Hi! I am trying to de-seasonalize specific air humidity data by subtracting the monthly mean. When using ct.climate.monthly_mean() on the data, I get the error message below. Same errors are outputted, if I use ct.cube.resample(freq='month').

My workflow can be found here: https://sis-dev.climate.copernicus.eu/toolbox-editor/1474/example-3-seasonal-variations-of-humidity-20s-20n-heatmap 

First, I get a RuntimeError stating that there's an invalid argument with regards to NetCDF. Moreover, it outputs a ValueError saying that there is a mismatch between dtypes (int 64 to int 32). How do I make sure both are of dtype int 64? And what does the first error mean?

Received error:

Traceback (most recent call last):
File "/opt/cdstoolbox/cdscompute/cdscompute/cdshandlers/services/handler.py", line 49, in handle_request
result = cached(context.method, proc, context, context.args, context.kwargs)
File "/opt/cdstoolbox/cdscompute/cdscompute/caching.py", line 108, in cached
result = proc(context, *context.args, **context.kwargs)
File "/opt/cdstoolbox/cdscompute/cdscompute/services.py", line 118, in __call__
return p(*args, **kwargs)
File "/opt/cdstoolbox/cdscompute/cdscompute/services.py", line 59, in __call__
return self.proc(context, *args, **kwargs)
File "/home/cds/cdsservices/services/python_service.py", line 38, in execute
raise exceptions.InternalError(logging + traceback, '')
cdsclient.exceptions.InternalError: Traceback (most recent call last):
File "/opt/cdstoolbox/cdscdm/cdscdm/io.py", line 144, in dataarray_to_netcdf
dataset.to_netcdf(output_path, **kwargs)
File "/usr/local/lib/python3.6/site-packages/xarray/core/dataset.py", line 1547, in to_netcdf
invalid_netcdf=invalid_netcdf,
File "/usr/local/lib/python3.6/site-packages/xarray/backends/api.py", line 1073, in to_netcdf
dataset, store, writer, encoding=encoding, unlimited_dims=unlimited_dims
File "/usr/local/lib/python3.6/site-packages/xarray/backends/api.py", line 1119, in dump_to_store
store.store(variables, attrs, check_encoding, writer, unlimited_dims=unlimited_dims)
File "/usr/local/lib/python3.6/site-packages/xarray/backends/common.py", line 298, in store
variables, check_encoding_set, writer, unlimited_dims=unlimited_dims
File "/usr/local/lib/python3.6/site-packages/xarray/backends/common.py", line 336, in set_variables
name, v, check, unlimited_dims=unlimited_dims
File "/usr/local/lib/python3.6/site-packages/xarray/backends/netCDF4_.py", line 478, in prepare_variable
fill_value=fill_value,
File "netCDF4/_netCDF4.pyx", line 2768, in netCDF4._netCDF4.Dataset.createVariable
File "netCDF4/_netCDF4.pyx", line 3857, in netCDF4._netCDF4.Variable.__init__
File "netCDF4/_netCDF4.pyx", line 1887, in netCDF4._netCDF4._ensure_nc_success
RuntimeError: NetCDF: Invalid argument

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/opt/cdstoolbox/jsonrequest/jsonrequest/coding.py", line 36, in encode_complex_object
return encoder(python_object, **context)
File "/opt/cdstoolbox/jsonrequest/jsonrequest/coders.py", line 99, in encoder_wrapper
metadata = encoder(python_object, path, cachedir=cachedir, cacheurl=cacheurl, **context)
File "/opt/cdstoolbox/cdscdm/cdscdm/jsoncoders.py", line 12, in dataarray_encoder
io.dataarray_to_netcdf(python_object, path)
File "/opt/cdstoolbox/cdscdm/cdscdm/io.py", line 146, in dataarray_to_netcdf
dataset.to_netcdf(output_path, engine='scipy', **kwargs)
File "/usr/local/lib/python3.6/site-packages/xarray/core/dataset.py", line 1547, in to_netcdf
invalid_netcdf=invalid_netcdf,
File "/usr/local/lib/python3.6/site-packages/xarray/backends/api.py", line 1073, in to_netcdf
dataset, store, writer, encoding=encoding, unlimited_dims=unlimited_dims
File "/usr/local/lib/python3.6/site-packages/xarray/backends/api.py", line 1119, in dump_to_store
store.store(variables, attrs, check_encoding, writer, unlimited_dims=unlimited_dims)
File "/usr/local/lib/python3.6/site-packages/xarray/backends/common.py", line 293, in store
variables, attributes = self.encode(variables, attributes)
File "/usr/local/lib/python3.6/site-packages/xarray/backends/common.py", line 383, in encode
variables = {k: self.encode_variable(v) for k, v in variables.items()}
File "/usr/local/lib/python3.6/site-packages/xarray/backends/common.py", line 383, in <dictcomp>
variables = {k: self.encode_variable(v) for k, v in variables.items()}
File "/usr/local/lib/python3.6/site-packages/xarray/backends/scipy_.py", line 191, in encode_variable
variable = encode_nc3_variable(variable)
File "/usr/local/lib/python3.6/site-packages/xarray/backends/netcdf3.py", line 82, in encode_nc3_variable
attrs = encode_nc3_attrs(var.attrs)
File "/usr/local/lib/python3.6/site-packages/xarray/backends/netcdf3.py", line 72, in encode_nc3_attrs
return {k: encode_nc3_attr_value(v) for k, v in attrs.items()}
File "/usr/local/lib/python3.6/site-packages/xarray/backends/netcdf3.py", line 72, in <dictcomp>
return {k: encode_nc3_attr_value(v) for k, v in attrs.items()}
File "/usr/local/lib/python3.6/site-packages/xarray/backends/netcdf3.py", line 65, in encode_nc3_attr_value
value = coerce_nc3_dtype(np.atleast_1d(value))
File "/usr/local/lib/python3.6/site-packages/xarray/backends/netcdf3.py", line 53, in coerce_nc3_dtype
f"could not safely cast array from dtype {dtype} to {new_dtype}"
ValueError: could not safely cast array from dtype int64 to int32

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/opt/cdstoolbox/jsonrequest/jsonrequest/requests.py", line 71, in jsonrequestcall
resp = coding.encode(req.callable(*req.args, **req.kwargs), register=encoders, **context)
File "/opt/cdstoolbox/jsonrequest/jsonrequest/coding.py", line 61, in encode
return encode_complex_object(python_object, register, **context)
File "/opt/cdstoolbox/jsonrequest/jsonrequest/coding.py", line 39, in encode_complex_object
raise TypeError("Can't encode object %r" % (python_object,)) from exc
TypeError: Can't encode object <xarray.DataArray 'hus' (time: 0, height: 251)>
dask.array<sub, shape=(0, 251), dtype=float32, chunksize=(0, 251), chunktype=numpy.ndarray>
Coordinates:
* time (time) datetime64[ns]
* height (height) float64 0.0 200.0 400.0 600.0 ... 4.96e+04 4.98e+04 5e+04
Attributes:
units: 1.0

Thanks

Huayuan



2 Comments

  1. Dear Huayuan,

    Here is a working version of you workflow: https://sis-dev.climate.copernicus.eu/toolbox-editor/112/example-3-seasonal-variations-of-humidity-20s-20n-heatmap-1

    I have fixed a few things:

    • first I resampled the download the data to month start ('MS') instead of referring to the middle of the month: data = ct.cube.resample(data, dim='time', freq='MS')
    • I reviewed you anomaly calculation but please review that this is what you were intending to do:
      • first I compute the climatology mean (monthly mean over all the available years): ct.cube.resample(data, dim='time', freq='MS')
      • then I calculate the anomaly with respect to the climatology mean: ct.climate.anomaly(data_mu, annual_cycle)
      • finally I compute the anomaly percentage with respect to the climatology mean: ct.cube.groupby_operator(anomaly, annual_cycle, 'time.month', operation='/') * 100.

    Please let me know if you have any question on any of these steps.

    Regards.

    Vivien

    1. Thank you for the help Vivien! It works flawlessly now