Skip to content

Instantly share code, notes, and snippets.

HDF5 "strings.nc" {
GROUP "/" {
ATTRIBUTE "_NCProperties" {
DATATYPE H5T_STRING {
STRSIZE 8192;
STRPAD H5T_STR_NULLTERM;
CSET H5T_CSET_UTF8;
CTYPE H5T_C_S1;
}
DATASPACE SCALAR
@xr.register_dataset_accessor('crs')
@xr.register_dataarray_accessor('crs')
def make_crs(xarray_obj):
return xarray_obj.coords['crs'].item()
In [24]: ds = xr.Dataset({'x': 1}, coords={'foo': 'bar', 'crs': 'hello, my lovely unwrapped object'})
In [25]: ds
Out[25]:
# Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
=============================== warnings summary ===============================
xarray/tests/test_backends.py::NetCDF4DataTest::test_default_fill_value
/home/travis/build/pydata/xarray/xarray/conventions.py:1162: RuntimeWarning: saving variable x with floating point data as an integer dtype without any _FillValue to use for NaNs
for k, v in iteritems(variables))
xarray/tests/test_backends.py::NetCDF4DataStoreAutocloseTrue::test_default_fill_value
/home/travis/build/pydata/xarray/xarray/conventions.py:1162: RuntimeWarning: saving variable x with floating point data as an integer dtype without any _FillValue to use for NaNs
for k, v in iteritems(variables))
xarray/tests/test_backends.py::NetCDF4ViaDaskDataTest::test_default_fill_value
/home/travis/build/pydata/xarray/xarray/conventions.py:1162: RuntimeWarning: saving variable x with floating point data as an integer dtype without any _FillValue to use for NaNs
for k, v in iteritems(variables))
@shoyer
shoyer / fails.txt
Created October 25, 2017 03:56
xarray build failure: No module named 'hypothesis.extra.pytestplugin
This file has been truncated, but you can view the full file.
travis_fold:start:worker_info
Worker information
hostname: 1d69df04-f002-4b58-a506-c7b9fba6bdd5@1.i-03218df-production-2-worker-org-ec2.travisci.net
version: v3.1.0 https://github.com/travis-ci/worker/tree/5b106cb773802277034d07b0a865af58d67bb1d2
instance: 51daf6d:travisci/ci-garnet:packer-1503972846 (via amqp)
startup: 527.739487ms
travis_fold:end:worker_info
travis_fold:start:system_info
Build system information
Build language: python
TypeError Traceback (most recent call last)
~/conda/envs/xarray-py36/lib/python3.6/site-packages/distributed/protocol/pickle.py in dumps(x)
37 try:
---> 38 result = pickle.dumps(x, protocol=pickle.HIGHEST_PROTOCOL)
39 if len(result) < 1000:
TypeError: can't pickle netCDF4._netCDF4.Variable objects
During handling of the above exception, another exception occurred:
# Copyright 2017 Google LLC.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# https://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,

I'll mention one other option that I've been contemplating recently, a bit of a hybrid of solutions 1 and 2:

  • We could build a new library with dispatchable functions inside NumPy itself, e.g., "numpy.api."

Functions in numpy.api work just like those in numpy, with two critical differences:

  1. They support overloading, via some to be determined mechanism.
  2. They don't coerce unknown types to np.array() using __array__().

This approach has a number of advantages over adjusting existing NumPy functions:

  • Backwards compatibility. For any particular numpy function without overloads, there is assuredly existing code that relies on it always coercing to numpy arrays. Every time we add a new overload in NumPy (e.g., np.any() recently), we've seen things break in downstream libraries like pandas. Even if we require downstream libraries to opt-in (e.g., by implementing __array_ufunc__) that just pushes the breakage downstream.
  • Predictability. We can remove any uncertainty over whether a NumPy function s
@shoyer
shoyer / scipy-vs-numba-interp1d.ipynb
Last active September 18, 2019 08:05
scipy vs numba interp1d.ipynb
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.