text stringlengths 81 112k |
|---|
Return a new DataArray whose dataset is given by selecting
index labels along the specified dimension(s).
.. warning::
Do not try to assign values when using any of the indexing methods
``isel`` or ``sel``::
da = xr.DataArray([0, 1, 2, 3], dims=['x'])
# DO ... |
Return a new DataArray whose dataset is given by pointwise integer
indexing along the specified dimension(s).
See Also
--------
Dataset.isel_points
def isel_points(self, dim='points', **indexers):
"""Return a new DataArray whose dataset is given by pointwise integer
ind... |
Return a new DataArray whose dataset is given by pointwise selection
of index labels along the specified dimension(s).
See Also
--------
Dataset.sel_points
def sel_points(self, dim='points', method=None, tolerance=None,
**indexers):
"""Return a new DataArray ... |
Conform this object onto a new set of indexes, filling in
missing values with NaN.
Parameters
----------
indexers : dict, optional
Dictionary with keys given by dimension names and values given by
arrays of coordinates tick labels. Any mis-matched coordinate
... |
Multidimensional interpolation of variables.
coords : dict, optional
Mapping from dimension names to the new coordinates.
new coordinate can be an scalar, array-like or DataArray.
If DataArrays are passed as new coordates, their dimensions are
used for the broadc... |
Interpolate this object onto the coordinates of another object,
filling out of range values with NaN.
Parameters
----------
other : Dataset or DataArray
Object with an 'indexes' attribute giving a mapping from dimension
names to an 1d array-like, which provides c... |
Returns a new DataArray with renamed coordinates or a new name.
Parameters
----------
new_name_or_name_dict : str or dict-like, optional
If the argument is dict-like, it it used as a mapping from old
names to new names for coordinates. Otherwise, use the argument
... |
Returns a new DataArray with swapped dimensions.
Parameters
----------
dims_dict : dict-like
Dictionary whose keys are current dimension names and whose values
are new names. Each value must already be a coordinate on this
array.
Returns
----... |
Return a new object with an additional axis (or axes) inserted at
the corresponding position in the array shape.
If dim is already a scalar coordinate, it will be promoted to a 1D
coordinate consisting of a single value.
Parameters
----------
dim : str, sequence of str,... |
Set DataArray (multi-)indexes using one or more existing
coordinates.
Parameters
----------
indexes : {dim: index, ...}
Mapping from names matching dimensions and values given
by (lists of) the names of existing coordinates or variables to set
as new ... |
Reset the specified index(es) or multi-index level(s).
Parameters
----------
dims_or_levels : str or list
Name(s) of the dimension(s) and/or multi-index level(s) that will
be reset.
drop : bool, optional
If True, remove the specified indexes and/or mu... |
Rearrange index levels using input order.
Parameters
----------
dim_order : optional
Mapping from names matching dimensions and values given
by lists representing new level orders. Every given dimension
must have a multi-index.
inplace : bool, optiona... |
Stack any number of existing dimensions into a single new dimension.
New dimensions will be added at the end, and the corresponding
coordinate variables will be combined into a MultiIndex.
Parameters
----------
dimensions : Mapping of the form new_name=(dim1, dim2, ...)
... |
Unstack existing dimensions corresponding to MultiIndexes into
multiple new dimensions.
New dimensions will be added at the end.
Parameters
----------
dim : str or sequence of str, optional
Dimension(s) over which to unstack. By default unstacks all
Mult... |
Return a new DataArray object with transposed dimensions.
Parameters
----------
*dims : str, optional
By default, reverse the dimensions. Otherwise, reorder the
dimensions to this order.
Returns
-------
transposed : DataArray
The retu... |
Drop coordinates or index labels from this DataArray.
Parameters
----------
labels : scalar or list of scalars
Name(s) of coordinate variables or index labels to drop.
dim : str, optional
Dimension along which to drop index labels. By default (if
``di... |
Returns a new array with dropped labels for missing values along
the provided dimension.
Parameters
----------
dim : str
Dimension along which to drop missing values. Dropping along
multiple dimensions simultaneously is not yet supported.
how : {'any', 'a... |
Fill missing values in this object.
This operation follows the normal broadcasting and alignment rules that
xarray uses for binary arithmetic, except the result is aligned to this
object (``join='left'``) instead of aligned to the intersection of
index coordinates (``join='inner'``).
... |
Fill NaN values by propogating values forward
*Requires bottleneck.*
Parameters
----------
dim : str
Specifies the dimension along which to propagate values when
filling.
limit : int, default None
The maximum number of consecutive NaN values ... |
Fill NaN values by propogating values backward
*Requires bottleneck.*
Parameters
----------
dim : str
Specifies the dimension along which to propagate values when
filling.
limit : int, default None
The maximum number of consecutive NaN values... |
Reduce this array by applying `func` along some dimension(s).
Parameters
----------
func : function
Function which can be called in the form
`f(x, axis=axis, **kwargs)` to return the result of reducing an
np.ndarray over an integer valued axis.
dim : ... |
Convert this array into a pandas object with the same shape.
The type of the returned object depends on the number of DataArray
dimensions:
* 1D -> `pandas.Series`
* 2D -> `pandas.DataFrame`
* 3D -> `pandas.Panel`
Only works for arrays with 3 or fewer dimensions.
... |
Convert this array and its coordinates into a tidy pandas.DataFrame.
The DataFrame is indexed by the Cartesian product of index coordinates
(in the form of a :py:class:`pandas.MultiIndex`).
Other coordinates are included as columns in the DataFrame.
def to_dataframe(self, name=None):
... |
Convert this array into a pandas.Series.
The Series is indexed by the Cartesian product of index coordinates
(in the form of a :py:class:`pandas.MultiIndex`).
def to_series(self):
"""Convert this array into a pandas.Series.
The Series is indexed by the Cartesian product of index coord... |
Convert this array into a numpy.ma.MaskedArray
Parameters
----------
copy : bool
If True (default) make a copy of the array in the result. If False,
a MaskedArray view of DataArray.values is returned.
Returns
-------
result : MaskedArray
... |
Write DataArray contents to a netCDF file.
Parameters
----------
path : str or Path, optional
Path to which to save this dataset. If no path is provided, this
function returns the resulting netCDF file as a bytes object; in
this case, we need to use scipy.io.... |
Convert this xarray.DataArray into a dictionary following xarray
naming conventions.
Converts all variables and attributes to native Python objects.
Useful for coverting to json. To avoid datetime incompatibility
use decode_times=False kwarg in xarrray.open_dataset.
Parameters
... |
Convert a dictionary into an xarray.DataArray
Input dict can take several forms::
d = {'dims': ('t'), 'data': x}
d = {'coords': {'t': {'dims': 't', 'data': t,
'attrs': {'units':'s'}}},
'attrs': {'title': 'air temperature'},
... |
Convert a pandas.Series into an xarray.DataArray.
If the series's index is a MultiIndex, it will be expanded into a
tensor product of one-dimensional coordinates (filling in missing
values with NaN). Thus this operation should be the inverse of the
`to_series` method.
def from_series(c... |
Helper function for equals and identical
def _all_compat(self, other, compat_str):
"""Helper function for equals and identical"""
def compat(x, y):
return getattr(x.variable, compat_str)(y.variable)
return (utils.dict_equiv(self.coords, other.coords, compat=compat) and
... |
Like equals, but also checks the array name and attributes, and
attributes on all coordinates.
See Also
--------
DataArray.broadcast_equals
DataArray.equal
def identical(self, other):
"""Like equals, but also checks the array name and attributes, and
attributes ... |
If the dataarray has 1 dimensional coordinates or comes from a slice
we can show that info in the title
Parameters
----------
truncate : integer
maximum number of characters for title
Returns
-------
title : string
Can be used for plot ti... |
Calculate the n-th order discrete difference along given axis.
Parameters
----------
dim : str, optional
Dimension over which to calculate the finite difference.
n : int, optional
The number of times values are differenced.
label : str, optional
... |
Shift this array by an offset along one or more dimensions.
Only the data is moved; coordinates stay in place. Values shifted from
beyond array bounds are replaced by NaN. This is consistent with the
behavior of ``shift`` in pandas.
Parameters
----------
shifts : Mappin... |
Roll this array by an offset along one or more dimensions.
Unlike shift, roll may rotate all variables, including coordinates
if specified. The direction of rotation is consistent with
:py:func:`numpy.roll`.
Parameters
----------
roll_coords : bool
Indicates... |
Perform dot product of two DataArrays along their shared dims.
Equivalent to taking taking tensordot over all shared dims.
Parameters
----------
other : DataArray
The other array with which the dot product is performed.
dims: list of strings, optional
Al... |
Sort object by labels or values (along an axis).
Sorts the dataarray, either along specified dimensions,
or according to values of 1-D dataarrays that share dimension
with calling object.
If the input variables are dataarrays, then the dataarrays are aligned
(via left-join) to ... |
Compute the qth quantile of the data along the specified dimension.
Returns the qth quantiles(s) of the array elements.
Parameters
----------
q : float in range of [0,1] (or sequence of floats)
Quantile to compute, which must be between 0 and 1 inclusive.
dim : str ... |
Ranks the data.
Equal values are assigned a rank that is the average of the ranks that
would have been otherwise assigned to all of the values within that
set. Ranks begin at 1, not 0. If pct, computes percentage ranks.
NaNs in the input array are returned as NaNs.
The `bottl... |
Differentiate the array with the second order accurate central
differences.
.. note::
This feature is limited to simple cartesian geometry, i.e. coord
must be one dimensional.
Parameters
----------
coord: str
The coordinate to be used to comp... |
integrate the array with the trapezoidal rule.
.. note::
This feature is limited to simple cartesian geometry, i.e. coord
must be one dimensional.
Parameters
----------
dim: str, or a sequence of str
Coordinate(s) used for the integration.
da... |
Convert this rolling object to xr.DataArray,
where the window dimension is stacked as a new dimension
Parameters
----------
window_dim: str
New name of the window dimension.
stride: integer, optional
Size of stride for the rolling window.
fill_val... |
Reduce the items in this group by applying `func` along some
dimension(s).
Parameters
----------
func : function
Function which can be called in the form
`func(x, **kwargs)` to return the result of collapsing an
np.ndarray over an the rolling dimensio... |
Number of non-nan entries in each rolling window.
def _counts(self):
""" Number of non-nan entries in each rolling window. """
rolling_dim = utils.get_temp_dimname(self.obj.dims, '_rolling_dim')
# We use False as the fill_value instead of np.nan, since boolean
# array is faster to be r... |
Methods to return a wrapped function for any function `func` for
numpy methods.
def _reduce_method(cls, func):
"""
Methods to return a wrapped function for any function `func` for
numpy methods.
"""
def wrapped_func(self, **kwargs):
return self.reduce(func, ... |
Methods to return a wrapped function for any function `func` for
bottoleneck method, except for `median`.
def _bottleneck_reduce(cls, func):
"""
Methods to return a wrapped function for any function `func` for
bottoleneck method, except for `median`.
"""
def wrapped_fun... |
Reduce the items in this group by applying `func` along some
dimension(s).
Parameters
----------
func : function
Function which can be called in the form
`func(x, **kwargs)` to return the result of collapsing an
np.ndarray over an the rolling dimensio... |
Return a wrapped function for injecting numpy and bottoleneck methods.
see ops.inject_datasetrolling_methods
def _reduce_method(cls, func):
"""
Return a wrapped function for injecting numpy and bottoleneck methods.
see ops.inject_datasetrolling_methods
"""
def wrapped_f... |
Convert this rolling object to xr.Dataset,
where the window dimension is stacked as a new dimension
Parameters
----------
window_dim: str
New name of the window dimension.
stride: integer, optional
size of stride for the rolling window.
fill_value... |
Return a wrapped function for injecting numpy methods.
see ops.inject_coarsen_methods
def _reduce_method(cls, func):
"""
Return a wrapped function for injecting numpy methods.
see ops.inject_coarsen_methods
"""
def wrapped_func(self, **kwargs):
from .dataarra... |
Return a wrapped function for injecting numpy methods.
see ops.inject_coarsen_methods
def _reduce_method(cls, func):
"""
Return a wrapped function for injecting numpy methods.
see ops.inject_coarsen_methods
"""
def wrapped_func(self, **kwargs):
from .dataset ... |
Ensure that a variable with vlen bytes is converted to fixed width.
def ensure_fixed_length_bytes(var):
"""Ensure that a variable with vlen bytes is converted to fixed width."""
dims, data, attrs, encoding = unpack_for_encoding(var)
if check_vlen_dtype(data.dtype) == bytes:
# TODO: figure out how t... |
Convert numpy/dask arrays from fixed width bytes to characters.
def bytes_to_char(arr):
"""Convert numpy/dask arrays from fixed width bytes to characters."""
if arr.dtype.kind != 'S':
raise ValueError('argument must have a fixed-width bytes dtype')
if isinstance(arr, dask_array_type):
impo... |
Like netCDF4.stringtochar, but faster and more flexible.
def _numpy_bytes_to_char(arr):
"""Like netCDF4.stringtochar, but faster and more flexible.
"""
# ensure the array is contiguous
arr = np.array(arr, copy=False, order='C', dtype=np.string_)
return arr.reshape(arr.shape + (1,)).view('S1') |
Convert numpy/dask arrays from characters to fixed width bytes.
def char_to_bytes(arr):
"""Convert numpy/dask arrays from characters to fixed width bytes."""
if arr.dtype != 'S1':
raise ValueError("argument must have dtype='S1'")
if not arr.ndim:
# no dimension to concatenate along
... |
Like netCDF4.chartostring, but faster and more flexible.
def _numpy_char_to_bytes(arr):
"""Like netCDF4.chartostring, but faster and more flexible.
"""
# based on: http://stackoverflow.com/a/10984878/809705
arr = np.array(arr, copy=False, order='C')
dtype = 'S' + str(arr.shape[-1])
return arr.v... |
Given an array, safely cast it to a pandas.Index.
If it is already a pandas.Index, return it unchanged.
Unlike pandas.Index, if the array has dtype=object or dtype=timedelta64,
this function will not attempt to do automatic type conversion but will
always return an index with dtype=object.
def safe_c... |
Creating a MultiIndex from a product without refactorizing levels.
Keeping levels the same gives back the original labels when we unstack.
Parameters
----------
levels : sequence of pd.Index
Values for each MultiIndex level.
names : optional sequence of objects
Names for each level... |
Wrap a transformed array with __array_wrap__ is it can be done safely.
This lets us treat arbitrary functions that take and return ndarray objects
like ufuncs, as long as they return an array with the same shape.
def maybe_wrap_array(original, new_array):
"""Wrap a transformed array with __array_wrap__ is... |
Compare two objects for equivalence (identity or equality), using
array_equiv if either object is an ndarray
def equivalent(first: T, second: T) -> bool:
"""Compare two objects for equivalence (identity or equality), using
array_equiv if either object is an ndarray
"""
# TODO: refactor to avoid cir... |
Returns the first value from iterable, as well as a new iterator with
the same content as the original iterable
def peek_at(iterable: Iterable[T]) -> Tuple[T, Iterator[T]]:
"""Returns the first value from iterable, as well as a new iterator with
the same content as the original iterable
"""
gen = i... |
Check the safety of updating one dictionary with another.
Raises ValueError if dictionaries have non-compatible values for any key,
where compatibility is determined by identity (they are the same item) or
the `compat` function.
Parameters
----------
first_dict, second_dict : dict-like
... |
Remove incompatible items from the first dictionary in-place.
Items are retained if their keys are found in both dictionaries and the
values are compatible.
Parameters
----------
first_dict, second_dict : dict-like
Mappings to merge.
compat : function, optional
Binary operator ... |
Whether to treat a value as a scalar.
Any non-iterable, string, or 0-D array
def is_scalar(value: Any) -> bool:
"""Whether to treat a value as a scalar.
Any non-iterable, string, or 0-D array
"""
return (
getattr(value, 'ndim', None) == 0 or
isinstance(value, (str, bytes)) or not
... |
Given a value, wrap it in a 0-D numpy.ndarray with dtype=object.
def to_0d_object_array(value: Any) -> np.ndarray:
"""Given a value, wrap it in a 0-D numpy.ndarray with dtype=object.
"""
result = np.empty((), dtype=object)
result[()] = value
return result |
Given a value, wrap it in a 0-D numpy.ndarray.
def to_0d_array(value: Any) -> np.ndarray:
"""Given a value, wrap it in a 0-D numpy.ndarray.
"""
if np.isscalar(value) or (isinstance(value, np.ndarray) and
value.ndim == 0):
return np.array(value)
else:
return... |
Test equivalence of two dict-like objects. If any of the values are
numpy arrays, compare them correctly.
Parameters
----------
first, second : dict-like
Dictionaries to compare for equality
compat : function, optional
Binary operator to determine if two values are compatible. By de... |
Return the intersection of two dictionaries as a new OrderedDict.
Items are retained if their keys are found in both dictionaries and the
values are compatible.
Parameters
----------
first_dict, second_dict : dict-like
Mappings to merge.
compat : function, optional
Binary opera... |
Return True if values of an array are uniformly spaced and sorted.
>>> is_uniform_spaced(range(5))
True
>>> is_uniform_spaced([-4, 0, 100])
False
kwargs are additional arguments to ``np.isclose``
def is_uniform_spaced(arr, **kwargs) -> bool:
"""Return True if values of an array are uniformly ... |
Convert attribute values from numpy objects to native Python objects,
for use in to_dict
def decode_numpy_dict_values(attrs: Mapping[K, V]) -> Dict[K, V]:
"""Convert attribute values from numpy objects to native Python objects,
for use in to_dict
"""
attrs = dict(attrs)
for k, v in attrs.items(... |
Convert val out of numpy time, for use in to_dict.
Needed because of numpy bug GH#7619
def ensure_us_time_resolution(val):
"""Convert val out of numpy time, for use in to_dict.
Needed because of numpy bug GH#7619"""
if np.issubdtype(val.dtype, np.datetime64):
val = val.astype('datetime64[us]')
... |
Get an new dimension name based on new_dim, that is not used in dims.
If the same name exists, we add an underscore(s) in the head.
Example1:
dims: ['a', 'b', 'c']
new_dim: ['_rolling']
-> ['_rolling']
Example2:
dims: ['a', 'b', 'c', '_rolling']
new_dim: ['_rolling']... |
Given an object array with no missing values, infer its dtype from its
first element
def _infer_dtype(array, name=None):
"""Given an object array with no missing values, infer its dtype from its
first element
"""
if array.dtype.kind != 'O':
raise TypeError('infer_type must be called on a dt... |
Create a copy of an array with the given dtype.
We use this instead of np.array() to ensure that custom object dtypes end
up on the resulting array.
def _copy_with_dtype(data, dtype):
"""Create a copy of an array with the given dtype.
We use this instead of np.array() to ensure that custom object dty... |
Converts an Variable into an Variable which follows some
of the CF conventions:
- Nans are masked using _FillValue (or the deprecated missing_value)
- Rescaling via: scale_factor and add_offset
- datetimes are converted to the CF 'units since time' format
- dtype encodings are enfor... |
Decodes a variable which may hold CF encoded information.
This includes variables that have been masked and scaled, which
hold CF style time variables (this is almost always the case if
the dataset has been serialized) and which have strings encoded
as character arrays.
Parameters
----------
... |
Adds time attributes to time bounds variables.
Variables handling time bounds ("Cell boundaries" in the CF
conventions) do not necessarily carry the necessary attributes to be
decoded. This copies the attributes from the time variable to the
associated boundaries.
See Also:
http://cfconventio... |
Decode several CF encoded variables.
See: decode_cf_variable
def decode_cf_variables(variables, attributes, concat_characters=True,
mask_and_scale=True, decode_times=True,
decode_coords=True, drop_variables=None,
use_cftime=None):
"""
... |
Decode the given Dataset or Datastore according to CF conventions into
a new Dataset.
Parameters
----------
obj : Dataset or DataStore
Object to decode.
concat_characters : bool, optional
Should character arrays be concatenated to strings, for
example: ['h', 'e', 'l', 'l', '... |
Decode a set of CF encoded variables and attributes.
See Also, decode_cf_variable
Parameters
----------
variables : dict
A dictionary mapping from variable name to xarray.Variable
attributes : dict
A dictionary mapping from attribute name to value
concat_characters : bool
... |
Encode coordinates on the given dataset object into variable specific
and global attributes.
When possible, this is done according to CF conventions.
Parameters
----------
dataset : Dataset
Object to encode.
Returns
-------
variables : dict
attrs : dict
def encode_dataset... |
A function which takes a dicts of variables and attributes
and encodes them to conform to CF conventions as much
as possible. This includes masking, scaling, character
array handling, and CF-time encoding.
Decode a set of CF encoded variables and attributes.
See Also, decode_cf_variable
Para... |
Coerce an array to a data type that can be stored in a netCDF-3 file
This function performs the following dtype conversions:
int64 -> int32
bool -> int8
Data is checked for equality, or equivalence (non-NaN values) with
`np.allclose` with the default keyword arguments.
def coerce_nc3_dtyp... |
Test whether an object can be validly converted to a netCDF-3
dimension, variable or attribute name
Earlier versions of the netCDF C-library reference implementation
enforced a more restricted set of characters in creating new names,
but permitted reading names containing arbitrary bytes. This
spec... |
Convert an object into a Variable.
Parameters
----------
obj : object
Object to convert into a Variable.
- If the object is already a Variable, return a shallow copy.
- Otherwise, if the object has 'dims' and 'data' attributes, convert
it into a new Variable.
- If... |
Convert arrays of datetime.datetime and datetime.timedelta objects into
datetime64 and timedelta64, according to the pandas convention.
def _possibly_convert_objects(values):
"""Convert arrays of datetime.datetime and datetime.timedelta objects into
datetime64 and timedelta64, according to the pandas conve... |
Prepare and wrap data to put in a Variable.
- If data does not have the necessary attributes, convert it to ndarray.
- If data has dtype=datetime64, ensure that it has ns precision. If it's a
pandas.Timestamp, convert it to datetime64.
- If data is already a pandas or xarray object (other than an Ind... |
Return the given values as a numpy array, or as an individual item if
it's a 0d datetime64 or timedelta64 array.
Importantly, this function does not copy data if it is already an ndarray -
otherwise, it will not be possible to update Variable values in place.
This function mostly exists because 0-dime... |
Create broadcast compatible variables, with the same dimensions.
Unlike the result of broadcast_variables(), some variables may have
dimensions of size 1 instead of the the size of the broadcast dimension.
def _broadcast_compat_variables(*variables):
"""Create broadcast compatible variables, with the same... |
Given any number of variables, return variables with matching dimensions
and broadcast data.
The data on the returned variables will be a view of the data on the
corresponding original arrays, but dimensions will be reordered and
inserted so that both broadcast arrays have the same dimensions. The new
... |
Concatenate variables along a new or existing dimension.
Parameters
----------
variables : iterable of Array
Arrays to stack together. Each variable is expected to have
matching dimensions and shape except for along the stacked
dimension.
dim : str or DataArray, optional
... |
Check for uniqueness of MultiIndex level names in all given
variables.
Not public API. Used for checking consistency of DataArray and Dataset
objects.
def assert_unique_multiindex_level_names(variables):
"""Check for uniqueness of MultiIndex level names in all given
variables.
Not public API.... |
Manually trigger loading of this variable's data from disk or a
remote source into memory and return this variable.
Normally, it should not be necessary to call this method in user code,
because all xarray functions should either work on deferred data or
load data automatically.
... |
Return this variable as a base xarray.Variable
def to_base_variable(self):
"""Return this variable as a base xarray.Variable"""
return Variable(self.dims, self._data, self._attrs,
encoding=self._encoding, fastpath=True) |
Return this variable as an xarray.IndexVariable
def to_index_variable(self):
"""Return this variable as an xarray.IndexVariable"""
return IndexVariable(self.dims, self._data, self._attrs,
encoding=self._encoding, fastpath=True) |
Dictionary representation of variable.
def to_dict(self, data=True):
"""Dictionary representation of variable."""
item = {'dims': self.dims,
'attrs': decode_numpy_dict_values(self.attrs)}
if data:
item['data'] = ensure_us_time_resolution(self.values).tolist()
... |
Prepare an indexing key for an indexing operation.
Parameters
-----------
key: int, slice, array, dict or tuple of integer, slices and arrays
Any valid input for indexing.
Returns
-------
dims: tuple
Dimension of the resultant variable.
i... |
Make sanity checks
def _validate_indexers(self, key):
""" Make sanity checks """
for dim, k in zip(self.dims, key):
if isinstance(k, BASIC_INDEXING_TYPES):
pass
else:
if not isinstance(k, Variable):
k = np.asarray(k)
... |
Equivalent numpy's nonzero but returns a tuple of Varibles.
def _nonzero(self):
""" Equivalent numpy's nonzero but returns a tuple of Varibles. """
# TODO we should replace dask's native nonzero
# after https://github.com/dask/dask/issues/1076 is implemented.
nonzeros = np.nonzero(self.... |
Used by IndexVariable to return IndexVariable objects when possible.
def _finalize_indexing_result(self, dims, data):
"""Used by IndexVariable to return IndexVariable objects when possible.
"""
return type(self)(dims, data, self._attrs, self._encoding,
fastpath=True) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.