text
stringlengths
81
112k
Data-type of the array's elements. Returns ------- numpy.dtype This NDArray's data type. Examples -------- >>> x = mx.nd.zeros((2,3)) >>> x.dtype <type 'numpy.float32'> >>> y = mx.nd.zeros((2,3), dtype='int32') >>> y.dtype ...
Whether this array's corresponding gradient array (registered via `autograd.mark_variables`) has been updated by `autograd.backward` since last reset. `_fresh_grad` need to be manually set to False after consuming gradient (usually after updating this array). def _fresh_grad(se...
Returns a ``numpy.ndarray`` object with value copied from this array. Examples -------- >>> x = mx.nd.ones((2,3)) >>> y = x.asnumpy() >>> type(y) <type 'numpy.ndarray'> >>> y array([[ 1., 1., 1.], [ 1., 1., 1.]], dtype=float32) ...
Returns a copy of the array after casting to a specified type. Parameters ---------- dtype : numpy.dtype or str The type of the returned array. copy : bool Default `True`. By default, astype always returns a newly allocated ndarray on the same context...
Copies the value of this array to another array. If ``other`` is a ``NDArray`` object, then ``other.shape`` and ``self.shape`` should be the same. This function copies the value from ``self`` to ``other``. If ``other`` is a context, a new ``NDArray`` will be first created on th...
Returns an array on the target device with the same value as this array. If the target context is the same as ``self.context``, then ``self`` is returned. Otherwise, a copy is made. Parameters ---------- context : Context The target context. Returns ...
Attach a gradient buffer to this NDArray, so that `backward` can compute gradient with respect to it. Parameters ---------- grad_req : {'write', 'add', 'null'} How gradient will be accumulated. - 'write': gradient will be overwritten on every backward. ...
Returns gradient buffer attached to this NDArray. def grad(self): """Returns gradient buffer attached to this NDArray.""" from . import _ndarray_cls hdl = NDArrayHandle() check_call(_LIB.MXNDArrayGetGrad(self.handle, ctypes.byref(hdl))) if hdl.value is None: return N...
Returns a new NDArray, detached from the current graph. def detach(self): """Returns a new NDArray, detached from the current graph.""" from . import _ndarray_cls hdl = NDArrayHandle() check_call(_LIB.MXNDArrayDetach(self.handle, ctypes.byref(hdl))) return _ndarray_cls(hdl)
Compute the gradients of this NDArray w.r.t variables. Parameters ---------- out_grad : NDArray, optional Gradient with respect to head. retain_graph : bool, optional Whether to retain the computaion graph for another backward pass on the same graph. ...
Build the align array def build(self, align_path): """ Build the align array """ file = open(align_path, 'r') lines = file.readlines() file.close() # words: list([op, ed, word]) words = [] for line in lines: _op, _ed, word = line.strip...
Get sentence def sentence(self, padding=75): """ Get sentence """ vec = word_to_vector(self.sentence_str) vec += [-1] * (padding - self.sentence_length) return np.array(vec, dtype=np.int32)
Get words def word(self, _id, padding=75): """ Get words """ word = self.words[_id][2] vec = word_to_vector(word) vec += [-1] * (padding - len(vec)) return np.array(vec, dtype=np.int32)
Get the position of words def word_frame_pos(self, _id): """ Get the position of words """ left = int(self.words[_id][0]/1000) right = max(left+1, int(self.words[_id][1]/1000)) return (left, right)
Prepares the module for processing a data batch by pulling row_sparse parameters from kvstore to all devices based on rowids. Parameters ---------- param_rowids : dict of str to NDArray of list of NDArrays def prepare_sparse_params(self, param_rowids): '''Prepares the module fo...
Saves model parameters to file. Parameters ---------- fname : str Path to output param file. Examples -------- >>> # An example of saving module parameters. >>> mod.save_params('myfile') def save_params(self, fname): """Saves model parameters ...
Copy data from kvstore to `arg_params` and `aux_params`. Parameters ---------- arg_params : list of NDArray Target parameter arrays. aux_params : list of NDArray Target aux arrays. Notes ----- - This function will inplace update the NDArray...
Clips gradient norm. The norm is computed over all gradients together, as if they were concatenated into a single vector. Gradients are modified in-place. The method is first used in `[ICML2013] On the difficulty of training recurrent neural networks` Note that the gradients...
Rescale the gradient of provided parameters by a certain scale def rescale_grad(self, scale=None, param_name=None): """ Rescale the gradient of provided parameters by a certain scale """ if scale is None or param_name is None: return param_idx = self._exec_group.param_names.index(pa...
builds factorization machine network with proper formulation: y = w_0 \sum(x_i w_i) + 0.5(\sum\sum<v_i,v_j>x_ix_j - \sum<v_iv_i>x_i^2) def factorization_machine_model(factor_size, num_features, lr_mult_config, wd_mult_config, init_config): """ builds factorization machine networ...
Reshape data into (num_example, batch_size) def batchify(data, batch_size): """Reshape data into (num_example, batch_size)""" nbatch = data.shape[0] // batch_size data = data[:nbatch * batch_size] data = data.reshape((batch_size, nbatch)).T return data
Tokenizes a text file. def tokenize(self, path): """Tokenizes a text file.""" assert os.path.exists(path) # Add words to the dictionary with open(path, 'r') as f: tokens = 0 for line in f: words = line.split() + ['<eos>'] tokens +=...
Build docstring for symbolic functions. def _build_doc(func_name, desc, arg_names, arg_types, arg_desc, key_var_num_args=None, ret_type=None): """Build docstring for symbolic functions.""" param_str = _build_param_doc(arg...
Get user friendly information of the output shapes. def get_output_shape(sym, **input_shapes): """Get user friendly information of the output shapes.""" _, s_outputs, _ = sym.infer_shape(**input_shapes) return dict(zip(sym.list_outputs(), s_outputs))
Query CUDA for the number of GPUs present. Raises ------ Will raise an exception on any CUDA error. Returns ------- count : int The number of GPUs. def num_gpus(): """Query CUDA for the number of GPUs present. Raises ------ Will raise an exception on any CUDA error. ...
Query CUDA for the free and total bytes of GPU global memory. Parameters ---------- device_id : int, optional The device id of the GPU device. Raises ------ Will raise an exception on any CUDA error. Returns ------- (free, total) : (int, int) The number of GPUs. d...
Returns the current context. By default, `mx.cpu()` is used for all the computations and it can be overridden by using `with mx.Context(x)` statement where x can be cpu(device_id) or gpu(device_id). Examples ------- >>> mx.current_context() cpu(0) >>> with mx.Context('gpu', 1): # Cont...
Populates synsets - a map of index to label for the data items. Populates the data in the dataset, making tuples of (data, label) def _list_audio_files(self, root, skip_rows=0): """Populates synsets - a map of index to label for the data items. Populates the data in the dataset, making tuples o...
Returns a new dataset with the first element of each sample transformed by the transformer function `fn`. This is useful, for example, when you only want to transform data while keeping label as is. lazy=False is passed to transform_first for dataset so that all tramsforms could be perf...
Try to configure cython and return cython configuration def config_cython(): """Try to configure cython and return cython configuration""" if not with_cython: return [] # pylint: disable=unreachable if os.name == 'nt': print("WARNING: Cython is not supported on Windows, will compile wit...
Compose symbol on inputs. This call mutates the current symbol. Parameters ---------- args: provide positional arguments kwargs: provide keyword arguments Returns ------- the resulting symbol def _compose(self, *args, **kwargs)...
Set the attribute of the symbol. Parameters ---------- **kwargs The attributes to set def _set_attr(self, **kwargs): """Set the attribute of the symbol. Parameters ---------- **kwargs The attributes to set """ keys = c_st...
Configuration factory for various networks Parameters ---------- network : str base network name, such as vgg_reduced, inceptionv3, resnet... data_shape : int input data dimension kwargs : dict extra arguments def get_config(network, data_shape, **kwargs): """Configurat...
Wrapper for get symbol for train Parameters ---------- network : str name for the base network symbol data_shape : int input shape kwargs : dict see symbol_builder.get_symbol_train for more details def get_symbol_train(network, data_shape, **kwargs): """Wrapper for get ...
Set the trainer this parameter is associated with. def _set_trainer(self, trainer): """ Set the trainer this parameter is associated with. """ # trainer cannot be replaced for sparse params if self._stype != 'default' and self._trainer and trainer and self._trainer is not trainer: r...
Get row_sparse data from row_sparse parameters based on row_id. def _get_row_sparse(self, arr_list, ctx, row_id): """ Get row_sparse data from row_sparse parameters based on row_id. """ # get row sparse params based on row ids if not isinstance(row_id, ndarray.NDArray): raise TypeEr...
(Re)initializes by loading from data. def _load_init(self, data, ctx): """(Re)initializes by loading from data.""" if self.shape: for self_dim, data_dim in zip(self.shape, data.shape): assert self_dim in (0, data_dim), \ "Failed loading Parameter '%s' fro...
Finishes deferred initialization. def _finish_deferred_init(self): """Finishes deferred initialization.""" if not self._deferred_init: return init, ctx, default_init, data = self._deferred_init self._deferred_init = () assert self.shape is not None and np.prod(self.s...
Sets data and grad. def _init_impl(self, data, ctx_list): """Sets data and grad.""" self._ctx_list = list(ctx_list) self._ctx_map = [[], []] for i, ctx in enumerate(self._ctx_list): dev_list = self._ctx_map[ctx.device_typeid&1] while len(dev_list) <= ctx.device_i...
Initialize grad buffers. def _init_grad(self): """Initialize grad buffers.""" if self.grad_req == 'null': self._grad = None return self._grad = [ndarray.zeros(shape=i.shape, dtype=i.dtype, ctx=i.context, stype=self._grad_stype) for i ...
Reduce data from multiple context to cpu. def _reduce(self): """Reduce data from multiple context to cpu.""" ctx = context.cpu() if self._stype == 'default': block = self.list_data() data = ndarray.add_n(*(w.copyto(ctx) for w in block)) / len(block) else: ...
Initializes parameter and gradient arrays. Only used for :py:class:`NDArray` API. Parameters ---------- init : Initializer The initializer to use. Overrides :py:meth:`Parameter.init` and default_init. ctx : Context or list of Context, defaults to :py:meth:`context.current_co...
Re-assign Parameter to other contexts. Parameters ---------- ctx : Context or list of Context, default ``context.current_context()``. Assign Parameter to given context. If ctx is a list of Context, a copy will be made for each context. def reset_ctx(self, ctx): ...
Sets this parameter's value on all contexts. def set_data(self, data): """Sets this parameter's value on all contexts.""" self.shape = data.shape if self._data is None: assert self._deferred_init, \ "Parameter '%s' has not been initialized"%self.name sel...
Returns a copy of the 'row_sparse' parameter on the same context as row_id's. The copy only retains rows whose ids occur in provided row ids. The parameter must have been initialized on this context before. Parameters ---------- row_id: NDArray Row ids to retain for ...
Returns copies of the 'row_sparse' parameter on all contexts, in the same order as creation. The copy only retains rows whose ids occur in provided row ids. The parameter must have been initialized before. Parameters ---------- row_id: NDArray Row ids to retain for t...
Returns a copy of this parameter on one context. Must have been initialized on this context before. For sparse parameters, use :py:meth:`Parameter.row_sparse_data` instead. Parameters ---------- ctx : Context Desired context. Returns ------- ...
Returns copies of this parameter on all contexts, in the same order as creation. For sparse parameters, use :py:meth:`Parameter.list_row_sparse_data` instead. Returns ------- list of NDArrays def list_data(self): """Returns copies of this parameter on all contexts, in t...
Returns a gradient buffer for this parameter on one context. Parameters ---------- ctx : Context Desired context. def grad(self, ctx=None): """Returns a gradient buffer for this parameter on one context. Parameters ---------- ctx : Context ...
Returns gradient buffers on all contexts, in the same order as :py:meth:`values`. def list_grad(self): """Returns gradient buffers on all contexts, in the same order as :py:meth:`values`.""" if self._data is not None and self._grad is None: raise RuntimeError( ...
Returns a list of contexts this parameter is initialized on. def list_ctx(self): """Returns a list of contexts this parameter is initialized on.""" if self._data is None: if self._deferred_init: return self._deferred_init[1] raise RuntimeError("Parameter '%s' has...
Sets gradient buffer on all contexts to 0. No action is taken if parameter is uninitialized or doesn't require gradient. def zero_grad(self): """Sets gradient buffer on all contexts to 0. No action is taken if parameter is uninitialized or doesn't require gradient.""" if self._grad is N...
Returns a symbol representing this parameter. def var(self): """Returns a symbol representing this parameter.""" if self._var is None: self._var = symbol.var(self.name, shape=self.shape, dtype=self.dtype, lr_mult=self.lr_mult, wd_mult=self.wd_mult, ...
Cast data and gradient of this Parameter to a new data type. Parameters ---------- dtype : str or numpy.dtype The new data type. def cast(self, dtype): """Cast data and gradient of this Parameter to a new data type. Parameters ---------- dtype : str...
Retrieves a :py:class:`Parameter` with name ``self.prefix+name``. If not found, :py:func:`get` will first try to retrieve it from "shared" dict. If still not found, :py:func:`get` will create a new :py:class:`Parameter` with key-word arguments and insert it to self. Parameters -...
Retrieves a :py:class:`.Constant` with name ``self.prefix+name``. If not found, :py:func:`get` will first try to retrieve it from "shared" dict. If still not found, :py:func:`get` will create a new :py:class:`.Constant` with key-word arguments and insert it to self. Parameters -...
Copies all Parameters in ``other`` to self. def update(self, other): """Copies all Parameters in ``other`` to self.""" for k, v in other.items(): if k in self._params: assert self._params[k] is v, \ "Cannot update self with other because they have differe...
Initializes all Parameters managed by this dictionary to be used for :py:class:`NDArray` API. It has no effect when using :py:class:`Symbol` API. Parameters ---------- init : Initializer Global default Initializer to be used when :py:meth:`Parameter.init` is ``None``. ...
Set an attribute to a new value for all Parameters. For example, set grad_req to null if you don't need gradient w.r.t a model's Parameters:: model.collect_params().setattr('grad_req', 'null') or change the learning rate multiplier:: model.collect_params().setattr('lr...
Save parameters to file. Parameters ---------- filename : str Path to parameter file. strip_prefix : str, default '' Strip prefix from parameter names before saving. def save(self, filename, strip_prefix=''): """Save parameters to file. Paramete...
Load parameters from file. Parameters ---------- filename : str Path to parameter file. ctx : Context or list of Context Context(s) initialize loaded parameters on. allow_missing : bool, default False Whether to silently skip loading parameter...
Create a Torch function from the FunctionHandle. def _make_torch_function(handle): """Create a Torch function from the FunctionHandle.""" # Get the property of function n_used_vars = mx_uint() n_scalars = mx_uint() n_mutate_vars = mx_uint() type_mask = ctypes.c_int() check_call(_LIB.MXFuncD...
List and add all the torch backed ndarray functions to current module. def _init_torch_module(): """List and add all the torch backed ndarray functions to current module.""" plist = ctypes.POINTER(FunctionHandle)() size = ctypes.c_uint() check_call(_LIB.MXListFunctions(ctypes.byref(size), ...
r"""Inception v3 model from `"Rethinking the Inception Architecture for Computer Vision" <http://arxiv.org/abs/1512.00567>`_ paper. Parameters ---------- pretrained : bool, default False Whether to load the pretrained weights for model. ctx : Context, default CPU The context in ...
Pack a string into MXImageRecord. Parameters ---------- header : IRHeader Header of the image record. ``header.label`` can be a number or an array. See more detail in ``IRHeader``. s : str Raw image string to be packed. Returns ------- s : str The packed str...
Unpack a MXImageRecord to string. Parameters ---------- s : str String buffer from ``MXRecordIO.read``. Returns ------- header : IRHeader Header of the image record. s : str Unpacked string. Examples -------- >>> record = mx.recordio.MXRecordIO('test.re...
Unpack a MXImageRecord to image. Parameters ---------- s : str String buffer from ``MXRecordIO.read``. iscolor : int Image format option for ``cv2.imdecode``. Returns ------- header : IRHeader Header of the image record. img : numpy.ndarray Unpacked imag...
Pack an image into ``MXImageRecord``. Parameters ---------- header : IRHeader Header of the image record. ``header.label`` can be a number or an array. See more detail in ``IRHeader``. img : numpy.ndarray Image to be packed. quality : int Quality for JPEG encoding in...
Opens the record file. def open(self): """Opens the record file.""" if self.flag == "w": check_call(_LIB.MXRecordIOWriterCreate(self.uri, ctypes.byref(self.handle))) self.writable = True elif self.flag == "r": check_call(_LIB.MXRecordIOReaderCreate(self.uri, ...
Check process id to ensure integrity, reset if in new process. def _check_pid(self, allow_reset=False): """Check process id to ensure integrity, reset if in new process.""" if not self.pid == current_process().pid: if allow_reset: self.reset() else: ...
Closes the record file. def close(self): """Closes the record file.""" if not self.is_open: return if self.writable: check_call(_LIB.MXRecordIOWriterFree(self.handle)) else: check_call(_LIB.MXRecordIOReaderFree(self.handle)) self.is_open = Fal...
Inserts a string buffer as a record. Examples --------- >>> record = mx.recordio.MXRecordIO('tmp.rec', 'w') >>> for i in range(5): ... record.write('record_%d'%i) >>> record.close() Parameters ---------- buf : string (python2), bytes (python3)...
Returns record as a string. Examples --------- >>> record = mx.recordio.MXRecordIO('tmp.rec', 'r') >>> for i in range(5): ... item = record.read() ... print(item) record_0 record_1 record_2 record_3 record_4 >>> recor...
Closes the record file. def close(self): """Closes the record file.""" if not self.is_open: return super(MXIndexedRecordIO, self).close() self.fidx.close()
Sets the current read pointer position. This function is internally called by `read_idx(idx)` to find the current reader pointer position. It doesn't return anything. def seek(self, idx): """Sets the current read pointer position. This function is internally called by `read_idx(idx)` ...
Returns the current position of write head. Examples --------- >>> record = mx.recordio.MXIndexedRecordIO('tmp.idx', 'tmp.rec', 'w') >>> print(record.tell()) 0 >>> for i in range(5): ... record.write_idx(i, 'record_%d'%i) ... print(record.tell()) ...
Inserts input record at given index. Examples --------- >>> for i in range(5): ... record.write_idx(i, 'record_%d'%i) >>> record.close() Parameters ---------- idx : int Index of a file. buf : Record to write. def writ...
Add new metrics as new columns to selected pandas dataframe. Parameters ---------- dataframe : pandas.DataFrame Selected dataframe needs to be modified. metrics : metric.EvalMetric New metrics to be added. def _add_new_columns(dataframe, metrics): """Add new metrics as new columns ...
Generates callback arguments for model.fit() for a set of callback objects. Callback objects like PandasLogger(), LiveLearningCurve() get passed in. This assembles all their callback arguments. def args_wrapper(*args): """Generates callback arguments for model.fit() for a set of callback objects. ...
Append new metrics to selected dataframes. Parameters ---------- metrics : metric.EvalMetric New metrics to be added. df_name : str Name of the dataframe to be modified. def append_metrics(self, metrics, df_name): """Append new metrics to selected datafr...
Callback funtion for training. def train_cb(self, param): """Callback funtion for training. """ if param.nbatch % self.frequent == 0: self._process_batch(param, 'train')
Update parameters for selected dataframe after a completed batch Parameters ---------- dataframe : pandas.DataFrame Selected dataframe needs to be modified. def _process_batch(self, param, dataframe): """Update parameters for selected dataframe after a completed batch ...
Callback function after each epoch. Now it records each epoch time and append it to epoch dataframe. def epoch_cb(self): """Callback function after each epoch. Now it records each epoch time and append it to epoch dataframe. """ metrics = {} metrics['elapsed'] = self.ela...
Render the plot with bokeh.io and push to notebook. def _push_render(self): """Render the plot with bokeh.io and push to notebook. """ bokeh.io.push_notebook(handle=self.handle) self.last_update = time.time()
Update selected dataframe after a completed batch Parameters ---------- df_name : str Selected dataframe name needs to be modified. def _process_batch(self, param, df_name): """Update selected dataframe after a completed batch Parameters ---------- df...
:param nested_list: list of list of string :return: dictionary mapping from string to int, inverse of that dictionary def build_vocab(nested_list): """ :param nested_list: list of list of string :return: dictionary mapping from string to int, inverse of that dictionary """ # Build vocabulary ...
Reads a csv of sentences/tag sequences into a pandas dataframe. Converts into X = array(list(int)) & Y = array(list(int)) Splits into training and test sets Builds dictionaries mapping from index labels to labels/ indexed features to features :param data_dir: directory to read in csv data from :para...
Build NN symbol depending on the length of the input sequence def sym_gen(seq_len): """ Build NN symbol depending on the length of the input sequence """ sentence_shape = train_iter.provide_data[0][1] char_sentence_shape = train_iter.provide_data[1][1] entities_shape = train_iter.provide_label[...
Draw random samples from an approximately log-uniform or Zipfian distribution. This operation randomly samples *num_sampled* candidates the range of integers [0, range_max). The elements of sampled_candidates are drawn with replacement from the base distribution. The base distribution for this operator is...
Run a for loop with user-defined computation over Symbols on dimension 0. This operator simulates a for loop and body has the computation for an iteration of the for loop. It runs the computation in body on each slice from the input NDArrays. body takes two arguments as input and outputs a tuple of tw...
Run a while loop with user-defined computation and loop condition. This operator simulates a while loop which iterately does customized computation as long as the condition is satisfied. `loop_vars` is a Symbol or nested lists of Symbols on which the computation uses. `cond` is a user-defined functio...
Run an if-then-else using user-defined condition and computation This operator simulates a if-like branch which chooses to do one of the two customized computations according to the specified condition. `pred` is a scalar MXNet Symbol, indicating which branch of computation should be used. `then_...
Indexes unknown and reserved tokens. def _index_unknown_and_reserved_tokens(self, unknown_token, reserved_tokens): """Indexes unknown and reserved tokens.""" self._unknown_token = unknown_token # Thus, constants.UNKNOWN_IDX must be 0. self._idx_to_token = [unknown_token] if re...
Indexes keys of `counter`. Indexes keys of `counter` according to frequency thresholds such as `most_freq_count` and `min_freq`. def _index_counter_keys(self, counter, unknown_token, reserved_tokens, most_freq_count, min_freq): """Indexes keys of `counter`. ...
Converts tokens to indices according to the vocabulary. Parameters ---------- tokens : str or list of strs A source token or tokens to be converted. Returns ------- int or list of ints A token index or a list of token indices according to the v...
Converts token indices to tokens according to the vocabulary. Parameters ---------- indices : int or list of ints A source token index or token indices to be converted. Returns ------- str or list of strs A token or a list of tokens according t...
Create an io iterator by handle. def _make_io_iterator(handle): """Create an io iterator by handle.""" name = ctypes.c_char_p() desc = ctypes.c_char_p() num_args = mx_uint() arg_names = ctypes.POINTER(ctypes.c_char_p)() arg_types = ctypes.POINTER(ctypes.c_char_p)() arg_descs = ctypes.POINTE...
List and add all the data iterators to current module. def _init_io_module(): """List and add all the data iterators to current module.""" plist = ctypes.POINTER(ctypes.c_void_p)() size = ctypes.c_uint() check_call(_LIB.MXListDataIters(ctypes.byref(size), ctypes.byref(plist))) module_obj = sys.modu...
Get DataDesc list from attribute lists. Parameters ---------- shapes : a tuple of (name_, shape_) types : a tuple of (name_, np.dtype) def get_list(shapes, types): """Get DataDesc list from attribute lists. Parameters ---------- shapes : a tuple of (na...
Get next data batch from iterator. Returns ------- DataBatch The data of next batch. Raises ------ StopIteration If the end of the data is reached. def next(self): """Get next data batch from iterator. Returns ------- ...