text stringlengths 81 112k |
|---|
Returns the implicit feature associated with the given implicit value.
def implied_feature (implicit_value):
""" Returns the implicit feature associated with the given implicit value.
"""
assert isinstance(implicit_value, basestring)
components = implicit_value.split('-')
if components[0] not in _... |
Checks if all name is a valid feature. Otherwise, raises an exception.
def validate_feature (name):
""" Checks if all name is a valid feature. Otherwise, raises an exception.
"""
assert isinstance(name, basestring)
if name not in __all_features:
raise InvalidFeature ("'%s' is not a valid featur... |
Helper for expand_subfeatures.
Given a feature and value, or just a value corresponding to an
implicit feature, returns a property set consisting of all component
subfeatures and their values. For example:
expand_subfeatures <toolset>gcc-2.95.2-linux-x86
-> <toolset>gcc ... |
Make all elements of properties corresponding to implicit features
explicit, and express all subfeature values as separate properties
in their own right. For example, the property
gcc-2.95.2-linux-x86
might expand to
<toolset>gcc <toolset-version>2.95.2 <toolset-os>linux <toolset-cpu>x86
... |
Adds the given values to the given feature.
def extend (name, values):
""" Adds the given values to the given feature.
"""
assert isinstance(name, basestring)
assert is_iterable_typed(values, basestring)
name = add_grist (name)
__validate_feature (name)
feature = __all_features [name]
... |
Checks that value-string is a valid value-string for the given feature.
def validate_value_string (f, value_string):
""" Checks that value-string is a valid value-string for the given feature.
"""
assert isinstance(f, Feature)
assert isinstance(value_string, basestring)
if f.free or value_string in... |
Declares a subfeature.
feature_name: Root feature that is not a subfeature.
value_string: An optional value-string specifying which feature or
subfeature values this subfeature is specific to,
if any.
subfeature: The name of the subfeature ... |
Sets the components of the given composite property.
All parameters are <feature>value strings
def compose (composite_property_s, component_properties_s):
""" Sets the components of the given composite property.
All parameters are <feature>value strings
"""
from . import property
component_p... |
Returns all values of the given feature specified by the given property set.
def get_values (feature, properties):
""" Returns all values of the given feature specified by the given property set.
"""
if feature[0] != '<':
feature = '<' + feature + '>'
result = []
for p in properties:
i... |
Expand all composite properties in the set so that all components
are explicitly expressed.
def expand_composites (properties):
""" Expand all composite properties in the set so that all components
are explicitly expressed.
"""
if __debug__:
from .property import Property
as... |
Return true iff f is an ordinary subfeature of the parent_property's
feature, or if f is a subfeature of the parent_property's feature
specific to the parent_property's value.
def is_subfeature_of (parent_property, f):
""" Return true iff f is an ordinary subfeature of the parent_property's
... |
As is_subfeature_of, for subproperties.
def __is_subproperty_of (parent_property, p):
""" As is_subfeature_of, for subproperties.
"""
if __debug__:
from .property import Property
assert isinstance(parent_property, Property)
assert isinstance(p, Property)
return is_subfeature_of ... |
Given a property set which may consist of composite and implicit
properties and combined subfeature values, returns an expanded,
normalized property set with all implicit features expressed
explicitly, all subfeature values individually expressed, and all
components of composite properti... |
Given a set of properties, add default values for features not
represented in the set.
Note: if there's there's ordinary feature F1 and composite feature
F2, which includes some value for F1, and both feature have default values,
then the default value of F1 will be added, not the value ... |
Given an expanded property set, eliminate all redundancy: properties
which are elements of other (composite) properties in the set will
be eliminated. Non-symmetric properties equal to default values will be
eliminated, unless the override a value from some composite property.
Implicit p... |
Given a property-set of the form
v1/v2/...vN-1/<fN>vN/<fN+1>vN+1/...<fM>vM
Returns
v1 v2 ... vN-1 <fN>vN <fN+1>vN+1 ... <fM>vM
Note that vN...vM may contain slashes. This is resilient to the
substitution of backslashes for slashes, since Jam, unbidden,
sometimes swaps slash direction o... |
Combine all subproperties into their parent properties
Requires: for every subproperty, there is a parent property. All
features are explicitly expressed.
This rule probably shouldn't be needed, but
build-request.expand-no-defaults is being abused for unintended
purposes and i... |
Given a property, return the subset of features consisting of all
ordinary subfeatures of the property's feature, and all specific
subfeatures of the property's feature which are conditional on the
property's value.
def __select_subfeatures (parent_property, features):
""" Given a property,... |
Retrieves the interpretation function used.
def _get_interpretation_function(interpretation, dtype):
"""
Retrieves the interpretation function used.
"""
type_string = dtype.__name__
name = "%s__%s" % (interpretation, type_string)
global _interpretations
if not hasattr(_interpretations, n... |
Returns the description and output type for a given interpretation.
def _get_interpretation_description_and_output_type(interpretation, dtype):
"""
Returns the description and output type for a given interpretation.
"""
type_string = dtype.__name__
name = "%s__%s" % (interpretation, type_string)
... |
Returns a list of the available interpretations and what they do.
If indent is specified, then the entire doc string is indented by that amount.
def _get_embeddable_interpretation_doc(indent = 0):
"""
Returns a list of the available interpretations and what they do.
If indent is specified, then the e... |
A function to load a previously saved SentenceSplitter instance.
Parameters
----------
unpickler : GLUnpickler
A GLUnpickler file handler.
version : int
Version number maintained by the class writer.
def _load_version(cls, unpickler, version):
"""
... |
Fits the transformer using the given data.
def fit(self, data):
"""
Fits the transformer using the given data.
"""
_raise_error_if_not_sframe(data, "data")
fitted_state = {}
feature_columns = _internal_utils.get_column_names(data, self._exclude, self._features)
... |
Transforms the data.
def transform(self, data):
"""
Transforms the data.
"""
if not self._get("fitted"):
raise RuntimeError("`transform` called before `fit` or `fit_transform`.")
data = data.copy()
output_column_prefix = self._get("output_column_prefix")
... |
Transforms short text into a dictionary of TFIDF-weighted 3-gram
character counts.
def short_text__str(self, column_name, output_column_prefix):
"""
Transforms short text into a dictionary of TFIDF-weighted 3-gram
character counts.
"""
from ._ngram_counter import NGramC... |
Interprets an integer column as a categorical variable.
def categorical__int(self, column_name, output_column_prefix):
"""
Interprets an integer column as a categorical variable.
"""
return [_ColumnFunctionTransformation(
features = [column_name],
output_column_... |
Sets up the content transforms.
def _setup_from_data(self, data):
"""
Sets up the content transforms.
"""
fitted_state = {}
_raise_error_if_not_of_type(data, [_SFrame])
feature_columns = _internal_utils.get_column_names(data, self._exclude, self._features)
if... |
Fits a transformer using the SFrame `data`.
Parameters
----------
data : SFrame
The data used to fit the transformer.
Returns
-------
self (A fitted object)
See Also
--------
transform, fit_transform
def fit(self, data):
"""... |
Fits and transforms the SFrame `data` using a fitted model.
Parameters
----------
data : SFrame
The data to be transformed.
Returns
-------
A transformed SFrame.
Returns
-------
out: SFrame
A transformed SFrame.
... |
Transform the SFrame `data` using a fitted model.
Parameters
----------
data : SFrame
The data to be transformed.
Returns
-------
A transformed SFrame.
Returns
-------
out: SFrame
A transformed SFrame.
See Also
... |
Returns a structured description of the model, including (where relevant)
the schema of the training data, description of the training data,
training statistics, and model hyperparameters.
Returns
-------
sections : list (of list of tuples)
A list of summary sections... |
Save the model as a directory, which can be loaded with the
:py:func:`~turicreate.load_model` method.
Parameters
----------
pickler : GLPickler
An opened GLPickle archive (Do not close the archive).
See Also
--------
turicreate.load_model
Ex... |
Create a new mock object.
Args:
# class_to_mock: the class to be mocked
class_to_mock: class
Returns:
MockObject that can be used as the class_to_mock would be.
def CreateMock(self, class_to_mock):
"""Create a new mock object.
Args:
# class_to_mock: the class to be mocked
... |
Replace a method, attribute, etc. with a Mock.
This will replace a class or module with a MockObject, and everything else
(method, function, etc) with a MockAnything. This can be overridden to
always use a MockAnything by setting use_mock_anything to True.
Args:
obj: A Python object (class, mod... |
Verify that all of the expected calls have been made.
Raises:
ExpectedMethodCallsError: if there are still more method calls in the
expected queue.
def _Verify(self):
"""Verify that all of the expected calls have been made.
Raises:
ExpectedMethodCallsError: if there are still more met... |
Verify the called method is expected.
This can be an ordered method, or part of an unordered set.
Returns:
The expected mock method.
Raises:
UnexpectedMethodCall if the method called was not expected.
def _VerifyMethodCall(self):
"""Verify the called method is expected.
This can be ... |
Returns a possible group from the end of the call queue or None if no
other methods are on the stack.
def GetPossibleGroup(self):
"""Returns a possible group from the end of the call queue or None if no
other methods are on the stack.
"""
# Remove this method from the tail of the queue so we can a... |
Checks if the last method (a possible group) is an instance of our
group_class. Adds the current method to this group or creates a new one.
Args:
group_name: the name of the group.
group_class: the class used to create instance of this new group
def _CheckAndCreateNewGroup(self, group_name, group... |
Check to see if the RHS is an instance of class_name.
Args:
# rhs: the right hand side of the test
rhs: object
Returns:
bool
def equals(self, rhs):
"""Check to see if the RHS is an instance of class_name.
Args:
# rhs: the right hand side of the test
rhs: object
Ret... |
Check to see if RHS is almost equal to float_value
Args:
rhs: the value to compare to float_value
Returns:
bool
def equals(self, rhs):
"""Check to see if RHS is almost equal to float_value
Args:
rhs: the value to compare to float_value
Returns:
bool
"""
try:
... |
Check whether the given key/value pair is in the rhs dict.
Returns:
bool
def equals(self, rhs):
"""Check whether the given key/value pair is in the rhs dict.
Returns:
bool
"""
try:
return rhs[self._key] == self._value
except Exception:
return False |
Check to see whether actual_seq has same elements as expected_seq.
Args:
actual_seq: sequence
Returns:
bool
def equals(self, actual_seq):
"""Check to see whether actual_seq has same elements as expected_seq.
Args:
actual_seq: sequence
Returns:
bool
"""
try:
... |
Checks whether any Comparator is equal to rhs.
Args:
# rhs: can be anything
Returns:
bool
def equals(self, rhs):
"""Checks whether any Comparator is equal to rhs.
Args:
# rhs: can be anything
Returns:
bool
"""
for comparator in self._comparators:
if compar... |
Remove a method call from the group.
If the method is not in the set, an UnexpectedMethodCallError will be
raised.
Args:
mock_method: a mock method that should be equal to a method in the group.
Returns:
The mock method from the group
Raises:
UnexpectedMethodCallError if the mo... |
Remove a method call from the group.
If the method is not in the set, an UnexpectedMethodCallError will be
raised.
Args:
mock_method: a mock method that should be equal to a method in the group.
Returns:
The mock method from the group
Raises:
UnexpectedMethodCallError if the mo... |
Return True if all methods in this group are called at least once.
def IsSatisfied(self):
"""Return True if all methods in this group are called at least once."""
# NOTE(psycho): We can't use the simple set difference here because we want
# to match different parameters which are considered the same e.g. I... |
Convert a _imputer model to the protobuf spec.
Parameters
----------
model: Imputer
A trained Imputer model.
input_features: str
Name of the input column.
output_features: str
Name of the output column.
Returns
-------
model_spec: An object of type Model_pb.
... |
Common utilities to set the regression interface params.
def set_classifier_interface_params(spec, features, class_labels,
model_accessor_for_class_labels, output_features = None):
"""
Common utilities to set the regression interface params.
"""
# Normalize the features list.
features = _fm... |
Common utilities to set the regressor interface params.
def set_regressor_interface_params(spec, features, output_features):
""" Common utilities to set the regressor interface params.
"""
if output_features is None:
output_features = [("predicted_class", datatypes.Double())]
else:
outp... |
Common utilities to set transform interface params.
def set_transform_interface_params(spec, input_features, output_features, are_optional = False):
""" Common utilities to set transform interface params.
"""
input_features = _fm.process_or_validate_features(input_features)
output_features = _fm.proces... |
Convert SFrame to batch form, where each row contains a sequence of length
predictions_in_chunk * prediction_window, and there is a single label per
prediction window.
def prep_data(data, features, session_id, prediction_window, predictions_in_chunk, target=None, verbose=True):
"""
Convert SFrame to ba... |
Loads into numpy array from SFrame, assuming SFrame stores data flattened
def _load_into_numpy(sf, np_array, start, end, strides=None, shape=None):
"""Loads into numpy array from SFrame, assuming SFrame stores data flattened"""
np_array[:] = 0.0
np_array_2d = np_array.reshape((np_array.shape[0], np_array.s... |
Set the inputs of the network spec.
Parameters
----------
input_names: [str]
List of input names of the network.
input_dims: [tuple]
List of input dimensions of the network. The ordering of input_dims
is the same as input_names.
Examples
... |
Set the outputs of the network spec.
Parameters
----------
output_names: [str]
List of output names of the network.
output_dims: [tuple]
List of output dimensions of the network. The ordering of output_dims is the same
as output_names.
Examp... |
Set class labels to the model spec to make it a neural network classifier.
Parameters
----------
class_labels: list[int or str]
A list of integers or strings that map the index of the output of a
neural network to labels in a classifier.
predicted_feature_name: ... |
Add optional inputs and outputs to the model spec.
Parameters
----------
optionals_in: [str]
List of inputs that are optionals.
optionals_out: [str]
List of outputs that are optionals.
See Also
--------
set_input, set_output
def add_opt... |
Add an embedding layer to the model.
Parameters
----------
name: str
The name of this layer
W: numpy.array
Weight matrix of shape (output_channels, input_dim).
b: numpy.array
Bias vector of shape (output_channels, ).
input_dim: int
... |
Add a softmax layer to the model.
Parameters
----------
name: str
The name of this layer.
input_name: str
The input blob name of this layer.
output_name: str
The output blob name of this layer.
See Also
--------
add_ac... |
Add an activation layer to the model.
Parameters
----------
name: str
The name of this layer
non_linearity: str
The non_linearity (activation) function of this layer.
It can be one of the following:
- 'RELU': Rectified Linear Unit (Re... |
Add an element-wise operation layer to the model.
Parameters
----------
The name of this layer
name: str
input_names: [str]
A list of input blob names of this layer. The input blobs should have the same shape.
output_name: str
The output blob ... |
Add upsample layer to the model.
Parameters
----------
name: str
The name of this layer.
scaling_factor_h: int
Scaling factor on the vertical direction.
scaling_factor_w: int
Scaling factor on the horizontal direction.
input_name: str
... |
Add scale layer to the model.
Parameters
----------
name: str
The name of this layer.
W: int | numpy.array
Scale of the input.
b: int | numpy.array
Bias to add to the input.
has_bias: boolean
Whether the bias vector of this... |
Add bias layer to the model.
Parameters
----------
name: str
The name of this layer.
b: int | numpy.array
Bias to add to the input.
input_name: str
The input blob name of this layer.
output_name: str
The output blob name of... |
Add sequence repeat layer to the model.
Parameters
----------
name: str
The name of this layer.
nrep: int
Number of repetitions of the input blob along the sequence axis.
input_name: str
The input blob name of this layer.
output_name: ... |
Add a convolution layer to the network.
Please see the ConvolutionLayerParams in Core ML neural network
protobuf message for more information about input and output blob dimensions.
Parameters
----------
name: str
The name of this layer.
kernel_channels: int... |
Add a padding layer to the model. Kindly refer to NeuralNetwork.proto for details.
Parameters
----------
name: str
The name of this layer.
left: int
Number of elements to be padded on the left side of the input blob.
right: int
Number of eleme... |
Add a cropping layer to the model.
The cropping layer have two functional modes:
- When it has 1 input blob, it crops the input blob based
on the 4 parameters [left, right, top, bottom].
- When it has 2 input blobs, it crops the first input blob based
on the ... |
Add a simple recurrent layer to the model.
Parameters
----------
name: str
The name of this layer.
W_h: numpy.array
Weights of the recurrent layer's hidden state. Must be of shape (hidden_size, hidden_size).
W_x: numpy.array
Weights of the rec... |
Add a Gated-Recurrent Unit (GRU) layer to the model.
Parameters
----------
name: str
The name of this layer.
W_h: [numpy.array]
List of recursion weight matrices. The ordering is [R_z, R_r, R_o],
where R_z, R_r and R_o are weight matrices at update ga... |
Add a Uni-directional LSTM layer to the model.
Parameters
----------
name: str
The name of this layer.
W_h: [numpy.array]
List of recursion weight matrices. The ordering is [R_i, R_f, R_o, R_z],
where R_i, R_f, R_o, R_z are weight matrices at input ga... |
Add a Bi-directional LSTM layer to the model.
Parameters
----------
name: str
The name of this layer.
W_h: [numpy.array]
List of recursion weight matrices for the forward layer. The ordering is [R_i, R_f, R_o, R_z],
where R_i, R_f, R_o, R_z are weight... |
Add a flatten layer. Only flattens the channel, height and width axis. Leaves the sequence axis as is.
Parameters
----------
name: str
The name of this layer.
mode: int
- If mode == 0, the flatten layer is in CHANNEL_FIRST mode.
- If mode == 1, the f... |
Add a slice layer. Equivalent to to numpy slice [start_index:end_index:stride],
start_index is included, while end_index is exclusive.
Parameters
----------
name: str
The name of this layer.
input_name: str
The input blob name of this layer.
outp... |
Add a data reorganization layer of type "SPACE_TO_DEPTH" or "DEPTH_TO_SPACE".
Parameters
----------
name: str
The name of this layer.
input_name: str
The input blob name of this layer.
output_name: str
The output blob name of this layer.
... |
Add a Batch Normalization layer. Batch Normalization operation is
defined as:
`y = gamma * (x - mean) / sqrt(variance + epsilon) + beta`
Parameters
----------
name: str
The name of this layer.
channels: int
Number of channels of the input blob.
... |
Add a permute layer. Assumes that the input has dimensions in the order [Seq, C, H, W]
Parameters
----------
name: str
The name of this layer.
dim: tuple
The order in which to permute the input dimensions = [seq,C,H,W].
Must have length 4 and a permut... |
Add a reshape layer. Kindly refer to NeuralNetwork.proto for details.
Parameters
----------
name: str
The name of this layer.
target_shape: tuple
Shape of the output blob. The product of target_shape must be equal
to the shape of the input blob.
... |
Add a reduce layer. Applies the function specified by the parameter mode,
along dimension(s) specified by the parameter axis.
Parameters
----------
name: str
The name of this layer.
input_name: str
The input blob name of this layer.
output_name: ... |
Add a LRN (local response normalization) layer. Please see the LRNLayerParams message in Core ML neural network
protobuf for more information about the operation of this layer. Supports "across" channels normalization.
Parameters
----------
name: str
The name of this layer.
... |
Add an MVN (mean variance normalization) layer. Computes mean, variance and normalizes the input.
Parameters
----------
name: str
The name of this layer.
input_name: str
The input blob name of this layer.
output_name: str
The output blob name... |
Add L2 normalize layer. Normalizes the input by the L2 norm, i.e. divides by the
the square root of the sum of squares of all elements of the input along C, H and W dimensions.
Parameters
----------
name: str
The name of this layer.
input_name: str
The i... |
Add a Unary layer. Applies the specified function (mode) to all the elements of the input.
Please see the UnaryFunctionLayerParams message in Core ML neural network
protobuf for more information about the operation of this layer.
Prior to the application of the function the input can be scaled a... |
Add a Split layer that uniformly splits the input along the channel dimension
to produce multiple outputs.
Parameters
----------
name: str
The name of this layer.
input_name: str
The input blob name of this layer.
output_names: [str]
... |
Add a load constant layer.
Parameters
----------
name: str
The name of this layer.
output_name: str
The output blob name of this layer.
constant_value: numpy.array
value of the constant as a numpy array.
shape: [int]
Lis... |
Add a custom layer.
Parameters
----------
name: str
The name of this layer.
input_names: [str]
The input blob names to this layer.
output_names: [str]
The output blob names from this layer.
custom_proto_spec: CustomLayerParams
... |
Add pre-processing parameters to the neural network object
Parameters
----------
image_input_names: [str]
Name of input blobs that are images
is_bgr: boolean | dict()
Channel order for input blobs that are images. BGR if True else RGB.
To specify a d... |
Registers a new generator class, specifying a set of
properties relevant to this scanner. Ctor for that class
should have one parameter: list of properties.
def register(scanner_class, relevant_properties):
""" Registers a new generator class, specifying a set of
properties relevant to thi... |
Returns an instance of previously registered scanner
with the specified properties.
def get(scanner_class, properties):
""" Returns an instance of previously registered scanner
with the specified properties.
"""
assert issubclass(scanner_class, Scanner)
assert is_iterable_typed(properti... |
Installs the specified scanner on actual target 'target'.
vtarget: virtual target from which 'target' was actualized.
def install (self, scanner, target, vtarget):
""" Installs the specified scanner on actual target 'target'.
vtarget: virtual target from which 'target' was actualized.
... |
Fills in the rest of function data into the skeleton function object
that were created via _make_skel_func().
def _fill_function(func, globals, defaults, dict, module, closure_values):
""" Fills in the rest of function data into the skeleton function object
that were created via _make_skel_func().
... |
Creates a skeleton function object that contains just the provided
code and the correct number of cells in func_closure. All other
func attributes (e.g. func_globals) are empty.
def _make_skel_func(code, cell_count, base_globals=None):
""" Creates a skeleton function object that contains just the ... |
Put attributes from `class_dict` back on `skeleton_class`.
See CloudPickler.save_dynamic_class for more info.
def _rehydrate_skeleton_class(skeleton_class, class_dict):
"""Put attributes from `class_dict` back on `skeleton_class`.
See CloudPickler.save_dynamic_class for more info.
"""
for attrnam... |
Iterate over each part instead of calling imp.find_module directly.
This function is able to find submodules (e.g. sickit.tree)
def _find_module(mod_name):
"""
Iterate over each part instead of calling imp.find_module directly.
This function is able to find submodules (e.g. sickit.tree)
"""
pat... |
Save a module as an import
def save_module(self, obj):
"""
Save a module as an import
"""
mod_name = obj.__name__
# If module is successfully found then it is not a dynamically created module
if hasattr(obj, '__file__'):
is_dynamic = False
else:
... |
Ensure de-pickler imports any package child-modules that
are needed by the function
def _save_subimports(self, code, top_level_dependencies):
"""
Ensure de-pickler imports any package child-modules that
are needed by the function
"""
# check if any known dependency is an... |
Save a class that can't be stored as module global.
This method is used to serialize classes that are defined inside
functions, or that otherwise can't be serialized as attribute lookups
from global modules.
def save_dynamic_class(self, obj):
"""
Save a class that can't be stor... |
Pickles an actual func object.
A func comprises: code, globals, defaults, closure, and dict. We
extract and save these, injecting reducing functions at certain points
to recreate the func object. Keep in mind that some of these pieces
can contain a ref to the func itself. Thus, a nai... |
Find all globals names read or written to by codeblock co
def extract_code_globals(cls, co):
"""
Find all globals names read or written to by codeblock co
"""
out_names = cls._extract_code_globals_cache.get(co)
if out_names is None:
try:
names = co.co... |
Save a "global".
The name of this method is somewhat misleading: all types get
dispatched here.
def save_global(self, obj, name=None, pack=struct.pack):
"""
Save a "global".
The name of this method is somewhat misleading: all types get
dispatched here.
"""
... |
Modified to support __transient__ on new objects
Change only affects protocol level 2 (which is always used by PiCloud
def save_reduce(self, func, args, state=None,
listitems=None, dictitems=None, obj=None):
"""Modified to support __transient__ on new objects
Change only aff... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.