text
stringlengths
81
112k
Entry point for serializing values. Most custom types should use :func:`~bloop.types.Type.dynamo_dump`. This wraps the return value of :func:`~bloop.types.Type.dynamo_dump` in DynamoDB's wire format. For example, serializing a string enum to an int: .. code-block:: python value =...
Entry point for deserializing values. Most custom types should use :func:`~bloop.types.Type.dynamo_load`. This unpacks DynamoDB's wire format and calls :func:`~bloop.types.Type.dynamo_load` on the inner value. For example, deserializing an int to a string enum: .. code-block:: python ...
Returns the DynamoDB backing type for a given python value's type :: 4 -> 'N' ['x', 3] -> 'L' {2, 4} -> 'SS' def backing_type_for(value): """Returns the DynamoDB backing type for a given python value's type :: 4 -> 'N' ['x', 3] -> ...
Monitor changes in approximately real-time and replicate them def stream_replicate(): """Monitor changes in approximately real-time and replicate them""" stream = primary.stream(SomeDataBlob, "trim_horizon") next_heartbeat = pendulum.now() while True: now = pendulum.now() if now >= next...
Move to the "trim_horizon" or "latest" of the entire stream. def _move_stream_endpoint(coordinator, position): """Move to the "trim_horizon" or "latest" of the entire stream.""" # 0) Everything will be rebuilt from DescribeStream. stream_arn = coordinator.stream_arn coordinator.roots.clear() coordi...
Scan through the *entire* Stream for the first record after ``time``. This is an extremely expensive, naive algorithm that starts at trim_horizon and simply dumps records into the void until the first hit. General improvements in performance are tough; we can use the fact that Shards have a max life of 24...
Move to the Stream position described by the token. The following rules are applied when interpolation is required: - If a shard does not exist (past the trim_horizon) it is ignored. If that shard had children, its children are also checked against the existing shards. - If none of the shards in the...
Poll active shards for records and insert them into the buffer. Rotate exhausted shards. Returns immediately if the buffer isn't empty. def advance_shards(self): """Poll active shards for records and insert them into the buffer. Rotate exhausted shards. Returns immediately if the buffer isn...
Keep active shards with "trim_horizon", "latest" iterators alive by advancing their iterators. def heartbeat(self): """Keep active shards with "trim_horizon", "latest" iterators alive by advancing their iterators.""" for shard in self.active: if shard.sequence_number is None: ...
JSON-serializable representation of the current Stream state. Use :func:`Engine.stream(YourModel, token) <bloop.engine.Engine.stream>` to create an identical stream, or :func:`stream.move_to(token) <bloop.stream.Stream.move_to>` to move an existing stream to this position. :returns: Stream sta...
Remove a Shard from the Coordinator. Drops all buffered records from the Shard. If the Shard is active or a root, it is removed and any children promoted to those roles. :param shard: The shard to remove :type shard: :class:`~bloop.stream.shard.Shard` :param bool drop_buffered_record...
Set the Coordinator to a specific endpoint or time, or load state from a token. :param position: "trim_horizon", "latest", :class:`~datetime.datetime`, or a :attr:`Coordinator.token <bloop.stream.coordinator.Coordinator.token>` def move_to(self, position): """Set the Coordinator to a speci...
Create a tuple of (ordering, (record, shard)) for use in a RecordBuffer. def heap_item(clock, record, shard): """Create a tuple of (ordering, (record, shard)) for use in a RecordBuffer.""" # Primary ordering is by event creation time. # However, creation time is *approximate* and has whole-second resolutio...
Push a new record into the buffer :param dict record: new record :param shard: Shard the record came from :type shard: :class:`~bloop.stream.shard.Shard` def push(self, record, shard): """Push a new record into the buffer :param dict record: new record :param shard: Sh...
Push multiple (record, shard) pairs at once, with only one :meth:`heapq.heapify` call to maintain order. :param record_shard_pairs: list of ``(record, shard)`` tuples (see :func:`~bloop.stream.buffer.RecordBuffer.push`). def push_all(self, record_shard_pairs): """Push multiple (record, sha...
Yields each (name, value) tuple for all columns in an object that aren't missing def loaded_columns(obj: BaseModel): """Yields each (name, value) tuple for all columns in an object that aren't missing""" for column in sorted(obj.Meta.columns, key=lambda c: c.name): value = getattr(obj, column.name, mis...
Push values by dynamo_name into an object def unpack_from_dynamodb(*, attrs, expected, model=None, obj=None, engine=None, context=None, **kwargs): """Push values by dynamo_name into an object""" context = context or {"engine": engine} engine = engine or context.get("engine", None) if not engine: ...
Set an object's field to default if it doesn't have a value def setdefault(obj, field, default): """Set an object's field to default if it doesn't have a value""" setattr(obj, field, getattr(obj, field, default))
Bind a column to the model with the given name. This method is primarily used during BaseModel.__init_subclass__, although it can be used to easily attach a new column to an existing model: .. code-block:: python import bloop.models class User(BaseModel): id = Column(String, ...
Bind an index to the model with the given name. This method is primarily used during BaseModel.__init_subclass__, although it can be used to easily attach a new index to an existing model: .. code-block:: python import bloop.models class User(BaseModel): ...
Recalculate the projection, hash_key, and range_key for the given index. :param meta: model.Meta to find columns by name :param index: The index to refresh def refresh_index(meta, index) -> None: """Recalculate the projection, hash_key, and range_key for the given index. :param meta: model.Meta to fi...
Unconditionally remove any columns or indexes bound to the given name or dynamo_name. .. code-block:: python import bloop.models class User(BaseModel): id = Column(String, hash_key=True) email = Column(String, dynamo_name="e") by_email = GlobalSecondaryIndex(p...
dict (dynamo name) -> obj def _load(cls, attrs, *, context, **kwargs): """ dict (dynamo name) -> obj """ return unpack_from_dynamodb( model=cls, attrs=attrs or {}, expected=cls.Meta.columns, context=context, **kwargs)
obj -> dict def _dump(cls, obj, *, context, **kwargs): """ obj -> dict """ if obj is None: return None dump = context["engine"]._dump filtered = filter( lambda item: item[1] is not None, (( column.dynamo_name, dump(colu...
Returns True if the actual index is a valid superset of the expected index def is_valid_superset(actual_projection, index): """Returns True if the actual index is a valid superset of the expected index""" projection_type = actual_projection["ProjectionType"] if projection_type == "ALL": return True...
Save an object to DynamoDB. :param item: Unpacked into kwargs for :func:`boto3.DynamoDB.Client.update_item`. :raises bloop.exceptions.ConstraintViolation: if the condition (or atomic) is not met. def save_item(self, item): """Save an object to DynamoDB. :param item: Unpacked into kwar...
Delete an object in DynamoDB. :param item: Unpacked into kwargs for :func:`boto3.DynamoDB.Client.delete_item`. :raises bloop.exceptions.ConstraintViolation: if the condition (or atomic) is not met. def delete_item(self, item): """Delete an object in DynamoDB. :param item: Unpacked int...
Loads any number of items in chunks, handling continuation tokens. :param items: Unpacked in chunks into "RequestItems" for :func:`boto3.DynamoDB.Client.batch_get_item`. def load_items(self, items): """Loads any number of items in chunks, handling continuation tokens. :param items: Unpacked i...
Invoke query/scan by name. Response always includes "Count" and "ScannedCount" :param str mode: "query" or "scan" :param request: Unpacked into :func:`boto3.DynamoDB.Client.query` or :func:`boto3.DynamoDB.Client.scan` def search_items(self, mode, request): """Invoke query/scan by name...
Create the model's table. Returns True if the table is being created, False otherwise. Does not wait for the table to create, and does not validate an existing table. Will not raise "ResourceInUseException" if the table exists or is being created. :param str table_name: The name of the table ...
Polls until the table is ready, then returns the first result when the table was ready. The returned dict is standardized to ensure all fields are present, even when empty or across different DynamoDB API versions. TTL information is also inserted. :param table_name: The name of the ta...
Polls until a creating table is ready, then verifies the description against the model's requirements. The model may have a subset of all GSIs and LSIs on the table, but the key structure must be exactly the same. The table must have a stream if the model expects one, but not the other way around. Wh...
Calls UpdateTimeToLive on the table according to model.Meta["ttl"] :param table_name: The name of the table to enable the TTL setting on :param model: The model to get TTL settings from def enable_ttl(self, table_name, model): """Calls UpdateTimeToLive on the table according to model.Meta["ttl...
Calls UpdateContinuousBackups on the table according to model.Meta["continuous_backups"] :param table_name: The name of the table to enable Continuous Backups on :param model: The model to get Continuous Backups settings from def enable_backups(self, table_name, model): """Calls UpdateContinuo...
Wraps :func:`boto3.DynamoDBStreams.Client.describe_stream`, handling continuation tokens. :param str stream_arn: Stream arn, usually from the model's ``Meta.stream["arn"]``. :param str first_shard: *(Optional)* If provided, only shards after this shard id will be returned. :return: All shards i...
Wraps :func:`boto3.DynamoDBStreams.Client.get_shard_iterator`. :param str stream_arn: Stream arn. Usually :data:`Shard.stream_arn <bloop.stream.shard.Shard.stream_arn>`. :param str shard_id: Shard identifier. Usually :data:`Shard.shard_id <bloop.stream.shard.Shard.shard_id>`. :param str itera...
Wraps :func:`boto3.DynamoDBStreams.Client.get_records`. :param iterator_id: Iterator id. Usually :data:`Shard.iterator_id <bloop.stream.shard.Shard.iterator_id>`. :return: Dict with "Records" list (may be empty) and "NextShardIterator" str (may not exist). :rtype: dict :raises bloop.ex...
Wraps :func:`boto3.DynamoDB.Client.db.transact_get_items`. :param items: Unpacked into "TransactionItems" for :func:`boto3.DynamoDB.Client.transact_get_items` :raises bloop.exceptions.TransactionCanceled: if the transaction was canceled. :return: Dict with "Records" list def transaction_read(s...
Wraps :func:`boto3.DynamoDB.Client.db.transact_write_items`. :param items: Unpacked into "TransactionItems" for :func:`boto3.DynamoDB.Client.transact_write_items` :param client_request_token: Idempotency token valid for 10 minutes from first use. Unpacked into "ClientRequestToken" :...
Only allows == against query_on.hash_key def check_hash_key(query_on, key): """Only allows == against query_on.hash_key""" return ( isinstance(key, BaseCondition) and (key.operation == "==") and (key.column is query_on.hash_key) )
BeginsWith, Between, or any Comparison except '!=' against query_on.range_key def check_range_key(query_on, key): """BeginsWith, Between, or any Comparison except '!=' against query_on.range_key""" return ( isinstance(key, BaseCondition) and key.operation in ("begins_with", "between", "<", ">",...
Constructs a :class:`~bloop.search.PreparedSearch`. def prepare(self): """Constructs a :class:`~bloop.search.PreparedSearch`.""" p = PreparedSearch() p.prepare( engine=self.engine, mode=self.mode, model=self.model, index=self.index, ke...
Validates the search parameters and builds the base request dict for each Query/Scan call. def prepare( self, engine=None, mode=None, model=None, index=None, key=None, filter=None, projection=None, consistent=None, forward=None, parallel=None): """Validates the search parameters and bui...
Number of items that have been loaded from DynamoDB so far, including buffered items. def count(self): """Number of items that have been loaded from DynamoDB so far, including buffered items.""" if self.request["Select"] == "COUNT": while not self.exhausted: next(self, None)...
Number of items that DynamoDB evaluated, before any filter was applied. def scanned(self): """Number of items that DynamoDB evaluated, before any filter was applied.""" if self.request["Select"] == "COUNT": while not self.exhausted: next(self, None) return self._scan...
Return the first result. If there are no results, raises :exc:`~bloop.exceptions.ConstraintViolation`. :return: The first result. :raises bloop.exceptions.ConstraintViolation: No results. def first(self): """Return the first result. If there are no results, raises :exc:`~bloop.exceptions.Con...
Return the unique result. If there is not exactly one result, raises :exc:`~bloop.exceptions.ConstraintViolation`. :return: The unique result. :raises bloop.exceptions.ConstraintViolation: Not exactly one result. def one(self): """Return the unique result. If there is not exactly one...
Reset to the initial state, clearing the buffer and zeroing count and scanned. def reset(self): """Reset to the initial state, clearing the buffer and zeroing count and scanned.""" self.buffer.clear() self._count = 0 self._scanned = 0 self._exhausted = False self.request...
get a type by the common bloop operation name: get/check/delete/save def by_alias(cls, name: str) -> "TxType": """get a type by the common bloop operation name: get/check/delete/save""" return { "get": TxType.Get, "check": TxType.Check, "delete": TxType.Delete, ...
Create a new PreparedTransaction that can be committed. This is called automatically when exiting the transaction as a context: .. code-block:: python >>> engine = Engine() >>> tx = WriteTransaction(engine) >>> prepared = tx.prepare() >>> prepared.commi...
Create a unique transaction id and dumps the items into a cached request object. def prepare(self, engine, mode, items) -> None: """ Create a unique transaction id and dumps the items into a cached request object. """ self.tx_id = str(uuid.uuid4()).replace("-", "") self.engine =...
Commit the transaction with a fixed transaction id. A read transaction can call commit() any number of times, while a write transaction can only use the same tx_id for 10 minutes from the first call. def commit(self) -> None: """ Commit the transaction with a fixed transaction id. ...
Add one or more objects to be loaded in this transaction. At most 10 items can be loaded in the same transaction. All objects will be loaded each time you call commit(). :param objs: Objects to add to the set that are loaded in this transaction. :return: this transaction for chaining...
Add a condition which must be met for the transaction to commit. While the condition is checked against the provided object, that object will not be modified. It is only used to provide the hash and range key to apply the condition to. At most 10 items can be checked, saved, or deleted in the...
Add one or more objects to be saved in this transaction. At most 10 items can be checked, saved, or deleted in the same transaction. The same idempotency token will be used for a single prepared transaction, which allows you to safely call commit on the PreparedCommit object multiple times. ...
Produces a numpy array of integers which encode the supplied cube dimensions. def encode(self, cube_dimensions): """ Produces a numpy array of integers which encode the supplied cube dimensions. """ return np.asarray([getattr(cube_dimensions[d], s) for d in s...
Produce a list of dictionaries for each dimension in this transcoder def decode(self, descriptor): """ Produce a list of dictionaries for each dimension in this transcoder """ i = iter(descriptor) n = len(self._schema) # Add the name key to our schema schema = self._schema + ('...
Download cub archive from cub_url and store it in cub_archive_name def dl_cub(cub_url, cub_archive_name): """ Download cub archive from cub_url and store it in cub_archive_name """ with open(cub_archive_name, 'wb') as f: remote_file = urllib2.urlopen(cub_url) meta = remote_file.info() ...
Compute the SHA1 hash of filename def sha_hash_file(filename): """ Compute the SHA1 hash of filename """ hash_sha = hashlib.sha1() with open(filename, 'rb') as f: for chunk in iter(lambda: f.read(1024*1024), b""): hash_sha.update(chunk) return hash_sha.hexdigest()
Downloads and installs cub into mb_inc_path def install_cub(mb_inc_path): """ Downloads and installs cub into mb_inc_path """ cub_url = 'https://github.com/NVlabs/cub/archive/1.6.4.zip' cub_sha_hash = '0d5659200132c2576be0b3959383fa756de6105d' cub_version_str = 'Current release: v1.6.4 (12/06/2016)' ...
Emit a list of architecture flags for each CUDA device found ['--gpu-architecture=sm_30', '--gpu-architecture=sm_52'] def cuda_architecture_flags(device_info): """ Emit a list of architecture flags for each CUDA device found ['--gpu-architecture=sm_30', '--gpu-architecture=sm_52'] """ # Figure ...
Create an extension that builds the custom tensorflow ops def create_tensorflow_extension(nvcc_settings, device_info): """ Create an extension that builds the custom tensorflow ops """ import tensorflow as tf import glob use_cuda = (bool(nvcc_settings['cuda_available']) and tf.test.is_built_wi...
Inform montblanc about dimension sizes def updated_dimensions(self): """ Inform montblanc about dimension sizes """ return [("ntime", args.ntime), # Timesteps ("nchan", args.nchan), # Channels ("na", args.na), # Antenna ("npsrc", len(...
Supply point source lm coordinates to montblanc def point_lm(self, context): """ Supply point source lm coordinates to montblanc """ # Shape (npsrc, 2) (ls, us), _ = context.array_extents(context.name) return np.asarray(lm_coords[ls:us], dtype=context.dtype)
Supply point source stokes parameters to montblanc def point_stokes(self, context): """ Supply point source stokes parameters to montblanc """ # Shape (npsrc, ntime, 4) (ls, us), (lt, ut), (l, u) = context.array_extents(context.name) data = np.empty(context.shape, context.dtype) ...
Supply UVW antenna coordinates to montblanc def uvw(self, context): """ Supply UVW antenna coordinates to montblanc """ # Shape (ntime, na, 3) (lt, ut), (la, ua), (l, u) = context.array_extents(context.name) # Create empty UVW coordinates data = np.empty(context.shape, context...
Monkeypatch distutils.Distribution.reinitialize_command() to match behavior of Distribution.get_command_obj() This fixes a problem where 'pip install -e' does not reinitialise options using the setup(options={...}) variable for the build_ext command. This also effects other option sourcs such as setup.c...
Compute the number of baselines for the given number of antenna. Can specify whether auto-correlations should be taken into account def nr_of_baselines(na, auto_correlations=False): """ Compute the number of baselines for the given number of antenna. Can specify whether auto-correlations sh...
Compute the number of antenna for the given number of baselines. Can specify whether auto-correlations should be taken into account def nr_of_antenna(nbl, auto_correlations=False): """ Compute the number of antenna for the given number of baselines. Can specify whether auto-correlations sho...
Estimates the memory in bytes required for an array of the supplied shape and dtype def array_bytes(shape, dtype): """ Estimates the memory in bytes required for an array of the supplied shape and dtype """ return np.product(shape)*np.dtype(dtype).itemsize
Returns a random array of the same shape and type as the supplied array argument, or the supplied shape and dtype def random_like(ary=None, shape=None, dtype=None): """ Returns a random array of the same shape and type as the supplied array argument, or the supplied shape and dtype """ if ary i...
Return a flatten version of the nested argument def flatten(nested): """ Return a flatten version of the nested argument """ flat_return = list() def __inner_flat(nested,flat): for i in nested: __inner_flat(i, flat) if isinstance(i, list) else flat.append(i) return flat __...
Return the number of bytes required by an array Arguments --------------- ary : dict Dictionary representation of an array template : dict A dictionary of key-values, used to replace any string values in the array with concrete integral values Returns ----------...
Return the number of bytes required by a dictionary of arrays. Arguments --------------- arrays : list A list of dictionaries defining the arrays template : dict A dictionary of key-values, used to replace any string values in the arrays with concrete integral values...
Returns the number of timesteps possible, given the registered arrays and a memory budget defined by bytes_available Arguments ---------------- bytes_available : int The memory budget, or available number of bytes for solving the problem. arrays : list List of dictionaries d...
Substitutes string values in the supplied shape parameter with integer variables stored in a dictionary Parameters ---------- sshape : tuple/string composed of integers and strings. The strings should related to integral properties registered with this Solver object variables : dict...
Shape a list of lists into the appropriate shape and data type def shape_list(l,shape,dtype): """ Shape a list of lists into the appropriate shape and data type """ return np.array(l, dtype=dtype).reshape(shape)
Return a function defining the conversion process between two NumPy arrays of different shapes def array_convert_function(sshape_one, sshape_two, variables): """ Return a function defining the conversion process between two NumPy arrays of different shapes """ if not isinstance(sshape_one, tuple): ssha...
Redistribute threads from the Z dimension towards the X dimension. Also clamp number of threads to the problem dimension size, if necessary def redistribute_threads(blockdimx, blockdimy, blockdimz, dimx, dimy, dimz): """ Redistribute threads from the Z dimension towards the X dimension. Also cl...
Register the default dimensions for a RIME solver def register_default_dimensions(cube, slvr_cfg): """ Register the default dimensions for a RIME solver """ import montblanc.src_types as mbs # Pull out the configuration options for the basics autocor = slvr_cfg['auto_correlations'] ntime = 10 ...
Hack to get IP address from the interface def get_ip_address(ifname): """ Hack to get IP address from the interface """ s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) return socket.inet_ntoa(fcntl.ioctl( s.fileno(), 0x8915, # SIOCGIFADDR struct.pack('256s', ifname[:15]) ...
Find nvcc and the CUDA installation def nvcc_compiler_settings(): """ Find nvcc and the CUDA installation """ search_paths = os.environ.get('PATH', '').split(os.pathsep) nvcc_path = find_in_path('nvcc', search_paths) default_cuda_path = os.path.join('usr', 'local', 'cuda') cuda_path = os.environ.g...
Poor mans deviceQuery. Returns CUDA_VERSION information and CUDA device information in JSON format def inspect_cuda_version_and_devices(compiler, settings): """ Poor mans deviceQuery. Returns CUDA_VERSION information and CUDA device information in JSON format """ try: output = build_and...
inject deep into distutils to customize gcc/nvcc dispatch def customize_compiler_for_nvcc(compiler, nvcc_settings): """inject deep into distutils to customize gcc/nvcc dispatch """ # tell the compiler it can process .cu files compiler.src_extensions.append('.cu') # save references to the default comp...
Return cuda device information and nvcc/cuda setup def inspect_cuda(): """ Return cuda device information and nvcc/cuda setup """ nvcc_settings = nvcc_compiler_settings() sysconfig.get_config_vars() nvcc_compiler = ccompiler.new_compiler() sysconfig.customize_compiler(nvcc_compiler) customize_c...
Returns a dictionary suitable for templating strings with properties and dimensions related to this Solver object. Used in templated GPU kernels. def template_dict(self): """ Returns a dictionary suitable for templating strings with properties and dimensions related to this Sol...
Factory function that produces a RIME solver def rime_solver(slvr_cfg): """ Factory function that produces a RIME solver """ from montblanc.impl.rime.tensorflow.RimeSolver import RimeSolver return RimeSolver(slvr_cfg)
Returns a dictionary of source methods found on this object, keyed on method name. Source methods are identified.by argspec, a list of argument specifiers. So for e.g. an argpsec of :code:`[['self', 'context'], ['s', 'c']]` would match methods looking like: .. code-block:: python def f(sel...
Returns a dictionary of source methods found on this object, keyed on method name. Source methods are identified by (self, context) arguments on this object. For example: .. code-block:: python def f(self, context): ... is a source method, but ...
Computes parallactic angles per timestep for the given reference antenna position and field centre. Arguments: times: ndarray Array of unique times with shape (ntime,), obtained from TIME column of MS table antenna_positions: ndarray of shape (na, 3) Antenna ...
Setup logging configuration def setup_logging(): """ Setup logging configuration """ # Console formatter, mention name cfmt = logging.Formatter(('%(name)s - %(levelname)s - %(message)s')) # File formatter, mention time ffmt = logging.Formatter(('%(asctime)s - %(levelname)s - %(message)s')) #...
Caches constant arrays associated with an array name. The intent of this decorator is to avoid the cost of recreating and storing many arrays of constant data, especially data created by np.zeros or np.ones. Instead, a single array of the first given shape is created and any further requests for co...
Caches chunks of default data. This decorator caches generated default data so as to avoid recomputing it on a subsequent queries to the provider. def chunk_cache(method): """ Caches chunks of default data. This decorator caches generated default data so as to avoid recomputing it on a su...
Create a DefaultsSourceProvider object. This provides default data sources for each array defined on the hypercube. The data sources may either by obtained from the arrays 'default' data source or the 'test' data source. def _create_defaults_source_provider(cube, data_source): """ Create a Defaults...
Constructs a tensorflow expression for computing the RIME def _construct_tensorflow_expression(slvr_cfg, feed_data, device, shard): """ Constructs a tensorflow expression for computing the RIME """ zero = tf.constant(0) src_count = zero src_ph_vars = feed_data.src_ph_vars LSA = feed_data.local ...
Get data from the data source, checking the return values def _get_data(data_source, context): """ Get data from the data source, checking the return values """ try: # Get data from the data source data = data_source.source(context) # Complain about None values if data is None:...
Supply data to the data sink def _supply_data(data_sink, context): """ Supply data to the data sink """ try: data_sink.sink(context) except Exception as e: ex = ValueError("An exception occurred while " "supplying data to data sink '{ds}'\n\n" "{e}\n\n" "...
Given a list of source_providers, apply the list of suggested dimension updates given in provider.updated_dimensions() to the supplied hypercube. Dimension global_sizes are always updated with the supplied sizes and lower_extent is always set to 0. upper_extent is set to any reductions (current upp...
Sets up the hypercube given a solver configuration def _setup_hypercube(cube, slvr_cfg): """ Sets up the hypercube given a solver configuration """ mbu.register_default_dimensions(cube, slvr_cfg) # Configure the dimensions of the beam cube cube.register_dimension('beam_lw', 2, ...
Partition data sources into 1. Dictionary of data sources associated with radio sources. 2. List of data sources to feed multiple times. 3. List of data sources to feed once. def _partition(iter_dims, data_sources): """ Partition data sources into 1. Dictionary of data sources associated with...