text
stringlengths
81
112k
returns id for a given physics group name def execute(self, conn, name='', transaction = False): """ returns id for a given physics group name """ binds={} if name: op = ('=', 'like')['%' in name] sql = self.sql + " WHERE pg.physics_group_name %s :physic...
Method to insert the Output Config. app_name, release_version, pset_hash, global_tag and output_module_label are required. args: businput(dic): input dictionary. Updated Oct 12, 2011 def insertOutputConfig(self, businput): """ Method to insert the Output Co...
API to get a list of supported REST APIs. In the case a particular API is specified, the docstring of that API is displayed. :param call: call to get detailed information about (Optional) :type call: str :return: List of APIs or detailed information about a specific call (parameters and...
API to list primary datasets :param primary_ds_type: List primary datasets with primary dataset type (Optional) :type primary_ds_type: str :param primary_ds_name: List that primary dataset (Optional) :type primary_ds_name: str :returns: List of dictionaries containing the follow...
API to list primary dataset types :param primary_ds_type: List that primary dataset type (Optional) :type primary_ds_type: str :param dataset: List the primary dataset type for that dataset (Optional) :type dataset: str :returns: List of dictionaries containing the following key...
API to list dataset(s) in DBS * You can use ANY combination of these parameters in this API * In absence of parameters, all valid datasets known to the DBS instance will be returned :param dataset: Full dataset (path) of the dataset. :type dataset: str :param parent_dataset: Fu...
API to list datasets in DBS. To be called by datasetlist url with post call. :param dataset: list of datasets [dataset1,dataset2,..,dataset n] (must have either a list of dataset or dataset_id), Max length 1000. :type dataset: list :param dataset_id: list of dataset ids [dataset_id1,dataset_id2,..,dat...
API to list data tiers known to DBS. :param data_tier_name: List details on that data tier (Optional) :type data_tier_name: str :returns: List of dictionaries containing the following keys (data_tier_id, data_tier_name, create_by, creation_date) def listDataTiers(self, data_tier_name=""): ...
API to list a block in DBS. At least one of the parameters block_name, dataset, data_tier_name or logical_file_name are required. If data_tier_name is provided, min_cdate and max_cdate have to be specified and the difference in time have to be less than 31 days. :param block_name: name of the b...
API to list blocks first generated in origin_site_name. :param origin_site_name: Origin Site Name (Optional, No wildcards) :type origin_site_name: str :param dataset: dataset ( No wildcards, either dataset or block name needed) :type dataset: str :param block_name: :type...
API to list block parents of multiple blocks. To be called by blockparents url with post call. :param block_names: list of block names [block_name1, block_name2, ...] (Required). Mwx length 1000. :type block_names: list def listBlocksParents(self): """ API to list block parents of mult...
API to list block children. :param block_name: name of block who's children needs to be found (Required) :type block_name: str :returns: List of dictionaries containing following keys (block_name) :rtype: list of dicts def listBlockChildren(self, block_name=""): """ API...
API that returns summary information like total size and total number of events in a dataset or a list of blocks :param block_name: list block summaries for block_name(s) :type block_name: str, list :param dataset: list block summaries for all blocks in dataset :type dataset: str ...
API to list files in DBS. Either non-wildcarded logical_file_name, non-wildcarded dataset or non-wildcarded block_name is required. The combination of a non-wildcarded dataset or block_name with an wildcarded logical_file_name is supported. * For lumi_list the following two json formats are supported: ...
API to list files in DBS. Either non-wildcarded logical_file_name, non-wildcarded dataset, non-wildcarded block_name or non-wildcarded lfn list is required. The combination of a non-wildcarded dataset or block_name with an wildcarded logical_file_name is supported. * For lumi_list the followi...
API to list number of files, event counts and number of lumis in a given block or dataset. If the optional run_num, output are: * The number of files which have data (lumis) for that run number; * The total number of events in those files; * The total number of ...
API to list A datasets parents in DBS. :param dataset: dataset (Required) :type dataset: str :returns: List of dictionaries containing the following keys (this_dataset, parent_dataset_id, parent_dataset) :rtype: list of dicts def listDatasetParents(self, dataset=''): """ ...
API to list OutputConfigs in DBS. * You can use any combination of these parameters in this API * All parameters are optional, if you do not provide any parameter, all configs will be listed from DBS :param dataset: Full dataset (path) of the dataset :type dataset: str :param l...
API to list file parents :param logical_file_name: logical_file_name of file (Required) :type logical_file_name: str, list :param block_id: ID of the a block, whose files should be listed :type block_id: int, str :param block_name: Name of the block, whose files should be listed...
IMPORTANT: This is ***WMAgent*** sepcial case API. It is not for others. API to list File Parentage for a given block with or w/o a list of LFN. It is used with the POST method of fileparents call. Using the child_lfn_list will significantly affect the API running speed. :param block...
API to list file children. One of the parameters in mandatory. :param logical_file_name: logical_file_name of file (Required) :type logical_file_name: str, list :param block_name: block_name :type block_name: str :param block_id: block_id :type block_id: str, int ...
API to list Lumi for files. Either logical_file_name or block_name is required. No wild card support in this API :param block_name: Name of the block :type block_name: str :param logical_file_name: logical_file_name of file :type logical_file_name: str, list :param run_num: List...
API to list all runs in DBS. At least one parameter is mandatory. :param logical_file_name: List all runs in the file :type logical_file_name: str :param block_name: List all runs in the block :type block_name: str :param dataset: List all runs in that dataset :type data...
API to list data types known to dbs (when no parameter supplied). :param dataset: Returns data type (of primary dataset) of the dataset (Optional) :type dataset: str :param datatype: List specific data type :type datatype: str :returns: List of dictionaries containing the follow...
API the list all information related with the block_name :param block_name: Name of block to be dumped (Required) :type block_name: str def dumpBlock(self, block_name): """ API the list all information related with the block_name :param block_name: Name of block to be dumped (...
API to list all Acquisition Eras in DBS. :param acquisition_era_name: Acquisition era name (Optional, wild cards allowed) :type acquisition_era_name: str :returns: List of dictionaries containing following keys (description, end_date, acquisition_era_name, create_by, creation_date and start_dat...
API to list all Processing Eras in DBS. :param processing_version: Processing Version (Optional). If provided just this processing_version will be listed :type processing_version: str :returns: List of dictionaries containing the following keys (create_by, processing_version, description, creat...
API to list all release versions in DBS :param release_version: List only that release version :type release_version: str :param dataset: List release version of the specified dataset :type dataset: str :param logical_file_name: List release version of the logical file name ...
API to list dataset access types. :param dataset_access_type: List that dataset access type (Optional) :type dataset_access_type: str :returns: List of dictionary containing the following key (dataset_access_type). :rtype: List of dicts def listDatasetAccessTypes(self, dataset_access_t...
API to list all physics groups. :param physics_group_name: List that specific physics group (Optional) :type physics_group_name: basestring :returns: List of dictionaries containing the following key (physics_group_name) :rtype: list of dicts def listPhysicsGroups(self, physics_group_n...
API to list run summaries, like the maximal lumisection in a run. :param dataset: dataset name (Optional) :type dataset: str :param run_num: Run number (Required) :type run_num: str, long, int :rtype: list containing a dictionary with key max_lumi def listRunSummaries(self, dat...
List all events def list(): """ List all events """ entries = lambder.list_events() for e in entries: click.echo(str(e))
Create an event def add(name, function_name, cron): """ Create an event """ lambder.add_event(name=name, function_name=function_name, cron=cron)
Load events from a json file def load(file): """ Load events from a json file """ with open(file, 'r') as f: contents = f.read() lambder.load_events(contents)
Manage AWS Lambda functions def functions(context): """ Manage AWS Lambda functions """ # find lambder.json in CWD config_file = "./lambder.json" if os.path.isfile(config_file): context.obj = FunctionConfig(config_file) pass
List lambder functions def list(): """ List lambder functions """ functions = lambder.list_functions() output = json.dumps( functions, sort_keys=True, indent=4, separators=(',', ':') ) click.echo(output)
Create a new lambda project def new( name, bucket, timeout, memory, description, subnet_ids, security_group_ids ): """ Create a new lambda project """ config = {} if timeout: config['timeout'] = timeout if memory: config['memory'] = memory if description:...
Deploy/Update a function from a project directory def deploy( config, name, bucket, timeout, memory, description, subnet_ids, security_group_ids ): """ Deploy/Update a function from a project directory """ # options should override config if it is there myname = name or conf...
Delete lambda function, role, and zipfile def rm(config, name, bucket): """ Delete lambda function, role, and zipfile """ # options should override config if it is there myname = name or config.name mybucket = bucket or config.bucket click.echo('Deleting {} from {}'.format(myname, mybucket)) l...
Invoke function in AWS def invoke(config, name, input): """ Invoke function in AWS """ # options should override config if it is there myname = name or config.name click.echo('Invoking ' + myname) output = lambder.invoke_function(myname, input) click.echo(output)
Insert the data in sereral steps and commit when each step finishes or rollback if there is a problem. def putBlock(self, blockcontent, migration=False): """ Insert the data in sereral steps and commit when each step finishes or rollback if there is a problem. """ #YG try: ...
Insert Release version, application, parameter set hashes and the map(output module config). def insertOutputModuleConfig(self, remoteConfig, migration=False): """ Insert Release version, application, parameter set hashes and the map(output module config). """ otptIdList = [] m...
This method insert a datsset from a block object into dbs. def insertDataset(self, blockcontent, otptIdList, migration=False): """ This method insert a datsset from a block object into dbs. """ dataset = blockcontent['dataset'] conn = self.dbi.connection() # First, chec...
_insertDatasetOnly_ Insert the dataset and only the dataset Meant to be called after everything else is put into place. The insertDataset flag is set to false if the dataset already exists def insertDatasetWOannex(self, dataset, blockcontent, otptIdList, conn, ins...
Returns sites. def listSites(self, block_name="", site_name=""): """ Returns sites. """ try: conn = self.dbi.connection() if block_name: result = self.blksitelist.execute(conn, block_name) else: result = self.sitelist.e...
Input dictionary has to have the following keys: site_name it builds the correct dictionary for dao input and executes the dao def insertSite(self, businput): """ Input dictionary has to have the following keys: site_name it builds the correct dictionary for dao input an...
das_map = {'lookup' : [{params : {'param1' : 'required', 'param2' : 'optional', 'param3' : 'default_value' ...}, url : 'https://cmsweb.cern.ch:8443/dbs/prod/global/DBSReader/acquisitioneras/', das_map : {'das_param1' : dbs_param1, ...} ...
:param: sourceList: list which need to be sliced :type: list :param: sliceSize: size of the slice :type: int :return: iterator of the sliced list def slicedIterator(sourceList, sliceSize): """ :param: sourceList: list which need to be sliced :type: list :param: sliceSize: size of the sl...
Helper function to check input by using before sending to the server :param method: Name of the API :type method: str :param validParameters: Allow parameters for the API call :type validParameters: list :param requiredParameters: Required parameters for the API call (Default: None) :type requi...
Helper function split list used as input parameter for requests, since Apache has a limitation to 8190 Bytes for the lenght of an URI. We extended it to also split lfn and dataset list length for POST calls to avoid DB abuse even if there is no limit on hoe long the list can be. YG 2015-5-13 :param data...
Decorator to split up server calls for methods using url parameters, due to the lenght limitation of the URI in Apache. By default 8190 bytes def split_calls(func): """ Decorator to split up server calls for methods using url parameters, due to the lenght limitation of the URI in Apache. By default 819...
A private method to make HTTP call to the DBS Server :param method: REST API to call, e.g. 'datasets, blocks, files, ...'. :type method: str :param params: Parameters to the API call, e.g. {'dataset':'/PrimaryDS/ProcessedDS/TIER'}. :type params: dict :param callmethod: The HTTP ...
An internal method, should not be used by clients :param httperror: Thrown httperror by the server def __parseForException(self, http_error): """ An internal method, should not be used by clients :param httperror: Thrown httperror by the server """ data = http_error.bo...
Returns the time needed to process the request by the frontend server in microseconds and the EPOC timestamp of the request in microseconds. :rtype: tuple containing processing time and timestamp def requestTimingInfo(self): """ Returns the time needed to process the request by the fro...
API to insert a bulk block :param blockDump: Output of the block dump command, example can be found in https://svnweb.cern.ch/trac/CMSDMWM/browser/DBS/trunk/Client/tests/dbsclient_t/unittests/blockdump.dict :type blockDump: dict def insertBulkBlock(self, blockDump): """ API to insert a...
API to insert a list of file into DBS in DBS. Up to 10 files can be inserted in one request. :param qInserts: True means that inserts will be queued instead of done immediately. INSERT QUEUE Manager will perform the inserts, within few minutes. :type qInserts: bool :param filesList: List of dic...
API to list file parents using lumi section info. :param block_name: name of block that has files who's parents needs to be found (Required) :type block_name: str :param logical_file_name: if not all the file parentages under the block needed, this lfn list gives the files that needs to find it...
API to list block parents. :param block_name: name of block who's parents needs to be found (Required) :type block_name: str :returns: List of dictionaries containing following keys (block_name) :rtype: list of dicts def listBlockParents(self, **kwargs): """ API to list...
API to list a block in DBS. At least one of the parameters block_name, dataset, data_tier_name or logical_file_name are required. If data_tier_name is provided, min_cdate and max_cdate have to be specified and the difference in time have to be less than 31 days. :param block_name: name of the b...
API to list dataset(s) in DBS * You can use ANY combination of these parameters in this API * In absence of parameters, all valid datasets known to the DBS instance will be returned :param dataset: Full dataset (path) of the dataset :type dataset: str :param parent_dataset: Ful...
API to list datasets in DBS. :param dataset: list of datasets [dataset1,dataset2,..,dataset n] (Required if dataset_id is not presented), Max length 1000. :type dataset: list :param dataset_id: list of dataset_ids that are the primary keys of datasets table: [dataset_id1,dataset_id2,..,dataset_...
API to list files in DBS. Non-wildcarded logical_file_name, non-wildcarded dataset, non-wildcarded block_name or non-wildcarded lfn list is required. The combination of a non-wildcarded dataset or block_name with an wildcarded logical_file_name is supported. * For lumi_list the following two json for...
API to list primary dataset types :param primary_ds_type: List that primary dataset type (Optional) :type primary_ds_type: str :param dataset: List the primary dataset type for that dataset (Optional) :type dataset: str :returns: List of dictionaries containing the following key...
API to list all run dictionary, for example: [{'run_num': [160578, 160498, 160447, 160379]}]. At least one parameter is mandatory. :param logical_file_name: List all runs in the file :type logical_file_name: str :param block_name: List all runs in the block :type block_name: st...
API to update the end_date of an acquisition era :param acquisition_era_name: acquisition_era_name to update (Required) :type acquisition_era_name: str :param end_date: end_date not zero (Required) :type end_date: int def updateAcqEraEndDate(self, **kwargs): """ API to ...
Lists lumi section numbers with in a file, a list of files or a block. def execute(self, conn, logical_file_name='', block_name='', run_num=-1, validFileOnly=0, migration=False): """ Lists lumi section numbers with in a file, a list of files or a block. """ sql = "" wheresql = "" lfn_generat...
cursors = self.dbi.processData(sql, binds, conn, transaction=transaction, returnCursor=True) for i in cursors: d = self.formatCursor(i, size=100) if isinstance(d, list) or isinstance(d, GeneratorType): for elem in d: yield elem elif d: ...
Lists all primary datasets if pattern is not provided. def execute(self, conn, run_num=-1, logical_file_name="", block_name="", dataset="", trans=False): """ Lists all primary datasets if pattern is not provided. """ sql = self.sql binds = {} if logical_file_name and "%" not in...
Return a list of dictionaries. Each dictionary represents one device. The dictionary contains the following keys: port, unique_id and in_use. `port` can be used with :func:`open`. `serial_number` is the serial number of the device (and can also be used with :func:`open`) and `in_use` indicates whether ...
Open an aardvark device and return an :class:`Aardvark` object. If the device cannot be opened an :class:`IOError` is raised. The `port` can be retrieved by :func:`find_devices`. Usually, the first device is 0, the second 1, etc. If you are using only one device, you can therefore omit the parameter ...
Set this to `True` to enable the hardware I2C interface. If set to `False` the hardware interface will be disabled and its pins (SDA and SCL) can be used as GPIOs. def enable_i2c(self): """Set this to `True` to enable the hardware I2C interface. If set to `False` the hardware interface ...
Set this to `True` to enable the hardware SPI interface. If set to `False` the hardware interface will be disabled and its pins (MISO, MOSI, SCK and SS) can be used as GPIOs. def enable_spi(self): """Set this to `True` to enable the hardware SPI interface. If set to `False` the hardware...
I2C bitrate in kHz. Not every bitrate is supported by the host adapter. Therefore, the actual bitrate may be less than the value which is set. The power-on default value is 100 kHz. def i2c_bitrate(self): """I2C bitrate in kHz. Not every bitrate is supported by the host adapter...
Setting this to `True` will enable the I2C pullup resistors. If set to `False` the pullup resistors will be disabled. Raises an :exc:`IOError` if the hardware adapter does not support pullup resistors. def i2c_pullups(self): """Setting this to `True` will enable the I2C pullup resistor...
Setting this to `True` will activate the power pins (4 and 6). If set to `False` the power will be deactivated. Raises an :exc:`IOError` if the hardware adapter does not support the switchable power pins. def target_power(self): """Setting this to `True` will activate the power pins (4...
I2C bus lock timeout in ms. Minimum value is 10 ms and the maximum value is 450 ms. Not every value can be set and will be rounded to the next possible number. You can read back the property to get the actual value. The power-on default value is 200 ms. def i2c_bus_timeout(self): ...
Make an I2C write access. The given I2C device is addressed and data given as a string is written. The transaction is finished with an I2C stop condition unless I2C_NO_STOP is set in the flags. 10 bit addresses are supported if the I2C_10_BIT_ADDR flag is set. def i2c_master_write(sel...
Make an I2C read access. The given I2C device is addressed and clock cycles for `length` bytes are generated. A short read will occur if the device generates an early NAK. The transaction is finished with an I2C stop condition unless the I2C_NO_STOP flag is set. def i2c_master...
Make an I2C write/read access. First an I2C write access is issued. No stop condition will be generated. Instead the read access begins with a repeated start. This method is useful for accessing most addressable I2C devices like EEPROMs, port expander, etc. Basically, this is ...
Wait for an event to occur. If `timeout` is given, if specifies the length of time in milliseconds which the function will wait for events before returing. If `timeout` is omitted, negative or None, the call will block until there is an event. Returns a list of events. In case ...
Enable I2C slave mode. The device will respond to the specified slave_address if it is addressed. You can wait for the data with :func:`poll` and get it with `i2c_slave_read`. def enable_i2c_slave(self, slave_address): """Enable I2C slave mode. The device will respond...
Read the bytes from an I2C slave reception. The bytes are returned as a string object. def i2c_slave_read(self): """Read the bytes from an I2C slave reception. The bytes are returned as a string object. """ data = array.array('B', (0,) * self.BUFFER_SIZE) status, addr,...
Returns the number of bytes transmitted by the slave. def i2c_slave_last_transmit_size(self): """Returns the number of bytes transmitted by the slave.""" ret = api.py_aa_i2c_slave_write_stats(self.handle) _raise_error_if_negative(ret) return ret
Retrieved any data fetched by the monitor. This function has an integrated timeout mechanism. You should use :func:`poll` to determine if there is any data available. Returns a list of data bytes and special symbols. There are three special symbols: `I2C_MONITOR_NACK`, I2C_MONITOR_STAR...
SPI bitrate in kHz. Not every bitrate is supported by the host adapter. Therefore, the actual bitrate may be less than the value which is set. The slowest bitrate supported is 125kHz. Any smaller value will be rounded up to 125kHz. The power-on default value is 1000 kHz. def spi_bitrat...
Configure the SPI interface. def spi_configure(self, polarity, phase, bitorder): """Configure the SPI interface.""" ret = api.py_aa_spi_configure(self.handle, polarity, phase, bitorder) _raise_error_if_negative(ret)
Configure the SPI interface by the well known SPI modes. def spi_configure_mode(self, spi_mode): """Configure the SPI interface by the well known SPI modes.""" if spi_mode == SPI_MODE_0: self.spi_configure(SPI_POL_RISING_FALLING, SPI_PHASE_SAMPLE_SETUP, SPI_BITORDER_MSB)...
Write a stream of bytes to a SPI device. def spi_write(self, data): """Write a stream of bytes to a SPI device.""" data_out = array.array('B', data) data_in = array.array('B', (0,) * len(data_out)) ret = api.py_aa_spi_write(self.handle, len(data_out), data_out, len(data_...
Change the ouput polarity on the SS line. Please note, that this only affects the master functions. def spi_ss_polarity(self, polarity): """Change the ouput polarity on the SS line. Please note, that this only affects the master functions. """ ret = api.py_aa_spi_master_ss_pol...
Customize edit form. def edit_form(self, obj): """Customize edit form.""" form = super(OAISetModelView, self).edit_form(obj) del form.spec return form
Flatten nested dictionary for GET / POST / DELETE API request def flatten_nested_hash(hash_table): """ Flatten nested dictionary for GET / POST / DELETE API request """ def flatten(hash_table, brackets=True): f = {} for key, value in hash_table.items(): _key = '[' + str(key)...
Perform an HTTP GET / POST / DELETE request def sailthru_http_request(url, data, method, file_data=None, headers=None, request_timeout=10): """ Perform an HTTP GET / POST / DELETE request """ data = flatten_nested_hash(data) method = method.upper() params, data = (None, data) if method == 'POST...
Return an instance of schema for given verb. def _schema_from_verb(verb, partial=False): """Return an instance of schema for given verb.""" from .verbs import Verbs return getattr(Verbs, verb)(partial=partial)
Return resumption token serializer. def serialize(pagination, **kwargs): """Return resumption token serializer.""" if not pagination.has_next: return token_builder = URLSafeTimedSerializer( current_app.config['SECRET_KEY'], salt=kwargs['verb'], ) schema = _schema_from_verb(...
Serialize resumption token. def _deserialize(self, value, attr, data): """Serialize resumption token.""" token_builder = URLSafeTimedSerializer( current_app.config['SECRET_KEY'], salt=data['verb'], ) result = token_builder.loads(value, max_age=current_app.config[...
Deserialize a data structure to an object. def load(self, data, many=None, partial=None): """Deserialize a data structure to an object.""" result = super(ResumptionTokenSchema, self).load( data, many=many, partial=partial ) result.data.update( result.data.get('re...
Validate arguments in incomming request. def make_request_validator(request): """Validate arguments in incomming request.""" verb = request.values.get('verb', '', type=str) resumption_token = request.values.get('resumptionToken', None) schema = Verbs if resumption_token is None else ResumptionVerbs ...
Parse an ISO8601-formatted datetime and return a datetime object. Inspired by the marshmallow.utils.from_iso function, but also accepts datestrings that don't contain the time. def from_iso_permissive(datestring, use_dateutil=True): """Parse an ISO8601-formatted datetime and return a datetime ...
Check range between dates under keys ``from_`` and ``until``. def validate(self, data): """Check range between dates under keys ``from_`` and ``until``.""" if 'verb' in data and data['verb'] != self.__class__.__name__: raise ValidationError( # FIXME encode data ...
Upgrade database. def upgrade(): """Upgrade database.""" op.create_table( 'oaiserver_set', sa.Column('created', sa.DateTime(), nullable=False), sa.Column('updated', sa.DateTime(), nullable=False), sa.Column('id', sa.Integer(), nullable=False), sa.Column('spec', sa.String...