text
stringlengths
81
112k
package node aims to package a (present working node) for a user into a container. This assumes that the node is a single partition. :param root: the root of the node to package, default is / :param name: the name for the image. If not specified, will use machine's psutil.disk_partitions() def pack...
unpackage node is intended to unpackage a node that was packaged with package_node. The image should be a .tgz file. The general steps are to: 1. Package the node using the package_node function 2. Transfer the package somewhere that Singularity is installed def unpack_node(image_path,name=None,output_fold...
get_build template returns a string or file for a particular build template, which is intended to build a version of a Singularity image on a cloud resource. :param template_name: the name of the template to retrieve in build/scripts :param params: (if needed) a dictionary of parameters to substitute in the...
sniff_extension will attempt to determine the file type based on the extension, and return the proper mimetype :param file_path: the full path to the file to sniff :param verbose: print stuff out def sniff_extension(file_path,verbose=True): '''sniff_extension will attempt to determine the file type bas...
get_script will return a build script_name, if it is included in singularity/build/scripts, otherwise will alert the user and return None :param script_name: the name of the script to look for def get_script(script_name): '''get_script will return a build script_name, if it is included in singularity...
zip_up will zip up some list of files into a package (.zip) :param file_list: a list of files to include in the zip. :param output_folder: the output folder to create the zip in. If not :param zip_name: the name of the zipfile to return. specified, a temporary folder will be given. def zip_up(file_lis...
get_container_contents will return a list of folders and or files for a container. The environmental variable SINGULARITY_HUB being set means that container objects are referenced instead of packages :param container: the container to get content for :param gets: a list of file names to return, without ...
get_image_hashes returns the hash for an image across all levels. This is the quickest, easiest way to define a container's reproducibility on each level. def get_image_hashes(image_path, version=None, levels=None): '''get_image_hashes returns the hash for an image across all levels. This is the quickest, ...
get_image_hash will generate a sha1 hash of an image, depending on a level of reproducibility specified by the user. (see function get_levels for descriptions) the user can also provide a level_filter manually with level_filter (for custom levels) :param level: the level of reproducibility to use, which map...
get_content_hashes is like get_image_hash, but it returns a complete dictionary of file names (keys) and their respective hashes (values). This function is intended for more research purposes and was used to generate the levels in the first place. If include_sizes is True, we include a second data structur...
get_image_hash will return an md5 hash of the file based on a criteria level. :param level: one of LOW, MEDIUM, HIGH :param image_path: full path to the singularity image def get_image_file_hash(image_path): '''get_image_hash will return an md5 hash of the file based on a criteria level. :param level: ...
container_difference will return a data structure to render an html tree (graph) of the differences between two images or packages. The second container is subtracted from the first :param container: the primary container object (to subtract from) :param container_subtract: the second container object ...
container_sim will return a data structure to render an html tree (graph) of the intersection (commonalities) between two images or packages :param container1: the first container object :param container2: the second container object if either not defined, need :param image_package1: a packaged contain...
tree will render an html tree (graph) of a container def container_tree(container=None,image_package=None): '''tree will render an html tree (graph) of a container ''' guts = get_container_contents(container=container, image_package=image_package, ...
make_container_tree will convert a list of folders and files into a json structure that represents a graph. :param folders: a list of folders in the image :param files: a list of files in the folder :param parse_files: return 'files' lookup in result, to associate ID of node with files (default True) :p...
make package tree will make a dendrogram comparing a matrix of packages :param matrix: a pandas df of packages, with names in index and columns :param labels: a list of labels corresponding to row names, will be pulled from rows if not defined :param title: a title for the plot, if not defined, will be ...
make interactive tree will return complete html for an interactive tree :param title: a title for the plot, if not defined, will be left out. def make_interactive_tree(matrix=None,labels=None): '''make interactive tree will return complete html for an interactive tree :param title: a title for the plot, if...
add_node will add a node to it's parent def add_node(node, parent): '''add_node will add a node to it's parent ''' newNode = dict(node_id=node.id, children=[]) parent["children"].append(newNode) if node.left: add_node(node.left, newNode) if node.right: add_node(node.right, newNode)
label tree will again recursively label the tree :param n: the root node, usually d3['children'][0] :param lookup: the node/id lookup def label_tree(n,lookup): '''label tree will again recursively label the tree :param n: the root node, usually d3['children'][0] :param lookup: the node/id lookup ...
extract app will extract metadata for one or more apps Parameters ========== image: the absolute path to the image app_name: the name of the app under /scif/apps def extract_apps(image, app_names): ''' extract app will extract metadata for one or more apps Parameters ...
run_command uses subprocess to send a command to the terminal. :param cmd: the command to send, should be a list for subprocess :param error_message: the error message to give to user if fails, if none specified, will alert that command failed. :param sudopw: if specified (not None) command will be run ...
download_repo :param repo_url: the url of the repo to clone from :param destination: the full path to the destination for the repo def download_repo(repo_url, destination, commit=None): '''download_repo :param repo_url: the url of the repo to clone from :param destination: the full path to the dest...
get tags will return a list of tags that describe the software in an image, meaning inside of a paricular folder. If search_folder is not defined, uses lib :param container: if provided, will use container as image. Can also provide :param image_package: if provided, can be used instead of container :pa...
file counts will return a list of files that match one or more regular expressions. if no patterns is defined, a default of readme is used. All patterns and files are made case insensitive. Parameters ========== :param container: if provided, will use container as image. Can also provide :param...
extension counts will return a dictionary with counts of file extensions for an image. :param container: if provided, will use container as image. Can also provide :param image_package: if provided, can be used instead of container :param file_list: the complete list of files :param return_counts: r...
assess_differences will compare two images on each level of reproducibility, returning for each level a dictionary with files that are the same, different, and an overall score. :param size_heuristic: if True, assess root owned files based on size :param guts1,guts2: the result (dict with sizes,roots,e...
include_file will look at a path and determine if it matches a regular expression from a level def include_file(member,file_filter): '''include_file will look at a path and determine if it matches a regular expression from a level ''' member_path = member.name.replace('.','',1) if len(member_p...
assess if a file is root owned, meaning "root" or user/group id of 0 def is_root_owned(member): '''assess if a file is root owned, meaning "root" or user/group id of 0''' if member.uid == 0 or member.gid == 0: return True elif member.uname == 'root' or member.gname == 'root': retu...
Determine if the filter wants the file to be read for content. In the case of yes, we would then want to add the content to the hash and not the file object. def assess_content(member,file_filter): '''Determine if the filter wants the file to be read for content. In the case of yes, we would then want ...
get_custom_level will generate a custom level for the user, based on a regular expression. If used outside the context of tarsum, the user can generate their own named and described filters. :param regexp: must be defined, the file filter regular expression :param description: optional description def...
get_level returns a single level, with option to customize files added and skipped. def get_level(level,version=None,include_files=None,skip_files=None): '''get_level returns a single level, with option to customize files added and skipped. ''' levels = get_levels(version=version) level_names ...
modify level is intended to add / modify a content type. Default content type is list, meaning the entry is appended. If you set append to False, the content will be overwritten For any other content type, the entry is overwritten. def modify_level(level,field,values,append=True): '''modify level is in...
get_levels returns a dictionary of levels (key) and values (dictionaries with descriptions and regular expressions for files) for the user. :param version: the version of singularity to use (default is 2.2) :param include_files: files to add to the level, only relvant if def get_levels(version=None): ...
make set efficient will convert all lists of items in levels to a set to speed up operations def make_levels_set(levels): '''make set efficient will convert all lists of items in levels to a set to speed up operations''' for level_key,level_filters in levels.items(): levels[level_key] = make_le...
make level set will convert one level into a set def make_level_set(level): '''make level set will convert one level into a set''' new_level = dict() for key,value in level.items(): if isinstance(value,list): new_level[key] = set(value) else: new_level[key] =...
extract the file guts from an in memory tarfile. The file is not closed. This should not be done for large images. def extract_guts(image_path, tar, file_filter=None, tag_root=True, include_sizes=True): '''extract the file guts from an in ...
get an in memory tar of an image. Use carefully, not as reliable as get_image_tar def get_memory_tar(image_path): '''get an in memory tar of an image. Use carefully, not as reliable as get_image_tar ''' byte_array = Client.image.export(image_path) file_object = io.BytesIO(byte_array) ...
get an image tar, either written in memory or to the file system. file_obj will either be the file object, or the file itself. def get_image_tar(image_path): '''get an image tar, either written in memory or to the file system. file_obj will either be the file object, or the file itself. ''' ...
delete image tar will close a file object (if extracted into memory) or delete from the file system (if saved to disk) def delete_image_tar(file_obj, tar): '''delete image tar will close a file object (if extracted into memory) or delete from the file system (if saved to disk)''' try: file_obj....
extract_content will extract content from an image using cat. If hash=True, a hash sum is returned instead def extract_content(image_path, member_name, return_hash=False): '''extract_content will extract content from an image using cat. If hash=True, a hash sum is returned instead ''' if member_nam...
run_build takes a build directory and params dictionary, and does the following: - downloads repo to a temporary directory - changes branch or commit, if needed - creates and bootstraps singularity image from Singularity file - returns a dictionary with: image (path), metadata (dict) ...
finish build sends the build and data (response) to a response url :param build_dir: the directory of the build :response_url: where to send the response. If None, won't send :param data: the data object to send as a post :param clean_up: If true (default) removes build directory def send_build_data(bu...
send build close sends a final response (post) to the server to bring down the instance. The following must be included in params: repo_url, logfile, repo_id, secret, log_file, token def send_build_close(params,response_url): '''send build close sends a final response (post) to the server to bring down ...
remove unicode keys and values from dict, encoding in utf8 def remove_unicode_dict(input_dict): '''remove unicode keys and values from dict, encoding in utf8 ''' if isinstance(input_dict, collections.Mapping): return dict(map(remove_unicode_dict, input_dict.iteritems())) elif isinstance(input_d...
update_dict will update lists in a dictionary. If the key is not included, if will add as new list. If it is, it will append. :param input_dict: the dict to update :param value: the value to update with def update_dict(input_dict,key,value): '''update_dict will update lists in a dictionary. If the key ...
update_dict sum will increment a dictionary key by an increment, and add a value of 0 if it doesn't exist :param input_dict: the dict to update :param increment: the value to increment by. Default is 1 :param initial_value: value to start with. Default is 0 def update_dict_sum(input_dict,key,increment...
a simple jacaard (information coefficient) to compare two lists of overlaps/diffs def information_coefficient(total1,total2,intersect): '''a simple jacaard (information coefficient) to compare two lists of overlaps/diffs ''' total = total1 + total2 return 2.0*len(intersect) / total
RSA analysis will compare the similarity of two matrices def RSA(m1,m2): '''RSA analysis will compare the similarity of two matrices ''' from scipy.stats import pearsonr import scipy.linalg import numpy # This will take the diagonal of each matrix (and the other half is changed to nan) and fla...
get_url will use the requests library to get a url :param service_type: the service to get (default is storage) :param version: version to use (default is v1) def get_google_service(service_type=None,version=None): ''' get_url will use the requests library to get a url :param service_type: the serv...
get_folder will return the folder with folder_name, and if create=True, will create it if not found. If folder is found or created, the metadata is returned, otherwise None is returned :param storage_service: the drive_service created from get_storage_service :param bucket: the bucket object from get_bu...
get_image_path will determine an image path based on a repo url, removing any token, and taking into account urls that end with .git. :param repo_url: the repo url to parse: :param trailing_path: the trailing path (commit then hash is common) def get_image_path(repo_url, trailing_path): '''get_image_pa...
run_build will generate the Singularity build from a spec_file from a repo_url. If no arguments are required, the metadata api is queried for the values. :param build_dir: directory to do the build in. If not specified, will use temporary. :param spec_file: the spec_file name to use, assumed to be in g...
finish_build will finish the build by way of sending the log to the same bucket. the params are loaded from the previous function that built the image, expected in $HOME/params.pkl :: note: this function is currently configured to work with Google Compute Engine metadata api, and should (will) be custom...
get_build_metadata will return metadata about an instance from within it. :param key: the key to look up def get_build_metadata(key): '''get_build_metadata will return metadata about an instance from within it. :param key: the key to look up ''' headers = {"Metadata-Flavor":"Google"} url = "htt...
get_build_params uses get_build_metadata to retrieve corresponding meta data values for a build :param metadata: a list, each item a dictionary of metadata, in format: metadata = [{'key': 'repo_url', 'value': repo_url }, {'key': 'repo_id', 'value': repo_id }, {'key': 'credential'...
wrapper around the rsync command. the ssh connection arguments are set automatically. any args are just passed directly to rsync. you can use {host_string} in place of the server. the kwargs are passed on the 'local' fabric command. if not set, 'capture' is set to False. ...
Bootstrap an EC2 instance that has been booted into an AMI from http://www.daemonology.net/freebsd-on-ec2/ Note: deprecated, current AMI images are basically pre-bootstrapped, they just need to be configured. def bootstrap(**kwargs): """ Bootstrap an EC2 instance that has been booted into an AMI from http://ww...
Digital Oceans FreeBSD droplets are pretty much already pre-bootstrapped, including having python2.7 and sudo etc. pre-installed. the only thing we need to change is to allow root to login (without a password) enable pf and ensure it is running def bootstrap(**kwargs): """Digital Oceans FreeBSD droplet...
we need some files to bootstrap the FreeBSD installation. Some... - need to be provided by the user (i.e. authorized_keys) - others have some (sensible) defaults (i.e. rc.conf) - some can be downloaded via URL (i.e.) http://pkg.freebsd.org/freebsd:10:x86:64/latest/Latest/pkg....
computes the name of the disk devices that are suitable installation targets by subtracting CDROM- and USB devices from the list of total mounts. def devices(self): """ computes the name of the disk devices that are suitable installation targets by subtracting CDROM- and USB devices ...
download bootstrap assets to control host. If present on the control host they will be uploaded to the target host during bootstrapping. def fetch_assets(self): """ download bootstrap assets to control host. If present on the control host they will be uploaded to the target host during bootstra...
:param res: :class:`requests.Response` object Parse the given request and generate an informative string from it def res_to_str(res): """ :param res: :class:`requests.Response` object Parse the given request and generate an informative string from it """ if 'Authorization' in res.request.head...
Returns all the info extracted from a resource section of the apipie json :param resource_name: Name of the resource that is defined by the section :param resrouce_dict: Dictionary as generated by apipie of the resource definition def parse_resource_definition(resource_name, resource_dct): """ ...
Returns the appropriate resource name for the given URL. :param url: API URL stub, like: '/api/hosts' :return: Resource name, like 'hosts', or None if not found def parse_resource_from_url(self, url): """ Returns the appropriate resource name for the given URL. :param url: A...
There are three cases, because apipie definitions can have multiple signatures but python does not For example, the api endpoint: /api/myres/:myres_id/subres/:subres_id/subres2 for method *index* will be translated to the api method name: subres_index_subres2 So ...
Generate function for specific method and using specific api :param as_global: if set, will use the global function name, instead of the class method (usually {resource}_{class_method}) when defining the function def generate_func(self, as_global=False): """ Generate fu...
Generate documentation for single parameter of function :param param: dict contains info about parameter :param sub: prefix string for recursive purposes def create_param_doc(cls, param, prefix=None): """ Generate documentation for single parameter of function :param param: dict...
This function parses one of the elements of the definitions dict for a plugin and extracts the relevant information :param http_method: HTTP method that uses (GET, POST, DELETE, ...) :param funcs: functions related to that HTTP method def convert_plugin_def(http_method, funcs): """ ...
Given a repo will return the version string, according to semantic versioning, counting as non-backwards compatible commit any one with a message header that matches (case insensitive):: sem-ver: .*break.* And as features any commit with a header matching:: sem-ver: feature And count...
Given a repo and optionally a base revision to start from, will return the list of authors. def get_authors(repo_path, from_commit): """ Given a repo and optionally a base revision to start from, will return the list of authors. """ repo = dulwich.repo.Repo(repo_path) refs = get_refs(repo) ...
Emit a spout Tuple message. :param tup: the Tuple to send to Storm, should contain only JSON-serializable data. :type tup: list or tuple :param tup_id: the ID for the Tuple. Leave this blank for an unreliable emit. :type tup_id: str :pa...
The inside of ``run``'s infinite loop. Separated out so it can be properly unit tested. def _run(self): """The inside of ``run``'s infinite loop. Separated out so it can be properly unit tested. """ cmd = self.read_command() if cmd["command"] == "next": sel...
Called when a bolt acknowledges a Tuple in the topology. :param tup_id: the ID of the Tuple that has been fully acknowledged in the topology. :type tup_id: str def ack(self, tup_id): """Called when a bolt acknowledges a Tuple in the topology. :param tup_id: the ...
Called when a Tuple fails in the topology A reliable spout will replay a failed tuple up to ``max_fails`` times. :param tup_id: the ID of the Tuple that has failed in the topology either due to a bolt calling ``fail()`` or a Tuple timing out. :type...
Emit a spout Tuple & add metadata about it to `unacked_tuples`. In order for this to work, `tup_id` is a required parameter. See :meth:`Bolt.emit`. def emit( self, tup, tup_id=None, stream=None, direct_task=None, need_task_ids=False ): """Emit a spout Tuple & add metadata about it...
Handler to drop us into a remote debugger upon receiving SIGUSR1 def remote_pdb_handler(signum, frame): """ Handler to drop us into a remote debugger upon receiving SIGUSR1 """ try: from remote_pdb import RemotePdb rdb = RemotePdb(host="127.0.0.1", port=0) rdb.set_trace(frame=frame) ...
Emit a record. If a formatter is specified, it is used to format the record. If exception information is present, it is formatted using traceback.print_exception and sent to Storm. def emit(self, record): """ Emit a record. If a formatter is specified, it is used to fo...
Add helpful instance variables to component after initial handshake with Storm. Also configure logging. def _setup_component(self, storm_conf, context): """Add helpful instance variables to component after initial handshake with Storm. Also configure logging. """ self.topology...
Read and process an initial handshake message from Storm. def read_handshake(self): """Read and process an initial handshake message from Storm.""" msg = self.read_message() pid_dir, _conf, _context = msg["pidDir"], msg["conf"], msg["context"] # Write a blank PID file out to the pidDir...
Send a message to Storm via stdout. def send_message(self, message): """Send a message to Storm via stdout.""" if not isinstance(message, dict): logger = self.logger if self.logger else log logger.error( "%s.%d attempted to send a non dict message to Storm: " "%r...
Report an exception back to Storm via logging. :param exception: a Python exception. :param tup: a :class:`Tuple` object. def raise_exception(self, exception, tup=None): """Report an exception back to Storm via logging. :param exception: a Python exception. :param tup: a :clas...
Log a message to Storm optionally providing a logging level. :param message: the log message to send to Storm. :type message: str :param level: the logging level that Storm should use when writing the ``message``. Can be one of: trace, debug, info, warn, or ...
Emit a new Tuple to a stream. :param tup: the Tuple payload to send to Storm, should contain only JSON-serializable data. :type tup: :class:`list` or :class:`pystorm.component.Tuple` :param tup_id: the ID for the Tuple. If omitted by a :class:`pystorm....
Main run loop for all components. Performs initial handshake with Storm and reads Tuples handing them off to subclasses. Any exceptions are caught and logged back to Storm prior to the Python process exiting. .. warning:: Subclasses should **not** override this method. d...
Properly kill Python process including zombie threads. def _exit(self, status_code): """Properly kill Python process including zombie threads.""" # If there are active threads still running infinite loops, sys.exit # won't kill them but os._exit will. os._exit skips calling cleanup # ha...
Returns a TextIOWrapper around the given stream that handles UTF-8 encoding/decoding. def _wrap_stream(stream): """Returns a TextIOWrapper around the given stream that handles UTF-8 encoding/decoding. """ if hasattr(stream, "buffer"): return io.TextIOWrapper(stream.b...
The Storm multilang protocol consists of JSON messages followed by a newline and "end\n". All of Storm's messages (for either bolts or spouts) should be of the form:: '<command or task_id form prior emit>\\nend\\n' Command example, an incoming Tuple to a bolt:: ...
Serialize to JSON a message dictionary. def serialize_dict(self, msg_dict): """Serialize to JSON a message dictionary.""" serialized = json.dumps(msg_dict, namedtuple_as_object=False) if PY2: serialized = serialized.decode("utf-8") serialized = "{}\nend\n".format(serialized)...
Read a tuple from the pipe to Storm. def read_tuple(self): """Read a tuple from the pipe to Storm.""" cmd = self.read_command() source = cmd["comp"] stream = cmd["stream"] values = cmd["tuple"] val_type = self._source_tuple_types[source].get(stream) return Tuple(...
Emit a new Tuple to a stream. :param tup: the Tuple payload to send to Storm, should contain only JSON-serializable data. :type tup: :class:`list` or :class:`pystorm.component.Tuple` :param stream: the ID of the stream to emit this Tuple to. Specify ``...
Indicate that processing of a Tuple has succeeded. :param tup: the Tuple to acknowledge. :type tup: :class:`str` or :class:`pystorm.component.Tuple` def ack(self, tup): """Indicate that processing of a Tuple has succeeded. :param tup: the Tuple to acknowledge. :type tup: :clas...
Indicate that processing of a Tuple has failed. :param tup: the Tuple to fail (its ``id`` if ``str``). :type tup: :class:`str` or :class:`pystorm.component.Tuple` def fail(self, tup): """Indicate that processing of a Tuple has failed. :param tup: the Tuple to fail (its ``id`` if ``str...
The inside of ``run``'s infinite loop. Separated out so it can be properly unit tested. def _run(self): """The inside of ``run``'s infinite loop. Separated out so it can be properly unit tested. """ tup = self.read_tuple() self._current_tups = [tup] if self.is_...
Process an exception encountered while running the ``run()`` loop. Called right before program exits. def _handle_run_exception(self, exc): """Process an exception encountered while running the ``run()`` loop. Called right before program exits. """ if len(self._current_tups) =...
Modified emit that will not return task IDs after emitting. See :class:`pystorm.component.Bolt` for more information. :returns: ``None``. def emit(self, tup, **kwargs): """Modified emit that will not return task IDs after emitting. See :class:`pystorm.component.Bolt` for more informa...
Increment tick counter, and call ``process_batch`` for all current batches if tick counter exceeds ``ticks_between_batches``. See :class:`pystorm.component.Bolt` for more information. .. warning:: This method should **not** be overriden. If you want to tweak how Tuples...
Iterate through all batches, call process_batch on them, and ack. Separated out for the rare instances when we want to subclass BatchingBolt and customize what mechanism causes batches to be processed. def process_batches(self): """Iterate through all batches, call process_batch on the...
Group non-tick Tuples into batches by ``group_key``. .. warning:: This method should **not** be overriden. If you want to tweak how Tuples are grouped into batches, override ``group_key``. def process(self, tup): """Group non-tick Tuples into batches by ``group_key``. ...
Process an exception encountered while running the ``run()`` loop. Called right before program exits. def _handle_run_exception(self, exc): """Process an exception encountered while running the ``run()`` loop. Called right before program exits. """ self.raise_exception(exc, se...
The inside of ``_batch_entry``'s infinite loop. Separated out so it can be properly unit tested. def _batch_entry_run(self): """The inside of ``_batch_entry``'s infinite loop. Separated out so it can be properly unit tested. """ time.sleep(self.secs_between_batches) wi...