text
stringlengths
81
112k
This function will run once logging has been configured. It just removes the temporary stream Handler from the logging handlers. def __remove_temp_logging_handler(): ''' This function will run once logging has been configured. It just removes the temporary stream Handler from the logging handlers. ...
This function will log all un-handled python exceptions. def __global_logging_exception_handler(exc_type, exc_value, exc_traceback): ''' This function will log all un-handled python exceptions. ''' if exc_type.__name__ == "KeyboardInterrupt": # Do not log the exception or display the traceback ...
Parse GRUB conf file CLI Example: .. code-block:: bash salt '*' grub.conf def conf(): ''' Parse GRUB conf file CLI Example: .. code-block:: bash salt '*' grub.conf ''' stanza = '' stanzas = [] in_stanza = False ret = {} pos = 0 try: with...
Used by conf() to break config lines into name/value pairs def _parse_line(line=''): ''' Used by conf() to break config lines into name/value pairs ''' parts = line.split() key = parts.pop(0) value = ' '.join(parts) return key, value
Display upgrade impact information without actually upgrading the device. system_image (Mandatory Option) Path on bootflash: to system image upgrade file. kickstart_image Path on bootflash: to kickstart image upgrade file. (Not required if using combined system/kickstart image file) ...
Upgrade NX-OS switch. system_image (Mandatory Option) Path on bootflash: to system image upgrade file. kickstart_image Path on bootflash: to kickstart image upgrade file. (Not required if using combined system/kickstart image file) Default: None issu Set this optio...
Helper method that does the heavy lifting for upgrades. def _upgrade(system_image, kickstart_image, issu, **kwargs): ''' Helper method that does the heavy lifting for upgrades. ''' si = system_image ki = kickstart_image dev = 'bootflash' cmd = 'terminal dont-ask ; install all' if ki is...
Helper method to parse upgrade data from the NX-OS device. def _parse_upgrade_data(data): ''' Helper method to parse upgrade data from the NX-OS device. ''' upgrade_result = {} upgrade_result['upgrade_data'] = None upgrade_result['succeeded'] = False upgrade_result['upgrade_required'] = Fal...
Standardizes all responses :param status: :param message: :param data: :param debug_msg: :return: def __standardize_result(status, message, data=None, debug_msg=None): ''' Standardizes all responses :param status: :param message: :param data: :param debug_msg: :return:...
Determines the filepath to use :param path: :return: def __get_docker_file_path(path): ''' Determines the filepath to use :param path: :return: ''' if os.path.isfile(path): return path for dc_filename in DEFAULT_DC_FILENAMES: file_path = os.path.join(path, dc_filen...
Read the compose file if it exists in the directory :param file_path: :return: def __read_docker_compose_file(file_path): ''' Read the compose file if it exists in the directory :param file_path: :return: ''' if not os.path.isfile(file_path): return __standardize_result(False,...
Read the compose file and load its' contents :param path: :return: def __load_docker_compose(path): ''' Read the compose file and load its' contents :param path: :return: ''' file_path = __get_docker_file_path(path) if file_path is None: msg = 'Could not find docker-compos...
Dumps :param path: :param content: the not-yet dumped content :return: def __dump_docker_compose(path, content, already_existed): ''' Dumps :param path: :param content: the not-yet dumped content :return: ''' try: dumped = yaml.safe_dump(content, indent=2, default_flow...
Write docker-compose to a path in order to use it with docker-compose ( config check ) :param path: docker_compose contains the docker-compose file :return: def __write_docker_compose(path, docker_compose, already_existed): ''' Write docker-compose to a path in order to use it wi...
Load a docker-compose project from path :param path: :return: def __load_project(path): ''' Load a docker-compose project from path :param path: :return: ''' file_path = __get_docker_file_path(path) if file_path is None: msg = 'Could not find docker-compose file at {0}'.fo...
Load a docker-compose project from file path :param path: :return: def __load_project_from_file_path(file_path): ''' Load a docker-compose project from file path :param path: :return: ''' try: project = get_project(project_dir=os.path.dirname(file_path), ...
Will load the compose file located at path Then determines the format/contents of the sent definition err or results are only set if there were any :param path: :param definition: :return tuple(compose_result, loaded_definition, err): def __load_compose_definitions(path, definition): ''' ...
Utility function to dump the compose result to a file. :param path: :param compose_result: :param success_msg: the message to give upon success :return: def __dump_compose_file(path, compose_result, success_msg, already_existed): ''' Utility function to dump the compose result to a file. ...
Get action executed for each container :param project: :param service_names: :return: def _get_convergence_plans(project, service_names): ''' Get action executed for each container :param project: :param service_names: :return: ''' ret = {} plans = project._get_convergence...
Get the content of the docker-compose file into a directory path Path where the docker-compose file is stored on the server CLI Example: .. code-block:: bash salt myminion dockercompose.get /path/where/docker-compose/stored def get(path): ''' Get the content of the docker-compos...
Create and validate a docker-compose file into a directory path Path where the docker-compose file will be stored on the server docker_compose docker_compose file CLI Example: .. code-block:: bash salt myminion dockercompose.create /path/where/docker-compose/stored content ...
Pull image for containers in the docker-compose file, service_names is a python list, if omitted pull all images path Path where the docker-compose file is stored on the server service_names If specified will pull only the image for the specified services CLI Example: .. code-bloc...
Pause running containers in the docker-compose file, service_names is a python list, if omitted pause all containers path Path where the docker-compose file is stored on the server service_names If specified will pause only the specified services CLI Example: .. code-block:: bash ...
Remove stopped containers in the docker-compose file, service_names is a python list, if omitted remove all stopped containers path Path where the docker-compose file is stored on the server service_names If specified will remove only the specified stopped services CLI Example: .....
List all running containers and report some information about them path Path where the docker-compose file is stored on the server CLI Example: .. code-block:: bash salt myminion dockercompose.ps /path/where/docker-compose/stored def ps(path): ''' List all running containers and...
Create and start containers defined in the docker-compose.yml file located in path, service_names is a python list, if omitted create and start all containers path Path where the docker-compose file is stored on the server service_names If specified will create and start only the specif...
Create or update the definition of a docker-compose service This does not pull or up the service This wil re-write your yaml file. Comments will be lost. Indentation is set to 2 spaces path Path where the docker-compose file is stored on the server service_name Name of the service to cr...
Remove the definition of a docker-compose service This does not rm the container This wil re-write your yaml file. Comments will be lost. Indentation is set to 2 spaces path Path where the docker-compose file is stored on the server service_name Name of the service to remove CLI Ex...
Change the tag of a docker-compose service This does not pull or up the service This wil re-write your yaml file. Comments will be lost. Indentation is set to 2 spaces path Path where the docker-compose file is stored on the server service_name Name of the service to remove tag ...
Take a low tag and split it back into the low dict that it came from def split_low_tag(tag): ''' Take a low tag and split it back into the low dict that it came from ''' state, id_, name, fun = tag.split('_|-') return {'state': state, '__id__': id_, 'name': name, ...
Generate a NULL duration for when states do not run but we want the results to be consistent. def _calculate_fake_duration(): ''' Generate a NULL duration for when states do not run but we want the results to be consistent. ''' utc_start_time = datetime.datetime.utcnow() local_start_time = ...
Return the directory that accumulator data is stored in, creating it if it doesn't exist. def get_accumulator_dir(cachedir): ''' Return the directory that accumulator data is stored in, creating it if it doesn't exist. ''' fn_ = os.path.join(cachedir, 'accumulator') if not os.path.isdir(fn_...
Trim any function off of a requisite def trim_req(req): ''' Trim any function off of a requisite ''' reqfirst = next(iter(req)) if '.' in reqfirst: return {reqfirst.split('.')[0]: req[reqfirst]} return req
Return a set of the arguments passed to the named state def state_args(id_, state, high): ''' Return a set of the arguments passed to the named state ''' args = set() if id_ not in high: return args if state not in high[id_]: return args for item in high[id_][state]: ...
Scan high data for the id referencing the given name and return a list of (IDs, state) tuples that match Note: if `state` is sls, then we are looking for all IDs that match the given SLS def find_name(name, state, high): ''' Scan high data for the id referencing the given name and return a list of (IDs, s...
Scan for all ids in the given sls and return them in a dict; {name: state} def find_sls_ids(sls, high): ''' Scan for all ids in the given sls and return them in a dict; {name: state} ''' ret = [] for nid, item in six.iteritems(high): try: sls_tgt = item['__sls__'] except...
Format the state into a log message def format_log(ret): ''' Format the state into a log message ''' msg = '' if isinstance(ret, dict): # Looks like the ret may be a valid state return if 'changes' in ret: # Yep, looks like a valid state return chg = ret['cha...
Compile the master side low state data, and build the hidden state file def master_compile(master_opts, minion_opts, grains, id_, saltenv): ''' Compile the master side low state data, and build the hidden state file ''' st_ = MasterHighState(master_opts, minion_opts, grains, id_, saltenv) return st...
Enforce the states in a template def render_template(self, template, **kwargs): ''' Enforce the states in a template ''' high = compile_template(template, self.rend, self.opts['renderer'], se...
Turns dot delimited function refs into function strings def pad_funcs(self, high): ''' Turns dot delimited function refs into function strings ''' for name in high: if not isinstance(high[name], dict): if isinstance(high[name], six.string_types): ...
Verify that the high data is viable and follows the data structure def verify_high(self, high): ''' Verify that the high data is viable and follows the data structure ''' errors = [] if not isinstance(high, dict): errors.append('High data is not a dictionary and is i...
Sort the chunk list verifying that the chunks follow the order specified in the order options. def order_chunks(self, chunks): ''' Sort the chunk list verifying that the chunks follow the order specified in the order options. ''' cap = 1 for chunk in chunks: ...
Read in the __exclude__ list and remove all excluded objects from the high data def apply_exclude(self, high): ''' Read in the __exclude__ list and remove all excluded objects from the high data ''' if '__exclude__' not in high: return high ex_sls = s...
Whenever a state run starts, gather the pillar data fresh def _gather_pillar(self): ''' Whenever a state run starts, gather the pillar data fresh ''' if self._pillar_override: if self._pillar_enc: try: self._pillar_override = salt.utils.cr...
Check the module initialization function, if this is the first run of a state package that has a mod_init function, then execute the mod_init function in the state module. def _mod_init(self, low): ''' Check the module initialization function, if this is the first run of a state...
Execute the aggregation systems to runtime modify the low chunk def _mod_aggregate(self, low, running, chunks): ''' Execute the aggregation systems to runtime modify the low chunk ''' agg_opt = self.functions['config.option']('state_aggregate') if 'aggregate' in low: ...
Check that unless doesn't return 0, and that onlyif returns a 0. def _run_check(self, low_data): ''' Check that unless doesn't return 0, and that onlyif returns a 0. ''' ret = {'result': False, 'comment': []} cmd_opts = {} if 'shell' in self.opts['grains']: ...
Check that unless doesn't return 0, and that onlyif returns a 0. def _run_check_onlyif(self, low_data, cmd_opts): ''' Check that unless doesn't return 0, and that onlyif returns a 0. ''' ret = {'result': False} if not isinstance(low_data['onlyif'], list): low_data_o...
Alter the way a successful state run is determined def _run_check_cmd(self, low_data): ''' Alter the way a successful state run is determined ''' ret = {'result': False} cmd_opts = {} if 'shell' in self.opts['grains']: cmd_opts['shell'] = self.opts['grains']....
Read the state loader value and loadup the correct states subsystem def _load_states(self): ''' Read the state loader value and loadup the correct states subsystem ''' if self.states_loader == 'thorium': self.states = salt.loader.thorium(self.opts, self.functions, {}) # TOD...
Load the modules into the state def load_modules(self, data=None, proxy=None): ''' Load the modules into the state ''' log.info('Loading fresh modules for state activity') self.utils = salt.loader.utils(self.opts) self.functions = salt.loader.minion_mods(self.opts, self....
Refresh all the modules def module_refresh(self): ''' Refresh all the modules ''' log.debug('Refreshing modules...') if self.opts['grains'].get('os') != 'MacOS': # In case a package has been installed into the current python # process 'site-packages', the...
Check to see if the modules for this state instance need to be updated, only update if the state is a file or a package and if it changed something. If the file function is managed check to see if the file is a possible module type, e.g. a python, pyx, or .so. Always refresh if the funct...
Verify the data, return an error statement if something is wrong def verify_data(self, data): ''' Verify the data, return an error statement if something is wrong ''' errors = [] if 'state' not in data: errors.append('Missing "state" data') if 'fun' not in da...
Verify the chunks in a list of low data structures def verify_chunks(self, chunks): ''' Verify the chunks in a list of low data structures ''' err = [] for chunk in chunks: err.extend(self.verify_data(chunk)) return err
Sort the chunk list verifying that the chunks follow the order specified in the order options. def order_chunks(self, chunks): ''' Sort the chunk list verifying that the chunks follow the order specified in the order options. ''' cap = 1 for chunk in chunks: ...
"Compile" the high data as it is retrieved from the CLI or YAML into the individual state executor structures def compile_high_data(self, high, orchestration_jid=None): ''' "Compile" the high data as it is retrieved from the CLI or YAML into the individual state executor structures ...
Pull the extend data and add it to the respective high data def reconcile_extend(self, high): ''' Pull the extend data and add it to the respective high data ''' errors = [] if '__extend__' not in high: return high, errors ext = high.pop('__extend__') ...
Extend the data reference with requisite_in arguments def requisite_in(self, high): ''' Extend the data reference with requisite_in arguments ''' req_in = {'require_in', 'watch_in', 'onfail_in', 'onchanges_in', 'use', 'use_in', 'prereq', 'prereq_in'} req_in_all = req_in.union({'...
The target function to call that will create the parallel thread/process def _call_parallel_target(self, name, cdata, low): ''' The target function to call that will create the parallel thread/process ''' # we need to re-record start/end duration here because it is impossible to ...
Call the state defined in the given cdata in parallel def call_parallel(self, cdata, low): ''' Call the state defined in the given cdata in parallel ''' # There are a number of possibilities to not have the cdata # populated with what we might have expected, so just be smart ...
Call a state directly with the low data structure, verify data before processing. def call(self, low, chunks=None, running=None, retries=1): ''' Call a state directly with the low data structure, verify data before processing. ''' use_uptime = False if os.path.is...
Read in the arguments from the low level slot syntax to make a last minute runtime call to gather relevant data for the specific routine Will parse strings, first level of dictionary values, and strings and first level dict values inside of lists def format_slots(self, cdata): ''' ...
verifies the specified retry data def verify_retry_data(self, retry_data): ''' verifies the specified retry data ''' retry_defaults = { 'until': True, 'attempts': 2, 'splay': 0, 'interval': 30, } expected_da...
Iterate over a list of chunks and call them, checking for requires. def call_chunks(self, chunks): ''' Iterate over a list of chunks and call them, checking for requires. ''' # Check for any disabled states disabled = {} if 'state_runs_disabled' in self.opts['grains']: ...
Check if the low data chunk should send a failhard signal def check_failhard(self, low, running): ''' Check if the low data chunk should send a failhard signal ''' tag = _gen_tag(low) if self.opts.get('test', False): return False if low.get('failhard', self.o...
Check to see if this low chunk has been paused def check_pause(self, low): ''' Check to see if this low chunk has been paused ''' if not self.jid: # Can't pause on salt-ssh since we can't track continuous state return pause_path = os.path.join(self.opts['...
Check the running dict for processes and resolve them def reconcile_procs(self, running): ''' Check the running dict for processes and resolve them ''' retset = set() for tag in running: proc = running[tag].get('proc') if proc: if not proc...
Look into the running data to check the status of all requisite states def check_requisite(self, low, running, chunks, pre=False): ''' Look into the running data to check the status of all requisite states ''' disabled_reqs = self.opts.get('disabled_requisites', []) ...
Fire an event on the master bus If `fire_event` is set to True an event will be sent with the chunk name in the tag and the chunk result in the event data. If `fire_event` is set to a string such as `mystate/is/finished`, an event will be sent with the string added to the tag and the c...
Check if a chunk has any requires, execute the requires and then the chunk def call_chunk(self, low, running, chunks): ''' Check if a chunk has any requires, execute the requires and then the chunk ''' low = self._mod_aggregate(low, running, chunks) self._mod_ini...
Find all of the listen routines and call the associated mod_watch runs def call_listen(self, chunks, running): ''' Find all of the listen routines and call the associated mod_watch runs ''' listeners = [] crefs = {} for chunk in chunks: crefs[(chunk['state'],...
Sets .call function to a state, if not there. :param high: :return: def inject_default_call(self, high): ''' Sets .call function to a state, if not there. :param high: :return: ''' for chunk in high: state = high[chunk] if not is...
Process a high data call and ensure the defined states. def call_high(self, high, orchestration_jid=None): ''' Process a high data call and ensure the defined states. ''' self.inject_default_call(high) errors = [] # If there is extension data reconcile it high, e...
Enforce the states in a template def call_template(self, template): ''' Enforce the states in a template ''' high = compile_template(template, self.rend, self.opts['renderer'], self.opts['ren...
Enforce the states in a template, pass the template as a string def call_template_str(self, template): ''' Enforce the states in a template, pass the template as a string ''' high = compile_template_str(template, self.rend, ...
The options used by the High State object are derived from options on the minion and the master, or just the minion if the high state call is entirely local. def __gen_opts(self, opts): ''' The options used by the High State object are derived from options on the minion and the ...
Pull the file server environments out of the master options def _get_envs(self): ''' Pull the file server environments out of the master options ''' envs = ['base'] if 'file_roots' in self.opts: envs.extend([x for x in list(self.opts['file_roots']) ...
Gather the top files def get_tops(self): ''' Gather the top files ''' tops = DefaultOrderedDict(list) include = DefaultOrderedDict(list) done = DefaultOrderedDict(list) found = 0 # did we find any contents in the top files? # Gather initial top files ...
Cleanly merge the top files def merge_tops(self, tops): ''' Cleanly merge the top files ''' merging_strategy = self.opts['top_file_merging_strategy'] try: merge_attr = '_merge_tops_{0}'.format(merging_strategy) merge_func = getattr(self, merge_attr) ...
The default merging strategy. The base env is authoritative, so it is checked first, followed by the remaining environments. In top files from environments other than "base", only the section matching the environment from the top file will be considered, and it too will be ignored if tha...
For each saltenv, only consider the top file from that saltenv. All sections matching a given saltenv, which appear in a different saltenv's top file, will be ignored. def _merge_tops_same(self, tops): ''' For each saltenv, only consider the top file from that saltenv. All secti...
Merge the top files into a single dictionary def _merge_tops_merge_all(self, tops): ''' Merge the top files into a single dictionary ''' def _read_tgt(tgt): match_type = None states = [] for item in tgt: if isinstance(item, dict): ...
Verify the contents of the top file data def verify_tops(self, tops): ''' Verify the contents of the top file data ''' errors = [] if not isinstance(tops, dict): errors.append('Top data was not formed as a dict') # No further checks will work, bail out ...
Returns the high data derived from the top file def get_top(self): ''' Returns the high data derived from the top file ''' try: tops = self.get_tops() except SaltRenderError as err: log.error('Unable to render top file: %s', err.error) return ...
Search through the top high data for matches and return the states that this minion needs to execute. Returns: {'saltenv': ['state1', 'state2', ...]} def top_matches(self, top): ''' Search through the top high data for matches and return the states that this minion need...
If autoload_dynamic_modules is True then automatically load the dynamic modules def load_dynamic(self, matches): ''' If autoload_dynamic_modules is True then automatically load the dynamic modules ''' if not self.opts['autoload_dynamic_modules']: return ...
Render a state file and retrieve all of the include states def render_state(self, sls, saltenv, mods, matches, local=False): ''' Render a state file and retrieve all of the include states ''' errors = [] if not local: state_data = self.client.get_state(sls, saltenv) ...
Take a state and apply the iorder system def _handle_iorder(self, state): ''' Take a state and apply the iorder system ''' if self.opts['state_auto_order']: for name in state: for s_dec in state[name]: if not isinstance(s_dec, six.string_t...
Add sls and saltenv components to the state def _handle_state_decls(self, state, sls, saltenv, errors): ''' Add sls and saltenv components to the state ''' for name in state: if not isinstance(state[name], dict): if name == '__extend__': c...
Take the extend dec out of state and apply to the highstate global dec def _handle_extend(self, state, sls, saltenv, errors): ''' Take the extend dec out of state and apply to the highstate global dec ''' if 'extend' in state: ext = state.pop('extend') ...
Take the exclude dec out of the state and apply it to the highstate global dec def _handle_exclude(self, state, sls, saltenv, errors): ''' Take the exclude dec out of the state and apply it to the highstate global dec ''' if 'exclude' in state: exc = state.po...
Gather the state files and render them into a single unified salt high data structure. def render_highstate(self, matches): ''' Gather the state files and render them into a single unified salt high data structure. ''' highstate = self.building_highstate all_erro...
Check the pillar for errors, refuse to run the state if there are errors in the pillar and return the pillar errors def _check_pillar(self, force=False): ''' Check the pillar for errors, refuse to run the state if there are errors in the pillar and return the pillar errors ''' ...
Reads over the matches and returns a matches dict with just the ones that are in the whitelist def matches_whitelist(self, matches, whitelist): ''' Reads over the matches and returns a matches dict with just the ones that are in the whitelist ''' if not whitelist: ...
Run the sequence to execute the salt highstate for this minion def call_highstate(self, exclude=None, cache=None, cache_name='highstate', force=False, whitelist=None, orchestration_jid=None): ''' Run the sequence to execute the salt highstate for this minion ''' #...
Return just the highstate or the errors def compile_highstate(self): ''' Return just the highstate or the errors ''' err = [] top = self.get_top() err += self.verify_tops(top) matches = self.top_matches(top) high, errors = self.render_highstate(matches) ...
Compile the highstate but don't run it, return the low chunks to see exactly what the highstate will execute def compile_low_chunks(self): ''' Compile the highstate but don't run it, return the low chunks to see exactly what the highstate will execute ''' top = self.get_...
Return all used and unused states for the minion based on the top match data def compile_state_usage(self): ''' Return all used and unused states for the minion based on the top match data ''' err = [] top = self.get_top() err += self.verify_tops(top) if err: ...
Load the modules into the state def load_modules(self, data=None, proxy=None): ''' Load the modules into the state ''' log.info('Loading fresh modules for state activity') # Load a modified client interface that looks like the interface used # from the minion, but uses r...