text
stringlengths
81
112k
Return a pandas dataframe with all the training jobs, along with their hyperparameters, results, and metadata. This also includes a column to indicate if a training job was the best seen so far. def _fetch_dataframe(self): """Return a pandas dataframe with all the training jobs, along with thei...
A dictionary describing the ranges of all tuned hyperparameters. The keys are the names of the hyperparameter, and the values are the ranges. def tuning_ranges(self): """A dictionary describing the ranges of all tuned hyperparameters. The keys are the names of the hyperparameter, and the values...
Call ``DescribeHyperParameterTuningJob`` for the hyperparameter tuning job. Args: force_refresh (bool): Set to True to fetch the latest data from SageMaker API. Returns: dict: The Amazon SageMaker response for ``DescribeHyperParameterTuningJob``. def description(self, force_re...
A (paginated) list of everything from ``ListTrainingJobsForTuningJob``. Args: force_refresh (bool): Set to True to fetch the latest data from SageMaker API. Returns: dict: The Amazon SageMaker response for ``ListTrainingJobsForTuningJob``. def training_job_summaries(self, forc...
Clear the object of all local caches of API methods, so that the next time any properties are accessed they will be refreshed from the service. def clear_cache(self): """Clear the object of all local caches of API methods, so that the next time any properties are accessed they will be r...
Return a dictionary with two datetime objects, start_time and end_time, covering the interval of the training job def _determine_timeinterval(self): """Return a dictionary with two datetime objects, start_time and end_time, covering the interval of the training job """ descripti...
Fetch all the values of a named metric, and add them to _data def _fetch_metric(self, metric_name): """Fetch all the values of a named metric, and add them to _data """ request = { 'Namespace': self.CLOUDWATCH_NAMESPACE, 'MetricName': metric_name, 'Dimensions...
Store a single metric in the _data dict which can be converted to a dataframe. def _add_single_metric(self, timestamp, metric_name, value): """Store a single metric in the _data dict which can be converted to a dataframe. """ # note that this method is built this way to make it ...
Helper method to discover the metrics defined for a training job. def _metric_names_for_training_job(self): """Helper method to discover the metrics defined for a training job. """ training_description = self._sage_client.describe_training_job(TrainingJobName=self._training_job_name) m...
Append a timestamp to the provided string. This function assures that the total length of the resulting string is not longer than the specified max length, trimming the input parameter if necessary. Args: base (str): String used as prefix to generate the unique name. max_length (int): Maxi...
Extract the base name of the image to use as the 'algorithm name' for the job. Args: image (str): Image name. Returns: str: Algorithm name, as extracted from the image name. def base_name_from_image(image): """Extract the base name of the image to use as the 'algorithm name' for the job. ...
Return a timestamp with millisecond precision. def sagemaker_timestamp(): """Return a timestamp with millisecond precision.""" moment = time.time() moment_ms = repr(moment).split('.')[1][:3] return time.strftime("%Y-%m-%d-%H-%M-%S-{}".format(moment_ms), time.gmtime(moment))
Print the function name and arguments for debugging. def debug(func): """Print the function name and arguments for debugging.""" @wraps(func) def wrapper(*args, **kwargs): print("{} args: {} kwargs: {}".format(func.__name__, args, kwargs)) return func(*args, **kwargs) return wrapper
Convert the input to a string, unless it is a unicode string in Python 2. Unicode strings are supported as native strings in Python 3, but ``str()`` cannot be invoked on unicode strings in Python 2, so we need to check for that case when converting user-specified values to strings. Args: value...
Returns the name used in the API given a full ARN for a training job or hyperparameter tuning job. def extract_name_from_job_arn(arn): """Returns the name used in the API given a full ARN for a training job or hyperparameter tuning job. """ slash_pos = arn.find('/') if slash_pos == -1: ...
Returns a string contains last modified time and the secondary training job status message. Args: job_description: Returned response from DescribeTrainingJob call prev_description: Previous job description from DescribeTrainingJob call Returns: str: Job status string to be printed. de...
Download a folder from S3 to a local path Args: bucket_name (str): S3 bucket name prefix (str): S3 prefix within the bucket that will be downloaded. Can be a single file. target (str): destination path where the downloaded items will be placed sagemaker_session (:class:`sagemaker.se...
Create a tar file containing all the source_files Args: source_files (List[str]): List of file paths that will be contained in the tar file Returns: (str): path to created tar file def create_tar_file(source_files, target=None): """Create a tar file containing all the source_files A...
Download a Single File from S3 into a local path Args: bucket_name (str): S3 bucket name path (str): file path within the bucket target (str): destination directory for the downloaded file. sagemaker_session (:class:`sagemaker.session.Session`): a sagemaker session to interact with ...
Sync to_directory with from_directory by copying each file in to_directory with new contents. Files in to_directory will be overwritten by files of the same name in from_directory. We need to keep two copies of the log directory because otherwise TensorBoard picks up temp files from `aws...
Create a TensorBoard process. Returns: tuple: A tuple containing: int: The port number. process: The TensorBoard process. Raises: OSError: If no ports between 6006 and 6105 are available for starting TensorBoard. def create_tensorboard_process(s...
Run TensorBoard process. def run(self): """Run TensorBoard process.""" port, tensorboard_process = self.create_tensorboard_process() LOGGER.info('TensorBoard 0.1.7 at http://localhost:{}'.format(port)) while not self.estimator.checkpoint_path: self.event.wait(1) wit...
Train a model using the input training dataset. See :func:`~sagemaker.estimator.EstimatorBase.fit` for more details. Args: inputs (str or dict or sagemaker.session.s3_input): Information about the training data. This can be one of three types: * (str) - the...
Convert the job description to init params that can be handled by the class constructor Args: job_details: the returned job details from a describe_training_job API call. Returns: dictionary: The transformed init_params def _prepare_init_params_from_job_description(cls, job_d...
Create a SageMaker ``TensorFlowModel`` object that can be deployed to an ``Endpoint``. Args: role (str): The ``ExecutionRoleArn`` IAM Role ARN for the ``Model``, which is also used during transform jobs. If not specified, the role from the Estimator will be used. model_s...
Return hyperparameters used by your custom TensorFlow code during model training. def hyperparameters(self): """Return hyperparameters used by your custom TensorFlow code during model training.""" hyperparameters = super(TensorFlow, self).hyperparameters() self.checkpoint_path = self.checkpoin...
Extracts subnets and security group ids as lists from a VpcConfig dict Args: vpc_config (dict): a VpcConfig dict containing 'Subnets' and 'SecurityGroupIds' do_sanitize (bool): whether to sanitize the VpcConfig dict before extracting values Returns: Tuple of lists as (subnets, security...
Checks that an instance of VpcConfig has the expected keys and values, removes unexpected keys, and raises ValueErrors if any expectations are violated Args: vpc_config (dict): a VpcConfig dict containing 'Subnets' and 'SecurityGroupIds' Returns: A valid VpcConfig dict containing only 'Sub...
Return the SageMaker hyperparameters for training this KMeans Estimator def hyperparameters(self): """Return the SageMaker hyperparameters for training this KMeans Estimator""" hp_dict = dict(force_dense='True') # KMeans requires this hp to fit on Record objects hp_dict.update(super(KMeans, se...
Creates a new tuner by copying the request fields from the provided parent to the new instance of ``HyperparameterTuner`` followed by addition of warm start configuration with the type as "IdenticalDataAndAlgorithm" and ``parents`` as the union of provided list of ``additional_parents`` and the ...
Creates a new ``HyperParameterTuner`` by copying the request fields from the provided parent to the new instance of ``HyperparameterTuner`` followed by addition of warm start configuration with the type as "TransferLearning" and ``parents`` as the union of provided list of ``additional_parents`` and the...
Creates an instance of ``WarmStartConfig`` class, from warm start configuration response from DescribeTrainingJob. Args: warm_start_config (dict): The expected format of the ``warm_start_config`` contains two first-class fields: * "type": Type of warm start tuner...
Converts the ``self`` instance to the desired input request format. Returns: dict: Containing the "WarmStartType" and "ParentHyperParameterTuningJobs" as the first class fields. Examples: >>> warm_start_config = WarmStartConfig(warm_start_type=WarmStartTypes.TransferLearning,pa...
Start a hyperparameter tuning job. Args: inputs: Information about the training data. Please refer to the ``fit()`` method of the associated estimator, as this can take any of the following forms: * (str) - The S3 location where training data is saved. ...
Attach to an existing hyperparameter tuning job. Create a HyperparameterTuner bound to an existing hyperparameter tuning job. After attaching, if there exists a best training job (or any other completed training job), that can be deployed to create an Amazon SageMaker Endpoint and return a ``Pr...
Deploy the best trained or user specified model to an Amazon SageMaker endpoint and return a ``sagemaker.RealTimePredictor`` object. For more information: http://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-training.html Args: initial_instance_count (int): Minimum number of...
Return name of the best training job for the latest hyperparameter tuning job. Raises: Exception: If there is no best training job available for the hyperparameter tuning job. def best_training_job(self): """Return name of the best training job for the latest hyperparameter tuning job. ...
Delete an Amazon SageMaker endpoint. If an endpoint name is not specified, this defaults to looking for an endpoint that shares a name with the best training job for deletion. Args: endpoint_name (str): Name of the endpoint to delete def delete_endpoint(self, endpoint_name=None): ...
Return the hyperparameter ranges in a dictionary to be used as part of a request for creating a hyperparameter tuning job. def hyperparameter_ranges(self): """Return the hyperparameter ranges in a dictionary to be used as part of a request for creating a hyperparameter tuning job. """ ...
Creates a new ``HyperparameterTuner`` by copying the request fields from the provided parent to the new instance of ``HyperparameterTuner``. Followed by addition of warm start configuration with the type as "TransferLearning" and parents as the union of provided list of ``additional_parents`` and the ``...
Creates a new ``HyperparameterTuner`` by copying the request fields from the provided parent to the new instance of ``HyperparameterTuner``. Followed by addition of warm start configuration with the type as "IdenticalDataAndAlgorithm" and parents as the union of provided list of ``additional_parents`` a...
Creates a new ``HyperparameterTuner`` with ``WarmStartConfig``, where type will be equal to ``warm_start_type`` and``parents`` would be equal to union of ``additional_parents`` and self. Args: additional_parents (set{str}): Additional parents along with self, to be used for warm starting. ...
Create a new Amazon SageMaker hyperparameter tuning job from the HyperparameterTuner. Args: tuner (sagemaker.tuner.HyperparameterTuner): HyperparameterTuner object created by the user. inputs (str): Parameters used when called :meth:`~sagemaker.estimator.EstimatorBase.fit`. Ret...
Return True iff there is an element, a, of arr such that a is not None def some(arr): """Return True iff there is an element, a, of arr such that a is not None""" return functools.reduce(lambda x, y: x or (y is not None), arr, False)
Iterate over the available events coming from a set of log streams in a single log group interleaving the events from each stream so they're yielded in timestamp order. Args: client (boto3 client): The boto client for logs. log_group (str): The name of the log group. streams (list of st...
A generator for log items in a single stream. This will yield all the items that are available at the current moment. Args: client (boto3.CloudWatchLogs.Client): The Boto client for CloudWatch logs. log_group (str): The name of the log group. stream_name (str): The name of the specific ...
Create a SageMaker ``RLEstimatorModel`` object that can be deployed to an Endpoint. Args: role (str): The ``ExecutionRoleArn`` IAM Role ARN for the ``Model``, which is also used during transform jobs. If not specified, the role from the Estimator will be used. vpc_config...
Return the Docker image to use for training. The :meth:`~sagemaker.estimator.EstimatorBase.fit` method, which does the model training, calls this method to find the image to use for model training. Returns: str: The URI of the Docker image. def train_image(self): """Return...
Convert the job description to init params that can be handled by the class constructor Args: job_details: the returned job details from a describe_training_job API call. model_channel_name (str): Name of the channel where pre-trained model data will be downloaded. ...
Return hyperparameters used by your custom TensorFlow code during model training. def hyperparameters(self): """Return hyperparameters used by your custom TensorFlow code during model training.""" hyperparameters = super(RLEstimator, self).hyperparameters() additional_hyperparameters = {SAGEMA...
Provides default metric definitions based on provided toolkit. Args: toolkit(sagemaker.rl.RLToolkit): RL Toolkit to be used for training. Returns: list: metric definitions def default_metric_definitions(cls, toolkit): """Provides default metric definitions based on pro...
Prepare S3 operations (specify where to upload `source_dir`) and environment variables related to framework. Args: estimator (sagemaker.estimator.Estimator): The framework estimator to get information from and update. s3_operations (dict): The dict to specify s3 operations (upload `source_dir`)...
Set up amazon algorithm estimator, adding the required `feature_dim` hyperparameter from training data. Args: estimator (sagemaker.amazon.amazon_estimator.AmazonAlgorithmEstimatorBase): An estimator for a built-in Amazon algorithm to get information from and update. inputs: The training...
Export Airflow base training config from an estimator Args: estimator (sagemaker.estimator.EstimatorBase): The estimator to export training config from. Can be a BYO estimator, Framework estimator or Amazon algorithm estimator. inputs: Information about the training data. Pl...
Export Airflow training config from an estimator Args: estimator (sagemaker.estimator.EstimatorBase): The estimator to export training config from. Can be a BYO estimator, Framework estimator or Amazon algorithm estimator. inputs: Information about the training data. Please ...
Export Airflow tuning config from an estimator Args: tuner (sagemaker.tuner.HyperparameterTuner): The tuner to export tuning config from. inputs: Information about the training data. Please refer to the ``fit()`` method of the associated estimator in the tuner, as this can take any ...
Updated the S3 URI of the framework source directory in given estimator. Args: estimator (sagemaker.estimator.Framework): The Framework estimator to update. job_name (str): The new job name included in the submit S3 URI Returns: str: The updated S3 URI of framework source directory de...
Update training job of the estimator from a task in the DAG Args: estimator (sagemaker.estimator.EstimatorBase): The estimator to update task_id (str): The task id of any airflow.contrib.operators.SageMakerTrainingOperator or airflow.contrib.operators.SageMakerTuningOperator that genera...
Prepare the framework model container information. Specify related S3 operations for Airflow to perform. (Upload `source_dir`) Args: model (sagemaker.model.FrameworkModel): The framework model instance_type (str): The EC2 instance type to deploy this Model to. For example, 'ml.p2.xlarge'. ...
Export Airflow model config from a SageMaker model Args: instance_type (str): The EC2 instance type to deploy this Model to. For example, 'ml.p2.xlarge' model (sagemaker.model.FrameworkModel): The SageMaker model to export Airflow config from role (str): The ``ExecutionRoleArn`` IAM Role AR...
Export Airflow model config from a SageMaker estimator Args: instance_type (str): The EC2 instance type to deploy this Model to. For example, 'ml.p2.xlarge' estimator (sagemaker.model.EstimatorBase): The SageMaker estimator to export Airflow config from. It has to be an estimator associ...
Export Airflow transform config from a SageMaker transformer Args: transformer (sagemaker.transformer.Transformer): The SageMaker transformer to export Airflow config from. data (str): Input data location in S3. data_type (str): What the S3 location defines (default: 'S3Prefix')...
Export Airflow transform config from a SageMaker estimator Args: estimator (sagemaker.model.EstimatorBase): The SageMaker estimator to export Airflow config from. It has to be an estimator associated with a training job. task_id (str): The task id of any airflow.contrib.operators.SageMa...
Export Airflow deploy config from a SageMaker model Args: model (sagemaker.model.Model): The SageMaker model to export the Airflow config from. instance_type (str): The EC2 instance type to deploy this Model to. For example, 'ml.p2.xlarge'. initial_instance_count (int): The initial number o...
Export Airflow deploy config from a SageMaker estimator Args: estimator (sagemaker.model.EstimatorBase): The SageMaker estimator to export Airflow config from. It has to be an estimator associated with a training job. task_id (str): The task id of any airflow.contrib.operators.SageMaker...
Create a model to deploy. The serializer, deserializer, content_type, and accept arguments are only used to define a default RealTimePredictor. They are ignored if an explicit predictor class is passed in. Other arguments are passed through to the Model class. Args: role (s...
Return a ``Transformer`` that uses a SageMaker Model based on the training job. It reuses the SageMaker Session and base job name used by the Estimator. Args: instance_count (int): Number of EC2 instances to use. instance_type (str): Type of EC2 instance to use, for example, 'ml...
Set any values in the estimator that need to be set before training. Args: * job_name (str): Name of the training job to be created. If not specified, one is generated, using the base name given to the constructor if applicable. def _prepare_for_training(self, job_name=None): ...
Train a model using the input training dataset. The API calls the Amazon SageMaker CreateTrainingJob API to start model training. The API uses configuration you provided to create the estimator and the specified input training data to send the CreatingTrainingJob request to Amazon SageMaker. ...
Compile a Neo model using the input model. Args: target_instance_family (str): Identifies the device that you want to run your model after compilation, for example: ml_c5. Allowed strings are: ml_c5, ml_m5, ml_c4, ml_m4, jetsontx1, jetsontx2, ml_p2, ml_p3, deeplens, ...
Attach to an existing training job. Create an Estimator bound to an existing training job, each subclass is responsible to implement ``_prepare_init_params_from_job_description()`` as this method delegates the actual conversion of a training job description to the arguments that the class const...
Deploy the trained model to an Amazon SageMaker endpoint and return a ``sagemaker.RealTimePredictor`` object. More information: http://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-training.html Args: initial_instance_count (int): Minimum number of EC2 instances to deploy to...
str: The model location in S3. Only set if Estimator has been ``fit()``. def model_data(self): """str: The model location in S3. Only set if Estimator has been ``fit()``.""" if self.latest_training_job is not None: model_uri = self.sagemaker_session.sagemaker_client.describe_training_job( ...
Convert the job description to init params that can be handled by the class constructor Args: job_details: the returned job details from a describe_training_job API call. model_channel_name (str): Name of the channel where pre-trained model data will be downloaded. Returns: ...
Delete an Amazon SageMaker ``Endpoint``. Raises: ValueError: If the endpoint does not exist. def delete_endpoint(self): """Delete an Amazon SageMaker ``Endpoint``. Raises: ValueError: If the endpoint does not exist. """ self._ensure_latest_training_job(...
Return a ``Transformer`` that uses a SageMaker Model based on the training job. It reuses the SageMaker Session and base job name used by the Estimator. Args: instance_count (int): Number of EC2 instances to use. instance_type (str): Type of EC2 instance to use, for example, 'ml...
Return a ``TrainingJobAnalytics`` object for the current training job. def training_job_analytics(self): """Return a ``TrainingJobAnalytics`` object for the current training job. """ if self._current_job_name is None: raise ValueError('Estimator is not associated with a TrainingJob'...
Returns VpcConfig dict either from this Estimator's subnets and security groups, or else validate and return an optional override value. def get_vpc_config(self, vpc_config_override=vpc_utils.VPC_CONFIG_DEFAULT): """ Returns VpcConfig dict either from this Estimator's subnets and security group...
Create a new Amazon SageMaker training job from the estimator. Args: estimator (sagemaker.estimator.EstimatorBase): Estimator object created by the user. inputs (str): Parameters used when called :meth:`~sagemaker.estimator.EstimatorBase.fit`. Returns: sagemaker.es...
Create a model to deploy. Args: role (str): The ``ExecutionRoleArn`` IAM Role ARN for the ``Model``, which is also used during transform jobs. If not specified, the role from the Estimator will be used. image (str): An container image to use for deploying the model. Defa...
Convert the job description to init params that can be handled by the class constructor Args: job_details: the returned job details from a describe_training_job API call. model_channel_name (str): Name of the channel where pre-trained model data will be downloaded Returns: ...
Set hyperparameters needed for training. This method will also validate ``source_dir``. Args: * job_name (str): Name of the training job to be created. If not specified, one is generated, using the base name given to the constructor if applicable. def _prepare_for_training(self, jo...
Upload the user training script to s3 and return the location. Returns: s3 uri def _stage_user_code_in_s3(self): """Upload the user training script to s3 and return the location. Returns: s3 uri """ local_mode = self.output_path.startswith('file://') if self.code_loc...
Get the appropriate value to pass as source_dir to model constructor on deploying Returns: str: Either a local or an S3 path pointing to the source_dir to be used for code by the model to be deployed def _model_source_dir(self): """Get the appropriate value to pass as source_dir to model c...
Convert the job description to init params that can be handled by the class constructor Args: job_details: the returned job details from a describe_training_job API call. model_channel_name (str): Name of the channel where pre-trained model data will be downloaded Returns: ...
Return the Docker image to use for training. The :meth:`~sagemaker.estimator.EstimatorBase.fit` method, which does the model training, calls this method to find the image to use for model training. Returns: str: The URI of the Docker image. def train_image(self): """Retur...
Attach to an existing training job. Create an Estimator bound to an existing training job, each subclass is responsible to implement ``_prepare_init_params_from_job_description()`` as this method delegates the actual conversion of a training job description to the arguments that the class const...
Return a ``Transformer`` that uses a SageMaker Model based on the training job. It reuses the SageMaker Session and base job name used by the Estimator. Args: instance_count (int): Number of EC2 instances to use. instance_type (str): Type of EC2 instance to use, for example, 'ml...
Return all non-None ``hyperparameter`` values on ``obj`` as a ``dict[str,str].`` def serialize_all(obj): """Return all non-None ``hyperparameter`` values on ``obj`` as a ``dict[str,str].``""" if '_hyperparameters' not in dir(obj): return {} return {k: str(v) for k, v in obj._hyperpa...
Start the Local Transform Job Args: input_data (dict): Describes the dataset to be transformed and the location where it is stored. output_data (dict): Identifies the location where to save the results from the transform job transform_resources (dict): compute instances for ...
Describe this _LocalTransformJob The response is a JSON-like dictionary that follows the response of the boto describe_transform_job() API. Returns: dict: description of this _LocalTransformJob def describe(self): """Describe this _LocalTransformJob The response i...
Get all the Environment variables that will be passed to the container Certain input fields such as BatchStrategy have different values for the API vs the Environment variables, such as SingleRecord vs SINGLE_RECORD. This method also handles this conversion. Args: **kwargs: existin...
Represent the parameter range as a dicionary suitable for a request to create an Amazon SageMaker hyperparameter tuning job. Args: name (str): The name of the hyperparameter. Returns: dict[str, str]: A dictionary that contains the name and values of the hyperparameter. ...
Represent the parameter range as a dictionary suitable for a request to create an Amazon SageMaker hyperparameter tuning job using one of the deep learning frameworks. The deep learning framework images require that hyperparameters be serialized as JSON. Args: name (str): The name ...
Create a definition for executing a container as part of a SageMaker model. Args: image (str): Docker image to run for this container. model_data_url (str): S3 URI of data required by this container, e.g. SageMaker training job model artifacts (default: None). env (dict[str, str...
Create a definition for executing a pipeline of containers as part of a SageMaker model. Args: models (list[sagemaker.Model]): this will be a list of ``sagemaker.Model`` objects in the order the inference should be invoked. instance_type (str): The EC2 instance type to deploy this Model to. ...
Create a production variant description suitable for use in a ``ProductionVariant`` list as part of a ``CreateEndpointConfig`` request. Args: model_name (str): The name of the SageMaker model this production variant references. instance_type (str): The EC2 instance type for this production vari...
Return the role ARN whose credentials are used to call the API. Throws an exception if Args: sagemaker_session(Session): Current sagemaker session Returns: (str): The role ARN def get_execution_role(sagemaker_session=None): """Return the role ARN whose credentials are used to call the A...
Initialize this SageMaker Session. Creates or uses a boto_session, sagemaker_client and sagemaker_runtime_client. Sets the region_name. def _initialize(self, boto_session, sagemaker_client, sagemaker_runtime_client): """Initialize this SageMaker Session. Creates or uses a boto_session...
Upload local file or directory to S3. If a single file is specified for upload, the resulting S3 object key is ``{key_prefix}/{filename}`` (filename does not include the local path, if any specified). If a directory is specified for upload, the API uploads all content, recursively, pre...