text
stringlengths
81
112k
Helper for session-related API calls. def database_admin_api(self): """Helper for session-related API calls.""" if self._database_admin_api is None: self._database_admin_api = DatabaseAdminClient( credentials=self.credentials, client_info=_CLIENT_INFO ) r...
Make a copy of this client. Copies the local data stored as simple types but does not copy the current state of any open connections with the Cloud Bigtable API. :rtype: :class:`.Client` :returns: A copy of the current client. def copy(self): """Make a copy of this client. ...
List available instance configurations for the client's project. .. _RPC docs: https://cloud.google.com/spanner/docs/reference/rpc/\ google.spanner.admin.instance.v1#google.spanner.admin.\ instance.v1.InstanceAdmin.ListInstanceConfigs See `RPC docs`_. ...
Factory to create a instance associated with this client. :type instance_id: str :param instance_id: The ID of the instance. :type configuration_name: string :param configuration_name: (Optional) Name of the instance configuration used to set up the instance's clu...
List instances for the client's project. See https://cloud.google.com/spanner/reference/rpc/google.spanner.admin.database.v1#google.spanner.admin.database.v1.InstanceAdmin.ListInstances :type filter_: string :param filter_: (Optional) Filter to select instances listed. See ...
Predicate for determining when to retry. We retry if and only if the 'reason' is 'backendError' or 'rateLimitExceeded'. def _should_retry(exc): """Predicate for determining when to retry. We retry if and only if the 'reason' is 'backendError' or 'rateLimitExceeded'. """ if not hasattr(exc...
Default unit test session. def default(session, django_dep=('django',)): """Default unit test session. """ # Install all test dependencies, then install this package in-place. deps = UNIT_TEST_DEPS deps += django_dep session.install(*deps) for local_dep in LOCAL_DEPS: session.inst...
Run the unit test suite. def unit(session): """Run the unit test suite.""" # Testing multiple version of django # See https://www.djangoproject.com/download/ for supported version django_deps_27 = [ ('django==1.8.19',), ('django >= 1.11.0, < 2.0.0dev',), ] if session.virtualen...
Run the system test suite. def system(session): """Run the system test suite.""" # Sanity check: Only run system tests if the environment variable is set. if not os.environ.get('GOOGLE_APPLICATION_CREDENTIALS', ''): session.skip('Credentials must be set via environment variable.') # Use pre-r...
Detect correct entry type from resource and instantiate. :type resource: dict :param resource: One entry resource from API response. :type client: :class:`~google.cloud.logging.client.Client` :param client: Client that owns the log entry. :type loggers: dict :param loggers: A mapping ...
Retrieve the metadata key in the metadata server. See: https://cloud.google.com/compute/docs/storing-retrieving-metadata :type metadata_key: str :param metadata_key: Key of the metadata which will form the url. You can also supply query parameters after the metadata key. ...
Define loss of TF graph :param y: correct labels :param model: output of the model :param mean: boolean indicating whether should return mean of loss or vector of losses for each input of the batch :return: return mean of loss if True, otherwise return vector with per sample loss def ...
Only initializes the variables of a TensorFlow session that were not already initialized. :param sess: the TensorFlow session :return: def initialize_uninitialized_global_variables(sess): """ Only initializes the variables of a TensorFlow session that were not already initialized. :param sess: the Tensor...
Train a TF graph. This function is deprecated. Prefer cleverhans.train.train when possible. cleverhans.train.train supports multiple GPUs but this function is still needed to support legacy models that do not support calling fprop more than once. :param sess: TF session to use when training the graph :para...
Compute the accuracy of a TF model on some data :param sess: TF session to use :param x: input placeholder :param y: output placeholder (for labels) :param predictions: model output predictions :param X_test: numpy array with training inputs :param Y_test: numpy array with training outputs :param feed: An...
Wrapper around deprecated function. def batch_eval(*args, **kwargs): """ Wrapper around deprecated function. """ # Inside function to avoid circular import from cleverhans.evaluation import batch_eval as new_batch_eval warnings.warn("batch_eval has moved to cleverhans.evaluation. " "batch_e...
Helper function that computes the current class prediction :param sess: TF session :param x: the input placeholder :param predictions: the model's symbolic output :param samples: numpy array with input samples (dims must match x) :param feed: An optional dictionary that is appended to the feeding d...
Helper function to normalize a batch of vectors. :param x: the input placeholder :param epsilon: stabilizes division :return: the batch of l2 normalized vector def l2_batch_normalize(x, epsilon=1e-12, scope=None): """ Helper function to normalize a batch of vectors. :param x: the input placeholder :param...
Helper function to compute kl-divergence KL(p || q) def kl_with_logits(p_logits, q_logits, scope=None, loss_collection=tf.GraphKeys.REGULARIZATION_LOSSES): """Helper function to compute kl-divergence KL(p || q) """ with tf.name_scope(scope, "kl_divergence") as name: p = tf.nn.softmax(p_log...
Helper function to clip the perturbation to epsilon norm ball. :param eta: A tensor with the current perturbation. :param ord: Order of the norm (mimics Numpy). Possible values: np.inf, 1 or 2. :param eps: Epsilon, bound of the perturbation. def clip_eta(eta, ord, eps): """ Helper function to c...
Returns the list of devices that multi-replica code should use. :param devices: list of string device names, e.g. ["/GPU:0"] If the user specifies this, `infer_devices` checks that it is valid, and then uses this user-specified list. If the user does not specify this, infer_devices uses: -...
Returns a list of string names of all available GPUs def get_available_gpus(): """ Returns a list of string names of all available GPUs """ local_device_protos = device_lib.list_local_devices() return [x.name for x in local_device_protos if x.device_type == 'GPU']
A wrapper for clip_by_value that casts the clipping range if needed. def clip_by_value(t, clip_value_min, clip_value_max, name=None): """ A wrapper for clip_by_value that casts the clipping range if needed. """ def cast_clip(clip): """ Cast clipping range argument if needed. """ if t.dtype in (...
A wrapper around tf multiplication that does more automatic casting of the input. def mul(a, b): """ A wrapper around tf multiplication that does more automatic casting of the input. """ def multiply(a, b): """Multiplication""" return a * b return op_with_scalar_cast(a, b, multiply)
A wrapper around tf division that does more automatic casting of the input. def div(a, b): """ A wrapper around tf division that does more automatic casting of the input. """ def divide(a, b): """Division""" return a / b return op_with_scalar_cast(a, b, divide)
Builds the graph to compute f(a, b). If only one of the two arguments is a scalar and the operation would cause a type error without casting, casts the scalar to match the tensor. :param a: a tf-compatible array or scalar :param b: a tf-compatible array or scalar def op_with_scalar_cast(a, b, f): """ Bui...
Create the Jacobian graph to be ran later in a TF session :param predictions: the model's symbolic output (linear output, pre-softmax) :param x: the input placeholder :param nb_classes: the number of classes the model has :return: def jacobian_graph(predictions, x, nb_classes): """ Create the Jacobia...
Augment an adversary's substitute training set using the Jacobian of a substitute model to generate new synthetic inputs. See https://arxiv.org/abs/1602.02697 for more details. See cleverhans_tutorials/mnist_blackbox.py for example use case :param sess: TF session in which the substitute model is defined :par...
Run evaluation on a saved model :param filepath: path to model to evaluate :param train_start: index of first training set example :param train_end: index of last training set example :param test_start: index of first test set example :param test_end: index of last test set example :param batch_size: size o...
Preprocessing the inputs before calling session.run() :param X_batch: A dictionary of inputs to the first sub-graph :return: A tuple, `(fetches, fd)`, with `fetches` being a list of Tensors to be fetches and `fd` the feed dictionary. def set_input(self, X_batch=None): """ Preprocessing th...
Postprocess the outputs of the Session.run(). Move the outputs of sub-graphs to next ones and return the output of the last sub-graph. :param fvals: A list of fetched values returned by Session.run() :return: A dictionary of fetched values returned by the last sub-graph. def proc_fvals(self, fvals): "...
Helper method to write images from single batch into datastore. def _write_single_batch_images_internal(self, batch_id, client_batch): """Helper method to write images from single batch into datastore.""" client = self._datastore_client batch_key = client.key(self._entity_kind_batches, batch_id) for im...
Writes all image batches to the datastore. def write_to_datastore(self): """Writes all image batches to the datastore.""" client = self._datastore_client with client.no_transact_batch() as client_batch: for batch_id, batch_data in iteritems(self._data): batch_key = client.key(self._entity_kin...
Writes only images from one batch to the datastore. def write_single_batch_images_to_datastore(self, batch_id): """Writes only images from one batch to the datastore.""" client = self._datastore_client with client.no_transact_batch() as client_batch: self._write_single_batch_images_internal(batch_id,...
Initializes batches by reading from the datastore. def init_from_datastore(self): """Initializes batches by reading from the datastore.""" self._data = {} for entity in self._datastore_client.query_fetch( kind=self._entity_kind_batches): batch_id = entity.key.flat_path[-1] self._data[ba...
Adds batch with give ID and list of properties. def add_batch(self, batch_id, batch_properties=None): """Adds batch with give ID and list of properties.""" if batch_properties is None: batch_properties = {} if not isinstance(batch_properties, dict): raise ValueError('batch_properties has to be ...
Adds image to given batch. def add_image(self, batch_id, image_id, image_properties=None): """Adds image to given batch.""" if batch_id not in self._data: raise KeyError('Batch with ID "{0}" does not exist'.format(batch_id)) if image_properties is None: image_properties = {} if not isinstan...
Reads list of dataset images from the datastore. def _read_image_list(self, skip_image_ids=None): """Reads list of dataset images from the datastore.""" if skip_image_ids is None: skip_image_ids = [] images = self._storage_client.list_blobs( prefix=os.path.join('dataset', self._dataset_name) ...
Initializes dataset batches from the list of images in the datastore. Args: batch_size: batch size allowed_epsilon: list of allowed epsilon or None to use default skip_image_ids: list of image ids to skip max_num_images: maximum number of images to read def init_from_storage_write_to_datas...
Init list of adversarial batches from dataset batches and submissions. Args: dataset_batches: instances of DatasetBatches attack_submission_ids: iterable with IDs of all (targeted and nontargeted) attack submissions, could be obtains as CompetitionSubmissions.get_all_attack_ids() def i...
Returns total number of all generated adversarial examples. def count_generated_adv_examples(self): """Returns total number of all generated adversarial examples.""" result = {} for v in itervalues(self.data): s_id = v['submission_id'] result[s_id] = result.get(s_id, 0) + len(v['images']) r...
Load a saved model, gather its predictions, and save a confidence report. :param filepath: path to model to evaluate :param train_start: index of first training set example to use :param train_end: index of last training set example to use :param test_start: index of first test set example to use :param test_...
Prints out accuracy, coverage, etc. statistics :param correctness: ndarray One bool per example specifying whether it was correctly classified :param confidence: ndarray The probability associated with each prediction :param name: str The name of this type of data (e.g. "clean", "MaxConfidence") def ...
Load a saved model, gather its predictions, and save a confidence report. This function works by running a single MaxConfidence attack on each example. This provides a reasonable estimate of the true failure rate quickly, so long as the model does not suffer from gradient masking. However, this estimate is mo...
MNIST CleverHans tutorial :param train_start: index of first training set example :param train_end: index of last training set example :param test_start: index of first test set example :param test_end: index of last test set example :param nb_epochs: number of epochs to train model :param batch_size: size ...
Generate symbolic graph for adversarial examples and return. :param x: The model's symbolic inputs. :param kwargs: Keyword arguments for the base attacker def generate(self, x, **kwargs): """ Generate symbolic graph for adversarial examples and return. :param x: The model's symbolic inputs. :...
Runs the untargeted attack. :param x: The input :param true_y: The correct label for `x`. This attack aims to produce misclassification. def attack(self, x, true_y): """ Runs the untargeted attack. :param x: The input :param true_y: The correct label for `x`. This attack aims to produce misclas...
Run the attack on a specific target class. :param x: tf Tensor. The input example. :param target_y: tf Tensor. The attacker's desired target class. Returns: A targeted adversarial example, intended to be classified as the target class. def attack_class(self, x, target_y): """ Run the attack o...
This helper function computes a batch start and end index :param batch_nb: the batch number :param data_length: the total length of the data being parsed by batches :param batch_size: the number of inputs in each batch :return: pair of (start, end) indices def batch_indices(batch_nb, data_length, batch_size): ...
Returns a list of class indices excluding the class indexed by class_ind :param nb_classes: number of classes in the task :param class_ind: the class index to be omitted :return: list of class indices excluding the class indexed by class_ind def other_classes(nb_classes, class_ind): """ Returns a list of cla...
Converts a class vector (integers) to binary class matrix. This is adapted from the Keras function with the same name. :param y: class vector to be converted into a matrix (integers from 0 to nb_classes). :param nb_classes: nb_classes: total number of classes. :param num_classses: depricated version...
Take in an array of correct labels and randomly select a different label for each label in the array. This is typically used to randomly select a target class in targeted adversarial examples attacks (i.e., when the search algorithm takes in both a source class and target class to compute the adversarial exampl...
Deprecation wrapper def pair_visual(*args, **kwargs): """Deprecation wrapper""" warnings.warn("`pair_visual` has moved to `cleverhans.plot.pyplot_image`. " "cleverhans.utils.pair_visual may be removed on or after " "2019-04-24.") from cleverhans.plot.pyplot_image import pair_visua...
Deprecation wrapper def grid_visual(*args, **kwargs): """Deprecation wrapper""" warnings.warn("`grid_visual` has moved to `cleverhans.plot.pyplot_image`. " "cleverhans.utils.grid_visual may be removed on or after " "2019-04-24.") from cleverhans.plot.pyplot_image import grid_visua...
Deprecation wrapper def get_logits_over_interval(*args, **kwargs): """Deprecation wrapper""" warnings.warn("`get_logits_over_interval` has moved to " "`cleverhans.plot.pyplot_image`. " "cleverhans.utils.get_logits_over_interval may be removed on " "or after 2019-04-2...
Deprecation wrapper def linear_extrapolation_plot(*args, **kwargs): """Deprecation wrapper""" warnings.warn("`linear_extrapolation_plot` has moved to " "`cleverhans.plot.pyplot_image`. " "cleverhans.utils.linear_extrapolation_plot may be removed on " "or after 2019-0...
Create a logger object with the given name. If this is the first time that we call this method, then initialize the formatter. def create_logger(name): """ Create a logger object with the given name. If this is the first time that we call this method, then initialize the formatter. """ base = logging...
Returns a version of `normal_dict` whose iteration order is always the same def deterministic_dict(normal_dict): """ Returns a version of `normal_dict` whose iteration order is always the same """ out = OrderedDict() for key in sorted(normal_dict.keys()): out[key] = normal_dict[key] return out
Return the union of l1 and l2, with a deterministic ordering. (Union of python sets does not necessarily have a consisten iteration order) :param l1: list of items :param l2: list of items :returns: list containing one copy of each item that is in l1 or in l2 def ordered_union(l1, l2): """ Return the uni...
like zip but with these properties: - returns a list, rather than an iterator. This is the old Python2 zip behavior. - a guarantee that all arguments are the same length. (normal zip silently drops entries to make them the same length) def safe_zip(*args): """like zip but with these properties: - returns a l...
Calls shell command with argument substitution. Args: command: command represented as a list. Each element of the list is one token of the command. For example "cp a b" becomes ['cp', 'a', 'b'] If any element of the list looks like '${NAME}' then it will be replaced by value from **kwargs with ...
Returns a copy of a dictionary whose values are numpy arrays. Copies their values rather than copying references to them. def deep_copy(numpy_dict): """ Returns a copy of a dictionary whose values are numpy arrays. Copies their values rather than copying references to them. """ out = {} for key in numpy_...
Load and preprocess MNIST dataset :param datadir: path to folder where data should be stored :param train_start: index of first training set example :param train_end: index of last training set example :param test_start: index of first test set example :param test_end: index of last test set example :return...
Preprocess CIFAR10 dataset :return: def data_cifar10(train_start=0, train_end=50000, test_start=0, test_end=10000): """ Preprocess CIFAR10 dataset :return: """ # These values are specific to CIFAR10 img_rows = 32 img_cols = 32 nb_classes = 10 # the data, shuffled and split between train and test...
Load a saved model and print out its accuracy on different data distributions This function works by running a single attack on each example. This provides a reasonable estimate of the true failure rate quickly, so long as the model does not suffer from gradient masking. However, this estimate is mostly intend...
The actual implementation of the evaluation. :param sess: tf.Session :param model: cleverhans.model.Model :param dataset: cleverhans.dataset.Dataset :param factory: the dataset factory corresponding to `dataset` :param x_data: numpy array of input examples :param y_data: numpy array of class labels :param...
Print accuracies def main(argv=None): """ Print accuracies """ try: _name_of_script, filepath = argv except ValueError: raise ValueError(argv) print_accuracies(filepath=filepath, test_start=FLAGS.test_start, test_end=FLAGS.test_end, which_set=FLAGS.which_set, n...
PyTorch implementation of the Fast Gradient Method. :param model_fn: a callable that takes an input tensor and returns the model logits. :param x: input tensor. :param eps: epsilon (input variation parameter); see https://arxiv.org/abs/1412.6572. :param ord: Order of the norm (mimics NumPy). Possible values: np...
Read png images from input directory in batches. Args: input_dir: input directory batch_shape: shape of minibatch array, i.e. [batch_size, height, width, 3] Yields: filenames: list file names without path of each image Lenght of this list could be less than batch_size, in this case only fi...
Saves images to the output directory. Args: images: array with minibatch of images filenames: list of filenames without path If number of file names in this list less than number of images in the minibatch then only first len(filenames) images will be saved. output_dir: directory where to sav...
Run the sample attack def main(_): """Run the sample attack""" # Images for inception classifier are normalized to be in [-1, 1] interval, # eps is a difference between pixels so it should be in [0, 2] interval. # Renormalizing epsilon from [0, 255] to [0, 2]. eps = 2.0 * FLAGS.max_epsilon / 255.0 batch_sh...
Load training and test data. def ld_cifar10(): """Load training and test data.""" train_transforms = torchvision.transforms.Compose([torchvision.transforms.ToTensor()]) test_transforms = torchvision.transforms.Compose([torchvision.transforms.ToTensor()]) train_dataset = torchvision.datasets.CIFAR10(root='/tmp/...
Plots a success-fail curve from a confidence report stored on disk, :param path: string filepath for the stored report. (Should be the output of make_confidence_report*.py) :param success_name: The name (confidence report key) of the data that should be used to measure success rate :param fail_names: A li...
Plot a success fail curve from a confidence report :param report: A confidence report (the type of object saved by make_confidence_report.py) :param success_name: see plot_report_from_path :param fail_names: see plot_report_from_path :param label: see plot_report_from_path :param is_max_confidence: see pl...
Make a success-failure curve. :param report: A confidence report (the type of object saved by make_confidence_report.py) :param success_name: see plot_report_from_path :param fail_names: see plot_report_from_path :returns: fail_optimal: list of failure rates on adversarial data for the optimal (t ...
Train a TF graph :param sess: TF session to use when training the graph :param x: input placeholder :param y: output placeholder (for labels) :param predictions: model output predictions :param X_train: numpy array with training inputs :param Y_train: numpy array with training outputs :param...
Clone variables unused by the attack on all GPUs. Specifically, the ground-truth label, y, has to be preserved until the training step. :param inputs: A list of dictionaries as the inputs to each step. :param outputs: A list of dictionaries as the outputs of each step. :param g0_inputs: Initial variabl...
Return a tensor that constructs adversarial examples for the given input. Generate uses tf.py_func in order to operate over tensors. :param x: (required) A tensor with the inputs. :param kwargs: See `parse_params` def generate(self, x, **kwargs): """ Return a tensor that constructs adversarial exam...
:param y_target: (optional) A tensor with the one-hot target labels. :param batch_size: The number of inputs to include in a batch and process simultaneously. :param binary_search_steps: The number of times we perform binary search to find the optimal trade...
Perform the attack on the given instance for the given targets. def attack(self, x_val, targets): """ Perform the attack on the given instance for the given targets. """ def lbfgs_objective(adv_x, self, targets, oimgs, CONST): """ returns the function value and the gradient for fmin_l_bfgs_b """...
Set the device before the next fprop to create a new graph on the specified device. def set_device(self, device_name): """ Set the device before the next fprop to create a new graph on the specified device. """ device_name = unify_device_name(device_name) self.device_name = device_name ...
Return a list of assignment operations that syncs the parameters of all model copies with the one on host_device. :param host_device: (required str) the name of the device with latest parameters def create_sync_ops(self, host_device): """ Return a list of assignment operations t...
Create and initialize a variable using a numpy array and set trainable. :param name: (required str) name of the variable :param initializer: a numpy array or a tensor def get_variable(self, name, initializer): """ Create and initialize a variable using a numpy array and set trainable. :param name: ...
Create and initialize layer parameters on the device previously set in self.device_name. :param new_input_shape: a list or tuple for the shape of the input. def set_input_shape_ngpu(self, new_input_shape): """ Create and initialize layer parameters on the device previously set in self.device_name....
Create an assignment operation for each weight on all devices. The weight is assigned the value of the copy on the `host_device'. def create_sync_ops(self, host_device): """Create an assignment operation for each weight on all devices. The weight is assigned the value of the copy on the `host_device'. ...
Tensorflow implementation of the perturbation method used for virtual adversarial training: https://arxiv.org/abs/1507.00677 :param model: the model which returns the network unnormalized logits :param x: the input placeholder :param logits: the model's unnormalized output tensor (the input to ...
Generate symbolic graph for adversarial examples and return. :param x: The model's symbolic inputs. :param kwargs: See `parse_params` def generate(self, x, **kwargs): """ Generate symbolic graph for adversarial examples and return. :param x: The model's symbolic inputs. :param kwargs: See `pa...
Take in a dictionary of parameters and applies attack-specific checks before saving them as attributes. Attack-specific parameters: :param eps: (optional float )the epsilon (input variation parameter) :param nb_iter: (optional) the number of iterations Defaults to 1 if not specified :param x...
Iterate with exponential backoff on failures. Useful to wrap results of datastore Query.fetch to avoid 429 error. Args: base_iter: basic iterator of generator object max_num_tries: maximum number of tries for each request max_backoff: maximum backoff, in seconds start_backoff: initial value of bac...
Lists names of all blobs by their prefix. def list_blobs(self, prefix=''): """Lists names of all blobs by their prefix.""" return [b.name for b in self.bucket.list_blobs(prefix=prefix)]
Begins a batch. def begin(self): """Begins a batch.""" if self._cur_batch: raise ValueError('Previous batch is not committed.') self._cur_batch = self._client.batch() self._cur_batch.begin() self._num_mutations = 0
Rolls back pending mutations. Keep in mind that NoTransactionBatch splits all mutations into smaller batches and commit them as soon as mutation buffer reaches maximum length. That's why rollback method will only roll back pending mutations from the buffer, but won't be able to rollback already committ...
Adds mutation of the entity to the mutation buffer. If mutation buffer reaches its capacity then this method commit all pending mutations from the buffer and emties it. Args: entity: entity which should be put into the datastore def put(self, entity): """Adds mutation of the entity to the mutat...
Adds deletion of the entity with given key to the mutation buffer. If mutation buffer reaches its capacity then this method commit all pending mutations from the buffer and emties it. Args: key: key of the entity which should be deleted def delete(self, key): """Adds deletion of the entity with...
Retrieves an entity given its key. def get(self, key, transaction=None): """Retrieves an entity given its key.""" return self._client.get(key, transaction=transaction)
MNIST tutorial for Carlini and Wagner's attack :param train_start: index of first training set example :param train_end: index of last training set example :param test_start: index of first test set example :param test_end: index of last test set example :param viz_enabled: (boolean) activate plots of adversa...
Selects the Attack Class using string input. :param attack_string: adversarial attack name in string format :return: attack class defined in cleverhans.attacks_eager def attack_selection(attack_string): """ Selects the Attack Class using string input. :param attack_string: adversarial attack name in string f...
MNIST cleverhans tutorial :param train_start: index of first training set example. :param train_end: index of last training set example. :param test_start: index of first test set example. :param test_end: index of last test set example. :param nb_epochs: number of epochs to train model. :param batch_size: ...
Removes directory tree as a superuser. Args: dir_name: name of the directory to remove. This function is necessary to cleanup directories created from inside a Docker, since they usually written as a root, thus have to be removed as a root. def sudo_remove_dirtree(dir_name): """Removes directory tree a...
Main function which runs worker. def main(args): """Main function which runs worker.""" title = '## Starting evaluation of round {0} ##'.format(args.round_name) logging.info('\n' + '#' * len(title) + '\n' + '#' * len(title) + '\n' + '##' + ' ' * (len(title)-2) + '##' ...