text
stringlengths
81
112k
Adds standard TensorBoard CLI flags to parser. def define_flags(self, parser): """Adds standard TensorBoard CLI flags to parser.""" parser.add_argument( '--logdir', metavar='PATH', type=str, default='', help='''\ Directory where TensorBoard will look to find TensorFlow e...
Fixes standard TensorBoard CLI flags to parser. def fix_flags(self, flags): """Fixes standard TensorBoard CLI flags to parser.""" FlagsError = base_plugin.FlagsError if flags.version_tb: pass elif flags.inspect: if flags.logdir and flags.event_file: raise FlagsError( 'Mu...
Put a message into the outgoing message stack. Outgoing message will be stored indefinitely to support multi-users. def put(self, message): """Put a message into the outgoing message stack. Outgoing message will be stored indefinitely to support multi-users. """ with self._outgoing_lock: se...
Get message(s) from the outgoing message stack. Blocks until an item at stack position pos becomes available. This method is thread safe. Args: pos: An int specifying the top position of the message stack to access. For example, if the stack counter is at 3 and pos == 2, then the 2nd ...
Run custom scalar demo and generate event files. def run(): """Run custom scalar demo and generate event files.""" step = tf.compat.v1.placeholder(tf.float32, shape=[]) with tf.name_scope('loss'): # Specify 2 different loss values, each tagged differently. summary_lib.scalar('foo', tf.pow(0.9, step)) ...
Returns the plugin, if possible. Args: context: The TBContext flags. Returns: A InteractiveInferencePlugin instance or None if it couldn't be loaded. def load(self, context): """Returns the plugin, if possible. Args: context: The TBContext flags. Returns: A InteractiveIn...
Stores a config file used by the embedding projector. Args: summary_writer: The summary writer used for writing events. config: `tf.contrib.tensorboard.plugins.projector.ProjectorConfig` proto that holds the configuration for the projector such as paths to checkpoint files and metadata files for ...
Wraps absl.flags's define functions so tf.flags accepts old names. def _wrap_define_function(original_function): """Wraps absl.flags's define functions so tf.flags accepts old names.""" def wrapper(*args, **kwargs): """Wrapper function that turns old keyword names to new ones.""" has_old_names...
Returns a (run,tag) tuple storing the evaluations of the specified metric. Args: session_name: str. metric_name: MetricName protobuffer. Returns: (run, tag) tuple. def run_tag_from_session_and_metric(session_name, metric_name): """Returns a (run,tag) tuple storing the evaluations of the specified metric...
Returns the last evaluations of the given metric at the given session. Args: multiplexer: The EventMultiplexer instance allowing access to the exported summary data. session_name: String. The session name for which to get the metric evaluations. metric_name: api_pb2.MetricName proto. The ...
Return {runName: {tagName: {displayName: ..., description: ...}}}. def index_impl(self): """Return {runName: {tagName: {displayName: ..., description: ...}}}.""" if self._db_connection_provider: # Read tags from the database. db = self._db_connection_provider() cursor = db.execute(''' ...
Result of the form `(body, mime_type)`. def scalars_impl(self, tag, run, experiment, output_format): """Result of the form `(body, mime_type)`.""" if self._db_connection_provider: db = self._db_connection_provider() # We select for steps greater than -1 because the writer inserts # placeholde...
Obtains value for scalar event given blob and dtype enum. Args: scalar_data_blob: The blob obtained from the database. dtype_enum: The enum representing the dtype. Returns: The scalar value. def _get_value(self, scalar_data_blob, dtype_enum): """Obtains value for scalar event given blob...
Given a tag and single run, return array of ScalarEvents. def scalars_route(self, request): """Given a tag and single run, return array of ScalarEvents.""" # TODO: return HTTP status code for malformed requests tag = request.args.get('tag') run = request.args.get('run') experiment = request.args.ge...
Add a run to the multiplexer. If the name is not specified, it is the same as the path. If a run by that name exists, and we are already watching the right path, do nothing. If we are watching a different path, replace the event accumulator. If `Reload` has been called, it will `Reload` the n...
Load runs from a directory; recursively walks subdirectories. If path doesn't exist, no-op. This ensures that it is safe to call `AddRunsFromDirectory` multiple times, even before the directory is made. If path is a directory, load event files in the directory (if any exist) and recursively call A...
Call `Reload` on every `EventAccumulator`. def Reload(self): """Call `Reload` on every `EventAccumulator`.""" logger.info('Beginning EventMultiplexer.Reload()') self._reload_called = True # Build a list so we're safe even if the list of accumulators is modified # even while we're reloading. wit...
Get index of runs and assets for a given plugin. Args: plugin_name: Name of the plugin we are checking for. Returns: A dictionary that maps from run_name to a list of plugin assets for that run. def PluginAssets(self, plugin_name): """Get index of runs and assets for a given plugin. ...
Return the contents for a specific plugin asset from a run. Args: run: The string name of the run. plugin_name: The string name of a plugin. asset_name: The string name of an asset. Returns: The string contents of the plugin asset. Raises: KeyError: If the asset is not avail...
Retrieve the scalar events associated with a run and tag. Args: run: A string name of the run for which values are retrieved. tag: A string name of the tag for which values are retrieved. Raises: KeyError: If the run is not found, or the tag is not available for the given run. R...
Get the session.run() metadata associated with a TensorFlow run and tag. Args: run: A string name of a TensorFlow run. tag: A string name of the tag associated with a particular session.run(). Raises: KeyError: If the run is not found, or the tag is not available for the given run. ...
Retrieve the audio events associated with a run and tag. Args: run: A string name of the run for which values are retrieved. tag: A string name of the tag for which values are retrieved. Raises: KeyError: If the run is not found, or the tag is not available for the given run. Re...
Retrieve the tensor events associated with a run and tag. Args: run: A string name of the run for which values are retrieved. tag: A string name of the tag for which values are retrieved. Raises: KeyError: If the run is not found, or the tag is not available for the given run. R...
Returns a 2-layer dictionary of the form {run: {tag: content}}. The `content` referred above is the content field of the PluginData proto for the specified plugin within a Summary.Value proto. Args: plugin_name: The name of the plugin for which to fetch content. Returns: A dictionary of t...
Return the summary metadata for the given tag on the given run. Args: run: A string name of the run for which summary metadata is to be retrieved. tag: A string name of the tag whose summary metadata is to be retrieved. Raises: KeyError: If the run is not found, or the tag is...
Return all the run names in the `EventMultiplexer`. Returns: ``` {runName: { scalarValues: [tagA, tagB, tagC], graph: true, meta_graph: true}} ``` def Runs(self): """Return all the run names in the `EventMultiplexer`. Returns: ``` {runName: { scalarValues: [tagA,...
Write a text summary. Arguments: name: A name for this summary. The summary tag used for TensorBoard will be this name prefixed by any active name scopes. data: A UTF-8 string tensor value. step: Explicit `int64`-castable monotonic step value for this summary. If omitted, this defaults to `tf...
Create a text tf.Summary protobuf. Arguments: tag: String tag for the summary. data: A Python bytestring (of type bytes), a Unicode string, or a numpy data array of those types. description: Optional long-form description for this summary, as a `str`. Markdown is supported. Defaults to empty....
Initializes flags and calls main(). def run_main(): """Initializes flags and calls main().""" program.setup_environment() if getattr(tf, '__version__', 'stub') == 'stub': print("TensorFlow installation not found - running with reduced feature set.", file=sys.stderr) tensorboard = program.Tensor...
Create a `summary_pb2.SummaryMetadata` proto for image plugin data. Returns: A `summary_pb2.SummaryMetadata` protobuf object. def create_summary_metadata(display_name, description): """Create a `summary_pb2.SummaryMetadata` proto for image plugin data. Returns: A `summary_pb2.SummaryMetadata` protobuf ...
Create a legacy audio summary op for use in a TensorFlow graph. Arguments: name: A unique name for the generated summary node. audio: A `Tensor` representing audio data with shape `[k, t, c]`, where `k` is the number of audio clips, `t` is the number of frames, and `c` is the number of channels. ...
Create a legacy audio summary protobuf. This behaves as if you were to create an `op` with the same arguments (wrapped with constant tensors where appropriate) and then execute that summary op in a TensorFlow session. Arguments: name: A unique name for the generated summary node. audio: An `np.array` ...
Create a PR curve summary op for a single binary classifier. Computes true/false positive/negative values for the given `predictions` against the ground truth `labels`, against a list of evenly distributed threshold values in `[0, 1]` of length `num_thresholds`. Each number in `predictions`, a float in `[0, 1...
Create a PR curves summary protobuf. Arguments: name: A name for the generated node. Will also serve as a series name in TensorBoard. labels: The ground truth values. A bool numpy array. predictions: A float32 numpy array whose values are in the range `[0, 1]`. Dimensions must match those...
Computes a precision-recall curve summary across batches of data. This function is similar to op() above, but can be used to compute the PR curve across multiple batches of labels and predictions, in the same style as the metrics found in tf.metrics. This function creates multiple local variables for storing ...
Create an op that collects data for visualizing PR curves. Unlike the op above, this one avoids computing precision, recall, and the intermediate counts. Instead, it accepts those tensors as arguments and relies on the caller to ensure that the calculations are correct (and the counts yield the provided precis...
Create a PR curves summary protobuf from raw data values. Args: name: A tag attached to the summary. Used by TensorBoard for organization. true_positive_counts: A rank-1 numpy array of true positive counts. Must contain `num_thresholds` elements and be castable to float32. false_positive_counts: ...
A private helper method for generating a tensor summary. We use a helper method instead of having `op` directly call `raw_data_op` to prevent the scope of `raw_data_op` from being embedded within `op`. Arguments are the same as for raw_data_op. Returns: A tensor summary that collects data for PR curves. ...
Executes the request. Returns: An array of tuples representing the metric evaluations--each of the form (<wall time in secs>, <training step>, <metric value>). def run(self): """Executes the request. Returns: An array of tuples representing the metric evaluations--each of the form ...
This plugin is active iff any run has at least one histograms tag. def is_active(self): """This plugin is active iff any run has at least one histograms tag.""" if self._db_connection_provider: # The plugin is active if one relevant tag can be found in the database. db = self._db_connection_provide...
Result of the form `(body, mime_type)`, or `ValueError`. At most `downsample_to` events will be returned. If this value is `None`, then no downsampling will be performed. def histograms_impl(self, tag, run, downsample_to=None): """Result of the form `(body, mime_type)`, or `ValueError`. At most `down...
Obtains values for histogram data given blob and dtype enum. Args: data_blob: The blob obtained from the database. dtype_enum: The enum representing the dtype. shape_string: A comma-separated string of numbers denoting shape. Returns: The histogram values as a list served to the frontend...
Given a tag and single run, return array of histogram values. def histograms_route(self, request): """Given a tag and single run, return array of histogram values.""" tag = request.args.get('tag') run = request.args.get('run') try: (body, mime_type) = self.histograms_impl( tag, run, dow...
Initialize the graph and session, if this has not yet been done. def _lazily_initialize(self): """Initialize the graph and session, if this has not yet been done.""" # TODO(nickfelt): remove on-demand imports once dep situation is fixed. import tensorflow.compat.v1 as tf with self._initialization_lock:...
Tries to get the scalars plugin. Returns: The scalars plugin. Or None if it is not yet registered. def _get_scalars_plugin(self): """Tries to get the scalars plugin. Returns: The scalars plugin. Or None if it is not yet registered. """ if scalars_metadata.PLUGIN_NAME in self._plugin_n...
This plugin is active if 2 conditions hold. 1. The scalars plugin is registered and active. 2. There is a custom layout for the dashboard. Returns: A boolean. Whether the plugin is active. def is_active(self): """This plugin is active if 2 conditions hold. 1. The scalars plugin is registered and...
Provides a response for downloading scalars data for a data series. Args: run: The run. tag: The specific tag. response_format: A string. One of the values of the OutputFormat enum of the scalar plugin. Raises: ValueError: If the scalars plugin is not registered. Returns: ...
Given a tag regex and single run, return ScalarEvents. This route takes 2 GET params: run: A run string to find tags for. tag: A string that is a regex used to find matching tags. The response is a JSON object: { // Whether the regular expression is valid. Also false if empty. regexVali...
Given a tag regex and single run, return ScalarEvents. Args: run: A run string. tag_regex_string: A regular expression that captures portions of tags. Raises: ValueError: if the scalars plugin is not registered. Returns: A dictionary that is the JSON-able response. def scalars_im...
r"""Fetches the custom layout specified by the config file in the logdir. If more than 1 run contains a layout, this method merges the layouts by merging charts within individual categories. If 2 categories with the same name are found, the charts within are merged. The merging is based on the order of...
Given an iterable of string contents, make a table row. Args: contents: An iterable yielding strings. tag: The tag to place contents in. Defaults to 'td', you might want 'th'. Returns: A string containing the content strings, organized into a table row. Example: make_table_row(['one', 'two', 'three...
Given a numpy ndarray of strings, concatenate them into a html table. Args: contents: A np.ndarray of strings. May be 1d or 2d. In the 1d case, the table is laid out vertically (i.e. row-major). headers: A np.ndarray or list of string header names for the table. Returns: A string containing all ...
Given a np.npdarray with nDims > 2, reduce it to 2d. It does this by selecting the zeroth coordinate for every dimension greater than two. Args: arr: a numpy ndarray of dimension at least 2. Returns: A two-dimensional subarray from the input array. Raises: ValueError: If the argument is not a ...
Take a numpy.ndarray containing strings, and convert it into html. If the ndarray contains a single scalar string, that string is converted to html via our sanitized markdown parser. If it contains an array of strings, the strings are individually converted to html and then composed into a table using make_tab...
Convert a TensorEvent into a JSON-compatible response. def process_string_tensor_event(event): """Convert a TensorEvent into a JSON-compatible response.""" string_arr = tensor_util.make_ndarray(event.tensor_proto) html = text_array_to_html(string_arr) return { 'wall_time': event.wall_time, 'step': ...
Determines whether this plugin is active. This plugin is only active if TensorBoard sampled any text summaries. Returns: Whether this plugin is active. def is_active(self): """Determines whether this plugin is active. This plugin is only active if TensorBoard sampled any text summaries. R...
Attempts to launch a thread to compute index_impl(). This may not launch a new thread if one is already running to compute index_impl(); in that case, this function is a no-op. def _maybe_launch_index_impl_thread(self): """Attempts to launch a thread to compute index_impl(). This may not launch a new...
Computes index_impl() asynchronously on a separate thread. def _async_index_impl(self): """Computes index_impl() asynchronously on a separate thread.""" start = time.time() logger.info('TextPlugin computing index_impl() in a new thread') self._index_cached = self.index_impl() self._index_impl_threa...
Create a `summary_pb2.SummaryMetadata` proto for pr_curves plugin data. Arguments: display_name: The display name used in TensorBoard. description: The description to show in TensorBoard. num_thresholds: The number of thresholds to use for PR curves. Returns: A `summary_pb2.SummaryMetadata` protob...
Parse summary metadata to a Python object. Arguments: content: The `content` field of a `SummaryMetadata` proto corresponding to the pr_curves plugin. Returns: A `PrCurvesPlugin` protobuf object. def parse_plugin_metadata(content): """Parse summary metadata to a Python object. Arguments: c...
Return a field to `Observations` dict for the event generator. Args: generator: A generator over event protos. query_for_tag: A string that if specified, only create observations for events with this tag name. Returns: A dict mapping keys in `TRACKED_FIELDS` to an `Observation` list. def get_fi...
Returns a dictionary of tags that a user could query over. Args: field_to_obs: Dict that maps string field to `Observation` list. Returns: A dict that maps keys in `TAG_FIELDS` to a list of string tags present in the event files. If the dict does not have any observations of the type, maps to an e...
Prints a shallow dict to console. Args: d: Dict to print. show_missing: Whether to show keys with empty values. def print_dict(d, show_missing=True): """Prints a shallow dict to console. Args: d: Dict to print. show_missing: Whether to show keys with empty values. """ for k, v in sorted(d.i...
Transform the field-to-obs mapping into a printable dictionary. Args: field_to_obs: Dict that maps string field to `Observation` list. Returns: A dict with the keys and values to print to console. def get_dict_to_print(field_to_obs): """Transform the field-to-obs mapping into a printable dictionary. ...
Returns elements that break the monotonically non-decreasing trend. This is used to find instances of global step values that are "out-of-order", which may trigger TensorBoard event discarding logic. Args: list_of_numbers: A list of numbers. Returns: A list of tuples in which each tuple are two eleme...
Returns a list of event generators for subdirectories with event files. The number of generators returned should equal the number of directories within logdir that contain event files. If only logdir contains event files, returns a list of length one. Args: logdir: A log directory that contains event file...
Returns a list of InspectionUnit objects given either logdir or event_file. If logdir is given, the number of InspectionUnits should equal the number of directories or subdirectories that contain event files. If event_file is given, the number of InspectionUnits should be 1. Args: logdir: A log directory...
Main function for inspector that prints out a digest of event files. Args: logdir: A log directory that contains event files. event_file: Or, a particular event file path. tag: An optional tag name to query for. Raises: ValueError: If neither logdir and event_file are given, or both are given. de...
Adds DebuggerPlugin CLI flags to parser. def define_flags(self, parser): """Adds DebuggerPlugin CLI flags to parser.""" group = parser.add_argument_group('debugger plugin') group.add_argument( '--debugger_data_server_grpc_port', metavar='PORT', type=int, default=-1, ...
Returns the debugger plugin, if possible. Args: context: The TBContext flags including `add_arguments`. Returns: A DebuggerPlugin instance or None if it couldn't be loaded. def load(self, context): """Returns the debugger plugin, if possible. Args: context: The TBContext flags incl...
Returns a summary metadata for the HParams plugin. Returns a summary_pb2.SummaryMetadata holding a copy of the given HParamsPluginData message in its plugin_data.content field. Sets the version field of the hparams_plugin_data_pb copy to PLUGIN_DATA_VERSION. Args: hparams_plugin_data_pb: the HParamsPlug...
Returns a data oneof's field from plugin_data.content. Raises HParamsError if the content doesn't have 'data_oneof_field' set or this file is incompatible with the version of the metadata stored. Args: content: The SummaryMetadata.plugin_data.content to use. data_oneof_field: string. The name of the dat...
Writes an event proto to disk. This method is threadsafe with respect to invocations of itself. Args: event: The event proto. Raises: IOError: If writing the event proto to disk fails. def write_event(self, event): """Writes an event proto to disk. This method is threadsafe with res...
Disposes of this events writer manager, making it no longer usable. Call this method when this object is done being used in order to clean up resources and handlers. This method should ever only be called once. def dispose(self): """Disposes of this events writer manager, making it no longer usable. ...
Creates a new events writer. Args: directory: The directory in which to write files containing events. Returns: A new events writer, which corresponds to a new events file. def _create_events_writer(self, directory): """Creates a new events writer. Args: directory: The directory in...
Obtains the names of debugger-related events files within the directory. Returns: The names of the debugger-related events files written to disk. The names are sorted in increasing events file index. def _fetch_events_files_on_disk(self): """Obtains the names of debugger-related events files withi...
Re-export all symbols from the original tf.summary. This function finds the original tf.summary V2 API and re-exports all the symbols from it within this module as well, so that when this module is patched into the TF API namespace as the new tf.summary, the effect is an overlay that just adds TensorBoard-prov...
Encode `image` to PNG on `thread_count` threads in parallel. Returns: A `float` representing number of seconds that it takes all threads to finish encoding `image`. def bench(image, thread_count): """Encode `image` to PNG on `thread_count` threads in parallel. Returns: A `float` representing number...
Generate a square RGB test image of the given side length. def _image_of_size(image_size): """Generate a square RGB test image of the given side length.""" return np.random.uniform(0, 256, [image_size, image_size, 3]).astype(np.uint8)
Format a line of a table. Arguments: headers: A list of strings that are used as the table headers. fields: A list of the same length as `headers` where `fields[i]` is the entry for `headers[i]` in this row. Elements can be of arbitrary types. Pass `headers` to print the header row. Returns: ...
Extract all nodes with gated-gRPC debug ops attached. Uses cached values if available. This method is thread-safe. Args: graph_def: A tf.GraphDef proto. matching_debug_op: Return tensors and nodes with only matching the specified debug op name (optional). If `None`, will extract only ...
Expand the base name if there are node names nested under the node. For example, if there are two nodes in the graph, "a" and "a/read", then calling this function on "a" will give "a/(a)", a form that points at a leaf node in the nested TensorBoard graph. Calling this function on "a/read" will just ret...
Load runs from a directory; recursively walks subdirectories. If path doesn't exist, no-op. This ensures that it is safe to call `AddRunsFromDirectory` multiple times, even before the directory is made. Args: path: A string path to a directory to load runs from. name: Optional, specifies a n...
Load events from every detected run. def Reload(self): """Load events from every detected run.""" logger.info('Beginning DbImportMultiplexer.Reload()') # Defer event sink creation until needed; this ensures it will only exist in # the thread that calls Reload(), since DB connections must be thread-loca...
Returns a batched event iterator over the run directory event files. def load_batches(self): """Returns a batched event iterator over the run directory event files.""" event_iterator = self._directory_watcher.Load() while True: events = [] event_bytes = 0 start = time.time() for eve...
Processes a single tf.Event and records it in tagged_data. def _process_event(self, event, tagged_data): """Processes a single tf.Event and records it in tagged_data.""" event_type = event.WhichOneof('what') # Handle the most common case first. if event_type == 'summary': for value in event.summa...
Create a TensorFlow op to group data into histogram buckets. Arguments: data: A `Tensor` of any shape. Must be castable to `float64`. bucket_count: Optional positive `int` or scalar `int32` `Tensor`. Returns: A `Tensor` of shape `[k, 3]` and type `float64`. The `i`th row is a triple `[left_edge, ri...
Create a legacy histogram summary op. Arguments: name: A unique name for the generated summary node. data: A `Tensor` of any shape. Must be castable to `float64`. bucket_count: Optional positive `int`. The output will have this many buckets, except in two edge cases. If there is no data, then ...
Create a legacy histogram summary protobuf. Arguments: name: A unique name for the generated summary, including any desired name scopes. data: A `np.array` or array-like form of any shape. Must have type castable to `float`. bucket_count: Optional positive `int`. The output will have this ...
Add a tensor the watch store. def add(self, value): """Add a tensor the watch store.""" if self._disposed: raise ValueError( 'Cannot add value: this _WatchStore instance is already disposed') self._data.append(value) if hasattr(value, 'nbytes'): self._in_mem_bytes += value.nbytes ...
Get number of values in memory. def num_in_memory(self): """Get number of values in memory.""" n = len(self._data) - 1 while n >= 0: if isinstance(self._data[n], _TensorValueDiscarded): break n -= 1 return len(self._data) - 1 - n
Get the number of values discarded due to exceeding both limits. def num_discarded(self): """Get the number of values discarded due to exceeding both limits.""" if not self._data: return 0 n = 0 while n < len(self._data): if not isinstance(self._data[n], _TensorValueDiscarded): brea...
Query the values at given time indices. Args: time_indices: 0-based time indices to query, as a `list` of `int`. Returns: Values as a list of `numpy.ndarray` (for time indices in memory) or `None` (for time indices discarded). def query(self, time_indices): """Query the values at given ...
Add a tensor value. Args: watch_key: A string representing the debugger tensor watch, e.g., 'Dense_1/BiasAdd:0:DebugIdentity'. tensor_value: The value of the tensor as a numpy.ndarray. def add(self, watch_key, tensor_value): """Add a tensor value. Args: watch_key: A string repre...
Query tensor store for a given watch_key. Args: watch_key: The watch key to query. time_indices: A numpy-style slicing string for time indices. E.g., `-1`, `:-2`, `[::2]`. If not provided (`None`), will use -1. slicing: A numpy-style slicing string for individual time steps. mapping...
Start listening on the given gRPC port. This method of an instance of DebuggerPlugin can be invoked at most once. This method is not thread safe. Args: grpc_port: port number to listen at. Raises: ValueError: If this instance is already listening at a gRPC port. def listen(self, grpc_por...
Determines whether this plugin is active. This plugin is active if any health pills information is present for any run. Returns: A boolean. Whether this plugin is active. def is_active(self): """Determines whether this plugin is active. This plugin is active if any health pills information...
A (wrapped) werkzeug handler for serving health pills. Accepts POST requests and responds with health pills. The request accepts several POST parameters: node_names: (required string) A JSON-ified list of node names for which the client would like to request health pills. run: (optional ...
Obtains the health pills for a run sampled by the event multiplexer. This is much faster than the alternative path of reading health pills from disk. Args: run: The run to fetch health pills for. node_names: A list of node names for which to retrieve health pills. Returns: A diction...
Converts an event_accumulator.TensorEvent to a HealthPillEvent. Args: tensor_event: The event_accumulator.TensorEvent to convert. node_name: The name of the node (without the output slot). device: The device. output_slot: The integer output slot this health pill is relevant to. Returns...