text
stringlengths
81
112k
Run a backtest for the given algorithm. This is shared between the cli and :func:`zipline.run_algo`. def _run(handle_data, initialize, before_trading_start, analyze, algofile, algotext, defines, data_frequency, capital_base, bundle, ...
Load all of the given extensions. This should be called by run_algo or the cli. Parameters ---------- default : bool Load the default exension (~/.zipline/extension.py)? extension : iterable[str] The paths to the extensions to load. If the path ends in ``.py`` it is treated ...
Run a trading algorithm. Parameters ---------- start : datetime The start date of the backtest. end : datetime The end date of the backtest.. initialize : callable[context -> None] The initialize function to use for the algorithm. This is called once at the very begi...
Extra sources always have a sid column. We expand the given data (by forward filling) to the full range of the simulation dates, so that lookup is fast during simulation. def handle_extra_source(self, source_df, sim_params): """ Extra sources always have a sid column. We expan...
Given an asset and dt, returns the last traded dt from the viewpoint of the given dt. If there is a trade on the dt, the answer is dt provided. def get_last_traded_dt(self, asset, dt, data_frequency): """ Given an asset and dt, returns the last traded dt from the viewpoint of t...
Internal method that determines if this asset/field combination represents a fetcher value or a regular OHLCVP lookup. def _is_extra_source(asset, field, map): """ Internal method that determines if this asset/field combination represents a fetcher value or a regular OHLCVP lookup. ...
Public API method that returns a scalar value representing the value of the desired asset's field at either the given dt. Parameters ---------- assets : Asset, ContinuousFuture, or iterable of same. The asset or assets whose data is desired. field : {'open', 'high', ...
Public API method that returns a scalar value representing the value of the desired asset's field at either the given dt. Parameters ---------- assets : Asset The asset or assets whose data is desired. This cannot be an arbitrary AssetConvertible. field :...
Returns a list of adjustments between the dt and perspective_dt for the given field and list of assets Parameters ---------- assets : list of type Asset, or Asset The asset, or assets whose adjustments are desired. field : {'open', 'high', 'low', 'close', 'volume', \...
Returns a scalar value representing the value of the desired asset's field at the given dt with adjustments applied. Parameters ---------- asset : Asset The asset whose data is desired. field : {'open', 'high', 'low', 'close', 'volume', \ 'price', 'l...
Internal method that returns a dataframe containing history bars of daily frequency for the given sids. def _get_history_daily_window(self, assets, end_dt, bar_count, field_to...
Internal method that returns a dataframe containing history bars of minute frequency for the given sids. def _get_history_minute_window(self, assets, end_dt, bar_count, field_to_use): """ Internal method that returns a dataframe containing history bars ...
Public API method that returns a dataframe containing the requested history window. Data is fully adjusted. Parameters ---------- assets : list of zipline.data.Asset objects The assets whose data is desired. bar_count: int The number of bars desired. ...
Internal method that gets a window of adjusted minute data for an asset and specified date range. Used to support the history API method for minute bars. Missing bars are filled with NaN. Parameters ---------- assets : iterable[Asset] The assets whose data ...
Internal method that gets a window of adjusted daily data for a sid and specified date range. Used to support the history API method for daily bars. Parameters ---------- asset : Asset The asset whose data is desired. start_dt: pandas.Timestamp ...
Internal method that returns a list of adjustments for the given sid. Parameters ---------- asset : Asset The asset for which to return adjustments. adjustments_dict: dict A dictionary of sid -> list that is used as a cache. table_name: string ...
Returns any splits for the given sids and the given dt. Parameters ---------- assets : container Assets for which we want splits. dt : pd.Timestamp The date for which we are checking for splits. Note: this is expected to be midnight UTC. Retu...
Returns all the stock dividends for a specific sid that occur in the given trading range. Parameters ---------- sid: int The asset whose stock dividends should be returned. trading_days: pd.DatetimeIndex The trading range. Returns ------...
Returns a list of assets for the current date, as defined by the fetcher data. Returns ------- list: a list of Asset objects. def get_fetcher_assets(self, dt): """ Returns a list of assets for the current date, as defined by the fetcher data. Returns ...
Retrieves the future chain for the contract at the given `dt` according the `continuous_future` specification. Returns ------- future_chain : list[Future] A list of active futures, where the first index is the current contract specified by the continuous future ...
Make a function that checks whether a scalar or array is of a given kind (e.g. float, int, datetime, timedelta). def make_kind_check(python_types, numpy_kind): """ Make a function that checks whether a scalar or array is of a given kind (e.g. float, int, datetime, timedelta). """ def check(valu...
Make a value with the specified numpy dtype. Only datetime64[ns] and datetime64[D] are supported for datetime dtypes. def coerce_to_dtype(dtype, value): """ Make a value with the specified numpy dtype. Only datetime64[ns] and datetime64[D] are supported for datetime dtypes. """ name = dtype.n...
Restride `array` to repeat `count` times along the first axis. Parameters ---------- array : np.array The array to restride. count : int Number of times to repeat `array`. Returns ------- result : array Array of shape (count,) + array.shape, composed of `array` repe...
Restride `array` to repeat `count` times along the last axis. Parameters ---------- array : np.array The array to restride. count : int Number of times to repeat `array`. Returns ------- result : array Array of shape array.shape + (count,) composed of `array` repeat...
Restride an array of shape (X_0, ... X_N) into an array of shape (length, X_0 - length + 1, ... X_N) where each slice at index i along the first axis is equivalent to result[i] = array[length * i:length * (i + 1)] Parameters ---------- array : np.ndarray The bas...
Check if a value is np.NaT. def isnat(obj): """ Check if a value is np.NaT. """ if obj.dtype.kind not in ('m', 'M'): raise ValueError("%s is not a numpy datetime or timedelta") return obj.view(int64_dtype) == iNaT
Generic is_missing function that handles NaN and NaT. def is_missing(data, missing_value): """ Generic is_missing function that handles NaN and NaT. """ if is_float(data) and isnan(missing_value): return isnan(data) elif is_datetime(data) and isnat(missing_value): return isnat(data)...
Simple of numpy.busday_count that returns `float` arrays rather than int arrays, and handles `NaT`s by returning `NaN`s where the inputs were `NaT`. Doesn't support custom weekdays or calendars, but probably should in the future. See Also -------- np.busday_count def busday_count_mask_NaT(beg...
Compute indices of values in ``a`` that differ from the previous value. Parameters ---------- a : np.ndarray The array on which to indices of change. include_first : bool Whether or not to consider the first index of the array as "changed". Example ------- >>> import numpy ...
Compute the start and end dates to run a pipeline for. Parameters ---------- sessions : DatetimeIndex The available dates. start_date : pd.Timestamp The first date in the pipeline. end_date : pd.Timestamp The last date in the pipeline. chunksize : int or None The...
Compute a pipeline. Parameters ---------- pipeline : zipline.pipeline.Pipeline The pipeline to run. start_date : pd.Timestamp Start date of the computed matrix. end_date : pd.Timestamp End date of the computed matrix. Returns ...
Compute a lifetimes matrix from our AssetFinder, then drop columns that didn't exist at all during the query dates. Parameters ---------- domain : zipline.pipeline.domain.Domain Domain for which we're computing a pipeline. start_date : pd.Timestamp Base s...
Compute the Pipeline terms in the graph for the requested start and end dates. This is where we do the actual work of running a pipeline. Parameters ---------- graph : zipline.pipeline.graph.ExecutionPlan Dependency graph of the terms to be executed. dates :...
Convert raw computed pipeline results into a DataFrame for public APIs. Parameters ---------- terms : dict[str -> Term] Dict mapping column names to terms. data : dict[str -> ndarray[ndim=2]] Dict mapping column names to computed results for those names. ...
Verify that the values passed to compute_chunk are well-formed. def _validate_compute_chunk_params(self, graph, dates, sids, initial_workspace): """ ...
Resolve a concrete domain for ``pipeline``. def resolve_domain(self, pipeline): """Resolve a concrete domain for ``pipeline``. """ domain = pipeline.domain(default=self._default_domain) if domain is GENERIC: raise ValueError( "Unable to determine domain for P...
Decorator for API methods that should only be called after TradingAlgorithm.initialize. `exception` will be raised if the method is called before initialize has completed. Examples -------- @require_initialized(SomeException("Don't do that!")) def method(self): # Do stuff that should o...
Decorator for API methods that cannot be called from within TradingAlgorithm.before_trading_start. `exception` will be raised if the method is called inside `before_trading_start`. Examples -------- @disallowed_in_before_trading_start(SomeException("Don't do that!")) def method(self): ...
Simple implementation of grouped row-wise function application. Parameters ---------- data : ndarray[ndim=2] Input array over which to apply a grouped function. group_labels : ndarray[ndim=2, dtype=int64] Labels to use to bucket inputs from array. Should be the same shape as arr...
Format a bulleted list of values. Parameters ---------- items : sequence The items to make a list. indent : int, optional The number of spaces to add before each bullet. bullet_type : str, optional The bullet type to use. Returns ------- formatted_list : str ...
Create a DataFrame representing lifetimes of assets that are constantly rotating in and out of existence. Parameters ---------- num_assets : int How many assets to create. first_start : pd.Timestamp The start date for the first asset. frequency : str or pd.tseries.offsets.Offset...
Create a DataFrame representing assets that exist for the full duration between `start_date` and `end_date`. Parameters ---------- sids : array-like of int start_date : pd.Timestamp, optional end_date : pd.Timestamp, optional symbols : list, optional Symbols to use for the assets. ...
Create a DataFrame representing assets that exist for the full duration between `start_date` and `end_date`, from multiple countries. def make_simple_multi_country_equity_info(countries_to_sids, countries_to_exchanges, start_date, ...
Create a DataFrame representing assets that all begin at the same start date, but have cascading end dates. Parameters ---------- num_assets : int How many assets to create. start_date : pd.Timestamp The start date for all the assets. first_end : pd.Timestamp The date at...
Create a DataFrame representing futures for `root_symbols` during `year`. Generates a contract per triple of (symbol, year, month) supplied to `root_symbols`, `years`, and `month_codes`. Parameters ---------- first_sid : int The first sid to use for assigning sids to the created contracts....
Make futures testing data that simulates the notice/expiration date behavior of physical commodities like oil. Parameters ---------- first_sid : int The first sid to use for assigning sids to the created contracts. root_symbols : list[str] A list of root symbols for which to create ...
Construct a Filter returning True for asset/date pairs where the output of ``self`` matches ``other``. def eq(self, other): """ Construct a Filter returning True for asset/date pairs where the output of ``self`` matches ``other``. """ # We treat this as an error because ...
Construct a Filter matching values starting with ``prefix``. Parameters ---------- prefix : str String prefix against which to compare values produced by ``self``. Returns ------- matches : Filter Filter returning True for all sid/date pairs for ...
Construct a Filter matching values ending with ``suffix``. Parameters ---------- suffix : str String suffix against which to compare values produced by ``self``. Returns ------- matches : Filter Filter returning True for all sid/date pairs for wh...
Construct a Filter matching values containing ``substring``. Parameters ---------- substring : str Sub-string against which to compare values produced by ``self``. Returns ------- matches : Filter Filter returning True for all sid/date pairs for ...
Construct a Filter that checks regex matches against ``pattern``. Parameters ---------- pattern : str Regex pattern against which to compare values produced by ``self``. Returns ------- matches : Filter Filter returning True for all sid/date pair...
Construct a Filter indicating whether values are in ``choices``. Parameters ---------- choices : iterable[str or int] An iterable of choices. Returns ------- matches : Filter Filter returning True for all sid/date pairs for which ``self`` ...
Called with the result of a pipeline. This needs to return an object which can be put into the workspace to continue doing computations. This is the inverse of :func:`~zipline.pipeline.term.Term.postprocess`. def to_workspace_value(self, result, assets): """ Called with the result of a...
Convert an array produced by this classifier into an array of integer labels and a missing value label. def _to_integral(self, output_array): """ Convert an array produced by this classifier into an array of integer labels and a missing value label. """ if self.dtype == ...
Override the default array allocation to produce a LabelArray when we have a string-like dtype. def _allocate_output(self, windows, shape): """ Override the default array allocation to produce a LabelArray when we have a string-like dtype. """ if self.dtype == int64_dtyp...
Check that all axes of a pandas object are unique. Parameters ---------- obj : pd.Series / pd.DataFrame / pd.Panel The object to validate. Returns ------- obj : pd.Series / pd.DataFrame / pd.Panel The validated object, unchanged. Raises ------ ValueError If...
Modify a preprocessor to explicitly allow `None`. Parameters ---------- preprocessor : callable[callable, str, any -> any] A preprocessor to delegate to when `arg is not None`. Returns ------- optional_preprocessor : callable[callable, str, any -> any] A preprocessor that deleg...
Argument preprocessor that converts the input into a numpy dtype. Examples -------- >>> import numpy as np >>> from zipline.utils.preprocess import preprocess >>> @preprocess(dtype=ensure_dtype) ... def foo(dtype): ... return dtype ... >>> foo(float) dtype('float64') def en...
Argument preprocessor that converts the input into a tzinfo object. Examples -------- >>> from zipline.utils.preprocess import preprocess >>> @preprocess(tz=ensure_timezone) ... def foo(tz): ... return tz >>> foo('utc') <UTC> def ensure_timezone(func, argname, arg): """Argument...
Argument preprocessor that converts the input into a pandas Timestamp object. Examples -------- >>> from zipline.utils.preprocess import preprocess >>> @preprocess(ts=ensure_timestamp) ... def foo(ts): ... return ts >>> foo('2014-01-01') Timestamp('2014-01-01 00:00:00') def ens...
Preprocessing decorator that verifies inputs have expected numpy dtypes. Examples -------- >>> from numpy import dtype, arange, int8, float64 >>> @expect_dtypes(x=dtype(int8)) ... def foo(x, y): ... return x, y ... >>> foo(arange(3, dtype=int8), 'foo') (array([0, 1, 2], dtype=int...
Preprocessing decorator that verifies inputs have expected dtype kinds. Examples -------- >>> from numpy import int64, int32, float32 >>> @expect_kinds(x='i') ... def foo(x): ... return x ... >>> foo(int64(2)) 2 >>> foo(int32(2)) 2 >>> foo(float32(2)) # doctest: +NOR...
Preprocessing decorator that verifies inputs have expected types. Examples -------- >>> @expect_types(x=int, y=str) ... def foo(x, y): ... return x, y ... >>> foo(2, '3') (2, '3') >>> foo(2.0, '3') # doctest: +NORMALIZE_WHITESPACE +ELLIPSIS Traceback (most recent call last):...
Factory for making preprocessing functions that check a predicate on the input value. Parameters ---------- exc_type : Exception The exception type to raise if the predicate fails. template : str A template string to use to create error messages. Should have %-style named te...
Preprocessing decorator that verifies inputs are elements of some expected collection. Examples -------- >>> @expect_element(x=('a', 'b')) ... def foo(x): ... return x.upper() ... >>> foo('a') 'A' >>> foo('b') 'B' >>> foo('c') # doctest: +NORMALIZE_WHITESPACE +ELLIPS...
Preprocessing decorator verifying that inputs fall INCLUSIVELY between bounds. Bounds should be passed as a pair of ``(min_value, max_value)``. ``None`` may be passed as ``min_value`` or ``max_value`` to signify that the input is only bounded above or below. Examples -------- >>> @expect_...
Preprocessing decorator that verifies inputs are numpy arrays with a specific dimensionality. Examples -------- >>> from numpy import array >>> @expect_dimensions(x=1, y=2) ... def foo(x, y): ... return x[0] + y[0, 0] ... >>> foo(array([1, 1]), array([[1, 1], [2, 2]])) 2 ...
A preprocessing decorator that coerces inputs of a given type by passing them to a callable. Parameters ---------- from : type or tuple or types Inputs types on which to call ``to``. to : function Coercion function to call on inputs. **to_kwargs Additional keywords to fo...
Preprocessing decorator that applies type coercions. Parameters ---------- **kwargs : dict[str -> (type, callable)] Keyword arguments mapping function parameter names to pairs of (from_type, to_type). Examples -------- >>> @coerce_types(x=(float, int), y=(int, str)) ... d...
Validate that a dictionary has an expected set of keys. def validate_keys(dict_, expected, funcname): """Validate that a dictionary has an expected set of keys. """ expected = set(expected) received = set(dict_) missing = expected - received if missing: raise ValueError( "M...
Construct a new enum object. Parameters ---------- *options : iterable of str The names of the fields for the enum. Returns ------- enum A new enum collection. Examples -------- >>> e = enum('a', 'b', 'c') >>> e <enum: ('a', 'b', 'c')> >>> e.a 0 ...
Get the oldest frame in the panel. def oldest_frame(self, raw=False): """ Get the oldest frame in the panel. """ if raw: return self.buffer.values[:, self._start_index, :] return self.buffer.iloc[:, self._start_index, :]
Resizes the buffer to hold a new window with a new cap_multiple. If cap_multiple is None, then the old cap_multiple is used. def extend_back(self, missing_dts): """ Resizes the buffer to hold a new window with a new cap_multiple. If cap_multiple is None, then the old cap_multiple is use...
Get a Panel that is the current data in view. It is not safe to persist these objects because internal data might change def get_current(self, item=None, raw=False, start=None, end=None): """ Get a Panel that is the current data in view. It is not safe to persist these objects because i...
Set the values stored in our current in-view data to be values of the passed panel. The passed panel must have the same indices as the panel that would be returned by self.get_current. def set_current(self, panel): """ Set the values stored in our current in-view data to be values of t...
Roll window worth of data up to position zero. Save the effort of having to expensively roll at each iteration def _roll_data(self): """ Roll window worth of data up to position zero. Save the effort of having to expensively roll at each iteration """ self.buffer.values...
Get the oldest frame in the panel. def oldest_frame(self, raw=False): """ Get the oldest frame in the panel. """ if raw: return self.buffer.values[:, self._oldest_frame_idx(), :] return self.buffer.iloc[:, self._oldest_frame_idx(), :]
Get a Panel that is the current data in view. It is not safe to persist these objects because internal data might change def get_current(self): """ Get a Panel that is the current data in view. It is not safe to persist these objects because internal data might change """ ...
Update internal state based on price triggers and the trade event's price. def check_triggers(self, price, dt): """ Update internal state based on price triggers and the trade event's price. """ stop_reached, limit_reached, sl_stop_reached = \ self.check_orde...
Given an order and a trade event, return a tuple of (stop_reached, limit_reached). For market orders, will return (False, False). For stop orders, limit_reached will always be False. For limit orders, stop_reached will always be False. For stop limit orders a Boolean is returned ...
For a market order, True. For a stop order, True IFF stop_reached. For a limit order, True IFF limit_reached. def triggered(self): """ For a market order, True. For a stop order, True IFF stop_reached. For a limit order, True IFF limit_reached. """ if sel...
Lives in zipline.__init__ for doctests. def setup(self, np=np, numpy_version=numpy_version, StrictVersion=StrictVersion, new_pandas=new_pandas): """Lives in zipline.__init__ for doctests.""" if numpy_version >= StrictVersion('1.14'): self.old_opts = np.get_print...
Lives in zipline.__init__ for doctests. def teardown(self, np=np): """Lives in zipline.__init__ for doctests.""" if self.old_err is not None: np.seterr(**self.old_err) if self.old_opts is not None: np.set_printoptions(**self.old_opts)
Define a unique string for any set of representable args. def hash_args(*args, **kwargs): """Define a unique string for any set of representable args.""" arg_string = '_'.join([str(arg) for arg in args]) kwarg_string = '_'.join([str(key) + '=' + str(value) for key, value in ite...
Assert that an event meets the protocol for datasource outputs. def assert_datasource_protocol(event): """Assert that an event meets the protocol for datasource outputs.""" assert event.type in DATASOURCE_TYPE # Done packets have no dt. if not event.type == DATASOURCE_TYPE.DONE: assert isinst...
Assert that an event meets the protocol for datasource TRADE outputs. def assert_trade_protocol(event): """Assert that an event meets the protocol for datasource TRADE outputs.""" assert_datasource_protocol(event) assert event.type == DATASOURCE_TYPE.TRADE assert isinstance(event.price, numbers.Real) ...
Takes an iterable of sources, generating namestrings and piping their output into date_sort. def date_sorted_sources(*sources): """ Takes an iterable of sources, generating namestrings and piping their output into date_sort. """ sorted_stream = heapq.merge(*(_decorate_source(s) for s in sources...
creates trade_count trades for each sid in sids list. first trade will be on sim_params.start_session, and daily thereafter for each sid. Thus, two sids should result in two trades per day. def create_daily_trade_source(sids, sim_params, asset_fin...
Load data table from zip file provided by Quandl. def load_data_table(file, index_col, show_progress=False): """ Load data table from zip file provided by Quandl. """ with ZipFile(file) as zip_file: file_names = zip_file.namelist() assert len(file_nam...
Fetch WIKI Prices data table from Quandl def fetch_data_table(api_key, show_progress, retries): """ Fetch WIKI Prices data table from Quandl """ for _ in range(retries): try: if show_progress: log.info('Downloading WIKI metadata....
quandl_bundle builds a daily dataset using Quandl's WIKI Prices dataset. For more information on Quandl's API and how to obtain an API key, please visit https://docs.quandl.com/docs#section-authentication def quandl_bundle(environ, asset_db_writer, minute_bar_writer, ...
Download streaming data from a URL, printing progress information to the terminal. Parameters ---------- url : str A URL that can be understood by ``requests.get``. chunk_size : int Number of bytes to read at a time from requests. **progress_kwargs Forwarded to click.pro...
Download data from a URL, returning a BytesIO containing the loaded data. Parameters ---------- url : str A URL that can be understood by ``requests.get``. Returns ------- data : BytesIO A BytesIO containing the downloaded data. def download_without_progress(url): """ ...
Resample a DataFrame with minute data into the frame expected by a BcolzDailyBarWriter. Parameters ---------- minute_frame : pd.DataFrame A DataFrame with the columns `open`, `high`, `low`, `close`, `volume`, and `dt` (minute dts) calendar : trading_calendars.trading_calendar.Tradin...
Resample an array with minute data into an array with session data. This function assumes that the minute data is the exact length of all minutes in the sessions in the output. Parameters ---------- column : str The `open`, `high`, `low`, `close`, or `volume` column. close_locs : array...
The open field's aggregation returns the first value that occurs for the day, if there has been no data on or before the `dt` the open is `nan`. Once the first non-nan open is seen, that value remains constant per asset for the remainder of the day. Returns ------- ...
The high field's aggregation returns the largest high seen between the market open and the current dt. If there has been no data on or before the `dt` the high is `nan`. Returns ------- np.array with dtype=float64, in order of assets parameter. def highs(self, assets, dt): ...
The low field's aggregation returns the smallest low seen between the market open and the current dt. If there has been no data on or before the `dt` the low is `nan`. Returns ------- np.array with dtype=float64, in order of assets parameter. def lows(self, assets, dt): ...
The close field's aggregation returns the latest close at the given dt. If the close for the given dt is `nan`, the most recent non-nan `close` is used. If there has been no data on or before the `dt` the close is `nan`. Returns ------- np.array with dtype=float6...
The volume field's aggregation returns the sum of all volumes between the market open and the `dt` If there has been no data on or before the `dt` the volume is 0. Returns ------- np.array with dtype=int64, in order of assets parameter. def volumes(self, assets, dt): ""...