response
stringlengths
1
33.1k
instruction
stringlengths
22
582k
Extract a model from its distributed containers. Args: model (`torch.nn.Module`): The model to extract. keep_fp32_wrapper (`bool`, *optional*): Whether to remove mixed precision hooks from the model. recursive (`bool`, *optional*, defaults to `False`): Whether to recursively extract...
def extract_model_from_parallel(model, keep_fp32_wrapper: bool = True, recursive: bool = False): """ Extract a model from its distributed containers. Args: model (`torch.nn.Module`): The model to extract. keep_fp32_wrapper (`bool`, *optional*): Whether to remove mixe...
Introduces a blocking point in the script, making sure all processes have reached this point before continuing. <Tip warning={true}> Make sure all processes will reach this instruction otherwise one of your processes will hang forever. </Tip>
def wait_for_everyone(): """ Introduces a blocking point in the script, making sure all processes have reached this point before continuing. <Tip warning={true}> Make sure all processes will reach this instruction otherwise one of your processes will hang forever. </Tip> """ PartialState(...
Cleans the state dictionary from a model and removes tensor aliasing if present. Args: state_dict (`dict`): The state dictionary from a model
def clean_state_dict_for_safetensors(state_dict: dict): """ Cleans the state dictionary from a model and removes tensor aliasing if present. Args: state_dict (`dict`): The state dictionary from a model """ ptrs = collections.defaultdict(list) # When bnb serialization is used...
Save the data to disk. Use in place of `torch.save()`. Args: obj: The data to save f: The file (or file-like object) to use to save the data save_on_each_node (`bool`, *optional*, defaults to `False`): Whether to only save on the global main process safe_serialization (`bool`, *...
def save(obj, f, save_on_each_node: bool = False, safe_serialization: bool = False): """ Save the data to disk. Use in place of `torch.save()`. Args: obj: The data to save f: The file (or file-like object) to use to save the data save_on_each_node (`bool`, *o...
A context manager that will temporarily clear environment variables. When this context exits, the previous environment variables will be back. Example: ```python >>> import os >>> from accelerate.utils import clear_environment >>> os.environ["FOO"] = "bar" >>> with clear_environment(): ... print(os.environ) ......
def clear_environment(): """ A context manager that will temporarily clear environment variables. When this context exits, the previous environment variables will be back. Example: ```python >>> import os >>> from accelerate.utils import clear_environment >>> os.environ["FOO"] = "bar...
A context manager that will add each keyword argument passed to `os.environ` and remove them when exiting. Will convert the values in `kwargs` to strings and upper-case all the keys. Example: ```python >>> import os >>> from accelerate.utils import patch_environment >>> with patch_environment(FOO="bar"): ... pr...
def patch_environment(**kwargs): """ A context manager that will add each keyword argument passed to `os.environ` and remove them when exiting. Will convert the values in `kwargs` to strings and upper-case all the keys. Example: ```python >>> import os >>> from accelerate.utils import pat...
Gets a pretty name from `obj`.
def get_pretty_name(obj): """ Gets a pretty name from `obj`. """ if not hasattr(obj, "__qualname__") and not hasattr(obj, "__name__"): obj = getattr(obj, "__class__", obj) if hasattr(obj, "__qualname__"): return obj.__qualname__ if hasattr(obj, "__name__"): return obj.__...
Recursively merges two dictionaries. Args: source (`dict`): The dictionary to merge into `destination`. destination (`dict`): The dictionary to merge `source` into.
def merge_dicts(source, destination): """ Recursively merges two dictionaries. Args: source (`dict`): The dictionary to merge into `destination`. destination (`dict`): The dictionary to merge `source` into. """ for key, value in source.items(): if isinstance(value, dict): ...
Checks if a port is in use on `localhost`. Useful for checking if multiple `accelerate launch` commands have been run and need to see if the port is already in use.
def is_port_in_use(port: int = None) -> bool: """ Checks if a port is in use on `localhost`. Useful for checking if multiple `accelerate launch` commands have been run and need to see if the port is already in use. """ if port is None: port = 29500 with socket.socket(socket.AF_INET, sock...
Converts `size` from bytes to the largest possible unit
def convert_bytes(size): "Converts `size` from bytes to the largest possible unit" for x in ["bytes", "KB", "MB", "GB", "TB"]: if size < 1024.0: return f"{round(size, 2)} {x}" size /= 1024.0 return f"{round(size, 2)} PB"
Warns if the kernel version is below the recommended minimum on Linux.
def check_os_kernel(): """Warns if the kernel version is below the recommended minimum on Linux.""" # see issue #1929 info = platform.uname() system = info.system if system != "Linux": return _, version, *_ = re.split(r"(\d+\.\d+\.\d+)", info.release) min_version = "5.5.0" if Ve...
Recursive `getattr`. Args: obj: A class instance holding the attribute. attr (`str`): The attribute that is to be retrieved, e.g. 'attribute1.attribute2'.
def recursive_getattr(obj, attr: str): """ Recursive `getattr`. Args: obj: A class instance holding the attribute. attr (`str`): The attribute that is to be retrieved, e.g. 'attribute1.attribute2'. """ def _getattr(obj, attr): return getattr(obj, att...
Helper function for reproducible behavior to set the seed in `random`, `numpy`, `torch`. Args: seed (`int`): The seed to set. device_specific (`bool`, *optional*, defaults to `False`): Whether to differ the seed on each device slightly with `self.process_index`. deterministic (`bool`, *opti...
def set_seed(seed: int, device_specific: bool = False, deterministic: bool = False): """ Helper function for reproducible behavior to set the seed in `random`, `numpy`, `torch`. Args: seed (`int`): The seed to set. device_specific (`bool`, *optional*, defaults to `False`): ...
Helper function to install appropriate xla wheels based on the `torch` version in Google Colaboratory. Args: upgrade (`bool`, *optional*, defaults to `False`): Whether to upgrade `torch` and install the latest `torch_xla` wheels. Example: ```python >>> from accelerate.utils import install_xla >>> instal...
def install_xla(upgrade: bool = False): """ Helper function to install appropriate xla wheels based on the `torch` version in Google Colaboratory. Args: upgrade (`bool`, *optional*, defaults to `False`): Whether to upgrade `torch` and install the latest `torch_xla` wheels. Example:...
Wrapper around `tqdm.tqdm` that optionally displays only on the main process. Args: main_process_only (`bool`, *optional*): Whether to display the progress bar only on the main process
def tqdm(*args, main_process_only: bool = True, **kwargs): """ Wrapper around `tqdm.tqdm` that optionally displays only on the main process. Args: main_process_only (`bool`, *optional*): Whether to display the progress bar only on the main process """ if not is_tqdm_available():...
Recursively converts the linear and layernorm layers of a model to their `transformers_engine` counterpart.
def convert_model(model, to_transformer_engine=True, _convert_linear=True, _convert_ln=True): """ Recursively converts the linear and layernorm layers of a model to their `transformers_engine` counterpart. """ if not is_fp8_available(): raise ImportError("Using `convert_model` requires transform...
Returns whether a given model has some `transformer_engine` layer or not.
def has_transformer_engine_layers(model): """ Returns whether a given model has some `transformer_engine` layer or not. """ if not is_fp8_available(): raise ImportError("Using `has_transformer_engine_layers` requires transformer_engine to be installed.") for m in model.modules(): ...
Compares a library version to some requirement using a given operation. Args: library_or_version (`str` or `packaging.version.Version`): A library name or a version to check. operation (`str`): A string representation of an operator, such as `">"` or `"<="`. requirement_version (`str`): ...
def compare_versions(library_or_version: Union[str, Version], operation: str, requirement_version: str): """ Compares a library version to some requirement using a given operation. Args: library_or_version (`str` or `packaging.version.Version`): A library name or a version to check. ...
Compares the current PyTorch version to a given reference with an operation. Args: operation (`str`): A string representation of an operator, such as `">"` or `"<="` version (`str`): A string version of PyTorch
def is_torch_version(operation: str, version: str): """ Compares the current PyTorch version to a given reference with an operation. Args: operation (`str`): A string representation of an operator, such as `">"` or `"<="` version (`str`): A string version of PyTorch ...
With this test, an observed batch size of 64 should result in neglible differences in the scheduler after going through the correct number of steps. Uses single, two, and four steps to test.
def accumulation_test(num_processes: int = 2): """ With this test, an observed batch size of 64 should result in neglible differences in the scheduler after going through the correct number of steps. Uses single, two, and four steps to test. """ from transformers import get_linear_schedule_with...
Generates a tuple of dummy DataLoaders to test with
def dummy_dataloaders(a=2, b=3, batch_size=16, n_train_batches: int = 10, n_valid_batches: int = 2): "Generates a tuple of dummy DataLoaders to test with" def get_dataset(n_batches): x = torch.randn(batch_size * n_batches, 1) return TensorDataset(x, a * x + b + 0.1 * torch.randn(batch_size * n_...
Trains for `num_epochs`
def train(num_epochs, model, dataloader, optimizer, accelerator, scheduler=None): "Trains for `num_epochs`" rands = [] for epoch in range(num_epochs): # Train quickly model.train() for batch in dataloader: x, y = batch outputs = model(x) loss = to...
Helper function parsing the command line options @retval ArgumentParser
def parse_args(): """ Helper function parsing the command line options @retval ArgumentParser """ parser = ArgumentParser( description=( "PyTorch TPU distributed training launch " "helper utility that will spawn up " "multiple distributed processes" ...
Retrieve a from a url as a string.
def r(url, headers=ADOBE_REQ_HEADERS): """Retrieve a from a url as a string.""" req = session.get(url, headers=headers) req.encoding = 'utf-8' return req.text
First stage of parsing the XML.
def get_products_xml(adobeurl): """First stage of parsing the XML.""" print('Source URL is: ' + adobeurl) return ET.fromstring(r(adobeurl))
2nd stage of parsing the XML.
def parse_products_xml(products_xml, urlVersion, allowedPlatforms): """2nd stage of parsing the XML.""" if urlVersion == 6: prefix = 'channels/' else: prefix = '' cdn = products_xml.find(prefix + 'channel/cdn/secure').text products = {} parent_map = {c: p for p in products_xml.i...
Question prompt default Y.
def questiony(question: str) -> bool: """Question prompt default Y.""" reply = None while reply not in ("", "y", "n"): reply = input(f"{question} (Y/n): ").lower() return (reply in ("", "y"))
Question prompt default N.
def questionn(question: str) -> bool: """Question prompt default N.""" reply = None while reply not in ("", "y", "n"): reply = input(f"{question} (y/N): ").lower() return (reply in ("y", "Y"))
Retrieve JSON.
def get_application_json(buildGuid): """Retrieve JSON.""" headers = ADOBE_REQ_HEADERS.copy() headers['x-adobe-build-guid'] = buildGuid return json.loads(r(ADOBE_APPLICATION_JSON_URL, headers))
Ask for desired download folder
def get_download_path(): """Ask for desired download folder""" if (args.destination): print('\nUsing provided destination: ' + args.destination) dest = args.destination else: print('\nPlease navigate to the desired downloads folder, or cancel to abort.') p = Popen(['/usr/bin...
Download a file
def download_file(url, product_dir, s, v, name=None): """Download a file""" if not name: name = url.split('/')[-1].split('?')[0] print('Url is: ' + url) print('[{}_{}] Downloading {}'.format(s, v, name)) file_path = os.path.join(product_dir, name) response = session.head(url, stream=True...
Download APRO
def download_APRO(appInfo, cdn): """Download APRO""" manifest = get_products_xml(cdn + appInfo['buildGuid']) downloadURL = manifest.find('asset_list/asset/asset_path').text dest = get_download_path() sapCode = appInfo['sapCode'] version = appInfo['productVersion'] name = 'Intall {}_{}_{}.dmg...
Run Main exicution.
def run_ccdl(products, cdn, sapCodes, allowedPlatforms): """Run Main exicution.""" sapCode = args.sapCode if not sapCode: for s, d in sapCodes.items(): print('[{}]{}{}'.format(s, (10 - len(s)) * ' ', d)) while sapCode is None: val = input( '\nPleas...
Constructs and sends a request. Returns response object. method - HTTP method url - request url params - (optional) Dictionary or bytes to be sent in the query string of the new request data - (optional) Dictionary, bytes, or file-like object to send in the body of the request json - (optional) Any json compatible...
def request( method: str, url: StrOrURL, *, params: Optional[Mapping[str, str]] = None, data: Any = None, json: Any = None, headers: Optional[LooseHeaders] = None, skip_auto_headers: Optional[Iterable[str]] = None, auth: Optional[BasicAuth] = None, allow_redirects: bool = True, ...
Load netrc from file. Attempt to load it from the path specified by the env-var NETRC or in the default location in the user's home directory. Returns None if it couldn't be found or fails to parse.
def netrc_from_env() -> Optional[netrc.netrc]: """Load netrc from file. Attempt to load it from the path specified by the env-var NETRC or in the default location in the user's home directory. Returns None if it couldn't be found or fails to parse. """ netrc_env = os.environ.get("NETRC") ...
Return :py:class:`~aiohttp.BasicAuth` credentials for ``host`` from ``netrc_obj``. :raises LookupError: if ``netrc_obj`` is :py:data:`None` or if no entry is found for the ``host``.
def basicauth_from_netrc(netrc_obj: Optional[netrc.netrc], host: str) -> BasicAuth: """ Return :py:class:`~aiohttp.BasicAuth` credentials for ``host`` from ``netrc_obj``. :raises LookupError: if ``netrc_obj`` is :py:data:`None` or if no entry is found for the ``host``. """ if netrc_obj ...
Get a permitted proxy for the given URL from the env.
def get_env_proxy_for_url(url: URL) -> Tuple[URL, Optional[BasicAuth]]: """Get a permitted proxy for the given URL from the env.""" if url.host is not None and proxy_bypass(url.host): raise LookupError(f"Proxying is disallowed for `{url.host!r}`") proxies_in_env = proxies_from_env() try: ...
Parses a MIME type into its components. mimetype is a MIME type string. Returns a MimeType object. Example: >>> parse_mimetype('text/html; charset=utf-8') MimeType(type='text', subtype='html', suffix='', parameters={'charset': 'utf-8'})
def parse_mimetype(mimetype: str) -> MimeType: """Parses a MIME type into its components. mimetype is a MIME type string. Returns a MimeType object. Example: >>> parse_mimetype('text/html; charset=utf-8') MimeType(type='text', subtype='html', suffix='', parameters={'charset': 'u...
Return 7-bit content as quoted-string. Format content into a quoted-string as defined in RFC5322 for Internet Message Format. Notice that this is not the 8-bit HTTP format, but the 7-bit email format. Content must be in usascii or a ValueError is raised.
def quoted_string(content: str) -> str: """Return 7-bit content as quoted-string. Format content into a quoted-string as defined in RFC5322 for Internet Message Format. Notice that this is not the 8-bit HTTP format, but the 7-bit email format. Content must be in usascii or a ValueError is raised. ...
Sets ``Content-Disposition`` header for MIME. This is the MIME payload Content-Disposition header from RFC 2183 and RFC 7579 section 4.2, not the HTTP Content-Disposition from RFC 6266. disptype is a disposition type: inline, attachment, form-data. Should be valid extension token (see RFC 2183) quote_fields performs...
def content_disposition_header( disptype: str, quote_fields: bool = True, _charset: str = "utf-8", **params: str ) -> str: """Sets ``Content-Disposition`` header for MIME. This is the MIME payload Content-Disposition header from RFC 2183 and RFC 7579 section 4.2, not the HTTP Content-Disposition from ...
Checks if received content type is processable as an expected one. Both arguments should be given without parameters.
def is_expected_content_type( response_content_type: str, expected_content_type: str ) -> bool: """Checks if received content type is processable as an expected one. Both arguments should be given without parameters. """ if expected_content_type == "application/json": return json_re.match(r...
Set future exception. If the future is marked as complete, this function is a no-op. :param exc_cause: An exception that is a direct cause of ``exc``. Only set if provided.
def set_exception( fut: "asyncio.Future[_T] | ErrorableProtocol", exc: BaseException, exc_cause: BaseException = _EXC_SENTINEL, ) -> None: """Set future exception. If the future is marked as complete, this function is a no-op. :param exc_cause: An exception that is a direct cause of ``exc``. ...
Process a date string, return a datetime object
def parse_http_date(date_str: Optional[str]) -> Optional[datetime.datetime]: """Process a date string, return a datetime object""" if date_str is not None: timetuple = parsedate(date_str) if timetuple is not None: with suppress(ValueError): return datetime.datetime(*...
Check if a request must return an empty body.
def must_be_empty_body(method: str, code: int) -> bool: """Check if a request must return an empty body.""" return ( status_code_must_be_empty_body(code) or method_must_be_empty_body(method) or (200 <= code < 300 and method.upper() == hdrs.METH_CONNECT) )
Check if a method must return an empty body.
def method_must_be_empty_body(method: str) -> bool: """Check if a method must return an empty body.""" # https://datatracker.ietf.org/doc/html/rfc9112#section-6.3-2.1 # https://datatracker.ietf.org/doc/html/rfc9112#section-6.3-2.2 return method.upper() == hdrs.METH_HEAD
Check if a status code must return an empty body.
def status_code_must_be_empty_body(code: int) -> bool: """Check if a status code must return an empty body.""" # https://datatracker.ietf.org/doc/html/rfc9112#section-6.3-2.1 return code in {204, 304} or 100 <= code < 200
Check if a Content-Length header should be removed. This should always be a subset of must_be_empty_body
def should_remove_content_length(method: str, code: int) -> bool: """Check if a Content-Length header should be removed. This should always be a subset of must_be_empty_body """ # https://www.rfc-editor.org/rfc/rfc9110.html#section-8.6-8 # https://www.rfc-editor.org/rfc/rfc9110.html#section-15.4.5-...
Check if the upgrade header is supported.
def _is_supported_upgrade(headers: CIMultiDictProxy[str]) -> bool: """Check if the upgrade header is supported.""" return headers.get(hdrs.UPGRADE, "").lower() in {"tcp", "websocket"}
Websocket masking function. `mask` is a `bytes` object of length 4; `data` is a `bytearray` object of any length. The contents of `data` are masked with `mask`, as specified in section 5.3 of RFC 6455. Note that this function mutates the `data` argument. This pure-python implementation may be replaced by an optimize...
def _websocket_mask_python(mask: bytes, data: bytearray) -> None: """Websocket masking function. `mask` is a `bytes` object of length 4; `data` is a `bytearray` object of any length. The contents of `data` are masked with `mask`, as specified in section 5.3 of RFC 6455. Note that this function mut...
Set up pytest fixture. Allow fixtures to be coroutines. Run coroutine fixtures in an event loop.
def pytest_fixture_setup(fixturedef): # type: ignore[no-untyped-def] """Set up pytest fixture. Allow fixtures to be coroutines. Run coroutine fixtures in an event loop. """ func = fixturedef.func if inspect.isasyncgenfunction(func): # async generator fixture is_async_gen = True ...
--fast config option
def fast(request): # type: ignore[no-untyped-def] """--fast config option""" return request.config.getoption("--aiohttp-fast")
--enable-loop-debug config option
def loop_debug(request): # type: ignore[no-untyped-def] """--enable-loop-debug config option""" return request.config.getoption("--aiohttp-enable-loop-debug")
Context manager which checks for RuntimeWarnings. This exists specifically to avoid "coroutine 'X' was never awaited" warnings being missed. If RuntimeWarnings occur in the context a RuntimeError is raised.
def _runtime_warning_context(): # type: ignore[no-untyped-def] """Context manager which checks for RuntimeWarnings. This exists specifically to avoid "coroutine 'X' was never awaited" warnings being missed. If RuntimeWarnings occur in the context a RuntimeError is raised. """ with warnings.ca...
Passthrough loop context. Sets up and tears down a loop unless one is passed in via the loop argument when it's passed straight through.
def _passthrough_loop_context(loop, fast=False): # type: ignore[no-untyped-def] """Passthrough loop context. Sets up and tears down a loop unless one is passed in via the loop argument when it's passed straight through. """ if loop: # loop already exists, pass it straight through y...
Fix pytest collecting for coroutines.
def pytest_pycollect_makeitem(collector, name, obj): # type: ignore[no-untyped-def] """Fix pytest collecting for coroutines.""" if collector.funcnamefilter(name) and asyncio.iscoroutinefunction(obj): return list(collector._genfunctions(name, obj))
Run coroutines in an event loop instead of a normal function call.
def pytest_pyfunc_call(pyfuncitem): # type: ignore[no-untyped-def] """Run coroutines in an event loop instead of a normal function call.""" fast = pyfuncitem.config.getoption("--aiohttp-fast") if asyncio.iscoroutinefunction(pyfuncitem.function): existing_loop = pyfuncitem.funcargs.get( ...
Return an instance of the event loop.
def loop(loop_factory, fast, loop_debug): # type: ignore[no-untyped-def] """Return an instance of the event loop.""" policy = loop_factory() asyncio.set_event_loop_policy(policy) with loop_context(fast=fast) as _loop: if loop_debug: _loop.set_debug(True) # pragma: no cover ...
Return a port that is unused on the current host.
def aiohttp_unused_port() -> Callable[[], int]: """Return a port that is unused on the current host.""" return _unused_port
Factory to create a TestServer instance, given an app. aiohttp_server(app, **kwargs)
def aiohttp_server(loop: asyncio.AbstractEventLoop) -> Iterator[AiohttpServer]: """Factory to create a TestServer instance, given an app. aiohttp_server(app, **kwargs) """ servers = [] async def go(app, *, port=None, **kwargs): # type: ignore[no-untyped-def] server = TestServer(app, port=...
Factory to create a RawTestServer instance, given a web handler. aiohttp_raw_server(handler, **kwargs)
def aiohttp_raw_server(loop: asyncio.AbstractEventLoop) -> Iterator[AiohttpRawServer]: """Factory to create a RawTestServer instance, given a web handler. aiohttp_raw_server(handler, **kwargs) """ servers = [] async def go(handler, *, port=None, **kwargs): # type: ignore[no-untyped-def] s...
Client class to use in ``aiohttp_client`` factory. Use it for passing custom ``TestClient`` implementations. Example:: class MyClient(TestClient): async def login(self, *, user, pw): payload = {"username": user, "password": pw} return await self.post("/login", json=payload) @pytes...
def aiohttp_client_cls() -> Type[TestClient]: """ Client class to use in ``aiohttp_client`` factory. Use it for passing custom ``TestClient`` implementations. Example:: class MyClient(TestClient): async def login(self, *, user, pw): payload = {"username": user, "passw...
Factory to create a TestClient instance. aiohttp_client(app, **kwargs) aiohttp_client(server, **kwargs) aiohttp_client(raw_server, **kwargs)
def aiohttp_client( loop: asyncio.AbstractEventLoop, aiohttp_client_cls: Type[TestClient] ) -> Iterator[AiohttpClient]: """Factory to create a TestClient instance. aiohttp_client(app, **kwargs) aiohttp_client(server, **kwargs) aiohttp_client(raw_server, **kwargs) """ clients = [] async...
Return a port that is unused on the current host.
def unused_port() -> int: """Return a port that is unused on the current host.""" with socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s: s.bind(("127.0.0.1", 0)) return cast(int, s.getsockname()[1])
A contextmanager that creates an event_loop, for test purposes. Handles the creation and cleanup of a test loop.
def loop_context( loop_factory: _LOOP_FACTORY = asyncio.new_event_loop, fast: bool = False ) -> Iterator[asyncio.AbstractEventLoop]: """A contextmanager that creates an event_loop, for test purposes. Handles the creation and cleanup of a test loop. """ loop = setup_test_loop(loop_factory) yield...
Create and return an asyncio.BaseEventLoop instance. The caller should also call teardown_test_loop, once they are done with the loop.
def setup_test_loop( loop_factory: _LOOP_FACTORY = asyncio.new_event_loop, ) -> asyncio.AbstractEventLoop: """Create and return an asyncio.BaseEventLoop instance. The caller should also call teardown_test_loop, once they are done with the loop. """ loop = loop_factory() asyncio.set_event_lo...
Teardown and cleanup an event_loop created by setup_test_loop.
def teardown_test_loop(loop: asyncio.AbstractEventLoop, fast: bool = False) -> None: """Teardown and cleanup an event_loop created by setup_test_loop.""" closed = loop.is_closed() if not closed: loop.call_soon(loop.stop) loop.run_forever() loop.close() if not fast: gc.c...
Creates mocked web.Request testing purposes. Useful in unit tests, when spinning full web server is overkill or specific conditions and errors are hard to trigger.
def make_mocked_request( method: str, path: str, headers: Any = None, *, match_info: Any = sentinel, version: HttpVersion = HttpVersion(1, 1), closing: bool = False, app: Any = None, writer: Any = sentinel, protocol: Any = sentinel, transport: Any = sentinel, payload: Any...
Creates a coroutine mock.
def make_mocked_coro( return_value: Any = sentinel, raise_exception: Any = sentinel ) -> Any: """Creates a coroutine mock.""" async def mock_coro(*args: Any, **kwargs: Any) -> Any: if raise_exception is not sentinel: raise raise_exception if not inspect.isawaitable(return_value...
Run an app locally
def run_app( app: Union[Application, Awaitable[Application]], *, debug: bool = False, host: Optional[Union[str, HostSequence]] = None, port: Optional[int] = None, path: Union[PathLike, TypingIterable[PathLike], None] = None, sock: Optional[Union[socket.socket, TypingIterable[socket.socket]]]...
Factory for producing a middleware that normalizes the path of a request. Normalizing means: - Add or remove a trailing slash to the path. - Double slashes are replaced by one. The middleware returns as soon as it finds a path that resolves correctly. The order if both merge and append/remove are enabled is ...
def normalize_path_middleware( *, append_slash: bool = True, remove_slash: bool = False, merge_slashes: bool = True, redirect_class: Type[HTTPMove] = HTTPPermanentRedirect, ) -> Middleware: """Factory for producing a middleware that normalizes the path of a request. Normalizing means: ...
Prepare :file:`.netrc` with given contents. Monkey-patches :envvar:`NETRC` to point to created file.
def netrc_contents( tmp_path: Path, monkeypatch: pytest.MonkeyPatch, request: pytest.FixtureRequest, ): """ Prepare :file:`.netrc` with given contents. Monkey-patches :envvar:`NETRC` to point to created file. """ netrc_contents = getattr(request, "param", None) netrc_file_path = tm...
Find all importables in the project. Return them in order.
def _find_all_importables(pkg: ModuleType) -> List[str]: """Find all importables in the project. Return them in order. """ return sorted( set( chain.from_iterable( _discover_path_importables(Path(p), pkg.__name__) for p in pkg.__path__ ), ), )
Yield all importables under a given path and package.
def _discover_path_importables( pkg_pth: Path, pkg_name: str, ) -> Generator[str, None, None]: """Yield all importables under a given path and package.""" for dir_path, _d, file_names in os.walk(pkg_pth): pkg_dir_path = Path(dir_path) if pkg_dir_path.parts[-1] == "__pycache__": ...
Verify that exploding importables doesn't explode. This is seeking for any import errors including ones caused by circular imports.
def test_no_warnings(import_path: str) -> None: """Verify that exploding importables doesn't explode. This is seeking for any import errors including ones caused by circular imports. """ imp_cmd = ( # fmt: off sys.executable, "-W", "error", # The following deprecatio...
Test appropriate Authorization header is sent when netrc is not empty.
def test_basicauth_from_netrc_present( make_request: Any, expected_auth: Optional[helpers.BasicAuth], ): """Test appropriate Authorization header is sent when netrc is not empty.""" req = make_request("get", "http://example.com", trust_env=True) assert req.headers[hdrs.AUTHORIZATION] == expected_aut...
Test no authorization header is sent via netrc if trust_env is False
def test_basicauth_from_netrc_present_untrusted_env( make_request: Any, ): """Test no authorization header is sent via netrc if trust_env is False""" req = make_request("get", "http://example.com", trust_env=False) assert hdrs.AUTHORIZATION not in req.headers
Test that no Authorization header is sent when netrc is empty
def test_basicauth_from_empty_netrc( make_request: Any, ): """Test that no Authorization header is sent when netrc is empty""" req = make_request("get", "http://example.com", trust_env=True) assert hdrs.AUTHORIZATION not in req.headers
Create pickled data for test_pickle_format().
def dump_cookiejar() -> bytes: # pragma: no cover """Create pickled data for test_pickle_format().""" cj = CookieJar() cj.update_cookies(cookies_to_send.__pytest_wrapped__.obj()) return pickle.dumps(cj._cookies, pickle.HIGHEST_PROTOCOL)
Test if cookiejar pickle format breaks. If this test fails, it may indicate that saved cookiejars will stop working. If that happens then: 1. Avoid releasing the change in a bugfix release. 2. Try to include a migration script in the release notes (example below). 3. Use dump_cookiejar() at the top of this...
def test_pickle_format(cookies_to_send) -> None: """Test if cookiejar pickle format breaks. If this test fails, it may indicate that saved cookiejars will stop working. If that happens then: 1. Avoid releasing the change in a bugfix release. 2. Try to include a migration script in the relea...
Per RFC 2045, media type matching is case insensitive.
def test_is_expected_content_type_json_non_lowercase(): """Per RFC 2045, media type matching is case insensitive.""" expected_ct = "application/json" response_ct = "Application/JSON" assert is_expected_content_type( response_content_type=response_ct, expected_content_type=expected_ct )
Test that reading netrc files from env works as expected
def test_netrc_from_env(expected_username: str): """Test that reading netrc files from env works as expected""" netrc_obj = helpers.netrc_from_env() assert netrc_obj.authenticators("example.com")[0] == expected_username
Test that netrc file contents are properly parsed into BasicAuth tuples
def test_basicauth_present_in_netrc( expected_auth: helpers.BasicAuth, ): """Test that netrc file contents are properly parsed into BasicAuth tuples""" netrc_obj = helpers.netrc_from_env() assert expected_auth == helpers.basicauth_from_netrc(netrc_obj, "example.com")
Test that an error is raised if netrc doesn't have an entry for our host
def test_read_basicauth_from_empty_netrc(): """Test that an error is raised if netrc doesn't have an entry for our host""" netrc_obj = helpers.netrc_from_env() with pytest.raises( LookupError, match="No entry for example.com found in the `.netrc` file." ): helpers.basicauth_from_netrc(n...
Test that HEAD is the only method that unequivocally must have an empty body.
def test_method_must_be_empty_body(): """Test that HEAD is the only method that unequivocally must have an empty body.""" assert method_must_be_empty_body("HEAD") is True # CONNECT is only empty on a successful response assert method_must_be_empty_body("CONNECT") is False
Test should_remove_content_length is always a subset of must_be_empty_body.
def test_should_remove_content_length_is_subset_of_must_be_empty_body(): """Test should_remove_content_length is always a subset of must_be_empty_body.""" assert should_remove_content_length("GET", 101) is True assert must_be_empty_body("GET", 101) is True assert should_remove_content_length("GET", 102...
Test that invalid chunked encoding doesn't allow content-length to be used.
def test_bad_chunked_py(loop: Any, protocol: Any) -> None: """Test that invalid chunked encoding doesn't allow content-length to be used.""" parser = HttpRequestParserPy( protocol, loop, 2**16, max_line_size=8190, max_field_size=8190, ) text = ( b"GET / HT...
C parser behaves differently. Maybe we should align them later.
def test_bad_chunked_c(loop: Any, protocol: Any) -> None: """C parser behaves differently. Maybe we should align them later.""" parser = HttpRequestParserC( protocol, loop, 2**16, max_line_size=8190, max_field_size=8190, ) text = ( b"GET / HTTP/1.1\r\nHost...
Test not upgraded if missing Upgrade header.
def test_bad_upgrade(parser: Any) -> None: """Test not upgraded if missing Upgrade header.""" text = b"GET /test HTTP/1.1\r\nconnection: upgrade\r\n\r\n" messages, upgrade, tail = parser.feed_data(text) msg = messages[0][0] assert not msg.upgrade assert not upgrade
Still a lot of dodgy servers sending bad requests like this.
def test_http_response_parser_bad_crlf(response: Any) -> None: """Still a lot of dodgy servers sending bad requests like this.""" messages, upgrade, tail = response.feed_data( b"HTTP/1.0 200 OK\nFoo: abc\nBar: def\n\nBODY\n" ) msg = messages[0][0] assert msg.headers["Foo"] == "abc" asse...
See https://github.com/aio-libs/aiohttp/issues/6197
def test___all__(pytester: pytest.Pytester) -> None: """See https://github.com/aio-libs/aiohttp/issues/6197""" pytester.makepyfile( test_a=""" from aiohttp import * assert 'GunicornWebWorker' in globals() """ ) result = pytester.runpytest("-vv") result.assert_...
Check that importing aiohttp doesn't take too long. Obviously, the time may vary on different machines and may need to be adjusted from time to time, but this should provide an early warning if something is added that significantly increases import time.
def test_import_time(pytester: pytest.Pytester) -> None: """Check that importing aiohttp doesn't take too long. Obviously, the time may vary on different machines and may need to be adjusted from time to time, but this should provide an early warning if something is added that significantly increases i...
Return the URL of an instance of a running secure proxy. This fixture also spawns that instance and tears it down after the test.
def secure_proxy_url(tls_certificate_pem_path): """Return the URL of an instance of a running secure proxy. This fixture also spawns that instance and tears it down after the test. """ proxypy_args = [ # --threadless does not work on windows, see # https://github.com/abhinavsingh/proxy....
Test that the logger does nothing when the log level is disabled.
def test_logger_does_nothing_when_disabled(caplog: pytest.LogCaptureFixture) -> None: """Test that the logger does nothing when the log level is disabled.""" mock_logger = logging.getLogger("test.aiohttp.log") mock_logger.setLevel(logging.INFO) access_logger = AccessLogger(mock_logger, "%b") access_...
Create a temp path with hello.txt and compressed versions. The uncompressed text file path is returned by default. Alternatively, an indirect parameter can be passed with an encoding to get a compressed path.
def hello_txt(request, tmp_path_factory) -> pathlib.Path: """Create a temp path with hello.txt and compressed versions. The uncompressed text file path is returned by default. Alternatively, an indirect parameter can be passed with an encoding to get a compressed path. """ txt = tmp_path_factory.mk...
Extract provider id from provider specification. :param provider_spec: provider specification can be in the form of the "PROVIDER_ID" or "apache-airflow-providers-PROVIDER", optionally followed by ">=VERSION". :return: short provider_id with `.` instead of `-` in case of `apache` and other providers with ...
def get_provider_id(provider_spec: str) -> str: """ Extract provider id from provider specification. :param provider_spec: provider specification can be in the form of the "PROVIDER_ID" or "apache-airflow-providers-PROVIDER", optionally followed by ">=VERSION". :return: short provider_id wi...
Convert provider specification with provider_id to provider requirement. The requirement can be used when constructing dependencies. It automatically adds pre-release specifier in case we are building pre-release version of Airflow. This way we can handle the case when airflow depends on specific version of the provid...
def get_provider_requirement(provider_spec: str) -> str: """ Convert provider specification with provider_id to provider requirement. The requirement can be used when constructing dependencies. It automatically adds pre-release specifier in case we are building pre-release version of Airflow. This way ...
Convert provider specification to extra dependency. :param provider_requirement: requirement of the provider in the form of apache-airflow-provider-*, optionally followed by >=VERSION. :return: extra dependency in the form of apache-airflow[extra]
def convert_to_extra_dependency(provider_requirement: str) -> str: """ Convert provider specification to extra dependency. :param provider_requirement: requirement of the provider in the form of apache-airflow-provider-*, optionally followed by >=VERSION. :return: extra dependency in the form o...
Produce the Python exclusion that should be used - converted from the list of python versions. :param excluded_python_versions: list of python versions to exclude the dependency for. :return: python version exclusion string that can be added to dependency in specification.
def get_python_exclusion(excluded_python_versions: list[str]): """ Produce the Python exclusion that should be used - converted from the list of python versions. :param excluded_python_versions: list of python versions to exclude the dependency for. :return: python version exclusion string that can be ...
Whether the dependency should be skipped for editable build for current python version. :param excluded_python_versions: list of excluded python versions. :return: True if the dependency should be skipped for editable build for the current python version.
def skip_for_editable_build(excluded_python_versions: list[str]) -> bool: """ Whether the dependency should be skipped for editable build for current python version. :param excluded_python_versions: list of excluded python versions. :return: True if the dependency should be skipped for editable build f...
Expand (potentially nested) env vars. Repeat and apply `expandvars` and `expanduser` until interpolation stops having any effect.
def expand_env_var(env_var: str | None) -> str | None: """ Expand (potentially nested) env vars. Repeat and apply `expandvars` and `expanduser` until interpolation stops having any effect. """ if not env_var: return env_var while True: interpolated = os.path.expanduser(os.pa...