text stringlengths 81 112k |
|---|
Declare the view as a JSON API method
This converts view return value into a :cls:JsonResponse.
The following return types are supported:
- tuple: a tuple of (response, status, headers)
- any other object is converted to JSON
def jsonapi(f):
""" Declare the view as a JSON ... |
Download + unpack given package into temp dir ``tmp``.
Return ``(real_version, source)`` where ``real_version`` is the "actual"
version downloaded (e.g. if a Git master was indicated, it will be the SHA
of master HEAD) and ``source`` is the source directory (relative to
unpacked source) to import into ... |
Vendorize Python package ``distribution`` at version/SHA ``version``.
Specify the vendor folder (e.g. ``<mypackage>/vendor``) as ``vendor_dir``.
For Crate/PyPI releases, ``package`` should be the name of the software
entry on those sites, and ``version`` should be a specific version number.
E.g. ``ven... |
Create a passworded sudo-capable user.
Used by other tasks to execute the test suite so sudo tests work.
def make_sudouser(c):
"""
Create a passworded sudo-capable user.
Used by other tasks to execute the test suite so sudo tests work.
"""
user = c.travis.sudo.user
password = c.travis.sud... |
Set up passwordless SSH keypair & authorized_hosts access to localhost.
def make_sshable(c):
"""
Set up passwordless SSH keypair & authorized_hosts access to localhost.
"""
user = c.travis.sudo.user
home = "~{0}".format(user)
# Run sudo() as the new sudo user; means less chown'ing, etc.
c.c... |
Run some command under Travis-oriented sudo subshell/virtualenv.
:param str command:
Command string to run, e.g. ``inv coverage``, ``inv integration``, etc.
(Does not necessarily need to be an Invoke task, but...)
def sudo_run(c, command):
"""
Run some command under Travis-oriented sudo su... |
Install and execute ``black`` under appropriate circumstances, with diffs.
Installs and runs ``black`` under Python 3.6 (the first version it
supports). Since this sort of CI based task only needs to run once per
commit (formatting is not going to change between interpreters) this seems
like a worthwhi... |
List based implementation of binary tree algorithm for concordance
measure after :cite:`Christensen2005`.
def _calc(self, x, y):
"""
List based implementation of binary tree algorithm for concordance
measure after :cite:`Christensen2005`.
"""
x = np.array(x)
y =... |
Wrapper function to decorate a function
def decorator(self, func):
""" Wrapper function to decorate a function """
if inspect.isfunction(func):
func._methodview = self
elif inspect.ismethod(func):
func.__func__._methodview = self
else:
raise Assertion... |
Test if the method matches the provided set of arguments
:param verb: HTTP verb. Uppercase
:type verb: str
:param params: Existing route parameters
:type params: set
:returns: Whether this view matches
:rtype: bool
def matches(self, verb, params):
""" Test if th... |
Detect a view matching the query
:param method: HTTP method
:param route_params: Route parameters dict
:return: Method
:rtype: Callable|None
def _match_view(self, method, route_params):
""" Detect a view matching the query
:param method: HTTP method
:param rout... |
Register the view with an URL route
:param app: Flask application
:type app: flask.Flask|flask.Blueprint
:param name: Unique view name
:type name: str
:param rules: List of route rules to use
:type rules: Iterable[str|werkzeug.routing.Rule]
:param class_args: Args... |
Calculates the steady state probability vector for a regular Markov
transition matrix P.
Parameters
----------
P : array
(k, k), an ergodic Markov transition probability matrix.
Returns
-------
: array
(k, ), steady state distribution.
Exa... |
Calculates the matrix of first mean passage times for an ergodic transition
probability matrix.
Parameters
----------
P : array
(k, k), an ergodic Markov transition probability matrix.
Returns
-------
M : array
(k, k), elements are the expected value for the num... |
Variances of first mean passage times for an ergodic transition
probability matrix.
Parameters
----------
P : array
(k, k), an ergodic Markov transition probability matrix.
Returns
-------
: array
(k, k), elements are the variances for the number of in... |
Examine world state, returning data on what needs updating for release.
:param c: Invoke ``Context`` object or subclass.
:returns:
Two dicts (technically, dict subclasses, which allow attribute access),
``actions`` and ``state`` (in that order.)
``actions`` maps release component name... |
Print current release (version, changelog, tag, etc) status.
Doubles as a subroutine, returning the return values from its inner call to
``_converge`` (an ``(actions, state)`` two-tuple of Lexicons).
def status(c):
"""
Print current release (version, changelog, tag, etc) status.
Doubles as a subr... |
Edit changelog & version, git commit, and git tag, to set up for release.
def prepare(c):
"""
Edit changelog & version, git commit, and git tag, to set up for release.
"""
# Print dry-run/status/actions-to-take data & grab programmatic result
# TODO: maybe expand the enum-based stuff to have values... |
Examine current repo state to determine what type of release to prep.
:returns:
A two-tuple of ``(branch-name, line-type)`` where:
- ``branch-name`` is the current branch name, e.g. ``1.1``, ``master``,
``gobbledygook`` (or, usually, ``HEAD`` if not on a branch).
- ``line-type`` ... |
Return all released versions from given ``changelog``, sorted.
:param dict changelog:
A changelog dict as returned by ``releases.util.parse_changelog``.
:returns: A sorted list of `semantic_version.Version` objects.
def _versions_from_changelog(changelog):
"""
Return all released versions fro... |
Return most recent branch-appropriate release, if any, and its contents.
:param dict changelog:
Changelog contents, as returned by ``releases.util.parse_changelog``.
:param str branch:
Branch name.
:param release_type:
Member of `Release`, e.g. `Release.FEATURE`.
:returns:
... |
Return sorted list of release-style tags as semver objects.
def _get_tags(c):
"""
Return sorted list of release-style tags as semver objects.
"""
tags_ = []
for tagstr in c.run("git tag", hide=True).stdout.strip().split("\n"):
try:
tags_.append(Version(tagstr))
# Ignore ... |
Try to find 'the' One True Package for this project.
Mostly for obtaining the ``_version`` file within it.
Uses the ``packaging.package`` config setting if defined. If not defined,
fallback is to look for a single top-level Python package (directory
containing ``__init__.py``). (This search ignores a ... |
Build sdist and/or wheel archives, optionally in a temp base directory.
All parameters save ``directory`` honor config settings of the same name,
under the ``packaging`` tree. E.g. say ``.configure({'packaging': {'wheel':
True}})`` to force building wheel archives by default.
:param bool sdist:
... |
Publish code to PyPI or index of choice.
All parameters save ``dry_run`` and ``directory`` honor config settings of
the same name, under the ``packaging`` tree. E.g. say
``.configure({'packaging': {'wheel': True}})`` to force building wheel
archives by default.
:param bool sdist:
Whether t... |
Upload (potentially also signing) all artifacts in ``directory``.
:param str index:
Custom upload index/repository name.
By default, uses whatever the invoked ``pip`` is configured to use.
Modify your ``pypirc`` file to add new named repositories.
:param bool sign:
Whether to ... |
Context-manage a temporary directory.
Can be given ``skip_cleanup`` to skip cleanup, and ``explicit`` to choose a
specific location.
(If both are given, this is basically not doing anything, but it allows
code that normally requires a secure temporary directory to 'dry run'
instead.)
def tmpdir(s... |
Generate ransom spatial permutations for inference on LISA vectors.
Parameters
----------
permutations : int, optional
Number of random permutations of observations.
alternative : string, optional
Type of alternative to form in generating p-values.
Op... |
Plot the rose diagram.
Parameters
----------
attribute : (n,) ndarray, optional
Variable to specify colors of the colorbars.
ax : Matplotlib Axes instance, optional
If given, the figure will be created inside this axis.
Default =None. Note, this axis ... |
Plot vectors of positional transition of LISA values starting
from the same origin.
def plot_origin(self): # TODO add attribute option to color vectors
"""
Plot vectors of positional transition of LISA values starting
from the same origin.
"""
import matplotlib.cm as cm... |
Plot vectors of positional transition of LISA values
within quadrant in scatterplot in a polar plot.
Parameters
----------
ax : Matplotlib Axes instance, optional
If given, the figure will be created inside this axis.
Default =None.
arrows : boolean, opti... |
Nuke docs build target directory so next build is clean.
def _clean(c):
"""
Nuke docs build target directory so next build is clean.
"""
if isdir(c.sphinx.target):
rmtree(c.sphinx.target) |
Open build target's index.html in a browser (using 'open').
def _browse(c):
"""
Open build target's index.html in a browser (using 'open').
"""
index = join(c.sphinx.target, c.sphinx.target_file)
c.run("open {0}".format(index)) |
Build the project's Sphinx docs.
def build(
c,
clean=False,
browse=False,
nitpick=False,
opts=None,
source=None,
target=None,
):
"""
Build the project's Sphinx docs.
"""
if clean:
_clean(c)
if opts is None:
opts = ""
if nitpick:
opts += " -n -... |
Display documentation contents with the 'tree' program.
def tree(c):
"""
Display documentation contents with the 'tree' program.
"""
ignore = ".git|*.pyc|*.swp|dist|*.egg-info|_static|_build|_templates"
c.run('tree -Ca -I "{0}" {1}'.format(ignore, c.sphinx.source)) |
Build both doc sites w/ maxed nitpicking.
def sites(c):
"""
Build both doc sites w/ maxed nitpicking.
"""
# TODO: This is super lolzy but we haven't actually tackled nontrivial
# in-Python task calling yet, so we do this to get a copy of 'our' context,
# which has been updated with the per-coll... |
Watch both doc trees & rebuild them if files change.
This includes e.g. rebuilding the API docs if the source code changes;
rebuilding the WWW docs if the README changes; etc.
Reuses the configuration values ``packaging.package`` or ``tests.package``
(the former winning over the latter if both defined... |
Random permutation of rows and columns of a matrix
Parameters
----------
X : array
(k, k), array to be permutated.
ids : array
range (k, ).
Returns
-------
X : array
(k, k) with rows and columns randomly shuffled.
Examples
--------
>>> import ... |
Flattens the lower part of an n x n matrix into an n*(n-1)/2 x 1 vector.
Parameters
----------
matrix : array
(n, n) numpy array, a distance matrix.
Returns
-------
lowvec : array
numpy array, the lower half of the distance matrix flattened into
a ve... |
r"""
Run black on the current source tree (all ``.py`` files).
.. warning::
``black`` only runs on Python 3.6 or above. (However, it can be
executed against Python 2 compatible code.)
:param int line_length:
Line length argument. Default: ``79``.
:param list folders:
Li... |
Normalize the response value into a 3-tuple (rv, status, headers)
:type rv: tuple|*
:returns: tuple(rv, status, headers)
:rtype: tuple(Response|JsonResponse|*, int|None, dict|None)
def normalize_response_value(rv):
""" Normalize the response value into a 3-tuple (rv, status, headers)
... |
Make JsonResponse
:param rv: Response: the object to encode, or tuple (response, status, headers)
:type rv: tuple|*
:rtype: JsonResponse
def make_json_response(rv):
""" Make JsonResponse
:param rv: Response: the object to encode, or tuple (response, status, headers)
:type rv: tuple|*
:rtype... |
Markov-based mobility index.
Parameters
----------
p : array
(k, k), Markov transition probability matrix.
measure : string
If measure= "P",
:math:`M_{P} = \\frac{m-\sum_{i=1}^m P_{ii}}{m-1}`;
if measure = "D",
:math:`M_{D} = 1... |
chi-squared test of difference between two transition matrices.
Parameters
----------
T1 : array
(k, k), matrix of transitions (counts).
T2 : array
(k, k), matrix of transitions (counts) to use to form the
probabilities under the null.
Returns
-------
... |
Kullback information based test of Markov Homogeneity.
Parameters
----------
F : array
(s, r, r), values are transitions (not probabilities) for
s strata, r initial states, r terminal states.
Returns
-------
Results : dictionary
(key - value)
Condit... |
Prais conditional mobility measure.
Parameters
----------
pmat : matrix
(k, k), Markov probability transition matrix.
Returns
-------
pr : matrix
(1, k), conditional mobility measures for each of the k classes.
Notes
-----
Prais' conditional mobility measur... |
Test for homogeneity of Markov transition probabilities across regimes.
Parameters
----------
transition_matrices : list
of transition matrices for regimes, all matrices must
have same size (r, c). r is the number of rows in the
... |
Calculate sojourn time based on a given transition probability matrix.
Parameters
----------
p : array
(k, k), a Markov transition probability matrix.
Returns
-------
: array
(k, ), sojourn times. Each element is the expected time a Markov
... |
Helper to estimate spatial lag conditioned Markov transition
probability matrices based on maximum likelihood techniques.
def _calc(self, y, w):
'''Helper to estimate spatial lag conditioned Markov transition
probability matrices based on maximum likelihood techniques.
'''
if s... |
A summary method to call the Markov homogeneity test to test for
temporally lagged spatial dependence.
To learn more about the properties of the tests, refer to
:cite:`Rey2016a` and :cite:`Kang2018`.
def summary(self, file_name=None):
"""
A summary method to call the Markov hom... |
Helper method for classifying continuous data.
def _maybe_classify(self, y, k, cutoffs):
'''Helper method for classifying continuous data.
'''
rows, cols = y.shape
if cutoffs is None:
if self.fixed:
mcyb = mc.Quantiles(y.flatten(), k=k)
yb =... |
Detect spillover locations for diffusion in LISA Markov.
Parameters
----------
quadrant : int
which quadrant in the scatterplot should form the core
of a cluster.
neighbors_on : binary
If false, then only the 1st o... |
Get entity property names
:param entity: Entity
:type entity: sqlalchemy.ext.declarative.api.DeclarativeMeta
:returns: Set of entity property names
:rtype: set
def get_entity_propnames(entity):
""" Get entity property names
:param entity: Entity
:type entity: sqlal... |
Get entity property names that are loaded (e.g. won't produce new queries)
:param entity: Entity
:type entity: sqlalchemy.ext.declarative.api.DeclarativeMeta
:returns: Set of entity property names
:rtype: set
def get_entity_loaded_propnames(entity):
""" Get entity property names th... |
Return a Version whose minor number is one greater than self's.
.. note::
The new Version will always have a zeroed-out bugfix/tertiary version
number, because the "next minor release" of e.g. 1.2.1 is 1.3.0, not
1.3.1.
def next_minor(self):
"""
Return a Version whose minor number ... |
Check that ``encoding`` is a valid Python encoding
:param name: name under which the encoding is known to the user, e.g. 'default encoding'
:param encoding_to_check: name of the encoding to check, e.g. 'utf-8'
:param source: source where the encoding has been set, e.g. option name
:raise pygount.common.... |
Generator function to yield lines (delimited with ``'\n'``) stored in
``text``. This is useful when a regular expression should only match on a
per line basis in a memory efficient way.
def lines(text):
"""
Generator function to yield lines (delimited with ``'\n'``) stored in
``text``. This is usef... |
The first line and its number (starting with 0) in the source code that
indicated that the source code is generated.
:param source_lines: lines of text to scan
:param generated_regexes: regular expressions a line must match to indicate
the source code is generated.
:param max_line_count: maximum... |
Similar to tokens but converts strings after a colon (:) to comments.
def _pythonized_comments(tokens):
"""
Similar to tokens but converts strings after a colon (:) to comments.
"""
is_after_colon = True
for token_type, token_text in tokens:
if is_after_colon and (token_type in pygments.tok... |
The encoding used by the text file stored in ``source_path``.
The algorithm used is:
* If ``encoding`` is ``'automatic``, attempt the following:
1. Check BOM for UTF-8, UTF-16 and UTF-32.
2. Look for XML prolog or magic heading like ``# -*- coding: cp1252 -*-``
3. Read the file using UTF-8.
... |
Initial quick check if there is a lexer for ``source_path``. This removes
the need for calling :py:func:`pygments.lexers.guess_lexer_for_filename()`
which fully reads the source file.
def has_lexer(source_path):
"""
Initial quick check if there is a lexer for ``source_path``. This removes
the need ... |
Analysis for line counts in source code stored in ``source_path``.
:param source_path:
:param group: name of a logical group the sourc code belongs to, e.g. a
package.
:param encoding: encoding according to :func:`encoding_for`
:param fallback_encoding: fallback encoding according to
:func:... |
replace all masked values
calculate flatField from 2d-polynomal fit filling
all high gradient areas within averaged fit-image
returns flatField, average background level, fitted image, valid indices mask
def polynomial(img, mask, inplace=False, replace_all=False,
max_dev=1e-5, max_iter... |
TODO
def errorDist(scale, measExpTime, n_events_in_expTime,
event_duration, std,
points_per_time=100, n_repetitions=300):
'''
TODO
'''
ntimes = 10
s1 = measExpTime * scale * 10
# exp. time factor 1/16-->16:
p2 = np.logspace(-4, 4, 18, base=2)
t = ... |
std ... standard deviation of every signal
dur1...dur3 --> event duration per second
n1...n3 --> number of events per second
def exampleSignals(std=1, dur1=1, dur2=3, dur3=0.2,
n1=0.5, n2=0.5, n3=2):
'''
std ... standard deviation of every signal
dur1...dur3 --> even... |
returns Gaussian shaped signal fluctuations [events]
t --> times to calculate event for
n --> numbers of events per sec
duration --> event duration per sec
std --> std of event if averaged over time
offs --> event offset
def _flux(t, n, duration, std, offs=1):
'''
returns Gauss... |
capture signal and return its standard deviation
#TODO: more detail
def _capture(f, t, t0, factor):
'''
capture signal and return its standard deviation
#TODO: more detail
'''
n_per_sec = len(t) / t[-1]
# len of one split:
n = int(t0 * factor * n_per_sec)
s = len(f) // n
... |
Return a generic camera matrix
[[fx, 0, cx],
[ 0, fy, cy],
[ 0, 0, 1]]
for a given image shape
def genericCameraMatrix(shape, angularField=60):
'''
Return a generic camera matrix
[[fx, 0, cx],
[ 0, fy, cy],
[ 0, 0, 1]]
for a given image shape
'''
# http:/... |
calculate the spatial resolved standard deviation
for a given 2d array
ksize -> kernel size
blurred(optional) -> with same ksize gaussian filtered image
setting this parameter reduces processing time
def standardDeviation2d(img, ksize=5, blurred=None):
'''
... |
fn['mean', 'median']
fill_mask=True:
replaced masked areas with filtered results
fill_mask=False:
masked areas are ignored
def maskedFilter(arr, mask, ksize=30, fill_mask=True,
fn='median'):
'''
fn['mean', 'median']
fill_mask=True:
replaced masked... |
Extract vignetting from a set of images
containing different devices
The devices spatial inhomogeneities are averaged
This method is referred as 'Method C' in
---
K.Bedrich, M.Bokalic et al.:
ELECTROLUMINESCENCE IMAGING OF PV DEVICES:
ADVANCED FLAT FIELD CALIBRATION,2017
---
... |
Calculate the averaged signal-to-noise ratio SNR50
as defined by IEC NP 60904-13
needs 2 reference EL images and one background image
def SNR_IEC(i1, i2, ibg=0, allow_color_images=False):
'''
Calculate the averaged signal-to-noise ratio SNR50
as defined by IEC NP 60904-13
needs 2 refe... |
angle [DEG]
def _rotate(img, angle):
'''
angle [DEG]
'''
s = img.shape
if angle == 0:
return img
else:
M = cv2.getRotationMatrix2D((s[1] // 2,
s[0] // 2), angle, 1)
return cv2.warpAffi... |
return offset(x,y) which fit best self._base_img
through template matching
def _findOverlap(self, img_rgb, overlap, overlapDeviation,
rotation, rotationDeviation):
'''
return offset(x,y) which fit best self._base_img
through template matching
'''
... |
estimate the noise level function as stDev over image intensity
from a set of 2 image groups
images at the same position have to show
the identical setup, so
imgs1[i] - imgs2[i] = noise
def estimateFromImages(imgs1, imgs2=None, mn_mx=None, nbins=100):
'''
estimate the noise level functio... |
get the parameters of the, needed by 'function'
through curve fitting
def _evaluate(x, y, weights):
'''
get the parameters of the, needed by 'function'
through curve fitting
'''
i = _validI(x, y, weights)
xx = x[i]
y = y[i]
try:
fitParams = _fit(xx, y)
#... |
limit [function] to a minimum y value
def boundedFunction(x, minY, ax, ay):
'''
limit [function] to a minimum y value
'''
y = function(x, ax, ay)
return np.maximum(np.nan_to_num(y), minY) |
general square root function
def function(x, ax, ay):
'''
general square root function
'''
with np.errstate(invalid='ignore'):
return ay * (x - ax)**0.5 |
return indices that have enough data points and are not erroneous
def _validI(x, y, weights):
'''
return indices that have enough data points and are not erroneous
'''
# density filter:
i = np.logical_and(np.isfinite(y), weights > np.median(weights))
# filter outliers:
try:
... |
in case the NLF cannot be described by
a square root function
commit bounded polynomial interpolation
def smooth(x, y, weights):
'''
in case the NLF cannot be described by
a square root function
commit bounded polynomial interpolation
'''
# Spline hard to smooth properly, ther... |
Estimate the NLF from one or two images of the same kind
def oneImageNLF(img, img2=None, signal=None):
'''
Estimate the NLF from one or two images of the same kind
'''
x, y, weights, signal = calcNLF(img, img2, signal)
_, fn, _ = _evaluate(x, y, weights)
return fn, signal |
Get the a range of image intensities
that most pixels are in with
def _getMinMax(img):
'''
Get the a range of image intensities
that most pixels are in with
'''
av = np.mean(img)
std = np.std(img)
# define range for segmentation:
mn = av - 3 * std
mx = av + 3 * std
... |
Calculate the noise level function (NLF) as f(intensity)
using one or two image.
The approach for this work is published in JPV##########
img2 - 2nd image taken under same conditions
used to estimate noise via image difference
signalFromMultipleImages - whether the signal is an avera... |
fit unstructured data
def polyfit2d(x, y, z, order=3 #bounds=None
):
'''
fit unstructured data
'''
ncols = (order + 1)**2
G = np.zeros((x.size, ncols))
ij = itertools.product(list(range(order+1)), list(range(order+1)))
for k, (i,j) in enumerate(ij):
G[:,k] = ... |
replace all masked values with polynomial fitted ones
def polyfit2dGrid(arr, mask=None, order=3, replace_all=False,
copy=True, outgrid=None):
'''
replace all masked values with polynomial fitted ones
'''
s0,s1 = arr.shape
if mask is None:
if outgrid is None:
... |
find closest minimum position next to middle line
relative: return position relative to middle line
f: relative decrease (0...1) - setting this value close to one will
discriminate positions further away from the center
##order: 2 for cubic refinement
def minimumLineInArray(arr, relative=False, ... |
remove all low frequencies by setting a square in the middle of the
Fourier transformation of the size (2*threshold)^2 to zero
threshold = 0...1
def highPassFilter(self, threshold):
'''
remove all low frequencies by setting a square in the middle of the
Fourier transformati... |
remove all high frequencies by setting boundary around a quarry in the middle
of the size (2*threshold)^2 to zero
threshold = 0...1
def lowPassFilter(self, threshold):
'''
remove all high frequencies by setting boundary around a quarry in the middle
of the size (2*threshold... |
do inverse Fourier transform and return result
def reconstructImage(self):
'''
do inverse Fourier transform and return result
'''
f_ishift = np.fft.ifftshift(self.fshift)
return np.real(np.fft.ifft2(f_ishift)) |
x,y,v --> 1d numpy.array
grid --> 2d numpy.array
fast if number of given values is small relative to grid resolution
def interpolate2dUnstructuredIDW(x, y, v, grid, power=2):
'''
x,y,v --> 1d numpy.array
grid --> 2d numpy.array
fast if number of given values is small relative to grid ... |
returns the Histogram of Oriented Gradients
:param ksize: convolution kernel size as (y,x) - needs to be odd
:param orientations: number of orientations in between rad=0 and rad=pi
similar to http://scikit-image.org/docs/dev/auto_examples/plot_hog.html
but faster and with less options
def hog(i... |
visualize HOG as polynomial around cell center
for [grid] * cells
def visualize(hog, grid=(10, 10), radCircle=None):
'''
visualize HOG as polynomial around cell center
for [grid] * cells
'''
s0, s1, nang = hog.shape
angles = np.linspace(0, np.pi, nang + 1)[:-1]
# center ... |
Post process measured flat field [arr].
Depending on the measurement, different
post processing [method]s are beneficial.
The available methods are presented in
---
K.Bedrich, M.Bokalic et al.:
ELECTROLUMINESCENCE IMAGING OF PV DEVICES:
ADVANCED FLAT FIELD CALI... |
border [None], if images are corrected and device ends at
image border
[one number] (like 50),
if there is an equally spaced border
around the device
[two tu... |
#########
mask -- optional
def addImage(self, image, mask=None):
'''
#########
mask -- optional
'''
self._last_diff = diff = image - self.noSTE
ste = diff > self.threshold
removeSinglePixels(ste)
self.mask_clean = clean = ~ste
... |
return STE area - relative to image area
def relativeAreaSTE(self):
'''
return STE area - relative to image area
'''
s = self.noSTE.shape
return np.sum(self.mask_STE) / (s[0] * s[1]) |
return distribution of STE intensity
def intensityDistributionSTE(self, bins=10, range=None):
'''
return distribution of STE intensity
'''
v = np.abs(self._last_diff[self.mask_STE])
return np.histogram(v, bins, range) |
transform a float to an unsigned integer array of a fitting dtype
adds an offset, to get rid of negative values
range = (min, max) - scale values between given range
cutNegative - all values <0 will be set to 0
cutHigh - set to False to rather scale values to fit
def toUIntArray(img, dtype=None... |
transform an unsigned integer array into a
float array of the right size
def toFloatArray(img):
'''
transform an unsigned integer array into a
float array of the right size
'''
_D = {1: np.float32, # uint8
2: np.float32, # uint16
4: np.float64, # uint32
... |
cast array to the next higher integer array
if dtype=unsigned integer
def toNoUintArray(arr):
'''
cast array to the next higher integer array
if dtype=unsigned integer
'''
d = arr.dtype
if d.kind == 'u':
arr = arr.astype({1: np.int16,
2: np.int32,
... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.