Caution
You're reading an old version of this documentation. If you want up-to-date information, please have a look at 0.9.1.
librosa.util.softmask¶
- librosa.util.softmask(X, X_ref, *, power=1, split_zeros=False)[source]¶
Robustly compute a soft-mask operation.
M = X**power / (X**power + X_ref**power)
- Parameters
- Xnp.ndarray
The (non-negative) input array corresponding to the positive mask elements
- X_refnp.ndarray
The (non-negative) array of reference or background elements. Must have the same shape as
X
.- powernumber > 0 or np.inf
If finite, returns the soft mask computed in a numerically stable way
If infinite, returns a hard (binary) mask equivalent to
X > X_ref
. Note: for hard masks, ties are always broken in favor ofX_ref
(mask=0
).- split_zerosbool
If True, entries where
X
andX_ref
are both small (close to 0) will receive mask values of 0.5.Otherwise, the mask is set to 0 for these entries.
- Returns
- masknp.ndarray, shape=X.shape
The output mask array
- Raises
- ParameterError
If
X
andX_ref
have different shapes.If
X
orX_ref
are negative anywhereIf
power <= 0
Examples
>>> X = 2 * np.ones((3, 3)) >>> X_ref = np.vander(np.arange(3.0)) >>> X array([[ 2., 2., 2.], [ 2., 2., 2.], [ 2., 2., 2.]]) >>> X_ref array([[ 0., 0., 1.], [ 1., 1., 1.], [ 4., 2., 1.]]) >>> librosa.util.softmask(X, X_ref, power=1) array([[ 1. , 1. , 0.667], [ 0.667, 0.667, 0.667], [ 0.333, 0.5 , 0.667]]) >>> librosa.util.softmask(X_ref, X, power=1) array([[ 0. , 0. , 0.333], [ 0.333, 0.333, 0.333], [ 0.667, 0.5 , 0.333]]) >>> librosa.util.softmask(X, X_ref, power=2) array([[ 1. , 1. , 0.8], [ 0.8, 0.8, 0.8], [ 0.2, 0.5, 0.8]]) >>> librosa.util.softmask(X, X_ref, power=4) array([[ 1. , 1. , 0.941], [ 0.941, 0.941, 0.941], [ 0.059, 0.5 , 0.941]]) >>> librosa.util.softmask(X, X_ref, power=100) array([[ 1.000e+00, 1.000e+00, 1.000e+00], [ 1.000e+00, 1.000e+00, 1.000e+00], [ 7.889e-31, 5.000e-01, 1.000e+00]]) >>> librosa.util.softmask(X, X_ref, power=np.inf) array([[ True, True, True], [ True, True, True], [False, False, True]], dtype=bool)