Caution

You're reading an old version of this documentation. If you want up-to-date information, please have a look at 0.10.2.

librosa.util.sparsify_rows

librosa.util.sparsify_rows(x, quantile=0.01, dtype=None)[source]

Return a row-sparse matrix approximating the input

Parameters:
xnp.ndarray [ndim <= 2]

The input matrix to sparsify.

quantilefloat in [0, 1.0)

Percentage of magnitude to discard in each row of x

dtypenp.dtype, optional

The dtype of the output array. If not provided, then x.dtype will be used.

Returns:
x_sparsescipy.sparse.csr_matrix [shape=x.shape]

Row-sparsified approximation of x

If x.ndim == 1, then x is interpreted as a row vector, and x_sparse.shape == (1, len(x)).

Raises:
ParameterError

If x.ndim > 2

If quantile lies outside [0, 1.0)

Notes

This function caches at level 40.

Examples

>>> # Construct a Hann window to sparsify
>>> x = scipy.signal.hann(32)
>>> x
array([ 0.   ,  0.01 ,  0.041,  0.09 ,  0.156,  0.236,  0.326,
        0.424,  0.525,  0.625,  0.72 ,  0.806,  0.879,  0.937,
        0.977,  0.997,  0.997,  0.977,  0.937,  0.879,  0.806,
        0.72 ,  0.625,  0.525,  0.424,  0.326,  0.236,  0.156,
        0.09 ,  0.041,  0.01 ,  0.   ])
>>> # Discard the bottom percentile
>>> x_sparse = librosa.util.sparsify_rows(x, quantile=0.01)
>>> x_sparse
<1x32 sparse matrix of type '<type 'numpy.float64'>'
    with 26 stored elements in Compressed Sparse Row format>
>>> x_sparse.todense()
matrix([[ 0.   ,  0.   ,  0.   ,  0.09 ,  0.156,  0.236,  0.326,
          0.424,  0.525,  0.625,  0.72 ,  0.806,  0.879,  0.937,
          0.977,  0.997,  0.997,  0.977,  0.937,  0.879,  0.806,
          0.72 ,  0.625,  0.525,  0.424,  0.326,  0.236,  0.156,
          0.09 ,  0.   ,  0.   ,  0.   ]])
>>> # Discard up to the bottom 10th percentile
>>> x_sparse = librosa.util.sparsify_rows(x, quantile=0.1)
>>> x_sparse
<1x32 sparse matrix of type '<type 'numpy.float64'>'
    with 20 stored elements in Compressed Sparse Row format>
>>> x_sparse.todense()
matrix([[ 0.   ,  0.   ,  0.   ,  0.   ,  0.   ,  0.   ,  0.326,
          0.424,  0.525,  0.625,  0.72 ,  0.806,  0.879,  0.937,
          0.977,  0.997,  0.997,  0.977,  0.937,  0.879,  0.806,
          0.72 ,  0.625,  0.525,  0.424,  0.326,  0.   ,  0.   ,
          0.   ,  0.   ,  0.   ,  0.   ]])