View covariance_iir.py
#!/usr/bin/env python2
from __future__ import division
import numpy as np
__version__ = "2017-01-13 jan denis-bz-py t-online de"
#...............................................................................
class Covariance_iir( object ):
""" running Covariance_iir filter, up-weighting more recent data like IIR
View ansicolors.py
#!/usr/bin/env python
""" ansicolors.py: colors( str [fg= bg= style=] ) -> str with color codes
fg, bg: black red green yellow blue magenta cyan white
or a number 0 .. 255
style: bold faint italic underline blink blink2 negative concealed crossed
see https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
less -R file-with-colors
"""
# pip install ansicolors -> .../site-packages/colors.py
View Downhill-mnist.md

mnist-downhill.py below runs the downhill optimizer on the MNIST test set of handwritten digits.

Downhill in a nutshell:

  • gradient descent optimizers: SGD, RMSProp, Ada*, with momentum
  • a thin wrapper for theano (really thin: 1500 lines, half comments)
  • well-written, narrative doc
View theano-example-how-to-monitor-gradients.py
# a tiny example of how to monitor gradients in theano
# from http://www.marekrei.com/blog/theano-tutorial/ 9. Minimal Training Example
# denis-bz 2016-11-04 nov
import theano
import theano.tensor as TT
import numpy as np
floatx = theano.config.floatX
np.set_printoptions( threshold=20, edgeitems=10, linewidth=100,
View sparse-dot-ndarray.py
""" Which {sparse or numpy array} * {sparse or numpy array} work ?
Which combinations are valid, what's the result type ?
Here try-it-and-see on N^2 combinations
with `safe_sparse_dot` from scikit-learn, not "*" .
See also:
http://scipy-lectures.github.com/advanced/scipy_sparse/
https://scipy.github.io/old-wiki/pages/SciPyPackages/Sparse.html
http://stackoverflow.com/questions/tagged/scipy+sparse-matrix (lots)
"""
# Keywords: scipy sparse dot-product basics tutorial
View 0-MNIST-KNN-svm.md

Compare sklearn KNN rbf poly2 on MNIST digits

Purpose: compare 4 scikit-learn classifiers on a venerable test case, the MNIST database of 70000 handwritten digits, 28 x 28 pixels.

Keywords: classification, benchmark, MNIST, KNN, SVM, scikit-learn, python

knn-mismatch-10

View Signal-98%-plus-Noise-20%.md

Signal 98 % + Noise 20 % = 100 % ?

Data is often split into "signal" + "noise", with

|data|^2 = |signal|^2 + |noise|^2
         = sum of squares, data1^2 + data2^2 + ...

Sums of squares can be rather non-intuitive. For example,

View truncated-svd-diagonal.py
#!/usr/bin/env python
""" Gavish + Donoho, Optimal Hard Threshold for Singular Values is 4 / sqrt 3, 2014, 14p
A = D (Signal, diagonal) + Noise
Atrunc = truncated SVD( A )
How well does Atrunc == D + Res approximate D ?
|Atrunc|, |Res| increase with ntrunc
"""
# what am I missing:
# if one knows that Signal is diagonal, just threshold A ?
# See also
View 0-Stripy.md

Stripy: percentile stripes for scatterplots

Keywords: scatterplot, percentiles, quantiles, visualize, regression, nonparametric

ozone-stripy-4june

What does this show ? Consider a fat vertical line at a given x in one of these plots. The colored bands are, from low to high,