View suncurves.md

suncurves.py: how does the sun's shadow vary over the year?

suncurves

View Test-error-in-classification.md

How noisy is test error in classification ?

Algorithms for classification, in particular binary classification, have two different objectives:

  • a smooth, fast, approximate Loss function used in most optimizers
  • real loss measured on a test set: expensive to calculate, so usually done only at the end.
View datalogger.py
""" logger = Datalogger( "x y z ..." ): log vars to plot or save
logger( locals(), x= ... ) in a loop
looks up x y ... in e.g. locals()
and grows
logger.mem["x"] = [x0 x1 ...]
logger.mem["y"] = [y0 y1 ...]
... over all calls to logger(), e.g. in a loop.
logger.savez( npzfile ) saves all x y z ... with numpy.savez .
View covariance_iir.py
#!/usr/bin/env python2
from __future__ import division
import numpy as np
__version__ = "2017-01-13 jan denis-bz-py t-online de"
#...............................................................................
class Covariance_iir( object ):
""" running Covariance_iir filter, up-weighting more recent data like IIR
View ansicolors.py
#!/usr/bin/env python
""" ansicolors.py: colors( str [fg= bg= style=] ) -> str with color codes
fg, bg: black red green yellow blue magenta cyan white
or a number 0 .. 255
style: bold faint italic underline blink blink2 negative concealed crossed
see https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
less -R file-with-colors
"""
# pip install ansicolors -> .../site-packages/colors.py
View Downhill-mnist.md

mnist-downhill.py below runs the downhill optimizer on the MNIST test set of handwritten digits.

Downhill in a nutshell:

  • gradient descent optimizers: SGD, RMSProp, Ada*, with momentum
  • a thin wrapper for theano (really thin: 1500 lines, half comments)
  • well-written, narrative doc
View theano-example-how-to-monitor-gradients.py
# a tiny example of how to monitor gradients in theano
# from http://www.marekrei.com/blog/theano-tutorial/ 9. Minimal Training Example
# denis-bz 2016-11-04 nov
import theano
import theano.tensor as TT
import numpy as np
floatx = theano.config.floatX
np.set_printoptions( threshold=20, edgeitems=10, linewidth=100,
View sparse-dot-ndarray.py
""" Which {sparse or numpy array} * {sparse or numpy array} work ?
Which combinations are valid, what's the result type ?
Here try-it-and-see on N^2 combinations
with `safe_sparse_dot` from scikit-learn, not "*" .
See also:
http://scipy-lectures.github.com/advanced/scipy_sparse/
https://scipy.github.io/old-wiki/pages/SciPyPackages/Sparse.html
http://stackoverflow.com/questions/tagged/scipy+sparse-matrix (lots)
"""
# Keywords: scipy sparse dot-product basics tutorial
View 0-MNIST-KNN-svm.md

Compare sklearn KNN rbf poly2 on MNIST digits

Purpose: compare 4 scikit-learn classifiers on a venerable test case, the MNIST database of 70000 handwritten digits, 28 x 28 pixels.

Keywords: classification, benchmark, MNIST, KNN, SVM, scikit-learn, python

knn-mismatch-10