View 0-Adaptive-soft-threshold-smooth-abs.md

Adaptive soft threshold and smooth abs: scale by average |X|

The soft threshold and smooth absolute value functions

adasoft

are widely used in optimization and signal processing. (Soft thresholding squeezes small values to 0; if "noise" is small and "signal" large, this improves the signal-to-noise ratio. Smooth abs, also called

View 0-Qmin.md

Qmin: minimize a noisy function by fitting quadratics.

Purpose: short, clear code for

  • fitting quadratics to data, aka quadratic regression
  • iterating quad fits to a local minimum of a noisy function.

This code is for students of programming and optimization to read and try out, not for professionals.

View 0-Qmin.md

Qmin: minimize a noisy function by fitting quadratics.

Purpose: short, clear code for

  • fitting quadratics to data, aka quadratic regression
  • iterating quad fits to a local minimum of a noisy function.

This code is for students of programming and optimization to read and try out, not for professionals.

View Test-error-in-classification.md

How noisy is test error in classification ?

Algorithms for classification, in particular binary classification, have two different objectives:

  • a smooth, fast, approximate Loss function used in most optimizers
  • real loss measured on a test set: expensive to calculate, so usually done only at the end.
View datalogger.py
""" logger = Datalogger( "x y z ..." ): log vars to plot or save
logger( locals(), x= ... ) in a loop
looks up x y ... in e.g. locals()
and grows
logger.mem["x"] = [x0 x1 ...]
logger.mem["y"] = [y0 y1 ...]
... over all calls to logger(), e.g. in a loop.
logger.savez( npzfile ) saves all x y z ... with numpy.savez .
View covariance_iir.py
#!/usr/bin/env python2
from __future__ import division
import numpy as np
__version__ = "2017-01-13 jan denis-bz-py t-online de"
#...............................................................................
class Covariance_iir( object ):
""" running Covariance_iir filter, up-weighting more recent data like IIR
View ansicolors.py
#!/usr/bin/env python
""" ansicolors.py: colors( str [fg= bg= style=] ) -> str with color codes
fg, bg: black red green yellow blue magenta cyan white
or a number 0 .. 255
style: bold faint italic underline blink blink2 negative concealed crossed
see https://en.wikipedia.org/wiki/ANSI_escape_code#Colors
less -R file-with-colors
"""
# pip install ansicolors -> .../site-packages/colors.py
View Downhill-mnist.md

mnist-downhill.py below runs the downhill optimizer on the MNIST test set of handwritten digits.

Downhill in a nutshell:

  • gradient descent optimizers: SGD, RMSProp, Ada*, with momentum
  • a thin wrapper for theano (really thin: 1500 lines, half comments)
  • well-written, narrative doc
View theano-example-how-to-monitor-gradients.py
# a tiny example of how to monitor gradients in theano
# from http://www.marekrei.com/blog/theano-tutorial/ 9. Minimal Training Example
# denis-bz 2016-11-04 nov
import theano
import theano.tensor as TT
import numpy as np
floatx = theano.config.floatX
np.set_printoptions( threshold=20, edgeitems=10, linewidth=100,