Skip to content

Instantly share code, notes, and snippets.

@vihari
Created September 23, 2017 06:16
Show Gist options
  • Save vihari/3214d5bbc6da57286298b9e311c01877 to your computer and use it in GitHub Desktop.
Save vihari/3214d5bbc6da57286298b9e311c01877 to your computer and use it in GitHub Desktop.
Perceptron Convergence Analysis
#!/usr/bin/python
"""
Is Convergence rate of perceptron update dependent on the input dimensionality?
"""
import numpy as np
N = 100
lr = 1
for sz in [5, 10, 100, 500, 1000, 5000, 10000]:
dat = np.random.normal(scale=10, size=[N, sz])
worig = np.random.normal(scale=10, size=[sz])
y = np.where(np.dot(dat, worig)>0, np.ones([N]), -np.ones([N]))
nsteps = 0
na = 10
ne = 0
for attempt in range(na):
w = np.random.normal(size=[sz])
for epoch in range(100000):
br = True
for i, x in enumerate(dat):
if y[i]*np.dot(x, w)<0:
w += lr*y[i]*x
br = False
if br:
#print "Dimension: %d Converged in %d epochs" % (sz, epoch)
ne += epoch
break
ne /= float(na)
print "Dimension: %d Converged in %f epochs" % (sz, ne)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment