Skip to content

Instantly share code, notes, and snippets.

@amitsaha
Created April 12, 2012 14:54
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save amitsaha/2367895 to your computer and use it in GitHub Desktop.
Save amitsaha/2367895 to your computer and use it in GitHub Desktop.
Hill Climbing with Multiple Solutions
'''Hill climbing search with Multiple solutions
in a continous variable space. (See: http://en.wikipedia.org/wiki/Hill_climbing)
Single variable '''
# Feel free to reuse, modify, play!
# Report errors/suggestions to: amitsaha.in@gmail.com
# Amit Saha (http://echorand.me)
# 7 April, 2012
'''
README:
1. Objective function defined as func1(), func2()..
and assigned to the global variable objfunc
2. Assumes a 1-D Maximization problem
3. Lower bound and upper bound of the variable specified
in the globals, lb and ub
4. Uses Central difference method to calculate gradient of a function
5. If a move doesn't improve the previous move, a random noise is added to it
for a maxstuck number of times before exiting the procedure
6. The file soluions.dat contain the X and F data of all the solutions evaluated
7. The file best.dat contains the X and F of the best solution reached with the
different starting points
8. The execution begins from __main__ below
'''
import math
import random
#math globals
fabs=math.fabs
sin=math.sin
cos=math.cos
exp=math.exp
pi=math.pi
#numnber of solutions
numsolutions=5
#lower bound and upper bound
#function func1
#lb=0.0
#ub=2.0
#function func2
lb=-3.0
ub=3.0
#function
def func1(x):
# assume 0 <= x <= 1
# to see how this function looks like:
# gnuplot> plot [0:1] exp(-x*x)*(sin(3.0*pi*x)**2)
return exp(-x*x)*(sin(3.0*pi*x)**2)
#function
def func2(x):
# assume 0 <= x <= 1
# to see how this function looks like:
# gnuplot> plot [0:1] exp(-x*x)*(sin(3.0*pi*x)**2)
return x*sin(x*x)
#objective function variable used later
#objfunc=func1
objfunc=func2
#file to store all the data
f=open('solutions.dat','w')
#file to store the best reached from a solution
f_best=open('best.dat','w')
# I am doing gradient ascent here, since I want to maximize the function
def grad_ascent(x):
#scalar multiple for grad-ascent
scalar=0.001
# central difference method to calculate gradient of the function
delta = 0.001
delta_x = (objfunc(x+delta) - objfunc(x-delta))/(2.0*delta)
return x + (scalar*delta_x)
#Definition of the simple hill climbing search
def hill_climb(x,next_move):
#starting point
current = x
#maximum number of times till it tries when
#it gets stuck
maxstuck=10
# initialize counter
stuck = 0
while True:
#evaluate the current solutions
current_val=objfunc(current)
# Book keeping
#print current,current_val
f.write('{0:f} {1:f}\n'.format(current,current_val))
# next move
next = next_move(current)
# check the quality of the move
if objfunc(next) > current_val:
# for the next iteration
current = next
else:
# if stuck
if stuck < maxstuck:
stuck=stuck+1
# add a uniform random noise
current=next + (random.uniform(0.01,0.05)*next)
# incase it overshoots the upper bound of the variable
if current < lb:
current=lb
if current > ub:
current=ub
else:
# if the quality of the move does not improve for
# maxstuck times, quit and hope for the best
break
return current,current_val
# the beginning
if __name__=='__main__':
# to make results reproducible
random.seed(10)
# initial solutions
# in [lb,ub]
x_init=[]
for i in range(numsolutions):
x_init.append(lb + (ub-lb)*random.random())
# Try to reach the highest point starting from
# the different solutions
for x in x_init:
#start the procedure
bestx,bestf=hill_climb(x,grad_ascent)
print 'Best Solution with {0:f} :: {1:f}'.format(x,bestf)
#store
f_best.write('{0:f} {1:f}\n'.format(bestx,bestf))
#close files
f.close()
f_best.close()
@amitsaha
Copy link
Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment