Skip to content

Instantly share code, notes, and snippets.

@tomthorogood
Last active December 14, 2015 14:58
Show Gist options
  • Save tomthorogood/5104052 to your computer and use it in GitHub Desktop.
Save tomthorogood/5104052 to your computer and use it in GitHub Desktop.
An implementation of the Euler method for approximating function returns
# Tom A. Thorogood - 2013 - github.com/tomthorogood
#
# This software is free to use, modify, and distribute as you wish. It is intended for educational purposes only, and is
# provided without a guarantee of functionality or warranty of any sort. Usage is solely at your own risk.
#
# The Euler provides a method to estimate the value of a function (f) given an input value (x).
#
# The class requires the following:
# 1) An equation with two variables (x and y)
# 2) A known x, and known y value
# 3) An x value for which a y value is desired
# 4) A degree of precision (smaller values are more precise)
#
#
# The class is stateful, and can be interacted with and examined stepwise:
#
# e.step()
# e.x # returns the last used x value
# e.y # returns the last estimated y value
# e.f_prime # returns the last estimated value of the derivative of f(x) at value x
# e.num_steps # returns the number of steps elapsed
#
# ===================================================
# BASIC USAGE:
#
# from euler import Euler
#
# def diff(x,y): return (2*x) + (3*y**2)
# e = Euler(
# differential = diff,
# known_x = -3,
# known_y = 1,
# goal = -1,
# h = 0.25
# )
#
# e.run()
# [returns -.862823900]
# ====================================================
#
# You do not need to pass in the sign of 'h'. The class will determine the sign of h based on known_x and goal.
# If you do pass in the sign of h, only the absolute value will be stored, so it also will not hurt anything.
#
# You can use a combination of step-wise and automatic processing:
#
# e.step() #advance the calculation one step
# e.step() #advance the calculation one step
# e.run() #complete the calculation through to the end
class Euler(object):
def __init__(self, differential, known_x, known_y, goal, h = 1):
self.differential = differential
self.goal = float(goal)
self.h = float(abs(h))
self.x = float(known_x)
self.y = float(known_y)
self.slope = self.differential(self.x, self.y)
self.direction = self.step_increase
self.num_steps = 0
if self.goal < self.x:
self.h = -self.h
self.direction = self.step_decrease
def step(self):
self.num_steps += 1
self.f_prime = self.differential(self.x,self.y)
self.y = self.y + (self.h * self.f_prime)
self.x = self.x + self.h
def step_increase(self):
while self.x < self.goal:
self.step()
def step_decrease(self):
while self.h > self.goal:
self.step()
def run(self):
self.direction()
return self.y
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment