Skip to content

Instantly share code, notes, and snippets.

@Sujimichi
Created May 27, 2012 02:15
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Sujimichi/2795962 to your computer and use it in GitHub Desktop.
Save Sujimichi/2795962 to your computer and use it in GitHub Desktop.
Reduced Version of my NN solution to RubyWarrior. Without support for using different layered NN's and extra comments removed. Includes two evolved genomes, put one in a file called 'genome' with player.rb in warrior folder
0.4208665093792384,0.8271085823481578,-0.08677969236421801,-1.3812424909479102,0.25662357967972205,0.895993960666072,0.5969217612341047,1.003827461932993,0.3535111488721142,1.626234122590899,-0.9807624824831435,0.8455667360510928,-0.6596045396559488,-0.32330752252947614,0.30962364095597517,-1.051266687042697,-0.8619251071880633,1.0131278329152824,0.7045901871883101,0.5102103240087068,-0.5795087495450799,-1.4718975639160914,1.110149168791025,0.4848770449182779,-0.131034115009259,-0.8994602986777726,1.379669139856652,-0.24300620776607484,-0.032104129747120846,0.21746064790006747,-1.1751996077694868,-0.4416596710613584,1.2838430768445492,-0.33854260905174827,-0.5552927940371294,0.19992797470487422,-1.2768902810663525,-3.223132063475557,2.6814521898725237,-1.9607168460213784,1.9180731391978023,0.6077600309107104,0.4123090743791793,0.8246054475587798,-0.9596703206415195,0.2646596987299211,-0.06851920283518687,0.9944253709871688,0.24848358116043623,0.7495855784944395,-1.108429896105167,1.3753052571268305,3.181034120558026,-1.2709155470015525,0.7253500256394799,2.4192587701666324,2.122847393625867,-0.7075864717716149,2.694571198462502,-0.23738793552525383,-2.6191541131659126,0.6470749529047221,-2.0785370070988187,0.21706440490961854,-1.0164747730432557,0.16792697963601755,0.5023039485374504,-1.6222578128267933,-1.6384362302042,0.3516923946976447,-0.08501410283203104,0.10333251322753634,0.9694370284329388,0.7053274460037942,-1.9104000853472534,1.347447081663761,-0.875632334774423,0.4292051447798868,-0.1650455988119438,-0.3090559012185816,0.8122849956890801,-0.43795253002994095,-0.6324174808071471,1.052703281088451,-1.1261157583338193,1.073884982118626,1.602349239517356,-1.5930163275608602,-0.5929434775878311,-0.08861092539187976,2.2426990558759963,-0.6078690447899587,-1.4689188415414924,0.8433812929082517,-0.09344770334669705,0.5036286454281341,-0.6455383167824994,-2.2115467726742777,0.9784517143150431,-1.5474794887298606,-1.0121117734936058,1.315809794779084,-0.038522836988336806,-0.1922363540190457,0.34537500635109775,-0.07035164026793084,0.1304654458666375,0.046499273727501345,-0.776155984209477,-0.37063833219461223,2.953491431451073,-0.05017712876532365,-0.43929723081370353,-0.9906992496151041,-1.9884220925888678,-2.869539337671243,-1.4158283863183865,0.7505673085798398,0.9765036522488568,0.2733308317062427,1.7150446139071789,-1.1283890568444024,-0.11792247213316376,2.4010116280638307,-0.4041747776518825,0.17187082634317985,2.9376700582278765,-0.5871316925721045,1.3928431558645926,-0.1618869375393336,0.5749040004267305,-0.4636279527831687,1.3626708417630131,0.711740301132716,-0.0659435614151952,0.28541086127064585,-1.649405424787955,-1.1626701947387152,0.8682689002089043,-0.8377851096785819,-1.65359097309008,-0.818010200816063,-0.6911964701717429,1.273553027323357
0.5802744208575772,-0.28574107165038476,-0.4662748785179248,-1.2171913093525357,0.5656330027974735,-0.5586257421359373,0.5084206048395205,1.003827461932993,0.709080278772143,1.6392156876576782,-1.5816557649098901,1.324602802894768,0.8260058048894272,0.7178511436920003,-0.5804795029026557,-0.6639538111439616,-1.0830547059169757,1.9874278830514984,1.060462572920094,0.043450244058545695,-0.9386108314118661,-0.20380921105848493,0.9387378967454564,1.2449728237120754,-0.22972591303958378,-0.24970922127807615,1.62529564108517,0.5417832369383961,-0.11528627896579335,1.4118376011513845,-0.7192935919079477,-0.3072746576279173,0.5584597818074516,-0.7422452538004506,-0.4634791859355173,0.004736003702502156,-0.522971915073979,-1.6080762596194578,1.2867003151679492,-1.251724631556766,1.2491658636125211,1.3107157100065336,0.1900789190385367,1.8111732526381408,-1.401736698163552,0.5447534774158844,-0.8207816593573641,1.4844152049571304,0.7437535038324593,1.392577856212811,-1.0991699952789844,0.8711055513346737,1.6172844865648934,0.8405928179966495,0.4943506198504791,1.4284968522597425,-0.12065215283619857,-1.4419055454769207,2.7846954519824108,-0.7774835582067521,-2.145877659337063,0.9779931808120903,-0.410640494708302,0.03333410784760027,-1.451123469025841,0.5979415584663418,0.4598369719701315,-1.2593196304976257,-0.35010422037185673,1.1346703795954287,-0.9273859981034838,-0.4201631565453021,1.0212588282106947,0.700940688812856,-1.8844466294018205,0.7347508702944914,-1.0349553738651185,-0.04385884711391752,-0.13241728359821703,0.42457537461219197,1.3247043094193365,-0.8305801586055288,-0.7672892639721568,1.6630102110931597,-1.4630564793478578,1.7414649188338496,0.2974684205486997,-0.5212079800825287,0.18517333251448853,0.411561910031378,1.0719799685548463,-1.7452448790752302,-0.9517129751400798,-0.08448533537832714,-1.2325858448108846,-0.034609688077519296,-1.1319323029736506,-2.504005551714977,-0.3791599946626758,-1.1627861021688033,-0.694483154179575,0.897682867193146,0.1689166248283508,0.1244239416555194,-0.03729333603100804,0.18323346496734283,-0.34400573706550336,-0.09399750371134646,-1.7745907738007982,-0.6104668588467705,2.7888692908831696,0.35413361313989,-1.6159956231111794,-1.2513571752160617,-1.121825686506059,-1.96465470421759,-1.8425058033652832,1.23775809354918,0.6562636799454941,-0.17649029415055761,1.6087244520620438,0.4469884419946354,0.6381523759405975,-0.15787817155644912,-0.6123560840087449,0.48782473306474006,3.4922828293755748,-1.048601918281,0.7836219082927285,-0.8332571162284346,0.6919216046299118,-0.6089382368131837,0.2010862248447225,-0.3640148193914603,-0.7425593566026867,1.3646274903635693,-0.26189256664438787,-1.7190295526618184,-0.4506178405825584,0.1614189724301064,-1.5381221619979772,-0.8047572832141019,-0.3636923895878802,-0.07888040239556415
class Player
def initialize
genome = File.open("./genome", "r"){|f| f.readlines}.join.split(",").map{|s| s.to_f} #Read genome from file.
@brain = Brain.new({:in => 16, :inner => 6, :out => 8}, genome) #Initialize warriors brain (neural net)
end
def play_turn(warrior)
@previous_health ||= 20
inputs = input_array_for(warrior) #Sense world and present as an array of inputs for NN
action, impulse = @brain.act_on(inputs) #send inputs to neural network and interpret its output as :action and :impulse
begin #send 'impulse' and 'action' from brain to warrior. done inside rescue as brain may request actions the body can't yet do, like rest! in the eariler levels.
warrior.send(*["#{action}!", (action.eql?(:rest) ? nil : impulse)].compact)
rescue NoMethodError => e
puts "body failed to understand brain! #{e.message}"
end
@previous_health = warrior.health if warrior.respond_to?(:health)
end
def input_array_for warrior
dirs = [:left, :forward, :right, :backward] #directions in which things can be
things = [:wall, :enemy, :captive] #type of things there can be
vis_scale = [0, 0.6, 0.3] #used to scale the values returned by :look.
if warrior.respond_to?(:feel)
can_look = warrior.respond_to?(:look)
inputs = things.map do |thing| #for each of the things
dirs.map do |dir| #in each of the directions
v = (warrior.feel(dir).send("#{thing}?").eql?(true) ? 1 : 0) #test if that thing is there, returning 1 for true else 0
if can_look #if warrior can also look
look = warrior.look(dir) #look in direction
v = v + look.map{|l| (l.send("#{thing}?").eql?(true) ? 1 : 0) * vis_scale[look.index(l)] }.max #reduce to a single val from given 3 ie [0,1,1] => [0, 0.6, 0.3] => [0.6]
end
v
end
end
else
inputs = [1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0] #inputs for empty corridor.
end
inputs << (warrior.respond_to?(:shoot!) ? 1 : 0)
w_health = warrior.respond_to?(:health) ? warrior.health : 20
inputs << (1 - 1.0/20 * w_health).round(2)
inputs << ((@previous_health > w_health) ? 1 : 0) #sense of health dropping
inputs << 1 #representational bias. yeah, I should prob explain this! its REALLY important!
inputs.flatten
end
end
class Brain
def initialize nodes, genome
p1 = (nodes[:in] * nodes[:inner]) #number of genes(weights) needed for first layer
@network = NeuralNetwork.new([
NeuralLayer.new(genome[0..p1-1].in_groups_of(nodes[:in])), #init 1st layer with weights from genome
NeuralLayer.new(genome[p1..(genome.size-1)].in_groups_of(nodes[:inner])) #init 2nd layer with weights from genome
])
end
def act_on inputs
output = @network.process(inputs) #send inputs to neural net and return result as array of values (nodes).
if output[0].abs > output[1].abs #moving forward or backwards
impulse = (output[0] > 0) ? :forward : :backward
else #moving left or right
impulse = (output[1] > 0) ? :left : :right
end
actions = [[:walk, output[2]], [:attack, output[3]], [:rest, output[4]], [:rescue, output[5]], [:pivot, output[6]], [:shoot, output[7]]]
action = actions.max_by{|grp| grp.last}.first #order actions by largest output value and select
impulse = :rest if action.eql?(:rest) #rest is the only non-directional action (this line is only needed by assertions in BootCamp, not core to solutions funtion)
[action, impulse] #return the action and impulse
end
end
class NeuralNetwork
def initialize layers
@layers = layers[0..2]
end
def process inputs
output = inputs
@layers.each{ |layer| output = layer.process(output) } #<---This calls the neural network calculation step
output
end
end
class NeuralLayer
def initialize weights
@weights = weights
@nodes = {:output => @weights.size, :input => @weights.first.size}
end
def process inputs
@nodes[:output].times.map{ |i| inputs.zip(@weights[i]).map{|d| d.product_with_activation}.sum.round(2) } #<---Main NN calculation step.
end
end
class Array
def sum; self.inject{|i,j| i.to_f + j.to_f}; end
def product; self.inject{|i,j| i.to_f * j.to_f}; end
def product_with_activation; Math.sin(product); end
def in_groups_of n
col = []
self.each_slice(n){|slice| col << slice}
return col
end
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment