genetic programming: hypothesis evolution by nam nguyen greg nelson

of 42 /42
Genetic Programming: Hypothesis Evolution By Nam Nguyen Greg Nelson

Author: abigayle-hodges

Post on 16-Dec-2015

217 views

Category:

Documents


1 download

Embed Size (px)

TRANSCRIPT

  • Slide 1
  • Genetic Programming: Hypothesis Evolution By Nam Nguyen Greg Nelson
  • Slide 2
  • Introduction To Evol An attempt to more directly simulate the process of natural selection. The system is set up as a large board with agents and examples. Each agent executes on its own thread. Agents live on the board and wander around processing examples and interacting with other agents. Two kinds of agents: helpers and fighters. Examples act as food for the agents.
  • Slide 3
  • Goal Implement a new genetic algorithm that, rather than being based on probabilities, actually models the process of natural selection. Build a dynamic, intuitive learning system that very closely emulates natural selection. Find a balance in parameter settings (how much "life" an agent starts with, what the "age" threshold is in order to breed, how much an agent's "life" is boosted by eating an example, etc...); some settings cause agents to breed like rabbits and grow to doomsday, others lead to almost immediate extinction.
  • Slide 4
  • Problem Representation Environment is represented by a board. All entities live on this board. Agents walk around, eat examples, breed, share information, and fight with other agents. E E
  • Slide 5
  • Hierarchy Tree Entity Example Agent Helper Fighter
  • Slide 6
  • Play Tennis Example Example 1 (Outlook = Sunny) && (Temp = Hot) && (Humidity = High) && (Wind = Weak) Then (PlayTennis = Yes) Bit string 10010010101 (First three bits for Outlook, etc) All hypotheses and examples are represented with this notation.
  • Slide 7
  • Agents The agents contain a genotype and phenotype. Both represent hypotheses. The phenotype is used (and changed); the genotype is inherited from birth, never changed, and passed on to future generations. Agents also interact by breeding, helping each other, or fighting.
  • Slide 8
  • Examples Continued Examples are created from the training data set and act as food for the agents. When an agent comes upon an example it does one of two things: 1. If it classifies it correctly, it "eats" it, which increases its strength and likelihood for survival. 2. If not, it tries to learn something from it (change some of the phenotype bits to match the examples).
  • Slide 9
  • When Helpers Meet Other Helpers Helper 1 0110100 Helper 2 1110110
  • Slide 10
  • You are too young. We should help each other. Okay, lets be friends! Helper 1 0110100 Helper 2 1110110
  • Slide 11
  • This is what I learned Heres my knowledge Helper 1 0110100 Helper 2 1110110 Bit 5 is true Bit 6 is true
  • Slide 12
  • When Helpers Meet Other Helpers: part 2 Helper 1 Life 500 Helper 2 Life 800
  • Slide 13
  • How old are you?
  • Slide 14
  • I am above threshold
  • Slide 15
  • Slide 16
  • Slide 17
  • Censored
  • Slide 18
  • Helper 1 Life 250 Helper 2 Life 550 Helper 3 (Baby) Life 500 Several Cycles Later
  • Slide 19
  • When Fighters Meet Other Agents Fighter 1
  • Slide 20
  • Quit stealing my mates and examples
  • Slide 21
  • No, you stop!
  • Slide 22
  • Lightning Bolt
  • Slide 23
  • Slide 24
  • Slide 25
  • POW! AH HA HA HA!
  • Slide 26
  • When Fighters Meet Wise Helpers Helper 1 Life 900 Fighter 1 Life 800
  • Slide 27
  • Must fight!
  • Slide 28
  • That agent is too cute to bully Hello! How are you?
  • Slide 29
  • Get over here!
  • Slide 30
  • Zip! AHHHHH!
  • Slide 31
  • Whoos h
  • Slide 32
  • Slide 33
  • Slide 34
  • Smooch!
  • Slide 35
  • Helper 1 Life 650 Fighter 1 Life 550 Fighter 2 Baby Life 500 Several Cycles Later..
  • Slide 36
  • Learning Algorithm Agents move around the board learning from examples and each other. The more successful agents will breed more readily, while the less successful agents will be killed by starvation and combat.
  • Slide 37
  • Performance Seemed to learn simple tasks such as playTennis and mushroom. - classified 80% of the playTennis set correctly - classified 92% of mushroom test set correctly. This is because it learned to classify everything as edible. Performed dismally on other tasks such as voting. Varying parameters might improve stability and speedup convergence.
  • Slide 38
  • Problems Population control (doomsday and extinction, memory issues). Too much interaction between agents, not enough eating! Disturbing bug (fixed): agents would have a child and then immediately mate with it!
  • Slide 39
  • Conclusions Hard to control/predict a dynamic population of agents acting independently. Nondeterministic: different results every time, although similar trends (above). With lots of work, we think this could work very well.
  • Slide 40
  • Sample log file. (Fighter3,9999692) at [12,3] ate (person161,99) at [12,3] (Fighter3,10499691) at [12,3] ate (person161,98) at [12,3] (Fighter0,14499692) at [20,1] is breeding with (Helper0,8999691) at [20,1]! (Fighter0,14499692) at [20,1] begat (Fighter20,500000) at [20,1]! (Fighter20,499999) at [19,1] beat up (Fighter19,499987) at [19,1]! (Fighter19,499987) at [19,1] is dying! (Fighter0,14499686) at [20,2] is breeding with (Helper0,8999685) at [20,2]! (Fighter0,14499686) at [20,2] begat (Fighter21,500000) at [20,2]! (Fighter21,500000) at [20,2] got beat up by (Fighter0,14499685) at [20,2] (Fighter21,500000) at [20,2] is dying! (Helper1,13999675) at [18,0] ate (person77,98) at [18,0] (Fighter0,14499676) at [21,4] is breeding with (Helper0,8999675) at [21,4]! (Fighter0,14499676) at [21,4] begat (Fighter22,500000) at [21,4]! (Fighter3,10999672) at [18,0] ate (person77,97) at [18,0] (Helper2,9499670) at [5,16] ate (person112,99) at [5,16] (Fighter3,11499670) at [18,24] ate (person52,95) at [18,24] (Helper2,9999665) at [6,17] ate (person130,100) at [6,17].
  • Slide 41
  • Sample output Time limit reached: terminating. The following 42 rules were learned from the training set: Helper1, RANK 2670: IF ((handicappedkids=y) OR (handicappedkids=n)) AND... AND ((export_act_safrica=y)) THEN Democrat.
  • Slide 42
  • Questions ???