WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 11 of 21 of 21
Genetic Specification of Recurrent Genetic Specification of Recurrent Neural Networks: Initial ThoughtsNeural Networks: Initial Thoughts
World Congress on Computational World Congress on Computational Intelligence 2006, VancouverIntelligence 2006, Vancouver
Bill Howell, Natural Resources Canada, OttawaBill Howell, Natural Resources Canada, Ottawa
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 22 of 21 of 21
I. Introduction – Genes and ANNs
Historically:• Biological inspiration of artificial neural networks,
right from the beginning• Ongoing mutual influence
Outline:I. Introduction – Genes and ANNs
II. Inspiration for DNA-ANNs
III. What might we hope to achieve with DNA-ANNs?
IV. Recommendations and Star-Gazing
V. Conclusions
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 33 of 21 of 21
I. Introduction – Genes and ANNs
Computational Neuro-Genetic Modelling
1. N. Kasabov, L. Benuskova, S. Wysoski, IJCNN04 & 05
Modeling gene networks for spiking neurons
2. R. Storjohann & G. Marcus, IJCNN05
“Neurogene – Integrated simulation of gene regulation, neural activity and neurodevelopment”
3. J.P. Thivierge & G. Marcus WCCI06
Genetics, growth, & environment
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 44 of 21 of 21
I. Introduction – Genes and ANNs
ANN Challenges
• Better, much faster learning algorithms– initial specification and evolution of complex
architectures– plasticity versus memory– robustness versus optimality
• Pre-loading:– data -> functions -> knowledge -> behaviours
• responses of: virus, bacteria, microbes, plants• instinct of: animals, man
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 55 of 21 of 21
I. Introduction – Genes and ANNs
Recurrent Neural Nets (RNNs)
• Due to their recurrent connections, RNNs are a more powerful and general form of ANN
• Problems for which we typically use RNNs are very challenging: modeling, control, approximate dynamic programming
• Interpretations of final structure & weights even more challenging than most other ANNs
• Chaotic NNs ...?
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 66 of 21 of 21
II. Inspiration for DNA-ANNs
1. Genetic – non-protein-coding DNA and RNA
2. Brain models
3. Artificial Neural Networks – trends
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 77 of 21 of 21
II. Inspiration for DNA-ANNs
non-protein-coding RNA (npcDNA) Mattick, John S. (UQueensland) "The hidden genetic program of complex organisms" Scientific American, Oct04 pp60-67. See http://imbuq.edu.au/groups/mattick
DNA gene
exon intron transcription
splicingPrimary RNA
transcript
Intronic RNAAssembled exonic RNA
Degraded and recycled
ProcessingTranslation
mRNA
Protein
Processing
Other functions
Other functions
Other functions
MicroRNAs and others
Noncoding RNA
Gen
e re
gula
tion
Gen
e re
gula
tion
Traditional concept of genes
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 88 of 21 of 21
II. Inspiration for DNA-ANNs
Mattick: Cambrian Complexity Explosion Mattick, John S. (UQueensland) "The hidden genetic program of complex organisms" Scientific American, Oct04 pp60-67. See http://imbuq.edu.au/groups/mattick
Multicellular world
Unicellular world
Single-celled eukaryotes
Origin of new regulatory system?
Eubacteria
Animals Plants Fungi
Time (millions of years ago)
Com
plex
ity
4,000 3,000 2,000 1,000 PresentArchae
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 99 of 21 of 21
II. Inspiration for DNA-ANNs
Finite automaton from DNA mechanisms Shapiro & Benenson “Bringing DNA computers to life” Scientific American, May06, pp45-51
Software strand 1
Software strand 2
Disease-associated mRNA
Active yes-yessoftware molecule
Protector strand
Fokl
Diagnostic molecule
Gene 1
Inactive drug
Gene 2 Gene 3 Gene 4
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1010 of 21 of 21
II. Inspiration for DNA-ANNs
March of the Penguins S. Pinker, The language instinct: how the mind creates language, New York: William Morrow & Company, 1994, Perenniel Classics edition, 2000
JUST instinct?
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1111 of 21 of 21
II. Inspiration for DNA-RNNs
Models of the Brain
• Sensory systems
• Motor
• Memory
• Cognition, planning
• Behaviours
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1212 of 21 of 21
II. Inspiration for DNA-ANNs
Trends with ANNs
• Local, incremental learning approaches – neural gas models, evolving connectionist systems
• Multi-phase ANN architectures – extreme learning machines, echo state networks
• Ensemble solutions – and hierarchies, networks
• Signal processing & information theoretics
• Recurrent Neural Networks (RNNs)
• Evolution of ANNs
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1313 of 21 of 21
III. What might we hope to achieve with DNA-ANNs?
• Starting with the right answer!
• Higher levels of abstraction
• Rapid and effective:
– learning (generalisations)
– evolution (restructure for strategies)
• Resource utilisation – reuse of "modules"
• Control & ADP – faster, more reliable, more robust
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1414 of 21 of 21
III. What might we hope to achieve with DNA-ANNs?
Starting with the right answer
• Trivial solution – give me the answer and I'll solve the problem ultra fast!
• Measures of problem similarity - perhaps at higher levels of abstraction, especially when data appears dissimilar (reminiscent of generality of signal processing)
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1515 of 21 of 21
III. What might we hope to achieve with DNA-ANNs?
Higher levels of abstraction
• Problems - decompose & modularise For example, ANNs can regenerate learned images from
noisy data. Can a similar feat be accomplished for problem decomposition/ modularisation at abstract levels to help evolve ensembles of ANNs?
• Okham's razor (simplest models that explains the data) - may NOT always be a good approach with complex systems!?
• Meaning/ logic – as emergent properties
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1616 of 21 of 21
III. What might we hope to achieve with DNA-ANNs?
Rapidity, Resources
• Rapid, effective, safe : training -> learning -> evolving
fit generalize strategize
• Resource utilisation – re-utilize "functional and connecting modules“, functional overloading, multiple simultaneous hypothesis
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1717 of 21 of 21
III. What might we hope to achieve with DNA-ANNs?
Non-linear dynamical systems Modeling and Control
Perhaps the biggest payback for DNA-ANNs would be their application to the special, but important, case of RNNs.
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1818 of 21 of 21
IV. Recommendations and Star-Gazing
Question:
Current algorithms for learning and evolving – are they adequate for more complex hierarchies and ensembles of ANNs, and for more abstract capabilities?
• To some extent - yes?
• I suspect – that we are also looking for additional formulations, and that to some extent their initial development may depend on having a set powerful, predictable and robust "modules“ as a starting point.
• Two examples of what this might connect to:
1. Local and global brain models – elegant, powerful ways of building systems
2. Classical AI and symbolic logic are an extreme example of “new” learning formulations for higher-level-abstraction ANNs
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 1919 of 21 of 21
IV. Recommendations and Star-Gazing
Artificial Neural Networks (ANNs)Existing CI capabilities are a basis:
1.Start with a “small-world universal function approximation" collection of ANN and RNN modules (custom built or selected from a variety of problem solutions)
2.Develop "generic interfaces" between combinations of two or more modules, or modules of modules
3.Develop "problem formulation/ classification" capabilities (rules, evolutionary strategies etc)
4.ANN phase changes (crystalline -> gaseous)
5.Develop learning / evolving strategies that can do points 1 to 4 above
6.Chaos – perhaps scramble through state-space, but DON’T get locked in to pre-existing structures
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 2020 of 21 of 21
IV. Recommendations and Star-Gazing
Recurrent Neural Networks
• My feeling is that because of their great power and the difficulty of rapidly training them, RNNs offer a challenge whereby DNA-RNNs may show tangible benefits that have a qualitative benefit beyond merely speeding up training and providing good generalisation.
• Question: Will the "genetic specification" of DNA-RNNs beat hand-crafted libraries (likely the starting point)?
• Play with & observe DNA-RNNs
WCCI06 Vancouver, 27Jul06, Bill HowellWCCI06 Vancouver, 27Jul06, Bill Howell slide slide 2121 of 21 of 21
ConclusionsConclusions
Biological computational/ processing Biological computational/ processing capabilities have always been the holy capabilities have always been the holy grail of advanced computing.grail of advanced computing.
As we advance, that brings us to an As we advance, that brings us to an awareness of the next level of concepts. awareness of the next level of concepts. This process may go on for a long This process may go on for a long time....time....
Right now, the genetics revolution is suggestive of DNA-RNNs.