Επεξεργασία Γνώσης knowledge processing (5ο winter...
TRANSCRIPT
![Page 1: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/1.jpg)
Μοντέλα Παλινδρόμησης καιΕπεξεργασία Γνώσης
ΧΕΙΜΕΡΙΝΟ ΕΞΑΜΗΝΟ (5ο)
Regression Models andKnowledge Processing
WINTER SEMESTER (5th)
Τμημα ΜαθηματικωνΑριστοτελειο Πανεπιστημιο
Θεσσαλονικης54124
School of MathematicsAristotle Universityof Thessaloniki54124
4. Δικτυα και Νευρωνικα ΔικτυαIωαννης Αντωνιου Χαραλαμπος Μπρατσας[email protected] [email protected]
Το παρόν εκπαιδευτικό υλικό υπόκειται σε Αδεια Χρήσης Creative Commons
![Page 2: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/2.jpg)
Network Functions-Evolution, Change, Τime
Εάν ο χρονος είναι συνεχης, 𝕋=[0,+∞) η 𝕋=(-∞,+∞) = ℝ,
Net Evolution σε Συνεχη Χρονο = Continuous Net Evolution
Εάν ο χρονος είναι διακριτος, 𝕋={0,1,2,…} ⊆ℕ0 ή 𝕋 ⊆ ℤ,
Net Evolution σε Διακριτο Χρονο = Discrete Net Evolution
Net Time Web Time
Events in the Web are registered and ordered in terms of the UTC
In our model, the relevant events are registered and ordered in terms of the UTC.
t denotes the registration time of events and
𝕋 denotes the set of all possible values of time.
![Page 3: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/3.jpg)
Net Evolution Models:Net Dynamical Systems = Net Dynamics
Stochastic Graph Evolution = Stochastic Net Evolution
Net Evolution Models are a natural mathematical framework for capturing
the Functions and Evolution of Complex Distributed Systems
Nodes State as “Mind”
Nodes Update as “Mind Update”
Weights as “Memory”
Weights Update as “Learning” (Adaptation, Utility)
Net Evolution as “Reasoning”
Neuro-Economic Representation – Understanding
Time is Money
![Page 4: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/4.jpg)
Network Dynamical Systems
State Dynamics: 𝑆𝑡:𝜓𝑤
⟼ 𝑆𝑡𝜓𝑤
=𝜓(𝑡)𝑤(𝑡)
𝜓(𝑡)𝑤(𝑡)
=𝜓𝜅(𝑡)𝑤𝜅𝜆(𝑡)
= 𝑆𝑡𝜓𝜅𝑤𝜅𝜆
=𝑆𝜅𝑡(𝜓1, 𝜓2, … , 𝑤𝛼𝛽)
𝑆𝜅𝜆𝑡 (𝜓1, 𝜓2, … , 𝑤𝛼𝛽)
the solution of the Dynamical Equation.
𝑆𝜅𝑡(𝜓1, 𝜓2, … , 𝑤𝛼𝛽) the Activation Dynamics-Algorithm
of the node κ
𝑆𝜅𝜆𝑡 (𝜓1, 𝜓2, … , 𝑤𝛼𝛽) the Learning Dynamics-Algorithm
of the channel 𝜅 ⟶ 𝜆
𝜅, 𝜆 = 1,2, … , Ν
![Page 5: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/5.jpg)
Differential Equation for Continuous Time t ≥ 0 or t Real
Νet Flow 𝜓(𝑡)𝑤(𝑡)
= 𝑆𝑡𝜓(0)𝑤(0)
𝑑
𝑑𝑡
𝜓(𝑡)𝑤(𝑡)
= 𝛷𝜓(𝑡)𝑤(𝑡)
𝑑
𝑑𝑡
𝜓1(𝑡)⋮
𝜓𝛮(𝑡)𝑤𝜅𝜆(𝑡)
=
𝛷1 𝜓1(𝑡), 𝜓2(𝑡), … ,𝑤𝛼𝛽(𝑡)
⋮𝛷𝛮 𝜓1(𝑡), 𝜓2(𝑡), … ,𝑤𝛼𝛽(𝑡)
𝛷𝜅𝜆 𝜓1(𝑡), 𝜓2(𝑡), … ,𝑤𝛼𝛽(𝑡)
𝜅, 𝜆 = 1,2, … , Ν
![Page 6: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/6.jpg)
Difference Equation for Discrete Time t=0,1,2,…, or Integer
Net Update at discrete steps
𝜓 𝑡 + 1
𝑤 𝑡 + 1= 𝑆
𝜓 𝑡
𝜓 𝑡
𝜓1(𝑡 + 1)⋮
𝜓𝛮(𝑡 + 1)𝑤𝜅𝜆(𝑡 + 1)
=
𝑆1 𝜓1(𝑡), 𝜓2(𝑡), … , 𝑤𝛼𝛽(𝑡)
⋮𝑆𝛮 𝜓1(𝑡), 𝜓2(𝑡), … , 𝑤𝛼𝛽(𝑡)
𝑆𝜅𝜆 𝜓1(𝑡), 𝜓2(𝑡), … ,𝑤𝛼𝛽(𝑡)
𝜓1(𝑡 + 1)⋮
𝜓𝛮(𝑡 + 1)𝑤𝜅𝜆(𝑡 + 1)
=
𝜓1 𝑡 + 𝛷1 𝜓1(𝑡), 𝜓2(𝑡), … , 𝑤𝛼𝛽(𝑡)
⋮𝜓Ν 𝑡 + 𝛷Ν 𝜓1(𝑡), 𝜓2(𝑡), … , 𝑤𝛼𝛽(𝑡)
𝑤𝜅𝜆 𝑡 + 𝛷𝜅𝜆 𝜓1(𝑡), 𝜓2(𝑡), … , 𝑤𝛼𝛽(𝑡)
𝜅, 𝜆 = 1,2, … , Ν
![Page 7: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/7.jpg)
Activation Dynamics𝜓𝜅 𝑡 + 1 = 𝑆𝜅 𝜓1 𝑡 , 𝜓2 𝑡 , … , 𝑤𝛼𝛽 𝑡
𝜓𝜅 𝑡 + 1 = 𝜓𝜅 𝑡 + Φ𝜅 𝜓1 𝑡 , 𝜓2 𝑡 , … , 𝑤𝛼𝛽 𝑡
1st order Discrete NDS For each Node v, The node Update map Sv depends only onthe states associated to the 1-neighborhood of each node
Direct Causality = Determinism
EXAMPLES: Neural Networks, Cellular Automata
![Page 8: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/8.jpg)
Activation Dynamics Neural Networks
ψκ(t+1) = Activation [Aggregation (𝝍𝝂, 𝒘𝜶𝜷)] of Neuron κ
Input Output
Neuron Model • McCulloch W. Pitts, W. 1943, A logical calculus of the ideas immanent in nervous activity.
Bulletin of Mathematical Biophysics 5, 115–133• Heykin S. 1999, Neural Networks. A Comprehensive Foundation, Pearson Prentice Hall, New Jersey• Anthony Μ. 2001, Discrete Mathematics οf Neural Networks. Selected Topics,
Society for Industrial and Applied Mathematics, Philadelphia, USA
![Page 9: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/9.jpg)
AGGREGATION Maps EXAMPLES“Πολλαπλο Γραμμικο Μοντελο»
Bilinear Agreggation: g(𝜓𝜈 , 𝑤𝛼𝛽)= 𝜆𝜓𝜆𝑤𝜆𝜅Since the early dates of the study of ΝΝ, the bilinear function has been used to model the aggregated input on each node at time t
Affine (Ομοπαραλληλικος) Agreggation = Bilinear Agreggation with bias:g(𝜓𝜈 , 𝑤𝛼𝛽) = λψ𝜆𝑤𝜆𝜅 + bκ
Used in PerceptronRosenblatt F. 1958, The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain, Psychological Review 65, 386-408
![Page 10: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/10.jpg)
NN Οutput Activation Maps = Transfer Functions EXAMPLESUnit Step Heaviside Map
φκ (x)= θ(x− ακ) = 1, 𝑖𝑓 𝑥 ≥ 𝑎𝜅0, 𝑖𝑓 𝑥 < 𝑎𝜅
ακ = the Activation Threshold of the Node κ,
We may consider Activation Maps with values in [−1,1]
φκ (x)=
+1, 𝑖𝑓 𝑥 > 00, 𝑖𝑓 𝑥 = 0−1, 𝑖𝑓 𝑥 < 0
All or None Property
McCullock-Pitts Neuron Model 1943
![Page 11: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/11.jpg)
NN Οutput Activation Maps = Transfer Functions
EXAMPLESPiecewise Linear Map
φκ (x)=
1, 𝑖𝑓 𝑥 ≥1
2
𝑥, 𝑖𝑓 𝑥 ∈ (−1
2, +
1
2)
0, 𝑖𝑓 𝑥 ≤1
2
![Page 12: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/12.jpg)
NN Οutput Activation Maps = Transfer Functions
EXAMPLESPiecewise Linear Map
φκ (x)=
1, 𝑖𝑓 𝑥 ≥1
2
𝑥, 𝑖𝑓 𝑥 ∈ (−1
2, +
1
2)
0, 𝑖𝑓 𝑥 ≤1
2
![Page 13: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/13.jpg)
NN Οutput Activation Maps = Transfer Functions
EXAMPLES
Sigmoid Map (Innovation)
φκ (x)= 1
1+𝑒−𝛽(𝑥−𝑎𝜅)
ακ = Activation Threshold of the Node κ
β = the slope parameter of the Sigmoid Map
![Page 14: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/14.jpg)
NN Οutput Activation Maps = Transfer Functions
EXAMPLES
Hyperbolic Tangent Map
φκ (x)= tanh(αx)Ferrazzi et al. BMC Bioinformatics 2007 8(Suppl 5):S2 doi:10.1186/1471-2105-8-S5-S2
![Page 15: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/15.jpg)
Sigmoid Distribution (Continuous)
𝜌 𝑥 =𝑒−𝑥
1+𝑒−𝑥 2
𝑥 Real Number
𝐹 𝑥 =1
1 + 𝑒−𝑥
Population Dynamics
Neural Νetworks
Learning
Innovations
Opinion Adoption
![Page 16: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/16.jpg)
Aggregation + Output as Utility Models
Neuro-Economics
Advertisement
Sex and Money
Neuro-Theology
![Page 17: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/17.jpg)
Learning DynamicsLink Update Algorithm
𝑤𝜅𝜆 𝑡 + 1 = 𝑆𝜅𝜆 𝜓1 𝑡 , 𝜓2 𝑡 , … , 𝑤𝛼𝛽 𝑡 = 𝑤𝜅𝜆 𝑡 + Φ𝜅𝜆 𝜓1 𝑡 , 𝜓2 𝑡 , … , 𝑤𝛼𝛽 𝑡
Constant Weights - No Learning the simplest casew(t) = w , wκλ(t) = wκλ
The structure of the Net does not change with time
The first NN McCulloch and Pitts 1943 , A Logical Calculus of the Ideas Immanent in Nervous Activity,
Bull. Math. Biophysics 5, 115-133
McCulloch and Pitts admitted that this was not the case of the Nervous System
![Page 18: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/18.jpg)
Η Κλασσικη Λογικη ως Αλγεβρα Βοοle
Ορισμος
Eνα Συνολο 𝔏 εφοδιασμενο με τις πραξεις (∨, ∧, ≤) καλειταιL1 Lattice
Πλεγμα
Bounded
Lattice
ΦραγμενοΠλεγμα
Complemented
Lattice
Συμπληρωμενο
Πλεγμα
Orthocomplemented
Lattice
Ορθοσυμπληρωμενο
Πλεγμα
Distributive
Orthocomplemented Lattice
= Boole Αlgebra
Επιμεριστικο
Ορθοσυμπληρωμενο Πλεγμα
= Αλγεβρα Boole
L2
L3
L4
L5
L6
L7
L8
L9
Law-Property Formula L1 Idempotent A ∨ A=A A ∧ A =A L2 Commutative A ∨ B =B ∨A A ∧B =B ∧ A L3 Associative A ∨ (B ∨ Γ)= (A ∨B) ∨ Γ A ∧ (B ∧ Γ) = (A ∧B) ∧ Γ L4 Absorption A ∧ (A ∨ B)= A A ∨ (A ∧B)= A L5 Order A ≤ B ⟺ A = A ∧ B A ≤ B ⟺ B = A ∨ B L6 Bounded O ∨ A = A I ∨ A =I O ∧ A =A I ∧ A =A L7 Complement A ∨ 𝐴𝑐 = I A ∧ 𝐴𝑐 = O L8 Orthocomplement (𝐴𝑐 )𝑐 =A L9 Distributivity A ∨(B ∧ Γ) = (A ∨ B) ∧(A ∨ Γ) A ∧ (B ∨ Γ) = (A ∧ B) ∨ (A ∧ Γ)
![Page 19: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/19.jpg)
Θεωρηματα
1. Τα υποσυνολα ενος συνολου Υ αποτελουν την Αλγεβρα Boole:
(ℬ[𝑌], ∪, ∩, 𝒄 , ⊆, 𝑌 )
2. Θεωρημα Αναπαραστασης Stone:
Καθε Αφηρημενη Αλγεβρα Boole αναπαρισταται ως
Αλγεβρα Boole υποσυνολωνStone M.1936, The theory of representation for Boolean Αlgebras, Trans. Amer. Math. Soc. 40, 37-111
3. Θεωρημα Αναπαραστασης Shannon:
Καθε Αφηρημενη Αλγεβρα Boole αναπαρισταται ως
Αλγεβρα Boole ΔιακοπτωνShannon C. 1938, A Symbolic Analysis of Relay and Swiching Circuits, AMS Transactions 57, 713-723,
MIT Master Thesis
Lee C. 1959, Representation of Switching Circuits by Binary Decision Diagrams, Bell System Techn. J. 38, 985-999
Causal Networks
Belief Networks
![Page 20: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/20.jpg)
Learning DynamicsHebb Learning: Multi-Linear Aggregationwακ(t+1) − wακ(t) = ζ(t) ψα(t) 𝑣ψ𝑣 (t) 𝑤𝑣𝜅(t)ζ(t)= the Learning rate
Simplest Hebb Learning:wακ (t+1) − wακ (t) = ζ ψα(t) ψκ(t)ζ = the constant Learning rate
Hebb, D.O. 1949. The Organization of Behavior. New York: John Wiley
The simple Hebbian Learning Rule is Unstable, becausethe synaptic weights will increase or decrease exponentiallyEuliano N. 1999, Neural and Adaptive Systems: Fundamentals Through Simulations. Wiley.
Need for Generalizations
![Page 21: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/21.jpg)
Learning DynamicsΟja Learningwακ(t+1) = wακ(t) + ζ(t) [ψα(t)− wακ(t) 𝑣ψ𝑣 (t) 𝑤𝑣𝜅(t)] 𝑣ψ𝑣 (t) 𝑤𝑣𝜅(t)
ζ(t)= the Learning rate
NN Learning by Supervision Widrow-Hoff rule or Delta rule
The adjusting of the weights wακ depends not the actual activation xκ(t) of the node κ but
on the difference between the actual activation xκ(t) and desired activation dκ provided by a teacher: wακ (t+1) = wακ (t) + η xα(t) [dκ(t)−xκ(t)]
wακ (t+1) = wακ (t) + η xα(t) [dκ−xκ(t)]
![Page 22: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/22.jpg)
Learning Hebb Learning: Experimental EvidenceSome synaptic changes observed by Eric Kandel(Nobel in Physiology-Medicine 2000) provide examples of Hebbian learning in the marine gastropod Aplysia californica
Antonov I., Antonova I., Kandel E., Hawkins R. 2003, Activity-Dependent Presynaptic Facilitation and Hebbian LTP are Both Required and Interact during Classical Conditioning in Aplysia, Neuron, 37 (1): 135–147
Kandel. 2007, In Search of Memory: The Emergence of a New Science of Mind, New York: W. W. Norton & Company.
![Page 23: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/23.jpg)
Memory and Learning Dynamics 2016
2016 Mu-ming Poo, et als,
What is memory?
The present state of the Engram,
BMC Biology 201614:40, DOI: 10.1186/s12915-016-0261-6
![Page 24: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/24.jpg)
Stochastic NNStochastic Activation
Stochastic McCullock-Pitts Neuron Model
φκ(z)= +1,𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑝 𝑧
−1,𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 1 − 𝑝 𝑧
p(x)= 1
1+𝑒−𝛽𝑧
Haykin, S. 1999, Neural Networks: A comprehensive Foundation. Prentice Hall
Boltzmann Machines the activation as the probability of generating an action potential spike, and is determined via a logistic function on the sum of the inputs to a unit.
Ackley D. H., Hinton G. E., Sejnowski T. J. 1985, A Learning Algorithm for Boltzmann Machines,
Cognitive Science 9 (1): 147–169. DOI:10.1207/s15516709cog0901_7.
Hinton G. E., Osindero S.,Teh Y. 2006, A fast learning algorithm for deep belief nets, Neural Computation 18 (7): 1527–1554. DOI:10.1162/neco.2006.18.7.1527. PMID 16764513.
![Page 25: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/25.jpg)
Neural NetsNN are Information Processors able to learn from observed data with or without a supervisor, rather than having to be programmed
NN consist of Interconnected Processing Units called Neurons
NN are defined by:
• the Architecture: the (Directed) Graph of the Connections the Weight (strength) of the Connections
• The Activation Dynamics (Activation Update)• The Learning Dynamics (Weight Update)
NN are useful when the solution of a problem of interest is difficult due to:
• Lack of physical/statistical understanding of the problem
• Statistical variations in the observable data
• Complex (for example Non-Linear) mechanism underlying the data
![Page 26: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/26.jpg)
Neural NetsΜαθηση με Εποπτεια-Επιβλεψη Supervised Learning: 1) a Supervisor provides the target (desired) outputs as
objectives
2) the weights are adjusted according to the difference
between the target (desired) and the actual outputs for a given input
Perceptrons
Supervised Learning Applications: • Function approximation, • regression analysis, • time series prediction • infer a function from observations • infer a model from observations, system identification• estimate and model in/out Response Ops underlying data
![Page 27: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/27.jpg)
Neural NetsΜαθηση ανευ Επιβλεψης Unupervised Learning: 1) the target (desired) outputs are not specified
2) the weights are adjusted to cluster the inputs into groups with similar features
Kohonen Net = SO Maps
AR = Adaptive Resonance
CL = Competitive LearningKohonen T. 1995, Self-Organizing Maps, Springer-Verlag, Berlin
Unsupervised Learning Applications: • estimation problems • source separation • Classification• Filtering, (e-mail spam filtering)• Clustering - Categorization • Signal separation • Compression• Chaos Identification in Time Series
![Page 28: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/28.jpg)
Neural NetsΜαθηση με Υποστηριξη (Διαδραση με Περιβαλλον)Relevance Feedback Reinforcement Learning: External Evaluation Criteria learned by Mutations
Environment as Supervisor (Markov Model)
Genetic Algorithms
Reinforcement Learning Applications:
• pattern recognition (radar systems, face identification, object recognition)
• sequence recognition (gesture, speech, handwritten text recognition)
• Pattern Association
• Novelty detection
• Game Playing, Decision making (backgammon, chess, racing, medical diagnosis, investments)
• Control (vehicle control, process control)
• Optimization
• Data mining (or knowledge discovery in databases, "KDD")
• Search
Pham D., Xing L. 1995, Neural Networks for Identification, Prediction and Control, Springer-Verlag, Berlin
![Page 29: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/29.jpg)
Neural NetsΕπιτροπη Νευρωνικων ΔικτυωνCommittee of NNCollection of different NN that together "vote" on a given example.
This generally gives a much better result compared to other neural network models.
Starting with the same architecture and training but using different initial random weights gives a Variety of vastly different NN.
Committees tends to stabilize the result.
Committee Learning is similar to the general NN learning
Variety is obtained by training the NN Committee Membersfrom different random starting weights rather than from different randomly selected subsets of the training data.
![Page 30: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/30.jpg)
NEURAL NETS are CognitiveΗ Ευφυης Επεξεργασια
Cognition = Knowledge Acquisition = Αποκτηση Γνωσης=Νοηση
σημερα Πραγματοποιειται από Τεχνητα Νευρωνικα Δικτυα
Προσομοιωνοντας τον Εγκεφαλο
Comprehension = Κατανοηση
Thinking, Reasoning = Στοχασμος, Συλλογισμος
• Inference = Συναγωγη Συμπερασματων
• Decision Making = Ληψη Aποφασεων
• Abstraction= Αφαιρεση
• Generalization and Discrimination = Γενικευση και Διακριση
• Problem Solving = Επιλυση Προβληματων
• Classification = Ταξινομηση
• Categorization = Κατηγοροποιηση
• Summarization = Συνοψη
• Translation = Μεταφραση
Learning = Μαθηση
![Page 31: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/31.jpg)
Βαθεια Μαθηση Deep Learning• a cascade of a Hierarchy of many layers
• Each successive layer uses the output from the previous layer
as input.
• supervised (for pattern analysis)
unsupervised (for classification).
• learning of feature representations in each layer,
• Higher level features are derived from lower level features
(hierarchical representation}
• learn multiple levels of representations
corresponding to different levels of abstraction;
the levels form a hierarchy of concepts.
http://googleresearch.blogspot.co.uk/2016/01/teach-yourself-deep-learning-with.html
https://www.udacity.com/course/deep-learning--ud730
![Page 32: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/32.jpg)
Impression Formation
How individual pieces of information
about another person or issue
are integrated to form a global impression of the individual
Neuron Model
Information Integration Theory.
Cognitive Algebra
Anderson N. 1971, Integration Theory and Attitude Change. Psychological Review 78, 171–206.
Anderson N. 1981, Cognitive Algebra: Integration Theory Applied to Social Attribution, In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 7, pp. 1–101), Academic Press, New York
Anderson N. 2013, Unified Psychology Based on Three Laws of Information Integration, Review of General Psychology 17, 125–132
![Page 33: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/33.jpg)
Memory
Cognition
Emotion
Imagination
Will
Awareness-Consciousness of
• Sensing: Seeing
Hearing
Smelling
Tasting
Touching, Body
• Cognition
• Emotion
• Imagination
• Errors-Illusions
• Will
• Mind-Self (as Software)
Are supported by the Nervous System
Most probably within the Association Area
Emotion, Imagination, Will, Awareness-Consciousness Not shown (yet, despite the effort) to be
Functionalities of Mathematical NEURAL NETS or other NETS
![Page 34: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/34.jpg)
Boolean Networks = SHANNON GRAPH = Binary Decision Diagram (BDD) propositional directed acyclic graphs (PDAG), as a data structures representing Boolean functionsLogical operations SGMany logical operations on SGs can be implemented by polynomial-time graph algorithms.• conjunction• disjunction• negation• existential abstraction• universal abstractionRepeating these operations several times, may in the worst case result in an exponential time.
![Page 35: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/35.jpg)
Random Boolean Networks (ΡΝΒ) = Kauffman Networks proposed as models of Genetic Regulatory Networks Kauffman, S. A. 1969, Metabolic Stability and Epigenesis in randomly constructed genetic nets.
Journal of Theoretical Biology 22, 437-467.
Kauffman, S. A. 1993, Origins of Order: Self-Organization and Selection in Evolution,
Oxford University Press. ISBN 0-19-507951-5A
RBN is a system of N binary-state nodes (representing genes) with K inputs to each node representing regulatory mechanisms. The two states (on/off) represent respectively, the status of a gene being active or inactive. The state of a network at any point in time is given by the current states of all N genes. Simulation of RBNs is performed in discrete time steps.
![Page 36: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/36.jpg)
Bayesian Networks , Graphical ModelsP[A=α|B=β] = the degree of belief that
the Variable A has the value α based on the fact that B=β
P[Ξ |Η] = P[Ξ∩Η]
P[Η]= P[Ξ]
P[Η]P[Η|Ξ]
P[x1, x2,…, xN ] = 𝑣,𝜅 P[𝑥𝜈|𝑥𝜈𝛼𝜅𝜈] (the parents of each node v)
Παραδειγμα
P[x1, x2, x3, x4, x5] = P[x1] P[x2] P[x3 |x1, x2] P[x4|x3] P[x5 | x3, x4]
Γενικα όμως ισχυει το Πολλαπλασιαστικο Θεωρημα:
p[x1, x2, …, xN] = p[xN| x N−1 , …, x2 , x1] ∙∙∙ p[x3| x2 , x1] p[x2 | x1] p[x1], N=2,3, . .
![Page 37: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/37.jpg)
Bayesian Networks , Graphical Models
Acyclic Graph
P[x3 | x2] P[x1|x3] P[x2 | x1] is not consistent probability
![Page 38: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/38.jpg)
Bayesian Networks , Graphical ModelsLinks indicate Direction, Causality, Temporal succession
GM include:
• Markov Nets
• Hidden Markov Nets
• Kalman Filters
• Neural Nets
Koller D., Friedman N. 2009, Probabilistic graphical models: principles and techniques, MIT Press
Darwiche A. 2009, Modeling and reasoning with Bayesian networks", Cambridge.
Jensen F. 2001, Bayesian Networks and Decision Graphs, Springer
![Page 39: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/39.jpg)
Propagation, Inference, Causality
Epidemics, Belief-Fame, Gossip, Evidence,
Navigation, Percolation, Games
Probabilistic Inference and Belief Propagation is NP- Hard as a general Computational Problem
Cooper G. 1990, The Computational Complexity of Probabilistic Inference Using Bayesian
Belief Networks, Artificial Intelligence 42, 393-405
Cooper G., Herskovits E. 1991, Determination of the Entropy of a Belief Network is NP-Hard
Knowledge Systems, AI Laboratory, Stanford, CA KSL Technical Report 90-21
There is no Systematic solution
Belief Propagation is tractable in sparse graphs
Many Algorithms are Available
Treatment of Cycles is possible in many cases
![Page 40: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/40.jpg)
Spreading Activation
a method for searching Semantic Networks, Brain Networks .
The search process is initiated by labeling a set of source nodes
(e.g. concepts in a semantic network) with "activation levels " and
then iteratively propagating or "spreading" that activation out to other nodes linked to the source nodes.
Most often these "activation levels" are real values that decay as activation propagates through the network.
When the activation levels are discrete this process is often referred to as marker passing.
Activation may originate from alternate paths, identified by distinct markers, and terminate when two alternate paths reach the same node.
![Page 41: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/41.jpg)
Causal Nets
• Accessibility = Causal Link
• The Universe as a Causal Net (Relativity)
• Causal Sets
• Causal Sets Triangulation
![Page 42: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/42.jpg)
Semantic Nets
Mind Nets
Decision Nets• Nodes: the Concepts
• Links: the Semantic Links
expressing
Semantic Relations
Natural Language
Knowledge Ontologies = Semantic Nets + Reasoning Rules
Semantic WebFolksonomies
Avoid Babel
![Page 43: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/43.jpg)
43
Νοηματική Επεξεργασία
![Page 44: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/44.jpg)
44
Ιστος Διασυνδεδεμενων Δεδομενων (Linked Data)
• Καθένας μπορεί να δημοσιεύει δεδομένα
•Οι Οντότητες (Συλλογες Δεδομενων) συνδέονται μεταξυτους• Δημιουργείται ένας παγκόσμιος γράφος αλληλοσυνδεδεμενων
νοηματικα δεδομένων
• Τα δεδομένα περιγράφουν τον εαυτό τους• Αν μια εφαρμογή συναντήσει δεδομένα που περιγράφονται από
άγνωστο λεξιλόγιο, η εφαρμογή μπορεί να κάνει resolve τα URIsπου ταυτοποιούν όρους του λεξιλογίου για να εντοπίσει τον ορισμό τους κατά RDFS ή OWL.
•Ο Ιστός των Δεδομένων είναι ανοικτός• Οι εφαρμογές μπορούν να ανακαλύπτουν νέες πηγές δεδομένων
την ώρα που τρέχουν, ακολουθώντας links.
![Page 45: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/45.jpg)
Τα Ανοικτά Δεδομένα εχουν την μεγαλύτερη «αξία» ως προς την δυνατότητα αξιοποιησής τους από εφαρμογές
![Page 46: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/46.jpg)
Οι Γενεές του WebΕποχή Περιγραφή Πηγή ΥπεραξίαςΠρο-Web 1980 Επιτραπέζιοι ΗΥ Υπολογισμοί
Web1.0: 1990 Αναρτηση Αρχειων: Παροχος⟶ Χρηστης
Περιηγητης (Browser)
Υπολογισμοί
+ Διασύνδεση Αρχείων
(Documents)
Web2.0: 2000 Αναρτηση Αρχειων: Παροχος⇆ Χρηστης
Κοινωνικος Ιστος
Διαδραστικη Επικοινωνια
Συρματοπλεγματα στα Λειβαδια του
Κυβερνοχωρου
Web 1.0
+ Wikipedia
+ Κοινωνία Πληροφορίας
Ανταλλαγή Εμπειρίας
Πληροφορίας, Σφαλμάτων
Web3.0: 2010 Σημασιολογικό Web:
Η Οντολογία (Σημασιολογικό Δίκτυο)
Μηχανική Επεξεργασία Γνώσης
Νέφος Διασυνδεδεμένων Δεδομένων
Διαδίκτυο Πραγμάτων (Internet of Things)
(ΗΥ, Smartphones, Ψυγεία, Φούρνοι,
Αισθητήρες, Ζώα)
Web 2.0
+ Διασύνδεση Εννοιών
+ Συλλογισμοί
+ Μείωση Σφαλμάτων μέσω
Αποσαφήνισης
και Διάκρισης
![Page 47: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/47.jpg)
Οντολογια Αριστοτελη 330 πΧΤην κατά µηδεµίαν συµπλοκή λεγοµένων έκαστον ήτοι ουσίαν σηµαίνει ή ποσόν ή ποιόν τι ή πού ή ποτέ ή κείσθαιή έχειν ή ποιείν ή πάσχειν. 'Εστι δε ουσία µεν ως τύπω ειπείν οίον άνθρωπος ίππος, ποσόν δε οίον δίπηχυ, τρίπηχυ- ποιόν δε οίον λευκόν γραµµατικόνπρος τι δε οίον διπλάσιον, ήµισυ, µείζον' πού δε οίον εν Λυκείω, εν αγορά' ποτέ δε οίον χθές, πέρισυν: κείσθαι δε οίον ανάκειται, κάθηται‘έχειν δε οίον δε υποδέδεται, ώπλισται: ποιείν δε οίον τέµνειν, καίειν πάσχειν δε οίον τέµνεσθαι, καίεσθαι, [Αριστοτελης Κατηγοριες 1,4]
Οντολογια Αριστοτελη
ΚΑΤΗΓΟΡΙΑ CATEGORY ΠΑΡΑΔΕΙΓΜΑ
Ουσια Substance άνθρωπος, ίππος, Θεμελιωδεις ΓενικεςΕννοιες
Ποσόν (ποσότητα) Quality δύο πήχεις τρείς πήχεις
Ποιόν (ποιότητα) Quantity λευκός, µορφωµένος
Ως προς τι (σχέση) Relation διπλάσιος, µισός µεγαλύτερος
Πού (τόπος) Where στο Λυκειο, στην αγορά
Πότε (χρόνος) When χθες, πέρυσι
Κεισθαι = Ανηκειν = Ενθεση Placement ξαπλώνω, κάθοµαι
Εχειν Having υποδύοµαι, οπλισµένος
Ποιειν, Πραττειν, Ενεργειν Action τεμνω, καίω
Πάσχειν Passion κόβοµαι, καίγοµαι
![Page 48: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/48.jpg)
Σημασιολογικο Δικτυο
Οντολογια Μποστανζογλου 1962EννοιεςΓενικες
ΥπαρξιςΣχεσηΕνοτηςΤαξηΠοσοτηςΑριθμοςΧρονοςXωροςΔιαστασιςΣχημαΕνεργειαΚινησις
Οντα Υλικα ΑβιαΕμβια
ΟνταΑυλα
ΝουςΒουλησιςΔρασιςΑξιεςΣυναισθημαΗθοςΘεος
Οντολογια Roget Thesaurus 2004
WORDS EXPRESSING ABSTRACT RELATIONS
ExistenceRelationQuantityOrderNumberTimeChangeCausation
WORDS RELATING TO SPACE
SpaceDimensionsFormMotion
WORDS RELATING TO MATTER
MatterMatter InorganicMatter Organic
WORDS RELATING TO THE INTELLECT
IdeasIdeas FormationIdeas Communication
WORDS RELATING TO VOLITION
Individual VolitionIntersocial Volition
WORDS RELATING TO AFFECTIONS
Affections Personal AffectionsSympathetic AffectionsMoral AffectionsReligious Affections
![Page 49: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/49.jpg)
Σημασιολογικο Δικτυο
![Page 50: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/50.jpg)
Semantic Network
![Page 51: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/51.jpg)
Semantic Networkδιὰ γὰρ τὴν ἀλλήλων τῶν εἰδῶν συμπλοκὴν, ὁ λόγος γέγονεν ἡμῖν.
only by the mutual engagement of concepts, do we attain to discourse of reason
Πλατων Σοφιστης 259e
τῶν ὄντων τὰ μὲν αὐτὰ καθ' αὑτά, τὰ δὲ πρὸς ἄλλα ἀεὶ λέγεσθαι.
existences are called absolute, as well as relative
Πλατων Σοφιστης 254a
SOURCE NODES = Αρχικες Εννοιες= Οντα καθ' αυτα
INTERMEDIATE NODES = Εννοιες οριζομενες ως σχεση-επικοινωνια με αλλες= Οντα Προς Αλλα
OBSERVABLE NODES: 𝝀 𝜶𝝂𝝀 + 𝜶𝝀𝝂 = 𝒅𝒆𝒈𝛎 > 𝟎
UNOBSERVABLE NODES: 𝝀 𝜶𝝂𝝀 + 𝜶𝝀𝝂 = 𝒅𝒆𝒈𝛎 = 𝟎
![Page 52: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/52.jpg)
Τι άλλαξε
• Συλλογική Κατανεμημενη ευφυία (Noυς)
• Νέα επιχειρηματικά μοντέλα• Εξέλιξη Διαφήμισης
• Παραγωγή και διάθεση προϊόντων
52
![Page 53: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/53.jpg)
Knowledge Transfer Nets Knowledge = Justified True Belief, Πλατων Θεαιτητος
«Την μετα λόγου αληθή δόξαν επιστήμη είναι,
την δε άλογον εκτός επιστήμης και
ων μεν μη εστι λόγος, ουκ επιστητά είναι» [Πλατων, Θεαιτητος, 201δ]
ὡς ἄρα τὴν διαφορὰν ἑκάστου ἂν λαμβάνῃς ᾗ τῶν ἄλλων διαφέρει,
λόγον, ὥς φασί τινες, λήψῃ: ἕως δ' ἂν κοινοῦ τινος ἐφάπτῃ,
ἐκείνων πέρι σοι ἔσται ὁ λόγος ὧν ἂν ἡ κοινότης ᾖ.
[Πλατων, Θεαιτητος 208δ]
Λογος = η ειδοποιος διαφορα, διακριση (βαθος-ευρος)
« (Αι δόξαι αι αληθείς) ου πολλού άξιαι εισίν,
εως αν τις αυτάς δήση αιτίας λογισμώ…
και δια ταυτα δη τιμιωτερον επιστημη ορθης δοξης εστιν»
[Πλατων, Μενων, 97ε, 98α].
![Page 54: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/54.jpg)
Knowledge Nets and
Opinion-Consensus Nets
Knowledge level 𝜓𝜅 as Activation of the Νode κ
Link with Nodes λ the max Κnowledge Difference: 𝜓𝜆 − 𝜓𝜅Αποκτηση Γνωσης
Increase Knowledge
Οpinion 𝜓𝜅 as Activation of the Νode κ
Link with Nodes λ the min Opinion Difference: 𝜓𝜆 − 𝜓𝜅Ομοφιλια
Reach Consensus
![Page 55: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/55.jpg)
Cellular Automata
Partial Difference Equations Yk+1,m+1 - q Yk+1,m - pYk,m = 0 k,m = 1,2,3,…
Wolfram S. 2002, A New Kind of Science Wolfram Media,Campaign, ISBN 1-57955-008-8
Net of CellsFinite states (2adic ON, OFF)
An initial state (time t=0) is selected by assigning a state for each cell. A new generation is created (t + 1), according to some fixed rule that determines the new state of each cell in terms of the current state of the cell and the states of the linked cells Typically, the rule for updating the state of cells is the same for each cell and does not change over time, and is applied to the whole grid simultaneously
Example: the cell is "ON" in the next generation if exactly two of the cells in the neighborhood are "ON" in the current generation, otherwise the cell is "OFF" in the next generation
![Page 56: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/56.jpg)
Cellular Automata Von Neumann 4-Neighbourhood
N
W E
S
![Page 57: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/57.jpg)
Cellular Automata Moore 8-Neighbourhood
Probabilistic Cellular Automata
NW N NE
W E
SW S SE
![Page 58: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/58.jpg)
Genetic Algorithms, Evolutionary Strategies, Net Games
Swarms, Agents, Sensors Nets
Collective Intelligence
Immune System Models
![Page 59: Επεξεργασία Γνώσης Knowledge Processing (5ο WINTER …cosal.auth.gr/iantonio/sites/default/files... · Kandel. 2007, In Search of Memory: The Emergence of a New](https://reader031.vdocuments.mx/reader031/viewer/2022040411/5eda9ff809f66a09130ba694/html5/thumbnails/59.jpg)
Οι Γενεές του WebΕποχή Περιγραφή Πηγή ΥπεραξίαςΠρο-Web 1980 Επιτραπέζιοι ΗΥ Υπολογισμοί
Web1.0: 1990 Αναρτηση Αρχειων: Παροχος⟶ Χρηστης
Περιηγητης (Browser)
Υπολογισμοί
+ Διασύνδεση Αρχείων (Docs)
Web2.0: 2000 Αναρτηση Αρχειων: Παροχος⇆ Χρηστης
Κοινωνικος Ιστος
Διαδραστικη Επικοινωνια
Συρματοπλεγματα στα Λειβαδια του
Κυβερνοχωρου
Web 1.0
+ Wikipedia
+ Κοινωνία Πληροφορίας
Ανταλλαγή Εμπειρίας
Πληροφορίας, Σφαλμάτων
Web3.0: 2010 Σημασιολογικό Web:
Η Οντολογία (Σημασιολογικό Δίκτυο)
Μηχανική Επεξεργασία Γνώσης
Νέφος Διασυνδεδεμένων Δεδομένων
Διαδίκτυο Πραγμάτων (Internet of Things)
(ΗΥ, Smartphones, Ψυγεία, Φούρνοι,
Αισθητήρες, Ζώα)
Web 2.0
+ Διασύνδεση Εννοιών
+ Συλλογισμοί
+ Μείωση Σφαλμάτων μέσω
Αποσαφήνισης
και Διάκρισης
Web4.0: 2020
(?)
Δίκτυα Γνώσης
Νέφος Γνώσης ως Οντοτητα
Διαλειτουργουσα
Πλανητικος Νους Νοόσφαιρα (?)
Web 3.0
+ Παγκόσμια Γνώση
+ Χρονος
+ Διασύνδεση Όντων