aplicación de redes neuronales

Upload: solid34

Post on 13-Apr-2018

223 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/26/2019 Aplicacin de Redes Neuronales

    1/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Self-organizing incremental neural network and its

    application

    F. Shen1 O. Hasegawa2

    1National Key Laboratory for Novel Software Technology, Nanjing University

    2Imaging Science and Engineering Lab, Tokyo Institute of Technology

    June 12, 2009

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    2/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Contents of this tutorial

    1 What is SOINN

    2 Why SOINN

    3 Detail algorithm of SOINN

    4 SOINN for machine learning

    5 SOINN for associative memory

    6 References

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    3/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    What is SOINN

    1 What is SOINN

    2 Why SOINN

    3 Detail algorithm of SOINN

    4 SOINN for machine learning

    5 SOINN for associative memory

    6 References

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    4/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    What is SOINN

    What is SOINN

    SOINN: Self-organizing incremental neural network

    Represent the topological structure of the input data

    Realize online incremental learning

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    C

    http://goforward/http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    5/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    What is SOINN

    What is SOINN

    SOINN: Self-organizing incremental neural network

    Represent the topological structure of the input data

    Realize online incremental learning

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    C t t

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    6/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    What is SOINN

    What is SOINN

    SOINN: Self-organizing incremental neural network

    Represent the topological structure of the input data

    Realize online incremental learning

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    7/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    What is SOINN

    What is SOINN

    SOINN: Self-organizing incremental neural network

    Represent the topological structure of the input data

    Realize online incremental learning

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    8/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    BackgroundCharacteristics of SOINN

    1 What is SOINN

    2 Why SOINN

    3 Detail algorithm of SOINN

    4 SOINN for machine learning

    5 SOINN for associative memory

    6 References

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    9/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    BackgroundCharacteristics of SOINN

    Background: Networks for topology representation

    SOM(Self-Organizing Map): predefine structure and size ofthe network

    NG(Neural Gas): predefine the network size

    GNG(Growing Neural Gas): predefine the network size;

    constant learning rate leads to non-stationary result.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    10/222

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    BackgroundCharacteristics of SOINN

    Background: Networks for topology representation

    SOM(Self-Organizing Map): predefine structure and size ofthe network

    NG(Neural Gas): predefine the network size

    GNG(Growing Neural Gas): predefine the network size;

    constant learning rate leads to non-stationary result.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    11/222

    Co t tsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    BackgroundCharacteristics of SOINN

    Background: Networks for topology representation

    SOM(Self-Organizing Map): predefine structure and size ofthe network

    NG(Neural Gas): predefine the network size

    GNG(Growing Neural Gas): predefine the network size;

    constant learning rate leads to non-stationary result.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    12/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Background: Networks for topology representation

    SOM(Self-Organizing Map): predefine structure and size ofthe network

    NG(Neural Gas): predefine the network size

    GNG(Growing Neural Gas): predefine the network size;

    constant learning rate leads to non-stationary result.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    13/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Background: Networks for incremental learning

    Incremental learning: Learning new knowledge without destroy

    of old learned knowledge (Stability-Plasticity Dilemma)ART(Adaptive Resonance Theory): Need a user definedthreshold.

    Multilayer Perceptrons: To learn new knowledge will destroy

    old knowledgeSub-network methods: Need plenty of storage

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    14/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Background: Networks for incremental learning

    Incremental learning: Learning new knowledge without destroy

    of old learned knowledge (Stability-Plasticity Dilemma)ART(Adaptive Resonance Theory): Need a user definedthreshold.

    Multilayer Perceptrons: To learn new knowledge will destroy

    old knowledgeSub-network methods: Need plenty of storage

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWh i SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    15/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Background: Networks for incremental learning

    Incremental learning: Learning new knowledge without destroy

    of old learned knowledge (Stability-Plasticity Dilemma)ART(Adaptive Resonance Theory): Need a user definedthreshold.

    Multilayer Perceptrons: To learn new knowledge will destroy

    old knowledgeSub-network methods: Need plenty of storage

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWh t i SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    16/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Background: Networks for incremental learning

    Incremental learning: Learning new knowledge without destroy

    of old learned knowledge (Stability-Plasticity Dilemma)ART(Adaptive Resonance Theory): Need a user definedthreshold.

    Multilayer Perceptrons: To learn new knowledge will destroy

    old knowledgeSub-network methods: Need plenty of storage

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    17/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Background: Networks for incremental learning

    Incremental learning: Learning new knowledge without destroy

    of old learned knowledge (Stability-Plasticity Dilemma)ART(Adaptive Resonance Theory): Need a user definedthreshold.

    Multilayer Perceptrons: To learn new knowledge will destroy

    old knowledgeSub-network methods: Need plenty of storage

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    18/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Characteristics of SOINN

    Neurons are self-organized with no predefined network

    structure and sizeAdaptively find suitable number of neurons for the network

    Realize online incremental learning without any prioricondition

    Find typical prototypes for large-scale data set.Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    19/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Characteristics of SOINN

    Neurons are self-organized with no predefined network

    structure and sizeAdaptively find suitable number of neurons for the network

    Realize online incremental learning without any prioricondition

    Find typical prototypes for large-scale data set.Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    20/222

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    BackgroundCharacteristics of SOINN

    Characteristics of SOINN

    Neurons are self-organized with no predefined network

    structure and sizeAdaptively find suitable number of neurons for the network

    Realize online incremental learning without any prioricondition

    Find typical prototypes for large-scale data set.Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    21/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    BackgroundCharacteristics of SOINN

    Characteristics of SOINN

    Neurons are self-organized with no predefined network

    structure and sizeAdaptively find suitable number of neurons for the network

    Realize online incremental learning without any prioricondition

    Find typical prototypes for large-scale data set.Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    22/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    BackgroundCharacteristics of SOINN

    Characteristics of SOINN

    Neurons are self-organized with no predefined network

    structure and sizeAdaptively find suitable number of neurons for the network

    Realize online incremental learning without any prioricondition

    Find typical prototypes for large-scale data set.Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    23/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    BackgroundCharacteristics of SOINN

    Characteristics of SOINN

    Neurons are self-organized with no predefined network

    structure and sizeAdaptively find suitable number of neurons for the network

    Realize online incremental learning without any prioricondition

    Find typical prototypes for large-scale data set.Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Architecture of SOINNTraining process of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    24/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Training process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    1 What is SOINN

    2 Why SOINN

    3 Detail algorithm of SOINN

    4 SOINN for machine learning

    5 SOINN for associative memory

    6 References

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Wh SOINN

    Architecture of SOINNTraining process of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    25/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Training process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Structure: Two-layer competitive network

    Two-layer competitivenetwork

    First layer: Competitivefor input data

    Second layer: Competitivefor output of first-layer

    Output topology structureand weight vector ofsecond layer

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Wh SOINN

    Architecture of SOINNTraining process of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    26/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    g pSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Structure: Two-layer competitive network

    Two-layer competitivenetwork

    First layer: Competitivefor input data

    Second layer: Competitivefor output of first-layer

    Output topology structureand weight vector ofsecond layer

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINN

    Architecture of SOINNTraining process of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    27/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    g pSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Structure: Two-layer competitive network

    Two-layer competitivenetwork

    First layer: Competitivefor input data

    Second layer: Competitivefor output of first-layer

    Output topology structureand weight vector ofsecond layer

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINN

    Architecture of SOINNTraining process of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    28/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Structure: Two-layer competitive network

    Two-layer competitivenetwork

    First layer: Competitivefor input data

    Second layer: Competitivefor output of first-layer

    Output topology structureand weight vector ofsecond layer

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINN

    Architecture of SOINNTraining process of SOINNSi il i h h ld f j d i i d

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    29/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Structure: Two-layer competitive network

    Two-layer competitivenetwork

    First layer: Competitivefor input data

    Second layer: Competitivefor output of first-layer

    Output topology structureand weight vector ofsecond layer

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINN

    Architecture of SOINNTraining process of SOINNSi il it th h ld f j d i i t d t

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    30/222

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Training flowchart of SOINN

    Adaptively updatedthreshold

    Between-classinsertion

    Update weight ofnodes

    Within-classinsertion

    Remove noise nodes

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINN

    Architecture of SOINNTraining process of SOINNSimilarit threshold for j dging inp t data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    31/222

    yDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Training flowchart of SOINN

    Adaptively updatedthreshold

    Between-classinsertion

    Update weight ofnodes

    Within-classinsertion

    Remove noise nodes

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    32/222

    yDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Training flowchart of SOINN

    Adaptively updatedthreshold

    Between-classinsertion

    Update weight ofnodes

    Within-classinsertion

    Remove noise nodes

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    33/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Training flowchart of SOINN

    Adaptively updatedthreshold

    Between-classinsertion

    Update weight ofnodes

    Within-classinsertion

    Remove noise nodes

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINND il l i h f SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    34/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Training flowchart of SOINN

    Adaptively updatedthreshold

    Between-classinsertion

    Update weight ofnodes

    Within-classinsertion

    Remove noise nodes

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINND t il l ith f SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    35/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Similarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    Training flowchart of SOINN

    Adaptively updatedthreshold

    Between-classinsertion

    Update weight ofnodes

    Within-classinsertion

    Remove noise nodes

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    36/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    y j g g pLearning rateSimple version of SOINNSimulation results

    Training flowchart of SOINN

    Adaptively updatedthreshold

    Between-classinsertion

    Update weight ofnodes

    Within-classinsertion

    Remove noise nodes

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    37/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Learning rateSimple version of SOINNSimulation results

    First layer: adaptively updating threshold Ti

    Basic idea: within-class distance Tbetween-class distance

    1 Initialize: Ti = +when node i is a new node.2 When iis winner or second winner, update Ti by

    If ihas neighbors, Tiis updated as the maximum distancebetween iand all of its neighbors.

    Ti= maxcNi

    ||Wi Wc|| (1)

    If ihas no neighbors, Tiis updated as the minimum distanceof iand all other nodes in network A.

    Ti= mincA\{i}

    ||Wi Wc|| (2)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input data

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    38/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Learning rateSimple version of SOINNSimulation results

    First layer: adaptively updating threshold Ti

    Basic idea: within-class distance Tbetween-class distance

    1 Initialize: Ti = +when node i is a new node.2 When iis winner or second winner, update Ti by

    If ihas neighbors, Tiis updated as the maximum distancebetween iand all of its neighbors.

    Ti= maxcNi

    ||Wi Wc|| (1)

    If ihas no neighbors, Tiis updated as the minimum distanceof iand all other nodes in network A.

    Ti= mincA\{i}

    ||Wi Wc|| (2)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataL i

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    39/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Learning rateSimple version of SOINNSimulation results

    First layer: adaptively updating threshold Ti

    Basic idea: within-class distance Tbetween-class distance

    1 Initialize: Ti = +when node i is a new node.2 When iis winner or second winner, update Ti by

    If ihas neighbors, Tiis updated as the maximum distancebetween iand all of its neighbors.

    Ti= maxcNi

    ||Wi Wc|| (1)

    If ihas no neighbors, Tiis updated as the minimum distanceof iand all other nodes in network A.

    Ti= mincA\{i}

    ||Wi Wc|| (2)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataL i t

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    40/222

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Learning rateSimple version of SOINNSimulation results

    First layer: adaptively updating threshold Ti

    Basic idea: within-class distance Tbetween-class distance

    1 Initialize: Ti = +when node i is a new node.2 When iis winner or second winner, update Ti by

    If ihas neighbors, Tiis updated as the maximum distancebetween iand all of its neighbors.

    Ti= maxcNi

    ||Wi Wc|| (1)

    If ihas no neighbors, Tiis updated as the minimum distanceof iand all other nodes in network A.

    Ti= mincA\{i}

    ||Wi Wc|| (2)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    41/222

    gSOINN for machine learning

    SOINN for associative memoryReferences

    Learning rateSimple version of SOINNSimulation results

    First layer: adaptively updating threshold Ti

    Basic idea: within-class distance Tbetween-class distance

    1 Initialize: Ti = +when node i is a new node.2 When iis winner or second winner, update Ti by

    If ihas neighbors, Tiis updated as the maximum distancebetween iand all of its neighbors.

    Ti= maxcNi

    ||Wi Wc|| (1)

    If ihas no neighbors, Tiis updated as the minimum distanceof iand all other nodes in network A.

    Ti= mincA\{i}

    ||Wi Wc|| (2)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    42/222

    SOINN for machine learningSOINN for associative memory

    References

    Learning rateSimple version of SOINNSimulation results

    First layer: adaptively updating threshold Ti

    Basic idea: within-class distance Tbetween-class distance

    1 Initialize: Ti = +when node i is a new node.2 When iis winner or second winner, update Ti by

    If ihas neighbors, Tiis updated as the maximum distancebetween iand all of its neighbors.

    Ti= maxcNi

    ||Wi Wc|| (1)

    If ihas no neighbors, Tiis updated as the minimum distanceof iand all other nodes in network A.

    Ti= mincA\{i}

    ||Wi Wc|| (2)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINN

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    43/222

    SOINN for machine learningSOINN for associative memory

    References

    Learning rateSimple version of SOINNSimulation results

    Second layer: constant threshold Tc

    Basic idea 1: within-class distance Tbetween-classdistance

    Basic idea 2: we already have some knowledge of input data

    from results of first-layer.Within-class distance:

    dw = 1

    NC

    (i,j)C

    ||Wi Wj|| (3)

    Between-class distance of two class Ci and Cj:

    db(Ci, Cj) = miniCi,jCj

    ||Wi Wj|| (4)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN f hi l i

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    44/222

    SOINN for machine learningSOINN for associative memory

    References

    Learning rateSimple version of SOINNSimulation results

    Second layer: constant threshold Tc

    Basic idea 1: within-class distance Tbetween-classdistance

    Basic idea 2: we already have some knowledge of input data

    from results of first-layer.Within-class distance:

    dw = 1

    NC

    (i,j)C

    ||Wi Wj|| (3)

    Between-class distance of two class Ci and Cj:

    db(Ci, Cj) = miniCi,jCj

    ||Wi Wj|| (4)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN f hi l i

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    45/222

    SOINN for machine learningSOINN for associative memory

    References

    e g teSimple version of SOINNSimulation results

    Second layer: constant threshold Tc

    Basic idea 1: within-class distance Tbetween-classdistance

    Basic idea 2: we already have some knowledge of input data

    from results of first-layer.Within-class distance:

    dw = 1

    NC

    (i,j)C

    ||Wi Wj|| (3)

    Between-class distance of two class Ci and Cj:

    db(Ci, Cj) = miniCi,jCj

    ||Wi Wj|| (4)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    46/222

    SOINN for machine learningSOINN for associative memory

    References

    gSimple version of SOINNSimulation results

    Second layer: constant threshold Tc

    Basic idea 1: within-class distance Tbetween-classdistance

    Basic idea 2: we already have some knowledge of input data

    from results of first-layer.Within-class distance:

    dw = 1

    NC

    (i,j)C

    ||Wi Wj|| (3)

    Between-class distance of two class Ci and Cj:

    db(Ci, Cj) = miniCi,jCj

    ||Wi Wj|| (4)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    47/222

    SOINN for machine learningSOINN for associative memory

    References

    Simple version of SOINNSimulation results

    Second layer: constant threshold Tc

    Basic idea 1: within-class distance Tbetween-classdistance

    Basic idea 2: we already have some knowledge of input data

    from results of first-layer.Within-class distance:

    dw = 1

    NC

    (i,j)C

    ||Wi Wj|| (3)

    Between-class distance of two class Ci and Cj:

    db(Ci, Cj) = miniCi,jCj

    ||Wi Wj|| (4)

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rate

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    48/222

    SOINN for machine learningSOINN for associative memory

    References

    Simple version of SOINNSimulation results

    Second layer: constant threshold Tc (continue)

    1 Set Tcas the minimum between-cluster distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (5)

    2 Set Tcas the minimum between-class distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (6)

    3 IfTc is less than within-class distance dw, set Tcas the next

    minimum between-cluster distance.

    Tc=db(Ci2 , Cj2 ) = mink,l=1,...,Q,k=l,k=i1,l=j1

    db(Ck, Cl) (7)

    4 Go to step 2 to update Tc until Tc is greaterthan dw.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSi l i f SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    49/222

    SOINN for machine learningSOINN for associative memory

    References

    Simple version of SOINNSimulation results

    Second layer: constant threshold Tc (continue)

    1 Set Tcas the minimum between-cluster distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (5)

    2 Set Tcas the minimum between-class distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (6)

    3 IfTc is less than within-class distance dw, set Tcas the next

    minimum between-cluster distance.

    Tc=db(Ci2 , Cj2 ) = mink,l=1,...,Q,k=l,k=i1,l=j1

    db(Ck, Cl) (7)

    4 Go to step 2 to update Tc until Tc is greaterthan dw.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSi l i f SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    50/222

    S gSOINN for associative memory

    References

    Simple version of SOINNSimulation results

    Second layer: constant threshold Tc (continue)

    1 Set Tcas the minimum between-cluster distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (5)

    2 Set Tcas the minimum between-class distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (6)

    3 IfTc is less than within-class distance dw, set Tcas the next

    minimum between-cluster distance.

    Tc=db(Ci2 , Cj2 ) = mink,l=1,...,Q,k=l,k=i1,l=j1

    db(Ck, Cl) (7)

    4 Go to step 2 to update Tc until Tc is greaterthan dw.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    51/222

    gSOINN for associative memory

    References

    Simple version of SOINNSimulation results

    Second layer: constant threshold Tc (continue)

    1 Set Tcas the minimum between-cluster distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (5)

    2 Set Tcas the minimum between-class distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (6)

    3 IfTc is less than within-class distance dw, set Tcas the next

    minimum between-cluster distance.

    Tc=db(Ci2 , Cj2 ) = mink,l=1,...,Q,k=l,k=i1,l=j1

    db(Ck, Cl) (7)

    4 Go to step 2 to update Tc until Tc is greaterthan dw.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    52/222

    SOINN for associative memoryReferences

    Simple version of SOINNSimulation results

    Second layer: constant threshold Tc (continue)

    1 Set Tcas the minimum between-cluster distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (5)

    2 Set Tcas the minimum between-class distance.

    Tc=db(Ci1 , Cj1 ) = mink,l=1,...,Q,k=l

    db(Ck, Cl) (6)

    3 IfTc is less than within-class distance dw, set Tcas the next

    minimum between-cluster distance.

    Tc=db(Ci2 , Cj2 ) = mink,l=1,...,Q,k=l,k=i1,l=j1

    db(Ck, Cl) (7)

    4 Go to step 2 to update Tc until Tc is greaterthan dw.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learning

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINN

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    53/222

    SOINN for associative memoryReferences

    Simple version of SOINNSimulation results

    Updating learning rate 1(t) and 2(t)

    Update of weight vector

    Ws1 = 1(t)( Ws1 ) (8)

    Wi = 2(t)( Wi) (iNs1 ) (9)

    After the size of network becomes stable, fine tune the network

    stochastic approximation: a number of adaptation steps witha strength (t) decaying slowly but not too slowly, i.e.,t=1(t) =, and

    t=1

    2(t)

  • 7/26/2019 Aplicacin de Redes Neuronales

    54/222

    SOINN for associative memoryReferences

    Simple version of SOINNSimulation results

    Updating learning rate 1(t) and 2(t)

    Update of weight vector

    Ws1 = 1(t)( Ws1 ) (8)

    Wi = 2(t)( Wi) (iNs1 ) (9)

    After the size of network becomes stable, fine tune the network

    stochastic approximation: a number of adaptation steps witha strength (t) decaying slowly but not too slowly, i.e.,t=1(t) =, and

    t=1

    2(t)

  • 7/26/2019 Aplicacin de Redes Neuronales

    55/222

    SOINN for associative memoryReferences

    pSimulation results

    Updating learning rate 1(t) and 2(t)

    Update of weight vector

    Ws1 = 1(t)( Ws1 ) (8)

    Wi = 2(t)( Wi) (iNs1 ) (9)

    After the size of network becomes stable, fine tune the network

    stochastic approximation: a number of adaptation steps witha strength (t) decaying slowly but not too slowly, i.e.,t=1(t) =, and

    t=1

    2(t)

  • 7/26/2019 Aplicacin de Redes Neuronales

    56/222

    SOINN for associative memoryReferences

    pSimulation results

    Updating learning rate 1(t) and 2(t)

    Update of weight vector

    Ws1 = 1(t)( Ws1 ) (8)

    Wi = 2(t)( Wi) (iNs1 ) (9)

    After the size of network becomes stable, fine tune the network

    stochastic approximation: a number of adaptation steps witha strength (t) decaying slowly but not too slowly, i.e.,t=1(t) =, and

    t=1

    2(t)

  • 7/26/2019 Aplicacin de Redes Neuronales

    57/222

    SOINN for associative memoryReferences

    Simulation results

    Single-layer SOINN

    For topologyrepresentation,first-layer is enough

    Within-classinsertion slightlyhappened infirst-layer

    Using subclass anddensity to judge ifconnection isneeded.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memory

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNS

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    58/222

    SOINN for associative memoryReferences

    Simulation results

    Single-layer SOINN

    For topologyrepresentation,first-layer is enough

    Within-classinsertion slightlyhappened infirst-layer

    Using subclass anddensity to judge ifconnection isneeded.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memory

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSi l i l

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    59/222

    SOINN for associative memoryReferences

    Simulation results

    Single-layer SOINN

    For topologyrepresentation,first-layer is enough

    Within-classinsertion slightlyhappened infirst-layer

    Using subclass anddensity to judge ifconnection isneeded.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memory

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSi l ti lt

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    60/222

    yReferences

    Simulation results

    Single-layer SOINN

    For topologyrepresentation,first-layer is enough

    Within-classinsertion slightlyhappened infirst-layer

    Using subclass anddensity to judge ifconnection isneeded.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memory

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    61/222

    yReferences

    Simulation results

    Single-layer SOINN

    For topologyrepresentation,first-layer is enough

    Within-classinsertion slightlyhappened infirst-layer

    Using subclass anddensity to judge ifconnection isneeded.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memory

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    62/222

    ReferencesSimulation results

    Single-layer SOINN

    For topologyrepresentation,first-layer is enough

    Within-classinsertion slightlyhappened infirst-layer

    Using subclass anddensity to judge ifconnection isneeded.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryf

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    63/222

    ReferencesSimulation results

    Artificial data set: topology representation

    Stationary and non-stationary

    Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.

    Original data Stationary Non-stationary

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryR f

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    64/222

    ReferencesSimulation results

    Artificial data set: topology representation

    Stationary and non-stationary

    Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.

    Original data Stationary Non-stationary

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    65/222

    References

    Artificial data set: topology representation

    Stationary and non-stationary

    Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.

    Original data Stationary Non-stationary

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    66/222

    References

    Artificial data set: topology representation

    Stationary and non-stationary

    Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.

    Original data Stationary Non-stationary

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    67/222

    References

    Artificial data set: topology representation

    Stationary and non-stationary

    Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.

    Original data Stationary Non-stationary

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    68/222

    References

    Artificial data set: topology representation

    Stationary and non-stationary

    Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.

    Original data Stationary Non-stationary

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    69/222

    References

    Artificial data set: topology representation

    Stationary and non-stationary

    Stationary: all training data obey same distributionNon-stationary: next training sample maybe obey differentdistribution from previous one.

    Original data Stationary Non-stationary

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    70/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number of nodes de-noise etcF. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    71/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number of nodes de-noise etcF. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    72/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number of nodes de-noise etcF. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    73/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number of nodes de-noise etcF. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    74/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number of nodes de-noise etcF. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    75/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number ofnodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    76/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number ofnodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Architecture of SOINNTraining process of SOINNSimilarity threshold for judging input dataLearning rateSimple version of SOINNSimulation results

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    77/222

    Artificial data set: topology representation (continue)

    Original data Two-layer SOINN Single-layer SOINN

    Conclusion of experiments: SOINN is able to

    Represent topology structure of input data.

    Realize incremental learning.

    Automatically learn number ofnodes, de-noise, etc.F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    78/222

    1 What is SOINN

    2 Why SOINN

    3 Detail algorithm of SOINN

    4 SOINN for machine learning

    5 SOINN for associative memory

    6 References

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    79/222

    Some objectives of unsupervised learning

    Automatically learn number of classes of input data

    Clustering with no priori knowledge

    Topology representation

    Realize real-time incremental learning

    Separate classes with low density overlapped area

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    80/222

    Some objectives of unsupervised learning

    Automatically learn number of classes of input data

    Clustering with no priori knowledge

    Topology representation

    Realize real-time incremental learning

    Separate classes with low density overlapped area

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    81/222

    Some objectives of unsupervised learning

    Automatically learn number of classes of input data

    Clustering with no priori knowledge

    Topology representation

    Realize real-time incremental learning

    Separate classes with low density overlapped area

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    S bj i f i d l i

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    82/222

    Some objectives of unsupervised learning

    Automatically learn number of classes of input data

    Clustering with no priori knowledge

    Topology representation

    Realize real-time incremental learning

    Separate classes with low density overlapped area

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    S bj i f i d l i

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    83/222

    Some objectives of unsupervised learning

    Automatically learn number of classes of input data

    Clustering with no priori knowledge

    Topology representation

    Realize real-time incremental learning

    Separate classes with low density overlapped area

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    S bj i f i d l i

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    84/222

    Some objectives of unsupervised learning

    Automatically learn number of classes of input data

    Clustering with no priori knowledge

    Topology representation

    Realize real-time incremental learning

    Separate classes with low density overlapped area

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN f i d l i If t d t d

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    85/222

    SOINN for unsupervised learning: If two nodes connected

    with one path, the nodes belong to one class

    1 Do SOINN for input data, output topology representation ofnodes

    2 Initialize all nodes as unclassified.3 Randomly choose one unclassified node i from node set A.

    Mark node ias classified and label it as class Ci.

    4 Search A to find all unclassified nodes that are connected to

    node iwith a path. Mark these nodes as classified and labelthem as the same class as node i.

    5 Go to Step3 to continue the classification process until allnodes are classified.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN f i d l i If t d t d

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    86/222

    SOINN for unsupervised learning: If two nodes connected

    with one path, the nodes belong to one class

    1 Do SOINN for input data, output topology representation ofnodes

    2 Initialize all nodes as unclassified.3 Randomly choose one unclassified node i from node set A.

    Mark node ias classified and label it as class Ci.

    4 Search A to find all unclassified nodes that are connected to

    node iwith a path. Mark these nodes as classified and labelthem as the same class as node i.

    5 Go to Step3 to continue the classification process until allnodes are classified.

    F Shen O Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for unsupervised learning: If two nodes connected

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    87/222

    SOINN for unsupervised learning: If two nodes connected

    with one path, the nodes belong to one class

    1 Do SOINN for input data, output topology representation ofnodes

    2 Initialize all nodes as unclassified.3 Randomly choose one unclassified node i from node set A.

    Mark node ias classified and label it as class Ci.

    4 Search A to find all unclassified nodes that are connected to

    node iwith a path. Mark these nodes as classified and labelthem as the same class as node i.

    5 Go to Step3 to continue the classification process until allnodes are classified.

    F Shen O Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for unsupervised learning: If two nodes connected

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    88/222

    SOINN for unsupervised learning: If two nodes connected

    with one path, the nodes belong to one class

    1 Do SOINN for input data, output topology representation ofnodes

    2 Initialize all nodes as unclassified.3 Randomly choose one unclassified node i from node set A.

    Mark node ias classified and label it as class Ci.

    4 Search A to find all unclassified nodes that are connected to

    node iwith a path. Mark these nodes as classified and labelthem as the same class as node i.

    5 Go to Step3 to continue the classification process until allnodes are classified.

    F Shen O Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for unsupervised learning: If two nodes connected

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    89/222

    SOINN for unsupervised learning: If two nodes connected

    with one path, the nodes belong to one class

    1 Do SOINN for input data, output topology representation ofnodes

    2 Initialize all nodes as unclassified.3 Randomly choose one unclassified node i from node set A.

    Mark node ias classified and label it as class Ci.

    4 Search A to find all unclassified nodes that are connected to

    node iwith a path. Mark these nodes as classified and labelthem as the same class as node i.

    5 Go to Step3 to continue the classification process until allnodes are classified.

    F Shen O Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for unsupervised learning: If two nodes connected

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    90/222

    SOINN for unsupervised learning: If two nodes connected

    with one path, the nodes belong to one class

    1 Do SOINN for input data, output topology representation ofnodes

    2 Initialize all nodes as unclassified.3 Randomly choose one unclassified node i from node set A.

    Mark node ias classified and label it as class Ci.

    4 Search A to find all unclassified nodes that are connected to

    node iwith a path. Mark these nodes as classified and labelthem as the same class as node i.

    5 Go to Step3 to continue the classification process until allnodes are classified.

    F Shen O Hasegawa Self-organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Artificial data set: 5 classes with 10% noise

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    91/222

    Artificial data set: 5 classes with 10% noise

    Original data Clustering result

    Conclusion of experiments

    Automatically reports number of classes.

    Perfectly clustering data with different shape and distribution.

    Find typical prototypes; incremental learning; de-noise; etc.F Shen O Hasegawa Self organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Artificial data set: 5 classes with 10% noise

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    92/222

    Artificial data set: 5 classes with 10% noise

    Original data Clustering result

    Conclusion of experiments

    Automatically reports number of classes.

    Perfectly clustering data with different shape and distribution.

    Find typical prototypes; incremental learning; de-noise; etc.F Shen O Hasegawa Self organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Artificial data set: 5 classes with 10% noise

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    93/222

    Artificial data set: 5 classes with 10% noise

    Original data Clustering result

    Conclusion of experiments

    Automatically reports number of classes.

    Perfectly clustering data with different shape and distribution.

    Find typical prototypes; incremental learning; de-noise; etc.F Shen O Hasegawa Self organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Artificial data set: 5 classes with 10% noise

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    94/222

    Artificial data set: 5 classes with 10% noise

    Original data Clustering result

    Conclusion of experiments

    Automatically reports number of classes.

    Perfectly clustering data with different shape and distribution.

    Find typical prototypes; incremental learning; de-noise; etc.F Shen O Hasegawa Self organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Artificial data set: 5 classes with 10% noise

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    95/222

    Artificial data set: 5 classes with 10% noise

    Original data Clustering result

    Conclusion of experiments

    Automatically reports number of classes.

    Perfectly clustering data with different shape and distribution.

    Find typical prototypes; incremental learning; de-noise; etc.F Shen O Hasegawa Self organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Artificial data set: 5 classes with 10% noise

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    96/222

    Artificial data set: 5 classes with 10% noise

    Original data Clustering result

    Conclusion of experiments

    Automatically reports number of classes.

    Perfectly clustering data with different shape and distribution.

    Find typical prototypes; incremental learning; de-noise; etc.F Shen O Hasegawa Self organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Face recognition: AT&T face data set

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    97/222

    g

    Experiment results

    Automatically reports there are 10 classes.

    Prototypes of every classes are reported.With such prototypes, recognition ratio (1-NN rule) is 90%.

    F Shen O Hasegawa Self organizing incremental neural network and its application Contents

    What is SOINNWhy SOINN

    Detail algorithm of SOINNSOINN for machine learning

    SOINN for associative memoryReferences

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Face recognition: AT&T face data set

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    98/222

    g

    Experiment results

    Automatically reports there are 10 classes.

    Prototypes of every classes are reported.With such prototypes, recognition ratio (1-NN rule) is 90%.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Face recognition: AT&T face data set

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    99/222

    g

    Experiment results

    Automatically reports there are 10 classes.

    Prototypes of every classes are reported.With such prototypes, recognition ratio (1-NN rule) is 90%.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Face recognition: AT&T face data set

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    100/222

    g

    Experiment results

    Automatically reports there are 10 classes.

    Prototypes of every classes are reported.With such prototypes, recognition ratio (1-NN rule) is 90%.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Face recognition: AT&T face data set

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    101/222

    Experiment results

    Automatically reports there are 10 classes.

    Prototypes of every classes are reported.With such prototypes, recognition ratio (1-NN rule) is 90%.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    102/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1 How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    103/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    104/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    105/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    106/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    107/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    108/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    109/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    110/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Prototype-based classifier: based on 1-NN or k-NN rule

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    111/222

    Nearest Neighbor Classifier (NNC): all training data asprototypesNearest Mean Classifier (NMC): mean of each class asprototypes

    k-means classifier (KMC), Learning Vector Quantization(LVQ), and others: predefine number of prototypes for everyclass.

    Main difficulty

    1

    How to find enough prototypes without overfitting2 How to realize Incremental learning

    Incremental of new data inside one class (non-stationary orconcept drift);Incremental of new classes.

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for supervised learning: Targets

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    112/222

    Automatically learn the number of prototypes needed torepresent every class

    Only the prototypes used to determine the decision boundarywill be remained

    Realize both types of incremental learning

    Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for supervised learning: Targets

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    113/222

    Automatically learn the number of prototypes needed torepresent every class

    Only the prototypes used to determine the decision boundarywill be remained

    Realize both types of incremental learning

    Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for supervised learning: Targets

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    114/222

    Automatically learn the number of prototypes needed torepresent every class

    Only the prototypes used to determine the decision boundarywill be remained

    Realize both types of incremental learning

    Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for supervised learning: Targets

    http://find/
  • 7/26/2019 Aplicacin de Redes Neuronales

    115/222

    Automatically learn the number of prototypes needed torepresent every class

    Only the prototypes used to determine the decision boundarywill be remained

    Realize both types of incremental learning

    Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    SOINN for supervised learning: Targets

    http://find/http://goback/
  • 7/26/2019 Aplicacin de Redes Neuronales

    116/222

    Automatically learn the number of prototypes needed torepresent every class

    Only the prototypes used to determine the decision boundarywill be remained

    Realize both types of incremental learning

    Robust to noise

    F. Shen, O. Hasegawa Self-organizing incremental neural network and its application

    ContentsWhat is SOINN

    Why SOINNDetail algorithm of SOINN

    SOINN for machine learningSOINN for associative memory

    References

    Unsupervised learningSupervised learningSemi-supervised learningActive learning

    Adjusted SOINN Classifier (ASC)

    http://find/
  • 7/26