Please enable JavaScript.
Coggle requires JavaScript to display documents.
Self Organizing Maps (SOM) (Vector quantization (Voronoi/ nearest-neigbour…
Self Organizing Maps (SOM)
Based on Competitive learning
Winner Takes All (WTA)/ Winning neuron
Neurons at nodes of lattice
Form topographic map of input patterns
Spatial location of neurons indicate features
Non-linear generalization of PCA
Inspiration: Human brain
Sensory inputs represented by topologically ordered computational maps
Feature maps
Feature mapping models
Kohonen model
More general
Class of vector coding algorithms
Willshaw-von def Malsburg's model
Same input and output dimensions
Not strictly WTA
Properties
Approximation of the input space
Topological ordering
Density matching
Feature selection
Visualization
Contextual/ semantic map
Topological map
Goal
Transform an incoming signal pattern of arbitrary dimension into a 1- or 2-d discrete map
Process
Competition
Finds the winning neuron
Cooperaration
Neighborhood function for lateral interaction
Synaptic Adaptation
Topological ordering by wt. updates
Phases
Self organizing/ Ordering
Convergence
SOM Algorithm
Components and parameters
Network topology defining the output space
Time varying neighbourhood function
Non-zero learning rate parameter
Continuous input space of activation patterns
Steps
Initialization
Sampling
Similarity Matching
Updating
Continuation
Vector quantization
Technique that exploits underlying structure of input vectors for data compression
Voronoi/ nearest-neigbour quantizer
vector quantizer with minimum encoding distortion
Computed approximately by SOM Algorithm
Learning vector quantization (LVQ)
supervised learning technique to improve quality of classifier regions
Form of
lossy
data compression