Self Organizing Maps – Kohonen Maps
Self Organizing Map (or Kohonen Map or SOM) is a type of Artificial Neural Network which is also inspired by biological models of neural systems form the 1970’s. It follows an unsupervised learning approach and trained its network through a competitive learning algorithm. SOM is used for clustering and mapping (or dimensionality reduction) techniques to map multidimensional data onto lower-dimensional which allows people to reduce complex problems for easy interpretation. SOM has two layers, one is the Input layer and the other one is the Output layer. The architecture of the Self Organizing Map with two clusters and n input features of any sample is given below:
How SOM works?
Let’s say an input data of size (m, n) where m is the number of training example and n is the number of features in each example. First, it initializes the weights of size (n, C) where C is the number of clusters. Then iterating over the input data, for each training example, it updates the winning vector (weight vector with the shortest distance (e.g Euclidean distance) from training example). Weight updation rule is given by :
wij = wij(old) - alpha(t) * (xik - wij(old))
where alpha is a learning rate at time t, j denotes the winning vector, i denotes the ith feature of training example and k denotes the kth training example from the input data. After training the SOM network, trained weights are used for clustering new examples. A new example falls in the cluster of winning vector.
Algorithm
Steps involved are :
Step 1 − Initialize the weights, the learning rate α and the neighborhood topological scheme.
Step 2 − Continue step 3-9, when the stopping condition is not true.
Step 3 − Continue step 4-6 for every input vector x.
Step 4 − Calculate Square of Euclidean Distance for j = 1 to m
Step 5 − Obtain the winning unit J where D is minimum.
Step 6 − Calculate the new weight of the winning unit by the following relation −
Step 7 − Update the learning rate α by the following relation −
Step 8 − Reduce the radius of topological scheme.
Step 9 − Check for the stopping condition for the network.
Answer
- It is an unsupervised neural network that reduces the input dimensionality in order to represent its distribution as a map.Therefore, SOM forms a map where similar samples are mapped closely together.
- Self-organizing map (SOM) is a neural network-based dimensionality reduction algorithm generally used to represent a high-dimensional dataset as a two-dimensional discretized pattern.
- Here, the vectors that are close in the high-dimensional space also end up being mapped to SOM nodes that are close in low-dimensional space.
- They differ from other artificial neural networks as they apply competitive learning as opposed to error-correction learning.
Answer
SOM mapping steps starts from initializing the weight vectors
Steps involve:
- Each node’s weights are initialized.
- A vector is chosen at random from the set of training data.
- Every node is examined to calculate which one’s weights are most like the input vector. The winning node is commonly known as the Best Matching Unit (BMU).
- Then the neighbourhood of the BMU is calculated. The amount of neighbors decreases over time.
- The winning weight is rewarded with becoming more like the sample vector.The neighbors also become more like the sample vector.The closer a node is to the BMU, the more its weights get altered and the farther away the neighbor is from the BMU, the less it learns.
- Repeat step 2 for N iterations.
Best Matching Unit is a technique which calculates the distance from each weight to the sample vector, by running through all weight vectors. The weight with the shortest distance is the winner. There are numerous ways to determine the distance, however, the most commonly used method is the Euclidean Distance, and that’s what is used in the following implementation.
Answer
The computational algorithm of SOM consists of two basic procedures, finding a winning neuron and adjusting weights of the winning neuron, as well as its neighbouring neurons.
0 Comments