Header Ads Widget

SOM Algorithm and its variant

Self Organizing Maps – Kohonen Maps

Self Organizing Map (or Kohonen Map or SOM) is a type of Artificial Neural Network which is also inspired by biological models of neural systems form the 1970’s. It follows an unsupervised learning approach and trained its network through a competitive learning algorithm. SOM is used for clustering and mapping (or dimensionality reduction) techniques to map multidimensional data onto lower-dimensional which allows people to reduce complex problems for easy interpretation. SOM has two layers, one is the Input layer and the other one is the Output layer. The architecture of the Self Organizing Map with two clusters and n input features of any sample is given below: 


How SOM works?

Let’s say an input data of size (m, n) where m is the number of training example and n is the number of features in each example. First, it initializes the weights of size (n, C) where C is the number of clusters. Then iterating over the input data, for each training example, it updates the winning vector (weight vector with the shortest distance (e.g Euclidean distance) from training example). Weight updation rule is given by : 

wij = wij(old) - alpha(t) * (xik - wij(old))

where alpha is a learning rate at time t, j denotes the winning vector, i denotes the ith feature of training example and k denotes the kth training example from the input data. After training the SOM network, trained weights are used for clustering new examples. A new example falls in the cluster of winning vector. 

Algorithm

Steps involved are :  

Step 1 − Initialize the weights, the learning rate Î± and the neighborhood topological scheme.

Step 2 − Continue step 3-9, when the stopping condition is not true.

Step 3 − Continue step 4-6 for every input vector x.

Step 4 − Calculate Square of Euclidean Distance for j = 1 to m

Step 5 − Obtain the winning unit J where Dj is minimum.

Step 6 − Calculate the new weight of the winning unit by the following relation −

Step 7 − Update the learning rate Î± by the following relation −

Step 8 − Reduce the radius of topological scheme.

Step 9 − Check for the stopping condition for the network.


Example:- A Kohonen self-organizing map is used to cluster four vectors. Let the vectors to be clustered be (1,1,0,0); (0,0,0,1); (1,0,0,0);(0,0,1,1) The maximum number of clusters to be formed is m = 2. Suppose the learning rate is n = 0.5, The neighborhood of node J is set so that only one cluster updates its weights at each step (R = 0). Initial weight matrix is [0.2 0.4 0.6 0.8] [0.9 0.7 0.5 0.3]. 




Ques.   What is Self-Organizing Map (SOM)?

Answer

  • It is an unsupervised neural network that reduces the input dimensionality in order to represent its distribution as a map.Therefore, SOM forms a map where similar samples are mapped closely together.
  • Self-organizing map (SOM) is a neural network-based dimensionality reduction algorithm generally used to represent a high-dimensional dataset as a two-dimensional discretized pattern.
  • Here, the vectors that are close in the high-dimensional space also end up being mapped to SOM nodes that are close in low-dimensional space. 
  • They differ from other artificial neural networks as they apply competitive learning as opposed to error-correction learning.

 

Ques.  Write steps used in SOM algorithm.

Answer

SOM mapping steps starts from initializing the weight vectors

Steps involve:

  1. Each node’s weights are initialized.
  2. A vector is chosen at random from the set of training data.
  3. Every node is examined to calculate which one’s weights are most like the input vector. The winning node is commonly known as the Best Matching Unit (BMU).
  4. Then the neighbourhood of the BMU is calculated. The amount of neighbors decreases over time.
  5. The winning weight is rewarded with becoming more like the sample vector.The neighbors also become more like the sample vector.The closer a node is to the BMU, the more its weights get altered and the farther away the neighbor is from the BMU, the less it learns.
  6. Repeat step 2 for N iterations.

Best Matching Unit is a technique which calculates the distance from each weight to the sample vector, by running through all weight vectors. The weight with the shortest distance is the winner. There are numerous ways to determine the distance, however, the most commonly used method is the Euclidean Distance, and that’s what is used in the following implementation.

 

Ques.  What are the basic processes used in SOM? Also explain stages of SOM algorithm.

Answer

The computational algorithm of SOM consists of two basic procedures, finding a winning neuron and adjusting weights of the winning neuron, as well as its neighbouring neurons.


Post a Comment

0 Comments