# 2D SONN algorithm with decremental gain

Here’s a simple algorithm to produce Self-organizing neural networks (SONN) in 2D clustering problems with a simple decremented gain.

Number of clusters of input data x represented by $N_n\times N_m$, and $N_F$ is the number of features in each cluster. To represent the amount of change in the weights as a function of the distance from the center cluster $(n_0,m_0)$, here I use window function $\lambda$, and the goal is to decrement the gain $g(t+1)$ for updating the weights at next step iteration.

Step 1 : set the weight in all clusters to random values :

$w_i^{m,n}=random$,

for $n=0,1,2,...,N_n$; $m=0,1,2,...,N_m$; and $i=0,1,2,...,N_F$

Set the initial gain $g(0)=1$.

Step 2 : For each input pattern

$x^t,$ where $t = 1, 2, ...., k$,

(a). Identify the cluster that is closest to to the k-th input  :

$(n_0, m_0)=\displaystyle min_{j,l} {||x^k-w^{j,l}||}$.

(b). Update the weights of the clusters in the neighborhood of cluster $(n,m), N,$ according to the rule :

$w_i^{n,m}(t+1) \leftarrow w_i^{n,m}(t)+g(t)\lambda (n,m)[x_i^k-w_i^{n,m}(t)]$, for $(n,m) \in N$,

Where $\lambda(n,m)$ is window function.

Step 3 : Decrement the gain term used for adapting the weight :

$g(t+1)=\mu g(t)$,

where $\mu$ is the learning rate.

Step 4 : Repeat by going back to step 2 until convergence.

For the similar 1D SONN algorithm see here.