7. Oja’s hebbian learning rule¶
Book chapters
See Chapter 19 Section 2 on the learning rule of Oja.
Python classes
The ojas_rule.oja
module contains all code required for this exercise.
At the beginning of your exercise solution file, import the contained functions by
from neurodynex.ojas_rule.oja import *
You can then simply run the exercise functions by executing, e.g.
cloud = make_cloud() # generate data points
wcourse = learn(cloud) # learn weights and return timecourse
7.1. Exercise: Circular data¶
Use the functions make_cloud
and learn
to get the timecourse for weights that are learned on a circular data cloud (ratio=1
). Plot the time course
of both components of the weight vector. Repeat this many times (learn
will choose random initial conditions on each run), and plot this into the same plot. Can you explain what happens?
7.2. Exercise: Elliptic data¶
Repeat the previous question with an elongated elliptic data cloud (e.g. ratio=0.3
). Again, repeat this several times.
7.2.1. Question¶
What difference in terms of learning do you observe with respect to the circular data clouds?
7.2.2. Question¶
Try to change the orientation of the ellipsoid (try several different angles). Can you explain what Oja’s rule does?
Note
To gain more insight, plot the learned weight vector in 2D space, and relate its orientation to that of the ellipsoid of data clouds.
7.3. Exercise: Non-centered data¶
The above exercises assume that the input activities can be negative (indeed the inputs were always statistically centered). In actual neurons, if we think of their activity as their firing rate, this cannot be less than zero.
Try again the previous exercise, but applying the learning rule on a noncentered data cloud. E.g., use 5 + make_cloud(...)
, which centers the data around (5,5)
. What conclusions can you draw? Can you think of a modification to the learning rule?