Vocabulary Networks: Program-12
This simulation illustrates the effect of a single massive
event that turns OFF all the words in a model vocabulary.
This event takes place at update 100.
The program allows you to re-activate a small number of
words and watch how the vocabulary responds.
THINGS TO DO.
Find a network with a high level attractor state.
Set the value of nEv to 0 and set sEv to 0. This will
give you a simulation where your network experiences a
massive loss event, and ALL words are deactivated.
Next set nEv=1 and sEv 50. This will give you a simulation
where 50 words are reactived in a single event.
QUESTIONS TO ASK.
Is there any evidence for recovery with these parameter
values? What happens if you increase the valueof sEv?
What happens if you increase the value of nEv?
Can you find a combination of values for nEv and sEV
that always results in a full recovery?
How important is the value of the rEv parameter in these
simulations?
What happens in models with a low level of activity in
in their attractor state?
What implications does this set of behaviours have
for theories of L2 vocabulary acquisition?