Vocabulary Networks: Program-12

  Use these boxes to set the basic parameters for your model

  NTWK sets up the network for this simulation. Choose a number between 0000 and 9999 for this parameter.
  

  nEv sets how many events will occur in your simulation. Choose a number between 0 and 500 for this parameter.
  

  sEv sets the size of an event - how many words to turn OFF. Choose a number between 0 and 500 for this parameter.
  

  rEv determines which words are affected by your instructions. Choose a number between 0 and 9999 for this parameter.
  

   




  This simulation illustrates the effect of a single massive
  event that turns OFF all the words in a model vocabulary.
  This event takes place at update 100.
  The program allows you to re-activate a small number of
  words and watch how the vocabulary responds.
  THINGS TO DO.
  Find a network with a high level attractor state.
  Set the value of nEv to 0 and set sEv to 0. This will
  give you a simulation where your network experiences a
  massive loss event, and ALL words are deactivated.
  Next set nEv=1 and sEv 50. This will give you a simulation
  where 50 words are reactived in a single event.
  QUESTIONS TO ASK.
  Is there any evidence for recovery with these parameter
  values? What happens if you increase the valueof sEv?
  What happens if you increase the value of nEv?
  Can you find a combination of values for nEv and sEV
  that always results in a full recovery?
  How important is the value of the rEv parameter in these
  simulations?
  What happens in models with a low level of activity in
  in their attractor state?
 
  What implications does this set of behaviours have
  for theories of L2 vocabulary acquisition?