Vocabulary Networks: Program-3

  Use these boxes to set the basic parameters for your model

  NTWK sets up the network for this simulation. Choose a number between 0000 and 9999 for this parameter.
  

  INIT randomises the current state of the words in this simulation. Choose a number between 0000 and 9999 for this parameter.
  

  nEv sets how many events will occur in your simulation. Choose a number between 0 and 500 for this parameter.
  

  sEv sets the size of an event - how many words to turn ON. Choose a number between 0 and 500 for this parameter.
  

  rEv determines which words are affected by your instructions. Choose a number between 0 and 9999 for this parameter.
  

  SPK turns a large number of words ON at update 99.
  

   




  This simulation shows the effect of turning words ON
  when the network is not in a stable attractor state.

  It builds a network like the ones in Program-1, and lets
  the network settle into a stable attactor state.

  You can move the network out of its attractor state by
  changing the value of the SPK parameter. This parameter
  turns ON a number of words at update 99.
 
  THINGS TO DO.
  Find a network with a low level of activation. Set
  SPK to a high value like 600, and set nEv to 0. How does
  the network react to this spike?
  Now experiment with the nEv and sEv parameters and see
  if you can find a combination values that prevents the
  network from falling back into its attractor state.

  QUESTIONS TO ASK.
  How long do the effects of a large spike last for?
  Do spikes affect high or low level vocabularies more?
  How do spikes interact with clusters of activation?
 
  What implications does this set of behaviours have
  for theories of L2 vocabulary acquisition?