Vocabulary Networks: Program-2

  Use these boxes to set the basic parameters for your model

  NTWK sets up the network for this simulation. Choose a number between 0000 and 9999 for this parameter.
  

  INIT randomises the current state of the words in this simulation. Choose a number between 0000 and 9999 for this parameter.
  

  nEv sets how many events will occur in your simulation. Choose a number between 0 and 500 for this parameter.
  

  sEv sets the size of an event - how many words to turn ON. Choose a number between 0 and 500 for this parameter.
  

  rEv determines which words are affected by your instructions. Choose a number between 0 and 9999 for this parameter.
  

   




 This simulation shows the effect of turning words ON.

  It randomly builds a network of 1000 words,
  turns about half of these words ON, and lets the
  network settle into a stable attractor state.
 
  The simulation runs through 1000 updates.
  The green dots show the number of active words.
  The red dots mark the occurrence of an EVENT.
  An event turns ON a fixed number of words.
 
  THINGS TO DO
  Change the value of NTWK to get a new model.
  Change the value of INIT to randomise the network.
  Change nEv to tell the program how many events you want.
  Change sEv to tell the program how big an event should be.
  Change rEv to get a different set of events.

  THINGS TO NOTE
  infrequent events don't usually have a lasting effect.
  frequent small events sometimes produce big effects.
  The network usually returns to its rest state. Why?
 
  What implications does this set of behaviours have
  for theories of L2 vocabulary acquisition?