- Taxonomic neighborhood density for 244K lexical items from Reilly and Desai (2017) paper (Plain text file)
- Map of the semantic system from Binder et al. (2009) "Where is the semantic system?" paper: NIFTI files of the "All" map and the "General" map (remove _.txt from the filenames; these are .nii files)
- Supplemetary material from Desai et al. (2011) "Neural career of sensory-motor metaphors" paper (pdf file)
The COHELD (COntrastive HEbbian Learning with Delays) is a neural network simulator implementing the Contrastive Hebbian Learning (CHL) algorithm. CHL allows the use of hidden units in continuous Hopfield networks. This allows the network to solve non-linearly separable tasks (for which feedforward backpropagation networks are commonly used) in an attractor network framework and with the simple and biologically realistic Hebbian learning rule. It can also be used for unsupervised learning. Since no distinction between “input” and “output” is necessary, unlike feedforward networks, COHELD networks can be used to learn mappings in multiple directions, from any part of a pattern to other parts of the pattern. In addition, COHELD is capable of processing temporal patterns using delay connections. Hence, it can also be used for temporal tasks for which typically Simple Recurrent Networks (Elman nets) or Jordan nets are used. It supports a wide variety of network architectures.
This no-frills package was written because the CHL algorithm is not implemented in most popular neural network packages. It is written in Java and runs on several different platforms, including Windows, Unix, and Mac OS X. A User Guide is included. It is available free for non-profit purposes.