Skip to content

Instantly share code, notes, and snippets.

@braintimeException
Last active August 14, 2018 14:49
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save braintimeException/109f1f43f1e920435444bb58bf3765e3 to your computer and use it in GitHub Desktop.
Save braintimeException/109f1f43f1e920435444bb58bf3765e3 to your computer and use it in GitHub Desktop.
Google Summer of Code 2018 Project Summary

Google Summer of Code 2018 with INCF and GeNN

A PyNN interface to GeNN

This project aimed at bridging PyNN, a Python module for definition of spiking neural networks, to GeNN, efficient simulator written in C++ and capable of running on GPU. The initial goal was just a PyNN interface, however the intermediate product, a Python wrapper for GeNN has proven to be usable on its own.

Work done

During the implementation phase of the project it has turned out that the most efficient way is first to build a Python interface to GeNN and then to link it with PyNN. Therefore the outcome of this project are two interfaces:

  • PyGeNN is a Python interface to GeNN and is usable on its own.
  • PyNN interface to GeNN links to PyGeNN and allows to run a large selection of standard models, which are defined in PyNN.

PyGeNN provides access to all crucial bits of GeNN. Using PyGeNN, it is possible to create a new neural network model, fine tune GeNN, e.g., by specifying how and when variables should be initialized, derive new NeuronModels, PostsynapticModels, WeightUpdateModels and CurrentSourceModels from Python and then pass them back to C++ backend. It is 100% working.

PyNN interface fully supports Populations, PopulationViews and Projections. Assembly has limited support: it can be used to record from cells, but not for creating new Projections. Manipulating Populations using Assembly is not tested, but might work. Many of the standard models are already supported, a list of the missing can be found in the TODO section.

For instructions on how to use it, see repository on GitHub linked in References section This repository is single-purpose and I am the only contributor so far.

Additionally during implementation of PyNN interface is has become clear that the most elegant way to implement one of the features, current sources, is to extend GeNN with corresponding module. My initiative was supported by my mentors and I added a new model family named CurrentSources.

TODO

PyNN interface to GeNN is missing a couple of features:

  • random initialization using native random number generators (RNG). Native RNGs are RNGs implemented in the target simulator (in this case GeNN) as opposed to RNGs implemented in PyNN itself. Using native RNGs for initialization can speed up model initialization greatly.

  • assemblies are not fully supported. Assembly is a PyNN class which allows to merge different populations or parts of populations into a single stucture. There is no direct counterpart for such structure in GeNN and a workaround would have strong negative effect on performance. The assemblies can still be used to specify recorded neurons.

  • recording of gsyn. At the moment, spike and voltage recording was tested. The problem with gsyn and other postsynaptic variables is that in Pynn they are part of a cell, and in genn they are part of synaptic projection. Otherwise in principle any state variable which does not come from postsynaptic part of the model should be recorded easily (PyNN only allows to record spikes, voltage and gsyn for some models by default without extending recorder manually).

  • some standard models are missing:

    • cells: GIF_cond_exp, IF_facets_hardware1, SpikeSouceGamma, SpikeSouceInhGamma
    • synapses: ElectricalSynapse, all stochastic synapses, MultiQuantalSynapse
  • Izhikevich model is broken (synaptic input has no effect, wrong scaling?)

  • projections & populations: no evaluation of lazyarrays unless necessary. Current implementation always evaluates lazyarrays, however it only makes sense in cases when values differ from each other. On contrary, if a single value is supplied, or a native random initialization (not implemented) is used, there is no need to evaluate the lazyarray which would save memory

  • projections: globalG. GeNN can be configured to use a single value for synaptic projections and thus save memory. Implementing this option is closely related to previous bulletpoint.

  • one of the last features implemented is support for changing from parameters to variables on the fly depending whether a single value or a list is supplied. However this breaks GeNN's derived parameters feature which depends on the position of parameters. In PyNN there are "computed_parameters". An investigation whether these can replace derived parameters is needed.

Links to my contribution

Thanks

Many thanks to my mentors, Dr. James Knight and Prof. Thomas Nowotny. They are always very helpful and responsive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment