Merge pre and postsynaptic update#461
Conversation
…MPostUpdateBeCombined`` implementation
# Conflicts: # tests/unit/synapseGroup.cc
…t have been merged * Did some renaming to clarify
* Implementations of ``SynapseGroup::getWUPreMergeHashDigest`` and ``SynapseGroup::getWUPostMergeHashDigest`` * All comes together into implementation of ``NeuronGroup::mergePrePostSynapses``
…d outgoing synapse groups with variables and code
Codecov Report
@@ Coverage Diff @@
## master #461 +/- ##
==========================================
+ Coverage 87.25% 87.52% +0.26%
==========================================
Files 78 78
Lines 17056 17277 +221
==========================================
+ Hits 14883 15121 +238
+ Misses 2173 2156 -17
Continue to review full report at Codecov.
|
|
Looking at the thesaurus, maybe we should call this process "fuse", "fused". |
|
I like it - will save that change for when I'm at home and can throw the Visual Studio magic rename tool at it 😄 |
… away * correct logic in general wrt to master-slave stuff
# Conflicts: # include/genn/genn/synapseGroup.h # src/genn/genn/synapseGroup.cc
|
I'm afraid the renaming somewhat dwarves the actual changes but still...The most meaningful new bits are probably the rules which define when updates can be merged and are in |
tnowotny
left a comment
There was a problem hiding this comment.
Interesting point about the trade off of "efficient hacky" with "more clean but less efficient" there ...
As for implementation details
a) your list of conditions sounds right
b) I assume you have added a test or two (I can't easily review the detailed code changes)
GeNN has supported pre and postsynaptic weight update models variables and associated update code since #152 (with some extra flexibility introduced in #377). This is nice in terms of modularity but, the implementation is very sub-optimal for models with complex connectivity when compared to the hackier approach. If you have a neuron population with multiple incoming or outgoing synapse populations, all with the same learning rule, the variables get duplicated and, in the neuron kernel, they all get read from memory, updated using the same code and written back to memory.
Postsynaptic models suffered from the same problem which I solved in #201. This PR basically does the same for these pre and postsynaptic weight updates. Essentially two outgoing synapse groups with presynaptic update code (incoming with postsynaptic is basically equivalent) will get merged together if:
The previous change for postsynaptic models significantly improved performance on the microcircuit model so I would imagine this change will be similarly beneficial once we start simulating models which combine more complex connectivity and learning.