Skip to content

Allow more efficient implementation of trace-based STDP rules#152

Merged
neworderofjamie merged 94 commits intomasterfrom
weight_update_pre_post_var
Oct 11, 2018
Merged

Allow more efficient implementation of trace-based STDP rules#152
neworderofjamie merged 94 commits intomasterfrom
weight_update_pre_post_var

Conversation

@neworderofjamie
Copy link
Contributor

@neworderofjamie neworderofjamie commented Jul 10, 2017

In the process of investigation spike-based learning in SpineML and finding it lacking, I realised that GeNN is missing some useful functionality for spike-based learning too.

If you want to implement trace-based STDP rules you often want to be able to have state that's only stored and updated once per pre or postsynaptic neuron - typically low-pass-filtered spike trains. Apart from by hacking this into neuron models, GeNN had no method to support this - until now....

Basically this pull request adds pre and postsynaptic state variables to weight update models (in a backward-compatible manner). These are substituted into existing bits of weight update user code in the normal way, but additionally there are two new piece of weight update user code which get inserted into the appropriate neuron kernel when populations spike. The last aspect is the one thing I'm a little concerned about as it seems a little ugly - any thought @tnowotny?

For an example - the following weight update model implements Pfister and Gerstner's triplet rule:

//----------------------------------------------------------------------------
// PfisterTriplet
//----------------------------------------------------------------------------
class PfisterTriplet : public WeightUpdateModels::Base
{
public:
    DECLARE_WEIGHT_UPDATE_MODEL(PfisterTriplet, 10, 1, 2, 2);

    SET_PARAM_NAMES({
      "tauPlus",  // 0 - Fast presynaptic time constant (ms)
      "tauMinus", // 1 - Fast postsynaptic time constant (ms)
      "tauX",     // 2 - Slow presynaptic time constant (ms)
      "tauY",     // 3 - Slow postsynaptic time constant (ms)
      "A2Plus",    // 4 - Pair-based potentiation strength (dimensionless)
      "A2Minus",    // 5 - Pair-based depression strength (dimensionless)
      "A3Plus",    // 6 - Triplet-based potentiation strength (dimensionless)
      "A3Minus",    // 7 - Triplet-based depression strength (dimensionless)
      "Wmin",     // 8 - Minimum weight
      "Wmax",     // 9 - Maximum weight
    });

    SET_VARS({{"g", "scalar"}});
    SET_PRE_VARS({{"r1", "scalar"}, {"r2", "scalar"}});
    SET_POST_VARS({{"o1", "scalar"}, {"o2", "scalar"}});

    SET_PRE_SPIKE_CODE(
        "scalar dt = $(t) - $(sT_pre);\n"
        "$(r1) = ($(r1) * exp(-dt / $(tauPlus))) + 1.0;\n"
        "$(r2) =  ($(sT_pre) == 0.0) ? 0.0 : ($(r2) + 1.0) * exp(-dt / $(tauX));\n");

    SET_POST_SPIKE_CODE(
        "scalar dt = $(t) - $(sT_post);\n"
        "$(o1) = ($(o1) * exp(-dt / $(tauPlus))) + 1.0;\n"
        "$(o2) = ($(sT_post) == 0.0) ? 0.0 : ($(o2) + 1.0) * exp(-dt / $(tauY));\n");

    SET_SIM_CODE(
        "$(addtoinSyn) = $(g);\n"
        "$(updatelinsyn);\n"
        "scalar dt = $(t) - $(sT_post); \n"
        "if (dt > 0)\n"
        "{\n"
        "    scalar o1 = $(o1) * exp(-dt / $(tauMinus));\n"
        "    scalar newWeight = $(g) - o1 * ($(A2Minus) + ($(A3Minus) * $(r2)));\n"
        "    $(g) = (newWeight < $(Wmin)) ? $(Wmin) : newWeight;\n"
        "}\n");

    SET_LEARN_POST_CODE(
        "scalar dt = $(t) - $(sT_pre);\n"
        "if (dt > 0)\n"
        "{\n"
        "    scalar r1 = $(r1) * exp(-dt / $(tauPlus));\n"
        "    scalar newWeight = $(g) + r1 * ($(A2Plus) + ($(A3Plus) * $(o2)));\n"
        "    $(g) = (newWeight > $(Wmax)) ? $(Wmax) : newWeight;\n"
        "}\n");

    SET_NEEDS_PRE_SPIKE_TIME(true);
    SET_NEEDS_POST_SPIKE_TIME(true);
};

IMPLEMENT_MODEL(PfisterTriplet);

Continuing the theme of recreating bits of my thesis using GeNN I implemented the Sjostromet al. (2001) pairing frequency experiment here which shows that the GeNN version behaves much like the Pfister and Gerstner paper suggests:

sjostrom

…and postsynaptic functions so either one can be substituted seperately
…model variables are substituted in correct places
…iables and code strings for applying them

* New DECLARE_WEIGHT_UPDATE_MODEL macro that defines PreVarValues and PostVarValues correctly
…synaptic variables so as to maintain API compatibility
…psePopulation and passed empty arrays from existing functions
…ames into weight update code blocks and insert pre and post-spike code into appropriate neuron kernels
…names into weight update code blocks and insert pre and post-spike code into appropriate neuron kernels
@neworderofjamie neworderofjamie deleted the weight_update_pre_post_var branch September 18, 2017 11:23
@neworderofjamie neworderofjamie restored the weight_update_pre_post_var branch December 8, 2017 15:48
@codecov
Copy link

codecov bot commented Dec 8, 2017

Codecov Report

❗ No coverage uploaded for pull request base (master@25ef548). Click here to learn what that means.
The diff coverage is 94.17%.

Impacted file tree graph

@@            Coverage Diff            @@
##             master     #152   +/-   ##
=========================================
  Coverage          ?   83.43%           
=========================================
  Files             ?       45           
  Lines             ?     7057           
  Branches          ?        0           
=========================================
  Hits              ?     5888           
  Misses            ?     1169           
  Partials          ?        0
Impacted Files Coverage Δ
lib/include/standardSubstitutions.h 100% <ø> (ø)
lib/include/codeGenUtils.h 97.72% <ø> (ø)
lib/src/generateCPU.cc 94.05% <100%> (ø)
lib/src/standardGeneratedSections.cc 98.58% <100%> (ø)
lib/include/newModels.h 79.06% <100%> (ø)
lib/src/generateRunner.cc 84.73% <100%> (ø)
lib/src/modelSpec.cc 64.24% <100%> (ø)
lib/include/modelSpec.h 84.37% <100%> (ø)
lib/include/neuronGroup.h 97.61% <100%> (ø)
lib/src/standardSubstitutions.cc 94.39% <100%> (ø)
... and 7 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 25ef548...192e91a. Read the comment docs.

@neworderofjamie neworderofjamie changed the base branch from development to master February 14, 2018 12:23
…e_pre_post_var

# Conflicts:
#	lib/include/codeGenUtils.h
#	lib/include/modelSpec.h
#	lib/include/newModels.h
#	lib/include/standardSubstitutions.h
#	lib/include/synapseGroup.h
#	lib/src/generateCPU.cc
#	lib/src/generateKernels.cc
#	lib/src/generateRunner.cc
#	lib/src/modelSpec.cc
#	lib/src/standardSubstitutions.cc
@neworderofjamie neworderofjamie added this to the GeNN 3.2.0 milestone Mar 20, 2018
…ynaptic traces should get copied from previous delay slot
# Conflicts:
#	lib/src/codeGenUtils.cc
…ost_var

# Conflicts:
#	lib/include/neuronGroup.h
…ht_update_pre_post_var

# Conflicts:
#	lib/include/initSparseConnectivitySnippet.h
…ost_var

# Conflicts:
#	lib/include/codeGenUtils.h
#	lib/include/standardSubstitutions.h
#	lib/src/codeGenUtils.cc
#	lib/src/generateKernels.cc
#	lib/src/standardSubstitutions.cc
… calling ``initConnectivity`` or ``initVar`` with a parameter-less model i.e. ``initConnectivity<InitSparseConnectivitySnippet::OneToOne>({})`` can now be just ``initConnectivity<InitSparseConnectivitySnippet::OneToOne>({})``
@neworderofjamie
Copy link
Contributor Author

@tnowotny this is finally ready and, aside from documentation updates etc is the last feature for the 3.2.0 release! Are you ok to merge? Previous concerns about delays are fixed and I've added new tests.

@tnowotny
Copy link
Member

I believe the design is right. The code for updating the spike traces needs to be in the neuron kernel. The only other solution would be a separate kernel which seems silly unless the code was very heavy computationally and would lead to a lot of divergence of spiking and silent neurons in the neuron kernel.
I agree that it seems odd if a synapse aspect influences the neuron kernel code but it only makes sense in this circumstance. The average user would never know anyway.

@tnowotny
Copy link
Member

I did not inspect the code in detail as often; but I think we should go ahead and merge - there are some very busy times ahead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants