Communication by binary and sparse spikes is a key factor for the energy...
The objective of continual learning (CL) is to learn tasks sequentially
...
Deep spiking neural networks (SNNs) offer the promise of low-power artif...
Surprising events trigger measurable brain activity and influence human
...
We consider the problem of training a neural network to store a set of
p...
Can we use spiking neural networks (SNN) as generative models of
multi-n...
Fitting network models to neural activity is becoming an important tool ...
We study how permutation symmetries in overparameterized multi-layer neu...
Back-propagation (BP) is costly to implement in hardware and implausible...
Reservoir computing is a powerful tool to explain how the brain learns
t...
Surprise-based learning allows agents to adapt quickly in non-stationary...
The permutation symmetry of neurons in each layer of a deep neural netwo...
Training deep neural networks with the error backpropagation algorithm i...
As deep learning advances, algorithms of music composition increase in
p...
Hand in hand with deep learning advancements, algorithms of music compos...
Hand in hand with deep learning advancements, algorithms of music compos...
Modern reinforcement learning algorithms reach super-human performance i...
Learning weights in a spiking neural network with hidden neurons, using
...
Learning and memory are intertwined in our brain and their relationship ...
Brains need to predict how the body reacts to motor commands. It is an o...
In machine learning, error back-propagation in multi-layer neural networ...
A big challenge in algorithmic composition is to devise a model that is ...
Surprise describes a range of phenomena from unexpected events to behavi...