Spiking neural networks

The growth of the internet of things is making the problem of processing telemetry data on end devices more acute. Spiking neural networks (SSNs) running on neuromorphic chips can resolve this issue.

Due to their poor energy performance, current computer architectures are unable to efficiently implement neural networks that could be trained to process data on end devices

SSNs running on neuromorphic hardware have great potential in this area. SNN operations are asynchronous and event driven – only neurons and synapses that process incoming pulses (spikes) are active and consume energy. Due to the asynchronous nature of pulse propagation in SNNs, solutions based on them are more than three orders of magnitude more energy efficient than solutions based on CPU and GPU chips.

For SNNs, the development and selection of training algorithms is vital.

The training principles for SNNs differ from those underpinning the classical backpropagation of errors method, which is used to train artificial neural networks. For an SNN, the principles underlying the concept of STDP are more applicable, meaning the system is trained through local and asynchronous changes in weights. Moreover, current approaches to training SNNs running on neuromorphic hardware are based on porting pre-trained convolutional neural networks to the SNN. This approach makes no provision for training directly on a neuromorphic chip.

We are building a platform that will allow us to conduct research into SNN training methods based on the STDP process, and to develop cognitive architectures on that basis

1

We are studying the concept of self-organizing liquid state machines (SOLSM) – a promising trend in the field of SNN training that combines Hebbian neural plasticity rules and the liquid state machine approach (M. Kiselev, A Synaptic Plasticity Rule Providing a Unified Approach to Supervised and Unsupervised Learning, Proceedings of IJCNN-2017, Anchorage, 2017, pp. 3806-3813).

2

We are researching potential memory mechanisms in SNNs. In the context of recognizing complex spatial-temporal patterns with a significant time duration, it is essential for the network to have a working memory mechanism that stores information about the start of the pattern throughout its duration, and integrates data about temporally spaced stimuli into a single picture. Using a genetic algorithm, we found a neuron model, plasticity rules, and network structure that show good results for the formation of working memory.

3

We are researching the preprocessing layers of the Winner Takes All (WTA) architecture. One result is the development of metrics and criteria to determine from a data sample whether the preprocessing layer is useful for a given type of task. Learn more.

4

We are developing a hardware-software neuromorphic platform, including memristor-based applications, in collaboration with Motiv.

Cognitive architectures

The development of cognitive architectures will lead to the emergence of new SNN capabilities with valuable applications.

Our research team is focused on developing cognitive architecture

Cognitive architecture would unlock the following capabilities of neural networks, all of which are of practical importance in the field of security and beyond:

  • Long-term contextual memory
  • Continuous learning
  • Awareness of environment
  • Actions in environment
  • Formulation and manipulation of data abstractions
  • Risk assessment
  • Motivation and goal setting
  • Long-term forecasting and planning

Here are just some of the tasks we are addressing in this area:

1
Data symbolization

Symbolization imparts structure and hierarchy to information. A symbolization mechanism is implemented to some extent in convolutional neural networks; however, the potential of SNNs as biosimilar networks is greater, and the task of manipulating symbolic information in SNNs needs to be addressed.

2
Long-term memory

The development of data symbolization mechanisms in SNNs could be the key to solving the problem of long-term memory, because symbols capture information as complete images that can then be extracted from memory.

3
Spiking network macroarchitecture

We are researching macroarchitectures with a view to solving the above problems.