Skip to content

Current Research

You can find our current research projects here. For the bigger picture, be sure to check out our Research Roadmap and approach to AGI.

[siteorigin_widget class=”SiteOrigin_Widget_Image_Widget”][/siteorigin_widget]

Episodic Memory

2018

Imagine trying to accomplish everyday tasks with only a memory for generic facts – without even remembering who you are and what you’ve done so far! That is the basis for most AI/ML algorithms.

We’re developing a complementary learning system with a long term memory akin to Neocortex and a nearer term system analogous to the Hippocampi.

The objective is to enable memory of combinations of specific states, or Episodic memory, enhancing the learning and memory of ‘typical’ patterns (i.e. classification), or Semantic memory. In turn enabling a self-narrative, faster learning with less data and the ability to build on existing knowledge.

Roadmap: Continuous Learning

 

[siteorigin_widget class=”SiteOrigin_Widget_Image_Widget”][/siteorigin_widget]

Predictive Capsules

2018

We believe that Capsules networks promise inherently better generalization, a key weakness of conventional artificial neural networks.

We published an initial paper on unsupervised sparse Capsules earlier this year, extending the work of Sabour et al to only allow local, unsupervised training, and arguably obtained much better generalization. We are now developing a much better understanding of Capsules and how they might be implemented by Pyramidal neurons.

Roadmap: Representation

[siteorigin_widget class=”SiteOrigin_Widget_Image_Widget”][/siteorigin_widget]

Continuous online learning of sparse representations

2017-2018

This project was the foundation of our approach to learning representations of data, with ambitious criteria – continuous, online, unsupervised learning of sparse distributed representations, resulting in state-of-the-art performance even given nonstationary input. We reviewed a broad range of historical techniques and experimented some novel mashups of older competitive learning and modern convolutional networks. We obtained some fundamental insights into effective sparse representations, and how to train them.

Roadmap: Continual learning

[siteorigin_widget class=”SiteOrigin_Widget_Image_Widget”][/siteorigin_widget]

Sequence learning: Alternatives to Backpropagation Through Time

2018-2019

We are intensely interested in biologically plausible alternatives to backpropagation through time (BPTT). BPTT is used to associate causes and effects that are widely separated in time. The problem is that it requires storage of partial derivatives for all synaptic weights for all time-steps up to a fixed horizon (e.g. 1000 steps). Not only is this memory intensive, the finite time window is very restrictive. There is no neurological equivalent to BPTT – nature does it another way, which we hope to copy.

Roadmap: Sequence learning