Density Networks and Neuromorphic Computing

How algorithms and hardware inspired by the brain can unlock new potential 

ChatGPT and other large statistical learning approaches are impressive, interesting, and can be useful, but cannot replicate the flexibility, efficiency, and mastery of the human mind. 

At Cambrya, we expect artificial learning will begin to simulate that natural elegance as algorithms and hardware move closer to the level of integration seen in the brain. 

You can hear a new sound and quickly notice it, identify it as novel, and remember it if you heard it again – even in an environment with other sounds. No single equation or process in your mind produces this result and no information is extracted or organized into a database. Instead a million tiny, simultaneous interactions alter the complex and constantly changing hardware of your brain.

For decades, the field of neuromorphic computing has been investing in developing hardware and software inspired by biological constraints that can work together to simulate behaviors we see in animal brains. However, neuromorphic approaches have consistently struggled to solve complex, real-world tasks. 

Using a simulation of neuromorphic architecture, a newly initialized instance of our prototype hearing Density Network can listen to a duet for the first time and within seconds learn, identify, follow, and extract the sounds of each individual instrument. We are continuing to develop the ability for the Density Network to extract distinct voices in a crowded environment - effectively solving the cocktail party effect. 


Density Networks are fundamentally compatible with neuromorphic hardware. All information in the network is represented entirely by:

  1.  the circuit structure of the network itself; 

  2. the autonomously governed states of each neuron within the network; and 

  3. the localized transmission of analog signals between interconnected neurons. 


All of this sound source separation happens locally within the Density Network, without any prior knowledge of or access to external data.This means that Density Networks are inherently more efficient than existing AI approaches, both in training requirements and computation costs. 

Unlike many current AI approaches, Density Networks do not make statistical predictions. Rather, each neuron within a Density Network is governed by a finite set of interactive behaviors that respond to changing stimuli to coordinate the development of new connections and concepts. Each neuron has smaller structures that receive, store, manipulate, and send information to help inform that decision making process. Complex coordination emerges from these processes happening in parallel across the network and result in the network shape evolving to reflect the shapes of stimuli in the environment. By interpreting this evolution through a suite of analytical tools, we can extract useful outputs such as separated sound sources and gain insight into how each part of the network contributes to source identification.

Every aspect of network processing, from perceiving to learning to recognizing to responding, happens continuously and simultaneously and in full adherence to neuromorphic principles and constraints.

We believe that the synchronous development of these brain-inspired algorithms and hardware will change our expectations of what a computer can accomplish. Our aim is to collaborate with neuromorphic researchers to truly bring this next generation of computing to life through the joint design of biologically plausible algorithms and hardware.

Next
Next

Density Networks in Action