An Atomic Boltzmann Machine Capable of On-chip Learning

July, 2020

Abstract

The Boltzmann Machine (BM) is a neural network composed of stochastically firing neurons that can learn complex probability distributions by adapting the synaptic interactions between the neurons. BMs represent a very generic class of stochastic neural networks that can be used for data clustering, generative modelling and deep learning. A key drawback of software-based stochastic neural networks is the required Monte Carlo sampling, which scales intractably with the number of neurons. Here, we realize a physical implementation of a BM directly in the stochastic spin dynamics of a gated ensemble of coupled cobalt atoms on the surface of semiconducting black phosphorus. Implementing the concept of orbital memory utilizing scanning tunnelling microscopy, we demonstrate the bottom-up construction of atomic ensembles whose stochastic current noise is defined by a reconfigurable multi-well energy landscape. Exploiting the anisotropic behaviour of black phosphorus, we build ensembles of atoms with two well-separated intrinsic time scales that represent neurons and synapses. By characterizing the conditional steady-state distribution of the neurons for given synaptic configurations, we illustrate that an ensemble can represent many distinct probability distributions. By probing the intrinsic synaptic dynamics, we reveal an autonomous reorganization of the synapses in response to external electrical stimuli. This self-adaptive architecture paves the way for on-chip learning directly in atomic-scale machine learning hardware.

Resource Type: