Exploring Emerging Device Physics for Efficient Spin-Based Neuromorphic Computing

Exploring Emerging Device Physics for Efficient Spin-Based Neuromorphic Computing
Author: Kezhou Yang
Publisher:
Total Pages: 0
Release: 2023
Genre:
ISBN:


Download Exploring Emerging Device Physics for Efficient Spin-Based Neuromorphic Computing Book in PDF, Epub and Kindle

In the past decade artificial intelligence has undergone vast development thanks to deep learning techniques. However, the large computation overhead limits the application of AI in scenarios where area and energy consumption are limited. This is due to the mismatch in architecture between von Neumann hardware computing systems and deep learning algorithms. As a promising solution to the problem, neuromorphic computing has attracted great research interest. While there are efforts to build neuromorphic computing systems based on CMOS technology, memristors which provide intrinsic dynamics similar to synapses and neurons are also under exploration. Among different types of memristors, this dissertation focus on spintronic devices, which offer more plentiful neural or synaptic functionalities with a low operating voltage. The work in this dissertation consists of both simulation and experimental part. On simulation side, a stochastic neuron design based on magnetic tunnel junction utilizing magnetic-electro effect is proposed. The stochastic neurons are used to build spiking neural networks, which show improved spike sparsity with good test accuracy. Apart from spiking neural network, an all-spin Bayesian neural network is proposed, where intrinsic stochasticity of scaled devices is utilized for random number generation. Voltage controlled magnetic anisotropy effect-based magnetic tunnel junction is explored and utilized to solve write sneak path problem in crossbar array structure. On experiment side, Hall bars are fabricated on ferromagnetic/heavy metal materials stacks and utilized as neurons. Relations between Hall bar characteristics and size are explored. Hardware-in-loop training has been studied with Hall bar neurons.