Skip to main content

A Provably Easy Construction of High-Accuracy Random Binary Neural Networks

A Provably Easy Construction of High-Accuracy Random Binary Neural Networks

Start: 
Friday, April 3, 2026 12:00 pm
End: 
Friday, April 3, 2026 12:50 pm
Location: 
STAG 110
Nick Marshall

In this talk, we describe a novel randomized algorithm for constructing binary neural networks with tunable accuracy. This approach is motivated by hyperdimensional computing (HDC), which is a brain-inspired paradigm that leverages high-dimensional vector representations, offering efficient hardware implementation and robustness to model corruptions. Unlike traditional low-precision methods that use quantization, we consider binary embeddings of data as points in the hypercube equipped with the Hamming distance. We propose a novel family of floating-point neural networks, G-Nets, which are general enough to mimic standard network layers. Each floating-point G-Net has a randomized binary embedding, an embedded hyperdimensional (EHD) G-Net, that retains the accuracy of its floating-point counterparts, with theoretical guarantees, due to the concentration of measure. This talk is based on joint work with Alireza Aghasi, Saeid Pourmand, and Wyatt Whiting.