Cornell University

Title: Do we really need all that precision in AI at every single step?

Abstract: In digital computing, we're obsessed with making everything super exact, with a fixed precision for every step of computation. Even on analog platforms, the goal has been to minimize noise and get as close as possible to perfect digital-like accuracy. However, does AI really need to be that precise, especially when chasing precision eats up so much energy? I’ll introduce *physics-aware stochastic training*, where instead of fighting uncertainty in physical analog devices, we use it to our advantage. Demonstrated on an optical computing platform, this approach lets us get reliable results even using roughly half a photon per neuron activation (dot product + activation function) during AI model inferences, only restricted by the quantum limit. This opens up a new lane for energy-savvy AI accelerators by embracing the wonkiness of the physical world.

0 people are interested in this event

User Activity

No recent activity