Back

Speaker "Michael Andregg" Details Back

 

Topic

Machine learning at the human brain scale: Staying at the frontier with novel hardware

Abstract

Neural networks and machine learning have achieved astounding results, however we are still a long way from general human performance. One big reason is that largest artificial neural networks have at least a million times less parameters than the human brain. It's well recognized that better hardware equals better performance in training on many datasets, and all of the largest ML powerhouses (Microsoft, Google, Facebook, Baidu) have large internal hardware projects to improve ML hardware. This talk will cover the performance limitations of the latest physical hardware (ASICs, FPGAs, GPUs) for running today's machine learning algorithms and also include a survey of recent progress to develop novel ML hardware implementations from a physics perspective. Finally, I'll cover Fathom's own optical computing approach which has the potential to achieve significant breakthroughs in training performance.

Profile

Michael Andregg is chief of strategy and co-founder of Fathom Computing, a startup building optics-based computing hardware for AI.