Back

 Industry News Details

 
How Facebook deals with the fact AI is a mess on smartphones Posted on : Dec 29 - 2018

Facebook has a whole set of internal tools to try and optimize its neural networks to run on mobile devices. Still, the company finds it difficult to navigate a smartphone market that is byzantine in its complexity, with thousands of different chipsets, most of poor performance, and software stacks that aren't quite up to the job.

AI on mobile devices is a bit of a mess, and it's a headache for Facebook, which gets 90% of its advertising revenue off of people using its service on mobile.

Those are some takeaways of a recent research paper from Facebook's AI folks, who detail how they've had to come up with all manner of tricks to get around the hardware shortcomings of mobile.

That includes things like tweaking how many "threads" in an application to use to reach a common denominator across a plethora of different chip designs and capabilities. That means they can't generally "optimize" their code for a given device.

Despite all Facebook's enormous resources, there's a lot the whole industry needs to do, they write.

The paper, "Machine Learning at Facebook: Understanding Inference at the Edge," is posted on the publications page at Facebook's research site, and is authored by Carole-Jean Wu and 25 colleagues. It has been submitted to the IEEE International Symposium on High-Performance Computer Architecture, taking place next February in Washington, D.C.

The authors outline the two-pronged problem: More and more, there's a need to perform AI on mobiles. But the landscape of chips and software in the devices is a "Wild West," a mess of different parts, different software APIs, and generally poor performance.

There's a need for apps on "edge" devices, including smartphones but also Oculus Rift headsets and other devices, to perform "inference," the part of machine learning where the computer uses its trained neural network to answer questions.

The authors cite things such as performing inference in real time on images that are going to be uploaded to Instagram as the kind of task that needs local processing to avoid the latency of going to the cloud to do inference.

But Facebook is up against frankly crummy hardware when considering the vast array of smartphones in the wild.

The company's "neural network engine is deployed on over one billion mobile devices," they point out, comprising "over two thousand unique SOCs [system on a chip, a semiconductor composed of not just a microprocessor but other compute blocks]." That's across ten thousand different models of phones and tablets. View More