Back

 Industry News Details

 
How TinyML Makes Artificial Intelligence Ubiquitous Posted on : Nov 03 - 2020

TinyML is the latest from the world of deep learning and artificial intelligence. It brings the capability to run machine learning models in a ubiquitous microcontroller - the smallest electronic chip present almost everywhere.

Microcontrollers are the brain for many devices that we use almost every day. From a TV remote controller to the elevator to the smart speaker, they are everywhere. Multiple sensors that can emit telemetry data are connected to a microcontroller. Actuators, such as switches and motors, are also connected to the same microcontroller. It carries embedded code that can acquire the data from sensors and control the actuators. 

The rise of TinyML marks a significant shift in how end-users consume AI. Vendors from the hardware and software industries are collaborating to bring AI models to the microcontrollers.

The ability to run sophisticated deep learning models embedded within an electronic device opens up many avenues. TinyML doesn’t need an edge, cloud, or Internet connectivity. It runs locally on the same microcontroller, which has the logic to manage the connected sensors and actuators.

The Evolution of TinyML

To appreciate the power of TinyML, we need to understand the evolution of AI in the cloud and at the edge.

Phase 1 - AI in the Cloud

During the early days of AI, the machine learning models were trained and hosted in the cloud. The massive compute power needed to run AI made cloud the ideal choice. Developers and data scientists leverage high-end CPUs and GPUs to train the models and then hosting them for inference. Every application that consumed AI talks to the cloud. This application would talk to the microcontroller to manage the sensors and actuators.

Phase 2 - AI at the Edge

While the cloud continues to be the logical home for AI, it does introduce latency while consuming the deep learning models. Imagine every time you speak to a smart speaker, the request going to the cloud for processing. The delay involved in the round trip kills the experience. Other scenarios, such as industrial automation, smart healthcare, connected vehicles, demand AI models to run locally. View More