Global Shipments of TinyML Devices to Reach 2.5 Billion by 2030

Industrial and Manufacturing, Smart Cities, and Consumer Applications are driving the need for Tiny Machine Learning chipsets.

According to global tech market advisory firm, ABI Research, a total of 2.5 billion devices are expected to be shipped with a Tiny Machine Learning (TinyML) chipset in 2030, propelled by the increasing focus on low latency, advanced automation, and the availability of low-cost and ultra-power-efficient Artificial Intelligence (AI) chipsets. Also known as very edge AI or embedded AI, these chipsets perform AI inference almost fully on board, while they will continue to rely on external resources, such as gateways, on-premise servers, or the cloud for training.

As enterprises start to look for AI solutions in areas of voice activation, image or video screening, people tracking, and ambient tracking, end-users struggled with the restricted nature of battery-powered sensors and embedded modules which operate on low computational resources offered by general-purpose microcontrollers. Often, edge sensors and devices need to handle large amounts of data. However, due to the low powered nature of these devices, they struggle to support high computing performance and high data throughput, causing latency issues. “Since AI is deployed to make immediate critical decisions such as quality inspection, surveillance, and alarm management, any latency within the system may result in machine stoppage or slowdown causing heavy damages or loss in productivity. Moving AI to the edge mitigates potential vulnerability and risks such as unreliable connectivity and delayed responses,” explains Lian Jye Su, Principal Analyst at ABI Research.

Imbued with quantized AI models, TinyML chipsets enable smart sensors to perform data analytics on hardware and software dedicated for low powered systems, typically in the milliwatt range, using algorithms, networks, and models down to 100kB and below. Arm and CEVA have both launched a chipset IP solution that supports low powered AI inference with supporting software libraries, toolchains, and models. Low-powered AI chipset vendors include GreenWaves Technologies, Lattice Semiconductor, Rockchip, Syntiant, and XMOS have launched their embedded AI chipset products in 2019. Realizing the potential of TinyML in machine vision, CMOS vendors such as Sony and HiMax are also integrating TinyML chipset into their CMOS sensor. “This means the market will soon start to see multiple AI chipsets in a single device at sensor and device level,” Su says.

More importantly, it is not just hardware development that accelerates the democratization of TinyML. Open-source software development from Google through TensorFlow Lite for Microcontroller and proprietary solutions from the likes of SensiML offer developer-friendly software tools and libraries, allowing more AI developers to create AI models that can support very edge applications. Developing competent and differentiated hardware is no longer enough. TinyML chipset manufacturers must focus on developing their AI developer ecosystem or be part of existing ecosystems, embrace open source, and focus on articulating their unique selling points and target markets to end-users. Without these conditions, chipset suppliers will struggle to generate scale for their products in what is expected to be a very competitive market. 

“At the moment most of these solutions are still in the early stages of commercial deployment in smart cities and smart manufacturing, mainly used for asset tracking and anomaly sensing, and yet to achieve large-scale adoption. While able to offer better processing capabilities, sensors with TinyML are often much more expensive. End users will also need to design and introduce a new set of procedures and protocols to leverage the information and insights derived from these sensors,” concludes Su.

These findings are from ABI Research’s Very Edge AI Chipset for TinyML Applications application analysis report.

LEAVE A REPLY

Please enter your comment!
Please enter your name here