Elevators that respond to voice commands, cameras that notify store managers when to restock shelves and video streams that keep tabs on everything from cash register lines to parking space availability.
These are a few of the millions of scenarios becoming possible thanks to a combination of artificial intelligence and computing on the edge. Standalone edge devices can take advantage of AI tools for things like translating text or recognizing images without having to constantly access cloud computing capabilities.
At its Ignite digital conference, Microsoft unveiled the public preview of Azure Percept, a platform of hardware and services that aims to simplify the ways in which customers can use Azure AI technologies on the edge – including taking advantage of Azure cloud offerings such as device management, AI model development and analytics.
Roanne Sones, corporate vice president of Microsoft’s edge and platform group, said the goal of the new offering is to give customers a single, end-to-end system, from the hardware to the AI capabilities, that “just works” without requiring a lot of technical know-how.
The Azure Percept platform includes a development kit with an intelligent camera, Azure Percept Vision. There’s also a “getting started” experience called Azure Percept Studio that guides customers with or without a lot of coding expertise or experience through the entire AI lifecycle, including developing, training and deploying proof-of-concept ideas.
For example, a company may want to set up a system to automatically identify irregular produce on a production line so workers can pull those items off before shipping.
Azure Percept Vision and Azure Percept Audio, which ships separately from the development kit, connect to Azure services in the cloud and come with embedded hardware-accelerated AI modules that enable speech and vision AI at the edge, or during times when the device isn’t connected to the internet. That’s useful for scenarios in which the device needs to make lightning-fast calculations without taking the time to connect to the cloud, or in places where there isn’t always reliable internet connectivity, such as on a factory floor or in a location with spotty service.
In addition to announcing hardware, Microsoft says it is working with third-party silicon and equipment manufacturers to build an ecosystem of intelligent edge devices that are certified to run on the Azure Percept platform, Sones said.
“We’ve started with the two most common AI workloads, vision and voice, sight and sound, and we’ve given out that blueprint so that manufacturers can take the basics of what we’ve started,” she said. “But they can envision it in any kind of responsible form factor to cover a pattern of the world.”
Making AI at the edge more accessible
The goal of the Azure Percept platform is to simplify the process of developing, training and deploying edge AI solutions, making it easier for more customers to take advantage of these kinds of offerings, according to Moe Tanabian, a Microsoft vice president and general manager of the Azure edge and devices group.
For example, most successful edge AI implementations today require engineers to design and build devices, plus data scientists to build and train AI models to run on those devices. Engineering and data science expertise are typically unique sets of skills held by different groups of highly trained people.
“With Azure Percept, we broke that barrier,” Tanabian said. “For many use cases, we significantly lowered the technical bar needed to develop edge AI-based solutions, and citizen developers can build these without needing deep embedded engineering or data science skills.”
The hardware in the Azure Percept development kit also uses the industry standard 80/20 T-slot framing architecture, which the company says will make it easier for customers to pilot proof-of-concept ideas everywhere from retail stores to factory floors using existing industrial infrastructure, before scaling up to wider production with certified devices.
As customers work on their proof-of-concept ideas with the Azure Percept development kit, they will have access to Azure AI Cognitive Services and Azure Machine Learning models as well as AI models available from the open-source community that have been designed to run on the edge.
In addition, Azure Percept devices automatically connect to Azure IoT Hub, which helps enable reliable communication with security protections between Internet of Things, or IoT, devices and the cloud. Customers can also integrate Azure Percept-based solutions with Azure Machine Learning processes that combine data science and IT operations to help companies develop machine learning models faster.
In the months to come, Microsoft aims to expand the number of third-party certified Azure Percept devices, so anybody who builds and trains a proof-of-concept edge AI solution with the Azure Percept development kit will be able to deploy it with a certified device from the marketplace, according to Christa St. Pierre, a product manager in Microsoft’s Azure edge and platform group.
“Anybody who builds a prototype using one of our development kits, if they buy a certified device, they don’t have to do any additional work,” she said.
Security and responsibility
Because Azure Percept runs on Azure, it includes the security protections already baked into the Azure platform, the company says.
Microsoft also says that all the components of the Azure Percept platform, from the development kit and services to Azure AI models, have gone through Microsoft’s internal assessment process to operate in accordance with Microsoft’s responsible AI principles: fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability.
The Azure Percept team is currently working with select early customers to understand their concerns around the responsible development and deployment of AI on edge devices, and the team will provide them with documentation and access to toolkits such as Fairlearn and InterpretML for their own responsible AI implementations.
Ultimately, Sones said, Microsoft hopes to enable the development of an ecosystem of intelligent edge devices that can take advantage of Azure services, in the same way that the Windows operating system has helped enable the personal computer marketplace.
“We are a platform company at our core. If we’re going to truly get to a scale where the billions of devices that exist on the edge get connected to Azure, there is not going to be one hyperscale cloud that solves all that through their first-party devices portfolio,” she said. “That is why we’ve done it in an ecosystem-centric way.”
Top image: The Azure Percept platform makes it easy for anyone to deploy artificial intelligence on the edge. Devices include, from left to right, Trusted Platform Module, Azure Percept Vision and Azure Percept Audio. Each device integrates with industry standard 80/20 hardware. Photo credit: Microsoft