
Let’s say you have a coffee pot, but not your average coffee pot — this thing is voice controlled and will produce custom drinks for each user, learning from their preferences as time goes by. It’s smart, thanks to artificial intelligence.
Perhaps even more incredibly, this coffee pot isn’t connected to the internet at all, and it doesn’t need to send user data back and forth to a large, far-off data center. Instead, all of the algorithms it uses to process data are generated at the local level, right in the coffee pot itself, thanks to edge AI.
Edge AI
Edge AI is the implementation of artificial intelligence in an edge computing environment. That means AI computations are done at the edge of a given network, usually on the device where the data is created, instead of in a centralized cloud computing facility or offsite data center. The edge could be a device like a camera or car, or a mini-data center located in a hospital. And edge AI builds intelligence into all of these environments, even if they aren’t connected to the internet.
Edge AI allows computations to be done close to where data is actually located, rather than at a centralized cloud computing facility or an offsite data center. This localized processing allows devices to make decisions in milliseconds without needing an internet connection or the cloud. Essentially, as the device produces data, the algorithms on board can put the data to use immediately.
And this kind of technology can be used far beyond the reaches of your kitchen countertop. Edge AI’s ability to more securely produce real-time analytics at higher speeds, lower costs and with less power has made it an attractive alternative to cloud computing AI — and companies across industries like manufacturing, healthcare and energy are taking advantage.
What Is Edge AI?
Put simply, edge AI is a combination of edge computing and artificial intelligence. Therefore, to understand edge AI, one first must understand edge computing.
These days, smart devices are everywhere. Everything from the watch on your wrist to the car in your garage is capable of performing autonomous computing and exchanging data with other smart devices — a concept commonly known as the internet of things, or IoT. All that data flying back and forth puts a heavy strain on data centers. But edge computing is meant to ease that burden by moving some of the processing closer to its point of origin. So, rather than traveling to the cloud, the job is done on “on the edge,” so to speak.
The “edge” simply refers to the device being used. This can be a phone, a camera, a car, a medical device or a television. So edge computing is when the computer is inside or near that device. And, like any other computer, edge computers are designed to process standard data.
By extension, edge AI is essentially “doing that work and making decisions locally,” according to Aaron Allsbrook, the founder and CTO of edge AI startup ClearBlade. “It’s all the rules. It’s the processing, it’s gathering the data, it’s understanding the data — not in big servers or in the cloud, but in your house, in a work or a job site, or in a parking garage,” he told Built In. “It’s about moving the different mathematical algorithms, and running those predictions at the edge.”
How Does Edge AI Technology Work?
For machines to successfully make predictions, see and identify objects, drive cars, converse with humans, and move through the world, they have to, in effect, functionally emulate human cognition. In other words: they have to implement artificial intelligence.
Many AI models are powered by machine learning, giving them the ability to learn and optimize processes without having to be specifically programmed to do so. Others employ data structures called neural networks, which are trained to answer specific types of questions by being shown many examples of that question, along with correct answers, in an effort to mimic the general functionality of the human brain. This training process is known as deep learning.
With edge AI, machine learning algorithms can run directly at the edge of a given network, close to where the data and information needed to run the system are generated, such as an IoT device or machine equipped with an edge computing device. Edge AI devices use embedded algorithms to monitor the device’s behavior, as well as collect and process the device data. This allows the device to make decisions, automatically correct problems and make future performance predictions.
Edge AI can run on a wide range of hardware, from existing central processing units, or CPUs, to microcontrollers and advanced neural processing devices. Some of the most-used edge computers are made by large tech companies like Intel, NXP and Qualcomm.
Rob May, the founder and CTO of Dianthus, a company that builds machine learning tools for e-commerce startups, says that the high levels of computing power achieved by these devices could eventually make edge AI even bigger than cloud AI. But this will largely depend on the kinds of chips that come to market, and how cheap and efficient they are.
One startup working toward this is Axelera, which is in the midst of designing a chip that offers high computing performance and usability at what co-founder and CEO Fabrizio Del Maffeso says is a “fraction of the price and power consumption,” of the tech we typically see today. So, instead of moving data from the memory to the CPU to the memory again (which is typical of most computers), the chip merges the memory with the CPU in what is called “in-memory computing,” he explained. “We are modifying the memory and making the calculation inside the memory. This allows the chip to be very efficient because you don’t move so much data.” By essentially embedding computing elements inside the memory, he continued, the computer’s neural networks can be accelerated.
Axelera certainly isn’t the only company innovating in this space. The edge AI hardware market is projected to grow from 920 million units in 2021 to more than 2 billion units by 2026. And, according to one estimate, the edge AI chip market alone is expected to grow by some $73 billion by 2025.
The Importance of Edge AI
Overall, the edge AI space is growing right alongside the increasing ubiquity of artificial intelligence in general, which is continuing to optimize even the most mundane parts of daily life — from predicting the best day for trash collection in a city, to helping small businesses run more efficiently. And despite its technological complexities, the ultimate goal of edge AI is a fairly simple one: to get closer to devices themselves, thus reducing the amount of data that needs to be moved around.
This, in turn, preserves bandwidth and reduces latency, or the surface area data must travel to be processed. Plus, because this data doesn’t have to travel far, edge AI makes real-time analytics possible, which can have a big impact on future innovations in this already fast-moving space.
“If we can do AI at the edge, we can make really cool predictions pretty dynamically without having to move data to centralized clouds or IT.”
“Edge AI is much more real time. It’s looking at the data as it comes off of the live feeds, that makes predictions in the live feeds,” Allsbrook said. “If we can do AI at the edge, we can make really cool predictions pretty dynamically without having to move data to centralized clouds or IT.”
Edge AI Examples
Allsbrook’s company ClearBlade is a prime example of this. It specializes in implementing edge AI in industrial IoT devices, particularly as it relates to predictive maintenance, or forecasting when certain equipment is going to fail. Companies can make inferences directly on the ClearBlade platform off of the live IoT data feeds thanks to its engine called Onyx, which allows people to build models in lots of different languages and with various types of tools.
Allsbrook says he’s also noticed that people are using Onyx to repurpose their old models using edge AI, bringing them onto new technology without having to build a fresh model or hire a new team of data scientists.
“We’re finding people now are just taking models they built in the mid-’90s or early 2000s that run in big batch supercomputers. They now can take that same model and drop it into a Raspberry Pi-like device,” he said, referring to the popular small, cheap computer that runs on Linux. “It can really change how they’re able to get value out of that very old model.”
Industrial IoT is only one application of edge AI, however. Others range from DevOps, to robotics, to consumer technology:
Healthcare
Edge AI allows hospitals and other healthcare providers to reap the benefits of artificial intelligence without having to transmit sensitive patient information unnecessarily. All the data collected from health monitoring devices like cardiac trackers and blood pressure sensors can be processed and analyzed locally, enabling real-time analytics that help medical professionals provide better care to patients.
Driverless Cars
When a 4,000-pound autonomous vehicle is driving down a busy road, every millisecond counts. The rapid data processing enabled by edge AI allows the system to respond quickly to the world around it — ideally making it a safer, more reliable contraption.
Smart Homes
From video doorbells to voice controlled light bulbs and refrigerators that monitor things like food consumption and expiration dates, smart homes contain a web of IoT devices that are meant to work together to make the residents’ lives easier. Instead of these devices having to send all the data from the house to a centralized remote server for processing, edge AI allows all of this to happen onsite, making it faster and more secure.
Advantages and Disadvantages of Edge AI
Dianthus’ May first became aware of edge AI as a partner at venture capital firm PJC, when Deeplite, a company that brings AI computation abilities to edge devices like drones and cameras, was pitched to him as an investment opportunity. He was skeptical, and even passed at first.
“When I thought about the edge AI market initially, I thought it was going to be a small market,” he told Built In. But as he thought about it more and more, he came to realize that it’s a very “cost-driven” sector. So, when Deeplite pitched to PJC again, he agreed to invest in it.
“Let’s say you have a small security camera, and you want that camera to do some kind of analysis. You might not be able to keep that camera cloud-connected all the time, compared to just having a chip in there and a $2 microcontroller,” he continued. “[Edge AI], in a lot of cases, can be a much better model.”
Cost effectiveness is just one of edge AI’s many draws. Integrating artificial intelligence at the edge makes real-time data analytics possible, preserves bandwidth, and reduces latency often experienced in cloud connected machine learning models. Things can be sped up by several milliseconds, which can make all the difference when it is being used in something like an autonomous car. Plus, because data stays in the device and doesn’t get moved back and forth between private data centers, it is often more secure. And, with security, comes privacy, which has become of increasing importance across the board, especially when it comes to IoT devices.
Still, edge AI is still “pretty new,” said Allsbrook. “I would say it’s emerging. I think we’re just beginning to see what’s possible.”
“I think we’re just beginning to see what’s possible.”
Of course, no technology is perfect, and edge AI comes with its fair share of drawbacks, too. For one, edge AI systems can be challenging because, like other AI models, they have to be trained on a regular and ongoing basis — just using data from edge devices. This often means creating a data set by transferring data from a huge number of edge devices to the cloud, which can be rather complex depending on the bandwidth available and connectivity to the edge devices.
Security is also an area of concern, just in a different way. While edge computing can make systems more safe by keeping processing local, the infrastructure and devices themselves need their own security measures. This can include access control, traffic monitoring, data backup, antivirus and anti-malware software and even encryption.
As such, the edge AI market has not grown as quickly as some may have hoped. Since PJC’s initial investment in Deeplite and, by extension, the larger edge AI space, May said the market has been “a little slower than anticipated” in terms of general adoption. He attributes this to the longer design cycles of the devices that edge AI actually requires — things like drones, phones, cars. He says another big part of this is the general lack of awareness that this space even exists among certain tech companies.
“You have to be aware of something, and then you have to build it into a cycle. And that might take 18 months to get something out the door, from a design perspective,” May said. “I think all that kind of adds up. But I’m hopeful that it’s going to pick up more and more.”
And it seems to already be happening. According to Allied Market Research, the edge AI market was worth just $9 million in 2020, but it is projected to be worth nearly $60 million by 2030. Meanwhile, in 2020, Gartner predicted that 75 percent of enterprise data will be generated and processed at the edge by 2025, bypassing the cloud entirely.
And companies in this space are continuing to garner attention from investors. AI chip startup Axelera, for example, recently raised $27 million in funding, and brought on Jonathan Ballon, the former VP and general manager of Intel’s edge AI and internet of things group, as its chairman. In the near future, Del Maffeo says Axelera has plans to focus on the security, retail and industrial automation industries because all three of them invest heavily in computer vision tech and cameras, which its chips are specifically designed for. For example, the chips can be used for cashierless checkouts at Amazon Go grocery stores, or in security cameras in smart cities, or in collaborative robots that require vision to prevent accidents with their human coworkers.
“Edge computing can help people improve the quality of their life,” Del Maffeo said. “We are at a tipping point where new technologies can really make a huge impact on the quality of life of people around us. … If you look forward 10 to 15 years, the world is going to change a lot all around us, and edge AI can positively impact it.”
The Future of Edge AI
If and when we do have all our smart devices equipped with AI on the edge, May said it’s likely we’ll be able to notice it in our everyday lives. He predicts more things will have voices embedded in them. And this tech will likely work faster and be cheaper.
“When you think about the ability to build intelligence into every device, no matter how small, whether it’s internet-connected or not, I think that’s a pretty powerful opportunity,” May said. “It’s hard if all of that has to be cloud-connected all the time,” he continued, so “edge AI is one of the key technologies that’s going to help with this sort of ambient computing.”
In fact, he thinks eventually edge AI could be even bigger than cloud AI, so long as it’s able to maintain high levels of computing power. That’s not to say this technology will completely oust cloud computing. Companies will likely still need to keep all their software-as-a-service, or SaaS, applications, database applications and infrastructure in the cloud.
Instead, Del Maffeo predicts the two will complement each other as artificial intelligence continues to get more sophisticated.
“Your phone is getting more and more powerful. Your TV is becoming a computer,” he said. “I mean, computers are everywhere. It’s a classic trend.”