Current Buzz Spot

What About Edge AI?


What About Edge AI?

This might not be an incredibly technical blog post, but it will address a fundamental change that's happening right now with AI. It's called edge computing, and it was previously applied to more deterministic systems, but now it's making its way into the AI world.

It's also a complete reversal of an enormous change that went on over the past 15 years - the move to the cloud...

Cloud computing made a lot of sense in its heyday when it was new. It still makes sense for a lot of workloads and data sets. But it doesn't always make sense for processes that need to be done in a resource-rich way, where it would be expensive to keep sending all of that data back and forth. And that's where edge computing comes in.

So what is edge AI?

It's essentially the idea that AI processes get done locally on a device, rather than up in the cloud, or in another centralized location like a data center.

So if you have an instance of a system that's an endpoint on the Internet in its own physical location, and it processes AI functions, that's an example of AI.

Experts, including the people at Nvidia, are talking about 'AI PCs' - individual personal computers, running AI systems and large language models.

That's inherently what edge AI is about - having these localized systems working on their own, instead of delivering the raw material for an AI process to the central location.

As for the benefits of this process, experts offer the following - scalability and lower latency, along with better privacy and security. So there are functional reasons to do AI, and there are reasons involving user protection.

In one of our recent classes, someone estimated that there are 300 million AI PCs working in their own siloed instances. That's a strange picture to paint in the days of the cloud - but then, if you understand the value proposition of edge AI, it makes sense!

That creates a need for new architecture and hardware, where the CPU becomes the GPU, and the GPU now becomes the NPU: a processor of neural network behaviors and activities.

Tiffany Yeung at Nvidia reports how this type of technology is feeding into self-driving vehicles, AI diagnosis in hospitals, and various kind of agriculture. The edge factor just makes these applications more efficient.

As for the definition of edge computing, it's essentially real-time processing without the cloud. Presumably, the results of the edge AI can be sent to the cloud later...

IBM characterizes edge AI computing this way:

"Edge computing is a distributed computing framework that brings enterprise applications closer to data sources such as IoT devices or local edge servers. This proximity to data at its source can deliver strong business benefits, including faster insights, improved response times and better bandwidth availability."

Don't know if that helps or not, if you're trying to use edge AI for something concrete, but it does sort of outline how we're approaching this option.

As for an MIT angle, here's a piece from 2022 talking about how MIT people are doing this, and why...

So even though it might sound cryptic, it's really that simple - in the AI age, we're taking workloads and processes and moving them back from the cloud to a local endpoint. We're doing that because it's safer and easier, and focuses more on quality, where the move to the cloud was mostly on cutting costs.

This is a big idea to focus on as we continue to evaluate what's being done with AI in our time. Keep an eye out for more from our conferences and classes on what makes this industry tick, and what executives and others are looking at as they embrace new automation technologies.

Previous articleNext article

POPULAR CATEGORY

business

6391

general

8225

health

6081

sports

8194