Home PC News Breakthrough use cases are emerging as computation moves from the cloud to...

Breakthrough use cases are emerging as computation moves from the cloud to the edge

Presented by Qualcomm       


Distributed intelligence is the headline-grabbing megatrend transforming the tech industry. It combines secure, on-device AI technology with powerful cloud-edge AI processing, and throws in the speed of 5G as the link. This is the power of AI, distributed across devices to power new experiences.

“It’s absolutely inevitable,” says Mike Vildibill, VP of Product Management at Qualcomm Technologies. “It’s happening everywhere decision-making has to be made locally, and therefore intelligence is happening locally.”

Distributed processing and decision-making at the edge requires the same kind of high performance and efficiency needed in a data center. The most dramatic example is the autonomous car. In the course of driving, it’s rapidly pulling in environmental data all around it, comparing that information to its onboard database, and making life-and-death decisions at lightning-fast speed. Physics does not allow enough time to transfer that data to a remote cloud data center, process it, and send a decision back; these decisions must be low latency.

“You need very high performance local to you, in the car, near the camera, near where the data is, where the action is occurring,” Vildibill explains. “But even more important, you need the power efficiency, because inevitably these distributed locations are power constrained, whether in a car, in a retail environment, or on the manufacturing floor.” 

A closer look at the edge

 As sensor data comes into an inferencing engine, it’s identifying anomalies and making decisions quickly about how to act. Inferencing on the edge device is counted in trillions of peta-operations per second. Qualcomm’s Cloud AI 100 platform was designed for this kind of fast and power-efficient inferencing at the edge, Vildibill says. A single card boosts up to 400 TOPS, and by working together with AMD and Gigabyte, a server now can contain up to 16 Cloud AI 100 cards. The inferencing performance for a single server is 6.4 peta-operations per second (POPS). Up to 16 to 20 of those racks can be placed into one server rack with more than 100+ POPS of AI performance.

Distributed intelligence is the next step. Edge devices aren’t just making decisions on fixed training models; increasingly those training models are becoming informed by the knowledge that they’re learning every second at the edge. Consider, for example, a smartphone powered by the Snapdragon 888 5G Mobile Platform and its 6th gen AI Engine. The intelligence collected at the edge gets fed back into the trained models, and then shared with all the devices at all of the edge points — this is called federated learning.

“What we see going on at the edge is really a very significant, pivotal point in the industry, where not only has a lot of computation moved from the cloud to the edge, but now we’re also seeing more and more intelligence is being developed at the edge and fed back to the cloud,” Vildibill says.

The players in the AI inference acceleration field

Major tech companies in the industry are competing to bring this kind of power to the edge. MLPerf Inference v1.0 is a cornerstone of MLCommons’ initiative to provide benchmarks and metrics that level the industry playing field through the comparison of machine learning systems, software, and solutions. The latest round of benchmarking measured how quickly a trained neural network can process new data for a wide range of applications on a variety of form factors, and a system power measurement methodology. Products that can deliver highest performance at lowest power are most critical at data centers and on the edge.

Qualcomm put the performance of its Cloud AI 100 platform to the test. It topped the inferencing datacenter and edge charts with the highest inference performance at lowest power among all 17 MLPerf 1.0 submissions. The platform delivered over 3.5 times the performance of the NVIDIA T4 while consuming roughly the same amount of energy with lower latency. The Gigabyte server with the Cloud AI 100 delivers performance comparable to the newest NVIDIA A30 GPU while consuming less than half the total system power.

“The edge demands raw performance, power efficiency, and constrained form factors, which is what the Qualcomm Cloud AI 100 is built for,” Vildibill says.

The PCI express card delivers up to 400 trillion operations per second, and lower bound products carry that down into the 70 TOPS realm. They’re incredibly power-efficient, providing orders of magnitude more performance than currently possible in a high-end mobile device, and surpassing its competitors in the market.

Groundbreaking vision use cases at the edge

“Inference at the edge enables scenarios that haven’t been possible up to now,” Vildibill says. “And it takes over human tasks and performs them far better than a human ever could.”

Cameras and sensors are now collecting terabytes of local data in retail environments, in smart manufacturing, in smart cities, and for safety applications in every industry. At those points of data capture, whether the input is coming from car sensors that have spotted a pedestrian unexpectedly stepping out into the street, or safety cameras on the floor of a manufacturing facility, optimized AI models are running at lightning speed. And this instantaneous computation, enabled by processors like the Cloud AI 100, delivers real-time, actionable insights to businesses, followed by split-second responses.

There are three fundamental categories of inferencing, Vildibill adds. The first is vision, whether it be an autonomous car or a smart city with cameras. The second is language, which includes speech to text, and natural language processing, and the third is analytics, which power the kinds of systems used, for example, by social media platforms to recommend connections to users.

While all three are vital, vision inferencing is having a particularly large impact on safety and business operations in retail and manufacturing — a company can now connect up to 24 high-definition cameras to the smallest Cloud AI 100 product, like the Qualcomm Cloud AI 100 Edge Development Kit, and conduct AI on every frame of each of those cameras in real time. With that kind of data and processing speed, an incredible array of new use cases has been unlocked. Here are just some of the opportunities available now.

New possibilities for business operations

  • On the floor of retail environments, cameras are monitoring queue lengths at the registers, and alerting managers when lines are too long, and more registers need to be opened to manage the rush. This queue data, captured in real time, can be aggregated and analyzed to make recommendations about staffing – such as when the store should hire more clerks to improve efficiency.
  • In-store stock can be tracked, and notifications sent to the supply chain when inventory is low.
  • Inferencing can help optimize retail marketing campaigns in store. Cameras and sensors monitor traffic patterns past sales displays, and observe customers’ reactions to them (demonstrating an interest in a sale item by removing it from the shelf and inspecting it, or by placing the item in their cart, or back on the shelf, for instance) and more.
  • Employee behavior can also be tracked to monitor efficiency and make recommendations about process improvements.
  • In manufacturing assembly lines, AI cameras can track the movement of materials and alert supervisors if there are any gaps in supply that would slow the line down, as well as offer insight into the best course of action to keep it moving.
  • Manufacturers can leverage real-time feedback into the IT infrastructure and supply chain as goods are processed and moved by employees.
  • Manufacturers can also use AI vision to identify flaws that humans can miss, making the visual inspection process for quality control and safety on manufacturing lines faster, cheaper, and far more accurate.

Improving employee and customer safety

Companies are increasingly concerned about the safety of their employees. At the same time, they’re often under pressure to improve efficiency and lower costs. This can translate to going without a safety team dedicated to monitoring safety issues or an inadequate number of staff to respond to issues. Fortunately, vision AI is now the answer for many and the Qualcomm Cloud AI 100 has solutions for all of it.

  • Cameras monitoring the manufacturing floor will alert a supervisor when an employee walks by without their safety gear.
  • Forklifts can be tracked and someone alerted if they’re moving erratically.
  • A camera positioned in an Employees Only area of a retail location can detect whether an actual employee is in view or send an alert that an unidentified person is entering a restricted area.
  • Sensors can detect spills or breakage and alert supervisors of a possible hazard.
  • A human’s pose can be analyzed – not just whether they’re inspecting a product and might need customer service, but whether they’re on the ground and may need medical aid.
  • In parking lots, driving behavior can be monitored, and alerts sent in real time about accidents, unsafe drivers, and more.

“These are just a few of the solutions our customers are starting to put in place,” Vildibill says. “We’re seeing this every day with our products, which marry performance with high efficiency, smaller form factor, and the lowest Total Cost of Ownership (TCO) possible with low latency and high privacy. We’ve achieved the extreme performance and extreme efficiency that’s fundamental to these use cases now, and are continuing to unlock even new segments.”

Dig deeper: Learn more about the future of AI-to-edge-computing.


Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected]

Most Popular

Recent Comments