News Hub

Nvidia launches EGX platform for AI processing at the edge

Written by Tue 28 May 2019

Early adopters of EGX include BMW group and Foxconn

Nvidia has announced a new accelerated computing platform for deploying AI at the edge of the network.

AI has grown into an indispensable tool for businesses of all stripes, yet the majority of these applications still sit in the core, whether that’s on-prem or in the cloud.

A whole host of applications, for instance smart mirrors in retail stores, are hungry for near-instantaneous, high-throughput AI processing where the data is produced.

As the number of Internet-connected and sensed devices skyrockets, firms want to process the near-continuous stream of data they produce, while also reducing the amount they must send to the cloud.

To meet these demands, Nvidia has announced EGX, an AI platform for the instant processing of data in the telecommunications, manufacturing, transportation and retail industries. Companies can start small with the compact Jetson nano server or go large with a full rack of Nvidia T4s.

“A scalable platform like Nvidia EGX allows [companies] to easily deploy systems to meet their needs on premises, in the cloud or both,” said Bob Pette, VP and GM of Enterprise and Edge Computing at Nvidia.

“On-prem AI Cloud-in-a-Box”

EGX uses an optimised software stack for the edge with support for containerised applications. Nvidia Edge stack includes a CUDA Kubernetes plugin, a CUDA container runtime, CUDA-X libraries and containerised AI frameworks and applications, including TensorRT, TensorRT Inference Server and DeepStream. EGX is also fully integrated with Red Hat’s container orchestrator OpenShift.

“By combining Red Hat OpenShift and Nvidia EGX-enabled platforms, customers can better optimise their distributed operations with a consistent, high-performance, container-centric environment,” said Chris Wright, CTO at Red Hat.

For networking, EGX is leveraging NICs and switches from Mellanox, the smart networking company which Nvidia acquired for $6.9 billion (£5.45 billion) earlier this year, and Ethernet and IP-based networking technologies from Cisco.

Nvidia AI computing services offered by the major cloud providers are architecturally compatible with Nvidia EGX, meaning applications developed in the cloud can run on Nvidia EGX and vice versa.

EGX also connects with the major public cloud IoT services, including AWS Greengrass and Azure IoT Edge, so customers can remotely manage deployments.

Nvidia said over 40 companies and organisations are already using EGX, including BMW, Foxconn, Harvard Medical School, and Seagate.

“Foxconn PC production lines are limited by the speed of inspection because it currently requires four seconds to manually inspect each part,” Mark Chein, GM at Foxconn D Group said.

“Our goal is to increase the throughput of the PC production line by over 40 percent using the Nvidia EGX platform for real-time intelligent decision-making at the edge. Our model detects and classifies 16 defect types and locations simultaneously using fast neural networks running on Nvidia GPUs, achieving 98 percent accuracy at a superhuman throughput rate.”

Written by Tue 28 May 2019

Tags:

AI AWS Azure edge IoT machine learning nvidia
Send us a correction Send us a news tip




Do NOT follow this link or you will be banned from the site!