Nvidia reveals new tech for mobility, logistic industry, and the Omniverse

635

At CES this week, the companies detailed their collaboration, which is aimed at easing the biggest pain points in AV development. Deloitte, a leading global consulting firm, is pairing with NVIDIA to offer a range of services for data generation, collection, ingestion, curation, labeling, and deep neural network (DNN) training with NVIDIA DGX SuperPOD.

NVIDIA DGX systems and advanced training tools enable streamlined, large-scale DNN training and optimization. Using the power of GPUs and AI, developers can seamlessly collect and curate data to comprehensively train DNNs for autonomous vehicle perception, planning, driving, and more. Developers can also train and test these DNNs in simulation with NVIDIA DRIVE Sim, a physically accurate, cloud-based simulation platform. It taps into NVIDIA’s core technologies including NVIDIA RTX, Omniverse, and AI to deliver a wide range of real-world scenarios for AV development and validation.

“The robust AI infrastructure provided by NVIDIA DGX SuperPOD is paving the way for our clients to develop transformative autonomous driving solutions for safer and more efficient transportation,” said Ashok Divakaran, Connected and Autonomous Vehicle Lead at Deloitte.

The center is built on NVIDIA DGX A100 systems to bring together the supercomputing architecture and expertise that Deloitte clients require as they become AI-fueled organizations. To further scale AV development and speed time to results, customers can choose the NVIDIA DGX SuperPOD, which includes 20 or more DGX systems plus networking and storage. Finally, NVIDIA and Deloitte make it possible to curate specific scenarios for comprehensive DNN training with Synthetic Data Generation-as-a-Service. Developers can take advantage of simulation expertise to generate high-fidelity training data to cover the rare and hazardous situations AVs must be able to handle safely.

NVIDIA Builds Isaac AMR Platform to Aid $9 Trillion Logistics Industry

Isaac AMR extends NVIDIA Isaac capabilities for building and deploying robotics applications, bringing mapping, site analytics, and fleet optimization onto NVIDIA Certified Systems.

The Isaac AMR platform uses NVIDIA Omniverse for creating digital twins of the facility where AMRs will be deployed. NVIDIA Isaac Sim (built on Omniverse) simulates the behavior of robot fleets, people, and other machines in the digital twins with high-fidelity physics and perception. It also enables synthetic data generation for the training of AI models. Isaac AMR consists of GPU-accelerated AI technologies and SDKs including DeepMap, ReOpt, and Metropolis. These technologies are securely orchestrated and cloud-delivered with NVIDIA Fleet Command.

NVIDIA’s recent acquisition of DeepMap brings advances in mapping for autonomous vehicles to the AMR industry as well. AMR deployments can access the DeepMap platform’s cloud-based SDK to help accelerate robot mapping of large facilities from weeks to days while achieving centimeter-level accuracy.

The DeepMap Update Client enables robot maps to be updated as frequently as necessary, in real-time. And the DeepMap SDK delivers layers of intelligence to maps by adding semantic understanding so robots can identify the object’s pixels represent and know if they can move one way or not. It’s also capable of addressing both indoor and outdoor map building. As part of the Isaac AMR platform, NVIDIA DeepMap integrates with other components, such as Metropolis, ReOpt, Isaac Sim via Omniverse, and more.

With Metropolis, AMRs have access to additional layers of situational awareness on the factory floor, enabling them to avoid high-congestion areas, eliminate blind spots, and enhance the visibility of both people and other AMRs. In addition, Metropolis’s pre-trained models provide a head start in customizing for site-specific needs.

NVIDIA ReOpt AI software libraries can be used to optimize vehicle route planning and logistics in real-time, which can be applied to fleets of AMRs. NVIDIA ReOpt provides for dynamic re-optimization of routes to a fleet of heterogeneous AMRs based on a number of constraints.

Open Version of Omniverse for Individual Creators and Artists Worldwide

In a special address at CES, NVIDIA also announced new platform developments for Omniverse Machinima and Omniverse Audio2Face, new platform features like Nucleus Cloud and 3D marketplaces, as well as ecosystem updates.

“With this technology, content creators get more than just a fast renderer,” said Zhelong Xu, a digital artist and Omniverse Creator based in Shanghai. “NVIDIA Omniverse and RTX give artists a powerful platform with infinite possibilities.”

New features within Omniverse include: 

  • Omniverse Nucleus Cloud enables “one-click-to-collaborate” simple sharing of large Omniverse 3D scenes, meaning artists can collaborate from across the room or the globe without transferring massive datasets. 
  • New support for the Omniverse ecosystem provided by leading 3D marketplaces and digital asset libraries gives creators an even easier way to build their scenes. TurboSquid by Shutterstock, CGTrader, Sketchfab, and Twinbru have released thousands of Omniverse-ready assets for creators, all based on Universal Scene Description (USD) format.
  • Omniverse Machinima for RTX creators who love to game, Bannerlord, and Squad assets in the Machinima library. Creators can remix and recreate their own game cinematics with these assets by dragging and dropping them into their scenes.
  • Omniverse Audio2Face, an AI-enabled app that instantly animates a 3D face with just an audio track, now offers blend shape support and direct export to Epic’s MetaHuman Creator app.

The latest Omniverse Connectors, extensions, and asset libraries include e-on software’s VUE, e-on software’s PlantFactory, e-on software’s  PlantCatalog, Twinbru, 

During a special virtual address at the show, Ali Kani, vice president and general manager of Automotive at NVIDIA, detailed the capabilities of DRIVE Hyperion and the many ways the industry is developing on the platform. These critical advancements show the maturity of autonomous driving technology as companies begin to deploy safer, more efficient transportation.

Autonomous Era Arrives at CES 2022 With NVIDIA DRIVE Hyperion and Omniverse Avatar

The DRIVE Hyperion architecture has been adopted by hundreds of automakers, truck makers, tier 1 suppliers, and robotaxi companies, ushering in the new era of autonomy. Bringing this comprehensive platform architecture to the global automotive ecosystem requires collaboration with leading tier 1 suppliers. Desay, Flex, Quanta, Valeo, and ZF are now DRIVE Hyperion 8 platform scaling partners, manufacturing production-ready designs with the highest levels of functional safety and security.

“We are excited to work with NVIDIA on their DRIVE Hyperion platform,” said Geoffrey Buoquot, CTO and vice president of Strategy at Valeo. “On top of our latest generation ultrasonic sensors providing digital raw data that their AI classifiers can process and our 12 cameras, including the new 8-megapixel cameras, we are now also able to deliver an Orin-based platform to support autonomous driving applications with consistent performance under automotive environmental conditions and production requirements.”

“Flex is thrilled to collaborate with NVIDIA to help accelerate the deployment of autonomous and ADAS systems leveraging the DRIVE Orin platform to design solutions for use across multiple customers,” said Mike Thoeny, president of Automotive at Flex.

Many leading NEV makers are adopting DRIVE Hyperion as the platform to develop these clean, intelligent models. From the storied performance heritage of Polestar to the breakthrough success of IM Motors, Li Auto, NIO, R Auto, and Xpeng, these companies are reinventing the personal transportation experience. DRIVE Concierge combines NVIDIA Omniverse Avatar, DRIVE IX, DRIVE AV 4D perception, Riva GPU-accelerated speech AI SDK and an array of deep neural networks to delight customers on every drive.

Omniverse Avatar connects speech AI, computer vision, natural language understanding, recommendation engines, and simulation. Avatars created on the platform are interactive characters with ray-traced 3D graphics that can see, speak, converse on a wide range of subjects, and understand naturally spoken intent. These important developments in intelligent driving technology, as well as innovations from suppliers, automakers, and trucking companies all building on NVIDIA DRIVE, are heralding the arrival of the autonomous era.

Source received from NVIDIA