CES-a global tech event has definitely become the new source point for the firms to incorporate collaborative efforts, focussed primarily on relieving up the huge pain points within the AV Development.
Deloitte-Nvidia is forming an alliance to offer up numerous services that range from data generation, collection, ingestion, curation, labeling, and deep neural network (DNN) training with NVIDIA DGX SuperPOD.
According to Deloitte and Nvidia, AVs are the crux for today’s agile mobile world and are born in the data centers. Hence both firms are forming alliances for offering robust foundations for lead developers deploying robust-self driving technology.
Creating AVs is time-consuming and also requires massive data to be filled. For an illustration: – A fleet of 50 test vehicles driving six hours a day generates 1.6 petabytes daily — if all that data were stored on standard 1GB flash drives, they’d cover more than 100 football fields. Yet that isn’t enough
NVIDIA DGX systems with their progressive training tools empower streamlined, large-scale DNN training and optimization. Utilizing the power of GPUs and AI, developers can seamlessly collect and curate data to comprehensively train DNNs for autonomous vehicle perception, planning, driving, and more.
The lead developers can also train and test these DNNs in simulation with NVIDIA DRIVE Sim, a physically accurate, cloud-based simulation platform, digging deep into NVIDIA’s crux technologies — comprising of the NVIDIA RTX, Omniverse, and AI — for offering a broader range of real-global scenarios for AV development and validation.
Drive Sim utilizing the NVIDIA Omniverse Replicator can generate huge-fidelity synthetic data for training the vehicle’s perception systems or test the decision-making processes.
Ashok Divakaran, Connected and Autonomous Vehicle Lead at Deloitte stated: “The robust AI infrastructure provided by NVIDIA DGX SuperPOD is paving the way for our clients to develop transformative autonomous driving solutions for safer and more efficient transportation.”
Deloitte during March, last year announced the launch of the Deloitte Center for AI Computing, a unique center designed to swiften out the development of innovative AI solutions for its clients. It\’s in this center NVIDIA DGX A100 systems to bring together the supercomputing architecture and expertise that Deloitte clients require as they become AI-fueled organizations.
For further scaling the AV Development and for quickening the time to the outcome, clients can opt-out for NVIDIA DGX SuperPOD, which comprises 20 or more DGX systems plus networking and storage.
NVIDIA and Deloitte have also commenced operations on Infrastructure-as-a-Service, which offers the management of the DGX SuperPOD infrastructure in an on-prem or co-location environment.
NVIDIA-Deloitte Alliance is also offering data scientists for improving productivity up to 30 percent with MLOps-as-a-Service with this turnkey solution deploying and aiding enterprise-grade MLOps software to train DNNs and accelerate accuracy.
Finally, NVIDIA and Deloitte make it possible to curate specific scenarios for comprehensive DNN training with Synthetic Data Generation-as-a-Service. Developers can take advantage of simulation expertise to generate high-fidelity training data to cover the rare and hazardous situations AVs must be able to handle safely.
Autonomous Era dawns CES 2022 with the NVIDIA DRIVE Hyperion and Omniverse Avatar
CES has long been a showcase of what’s coming down the technology pipeline. This year, NVIDIA is showing the radical innovation happening now.
Ali Kani, vice president and general manager of Automotive at NVIDIA, detailed the capabilities of DRIVE Hyperion and the many ways the industry is developing on the platform. These critical advancements show the maturity of autonomous driving technology as companies begin to deploy safer, more efficient transportation.
The DRIVE Hyperion architecture has been adopted by hundreds of automakers, truck makers, tier 1 suppliers, and robotaxi companies, ushering in the new era of autonomy. Bringing this comprehensive platform architecture to the global automotive ecosystem requires collaboration with leading tier 1 suppliers.
Desay, Flex, Quanta, Valeo, and ZF are now DRIVE Hyperion 8 platform scaling partners, manufacturing production-ready designs with the highest levels of functional safety and security.
Geoffrey Buoquot, CTO and vice president of Strategy at Valeo stated: “We are excited to work with NVIDIA on their DRIVE Hyperion platform. On top of our latest generation ultrasonic sensors providing digital raw data that their AI classifiers can process and our 12 cameras, including the new 8-megapixel cameras, we are now also able to deliver an Orin-based platform to support autonomous driving applications with consistent performance under automotive environmental conditions and production requirements.”
Mike Thoeny, president of Automotive at Flex stated: “Flex is thrilled to collaborate with NVIDIA to help accelerate the deployment of autonomous and ADAS systems leveraging the DRIVE Orin platform to design solutions for use across multiple customers.”
The radical transformation of the transportation industry extends from computing power to powertrains.
Electric vehicles have just started to become the new norm and it is not just suitable for the environment, they also fundamentally progress the driving experience for clients. With a quieter and more sustainable profile, EVs will begin to make up the majority of cars sold over the next several decades.
Numerous core NEV makers are adopting DRIVE Hyperion as the platform to develop these clean, intelligent models. From the storied performance heritage of Polestar to the breakthrough success of IM Motors, Li Auto, NIO, R Auto, and Xpeng, these companies are reinventing the personal transportation experience.
AI isn’t just transforming personal transportation, it’s also addressing the rapidly growing challenges faced by the trucking and logistics industry.
NVIDIA DRIVE Concierge platform, which delivers intelligent services that are always on.
DRIVE Concierge combines NVIDIA Omniverse Avatar, DRIVE IX, DRIVE AV 4D perception, Riva GPU-accelerated speech AI SDK and an array of deep neural networks to delight customers on every drive.
Omniverse Avatar integrates speech AI, computer vision, natural language understanding, recommendation engines, and simulation. Avatars built on the platform are interactive characters with ray-traced 3D graphics that can see, speak, converse on a wide range of subjects, and understand naturally spoken intent.
These important developments in intelligent driving technology, as well as innovations from suppliers, automakers, and trucking companies all building on NVIDIA DRIVE, are heralding the arrival of the autonomous era.
NVIDIA’s Isaac AMR Platform comes in handy for aiding $9 Trillion Logistics Industry
NVIDIA has recently unveiled the Isaac Autonomous Mobile Robot (AMR) platform for optimizing operational efficiency and quickening up the deployment of AMRs. Isaac AMR extends NVIDIA Isaac capabilities for building and deploying robotics applications, bringing mapping, site analytics, and fleet optimization onto NVIDIA-Certified Systems.
The Isaac AMR platform utilizes the NVIDIA Omniverse for building out the digital twins of the facility where AMRs will be deployed. NVIDIA Isaac Sim (built on Omniverse) boosts up the behavior of robot fleets, people, and other machines in the digital twins with high-fidelity physics and perception.
It also empowers synthetic data generation for the training of AI models. Isaac AMR consists of GPU-accelerated AI technologies and SDKs comprising DeepMap, ReOpt, and Metropolis. These technologies are securely orchestrated and cloud-delivered with NVIDIA Fleet Command.
Mapping doesn’t account for everything in these environments. And the advanced sensors onboard AMRs aren’t always sufficient to ensure safe and efficient operation.
The NVIDIA Metropolis video analytics platform meets the need for higher-level real-time “outside-in” perception by providing access to cameras and sensors deployed all over the factory or warehouse floor.
With Metropolis, AMRs have access to additional layers of situational awareness on the factory floor, enabling them to avoid high-congestion areas, eliminate blind spots, and enhance the visibility of both people and other AMRs. In addition, Metropolis’s pre-trained models provide a head start in customizing for site-specific needs.
NVIDIA’s recent acquisition of DeepMap brings advances in mapping for autonomous vehicles to the AMR industry as well. AMR deployments can access the DeepMap platform’s cloud-based SDK to help accelerate robot mapping of large facilities from weeks to days while achieving centimeter-level accuracy.
The DeepMap Update Client enables robot maps to be updated as frequently as necessary, in real-time. And the DeepMap SDK delivers layers of intelligence to maps by adding semantic understanding so robots can identify the object’s pixels and know if they can move one way or not. It’s also capable of addressing both indoor and outdoor map building.
As part of the Isaac AMR platform, NVIDIA DeepMap integrates with other components, such as Metropolis, ReOpt, Isaac Sim via Omniverse, and more.
NVIDIA ReOpt AI software libraries can be used to optimize vehicle route planning and logistics in real-time, which can be applied to fleets of AMRs.
Companies can simulate (using Isaac Sim) multiple AMR interactions with NVIDIA ReOpt. These can happen quickly and accurately in digital twins of environments such as warehouses. And they can be implemented before deploying robots in production as situations change, saving time and money.
Omniverse Free Version Available to Millions of Individual Creators and Artists Worldwide
Designed to be the foundation that connects virtual worlds, NVIDIA Omniverse is now available to millions of individual NVIDIA Studio creators using GeForce RTX and NVIDIA RTX GPUs.
With Omniverse, NVIDIA’s real-time 3D design collaboration and virtual world simulation platform, artists, designers and creators can use leading design applications to create 3D assets and scenes from their laptop or workstation.
In a special address at CES, NVIDIA also announced new platform developments for Omniverse Machinima and Omniverse Audio2Face, new platform features like Nucleus Cloud and 3D marketplaces, as well as ecosystem updates.
Zhelong Xu, a digital artist and Omniverse Creator based in Shanghai stated: “With this technology, content creators get more than just a fast renderer. NVIDIA Omniverse and RTX give artists a powerful platform with infinite possibilities.”
The culmination of over 20 years of NVIDIA’s groundbreaking work, Omniverse brings graphics, AI, simulation and scalable compute into a single platform to enhance existing 3D workflows.
With Omniverse, free for individual users, GeForce RTX Studio creators can connect their favorite 3D design tools to a single scene and simultaneously create and edit between the apps.
Salient Latest Features of Omniverse
Omniverse Nucleus Cloud enables “one-click-to-collaborate” simple sharing of large Omniverse 3D scenes, meaning artists can collaborate from across the room or the globe without transferring massive datasets. Changes made by the artist are reflected back to the client — like working on a cloud-shared document — but for a 3D scene.
New support for the Omniverse ecosystem provided by leading 3D marketplaces and digital asset libraries gives creators an even easier way to build their scenes. TurboSquid by Shutterstock, CGTrader, Sketchfab, and Twinbru have released thousands of Omniverse-ready assets for creators, all based on Universal Scene Description (USD) format, and are found directly in the Omniverse Launcher. Reallusion’s ActorCore, Daz3D, and e-on software’s PlantCatalog will soon release their own Omniverse-ready assets.
Omniverse Machinima for RTX creators who love to game — now featuring new, free characters, objects, and environments from leading game titles like Mechwarrior 5 and Shadow Warrior 3, plus Mount & Blade II: Bannerlord and Squad assets in the Machinima library. Creators can remix and recreate their own game cinematics with these assets by dragging and dropping them into their scenes.
Omniverse Audio2Face, a revolutionary AI-enabled app that instantly animates a 3D face with just an audio track, now offers blendshape support and direct export to Epic’s MetaHuman Creator app. This leaves the tedious, manual blend-shaping process to AI, so artists and creators can spend more time on their creative workflows.
Recently, there are 14 connectors to applications like Autodesk 3ds Max, Autodesk Maya, and Epic Games’ Unreal Engine — with many more in the pipeline, including an Adobe Substance 3D Material Extension coming soon.
Article received from Nvidia