- Thread starter
- #1
You are probably going to think of VR training from the Metal Gear Solid series with what these robots are doing.
Sources -
Omniverse (NVIDIA Omniverse™ is a platform of APIs, SDKs, and services that enable developers to easily integrate Universal Scene Description (OpenUSD) and RTX rendering technologies into existing software tools and simulation workflows for building AI systems.): https://www.nvidia.com/en-au/omniverse/
Project GR00T (GR00T is a general-purpose foundation model that promises to transform humanoid robot learning in simulation and the real world): https://developer.nvidia.com/project-GR00T
Isaac Perceptor (NVIDIA Isaac™ Perceptor is a collection of hardware-accelerated packages for visual AI, tailored for Autonomous Mobile Robot (AMR) to perceive, localize, and operate robustly in unstructured environments.): https://developer.nvidia.com/isaac/perceptor
Isaac Sim (NVIDIA Isaac Sim™ is an extensible robotics simulation platform that gives you a faster, better way to design, test, and train AI-based robots. It’s powered by Omniverse™ to deliver scalable, photorealistic, and physically accurate virtual environments for building high-fidelity simulations.): https://developer.nvidia.com/isaac-sim
Omniverse cloud API press release: https://nvidianews.nvidia.com/news/omniverse-cloud-apis-industrial-digital-twin
Digital twins warehouse blog: https://blogs.nvidia.com/blog/ai-digital-twins-industrial-automation-demo/
NVIDIA had their GTC (GPU Technology conference) very recently, the 2-hour keynote video has over 7.5M views currently.
At the 1:30:00 mark of the video, Jensen Huang talks about the Omniverse and robotics. One of the uses that NVIDIA found for the Omniverse is creating simulations (with physics) which are then applied in the real world hence their term of creating a "digital twin", they claim this saves on costs and planning, etc. Obviously, this isn't entirely new to most engineers when they design things virtually before doing manufacturing.
Let's skip to the robotics part, here are a few short videos that were in that keynote (except for the video from 2022).
Here's their promotional video titled, "NVIDIA Robotics: A Journey From AVs to Humanoids":
Video title: "Fusing Real-Time AI With Digital Twins"
Video description:
Video title: "NVIDIA Isaac Perceptor 3D Surround Vision"
Video description:
Here's a video from March 2022 of Nvidia using their Omniverse platform and Isaac Sim to train robots in VR and then applying that in the real world titled, "Narrowing the Sim2Real Gap with NVIDIA Isaac Sim":
That's all the related videos I could find, now robots will get more skills and training then you ever will in your lifetime. In the future, companies can finally stop posting entry-level jobs that require a minimum of five years of experience. Instead, they'll train robots on the Omniverse in a simulated environment of the workplace and then they get the robot to do those entry-level jobs.
I'm being facetious since I already made a thread talking about the estimated costs of manufacturing a humanoid robot: https://www.installbaseforum.com/fo...k-per-unit-lets-look-at-the-competition.2505/
More realistically, what I could see happening is a company that sells services where they go to your workplace, scan the environment, put it on the Omniverse, find out what kind of robot you need and what it needs to be trained on. You as a business owner won't have to know how to get this all to work since you pay someone else to provide it. Then you just replace all the fleshy humans with your new AI-powered overlords.
Sources -
Omniverse (NVIDIA Omniverse™ is a platform of APIs, SDKs, and services that enable developers to easily integrate Universal Scene Description (OpenUSD) and RTX rendering technologies into existing software tools and simulation workflows for building AI systems.): https://www.nvidia.com/en-au/omniverse/
Project GR00T (GR00T is a general-purpose foundation model that promises to transform humanoid robot learning in simulation and the real world): https://developer.nvidia.com/project-GR00T
Isaac Perceptor (NVIDIA Isaac™ Perceptor is a collection of hardware-accelerated packages for visual AI, tailored for Autonomous Mobile Robot (AMR) to perceive, localize, and operate robustly in unstructured environments.): https://developer.nvidia.com/isaac/perceptor
Isaac Sim (NVIDIA Isaac Sim™ is an extensible robotics simulation platform that gives you a faster, better way to design, test, and train AI-based robots. It’s powered by Omniverse™ to deliver scalable, photorealistic, and physically accurate virtual environments for building high-fidelity simulations.): https://developer.nvidia.com/isaac-sim
Omniverse cloud API press release: https://nvidianews.nvidia.com/news/omniverse-cloud-apis-industrial-digital-twin
Digital twins warehouse blog: https://blogs.nvidia.com/blog/ai-digital-twins-industrial-automation-demo/
NVIDIA had their GTC (GPU Technology conference) very recently, the 2-hour keynote video has over 7.5M views currently.
At the 1:30:00 mark of the video, Jensen Huang talks about the Omniverse and robotics. One of the uses that NVIDIA found for the Omniverse is creating simulations (with physics) which are then applied in the real world hence their term of creating a "digital twin", they claim this saves on costs and planning, etc. Obviously, this isn't entirely new to most engineers when they design things virtually before doing manufacturing.
Let's skip to the robotics part, here are a few short videos that were in that keynote (except for the video from 2022).
Here's their promotional video titled, "NVIDIA Robotics: A Journey From AVs to Humanoids":
Video title: "Fusing Real-Time AI With Digital Twins"
Video description:
Discover the AI that'll drive the next phase of industrial automation—how it'll be developed, refined, and first deployed in simulation in digital twins. Complex AI is being tested in real time inside an Omniverse digital twin of a warehouse, showcasing AI that's been developed inside this digital twin. It’s a workflow that developers can use to build AI gyms to train and evaluate complex AI, all in real time within the digital twin. This is something that otherwise would be incredibly costly or impossible to run in the real world—particularly for heavy industry, factories, and supply chains. This demo leverages NVIDIA Metropolis, Omniverse, CuOpt, and Isaac for robot perception to create an end-to-end concept of how to fully automate logistically complex co-bot spaces.
Video title: "NVIDIA Isaac Perceptor 3D Surround Vision"
Video description:
NVIDIA Isaac Perceptor, optimized on Jetson Orin, uses multiple cameras for 3D surround perception to detect obstacles like low-lying hazards or overhangs, which are invisible to standard 2D lidar.
Using robust AI-based depth estimation, GPU-Accelerated 3D reconstruction, and semantic segmentation, the mobile robot can work more safely alongside humans.
Key Points:
- Isaac Perceptor is a live multi-camera, surround visual perception running on Jetson.
- Isaac Perceptor is GPU-accelerated and optimized on Orin, leaving headroom for adding other SW such as navigation stack.
- The traditional approaches have predominantly utilized 2D lidars, which offer limited functionality, or 3D lidars, known for their prohibitive costs. Isaac Perceptor revolutionizes this by offering an affordable, camera-based solution that doesn't compromise on capability. Also, brings visual AI semantics for autonomy.
Here's a video from March 2022 of Nvidia using their Omniverse platform and Isaac Sim to train robots in VR and then applying that in the real world titled, "Narrowing the Sim2Real Gap with NVIDIA Isaac Sim":
That's all the related videos I could find, now robots will get more skills and training then you ever will in your lifetime. In the future, companies can finally stop posting entry-level jobs that require a minimum of five years of experience. Instead, they'll train robots on the Omniverse in a simulated environment of the workplace and then they get the robot to do those entry-level jobs.
I'm being facetious since I already made a thread talking about the estimated costs of manufacturing a humanoid robot: https://www.installbaseforum.com/fo...k-per-unit-lets-look-at-the-competition.2505/
More realistically, what I could see happening is a company that sells services where they go to your workplace, scan the environment, put it on the Omniverse, find out what kind of robot you need and what it needs to be trained on. You as a business owner won't have to know how to get this all to work since you pay someone else to provide it. Then you just replace all the fleshy humans with your new AI-powered overlords.