NVIDIA Jetson supports Zipline drone deliveries, as Omniverse enables Amazon digital twins


Listen to this article

Voiced by Amazon Polly
NVIDIA Jetson Xavier NX processes sensor inputs for the P1 delivery drone.

NVIDIA Jetson Xavier NX processes sensor inputs for the P1 delivery drone. Source: Zipline

Robotics, simulation, and artificial intelligence are providing new capabilities for supply chain automation. For example, Zipline International Inc. drone deliveries and Amazon Robotics digital twins for package handling demonstrate how NVIDIA Corp. technologies can enable industrial applications.

“You can pick the right place for your algorithms to run to make sure you’re getting the most out of the hardware and the power that you are putting into the system,” said A.J. Frantz, navigation lead at Zipline, in a case study.

NVIDIA claimed that its Jetson Orin modules can perform up to 275 trillion operations per second (TOPS) to provide mission-critical computing for autonomous systems in everything from delivery services and agriculture to mining and undersea exploration. The Santa Clara, Calif.-based company added that Jetson’s energy efficiency can help businesses electrify their vehicles and reduce carbon emissions to meet sustainability goals.

Zipline drones rely on Jetson Xavier NX to avoid obstacles

Founded in 2011, Zipline said it has completed more than 800,000 deliveries of food, medication, and more in seven countries. The San Francisco-based company said its drones have flown over 55 million miles using NVIDIA Jetson edge AI platform for autonomous navigation and landings.

Zipline, which raised $330 million in April at a valuation of $4.2 billion, is a member of the NVIDIA Inception program, in which startups can get technology support. The company’s Platform One, or P1, drone uses Jetson Xavier NX system-on-module (SOM) to process sensor inputs.

“The NVIDIA Jetson module in the wing is part of what delivers our acoustic detection and avoidance system, so it allows us to listen for other aircraft in the airspace around us and plot trajectories that avoid any conflict,” Frantz explained.

Zipline’s fixed-wing drones can fly out more than 55 miles (88.5 km), at 70 mph (112.6 kph) from several distribution centers and then return. Capable of hauling up to 4 lb. (1.8 kg) of cargo, they autonomously fly and release packages at their destinations by parachute.


SITE AD for the 2024 RBR50 call for nominations.Submit your nominations for innovation awards in the 2024 RBR50 awards.


P2 hybrid drone includes Jetson Orin NX for sensor fusion, safety

Zipline’s Platform Two, or P2, hybrid drone can fly fast on fixed-wing flights, as well as hover. It can carry 8 lb. (3.6 kg) of cargo for 10 miles (16 km), as well as a droid that can be lowered on a tether to precisely place deliveries. It’s intended for use in dense, urban environments.

The P2 uses two Jetson Orin NX modules. One is for sensor fusion system to understand environments. The other is in the droid for redundancy and safety.

Zipline claimed that its drones, nicknamed “Zips,” can deliver items 7x faster than ground vehicles. It boasted that it completes one delivery every 70 seconds globally.

“Our aircraft fly at 70 miles per hour, as the crow flies, so no traffic, no waiting at lights — we’re talking minutes here in terms of delivery times,” said Joseph Mardall, head of engineering at Zipline. “Single-digit minutes are common for deliveries, so it’s faster than any alternative.”

In addition to transporting pizza, vitamins, and medications, Zipline works with Walmart, restaurant chain Sweetgreen, Michigan Medicine, MultiCare Health Systems, Intermountain Health, and the government of Rwanda, among others. It delivers to more than 4,000 hospitals and health centers.

Amazon uses Omniverse, Adobe Substance 3D for realistic packages

For warehouse robots to be able to handle a wide range of packages, they need to be trained on massive but realistic data sets, according to Amazon Robotics.

“The increasing importance of AI and synthetic data to run simulation models comes with new challenges,” noted Adobe Inc. in a blog post. “One of these challenges is the creation of massive amounts of 3D assets to train AI perception programs in large-scale, real-time simulations.”

Amazon Robotics turned to Adobe Substance 3D, Universal Scene Description (USD), and NVIDIA Omniverse to develop random but realistic 3D environments and thousands of digital twins of packages for training AI models.

NVIDIA Omniverse integrates with Adobe Substance 3D to generate realistic package models.

NVIDIA Omniverse integrates with Adobe Substance 3D to generate realistic package models for training robots. Source: Adobe

NVIDIA Omniverse allows simulations to be modified, shared

“The Virtual Systems Team collaborates on a wide range of projects, encompassing both extensive solution-level simulations and individual workstation emulators as part of larger solutions,” explained Hunter Liu, technical artist at Amazon Robotics.

“To describe the 3D worlds required for these simulations, the team utilizes USD,” he said. “One of the team’s primary focuses lies in generating synthetic data for training machine learning models used in intelligent robotic perception programs.”

The team uses Houdini for procedural mesh generation and Substance 3D Designer for texture generation and loading virtual boxes into Omniverse, added Haining Cao, a texturing artist at Amazon Robotics.

The team has developed multiple workflows to represent the vast variety of packages that Amazon handles. It has gone from generating two to 300 assets per hour, said Liu.

“To introduce further variations, we utilize PDG (Procedural Dependency Graph) within Houdini,” he noted. “PDG enables us to efficiently batch process multiple variations, transforming the Illustrator files into distinct meshes and textures.”

After generating the synthetic data and publishing the results to Omniverse, the Adobe-NVIDIA integration enables Amazon’s team to change parameters to, for example, simulate work cardboard. The team can also use Python to trigger randomized values and collaborate on the data within Omniverse, said Liu.

In addition, Substance 3D includes features for creating “intricate and detailed textures while maintaining flexibility, efficiency, and compatibility with other software tools,” he said. Simulation-specific extensions bundled with NVIDIA Isaac Sim allow for further generation of synthetic data and live simulations using robotic manipulators, lidar, and other sensors, Liu added.

Latest articles

spot_imgspot_img

Related articles

Leave a reply

Please enter your comment!
Please enter your name here

spot_imgspot_img