Soft Robotics mGripAI uses simulation to train in NVIDIA Isaac Sim

Soft Robotics

Soft Robotics grippers can acquire and move items that might be damaged by classic mechanical gripper fingers. | Credit: Soft Robotics

Robots are finally getting a grip. 

Developers have been striving to close the gap on robotic gripping for the past several years, pursuing applications for multibillion-dollar industries. Securely gripping and transferring fast-moving items on conveyor belts holds vast promise for businesses. 

Soft Robotics, a Bedford, Mass. startup, is harnessing NVIDIA Isaac Sim to help close the sim to real gap for a handful of robotic gripping applications. One area is perfecting gripping for pick and placement of foods for packaging. 

Food packaging and processing companies are using the startup’s mGripAI system which combines soft grasping with 3D Vision and AI to grasp delicate foods such as proteins, produce, and bakery items without damage.

“We’re selling the hands, the eyes and the brains of the picking solution,” said David Weatherwax, senior director of software engineering at Soft Robotics. 

Unlike other industries that have adopted robotics, the $8 trillion food market has been slow to develop robots to handle variable items in unstructured environments, says Soft Robotics. 

The company, founded in 2013, recently landed $26 million in Series C funding from Tyson Ventures, Marel and Johnsonville Ventures.

Companies such as Tyson Foods and Johnsonville are betting on the adoption of robotic automation to help improve safety and increase production in their facilities. Both companies rely on Soft Robotics technologies. 

Soft Robotics is a member of the NVIDIA Inception program, which provides companies with GPU support and AI platforms guidance. 

Getting a Grip With Synthetic Data

Soft Robotics develops unique models for every one of its gripping applications, each requiring specific data sets. And picking from piles of wet, slippery chicken and other foods can be a tricky challenge. 

Utilizing Omniverse and Isaac Sim, the company can create 3D renderings of chicken parts with different backgrounds, like on conveyor belts or in bins and with different lighting scenarios. 

The company taps into Isaac Replicator to develop synthetic data, generating hundreds of thousands of images per model and distributing that among an array of instances in the cloud. Isaac Replicator is a set of tools, APIs, and workflows for generating synthetic data using Isaac Sim.

It also runs pose estimation models to help its gripping system see the angle of the item to pick. 

NVIDIA A100 GPUs on site enable Soft Robotics to run split-second inference with the unique models for each application in these food-processing facilities. Meanwhile, simulation and training in Isaac Sim offer access to NVIDIA A100s for scaling up workloads.

“Our current setup is fully synthetic, which allows us to rapidly deploy new applications. We’re all in on Omniverse and Isaac Sim, and that’s been working great for us,” said Weatherwax. 

Solving Issues With Occlusion, Lighting 

A big challenge at Soft Robotics is solving issues with occlusion for an understanding of how different pieces of chicken stack up and overlap one another when dumped into a pile. “How those form can be pretty complex,” Weatherwax said.

Glares on wet chicken can potentially throw off detection models. “A key thing for us is the lighting, so the NVIDIA RTX-driven ray tracing is really important,” he said. 

Soft Robotics chicken

The glares on wet chicken is a classic lighting and vision problem that requires a new approach for training machine learning vision models. | Credit: Soft Robotics

But where it really gets interesting is modeling it all in 3D and figuring out in a split second which item is the least obstructed in a pile and most accessible for a robot gripper to pick and place. 

Building synthetic data sets with physics-based accuracy, Omniverse enables Soft Robotics to create such environments. “One of the big challenges we have is how all these amorphous objects form into a pile,” Weatherwax said. 

Boosting Production Line Pick Accuracy

Production lines in food processing plants can move fast. But robots deployed with application-specific models promise to handle as many as 100 picks per minute. 

Still a work in progress, success in such tasks hinges on accurate representations of piles of items, supported by training data sets that consider every possible way items can fall into a pile. 

The objective is to provide the robot with the best available pick from a complex and dynamic environment. If food items fall off the conveyor belt or otherwise become damaged then it is considered waste, which directly impacts yield.

Driving Production Gains 

Meat-packing companies rely on lines of people for processing chicken, but like so many other industries they have faced employee shortages. Some that are building new plants for food processing can’t even attract enough workers at launch, said Weatherwax. 

“They are having a lot of staffing challenges, so there’s a push to automate,” he said.

The Omniverse-driven work for food processing companies has delivered a more than 10X increase in its simulation capacity, accelerating deployment times for AI picking systems from months to days. 

And that’s enabling Soft Robotics customers to get a grip on more than just deploying automated chicken-picking lines — it’s ensuring that they are covered for an employment challenge that has hit many industries, especially those with increased injury and health risks. 

“Handling raw chicken is a job better suited for a robot,” he said.

Download Isaac Sim here to use the Replicator features.

The post Soft Robotics mGripAI uses simulation to train in NVIDIA Isaac Sim appeared first on The Robot Report.



from The Robot Report - Robotics News, Analysis & Research https://ift.tt/eCEh9Wp
via artificialconference

Comments

Popular posts from this blog

Valiant TMS and Realtime Robotics partner to cut programming, cycle times

Ocado wins UK patent lawsuit over Autostore

Update on AgTech automation at CNHI