March 24, 2026 Volume 22 Issue 12
 

Designfax weekly eMagazine

Subscribe Today!
image of Designfax newsletter

Archives

View Archives

Partners

Manufacturing Center
Product Spotlight

Modern Applications News
Metalworking Ideas For
Today's Job Shops

Tooling and Production
Strategies for large
metalworking plants

UR AI Trainer for robotics: First lab-to-factory AI model trainer

Universal Robots (UR) unveiled the UR AI Trainer at GTC 2026 in Silicon Valley (March 16-19, 2026). Developed in collaboration with Scale AI, the AI Trainer marks a tectonic shift as robots move from pre-programmed applications to fully AI-driven tasks. These systems are powered by robust data generated in AI training cells where robots imitate humans. 

The AI Trainer allows human operators to guide UR robots through tasks in a leader-follower setup while automatically capturing high-quality multimodal data for robotics AI development. [Credit: Image courtesy of Universal Robots]

"Our customers, ranging from large enterprises to AI research labs, are no longer just asking for AI features," said Anders Beck, VP of AI Robotics Products at Universal Robots. "They need a way to collect high-fidelity, synchronized robot and vision data to train AI models on the same robots they intend to deploy. Our AI Trainer is the industry's first direct lab-to-factory solution for AI model training." 

Alongside the new AI Trainer, Universal Robots' GTC booth showcased a state-of-the-art robotic foundation model from Generalist AI, a UR preferred model partner. Leveraging this model, two UR robots completed a complex smartphone packaging task, previously impossible without recent advances in the field of Physical AI.

Enabling AI-ready data capture with force feedback and direct torque control
AI robotics training is often hindered by fragmented hardware and low-fidelity data capture. Much of today's training data is collected on research robots not suited for production environments, and many systems rely only on visual feedback, making delicate or contact-rich tasks difficult.

UR partner Generalist AI demonstrated how advances in data collection and AI models translate into real-world robotic performance, with two UR7e robots autonomously executing a complex smartphone packaging task using embodied foundation models. [Credit: Image courtesy of Universal Robots]

"The AI Trainer directly addresses these barriers," said Beck. "By utilizing our unique Direct Torque Control and force feedback features, we give developers direct influence over how the robot physically interacts with the world, training on the same robust hardware used in over 100,000 industrial deployments."

Scale AI partnership enables a flywheel of integrated robotics data
  The AI Trainer allows human operators to guide UR robots through tasks in a leader-follower setup while automatically capturing high-quality multimodal data for robotics AI development. Operators physically guide a "leader" robot through a task while a synchronized "follower" robot mirrors the motion in real time. During each demonstration, the system records synchronized motion, force, and visual data, producing the structured datasets required to train Vision-Language-Action (VLA). 

Deploying on UR's AI Accelerator platform, the UR AI Trainer combines UR robots with Scale AI software to enable data capture on UR robots in production and at scale, creating continuous feedback that drives ongoing optimization of physical AI systems.     "Universal Robots is a leader in industrial robotics, and its global footprint offers the ideal foundation for data capture and AI deployment," said Ben Levin, General Manager, Physical AI at Scale AI. "Together, we've created an integrated robotics data flywheel, allowing customers to train, deploy, and improve their AI models faster than ever before." As part of this collaboration, UR and Scale AI will release a large-scale industrial dataset collected on UR robots later this year.

First-hand encounters with AI Trainer at GTC
With GTC as the official launch pad, attendees were able to experience the system first-hand at UR's booth as they guided two UR3e "leader" robots providing haptic input to control two UR7e "follower" robots. The setup enabled visitors to perform an advanced smartphone packaging task with haptic feedback for imitation learning and VLA training, with demonstration data recorded in real time on Scale's stack and replayable directly on the AI Trainer.  

The process of capturing robot training data for AI models was further showcased through a demo that illustrated the same smartphone packaging task -- just trained virtually: Built in NVIDIA Omniverse and leveraging Isaac Sim, the simulated setup allowed attendees to control a virtual bi-manual UR3e system with real-time haptic feedback using two Haply Inverse3 devices as "leaders," providing a physics-accurate simulation.

Universal Robots is also exploring the use of the NVIDIA Physical AI Data Factory Blueprint to automate and scale its synthetic data generation, transforming world-scale compute into a production engine for high-quality robotic training data.

Attendees got to experience robots being trained on-site and virtually. [Credit: Image courtesy of Universal Robots]

"The shift toward Physical AI requires a fundamental move from rigid, pre-programmed automation to generalist robots that can perceive, reason, and learn through human-like interaction," said Amit Goel, head of robotics and edge AI ecosystem at NVIDIA. "By leveraging the NVIDIA Isaac simulation frameworks, Universal Robots is building a scalable engine for high-fidelity data capture and generation, providing the essential infrastructure to train the next generation of autonomous systems at scale."

Generalist AI demonstrates real-world robotic foundation model performance
Complementing the two data-capture demonstrations, Generalist's showcase highlighted how advances in data collection and AI models translate into real-world robotic performance. In the first public demonstration of Generalist's embodied foundation models, two UR7e robots autonomously executed a complex smartphone packaging task, demonstrating dexterity, coordination, and contact-rich manipulation in a real-world environment. The demonstration showed how scaled, high-quality training data combined with frontier model architectures can enable robust physical AI systems beyond the lab. 

"Generalist is building embodied foundation models that deliver industry-leading dexterity and reliability," said Pete Florence, co-founder and CEO of Generalist AI. "This demonstration on Universal Robots' trusted industrial platform shows how physical common sense can be translated into real-world capability, paving the way for deployment across industries at scale."

Source: Universal Robots

Published March 2026

Rate this article

[UR AI Trainer for robotics: First lab-to-factory AI model trainer]

Very interesting, with information I can use
Interesting, with information I may use
Interesting, but not applicable to my operation
Not interesting or inaccurate

E-mail Address (required):

Comments:


Type the number:



Copyright © 2026 by Nelson Publishing, Inc. All rights reserved. Reproduction Prohibited.
View our terms of use and privacy policy