Collaborative grocery storage. A single set of Helix neural network weights runs simultaneously on two robots as they work together to put away groceries neither robot has ever seen before. Image & Video: Figure

Technology USA16. March 2025

Milestone: Humanoid Robot Understands What We Say, and Takes Action!

An Artificial Intelligence development company based in California, the United States, created a robot capable of understanding vocal commands and translating them into action, opening new exciting possibilities for humanoid deployment in unstructured environments.

Helix is a first-of-its-kind “System 1, System 2” Vision-Language-Action (VLA) model for high-rate, dexterous control of the entire humanoid upper body, where S2 can “think slow” about high-level goals, while S1 can “think fast” to execute and adjust actions in real-time.

The humanoid robot Helix is the first to have full-upper-body control, operate in multi-robot mode, pick up anything, have one neural network, and be commercial-ready. Helix unifies perception, understands language, and has learned to overcome multiple longstanding challenges in robotics. The company directly translated the rich semantic knowledge captured in Vision Language Models into generalizable robot control. Consequently, this VLA model can directly control an entire humanoid upper body from natural language and generate long-horizon, collaborative, dexterous manipulation without any task-specific demonstrations or extensive manual programming.

Source:
Figure

:::::: Related Articles

Back to top button