Embodied AI Robot Large Model (Including VLA) Research Report, 2026
Research on Robot Large Models: World Models Are About to Become Standard, and OEMs Enter and Accelerate Mass Production and Application
ResearchInChina has released the Embodied AI Robot Large Model (Including VLA) Research Report, 2026, which focuses on the research, analysis, and summary of the following content:
The basic concepts, industrial ecosystem map, multi-dimensional classification (application scope, capability modality, architecture), industry development drivers, key technology development directions, and commercialization modes of Embodied AI robot large models;
The layout planning, team building, core talents, large model products and their applications, detailed introduction and implementation status of Embodied AI robot large model products, Embodied AI ecosystem partners, and recent key dynamics of 11 tech giants in the Embodied AI robot field, including Alibaba Group, NVIDIA, Google DeepMind, OpenAI, Microsoft, Huawei, Tencent RoboticsX, Baidu, ByteDance, iFlytek, and SenseTime;
The profile, development history and planning, robot products and large model installation, detailed introduction of self-developed large models, large model ecosystem cooperation, and recent key dynamics of 10 well-known robot enterprises, including UBTECH Robotics, Unitree Robotics, AgiBot, Leju Robotics, Galbot, RobotEra, FigureAI, Sanctuary AI, 1X Technologies, and Neura Robotics;
The layout planning, team building, core talents, robot products and large model installation, summary of large model products, detailed introduction of Embodied AI robot large model products, Embodied AI ecosystem partners, and recent key dynamics of 11 OEMs in the Embodied AI robot field, including Tesla, Toyota, Honda, Hyundai, Xiaomi, XPeng, GAC Group, Chery, Leapmotor, BYD, and Dongfeng Motor. In addition, this report summarizes the layout of 13 other global OEMs in Embodied AI robot field.
Embodied AI robot large models ("robot large models" for short) can make end-to-end or hierarchical decisions compared with traditional robot control algorithms, without the need for precise modeling, and can operate in unstructured and open environments (families, outdoors, cluttered desktops). Compared with general large models, Embodied AI robot large models pay more attention to the fusion and understanding of multi-modal information (vision + lidar + touch + text, etc.), aiming to complete closed-loop actions in the physical world and output motion commands such as joint angles, speeds, and grasping forces.
In recent years, Embodied AI robot large model field has shown the following development trends:
1. Embodied AI Players Have Begun to Apply World Models
Currently, robot large models represented by Vision-Language-Action (VLA) models have made significant progress in the "perception-decision-execution" closed loop, enabling robots to understand instructions and generate actions. However, such models still face bottlenecks in coping with the high diversity and uncertainty of physical world. In essence, they are more like "imitating" patterns in training data, lacking the foresight of action consequences and the understanding of physical logic.
The introduction of world models is precisely to break this limitation. The core of a world model is to enable robots to acquire the ability to "imagine the future". Through training with multi-modal data, it constructs an internal dynamic representation of physical environment, and can predict state changes of multiple future steps based on current state and planned actions. This means that robots can transform from passive instruction followers to active decision-makers capable of "brain deduction". For example, when performing the "pouring water" task, a robot equipped with a world model can not only identify cups and kettles but also predict the water flow trajectory, cup tilt angle, and possible spills before action, thereby planning a safer and more accurate action sequence.
Driving forces for the application of world models mainly come from three aspects:
Solving the data bottleneck: The collection of high-quality real robot data is extremely costly and limited in scale, having become a core constraint on capability upgrading. World models can serve as powerful "data generators" and "simulation engines", generating massive, controllable, and high-fidelity synthetic training scenarios, and greatly reducing the reliance on expensive real robot data.
Improving decision and generalization capabilities: Through prediction and deduction, world models enable robots to have a certain degree of causal reasoning and physical intuition, capable of handling new scenarios and new objects not seen in training, and achieving "learning by analogy".
Realizing the collaborative evolution of "cerebrum" and "cerebellum": The industry consensus is that future robots’ intelligence will be the result of collaborative evolution of the "cerebrum" (high-level cognition and planning) and the "cerebellum" (low-level motion control). As a key component of the high-level "cerebrum", the world model forms a complementary relationship with execution-oriented models such as VLA, jointly constituting a complete intelligent system.
Many enterprises have developed their own world models, such as Alibaba's WorldVLA, NVIDIA's WAM, Tencent Hunyuan 3D World Model, Unitree Robotics' UnifoLM-WMA-0, and AgiBot's GE-1. Among them, Unitree Robotics' UnifoLM-WMA-0 was released and open-sourced around September 2025. It is an open-source model specially designed for general robot learning. It has been adapted to the company's humanoid and quadruped robots, with two modes: decision and simulation. The decision mode can predict future physical interactions (such as stacking stability and collision risks), correct actions, and improve the robustness of complex tasks. The simulation mode can generate high-fidelity synthetic data to solve the problem of scarcity of real robot training data.
AgiBot's world model GE-1 was released in August 2025, which is a video-generative world model for robot control. With a closed-loop architecture of "video generation + strategy learning + simulation evaluation", it realizes end-to-end reasoning from "seeing" to "thinking" and then to "acting". GE-1 collaborates with AgiBot's GO-1 series base models: GO-1 focuses on general task planning and common-sense knowledge support, while GE-1 specializes in spatiotemporal prediction and action rehearsal, improving the task success rate and stability of G2 in complex scenarios.
GE-1 was officially deployed with the industrial-grade interactive embodied operation robot G2 in October 2025, and AgiBot announced that it had won an order worth hundreds of millions of yuan from Longcheer Technology. The robot has performed tasks such as "making sandwiches", "pouring tea", and "wiping the desktop".
2. Robot Large Models Achieve Cross-Platform Applications
In traditional robot development mode, the software and algorithms of each robot need to be specially developed and optimized for its unique hardware configuration (sensors, actuators, form), leading to high R&D costs, long cycles, and non-reusable capabilities. The cross-platform application of robot large models can break this drawback. By building a powerful end-to-end multi-modal foundation model, it implants transferable general intelligence into robots, enabling them to cross the limitations of different ontologies (such as humanoid, quadruped, robotic arm), different tasks and different environments, and realize rapid generalization and deployment of capabilities.
Starting from 2025, robot large models such as NVIDIA's GR00T series, Google DeepMind's Gemini Robotics, Microsoft's Rho-alpha, Huawei's CloudRobo, and RobotEra's ERA?42 all support cross-robot platform development and cross-scenario applications.
In Q3 2025, NVIDIA released the GR00T N1.6 large model, positioned as a general humanoid robot VLA large model. Through a unified multi-modal interface + modular adaptation layer + simulation-reality collaborative pipeline + hierarchical deployment architecture, it realizes the cross-platform application of "one training, multi-machine adaptation". It supports humanoid dual-arm/mobile robotic arms, warehouse AGVs, medical assistive robots, scientific research robots, etc. It can execute tasks for new objects/new scenarios without a large amount of data, and can be flexibly adapted to various application scenarios such as industrial manufacturing, logistics and warehousing, household and commercial services, medical care and health, and scientific research and development.
RobotEra's end-to-end VLA embodied large model ERA?42 was released in December 2024 and initially adapted to its dexterous hand XHAND1. In mid-2025, the model was successively applied across platforms to the wheeled service robot Q5 and the bipedal humanoid robot L7, enabling rapid adaptation to new tasks without pre-programming.
3. An Increasing Number of Robot Large Models Are Open-Sourced
The open-sourcing of large models is not a simple technical sharing. Open-source models gather the wisdom of global developers and can quickly overcome complex "long-tail problems" in the physical world. At the same time, open-sourcing breaks traditional closed-source business mode, allowing small and medium-sized enterprises to quickly develop based on open-source models, focus resources on hardware innovation and implementation in scenarios, and form an industrial pattern of "giants build the platform, and hundreds of enterprises perform on it".
The core of open-sourcing is to lower the R&D threshold, accelerate technological iteration, build ecosystem barriers, promote large-scale implementation, and form a positive flywheel of "open-source — ecosystem— data — more powerful models".
Xiaomi's VLA large model for Embodied AI robots, Xiaomi-Robotics-0, was officially open-sourced on February 12, 2026, adopting the Apache License 2.0 (allowing commercial use, modification, and distribution without "contagion") open-source protocol, with full-stack and unreserved open-sourcing of complete code, pre-trained weights, technical documents, papers, deployment solutions, etc. Xiaomi-Robotics-0 reuses Xiaomi's autonomous driving perception/decision technology to realize technology interoperability between robots and automobiles. It adopts a Mixture of Experts (MoE) architecture, separating the "cerebrum" (vision-language understanding) and the "cerebellum" (action execution). This design ameliorates the reasoning delay problem that may exist in traditional VLA models, making it more suitable for consumer robot products that require real-time response.
4. OEMs Enter the Market to Solve the Scarcity of Real Data for Embodied AI Robot Large Models and Provide Field Verification Scenarios
The entry of multiple OEMs into the Embodied AI and humanoid robot track brings massive industrial scenario data, automotive-grade sensor data and a mature autonomous driving technology stack to Embodied AI large models (VLA, world models, etc.). Algorithms such as BEV perception, multi-modal fusion, and end-to-end decision can be directly migrated to robots to train and improve environmental understanding, task planning and motion control capabilities of models. The production line scenarios of OEMs can verify the reliability and success rate of robot large models, expose model defects at the same time, provide high-reliable real robot interaction data for future model correction, and effectively narrow the large gap between simulation and reality.
In addition, OEMs introduce automotive-grade safety standards and hardware collaborative design into robots, greatly optimizing the reasoning delay, reliability and implementation efficiency of large models; the core supply chains of automobiles and robots (batteries, motors, sensors, domain controllers, etc.) have a high degree of overlap. Some institutions estimate that the overlap rate exceeds 50%. The scale effect greatly reduces the cost of core hardware, and the model deployment cost also decreases synchronously.
For example, to solve the data problem, GAC Group borrows the experience of autonomous driving data collection. It sends robots to real scenarios to collect real data, and at the same time carries out in-depth adaptation and field verification of core functions, forming a closed-loop data growth model of "learning by using, using by learning". In terms of cost reduction strategies, its robots first multiplexes vehicle components (such as chips, lidar, etc.) and realize 100% localization of key components. GAC has clearly planned to mass-produce its fourth-generation product GoMate Mini in 2027, taking the security scenario as the first commercial application field for its robots.
1 Overview of Embodied AI Robot Large Models and Key Technology Development Directions
1.1 Core Definitions of Embodied AI Robot Large Models
1.1.1 Definition and Evolution of Embodied AI: Shifting from Weak Interaction to High Autonomy
1.1.2 Definition of Embodied AI Robots: Autonomously Understanding the Environment and Completing Tasks via Artificial Intelligence
1.1.3 Definition of Embodied AI Robot Large Models
1.2 Global Industrial Ecosystem Map of Embodied AI Robot Large Models
1.3 Classification of Embodied AI Robot Large Models
1.3.1 By Application Scope
1.3.2 By Capability Modality
1.3.3 By Architectural Form
1.4 Industry Development Drivers of Embodied AI Robot Large Models
1.4.1 Overview
1.4.2 Policies as the Core Engine
1.4.3 Technology
1.4.4 Market Demand
1.4.5 Increased Capital Investment
1.4.6 Industrial Collaboration
1.4.7 Data Closed Loop Facilitates Model Iteration
1.4.8 Aggregation of Interdisciplinary Talents
1.5 Key Technology Development Directions of Embodied AI Robot Large Models
1.5.1 Overview
1.5.2 Multi-Modal Perception and Unified Representation
1.5.3 World Model
1.5.4 VLA End-to-End Architecture
1.5.5 Hierarchical Fast-Slow System
1.5.6 Enhancing Generalization Ability and Data Efficiency
1.5.7 Safety and Reliability
1.5.8 Lightweight and Edge Deployment
1.6 Commercialization Modes of Embodied AI Robot Large Models
1.6.1 Model Technology Output
1.6.2 Integrated Hardware and Software Sales
1.6.3 Scenario-Based Service Operation
1.6.4 Data and Tool Ecosystem Services
1.6.5 Key Strategies and Evolution Directions for Commercial Implementation of Embodied AI Robot Large Models
2 Global Major Players and Products: Tech Giant Camp
2.1 Summary of Typical Embodied AI Large Model Products of Tech Giants (1)-(3)
2.2 Alibaba Group
2.2.1 Industrial Layout and Planning for Embodied AI Robots
2.2.2 Large Model R&D and Engineering Team: Tongyi Lab
2.2.3 Establishment of the "Robot and Embodied AI Business Unit": Detailed Introduction
2.2.4 Establishment of the "Robot and Embodied AI Business Unit": 2026–2028 Business Plan
2.2.5 Core Team Members Their Resumes of Embodied AI Robot Large Models
2.2.6 Large Model Product System
2.2.7 Embodied AI Robot Large Models: Milestones in the Development
2.2.8 Embodied AI Robot Large Models: Products Summary
2.2.9 Embodied AI Robot Large Models: RynnBrain Series – The World’s First Embodied AI Brain Foundation Model with Spatiotemporal Memory
2.2.10 Embodied AI Robot Large Models: Flagship General Embodied Model RynnBrain30BA3B
2.2.11 Embodied AI Robot Large Models: RynnVLA001
2.2.12 Embodied AI Robot Large Models: RynnEC – Video Multi-Modal Embodied Cognition Model
2.2.13 Embodied AI Robot Large Models: WorldVLA – Fully Autoregressive Embodied AI Large Model
2.2.14 Embodied AI Robot Large Models: Summary of Implemented Robots
2.2.15 Embodied AI Robot Large Models: Ecosystem Partners
2.3 NVIDIA
2.3.1 Profile
2.3.2 Industrial Layout History of Embodied AI Robots
2.3.3 Core Team for Embodied AI Robots
2.3.4 Summary of Embodied AI Robot-Related Products
2.3.5 Embodied AI Robot Large Models: Development History
2.3.6 Embodied AI Robot Large Models: Products Summary
2.3.7 Embodied AI Robot Large Models: Isaac GR00T – VLA Large Model
2.3.8 Embodied AI Robot Large Models: Dream Zero – World Action Model
2.3.9 Embodied AI Robot Large Models: Implementation Status
2.3.10 Embodied AI Robot Large Models: Ecosystem Partners
2.3.11 Embodied AI Robot Large Models: Key Dynamics
2.4 Google DeepMind
2.4.1 Core Team for Embodied AI Robots:Google DeepMind
2.4.2 Profile
2.4.3 Development History
2.4.4 Core Research Directions: 10 Major Fields
2.4.5 Core Team Members and Their Resumes
2.4.6 Summary of Large Models
2.4.7 Major Large Model: Gemini
2.4.8 Embodied AI Robot Large Models: Gemini Robotics
2.5 OpenAI
2.5.1 Profile
2.5.2 Financing History: Valuation Increased More Than 25 Times in Three Years
2.5.3 Development History
2.5.4 Organizational Structure
2.5.5 Product Matrix
2.5.6 Industrial Layout and Planning for Embodied AI Robots
2.5.7 Core Team Members and Resumes of the Humanoid Robot Lab
2.5.8 Embodied AI Robot Large Models: Products Summary
2.5.9 Embodied AI Robot Large Models: GPT-5 Embodied Adaptation Version
2.5.10 Embodied AI Robot Large Models: VLA Foundation Model
2.5.11 Embodied AI Robot Large Models: Ecosystem Partners
2.6 Microsoft
2.6.1 Industrial Layout History and Planning of Embodied AI Robots
2.6.2 Team Setup for Embodied AI Robots
2.6.3 Core Members and Their Resumes of the Embodied AI Team
2.6.4 Summary of Self-Developed Large Model Products
2.6.5 Embodied AI Robot Large Models: R&D History
2.6.6 Embodied AI Robot Large Models: Rho-alpha – VLA+ Model
2.6.7 Embodied AI Robot Large Models: Ecosystem Partners
2.6.8 Embodied AI Robot Large Models: Recent Key News and Dynamics
2.7 Huawei
2.7.1 Industrial Layout and Planning for Embodied AI Robots
2.7.2 Panoramic Table of Core Teams and Platforms for Embodied AI Robots
2.7.3 Core Team Members and Resumes of the Embodied AI Special Task Group
2.7.4 Overview of Pangu Large Model Products
2.7.5 Pangu Large Model Capabilities: Multi-Modal Technology
2.7.6 Pangu Large Model Capabilities: Reasoning Technology
2.7.7 Pangu Large Model AI Cloud Services
2.7.8 Embodied AI Robot Large Models: Cloud Robo
2.7.9 Embodied AI Robot Large Models: Ecosystem Partners
2.8 Tencent RoboticsX
2.8.1 Profile (1)-(2)
2.8.2 Development History
2.8.3 Embodied AI Robot Large Models: Products Summary
2.8.4 Embodied AI Robot Large Models: Tairos-Perception
2.8.5 Embodied AI Robot Large Models: Tairos-Planner
2.8.6 Embodied AI Robot Large Models: Tairos-Action
2.8.7 Embodied AI Robot Large Models: Ecosystem Partners
2.8.8 Embodied AI Robot Large Models: Key Dynamics
2.9 Baidu
2.9.1 Industrial Layout History and Planning for Embodied AI Robots
2.9.2 Introduction to Teams for Embodied AI Robots
2.9.3 Summary of Large Model Products
2.9.4 Embodied AI Robot Large Models: ERNIE Embodied Control Model
2.9.5 Embodied AI Robot Large Models: Ecosystem Partners
2.9.6 Embodied AI Robot Large Models: Key Dynamics
2.10 ByteDance
2.10.1 Industrial Layout History and Planning for Embodied AI Robots
2.10.2 Introduction to Teams for Embodied AI Robots
2.10.3 Core Team Members and Their Resumes of SeedRobotics for Embodied AI Robots
2.10.4 Summary of Large Model Products
2.10.5 Embodied AI Robot Large Models: Products Summary
2.10.6 Embodied AI Robot Large Models: GR Series – Robot Cerebellum
2.10.7 Embodied AI Robot Large Models: Robix – Robot Cerebrum
2.10.8 Embodied AI Robot Large Models: M3-Agent – Multi-Modal Long-Term Memory
2.10.9 Embodied AI Robot Large Models: Ecosystem Partners
2.11 iFlytek
2.11.1 Industrial Layout and Planning for Embodied AI Robots
2.11.2 Related Teams/Enterprises for Embodied AI Robots
2.11.3 Summary of Large Model Products
2.11.4 Embodied AI Robot Large Models: Products Summary
2.11.5 Embodied AI Robot Large Models: iFlyBot-VLM
2.11.6 Embodied AI Robot Large Models: iFlyBot-VLA
2.11.7 Embodied AI Robot Large Models: Ecosystem Partners
2.11.8 Embodied AI Robot Large Models: Key Dynamics
2.12 SenseTime
2.12.1 Industrial Layout and Planning for Embodied AI Robots
2.12.2 Related Teams/Enterprises for Embodied AI Robots
2.12.3 Summary of Large Model Products
2.12.4 Embodied AI Robot Large Models: Products Summary
2.12.5 Embodied AI Robot Large Models: Wuneng Embodied AI Platform
2.12.6 Embodied AI Robot Large Models: A1 – Embodied Super Brain
2.12.7 Embodied AI Robot Large Models: Ecosystem Partners
2.12.8 Embodied AI Robot Large Models: Key Dynamics
3 Global Major Players and Products: Robot Enterprise Camp
3.1 Summary of Typical Embodied AI Large Model Products of Robot Enterprises (1)-(3)
3.2 UBTECH Robotics(UBTECH)
3.2.1 Profile
3.2.2 Revenue
3.2.3 Overview of Robot Products
3.2.4 Core Technology System
3.2.5 Development Strategy and Planning
3.2.6 Embodied AI Robot Large Models: BrainNet Architecture
3.2.7 Layout of Embodied AI Robot Large Models
3.2.8 Embodied AI Robot Large Models: Core Information of Three Major Models
3.2.9 Embodied AI Robot Large Models: Development History of Thinker Multi-Modal Large Model
3.2.10 Embodied AI Robot Large Models: Thinker – Multi-Modal Large Model for Humanoid Robots
3.2.11 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
3.2.12 Embodied AI Robot Large Models: Ecosystem Partners
3.3 Unitree Robotics
3.3.1 Profile
3.3.2 Market and Product Strategic Planning
3.3.3 Embodied AI Robot Large Models: Development History
3.3.4 Embodied AI Robot Large Models: Self-Developed UnifoLM Series
3.3.5 Embodied AI Robot Large Models: UnifoLM-WMA-0
3.3.6 Embodied AI Robot Large Models: UnifoLM-VLA-0
3.3.7 Embodied AI Robot Large Models: Ecosystem Partners
3.3.8 Embodied AI Robot Large Models: Details of Large Models Adapted to Robots
3.4 AgiBot
3.4.1 Profile
3.4.2 Product Overview
3.4.3 Embodied AI Robot Large Models: Five Self-Developed Core Models
3.4.4 Embodied AI Robot Large Models: GO-1
3.4.5 Embodied AI Robot Large Models: Guiguang Dongyu Large Model
3.4.6 Embodied AI Robot Large Models: WorkGPT
3.4.7 Embodied AI Robot Large Models: ActionGPT – Motion Large Model
3.4.8 Embodied AI Robot Large Models: GE-1 – World Model
3.4.9 Embodied AI Robot Large Models: Ecosystem Partners
3.4.10 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
3.4.11 "Project A"
3.5 Leju Robotics
3.5.1 Profile
3.5.2 Development History
3.5.3 Product Overview
3.5.4 Development Strategy and Planning
3.5.5 Embodied AI Robot Large Models: Development History
3.5.6 Embodied AI Robot Large Models: Embodied AI Module (Self-Developed Multi-Modal by Leju) & Education Large Model
3.5.7 Embodied AI Robot Large Models: Ecosystem Partners
3.5.8 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
3.5.9 Latest Dynamics
3.6 Galbot
3.6.1 Profile
3.6.2 Core Team Members
3.6.3 Product Overview
3.6.4 Strategic Planning
3.6.5 Embodied AI Robot Large Models: Summary of Self-Developed Large Models
3.6.6 Embodied AI Robot Large Models: GraspVLA – Grasping Foundation Model
3.6.7 Embodied AI Robot Large Models: Navigation Large Model
3.6.8 Embodied AI Robot Large Models: GroceryVLA – Retail Scenario Large Model
3.6.9 Embodied AI Robot Large Models: Ecosystem Partners
3.7 RobotEra
3.7.1 Profile
3.7.2 Overview of Robot Products
3.7.3 Four Stages of Embodied AI Robot Large Model Exploration
3.7.4 Embodied AI Robot Large Models: ERA?42
3.7.5 Embodied AI Robot Large Models: CtrlWorld – Controllable Generation World Model
3.7.6 Embodied AI Robot Large Models: Joining Two Top Industry-University-Research Alliances Simultaneously
3.7.7 Embodied AI Robot Large Models: Joint Open-Sourcing of AIGC Robot Large Model with Tsinghua University
3.7.8 Embodied AI Robot Large Models: Ecosystem Partners
3.8 FigureAI
3.8.1 Profile
3.8.2 Embodied AI Robot Large Models: Milestones in the Development
3.8.3 Embodied AI Robot Large Models: Helix – End-to-End VLA General Embodied AI Model
3.8.4 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
3.8.5 Embodied AI Robot Large Models: Industry Chain Partners
3.9 Sanctuary AI
3.9.1 Profile
3.9.2 Core Team Members and Their Resumes
3.9.3 Embodied AI Robot Large Models: Development History
3.9.4 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
3.9.5 Embodied AI Robot Large Models: Summary of Self-developed Large Model
3.9.6 Embodied AI Robot Large Models: Carbon?v3
3.9.7 Embodied AI Robot Large Models: LBM – Large Behavior Model
3.9.8 Embodied AI Robot Large Models: Industry Chain Partners
3.10 1X Technologies
3.10.1 Profile
3.10.2 Development History
3.10.3 Core Team Members and Background
3.10.4 Embodied AI Robot Large Models: R&D and Deployment History
3.10.5 Embodied AI Robot Large Models: Summary of Self-Developed Large Models and Their Implementation Status
3.10.6 Embodied AI Robot Large Models: Redwood AI
3.10.7 Embodied AI Robot Large Models: 1X World Model
3.10.8 Embodied AI Robot Large Models: Ecosystem Partners
3.11 Neura Robotics
3.11.1 Profile
3.11.2 Embodied AI Robot Large Models: Development History
3.11.3 Embodied AI Robot Large Models: Summary of Self-Developed Model System
3.11.4 Embodied AI Robot Large Models: NEFM
3.11.5 Embodied AI Robot Large Models: Ecosystem Partners
3.11.6 Latest Dynamics: China Headquarters Settled in Xiaoshan, Hangzhou
4 Global Major Players and Products: Cross-Border OEMs Camp
4.1 Summary of Typical Embodied AI Large Model Products of OEMs (1)-(4)
4.2 Tesla
4.2.1 Industrial Layout History and Planning for Embodied AI Robots
4.2.2 Strategic Positioning in the Embodied AI Field
4.2.3 Team Setup for Embodied AI Robots
4.2.4 Core Team Members and Resumes of the Optimus Robot Team
4.2.5 Embodied AI Robot Products and Large Model Deployment Status
4.2.6 Embodied AI Robot Large Models: Summary of Large Model Products
4.2.7 Embodied AI Robot Large Models: FSD – End-to-End Embodied Control Model
4.2.8 Embodied AI Robot Large Models: Grok4 – Embodied Interaction Large Model
4.2.9 Optimus Humanoid Robot Brain Adopting Dojo Supercomputer System
4.2.10 Multiplexing FSD Software Algorithms for Robots
4.2.11 AI Humanoid Robot Software Algorithms – Perception Algorithms
4.2.12 AI Humanoid Robot Software Algorithms – Motion Planning
4.2.13 Embodied AI Robot Large Models: Ecosystem Partners
4.2.14 Embodied AI Robot Large Models: Key Dynamics
4.3 Toyota
4.3.1 Industrial Layout History and Planning for Embodied AI Robots
4.3.2 Related Teams/Companies for Embodied AI Robots
4.3.3 Core Members and Their Resumes of the Research Institute
4.3.4 Embodied AI Robot Products and Large Model Deployment Status
4.3.5 Embodied AI Robot Large Models: Summary of Large Model
4.3.6 Embodied AI Robot Large Models: LBM
4.3.7 Embodied AI Robot Large Models: Ecosystem Partners
4.3.8 Embodied AI Robot Large Models: Key Dynamics
4.4 Honda
4.4.1 Industrial Layout History and Planning for Embodied AI Robots
4.4.2 Related Teams/Companies for Embodied AI Robots
4.4.3 Embodied AI Robot Products and Large Model Deployment Status
4.4.4 Release of 2026 Core Technology Roadmap for Embodied Robots
4.4.5 Overview of Embodied AI Robot Large Models
4.4.6 Embodied AI Robot Large Models: Key Dynamics
4.5 Hyundai
4.5.1 Industrial Layout History and Planning for Embodied AI Robots
4.5.2 AIRobotics Strategy: Partnering Human Progress
4.5.3 Related Teams/Companies for Embodied AI Robots
4.5.4 Holding a Controlling Stake in Boston Dynamics
4.5.5 Boston Dynamics: Profile
4.5.6 Boston Dynamics: Core Team Members and Resumes
4.5.7 Embodied AI Robot Products and Large Model Deployment Status
4.5.8 Embodied AI Robot Large Models: Overview of Large Model
4.5.9 Embodied AI Robot Large Models: Ecosystem Partners
4.5.10 Embodied AI Robot Large Models: Key Dynamics
4.6 Xiaomi
4.6.1 Industrial Layout History and Planning for Embodied AI Robots
4.6.2 Related Teams/Companies for Embodied AI Robots
4.6.3 Panoramic View of Investment Ecosystem in the Embodied AI Robot Field
4.6.4 Embodied AI Robot Products and Large Model Deployment Status
4.6.5 Embodied AI Robot Large Models: Overview of Large Model
4.6.6 Embodied AI Robot Large Models: Xiaomi-Robotics-0
4.6.7 Embodied AI Robot: Self-Developed Software Algorithms
4.6.8 Empowerment of Automotive Technology on Embodied AI Robots
4.6.9 Embodied AI Robot Large Models: Ecosystem Partners
4.6.10 Embodied AI Robot Large Models: Key Dynamics
4.7 XPeng
4.7.1 Industrial Layout History and Planning for Embodied AI Robots
4.7.2 Related Teams/Companies for Embodied AI Robots
4.7.3 Core Talents and Their Resumes for Embodied AI Robots
4.7.4 Product Iteration History and Large Model Deployment Status of Embodied AI Robots
4.7.5 Embodied AI Robot Large Models: Summary of Large Model
4.7.6 Embodied AI Robot Large Models: VLT – Robot-Specific Decision Large Model
4.7.7 Embodied AI Robot Large Models: 2nd-Generation VLA Physical World Large Model
4.7.8 Embodied AI Robot Large Models: VLM – Multi-Modal Interaction Large Model
4.7.9 Multiplexing Automotive Algorithm Technology for Humanoid Robots
4.7.10 Embodied AI Robot Large Models: Ecosystem Partners
4.7.11 Embodied AI Robot Large Models: Key Dynamics
4.8 GAC Group
4.8.1 Industrial Layout History and Planning for Embodied AI Robots
4.8.2 Related Teams/Companies for Embodied AI Robots
4.8.3 Establishment of Huilun Technology: Responsible for Core R&D, Production and Sales of Robots
4.8.4 Huilun Technology: Core Members and Their Resumes
4.8.5 Embodied AI Robot Products and Large Model Deployment Status
4.8.6 Embodied AI Robot Large Models: Summary of Large Model
4.8.7 Embodied AI Robot Large Models: GoMate – General Multi-Modal Large Model
4.8.8 Embodied AI Robot Large Models: Embodied AI Motion Control Small Model
4.8.9 Embodied AI Robot Large Models: GoMate Mini – Security Vertical Large Model
4.8.10 Application of Autonomous Driving Technology on Humanoid Robots
4.8.11 Embodied AI Robot Large Models: Ecosystem Partners
4.8.12 Embodied AI Robot Large Models: Summary of Key Dynamics
4.9 Chery
4.9.1 Industrial Layout History and Planning for Embodied AI Robots
4.9.2 Related Teams/Companies for Embodied AI Robots
4.9.3 Embodied AI Robot Products and Large Model Deployment Status
4.9.4 Embodied AI Robot Large Models: Overview of Large Model
4.9.5 Embodied AI Robot Large Models: Ecosystem Partners
4.10 Leapmotor
4.10.1 Industrial Layout History and Planning for Embodied AI Robots
4.10.2 Related Teams/Companies for Embodied AI Robots
4.10.3 Core Team Members and Their Resumes of Embodied AI Robot Team
4.10.4 Embodied AI Robot Products and Large Model Deployment Status
4.10.5 Embodied AI Robot Large Models: Summary and Planning of Large Model
4.10.6 Embodied AI Robot Large Models: Ecosystem Partners
4.10.7 Embodied AI Robot Large Models: Key Dynamics
4.11 BYD
4.11.1 Industrial Layout History and Planning for Embodied AI Robots
4.11.2 Related Teams/Companies for Embodied AI Robots
4.11.3 Core Talents and Their Resumes for Embodied AI Robots
4.11.4 Investment Ecosystem in the Embodied AI Robot Field
4.11.5 Embodied AI Robot Products and Large Model Deployment Status
4.11.6 Summary of Embodied AI Robot Large Models
4.11.7 Embodied AI Robot Large Models: Key Dynamics
4.12 Dongfeng Motor
4.12.1 Industrial Layout History and Planning for Embodied AI Robots
4.12.2 Related Teams/Companies for Embodied AI Robots
4.12.3 Embodied AI Robot Products and Large Model Deployment Status
4.12.4 Embodied AI Robot Large Models: Taiji Large Model
4.12.5 Embodied AI Robot Large Models: Ecosystem Partners
4.12.6 Embodied AI Robot Large Models: Key Dynamics
4.13 Summary of Global Other Main OEMs' Layout in the Embodied AI Robot Field (1)-(4)
Embodied AI Robot Large Model (Including VLA) Research Report, 2026
Research on Robot Large Models: World Models Are About to Become Standard, and OEMs Enter and Accelerate Mass Production and Application
ResearchInChina has released the Embodied AI Robot Large Model...
Next-Generation Embodied AI Robot Communication Network Topology and Chip Industry Report, 2026
AI Robot Communication Network and Chip Research: Six Evolution Trends and Chip Transformation
Embodied AI robots, namely the new generation of AI robots integrating large AI models and physical enti...
Swarm Intelligence and Robotic Collaboration Application Report, 2025
Research on swarm intelligence and robotic collaboration: Swarm intelligence and robotic collaboration will break through the boundaries of individual intelligence and will be widely adopted across va...
Robot Controllers (Brain & Cerebellum) Research Report, 2025
Robot Controller Research: Brain-Cerebellum Integration Becomes a Trend, and Automotive-Grade Chips Migrate to Robots
ResearchInChina has released the Robot Controllers (Brain & Cerebellum) Resea...
Tactile Sensor Research Report, 2025
ResearchInChina has released the "Tactile Sensor Research Report, 2025", which conducts research, analysis and summary on the basic concepts, technical principles, advantages and disadvantages o...
Embodied AI and Humanoid Robot Market Research 2024-2025: Product Technology Outlook and Supply Chain Analysis
Six Trends in the Development of Embodied AI and Humanoid Robots
In 2025, the global humanoid robot industry is at a critical turning point from technology verification to scenario penetration, and t...
Global and China Smart Meters Industry Report, 2022-2027
Meters are widely used in the national economy and are an important part of metering to promote the development of metering. As a legal measuring tool, meters are mainly used in the supply process of ...
China Smart Agriculture and Autonomous Agricultural Machinery Market Report, 2022
Research on smart agriculture and autonomous agricultural machinery: top-level design, agricultural digitization and automation present a potential marketAmid the pandemic, the conflict between Russia...
Global and China Heat Meters Industry Report, 2022-2027
A heat meter is an instrument used to measure, calculate and display the value of heat released or absorbed by water flowing through a heat exchange system, and is mainly used for measuring the heatin...
Global and China CNC Machine Tool Industry Report, 2022-2027
As typical mechatronics products, CNC machine tools are a combination of mechanical technology and CNC intelligence. The upstream mainly involves castings, sheet metal parts, precision parts, function...
Global and China Hydraulic Industry Report, 2021-2026
Hydraulic components are key parts for mobile machineries including construction machinery, agricultural and forestry machinery, material handling equipment and commercial vehicle. The global construc...
China Motion Controller Industry Report, 2021-2026
The motion control system is the core component of intelligent manufacturing equipment, usually composed of controllers, motors, drivers, and human-computer interaction interfaces. Through the control...
Global and China Industrial Robot Servo Motor Industry Report, 2021-2026
As the actuator of control system, servo motor is one of the three crucial parts to industrial robot and its development is bound up with industrial robots. Given the slow progress of 3C electronics a...
Global and China Industrial Laser Industry Report, 2020-2026
As one of the most advanced manufacturing and processing technologies in the world, laser technology has been widely used in industrial production, communications, information processing, medical beau...
Global and China Mining-use Autonomous Driving Industry Report, 2020-2021
Demand and policies speed up landing of Autonomous Driving in Mining
Traditional mines have problems in recruitment, efficiency, costs, and potential safety hazards, while which can be solved by aut...
Autonomous Agricultural Machinery Research Report, 2020
Autonomous Agricultural Machinery Research: 17,000 sets of autonomous agricultural machinery systems were sold in 2020, a year-on-year increase of 188%
Autonomous agricultural machinery relies heavil...
Global and China CNC Machine Tool Industry Report, 2020-2026
As a typical type of mechatronic products, CNC machine tools combine mechanical technology with CNC intelligence. The upstream mainly involves castings, sheet weldments, precision parts, functional pa...
Global and China Hydraulic Industry Report, 2020-2026
Hydraulic parts, essential to modern equipment manufacturing, are mostly used in mobile machinery, industrial machinery and large-sized equipment. Especially, construction machinery consumes the overw...