China Automotive Gesture Interaction Development Research Report,2022-2023
  • Mar.2023
  • Hard Copy
  • USD $3,800
  • Pages:186
  • Single User License
    (PDF Unprintable)       
  • USD $3,600
  • Code: LYX001
  • Enterprise-wide License
    (PDF Printable & Editable)       
  • USD $5,400
  • Hard Copy + Single User License
  • USD $4,000
      

Vehicle gesture interaction research: in 2022, the installations rocketed by 315.6% year on year.

China Automotive Gesture Interaction Development Research Report, 2022-2023 released by ResearchInChina analyzes and studies four aspects: gesture interaction technology, benchmarking vehicle gesture interaction solutions, gesture interaction industry chain, and gesture interaction solution providers.

1. In 2022, the installations of vehicle gesture recognition functions soared by 315.6% on an annual basis.

Accompanied by iterative upgrade of intelligent cockpit technology, cockpit services are evolving from passive intelligence to active intelligence, and the human-computer interaction mode is also shifting from single-modal to multi-modal interaction. In this trend, vehicle gesture interaction functions enjoy a boom. In 2022, gesture recognition (standard configuration) was installed in 427,000 passenger cars in China, a year-on-year spurt of 315.6%, with installation rate up to 2.1%, 1.6 percentage points higher than 2021.

手势交互 1_副本.png

By brand, in 2022 Changan Automobile boasted the highest gesture recognition installation rate, up to 33.0%, 13.1 percentage points higher than 2021. In terms of models, in 2022, Changan Automobile had a total of 6 models (e.g., UNI-V, CS75 and UNI-K) equipped with gesture recognition as a standard configuration, 5 models more than in 2021.

手势交互 2_副本.png

The gesture recognition feature of Changan UNI-K adopts a 3D ToF solution, enabling such functions as song switch and navigation activation. The specific gestures are: swipe the palm horizontally to left/right for playing the previous/next song; make a finger heart for navigating back home; thumb up for navigating to the workplace.

手势交互 3_副本.png

2. The control scope of gesture recognition is extending from software to hardware, and from the inside to the outside of cars.

As gesture interaction technology gains popularity and finds application in ever more scenarios, vehicle gesture interaction also springs up. At present, automakers are working hard on layout of cockpit interaction functions. Gesture controlled functions have increased from initially in-vehicle infotainment system features (e.g., phone call, media volume and navigation), to body hardware and safety systems (e.g., windows/sunroof/sunshades, doors, and driving).

In addition, manufacturers also make efforts to develop exterior gesture control technology. One example is WEY Mocha that has allowed for gesture control over ignition, forward/backward movement, stop, and flameout outside the car. In the future, gesture recognition will no longer be limited to occupants, and will gradually cover actions of passers-by outside the car, for instance, recognizing command gestures of traffic police on road or gestures made by cyclists around the car.

3. Six gesture recognition technology routes.

From the perspective of technology route, gesture recognition technologies are led by 3D camera based structured light, ToF, stereo imaging, radar-based millimeter wave, ultrasonics, and bioelectricity-based myoelectricity.

手势交互 4_副本.png

In current stage, 3D camera based gesture sensing prevails among vehicle gesture recognition technology routes. The technology route consists of 3D camera and control unit. Composed of camera, infrared LED and sensor, the 3D camera is used to capture hand movements, and then recognize the type of gestures via corresponding image processing algorithms and issue relevant instructions. The 3D camera based technology route can be subdivided into structured light, ToF and stereo vision.

1. Structured light technology refers to a solution where the light with coded information is projected onto the human body, the infrared sensor collects the reflected structural pattern, and finally the processor builds a 3D model. With benefits of mature hardware, high recognition accuracy and high resolution, it is applicable to close-range scenarios within 10 meters. The gesture recognition carried by Neta S rolled out in July 2022 is a structured light solution.

The in-cabin gesture recognition sensor of Neta S is located above the interior rearview mirror. It can recognize 6 gestures, including: swipe the palm back and forth to adjust the light transmittance of the sunroof; make the "shh" sign for the silent mode; rotate a finger clockwise/counterclockwise to adjust the volume; move the palm to left and right to switch audio and video programs; make the "V" sign to take selfies in the car; thumb up to save favorite programs.

手势交互 5_副本.png

2. ToF technology, namely, time-of-flight (ToF) based ranging, is enabled with 3D images constructed by the underlying photosensitive elements. It can obtain effective and real-time depth information within 5 meters. With the applicability to a wider range of scenarios, it acquires effective depth of field information regardless of whether the ambient light is strong (e.g., sunlight) or weak. The gesture recognition solutions installed in production models like BMW iX, Li Auto L9 and new ARCFOX αS HI Edition are all ToF solutions.

The in-cabin gesture recognition sensor of BMW iX lies at the dome light above the center console screen. It can recognize 8 gestures, including:
① swipe hand to left and right to reject phone call/close the pop-up;
② point the index finger back and forth to answer phone call/confirm the pop-up;
③ rotate the index finger clockwise to turn the volume up or zoom in on the navigation map;
④ rotate the index finger counterclockwise to turn the volume down or zoom out on the navigation map;
⑤move a fist with thumb extended to left right back and forth to play the previous/next song;
⑥ point the index and middle fingers extended into the display to perform individually assignable gesture;
⑦ stretch out all five fingers, make a fist and then stretch out all five fingers again to perform individually assignable gesture;
⑧ bring thumb and index finger together and swipe the hand to the right or left for a view around the car (requiring the car to pack the automated parking assist system PLUS).  

手势交互 6_副本.png

To ensure gesture recognition and control by occupants, Li Auto L9 has gesture recognition sensors installed in the fore cabin and rear cabin. The fore cabin sensor is located above the interior rearview mirror, and the rear one lies above the rear entertainment screen.

The fore cabin sensor can recognize 2 gestures, including:
①point towards windows/sunroof/sunshades to control (combined with voice interaction capability);
②make a fist and hold, and swipe up and down on the play page to adjust the volume.   

The rear cabin sensor can recognize 7 gestures, including: ①stretch out all five fingers and place the inner side of elbow on the armrest for 2 seconds to activate the gesture control function; ②stretch out all five fingers and swipe the hand down to turn on the screen; ③stretch out all five fingers and swipe the hand to move the cursor; ④stretch out all five fingers and make a fist to spot the icon; ⑤stretch out all five fingers, make a fist and hold, and move the hand to share the content on the rear entertainment screen to the front display; ⑥stretch out all five fingers, make a fist and hold, and swipe the hand on the play page to left and right to adjust the play progress; ⑦stretch out all five fingers and swipe the hand up to exit the current content. 

手势交互 7_副本.png

3. Stereo imaging technology based on the parallax principle is enabled with 3D geometric information of objects that is obtained from multiple images. This technology is a cost-effective solution posing low requirements for hardware and needing no additional special device. The gesture recognition solution carried by the Mercedes-Benz EQS launched in May 2022 is a stereo imaging solution.

The in-cabin gesture recognition sensor of Mercedes-Benz EQS is located at the reading light on the roof, and can recognize 3 gestures, including:
① make the "V" sign to call up favorites;
②swipe hand back and forth under the interior rearview mirror to control the sunroof;
③swipe hand toward the inside of the car to automatically close doors (requiring optional four-door electric switches). 

手势交互 8_副本.png


Currently gesture recognition technologies such as radar-based millimeter wave, ultrasonics, and bioelectricity-based myoelectricity have yet to be used widely in in-cabin gesture recognition functions. Compared with conventional vision-based gesture recognition, these technologies still have some limitations and pose challenges.  

1. Radar is a radio wave sensor that enables accurate detection of the position and movements of hands even in the presence of obstacles.

In 2020, Ainstein, the American subsidiary of Muniu Technology, together with ADAC Automotive established a joint venture brand - RADAC. At the CES 2020, Ainstein introduced a radar–based vehicle gesture recognition solution. The gesture recognition sensor in this solution lies on the top of the tailgate, allowing users to open the door by swiping hand to left and right.

手势交互 9_副本.png

2. Ultrasonic radar. In February 2020, DS showcased the Aero Sport Lounge concept car at the Geneva International Motor Show. Integrating Leap Motion and Ultrahaptics technologies, this car can easily recognize and understand every gesture made by occupants, and give haptic feedback to them through the stereo ultrasonic waves emitted by the micro-speaker.

The in-cabin gesture recognition and ultrasonic feedback sensor of DS Aero Sport Lounge is located at the center armrest of the car, and can recognize 5 gestures, including:
①adjust in-cabin temperature and blowing velocity;
②adjust tracks and volume;
③process navigation/map, including new route settings;
④answer/reject phone calls;
⑤switch menu functions.

手势交互 10_副本.png

3. Bioelectricity refers to the electric signals generated by human muscular movements. Bioelectric sensors can recognize gestures and movements by measuring these signals. At present, the bioelectricity-based myoelectric gesture recognition technology is more used to control external devices and interaction interfaces, such as prosthetics, virtual reality and gaming devices. Thalmic Labs, a Canadian company dedicated to developing smart gesture control products, introduced the first wearable device, the MYO armband, which uses myoelectricity technology. The eight myoelectric sensors embedded in the armband record the electric signals of arm muscles, and recognize different gestures by analyzing these signals. In the actual application, users can control drones, computers, smartphones and other electronic devices through the Bluetooth connect of MYO. There are no vehicle use cases at present.

手势交互 11_副本.png

1 Overview of Gesture Interaction
1.1 Introduction to Gesture Interaction
1.2 Key Features of Gesture Interaction
1.3 Application Scenarios of Gesture Interaction
1.3.1 Mobile Devices
1.3.2 Smart Wearables
1.3.3 Smart Home
1.3.4 Outdoor/Indoor Experience Areas
1.3.5 Automobiles
1.4 Development Route of Automotive Gesture Interaction 
1.4.1 Development Route of Gesture Interaction and Intelligent Cockpit 
1.4.2 Development Route of Gesture Interaction and Multimodal Interaction
1.4.3 Installation History of Automotive Gesture Interaction
1.4.4 Development History and Trends of Intelligent Cockpit Interaction Scenarios (1)
1.4.5 Development History and Trends of Intelligent Cockpit Interaction Scenarios (2)
1.4.6 Development History and Trends of Intelligent Cockpit Interaction Scenarios (3) 
1.4.7 Development Trends of Gesture Interaction (1)
1.4.8 Development Trends of Gesture Interaction (2)
1.4.9 Development Trends of Gesture Interaction (3)

2 Gesture Interaction Industry Chain
2.1 Types of Gesture Interaction Technology
2.2 Development History of Gesture Interaction Technology by Type
2.3 Gesture Interaction Industry Chain
2.4 Gesture Interaction Algorithm
2.5 Gesture Interaction Solution Providers
2.6 Trends of Gesture Interaction Patent Filings
2.7 TOP10 Companies by Number of Gesture Interaction Patents

3 Gesture Interaction Solutions for Benchmarking Models
3.1 Installation of Vehicle Gesture Recognition 
3.1.1 Installations & Installation Rate 
3.1.2 Ranking of Brands
3.1.3 Ranking of Vehicle Models
3.1.4 Price Features 
3.1.5 Installation of Gesture Function in New Models in 2022
3.1.6 Price Features of New Models in 2022
3.2 Cockpit Interaction Modes and Gesture Interaction Functions of Major Models
3.3 Gesture Interaction Solutions for Benchmarking Models
3.3.1 BMW
3.3.2 Li Auto
3.3.3 ARCFOX 
3.3.4 Mercedes-Benz
3.3.5 Neta
3.3.6 RADAC & DS & BYTON

4 Gesture Interaction Solution Providers
4.1 Cipia Vision (Eyesight Technologies)
4.1.1 Profile
4.1.2 Gesture Interaction Products 
4.1.3 In-cabin Interaction Solutions
4.1.4 In-cabin Solutions 
4.2 Ultraleap (Leap Motion)
4.2.1 Profile
4.2.2 Hand Tracking Technology
4.2.3 Hand Tracking Hardware
4.2.4 Gesture Tracking Software
4.2.5 Mid-air Haptic Technology
4.2.6 Haptic Feedback Kit
4.2.7 Technical Reference Platform
4.2.8 Application of Products: Gesture Solution Based on AR Helmet
4.2.8 Application of Products: Outdoor Gesture Solution
4.2.8 Application of Products: Vehicle Gesture Solution
4.2.9 Development Route
4.3 Aptiv
4.3.1 Profile
4.3.2 Solutions
4.3.3 In-cabin Sensing Platform
4.3.4 Gesture Recognition Technology
4.4 Cerence Inc.
4.4.1 Profile 
4.4.2 Cockpit Interaction Solutions
4.4.3 Core Technologies
4.4.4 Human-Computer Interaction: Eyesight + Gesture/Voice
4.4.4 Human-Computer Interaction: Gesture
4.4.5 Development Plan
4.5 Melexis
4.5.1 Profile
4.5.2 ToF Sensor Chip
4.5.3 Gesture Interaction
4.6 SenseTime
4.6.1 Profile
4.6.2 Core Technologies of Smart Cars
4.6.3 Smart Car Solutions 
4.6.4 SenseAuto Cabin
4.6.5 In-cabin Interaction Technology
4.7 uSens
4.7.1 Profile
4.7.2 Core Technologies
4.7.3 Product Layout
4.7.4 Gesture Recognition System Solutions
4.7.5 Strategic Planning
4.7.6 Development Route of Interaction
4.8 Geefish Technology
4.8.1 Profile
4.8.2 ToF Modules for Gesture Recognition 
4.8.3 In-depth Human Vehicle Interaction Solution
4.8.4 Gesture Recognition Technology
4.8.5 Exploration Route
4.9 iGentAI 
4.9.1 Profile
4.9.2 IVI Solutions

5 Appendix: Summary of Automotive Gesture Interaction Patents
5.1 Summary of Gesture Interaction Patents of OEMs
5.2 Summary of Gesture Interaction Patents of Suppliers
5.3 Summary of Gesture Interaction Patents of Research Institutes
5.4 Summary of Gesture Interaction Patents of Colleges and Universities 
5.5 Summary of Gesture Interaction Patents of Individuals 
 

Automotive Display, Center Console and Cluster Industry Report, 2026

Automotive Display Research: Multi-Screen Application Slows Down, While OLED and MiniLED Are Introduced in Vehicles Quickly In 2026, automotive displays will no longer excessively pursue the number a...

Global and China Intelligent Vehicle Standard System Construction and Certification Research Report, 2026

Intelligent Driving Standards and Certification: With the Maturing Standardization System, China Will Participate in Formulation of Global Standards China's automotive industry is transforming from ...

Automotive Intelligent Diagnosis Industry Report, 2026

Automotive Intelligent Diagnosis Research: Powered by AI, Remote Diagnosis Is Being Upgraded towards Intelligence. ResearchInChina released the Automotive Intelligent Diagnosis Industry Report, 2026....

Automotive Cloud Service Platform Research Report, 2026

Research on automotive cloud service platform: with architecture upgrade and computing power improvement, cloud services enter a new stage In 2026, the Internet of Vehicles industry generates petaby...

Integrated Battery and Innovative Battery Technology Research Report, 2026

Power Battery Research: Sales of High-Capacity Vehicles Keep Rising, and Solid-State Batteries Begin to Be Installed in Vehicles I. Sales of High-Capacity Vehicles Sustain Growth, and Those with A C...

Chinese Independent OEMs’ ADAS and Autonomous Driving Report, 2026

Research on OEMs' Intelligent Driving: Era of Physical AI, Standard Configuration of D2D, and Initial Exploration of L3 Commercial Pilot Projects From 2023 to 2025, the intelligent driving installati...

Intelligent Vehicle New Technology Application Analysis Report, 2025-2026

New Technology Research: Innovative Products such as Bionic Cameras, Vision-LiDAR Fusion Sensors, Auditory Sensors Further Enhance Vehicle Perception Capabilities ForewordResearchInChina released th...

Automotive Optical Fiber Communication (Optical Fiber Ethernet, PON) and Supply Chain Research Report, 2026

Research on Automotive Optical Fiber Communication: Introduction of Optical Fiber in Vehicles Accelerates, with Priority Deployment in High-Speed Communication Link (10+Gbps) Scenarios Automotive opt...

Automotive Intelligent Cockpit SoC Research Report, 2026

Automotive Cockpit SoC Research: Passenger Cars in the Price Range of RMB100,000–200,000 Account for Nearly 50% of Total Sales, and New-Generation Cockpit SoC Products Largely Enter Mass Production P...

LiDAR (Automotive, Pan-Robotics, etc.) Application Research Report, 2025-2026

LiDAR research: hardware competition shifts to combined sensing capabilities from "point cloud" to "images” and from automotive to robots     The "LiDAR (Automotive, Pan-Robotics, ...

Global and China Passenger Car T-Box Market Report, 2026

Based on 2025 market data and the latest business layouts of OEMs and suppliers from 2025 to 2026, this report analyzes the development status quo and future trends of China’s passenger car T-Box mark...

Global and China Range Extended Electric Vehicle (REEV) and Plug-in Hybrid Electric Vehicle (PHEV) Research Report, 2026

Research on REEVs and PHEVs: Foreign OEMs are considering extended-range technology as an important strategic option and will launch a series of new vehicles Global PHEVs & REEVs tend to be domin...

Automotive Voice Industry Report, 2026

Automotive Voice Research: Explosive Growth in Features Like "See and Speak", 35-Fold Increase in External Voice Interaction in Two Years ResearchInChina has released the Automotive Voice Industry R...

China Passenger Car Digital Chassis Research Report, 2026

Research on Digital Chassis: Leading OEMs Have Completed Configuration of Version 2.0 1. Leading OEMs Have Completed Configuration of Digital Chassis 2.0 By the degree of wired control of each c...

Vehicle Functional Safety and Safety Of The Intended Functionality (SOTIF) Research Report, 2026

Multiple Mandatory Standards for Intelligent Vehicles in China Upgrade Functional Safety Requirements from Recommended to Mandatory Access Criteria In 2026, China has intensively issued and promo...

Automotive 12V/48V Low-Voltage Lithium-ion Battery/Sodium-ion Battery Industry Research Report, 2026

Research on 12V/48V automotive low-voltage lithium-ion (sodium-ion) batteries: promoted by regulations and standardization, it is imperative to "replace lithium-ion (sodium-ion) batteries with lead-ac...

Next-Generation Automotive Wireless Communication Technologies (6G/5G-A, NearLink, Satellite Communication, UWB, etc.) and Automotive Communication Module Industry Report, 2026

Research on Next-Generation Communication and Modules: Accelerated Deployment of 5G-A, Satellite Communication, NearLink, UWB and Other Technologies in Automobiles Automotive wireless communication t...

Research on Zonal Architecture: Smart Actuators (Micro-motors) and Application Trends in Sub-scenarios, 2026

Smart Actuator and Micro-motor Research: Under Zonal Architecture, Actuators Are Developing towards Edge Computing, 48V, and Brushless Motors. The core components of automotive zonal architecture mai...

2005- www.researchinchina.com All Rights Reserved 京ICP备05069564号-1 京公网安备1101054484号