CN115359463A - Gesture-controlled intelligent surface in vehicle and using method thereof - Google Patents

Gesture-controlled intelligent surface in vehicle and using method thereof Download PDF

Info

Publication number
CN115359463A
CN115359463A CN202211018494.7A CN202211018494A CN115359463A CN 115359463 A CN115359463 A CN 115359463A CN 202211018494 A CN202211018494 A CN 202211018494A CN 115359463 A CN115359463 A CN 115359463A
Authority
CN
China
Prior art keywords
gesture
target detection
detection module
controlled
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211018494.7A
Other languages
Chinese (zh)
Inventor
王金磊
丰建芬
陈津义
刘健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changzhou Xingyu Automotive Lighting Systems Co Ltd
Original Assignee
Changzhou Xingyu Automotive Lighting Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changzhou Xingyu Automotive Lighting Systems Co Ltd filed Critical Changzhou Xingyu Automotive Lighting Systems Co Ltd
Priority to CN202211018494.7A priority Critical patent/CN115359463A/en
Publication of CN115359463A publication Critical patent/CN115359463A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture-controlled intelligent surface in a vehicle and a control method thereof. The invention provides a gesture-controlled intelligent surface in a vehicle and a control method thereof.

Description

Gesture-controlled intelligent surface in vehicle and using method thereof
Technical Field
The invention relates to a gesture-controlled intelligent surface in a vehicle and a using method thereof, and belongs to the field of intelligent lighting of vehicles.
Background
At present, the intelligent surface is the development trend of future vehicle intelligent cabin interior trim and electronics, and bright and colorful atmosphere patterns are displayed on the interior trim, and the intelligent surface can be designed in the form of an instrument panel, a vehicle door interior trim panel, a seat and the like.
However, the control mode of the existing mass production intelligent surface is that the pattern is lighted when the whole vehicle is powered on, the fixed breathing rhythm effect is realized through the preset software, and passengers cannot control the atmosphere rhythm effect of the intelligent surface by themselves. The whole body is monotonous, the intelligent surface can only be regarded as the continuation of different forms of the traditional atmosphere lamp, the intelligent surface cannot enter the basic tone of an intelligent cabin, and the intelligent human-computer interaction experience is lacked. However, as the times grow, passengers increasingly want to be able to autonomously control various parts of the cabin, and the control mode is simple, efficient and supports blind operation, so that the current intelligent surface needs to add more intelligent control modes.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a gesture-controlled intelligent surface in a vehicle and a control method thereof.
In order to solve the technical problems, the technical scheme of the invention is as follows:
the invention provides a gesture-controlled intelligent surface in a vehicle, which comprises a camera module, a target detection module and an intelligent surface, wherein the camera module is connected with the input end of the target detection module, and the output end of the target detection module is connected with the input end of the intelligent surface.
Further, the camera module comprises a near-infrared camera and a light supplement module thereof.
Further, the output end of the near-infrared camera is connected with the input end of the target detection module in an LVDS mode, and the near-infrared camera is used for collecting hand images of drivers and passengers.
Further, the near-infrared camera is installed in the center of the instrument desk in the vehicle.
Further, the target detection module comprises an Yindanda AI processing board, and the algorithm adopted by the target detection module is a YOLOv5 gesture target detection algorithm.
Further, the output end of the target detection module is connected with the input end of the intelligent surface through a CAN bus, and the target detection module is used for sending the gesture recognition result to the intelligent surface through the CAN bus.
Further, the intelligent surface comprises an MCU, an LED constant current driving chip and an RGB LED array;
the MCU is used for analyzing CAN bus information and acquiring a gesture instruction;
and the LED constant current driving chip receives an SPI signal sent by the MCU and executes rhythm effect control of response to the RGB LED array.
Further, the gesture command includes, but is not limited to, a gesture of like or heart or OK or numbers 1-9.
The invention provides a gesture-controlled control method for an intelligent surface in a vehicle, which comprises the following steps:
s1, collecting hand images of a driver and passengers by a camera module, and sending near-infrared images of hands of passengers to a target detection module every 10 frames;
s2, preprocessing the near-infrared image by a target detection module, and cutting the near-infrared image into an image with the resolution of 640 x 640;
s3, firstly, training a target detection module through a network model to obtain a YOLOv5 gesture target detection algorithm, transmitting the preprocessed image into the gesture target detection algorithm, and predicting the result;
s4, analyzing the prediction result, and if the hand image is not an effective gesture, acquiring the hand image again; if the gesture is effective, sending gesture information to the intelligent surface through the CAN bus;
and S5, analyzing the gesture actions of the passengers by the intelligent surface through analyzing the CAN bus signals, and starting different atmosphere pattern rhythm effects according to different gesture actions.
Further, the atmosphere pattern rhythm effect is a color switching mode or a breathing mode or a running water mode or a circulation mode or a cheerful mode.
By adopting the technical scheme, the camera module is used for collecting the hand images of the driver and the passengers, the static gesture actions of the driver and the passengers are identified in a visual perception mode, the atmosphere patterns on the intelligent surface respond to different rhythm effects according to different gestures, such as modes of color switching, breathing, flowing water, circulation, joy and the like, the man-machine interaction can be enriched, and the class of the passenger cabin is improved.
Drawings
FIG. 1 is a system schematic block diagram of a gesture controlled in-vehicle smart surface of the present invention;
FIG. 2 is an algorithm flow chart of the control method of the gesture controlled in-vehicle smart surface of the present invention.
Detailed Description
In order that the manner in which the present invention is attained and can be understood in detail, a more particular description of the invention briefly summarized above may be had by reference to the embodiments thereof which are illustrated in the appended drawings.
Example one
As shown in fig. 1, the present embodiment provides a gesture-controlled intelligent surface in a vehicle, which includes a camera module, a target detection module, and an intelligent surface, wherein the camera module is connected to an input terminal of the target detection module, and an output terminal of the target detection module is connected to an input terminal of the intelligent surface.
As shown in fig. 1, the camera module of the present embodiment includes a near-infrared camera and a light supplement module thereof. The near-infrared camera is installed at the central position of an instrument desk in the automobile, the output end of the near-infrared camera is connected with the input end of the target detection module in an LVDS mode, and the near-infrared camera is used for collecting hand images of drivers and passengers.
As shown in fig. 1, the target detection module of the present embodiment includes an english-viada AI processing board, and the algorithm adopted by the target detection module is a YOLOv5 gesture target detection algorithm. The output end of the target detection module is connected with the input end of the intelligent surface through a CAN bus, and the target detection module is used for sending the gesture recognition result to the intelligent surface through the CAN bus.
As shown in fig. 1, the intelligent surface of the present embodiment includes an MCU, an LED constant current driving chip, and an RGB LED array;
the MCU is used for analyzing the CAN bus information and acquiring a gesture instruction, wherein the gesture instruction comprises but is not limited to a gesture of praise, heart or OK or numbers 1-9;
and the LED constant current driving chip receives the SPI signal sent by the MCU and executes rhythm effect control of response to the RGB LED array.
Example two
As shown in fig. 2, the present embodiment provides a method for controlling an intelligent surface in a vehicle through gesture control, which includes:
s1, a camera module collects hand images of a driver and a passenger, and sends a near-infrared image of the hand of the passenger to a target detection module every 10 frames;
s2, preprocessing the near-infrared image by a target detection module, and cutting the near-infrared image into an image with the resolution of 640 x 640;
s3, a target detection module firstly obtains a YOLOv5 gesture target detection algorithm through network model training, and transmits the preprocessed image into the gesture target detection algorithm to predict a result;
s4, analyzing the prediction result, and if the hand gesture is not an effective gesture, re-acquiring the hand image; if the gesture is effective, sending gesture information to the intelligent surface through the CAN bus;
and S5, analyzing the gesture actions of the passenger by the intelligent surface through analyzing the CAN bus signals, and starting different atmosphere pattern rhythm effects according to different gesture actions, wherein the atmosphere pattern rhythm effects include but are not limited to a color switching mode, a breathing mode, a water flowing mode, a circulating mode and a cheerful mode.
The technical problems, technical solutions and advantages of the present invention will be further described in detail with reference to the above embodiments, it should be understood that the above embodiments are only examples of the present invention and should not be construed as limiting the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. The intelligent surface in the car controlled by gestures is characterized by comprising a camera module, a target detection module and an intelligent surface, wherein the camera module is connected with the input end of the target detection module, and the output end of the target detection module is connected with the input end of the intelligent surface.
2. The gesture-controlled in-vehicle smart surface of claim 1, wherein: the camera module comprises a near-infrared camera and a light supplementing module thereof.
3. The gesture-controlled in-vehicle smart surface of claim 2, wherein: the output end of the near-infrared camera is connected with the input end of the target detection module in an LVDS mode, and the near-infrared camera is used for collecting hand images of drivers and passengers.
4. The gesture-controlled in-vehicle smart surface of claim 3, wherein: the near-infrared camera is installed in the center of the instrument desk in the vehicle.
5. The gesture-controlled in-vehicle smart surface of claim 1, wherein: the target detection module comprises an Yindanda AI processing board, and the algorithm adopted by the target detection module is a YOLOv5 gesture target detection algorithm.
6. The gesture-controlled in-vehicle smart surface of claim 1, wherein: the output end of the target detection module is connected with the input end of the intelligent surface through a CAN bus, and the target detection module is used for sending the gesture recognition result to the intelligent surface through the CAN bus.
7. The gesture-controlled in-vehicle smart surface of claim 1, wherein: the intelligent surface comprises an MCU, an LED constant current driving chip and an RGB LED array;
the MCU is used for analyzing CAN bus information and acquiring a gesture instruction;
and the LED constant current driving chip receives an SPI signal sent by the MCU and executes rhythm effect control of response to the RGB LED array.
8. The gesture-controlled in-vehicle smart surface of claim 7, wherein: the gesture instructions include, but are not limited to, a gesture of like or heart or OK or numbers 1-9.
9. A method of controlling a gesture-controlled smart surface in a vehicle according to any one of claims 1 to 9, comprising:
s1, a camera module collects hand images of a driver and a passenger, and sends a near-infrared image of the hand of the passenger to a target detection module every 10 frames;
s2, preprocessing the near-infrared image by a target detection module, and cutting the near-infrared image into an image with the resolution of 640 x 640;
s3, a target detection module firstly obtains a YOLOv5 gesture target detection algorithm through network model training, and transmits the preprocessed image into the gesture target detection algorithm to predict a result;
s4, analyzing the prediction result, and if the hand image is not an effective gesture, acquiring the hand image again; if the gesture is effective, sending gesture information to the intelligent surface through the CAN bus;
and S5, analyzing the gesture actions of the passengers by the intelligent surface through analyzing the CAN bus signals, and starting different atmosphere pattern rhythm effects according to different gesture actions.
10. The gesture-controlled in-vehicle smart surface of claim 9, wherein: the atmosphere pattern rhythm effect is a color switching mode or a breathing mode or a running water mode or a circulation mode or a cheerful mode.
CN202211018494.7A 2022-08-24 2022-08-24 Gesture-controlled intelligent surface in vehicle and using method thereof Pending CN115359463A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211018494.7A CN115359463A (en) 2022-08-24 2022-08-24 Gesture-controlled intelligent surface in vehicle and using method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211018494.7A CN115359463A (en) 2022-08-24 2022-08-24 Gesture-controlled intelligent surface in vehicle and using method thereof

Publications (1)

Publication Number Publication Date
CN115359463A true CN115359463A (en) 2022-11-18

Family

ID=84005529

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211018494.7A Pending CN115359463A (en) 2022-08-24 2022-08-24 Gesture-controlled intelligent surface in vehicle and using method thereof

Country Status (1)

Country Link
CN (1) CN115359463A (en)

Similar Documents

Publication Publication Date Title
CN110949248A (en) Vehicle multi-mode atmosphere lamp control system and method
CN211457428U (en) Intelligent emotion recognition atmosphere lamp based on image recognition
CN113212124B (en) Intelligent automatic adjusting method and device for car window brightness
CN208827916U (en) Vehicle steering wheel and vehicle
CN114071841B (en) Vehicle atmosphere lamp control system and method, vehicle and computer storage medium
CN114347932A (en) Vehicle personalized control system and method based on state of passenger in vehicle
CN107813677A (en) A kind of automatic air condition HVAC assembly apparatus
CN208615804U (en) Atmosphere lamp automatic regulating system and vehicle
CN115359463A (en) Gesture-controlled intelligent surface in vehicle and using method thereof
CN215068628U (en) Vehicle-mounted auxiliary traffic light prompting controller
CN213108907U (en) Air conditioner control switch control system with atmosphere lamp follow-up function and automobile
CN110001510A (en) A kind of interactive approach and system, the vehicle of vehicle and pedestrian
CN209395747U (en) Sunroof control system
CN212677409U (en) Intelligent atmosphere lamp system
CN108482279A (en) A kind of integrated alarms lamp switch and control method, automobile
CN211280814U (en) Ceiling lighting system of vehicle and vehicle
CN213019464U (en) Integrative tail lamp of electric motor car intelligence LED
CN114435231A (en) Automobile lamp control system
CN112669835A (en) Voice anthropomorphic interaction system based on vehicle-mounted intelligent robot and implementation method thereof
CN117580220B (en) Automatic control system for light brightness in automobile
CN105082938A (en) Vehicle touch screen air-conditioner control system
CN219154420U (en) Vehicle welcome system and vehicle
CN216351535U (en) Projection system for vehicle
CN220391161U (en) Closing and opening control system of automobile ceiling screen
CN217718466U (en) Take roof canopy in car of touch control function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination