CN113696850A - Vehicle control method and device based on gestures and storage medium - Google Patents

Vehicle control method and device based on gestures and storage medium Download PDF

Info

Publication number
CN113696850A
CN113696850A CN202110993042.XA CN202110993042A CN113696850A CN 113696850 A CN113696850 A CN 113696850A CN 202110993042 A CN202110993042 A CN 202110993042A CN 113696850 A CN113696850 A CN 113696850A
Authority
CN
China
Prior art keywords
vehicle
gesture
target
control result
vehicle control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110993042.XA
Other languages
Chinese (zh)
Inventor
朱鹤群
胡晓健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xianta Intelligent Technology Co Ltd
Original Assignee
Shanghai Xianta Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xianta Intelligent Technology Co Ltd filed Critical Shanghai Xianta Intelligent Technology Co Ltd
Priority to CN202110993042.XA priority Critical patent/CN113696850A/en
Publication of CN113696850A publication Critical patent/CN113696850A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/2045Means to switch the anti-theft system on or off by hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a vehicle control method and device based on gestures and a storage medium. The method comprises the following steps: acquiring an image to be learned and first mapping information, which are acquired by a camera module loaded on a target vehicle, wherein the image to be learned records a first gesture action of a target user, and the first mapping information represents that the first gesture action is mapped with a first vehicle control result; determining second mapping information based on the first vehicle control result, the first gesture action and a preset association rule, wherein the second mapping information represents that the second gesture action is mapped with the second vehicle control result; updating a mapping relation according to the first mapping information and the second mapping information; acquiring a control image acquired by a camera module loaded on a target vehicle, wherein the control image records gesture actions to be executed of a target user; and controlling the target vehicle according to the gesture action to be executed and the mapping relation.

Description

Vehicle control method and device based on gestures and storage medium
Technical Field
The invention relates to the technical field of vehicle control, in particular to a vehicle control method and device based on gestures and a storage medium.
Background
With the continuous development of technology, the application of automatic vehicle control technology, such as automatic driving, automatic parking, automatic unlocking, etc., is more and more common in the market besides the traditional way of manually controlling the vehicle.
In the prior art, for example, when a vehicle is unlocked, an image can be collected through a camera, and if the collected image includes a shadow of a user, the vehicle can be unlocked, but the unlocking accuracy is not high by adopting the method, some people which are not car owners around the vehicle can be collected, and a huge potential safety hazard can exist if the vehicle is unlocked. Moreover, if the user desires to perform other operations besides unlocking the vehicle, such as ignition, turning on an air conditioner, turning on music, etc., these personalized operations cannot be automatically implemented.
Based on this, how to realize the automatic control of the vehicle can improve the accuracy of the automatic control, can meet the personalized requirements of users, improve the user experience, and become the key point of attention in the industry.
Disclosure of Invention
The invention provides a vehicle control method and device based on gestures and a storage medium, and aims to solve the problems of inaccurate vehicle control, low safety and poor convenience.
According to a first aspect of the present invention, there is provided a gesture-based vehicle control method comprising:
acquiring an image to be learned and first mapping information, which are acquired by a camera module loaded on a target vehicle, wherein the image to be learned records a first gesture action of a target user, and the first mapping information represents that the first gesture action is mapped with a first vehicle control result;
determining second mapping information based on the first vehicle control result, the first gesture action and a preset association rule, wherein the second mapping information represents that the second gesture action is mapped with the second vehicle control result, and the association rule defines: the vehicle control result associated with the first vehicle control result is the second vehicle control result, and the relationship between the first gesture action and the second gesture action;
updating the stored mapping relation according to the first mapping information and the second mapping information;
acquiring a control image acquired by a camera module loaded on a target vehicle, wherein the control image records gesture actions to be executed of a target user;
and controlling the target vehicle according to the gesture action to be executed and the mapping relation.
Optionally, the first vehicle control result is a vehicle control result with an opposite function as the second vehicle control result.
Optionally, if the first vehicle control result includes unlocking the target vehicle, the second vehicle control result includes locking the target vehicle;
if the first vehicle control result comprises igniting the target vehicle, the second vehicle control result comprises extinguishing the target vehicle;
if the first vehicle control result comprises that the vehicle-mounted equipment of the target vehicle is opened, the second vehicle control result is that the vehicle-mounted equipment is closed;
if the first vehicle control result comprises improvement of the working parameters of the vehicle-mounted equipment of the target vehicle, the second vehicle control result is reduction of the working parameters;
and if the first vehicle control result comprises that the working state of the vehicle-mounted equipment of the target vehicle is changed from a first state to a second state, the second vehicle control result is that the working state is changed from the second state to the first state.
Optionally, a relationship between the first gesture and the second gesture satisfies any one of the following:
the first gesture is a gesture motion of the target user's hand moving along a first line, and the second gesture is a gesture motion of the target user's hand moving along a second line; the first line and the second line are lines with the same or similar shapes, but the corresponding movement directions are opposite;
the first gesture is taken as a gesture motion of the hand of the target user moving along a first graph, and the second gesture is taken as a gesture motion of the hand of the target user moving along a second graph; the first figure and the second figure are the same or similar in shape, but the first figure and the second figure are different by a specified deflection angle or are symmetrical;
the first gesture motion and the second gesture motion are the same gesture motion.
Optionally, the controlling the target vehicle according to the gesture motion to be executed and the mapping relationship includes:
and determining a target control result based on the gesture action to be executed and the mapping relation, and controlling the target vehicle to execute the target control result.
Optionally, the target control result includes at least one of:
ignition, flameout, vehicle locking, advancing, backing, executing a parking process, turning on the vehicle-mounted equipment, turning off the vehicle-mounted equipment, and controlling the vehicle-mounted equipment to generate specified change.
Optionally, the method further includes:
determining a motion frequency of the gesture motion to be performed in the control image;
the controlling the target vehicle to execute the target control result includes:
and controlling the target vehicle based on the action frequency and the target control result.
Optionally, the target control result includes: a movement of the target vehicle, the movement including traveling or backing up;
controlling the target vehicle based on the action frequency and the target control result, including:
controlling the movement of the target vehicle and adapting the speed of the target vehicle to the change in the action frequency.
Optionally, the matching the vehicle speed of the target vehicle to the action frequency variation includes:
when the change frequency is larger than a preset frequency threshold value, increasing the speed of the target vehicle;
when the change frequency is less than a frequency threshold, reducing the vehicle speed of the target vehicle.
According to a second aspect of the present invention, there is provided a gesture-based vehicle control apparatus comprising:
the device comprises a first acquisition module, a first mapping module and a second acquisition module, wherein the first acquisition module is used for acquiring an image to be learned and first mapping information, the image to be learned is acquired by a camera module loaded on a target vehicle, a first gesture action of a target user is recorded in the image to be learned, and the first mapping information represents that the first gesture action is mapped with a first vehicle control result;
a mapping determination module, configured to determine second mapping information based on the first vehicle control result, the first gesture action, and a preset association rule, where the second mapping information represents that the second gesture action is mapped with the second vehicle control result, and the association rule defines: the vehicle control result associated with the first vehicle control result is the second vehicle control result, and the relationship between the first gesture action and the second gesture action;
the updating module is used for updating the stored mapping relation according to the first mapping information and the second mapping information;
the second acquisition module is used for acquiring a control image acquired by a camera module loaded on a target vehicle, and the control image records gesture actions to be executed of the target user;
and the control module is used for controlling the target vehicle according to the gesture action to be executed and the mapping relation.
According to a third aspect of the present invention, there is provided a storage medium having a program stored thereon, wherein the program, when executed by a processor, performs the steps of the method of the first aspect.
According to a fourth aspect of the present invention, there is provided an electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, wherein the processor implements the steps of the method of the first aspect when executing the program.
According to the vehicle control method based on the gestures, provided by the invention, after the first gesture action and the first mapping information are learned, the second mapping information can be automatically determined based on the first vehicle control result mapped and represented by the first mapping information, the first gesture action and the preset association rule, so that the mapping between the second gesture action and the second vehicle control result is established, and the efficiency of establishing and updating the mapping relation is effectively improved. And through the application of the association rule, the second mapping information and the first mapping information can be ensured to be related, so that the user can conveniently memorize and use gesture actions to control.
In addition, since the mapping relationship between the gesture motion and the control result is formed based on learning and control can be performed based on the gesture motion, the accuracy and safety of vehicle control can be improved. Moreover, the user can customize the gesture action of the user, the personalized requirement of the user is met, and the user experience is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic flow diagram illustrating a gesture-based vehicle control method in accordance with an exemplary embodiment of the present invention;
FIG. 2 is a schematic flow diagram illustrating another gesture-based vehicle control method in accordance with an exemplary embodiment of the present invention;
FIG. 3 is a hardware block diagram of an electronic device in which a gesture-based vehicle control apparatus is located according to an exemplary embodiment of the present invention;
FIG. 4 is a block diagram of a gesture-based vehicle control apparatus, according to an exemplary embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a gesture-based vehicle control method according to an exemplary embodiment of the invention. The method may be applied to an electronic device having a memory, a processor, the method may include the steps of:
102, acquiring an image to be learned and first mapping information, which are acquired by a camera module loaded on a target vehicle, wherein the image to be learned records a first gesture action of a target user, and the first mapping information represents that the first gesture action is mapped with a first vehicle control result;
step 104, determining second mapping information based on the first vehicle control result, the first gesture action and a preset association rule, wherein the second mapping information represents that the second gesture action is mapped with the second vehicle control result, and the association rule defines: the vehicle control result associated with the first vehicle control result is the second vehicle control result, and the relationship between the first gesture action and the second gesture action;
step 106, updating the stored mapping relation according to the first mapping information and the second mapping information;
step 108, acquiring a control image acquired by a camera module loaded on a target vehicle, wherein the control image records gesture actions to be executed of the target user;
and step 110, controlling the target vehicle according to the gesture action to be executed and the mapping relation.
The above steps are explained in detail below.
In this embodiment, the target vehicle may be loaded with a camera module, and the camera module may be a camera, for example, a camera on a driving recorder, or a specially configured camera, which is not limited to this. The camera module can acquire images of the surrounding environment of the target vehicle in real time. The camera module for collecting the control image and the image to be learned can be the same module or different modules.
The first mapping information may be formed by user input or received from the outside;
the mapping relation is the mapping relation between each gesture action and the vehicle control result; .
In this embodiment, the first vehicle control result is a vehicle control result with an opposite function as the second vehicle control result; for example:
if the first vehicle control result comprises unlocking the target vehicle, the second vehicle control result comprises locking the target vehicle;
if the first vehicle control result comprises igniting the target vehicle, the second vehicle control result comprises extinguishing the target vehicle;
if the first vehicle control result comprises that the vehicle-mounted equipment of the target vehicle is opened, the second vehicle control result is that the vehicle-mounted equipment is closed;
if the first vehicle control result comprises improvement of the working parameters of the vehicle-mounted equipment of the target vehicle, the second vehicle control result is reduction of the working parameters;
if the first vehicle control result comprises that the working state of the vehicle-mounted equipment of the target vehicle is changed from a first state to a second state, the second vehicle control result is that the working state is changed from the second state to the first state;
if the first vehicle control result includes the traveling of the target vehicle, the second vehicle control result includes the backing of the target vehicle.
The on-board device may be any device mounted on the vehicle, such as an air conditioner, a playing device, a display, an air purifier, an image capturing device, and the like.
The operating parameter may be any parameter formed during the operation of the above vehicle-mounted device, such as the temperature of an air conditioner, the volume of a playing device, and the like.
The first state and the second state may be any information describing the operating state of the vehicle-mounted device, for example, the first state of the air conditioner may be a cooling state, the second state of the air conditioner may be a heating state, the first state of the playback device may be a playback state, and the second state of the playback device may be a pause state.
In other examples, the first vehicle control result and the second vehicle control result may also be associated and not oppositely functioning vehicle control results, such as ignition of the vehicle and turning on of an air conditioner; turning on the air conditioner, adjusting the temperature of the air conditioner, and the like.
In this embodiment, the relationship between the first gesture motion and the second gesture motion satisfies any one of the following conditions:
the first gesture is a gesture motion of the target user's hand moving along a first line, and the second gesture is a gesture motion of the target user's hand moving along a second line; the first line and the second line are lines with the same or similar shapes, but the corresponding movement directions are opposite;
the first gesture is taken as a gesture motion of the hand of the target user moving along a first graph, and the second gesture is taken as a gesture motion of the hand of the target user moving along a second graph; the first figure and the second figure are the same or similar in shape, but the first figure and the second figure are different by a specified deflection angle or are symmetrical;
the first gesture motion and the second gesture motion are the same gesture motion.
In a specific example, a skeleton node of a target user in an image can be identified, and what gesture action the hand of the target user makes is judged based on the motion of the skeleton node, when a first gesture is used for circling clockwise, a second gesture action associated with the first gesture can be used for circling anticlockwise; when the first gesture is taken as drawing a regular triangle, the second gesture associated with the first gesture may be taken as drawing an inverted triangle (which may be understood as a figure symmetrical to the regular triangle and may also be understood as a figure in which the regular triangle is deflected by 180 degrees); when the first gesture is a swipe up, the second gesture with action association therewith may be a swipe down; when the first gesture is a leftward swipe, the second gesture associated with the first gesture may be a rightward swipe.
Of course, the above examples are merely exemplary, and other related actions may exist in practical applications, which are not examples herein.
In this embodiment, a mapping relationship between each gesture and a vehicle control result may also be stored, and based on the first mapping information and the second mapping information, the mapping relationship may be updated, that is: and writing the mapping relation represented by the first mapping information and the second mapping information into the stored mapping relation. In addition, the stored mapping relationships may be stored with mapping relationships that are not determined by the above steps.
In a specific aspect of step 110, a target control result may be determined based on the gesture motion to be performed and the mapping relationship, and the target vehicle may be controlled to execute the target control result.
Specifically, the target gesture action can be positioned in the mapping relation based on the gesture action to be executed, the vehicle control result mapped by the target gesture action is the target control result, and further, the target vehicle can be controlled to execute the target control result.
The target control result may include at least one of:
ignition, key-off, locking the vehicle, traveling, backing up, performing a parking process, turning on the vehicle device, turning off the vehicle device, controlling the vehicle device to make a specified change (e.g. an increase, a decrease of an operating parameter, a switch between the first state, the second state, etc. as mentioned earlier).
For example, the first gesture is a clockwise circle, and the second gesture associated with the first gesture is a counterclockwise circle, and the first vehicle control result corresponding to the first gesture may be unlocked and the second vehicle control result corresponding to the second gesture may be locked. Then: when the gesture action to be performed is circling clockwise, it may be determined based on this that the vehicle control result (i.e., the target control result) mapped by the gesture action to be performed is unlocked.
By adopting the method, after the first gesture action and the first mapping information are learned, the second mapping information can be automatically determined based on the first vehicle control result mapped and represented by the first mapping information, the first gesture action and the preset association rule, so that the mapping between the second gesture action and the second vehicle control result is established, and the efficiency of establishing and updating the mapping relation is effectively improved. And through the application of the association rule, the second mapping information and the first mapping information can be ensured to be related, so that the user can conveniently memorize and use gesture actions to control.
In addition, since the mapping relationship between the gesture motion and the control result is formed based on learning and control can be performed based on the gesture motion, the accuracy and safety of vehicle control can be improved. Moreover, the user can customize the gesture action of the user, the personalized requirement of the user is met, and the user experience is improved.
The following describes another embodiment of a gesture-based vehicle control method provided by the present invention.
Referring to fig. 2, fig. 2 is a schematic flow chart illustrating another gesture-based vehicle control method according to an exemplary embodiment of the invention.
In one example, the target vehicle may be loaded with an electronic device having computing capabilities, which is capable of executing the steps of the method of the present embodiment, and the method of the present embodiment may be applied to the electronic device.
In another example, only a camera module for capturing images is loaded on the target vehicle, the camera module can send the captured images to a server, and the server executes the steps of the method described in this embodiment, so that the method described in this embodiment can be applied to the server.
With respect to step 110:
in one example, when matching, a gesture motion to be performed may be extracted from the control image, and a gesture feature may be extracted, where the gesture feature may be, for example, a position and/or a position change of a skeleton node of the gesture motion, and then the gesture feature of the gesture motion to be performed is matched with gesture features of some existing gesture motions, so as to obtain a matched target gesture motion.
In another example, upon matching, the control image may also be input to a gesture recognition model that may predict a target gesture action that results in a match. Wherein the gesture recognition model may be trained based on only a large number of gesture motions.
In this embodiment, if the matched target gesture motion can be obtained, based on a mapping relationship between the pre-stored gesture motion and the vehicle control result, the vehicle control result corresponding to the target gesture motion may be obtained as the target control result, and the target vehicle may be controlled based on the target control result.
In an embodiment of the present invention, please refer to fig. 3, which may further include:
step 201: a motion frequency of a gesture motion to be performed in the control image is determined based on the control image.
The specific method for determining the operation frequency may refer to related technologies, and this embodiment is not described herein.
Correspondingly, step 110 may include:
step 1101: and controlling the target vehicle based on the action frequency and the target control result.
If the target control result comprises: controlling the target vehicle to move, wherein the movement comprises traveling or reversing, then: step 1101 may specifically include:
controlling the target vehicle to move and enabling the vehicle speed of the target vehicle to be adapted to the action frequency change.
Specifically, a frequency threshold may be preset, and a specific numerical value of the frequency threshold may be set according to an actual application scenario. Whether the action frequency of the gesture action to be executed is greater than a frequency threshold value or not can be judged, and if yes, the speed of the vehicle can be increased in the movement process of the target vehicle; if not, the vehicle speed can be reduced in the process of moving the target vehicle.
For example, the gesture motion to be performed may be, for example, clockwise circle drawing, and the corresponding vehicle control result is reversing (i.e., backing) of the vehicle during parking, so that when the clockwise circle drawing is not stopped, the vehicle may be controlled to keep backing, at this time, the target user may control the motion frequency (e.g., the frequency of circle drawing) of the gesture motion to be performed, and if the motion frequency becomes higher, the vehicle may be controlled to accelerate the moving speed; if the operation frequency is low, the vehicle can be controlled to decrease the moving speed. Of course, this example is merely an illustrative example, and other methods may be adopted in practical applications, which are not examples herein.
In addition, in the mapping relationship, in addition to the above-mentioned first mapping information and second mapping information, other custom gesture actions and mapping information corresponding to the custom gesture actions may be recorded, and other standard gesture actions and mapping information corresponding to the standard gesture actions may also be recorded.
In this embodiment, whether the identity characteristic of the target user exists or not may also be detected, and then the identity of the target user is identified based on the identity characteristic.
In one example, the identity feature may be a face feature of the target user, and the face image of the target user may be acquired based on a camera module mounted on the target vehicle, and then the face feature may be extracted, and the identity of the target user may be identified based on the face feature.
In another example, the identity feature may be a voice feature of the target user, such as a voiceprint, and the voice of the target user may be collected based on a voice collection module, such as a microphone, installed on the target vehicle, and then the voice feature may be extracted, and the identity of the target user may be recognized based on the voice feature.
Of course, besides the above examples, the identity of the target user may also be recognized by other methods, for example, the identity of the target user may also be recognized according to bluetooth, wifi signal, and the like of a mobile phone of the target user, which is not limited herein.
Corresponding to the embodiment of the vehicle control method based on the gesture, the invention further provides an embodiment of a vehicle control device based on the gesture.
The embodiment of the gesture-based vehicle control device can be applied to electronic equipment. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. Taking a software implementation as an example, as a logical device, the device is formed by reading, by a processor of the electronic device where the device is located, a corresponding computer program instruction in the nonvolatile memory into the memory for operation. From a hardware level, as shown in fig. 3, a hardware structure diagram of an electronic device where the gesture-based vehicle control apparatus of the present invention is located is shown.
Referring to fig. 3, an electronic device 30 is provided, which includes:
a processor 31; and the number of the first and second groups,
a memory 32 for storing executable instructions of the processor;
wherein the processor 31 is configured to perform the above-mentioned method via execution of the executable instructions.
The processor 31 is capable of communicating with the memory 32 via a bus 33.
In addition to the processor, the memory, and the bus shown in fig. 3, the electronic device in which the apparatus is located in the embodiment may also include other hardware according to the actual function of the electronic device, which is not described again.
Referring to fig. 4, fig. 4 is a block diagram of a gesture-based vehicle control apparatus according to an exemplary embodiment of the present invention, which may be applied to the electronic device shown in fig. 4, and the gesture-based vehicle control apparatus 4 includes:
a first obtaining unit 410, configured to obtain an image to be learned and first mapping information, where the image to be learned records a first gesture of a target user, and the first mapping information represents that the first gesture is mapped to a first vehicle control result, where the image to be learned is acquired by a camera module mounted on a target vehicle;
a mapping determining unit 420, configured to determine second mapping information based on the first vehicle control result, the first gesture action, and a preset association rule, where the second mapping information represents that the second gesture action is mapped with the second vehicle control result, and the association rule defines: the vehicle control result associated with the first vehicle control result is the second vehicle control result, and the relationship between the first gesture action and the second gesture action;
an updating unit 430, configured to update the stored mapping relationship according to the first mapping information and the second mapping information;
a second obtaining unit 440, configured to obtain an control image collected by a camera module mounted on a target vehicle, where the control image records a gesture to be performed by the target user;
the control unit 450 is configured to control the target vehicle according to the gesture motion to be performed and the mapping relationship.
Optionally, the first vehicle control result and the second vehicle control result are vehicle control results with functions opposite to each other.
Optionally, if the first vehicle control result includes unlocking the target vehicle, the second vehicle control result includes locking the target vehicle;
if the first vehicle control result comprises igniting the target vehicle, the second vehicle control result comprises extinguishing the target vehicle;
if the first vehicle control result comprises that the vehicle-mounted equipment of the target vehicle is opened, the second vehicle control result is that the vehicle-mounted equipment is closed;
if the first vehicle control result comprises improvement of the working parameters of the vehicle-mounted equipment of the target vehicle, the second vehicle control result is reduction of the working parameters;
if the first vehicle control result comprises that the working state of the vehicle-mounted equipment of the target vehicle is changed from a first state to a second state, the second vehicle control result is that the working state is changed from the second state to the first state;
if the first vehicle control result includes the traveling of the target vehicle, the second vehicle control result includes the backing of the target vehicle.
Optionally, a relationship between the first gesture and the second gesture satisfies any one of the following:
the first gesture is a gesture motion of the target user's hand moving along a first line, and the second gesture is a gesture motion of the target user's hand moving along a second line; the first line and the second line are lines with the same or similar shapes, but the corresponding movement directions are opposite;
the first gesture is taken as a gesture motion of the hand of the target user moving along a first graph, and the second gesture is taken as a gesture motion of the hand of the target user moving along a second graph; the first figure and the second figure are the same or similar in shape, but the first figure and the second figure are different by a specified deflection angle or are symmetrical;
the first gesture motion and the second gesture motion are the same gesture motion.
Optionally, controlling the target vehicle according to the gesture motion to be performed and the mapping relationship includes:
and determining a target control result based on the gesture action to be executed and the mapping relation, and controlling the target vehicle to execute the target control result.
Optionally, the target control result includes at least one of:
ignition, flameout, vehicle locking, advancing, backing, executing a parking process, turning on the vehicle-mounted equipment, turning off the vehicle-mounted equipment, and controlling the vehicle-mounted equipment to generate specified change.
Optionally, the apparatus further includes:
a frequency determination unit for determining a motion frequency of the gesture motion to be performed in the control image;
the control unit is specifically configured to:
and controlling the target vehicle based on the action frequency and the target control result.
Optionally, the target control result includes: controlling the target vehicle to move, wherein the movement comprises traveling or backing up;
the control unit is specifically configured to:
controlling the target vehicle to move and enabling the vehicle speed of the target vehicle to be adapted to the action frequency change.
Optionally, the control unit is specifically configured to:
when the change frequency is larger than a preset frequency threshold value, increasing the speed of the target vehicle;
when the change frequency is less than a frequency threshold, reducing the vehicle speed of the target vehicle.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
Embodiments of the present invention also provide a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the above-mentioned method.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (12)

1. A gesture-based vehicle control method, comprising:
acquiring an image to be learned and first mapping information, which are acquired by a camera module loaded on a target vehicle, wherein the image to be learned records a first gesture action of a target user, and the first mapping information represents that the first gesture action is mapped with a first vehicle control result;
determining second mapping information based on the first vehicle control result, the first gesture action and a preset association rule, wherein the second mapping information represents that the second gesture action is mapped with the second vehicle control result, and the association rule defines: the vehicle control result associated with the first vehicle control result is the second vehicle control result, and the relationship between the first gesture action and the second gesture action;
updating the stored mapping relation according to the first mapping information and the second mapping information;
acquiring a control image acquired by a camera module loaded on a target vehicle, wherein the control image records gesture actions to be executed of a target user;
and controlling the target vehicle according to the gesture action to be executed and the mapping relation.
2. The method of claim 1, wherein the first vehicle control outcome and the second vehicle control outcome are functionally opposite vehicle control outcomes.
3. The method of claim 2,
if the first vehicle control result comprises unlocking the target vehicle, the second vehicle control result comprises locking the target vehicle;
if the first vehicle control result comprises igniting the target vehicle, the second vehicle control result comprises extinguishing the target vehicle;
if the first vehicle control result comprises that the vehicle-mounted equipment of the target vehicle is opened, the second vehicle control result is that the vehicle-mounted equipment is closed;
if the first vehicle control result comprises improvement of the working parameters of the vehicle-mounted equipment of the target vehicle, the second vehicle control result is reduction of the working parameters;
if the first vehicle control result comprises that the working state of the vehicle-mounted equipment of the target vehicle is changed from a first state to a second state, the second vehicle control result is that the working state is changed from the second state to the first state;
if the first vehicle control result includes the traveling of the target vehicle, the second vehicle control result includes the backing of the target vehicle.
4. The method of claim 1, wherein the relationship between the first gesture action and the second gesture action satisfies any one of the following:
the first gesture is a gesture motion of the target user's hand moving along a first line, and the second gesture is a gesture motion of the target user's hand moving along a second line; the first line and the second line are lines with the same or similar shapes, but the corresponding movement directions are opposite;
the first gesture is taken as a gesture motion of the hand of the target user moving along a first graph, and the second gesture is taken as a gesture motion of the hand of the target user moving along a second graph; the first figure and the second figure are the same or similar in shape, but the first figure and the second figure are different by a specified deflection angle or are symmetrical;
the first gesture motion and the second gesture motion are the same gesture motion.
5. The method according to any one of claims 1 to 4, wherein controlling the target vehicle according to the mapping relation and the gesture action to be performed comprises:
and determining a target control result based on the gesture action to be executed and the mapping relation, and controlling the target vehicle to execute the target control result.
6. The method of claim 5,
the target control result includes at least one of:
ignition, flameout, vehicle locking, advancing, backing, executing a parking process, turning on the vehicle-mounted equipment, turning off the vehicle-mounted equipment, and controlling the vehicle-mounted equipment to generate specified change.
7. The method of claim 5, further comprising:
determining a motion frequency of the gesture motion to be performed in the control image;
the controlling the target vehicle to execute the target control result includes:
and controlling the target vehicle based on the action frequency and the target control result.
8. The method of claim 7, wherein the target control result comprises: controlling the target vehicle to move, wherein the movement comprises traveling or backing up;
controlling the target vehicle based on the action frequency and the target control result, including:
controlling the target vehicle to move and enabling the vehicle speed of the target vehicle to be adapted to the action frequency change.
9. The method of claim 8, wherein said matching the vehicle speed of the target vehicle to the change in the frequency of motion comprises:
when the change frequency is larger than a preset frequency threshold value, increasing the speed of the target vehicle;
when the change frequency is less than the frequency threshold, reducing the vehicle speed of the target vehicle.
10. A gesture-based vehicle control apparatus, comprising:
the device comprises a first acquisition module, a first mapping module and a second acquisition module, wherein the first acquisition module is used for acquiring an image to be learned and first mapping information, the image to be learned is acquired by a camera module loaded on a target vehicle, a first gesture action of a target user is recorded in the image to be learned, and the first mapping information represents that the first gesture action is mapped with a first vehicle control result;
a mapping determination module, configured to determine second mapping information based on the first vehicle control result, the first gesture action, and a preset association rule, where the second mapping information represents that the second gesture action is mapped with the second vehicle control result, and the association rule defines: the vehicle control result associated with the first vehicle control result is the second vehicle control result, and the relationship between the first gesture action and the second gesture action;
the updating module is used for updating the stored mapping relation according to the first mapping information and the second mapping information;
the second acquisition module is used for acquiring a control image acquired by a camera module loaded on a target vehicle, and the control image records gesture actions to be executed of the target user;
and the control module is used for controlling the target vehicle according to the gesture action to be executed and the mapping relation.
11. A storage medium having a program stored thereon, wherein the program, when executed by a processor, performs the steps of the method of any one of claims 1-9.
12. An electronic device comprising a memory, a processor and a program stored on the memory and executable on the processor, wherein the steps of the method of any of claims 1-9 are implemented when the program is executed by the processor.
CN202110993042.XA 2021-08-27 2021-08-27 Vehicle control method and device based on gestures and storage medium Pending CN113696850A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110993042.XA CN113696850A (en) 2021-08-27 2021-08-27 Vehicle control method and device based on gestures and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110993042.XA CN113696850A (en) 2021-08-27 2021-08-27 Vehicle control method and device based on gestures and storage medium

Publications (1)

Publication Number Publication Date
CN113696850A true CN113696850A (en) 2021-11-26

Family

ID=78655722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110993042.XA Pending CN113696850A (en) 2021-08-27 2021-08-27 Vehicle control method and device based on gestures and storage medium

Country Status (1)

Country Link
CN (1) CN113696850A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050212751A1 (en) * 2004-03-23 2005-09-29 Marvit David L Customizable gesture mappings for motion controlled handheld devices
US20100306716A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Extending standard gestures
CN102782612A (en) * 2010-02-24 2012-11-14 诺基亚公司 Gesture control
CN104216514A (en) * 2014-07-08 2014-12-17 深圳市华宝电子科技有限公司 Method and device for controlling vehicle-mounted device, and vehicle
CN105426658A (en) * 2015-10-29 2016-03-23 东莞酷派软件技术有限公司 Vehicle pre-starting method and related apparatus
CN107765853A (en) * 2017-10-13 2018-03-06 广东欧珀移动通信有限公司 Using method for closing, device, storage medium and electronic equipment
CN109032358A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 The control method and device of AR interaction dummy model based on gesture identification
CN110297545A (en) * 2019-07-01 2019-10-01 京东方科技集团股份有限公司 Gestural control method, gesture control device and system and storage medium
CN111625086A (en) * 2020-04-24 2020-09-04 爱驰汽车有限公司 Vehicle interaction method, system, device and storage medium based on user action
CN111813321A (en) * 2020-08-12 2020-10-23 Oppo广东移动通信有限公司 Gesture control method and related device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050212751A1 (en) * 2004-03-23 2005-09-29 Marvit David L Customizable gesture mappings for motion controlled handheld devices
US20100306716A1 (en) * 2009-05-29 2010-12-02 Microsoft Corporation Extending standard gestures
CN102782612A (en) * 2010-02-24 2012-11-14 诺基亚公司 Gesture control
CN104216514A (en) * 2014-07-08 2014-12-17 深圳市华宝电子科技有限公司 Method and device for controlling vehicle-mounted device, and vehicle
CN105426658A (en) * 2015-10-29 2016-03-23 东莞酷派软件技术有限公司 Vehicle pre-starting method and related apparatus
CN107765853A (en) * 2017-10-13 2018-03-06 广东欧珀移动通信有限公司 Using method for closing, device, storage medium and electronic equipment
CN109032358A (en) * 2018-08-27 2018-12-18 百度在线网络技术(北京)有限公司 The control method and device of AR interaction dummy model based on gesture identification
CN110297545A (en) * 2019-07-01 2019-10-01 京东方科技集团股份有限公司 Gestural control method, gesture control device and system and storage medium
CN111625086A (en) * 2020-04-24 2020-09-04 爱驰汽车有限公司 Vehicle interaction method, system, device and storage medium based on user action
CN111813321A (en) * 2020-08-12 2020-10-23 Oppo广东移动通信有限公司 Gesture control method and related device

Similar Documents

Publication Publication Date Title
JP6765545B2 (en) Dynamic gesture recognition method and device, gesture dialogue control method and device
US10366602B2 (en) Interactive multi-touch remote control
JP6438579B2 (en) Apparatus and method for determining a desired target
US11995536B2 (en) Learning device, estimating device, estimating system, learning method, estimating method, and storage medium to estimate a state of vehicle-occupant with respect to vehicle equipment
CN109144260B (en) Dynamic motion detection method, dynamic motion control method and device
CN105117008B (en) Guiding method of operating and device, electronic equipment
EP3130969A1 (en) Method and device for showing work state of a device
CN106295599A (en) The control method of vehicle and device
CN113696849B (en) Gesture-based vehicle control method, device and storage medium
CN113703576A (en) Vehicle control method, device, equipment and medium based on vehicle exterior gesture
CN108664286B (en) Application program preloading method and device, storage medium and mobile terminal
CN113696904A (en) Processing method, device, equipment and medium for controlling vehicle based on gestures
CN110532755B (en) Computer-implemented risk identification method and device
CN116187438A (en) Method and device for determining agent exploration strategy, agent and storage medium
CN115617217A (en) Vehicle state display method, device, equipment and readable storage medium
CN110554766A (en) Interaction method and vehicle-mounted interaction device
CN113548061B (en) Man-machine interaction method and device, electronic equipment and storage medium
CN112732379B (en) Method for running application program on intelligent terminal, terminal and storage medium
CN109999506A (en) The interaction control method and device, storage medium, electronic equipment of object event
CN113696850A (en) Vehicle control method and device based on gestures and storage medium
CN112383826A (en) Control method and device of vehicle-mounted entertainment terminal, storage medium, terminal and automobile
CN115097928A (en) Gesture control method and device, electronic equipment and storage medium
CN116310633A (en) Key point detection model training method and key point detection method
CN114924647A (en) Vehicle control method, device, control equipment and medium based on gesture recognition
CN109191626A (en) Control method, apparatus, storage medium and the electronic equipment and vehicle of car door lock

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20240308

AD01 Patent right deemed abandoned