CN108460354B - Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system - Google Patents

Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system Download PDF

Info

Publication number
CN108460354B
CN108460354B CN201810199832.9A CN201810199832A CN108460354B CN 108460354 B CN108460354 B CN 108460354B CN 201810199832 A CN201810199832 A CN 201810199832A CN 108460354 B CN108460354 B CN 108460354B
Authority
CN
China
Prior art keywords
posture
image
gesture
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201810199832.9A
Other languages
Chinese (zh)
Other versions
CN108460354A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhendi Information Technology Co ltd
Original Assignee
Shenzhen Zhendi Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhendi Information Technology Co ltd filed Critical Shenzhen Zhendi Information Technology Co ltd
Priority to CN201810199832.9A priority Critical patent/CN108460354B/en
Publication of CN108460354A publication Critical patent/CN108460354A/en
Application granted granted Critical
Publication of CN108460354B publication Critical patent/CN108460354B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • General Physics & Mathematics (AREA)
  • Psychiatry (AREA)
  • Human Computer Interaction (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a method and a device for controlling an unmanned aerial vehicle, the unmanned aerial vehicle and a system, which relate to the technical field of unmanned aerial vehicles, wherein the method is executed by the unmanned aerial vehicle and comprises the following steps: monitoring a current posture of a user; determining the operation to be executed currently according to the gesture; and executing the determined operation. The invention can enable the user to operate the unmanned aerial vehicle without holding the ground terminal, thereby providing great convenience for the user.

Description

Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle control method, an unmanned aerial vehicle control device, an unmanned aerial vehicle and a system.
Background
Traditional unmanned aerial vehicle needs the user to control such as remote controller, the ground terminals such as cell-phone that is provided with APP remote control and uses, and this kind of mode needs the handheld ground terminal of user, will bring great posture restriction for the user absolutely, brings a great deal of inconvenience for the user. For example, when the user manipulates the unmanned aerial vehicle to take an aerial photograph, the user is inconvenient to put out other postures due to the need to hold the ground terminal in hand, and if the user wants to put out other postures, the user is inconvenient to manipulate the ground terminal to control the unmanned aerial vehicle.
To the user need hand ground terminal control unmanned aerial vehicle among the prior art, comparatively inconvenient problem, effective solution has not been proposed yet at present.
Disclosure of Invention
In view of this, the present invention provides a method, an apparatus, an unmanned aerial vehicle and a system for controlling an unmanned aerial vehicle, which can solve the technical problem in the prior art that a user needs to hold a ground terminal to operate an unmanned aerial vehicle, which is inconvenient.
In order to achieve the above purpose, the embodiment of the present invention adopts the following technical solutions:
in a first aspect, an embodiment of the present invention provides a method for controlling an unmanned aerial vehicle, where the method is performed by an unmanned aerial vehicle, and the method includes: monitoring a current posture of a user; determining the operation to be executed currently according to the gesture; performing the determined operation.
With reference to the first aspect, an embodiment of the present invention provides a first possible implementation manner of the first aspect, where the step of monitoring a current posture of the user includes: and acquiring the current posture image of the user through the camera of the unmanned aerial vehicle.
With reference to the first possible implementation manner of the first aspect, the embodiment of the present invention provides a second possible implementation manner of the first aspect, where the determining, according to the gesture, an operation to be currently performed includes: searching an operation instruction corresponding to the current posture image through a pre-stored operation database; wherein, the corresponding relation between the gesture image and the operation instruction is stored in the operation database; the operation instruction comprises a photographing instruction and/or a flight control instruction; and determining the current operation to be executed according to the operation instruction.
With reference to the first possible implementation manner of the first aspect, the embodiment of the present invention provides a third possible implementation manner of the first aspect, where the step of determining, according to the gesture, an operation to be currently performed includes: searching whether an image matched with the current posture image exists in a preset posture database; the gesture database stores gesture images corresponding to a plurality of photographing triggering gestures; if so, determining that the gesture corresponding to the current gesture image is a photographing triggering gesture, and determining that the current operation to be executed is a photographing operation.
With reference to the third possible implementation manner of the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, where the establishing manner of the gesture database includes: receiving a gesture image pre-entered by a user, and storing the gesture image in the gesture database; and/or searching posture images from a stored photo album, and storing the posture images with the appearance frequency of the corresponding postures higher than the preset frequency in the posture database; wherein, the photo album contains the historical image of shooing of unmanned aerial vehicle.
With reference to the third possible implementation manner of the first aspect, an embodiment of the present invention provides a fifth possible implementation manner of the first aspect, where the step of searching in a pre-established gesture database whether an image matching the current gesture image exists includes: determining a posture corresponding to the current posture image through a human body posture recognition algorithm; comparing the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in a preset posture database; judging whether a photographing triggering gesture with the gesture similarity corresponding to the current gesture image higher than a preset similarity threshold exists; if yes, determining that the image matched with the current posture image exists in the posture database.
With reference to the third possible implementation manner of the first aspect, an embodiment of the present invention provides a sixth possible implementation manner of the first aspect, where the method further includes: and sending the image obtained by executing the photographing operation to a terminal associated with the unmanned aerial vehicle, and/or storing the image in a storage device of the unmanned aerial vehicle.
In a second aspect, an embodiment of the present invention further provides an unmanned aerial vehicle control apparatus, where the apparatus is provided on an unmanned aerial vehicle side, and the apparatus includes: a gesture monitoring module for monitoring a current gesture of a user; the operation determining module is used for determining the operation to be executed currently according to the gesture; an execution module to execute the determined operation.
In a third aspect, an embodiment of the present invention provides an unmanned aerial vehicle, where the unmanned aerial vehicle control device in the second aspect is arranged on the unmanned aerial vehicle.
In a fourth aspect, an embodiment of the present invention provides an unmanned aerial vehicle system, where the unmanned aerial vehicle system includes the unmanned aerial vehicle of the third aspect, and a terminal device; the unmanned aerial vehicle is in communication connection with the terminal equipment; the terminal equipment is used for interacting information with the unmanned aerial vehicle.
The embodiment of the invention provides an unmanned aerial vehicle control method, an unmanned aerial vehicle control device, an unmanned aerial vehicle and an unmanned aerial vehicle control system. Compared with the problem that the unmanned aerial vehicle is inconvenient to operate by a user needing to hold ground terminals such as a remote controller in the prior art, the embodiment of the invention can enable the user to operate the unmanned aerial vehicle without holding the ground terminals, thereby providing great convenience for the user.
Additional features and advantages of the disclosure will be set forth in the description which follows, or in part may be learned by the practice of the above-described techniques of the disclosure, or may be learned by practice of the disclosure.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 shows a flowchart of a method for controlling an unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 2 illustrates a gesture diagram provided by embodiments of the present invention;
figure 3 shows a flow chart of a method for matching pose images provided by an embodiment of the invention,
fig. 4 shows a schematic working diagram of an unmanned aerial vehicle according to an embodiment of the present invention;
fig. 5 shows a block diagram of an unmanned aerial vehicle control apparatus according to an embodiment of the present invention;
fig. 6 shows a schematic structural diagram of an unmanned aerial vehicle system provided by an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
At present, most unmanned aerial vehicles need users to hold ground terminals such as remote controllers for operation, inconvenience is brought to the users, and in order to solve the problem, the unmanned aerial vehicle control method, the unmanned aerial vehicle control device, the unmanned aerial vehicle and the unmanned aerial vehicle control system provided by the embodiment of the invention can be applied to the field of unmanned aerial vehicles, and the embodiment of the invention is described in detail below.
Referring to fig. 1, a flowchart of a method for controlling a drone, the method being performed by the drone, includes the steps of:
step S102, monitoring the current posture of the user. In practical application, the unmanned aerial vehicle is usually provided with image acquisition equipment such as a camera, and the unmanned aerial vehicle can monitor the current posture of a user in real time through the camera; specifically, the current posture image of the user may be acquired through a camera of the drone. In practical application, a camera of the unmanned aerial vehicle can be defaulted to be started in real time, and the unmanned aerial vehicle monitors the current posture of a user in real time; the camera of the unmanned aerial vehicle can also be opened after the user sets for through ground terminals such as a remote controller, and the unmanned aerial vehicle in the photographing mode monitors the current posture of the user again.
And step S104, determining the operation to be executed currently according to the gesture. Specifically, different gestures correspond to different operations.
In one embodiment, this may be performed with reference to the following steps:
(1) searching an operation instruction corresponding to the current posture image through a pre-stored operation database; wherein, the corresponding relation between the gesture image and the operation instruction is stored in the operation database; the operating instructions include photographing instructions and/or flight control instructions. The instruction of shooing also instructs unmanned aerial vehicle to shoot the image through the camera, and the flight control instruction includes direction control, hover control, landing control, return journey control, follow control, speed control etc. instruction to unmanned aerial vehicle. It is understood that the operation database stores a plurality of posture images, each posture image includes a specific posture, and different postures correspond to different operation instructions. The unmanned aerial vehicle can confirm the operating instruction corresponding to this gesture through discerning the gesture.
(2) And determining the operation to be executed currently according to the operation instruction. Specifically, when the operation instruction is a photographing instruction, the unmanned aerial vehicle determines that the operation to be executed currently is photographing operation; when the operation instruction is a flight control instruction, the unmanned aerial vehicle determines that the operation to be executed currently is the operation indicated by the execution flight control instruction, for example, when the flight control instruction is hover control, the unmanned aerial vehicle determines that the operation to be executed currently is hover operation, and when the flight control instruction is return control, the unmanned aerial vehicle determines that the operation to be executed currently is return operation.
In step S106, the determined operation is performed. When the unmanned aerial vehicle determines the operation to be executed, the determined operation is directly executed, and the operation requirement of the user on the unmanned aerial vehicle can be met.
In the method of this embodiment, the drone may perform the determined operation by monitoring the current posture of the user and determining the operation to be currently performed according to the posture. Compared with the problem that the unmanned aerial vehicle is inconvenient to operate by a user needing to hold ground terminals such as a remote controller in the prior art, the embodiment of the invention can enable the user to operate the unmanned aerial vehicle without holding the ground terminals, thereby providing great convenience for the user.
Considering that the aerial photography operation is a common use of the unmanned aerial vehicle, when the unmanned aerial vehicle performs the aerial photography operation, a user usually needs to hold a remote controller or a mobile phone and press a photographing control button to control the unmanned aerial vehicle to photograph, such as acquiring images including the user. However, this kind of photographing control method has greatly limited the pendulum shooting posture of the user, in order to make the user can freely swing shooting when using the unmanned aerial vehicle to take a photo by plane, and need not to receive the posture restriction that needs to hold the remote controller again, this embodiment provides a specific implementation mode that determines the operation to be executed at present according to the posture, including:
(1) searching whether an image matched with the current posture image exists in a preset posture database; the gesture database stores gesture images corresponding to a plurality of photographing triggering gestures. It is understood that each posture image contains a specific posture, such as a posture of crossing arms, raising head, side face, long jump, walking, forking, running, bending, holding chin, arm holding, etc., and for ease of understanding, reference may be made to a posture diagram of fig. 2, which simply illustrates a part of the posture, and a plurality of posture images, each containing a photograph triggering posture, are stored in the posture database; the photograph trigger gesture may include most common user gestures, and the specific photograph trigger gesture may be preset by the user.
(2) If so, determining that the gesture corresponding to the current gesture image is a photographing triggering gesture, and determining that the operation to be executed is a photographing operation.
The embodiment provides the following two establishment modes of the gesture database, which are respectively:
the first method is as follows: and receiving a gesture image which is input by a user in advance, and storing the gesture image in a gesture database. The user may store a variety of gesture images that may be taken in advance in the gesture database.
The second method comprises the following steps: searching posture images from a stored photo album, and storing the posture images with the occurrence frequency of corresponding postures higher than the preset frequency in a posture database; wherein, the album contains the image that unmanned aerial vehicle history was shot. The stored photo album contains the photos taken in history, and specifically can be an APP photo album or other cloud photo albums. By analyzing big data, counting frequency and the like of the photo album, posture images higher than a certain frequency can be screened out from the posture images stored in the photo album, and the posture images higher than the certain frequency are confirmed to be the preferred shooting posture of the user and stored in a posture database to be used as the shooting triggering posture.
In specific application, the posture database can be flexibly established by adopting the first mode or the second mode according to actual requirements, and the posture database can also be established by adopting the first mode and the second mode simultaneously. Of course, the above two ways are only illustrative and should not be considered as limiting herein.
Specifically, referring to a flowchart of a method for matching a posture image shown in fig. 3, the flowchart of the method mainly shows a specific implementation manner for searching whether an image matching a current posture image exists in a preset posture database, and includes the following steps:
and step S302, determining the posture corresponding to the current posture image through a human body posture recognition algorithm. The principle of the human posture recognition algorithm is to recognize the positions of human joint points or key parts in the posture image. Taking joint points as an example, in specific implementation, a depth map can be used for joint point position estimation, or joint point position estimation can be directly performed based on an RGB image, when a human body posture recognition algorithm is used for processing a current posture image, a hot spot map is obtained first, the basic processing procedure is to classify colors of the human body hot spot map and calibrate the colors, then perform position estimation of three-dimensional joint points, and finally match the calibrated colors with the three-dimensional joint points, so that the posture corresponding to the current posture image can be determined.
Step S304, comparing the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in the preset posture database. That is, the determined current posture is compared with the photographing trigger posture. It is understood that each posture image corresponds to a posture, the posture corresponding to the current posture image is the current posture, and the postures corresponding to the various posture images stored in the posture database are photographing triggering postures; specifically, an image comparison algorithm can be adopted to compare the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in the posture database one by one.
Step S306, judging whether a photographing triggering gesture with the gesture corresponding to the current gesture image higher than a preset similarity threshold exists. Specifically, the human body posture recognition algorithm can convert the current posture image into a joint point two-dimensional image, then convert the joint point two-dimensional image into coordinate lattice column data, and compare the coordinate lattice column data with coordinate lattice column data corresponding to each photographing trigger posture in the posture database, so that the similarity between the two images is determined.
In step S308, if yes, it is determined that an image matching the current posture image is stored in the posture database. For example, the preset similarity threshold is set to be 80%, and if the similarity is higher than 80%, it is indicated that an image matched with the current posture image is stored in the posture database, it indicates that the posture corresponding to the current posture image is a photographing triggering posture, and the unmanned aerial vehicle can determine that a photographing instruction needs to be executed.
By the method, the current posture image of the user can be compared with the posture database, so that whether the current posture image is stored in the posture database or not is judged, the posture corresponding to the image stored in the posture database is a photographing triggering posture, the unmanned aerial vehicle can judge whether the current posture of the user corresponds to photographing operation or not, and if yes, photographing operation is executed.
Further, after the unmanned aerial vehicle performs the photographing operation, the image obtained by performing the photographing operation can be sent to a terminal associated with the unmanned aerial vehicle, and/or stored in a storage device of the unmanned aerial vehicle. The terminal associated with the unmanned aerial vehicle can be a mobile terminal (a mobile phone and the like) of a user, and can also be a terminal such as a cloud server and the like, so that the user can conveniently store or directly view the photographed image; unmanned aerial vehicle's storage device specifically can be memory etc. and the user can look over the image information of the interior storage of unmanned aerial vehicle's storage device afterwards, specifically can look over unmanned aerial vehicle from leading-in to intelligent terminals such as computer, cell-phone of the image information that storage device stored.
In an embodiment, reference may be made to a working schematic diagram of an unmanned aerial vehicle shown in fig. 4, which simply illustrates a connection relationship between components when the unmanned aerial vehicle takes a photo by using an attitude detection mechanism, and for understanding, the following is specifically described herein: the memory stores a posture database, and the posture image included in the posture database may be a user's usual practice posture collected from an album or a posture image that is imported into the memory by an external device such as a computer. After the ground terminal opens the photographing mode through the processor of the unmanned aerial vehicle, the unmanned aerial vehicle opens the camera immediately to monitor the user gesture, the camera can transmit the acquired current gesture image to the processor in real time, so that the processor compares the current gesture image with the photographing triggering gesture stored in the gesture database in the memory, if the gesture corresponding to the current gesture image is determined to be the photographing triggering gesture, the camera is controlled to execute photographing operation, and the photographed photo is stored in the ground terminal. Specifically, the ground terminal in fig. 4 can be a mobile phone, a computer, and the like, and in practical application, the ground terminal can be provided with a corresponding APP, is connected with a cloud server through the APP, and stores the shot picture in the cloud through the APP.
Corresponding to the aforementioned unmanned aerial vehicle control method, the present embodiment further provides an unmanned aerial vehicle control apparatus, and referring to a structural block diagram of the unmanned aerial vehicle control apparatus shown in fig. 5, the apparatus is disposed at the unmanned aerial vehicle side, and specifically includes the following modules:
a gesture monitoring module 502 for monitoring a current gesture of the user. In one embodiment, the gesture monitoring module 502 is configured to capture a current gesture image of the user via a camera of the drone.
And an operation determining module 504, configured to determine, according to the gesture, an operation to be currently performed. In one embodiment, the operation determining module 504 is configured to search, through a pre-stored operation database, an operation instruction corresponding to the current posture image; wherein, the corresponding relation between the gesture image and the operation instruction is stored in the operation database; the operation instruction comprises a photographing instruction and/or a flight control instruction; and determining the operation to be executed currently according to the operation instruction.
An executing module 506, configured to execute the determined operation.
In the above apparatus of this embodiment, the drone may perform the determined operation by monitoring the current posture of the user and determining the operation to be currently performed according to the posture. Compared with the problem that the unmanned aerial vehicle is inconvenient to operate by a user needing to hold ground terminals such as a remote controller in the prior art, the embodiment of the invention can enable the user to operate the unmanned aerial vehicle without holding the ground terminals, thereby providing great convenience for the user.
In another embodiment, the operation determining module includes a searching unit and a determining unit, which are specifically described as follows:
the searching unit is used for searching whether an image matched with the current posture image exists in a preset posture database; the gesture database stores gesture images corresponding to a plurality of photographing triggering gestures. Specifically, the establishment mode of the gesture database comprises the following steps: receiving a gesture image input by a user in advance, and storing the gesture image in a gesture database; and/or searching posture images from a stored photo album, and storing the posture images with the appearance frequency of the corresponding postures higher than the preset frequency in a posture database; wherein, the album contains the image that unmanned aerial vehicle history was shot.
And the determining unit is used for determining that the gesture corresponding to the current gesture image is a photographing triggering gesture and the operation to be executed is photographing operation when the searching result of the searching unit is yes.
In a specific implementation, the search unit is configured to: determining the posture corresponding to the current posture image through a human body posture recognition algorithm; comparing the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in a preset posture database; judging whether a photographing triggering gesture with the gesture similarity corresponding to the current gesture image higher than a preset similarity threshold exists; if so, determining that the image matched with the current posture image exists in the posture database.
In practical application, the unmanned aerial vehicle controlling means that this embodiment provided still includes: and the image processing module is used for sending the image obtained by executing the photographing operation to a terminal associated with the unmanned aerial vehicle and/or storing the image in a storage device of the unmanned aerial vehicle.
The device provided by the embodiment has the same implementation principle and technical effect as the foregoing embodiment, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiment for the portion of the embodiment of the device that is not mentioned.
Further, this embodiment still provides an unmanned aerial vehicle, is provided with unmanned aerial vehicle controlling means on this unmanned aerial vehicle.
As can be clearly understood by those skilled in the art, for convenience and brevity of description, the specific working process of the unmanned aerial vehicle described above may refer to the corresponding process in the foregoing embodiment, and is not described herein again.
In addition, the present embodiment also provides an unmanned aerial vehicle system, which refers to the schematic structural diagram of an unmanned aerial vehicle system shown in fig. 6, and includes an unmanned aerial vehicle 100, and a terminal device 200; wherein, the unmanned aerial vehicle 100 is in communication connection with the terminal device 200; terminal device 200 is used for interacting with unmanned aerial vehicle 100.
The terminal equipment can be a remote controller of the unmanned aerial vehicle, and also can be terminals such as a mobile phone and a tablet personal computer, and the description is omitted here.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the system described above may refer to the corresponding process in the foregoing embodiments, and is not described herein again.
The method, the apparatus, the unmanned aerial vehicle, and the computer program product of the system for controlling the unmanned aerial vehicle provided by the embodiments of the present invention include a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiments, and specific implementations may refer to the method embodiments and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present invention, which are used for illustrating the technical solutions of the present invention and not for limiting the same, and the protection scope of the present invention is not limited thereto, although the present invention is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (5)

1. A drone control method, characterized in that the method is performed by a drone, the method comprising:
monitoring a current posture of a user;
determining the operation to be executed currently according to the gesture;
performing the determined operation;
the step of monitoring the current posture of the user comprises: acquiring a current posture image of a user through a camera of the unmanned aerial vehicle;
the step of determining the operation to be currently executed according to the gesture includes: searching whether an image matched with the current posture image exists in a preset posture database; the gesture database stores gesture images corresponding to a plurality of photographing triggering gestures; the photograph trigger gesture comprises a user gesture; if so, determining that the gesture corresponding to the current gesture image is a photographing triggering gesture, and the current operation to be executed is a photographing operation;
searching an operation instruction corresponding to the current posture image through a pre-stored operation database; wherein, the corresponding relation between the gesture image and the operation instruction is stored in the operation database; the operation instructions comprise a photographing instruction and/or a flight control instruction, the photographing instruction instructs the unmanned aerial vehicle to photograph images through the camera, the flight control instruction comprises direction control, hovering control, landing control, return control, following control and speed control instructions for the unmanned aerial vehicle, a plurality of posture images are stored in the operation database, each posture image comprises a specific posture, different postures correspond to different operation instructions, and the unmanned aerial vehicle determines the operation instructions corresponding to the postures by recognizing the postures;
determining the current operation to be executed according to the operation instruction, and when the operation instruction is a photographing instruction, determining the current operation to be executed as photographing operation by the unmanned aerial vehicle; when the operation instruction is a flight control instruction, the unmanned aerial vehicle determines that the operation to be executed currently is the operation indicated by the execution flight control instruction, when the flight control instruction is hovering control, the unmanned aerial vehicle determines that the operation to be executed currently is hovering operation, and when the flight control instruction is return control, the unmanned aerial vehicle determines that the operation to be executed currently is return operation;
the establishment mode of the gesture database comprises the following steps: receiving a posture image input by a user in advance, storing the posture image in a posture database, and storing various posture images which can be taken by the user in the posture database in advance; or searching posture images from a stored photo album, and storing the posture images with the appearance frequency of the corresponding postures higher than the preset frequency in a posture database; the photo album comprises images shot by the unmanned aerial vehicle in history, the stored photo album comprises photos shot in history, posture images higher than a certain frequency are screened out from all posture images stored in the photo album by performing big data analysis and frequency statistics on the photo album, the posture images higher than the certain frequency are determined as preferred shooting postures of the user, and are stored in a posture database to serve as shooting trigger postures;
determining the posture corresponding to the current posture image through a human posture recognition algorithm, performing joint point position estimation by adopting a depth map or directly based on an RGB (red, green and blue) image, when the current posture image is processed by the human posture recognition algorithm, firstly obtaining a hot spot map, performing color classification on the human hot spot map and color calibration, then performing position estimation on a three-dimensional joint point, and finally matching the calibrated color with the three-dimensional joint point to determine the posture corresponding to the current posture image;
comparing the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in a preset posture database, comparing the determined current posture with the photographing triggering posture, and comparing the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in the posture database one by adopting an image comparison algorithm;
judging whether a photographing triggering gesture with the gesture similarity corresponding to the current gesture image higher than a preset similarity threshold exists, converting the current gesture image into a joint point two-dimensional image by a human body gesture recognition algorithm, converting the joint point two-dimensional image into coordinate dot matrix column data, and comparing the coordinate dot matrix column data with coordinate dot matrix column data corresponding to each photographing triggering gesture in a gesture database to determine the similarity between the two images;
and if so, determining that the image matched with the current posture image is stored in the posture database, and determining that the photographing instruction needs to be executed by the unmanned aerial vehicle.
2. The method of claim 1, further comprising:
and sending the image obtained by executing the photographing operation to a terminal associated with the unmanned aerial vehicle, and/or storing the image in a storage device of the unmanned aerial vehicle.
3. The utility model provides an unmanned aerial vehicle controlling means, its characterized in that, the device sets up in the unmanned aerial vehicle side, the device includes:
a gesture monitoring module for monitoring a current gesture of a user;
the operation determining module is used for determining the operation to be executed currently according to the gesture;
an execution module to execute the determined operation;
the gesture detection module is further used for acquiring a current gesture image of a user through a camera of the unmanned aerial vehicle;
the operation determining module is further used for searching whether an image matched with the current posture image exists in a preset posture database; the gesture database stores gesture images corresponding to a plurality of photographing triggering gestures; the photograph trigger gesture comprises a user gesture; if so, determining that the gesture corresponding to the current gesture image is a photographing triggering gesture, and the current operation to be executed is a photographing operation;
the gesture database establishing module is used for receiving gesture images input by a user in advance and storing the gesture images in the gesture database; and/or searching posture images from a stored photo album, and storing the posture images with the appearance frequency of the corresponding postures higher than the preset frequency in the posture database; the photo album comprises images shot by the unmanned aerial vehicle historically;
the apparatus is further configured to:
searching an operation instruction corresponding to the current posture image through a pre-stored operation database; wherein, the corresponding relation between the gesture image and the operation instruction is stored in the operation database; the operation instructions comprise a photographing instruction and/or a flight control instruction, the photographing instruction instructs the unmanned aerial vehicle to photograph images through the camera, the flight control instruction comprises direction control, hovering control, landing control, return control, following control and speed control instructions for the unmanned aerial vehicle, a plurality of posture images are stored in the operation database, each posture image comprises a specific posture, different postures correspond to different operation instructions, and the unmanned aerial vehicle determines the operation instructions corresponding to the postures by recognizing the postures;
determining the current operation to be executed according to the operation instruction, and when the operation instruction is a photographing instruction, determining the current operation to be executed as photographing operation by the unmanned aerial vehicle; when the operation instruction is a flight control instruction, the unmanned aerial vehicle determines that the operation to be executed currently is the operation indicated by the execution flight control instruction, when the flight control instruction is hovering control, the unmanned aerial vehicle determines that the operation to be executed currently is hovering operation, and when the flight control instruction is return control, the unmanned aerial vehicle determines that the operation to be executed currently is return operation;
the establishment mode of the gesture database comprises the following steps: receiving a posture image input by a user in advance, storing the posture image in a posture database, and storing various posture images which can be taken by the user in the posture database in advance; or searching posture images from a stored photo album, and storing the posture images with the appearance frequency of the corresponding postures higher than the preset frequency in a posture database; the photo album comprises images shot by the unmanned aerial vehicle in history, the stored photo album comprises photos shot in history, posture images higher than a certain frequency are screened out from all posture images stored in the photo album by performing big data analysis and frequency statistics on the photo album, the posture images higher than the certain frequency are determined as preferred shooting postures of the user, and are stored in a posture database to serve as shooting trigger postures;
determining the posture corresponding to the current posture image through a human posture recognition algorithm, performing joint point position estimation by adopting a depth map or directly based on an RGB (red, green and blue) image, when the current posture image is processed by the human posture recognition algorithm, firstly obtaining a hot spot map, performing color classification on the human hot spot map and color calibration, then performing position estimation on a three-dimensional joint point, and finally matching the calibrated color with the three-dimensional joint point to determine the posture corresponding to the current posture image;
comparing the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in a preset posture database, comparing the determined current posture with the photographing triggering posture, and comparing the posture corresponding to the current posture image with the photographing triggering posture corresponding to each posture image in the posture database one by adopting an image comparison algorithm;
judging whether a photographing triggering gesture with the gesture similarity corresponding to the current gesture image higher than a preset similarity threshold exists, converting the current gesture image into a joint point two-dimensional image by a human body gesture recognition algorithm, converting the joint point two-dimensional image into coordinate dot matrix column data, and comparing the coordinate dot matrix column data with coordinate dot matrix column data corresponding to each photographing triggering gesture in a gesture database to determine the similarity between the two images;
and if so, determining that the image matched with the current posture image is stored in the posture database, and determining that the photographing instruction needs to be executed by the unmanned aerial vehicle.
4. An unmanned aerial vehicle, characterized in that, be provided with the unmanned aerial vehicle controlling means of claim 3 on the unmanned aerial vehicle.
5. A drone system, characterized in that it comprises the drone of claim 4, and a terminal device; the unmanned aerial vehicle is in communication connection with the terminal equipment;
the terminal equipment is used for interacting information with the unmanned aerial vehicle.
CN201810199832.9A 2018-03-09 2018-03-09 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system Expired - Fee Related CN108460354B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810199832.9A CN108460354B (en) 2018-03-09 2018-03-09 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810199832.9A CN108460354B (en) 2018-03-09 2018-03-09 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system

Publications (2)

Publication Number Publication Date
CN108460354A CN108460354A (en) 2018-08-28
CN108460354B true CN108460354B (en) 2020-12-29

Family

ID=63217151

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810199832.9A Expired - Fee Related CN108460354B (en) 2018-03-09 2018-03-09 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system

Country Status (1)

Country Link
CN (1) CN108460354B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110426970B (en) * 2019-06-25 2021-05-25 西安爱生无人机技术有限公司 Unmanned aerial vehicle photographing system and control method thereof
CN110650287A (en) * 2019-09-05 2020-01-03 深圳市道通智能航空技术有限公司 Shooting control method and device, aircraft and flight system
CN111123959B (en) * 2019-11-18 2023-05-30 亿航智能设备(广州)有限公司 Unmanned aerial vehicle control method based on gesture recognition and unmanned aerial vehicle adopting method
CN114168744A (en) * 2021-11-25 2022-03-11 中国航空工业集团公司沈阳飞机设计研究所 Unmanned aerial vehicle control intention understanding method based on knowledge graph

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487673A (en) * 2016-01-04 2016-04-13 京东方科技集团股份有限公司 Man-machine interactive system, method and device
CN105589553A (en) * 2014-09-23 2016-05-18 上海影创信息科技有限公司 Gesture control method and system for intelligent equipment
CN105892474A (en) * 2016-03-31 2016-08-24 深圳奥比中光科技有限公司 Unmanned plane and control method of unmanned plane
CN106161954A (en) * 2016-08-16 2016-11-23 北京金山安全软件有限公司 Video shooting control method and device and electronic equipment
CN106227341A (en) * 2016-07-20 2016-12-14 南京邮电大学 Unmanned plane gesture interaction method based on degree of depth study and system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103295028B (en) * 2013-05-21 2018-09-04 深圳Tcl新技术有限公司 gesture operation control method, device and intelligent display terminal
CN104363494A (en) * 2013-12-21 2015-02-18 滁州惠智科技服务有限公司 Gesture recognition system for smart television
US9600736B2 (en) * 2015-06-29 2017-03-21 International Business Machines Corporation Pose detection using depth camera
CN107741781A (en) * 2017-09-01 2018-02-27 中国科学院深圳先进技术研究院 Flight control method, device, unmanned plane and the storage medium of unmanned plane

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105589553A (en) * 2014-09-23 2016-05-18 上海影创信息科技有限公司 Gesture control method and system for intelligent equipment
CN105487673A (en) * 2016-01-04 2016-04-13 京东方科技集团股份有限公司 Man-machine interactive system, method and device
CN105892474A (en) * 2016-03-31 2016-08-24 深圳奥比中光科技有限公司 Unmanned plane and control method of unmanned plane
CN106227341A (en) * 2016-07-20 2016-12-14 南京邮电大学 Unmanned plane gesture interaction method based on degree of depth study and system
CN106161954A (en) * 2016-08-16 2016-11-23 北京金山安全软件有限公司 Video shooting control method and device and electronic equipment

Also Published As

Publication number Publication date
CN108460354A (en) 2018-08-28

Similar Documents

Publication Publication Date Title
CN108460354B (en) Unmanned aerial vehicle control method and device, unmanned aerial vehicle and system
EP3965003A1 (en) Image processing method and device
WO2019128507A1 (en) Image processing method and apparatus, storage medium and electronic device
KR102067057B1 (en) A digital device and method of controlling thereof
CN105488371B (en) Face recognition method and device
EP3432207A1 (en) Method for biometric recognition and terminal device
US10587847B2 (en) Content capture and transmission of data of a subject to a target device
CN108229369A (en) Image capturing method, device, storage medium and electronic equipment
WO2019137131A1 (en) Image processing method, apparatus, storage medium, and electronic device
CN110688914A (en) Gesture recognition method, intelligent device, storage medium and electronic device
CN107231470B (en) Image processing method, mobile terminal and computer readable storage medium
WO2019052329A1 (en) Facial recognition method and related product
WO2019024717A1 (en) Anti-counterfeiting processing method and related product
CN106292799B (en) Unmanned plane, remote control and its control method
EP3534250B1 (en) Target detection method and unmanned aerial vehicle
WO2019011098A1 (en) Unlocking control method and relevant product
WO2019084825A1 (en) Image processing method and device, and unmanned aerial vehicle
US11151398B2 (en) Anti-counterfeiting processing method, electronic device, and non-transitory computer-readable storage medium
US10971152B2 (en) Imaging control method and apparatus, control device, and imaging device
CN112699849A (en) Gesture recognition method and device, electronic equipment, readable storage medium and chip
CN110580053A (en) Target tracking method, aircraft and flight system
WO2018049630A1 (en) Photographing method and terminal
CN107341190B (en) Picture screening method, terminal and computer readable storage medium
CN106829662A (en) A kind of multifunctional intellectual elevator device and control method
CN107729736B (en) Face recognition method and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201229