CN112711274A - Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium - Google Patents

Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium Download PDF

Info

Publication number
CN112711274A
CN112711274A CN202110072365.5A CN202110072365A CN112711274A CN 112711274 A CN112711274 A CN 112711274A CN 202110072365 A CN202110072365 A CN 202110072365A CN 112711274 A CN112711274 A CN 112711274A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
preset time
control
triggered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110072365.5A
Other languages
Chinese (zh)
Inventor
张显志
高硕�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Yidian Aviation Technology Co ltd
Original Assignee
Sichuan Yidian Aviation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Yidian Aviation Technology Co ltd filed Critical Sichuan Yidian Aviation Technology Co ltd
Priority to CN202110072365.5A priority Critical patent/CN112711274A/en
Publication of CN112711274A publication Critical patent/CN112711274A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned aerial vehicle control method which is applied to an unmanned aerial vehicle. The unmanned aerial vehicle control method comprises the following steps: judging whether a control instruction is triggered within a first preset time or not after the triggering of the palm control mode is detected; if the control instruction is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle; and when the current flight task is executed, controlling the unmanned aerial vehicle to land. The invention also discloses an unmanned aerial vehicle control device, an unmanned aerial vehicle and a computer readable storage medium. The invention improves the control efficiency of the unmanned aerial vehicle and reduces the hardware cost.

Description

Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle control method and device, an unmanned aerial vehicle and a computer readable storage medium.
Background
With the rapid development of science and technology, more and more technologies are applied to unmanned aerial vehicles. Currently, unmanned aerial vehicles are widely used in military police industry, public safety industry, electric power industry, news media industry, meteorological forest fire prevention and disaster prevention industry, and the like. Specifically, the unmanned aerial vehicle can perform flight missions such as military reconnaissance, police reconnaissance, anti-terrorist and anti-riot, group events, safety monitoring, emergency rescue and relief, patrol and inspection, search and rescue, tracking and searching, public safety, traffic supervision, traffic duty evidence obtaining, exploration and survey, environmental monitoring, meteorological monitoring, industrial monitoring and inspection, scientific investigation, record evidence obtaining and the like.
Currently, the control method of the unmanned aerial vehicle executing the flight mission is to use a remote controller for control or a mobile phone APP (Application program) for control. However, adopt the remote controller to control and need extra hardware cost to, adopt remote controller and APP to control, need transmit the order with wireless mode and give unmanned aerial vehicle execution, unmanned aerial vehicle can't independently trigger the instruction promptly, can't independently carry out the flight task, lead to unmanned aerial vehicle control operation comparatively complicated, reduced unmanned aerial vehicle's control efficiency.
Disclosure of Invention
The invention mainly aims to provide an unmanned aerial vehicle control method and device, an unmanned aerial vehicle and a computer readable storage medium, and aims to improve the control efficiency of the unmanned aerial vehicle and reduce the hardware cost.
In order to achieve the above object, the present invention provides an unmanned aerial vehicle control method, which is applied to an unmanned aerial vehicle, and comprises the following steps:
judging whether a control instruction is triggered within a first preset time or not after the triggering of the palm control mode is detected;
if the control instruction is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle;
and when the current flight task is executed, controlling the unmanned aerial vehicle to land.
Optionally, the current flight task is a photo task, and the step of executing the current flight task of the unmanned aerial vehicle includes:
acquiring the photographing task and acquiring a photographing object of the photographing task;
starting a corresponding flight action according to the photographing task, and detecting whether the photographing object can be photographed within a second preset time;
if the shooting object can be shot within the second preset time, tracking the shooting object for shooting until the shooting task is completed;
and if the shooting object cannot be shot within the second preset time, controlling the unmanned aerial vehicle to land.
Optionally, the step of starting a corresponding flight action according to the photo task includes:
controlling the unmanned aerial vehicle to hover according to the photographing task;
rotating the course of the unmanned aerial vehicle until the shooting object is shot; and/or the presence of a gas in the gas,
and adjusting the height of the unmanned aerial vehicle until the shooting object is shot.
Optionally, the unmanned aerial vehicle includes an acceleration sensor, and the step of determining whether to trigger the control instruction within a first preset time includes:
detecting the running state of the unmanned aerial vehicle through the acceleration sensor within a first preset time;
when the running state is detected to be a free-falling body state, judging that a control instruction is triggered within the first preset time;
if the operation state is not detected to be a free-fall state, it is determined that the control instruction is not triggered within the first preset time.
Optionally, before the step of determining whether to trigger the control instruction within the first preset time, the method further includes:
initializing the unmanned aerial vehicle after the palm control mode is triggered;
starting the blades of the drone in preparation for flight.
Optionally, after the step of determining whether to trigger the control instruction within a first preset time after detecting that the palm control mode is triggered, the method further includes:
if the control instruction is not triggered within the first preset time, stopping the palm control mode;
detecting whether the palm control mode is triggered again;
and if the palm control mode is triggered again, the step of judging whether a control instruction is triggered within a first preset time is carried out.
Optionally, before the step of determining whether to trigger the control instruction within a first preset time after detecting that the palm control mode is triggered, the method further includes:
when the change of the attitude of the unmanned aerial vehicle is detected, detecting an attitude change parameter of the unmanned aerial vehicle, and judging whether the attitude change parameter is greater than a preset threshold value or not;
if the attitude change parameter is larger than the preset threshold value, triggering a palm control mode; or the like, or, alternatively,
and when the on-off key of the unmanned aerial vehicle is continuously pressed for the preset times, triggering the palm control mode.
In addition, in order to achieve the above object, the present invention also provides an unmanned aerial vehicle control apparatus, including:
the trigger judging module is used for judging whether a control instruction is triggered within first preset time or not after the trigger of the palm control mode is detected;
the task execution module is used for executing the current flight task of the unmanned aerial vehicle if the control instruction is triggered within the first preset time;
and the landing control module is used for controlling the unmanned aerial vehicle to land when the current flight task is executed and completed.
In addition, in order to achieve the above object, the present invention also provides an unmanned aerial vehicle, including: a memory, a processor, and a drone control program stored on the memory and executable on the processor, the drone control program when executed by the processor implementing the steps of the drone control method as described above.
Furthermore, to achieve the above object, the present invention also provides a computer-readable storage medium having stored thereon a drone control program, which when executed by a processor, implements the steps of the drone control method as described above.
The invention provides an unmanned aerial vehicle control method, an unmanned aerial vehicle control device, an unmanned aerial vehicle and a computer readable storage medium, wherein the unmanned aerial vehicle control method is applied to the unmanned aerial vehicle, and after the palm control mode is detected to be triggered, whether a control instruction is triggered within first preset time is judged; if the control instruction is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle; when the current flight task is executed and completed, the unmanned aerial vehicle is controlled to land. Through the mode, the unmanned aerial vehicle control method is applied to the unmanned aerial vehicle, namely, the specific code is embedded into the flight control program of the unmanned aerial vehicle, so that the unmanned aerial vehicle autonomously triggers the palm control mode and the control instruction thereof, and the unmanned aerial vehicle is not required to be controlled by a remote controller or a mobile phone APP and the like, so that the control operation is simplified, the hardware cost is reduced, and the control efficiency of the unmanned aerial vehicle is improved. And, unmanned aerial vehicle can independently carry out current flight task, need not the user and controls to further simplify control operation, with further improvement unmanned aerial vehicle's control efficiency. Therefore, the invention improves the control efficiency of the unmanned aerial vehicle and reduces the hardware cost.
Drawings
Fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a first embodiment of the unmanned aerial vehicle control method according to the present invention;
fig. 3 is a schematic flow chart of a second embodiment of the unmanned aerial vehicle control method according to the present invention;
fig. 4 is a schematic functional module diagram of the first embodiment of the unmanned aerial vehicle control device according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1, fig. 1 is a schematic terminal structure diagram of a hardware operating environment according to an embodiment of the present invention.
The terminal in the embodiment of the present invention is an unmanned aerial vehicle control device, and the unmanned aerial vehicle control device may be a terminal device having a processing function, such as an unmanned aerial vehicle, a Personal Computer (PC), a microcomputer, a notebook computer, and a server.
As shown in fig. 1, the terminal may include: a processor 1001, such as a CPU (Central Processing Unit), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a hand-held controller (hand controls), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface). The memory 1005 may be a high-speed RAM memory or a non-volatile memory (non-volatile memory), such as a flash memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the terminal structure shown in fig. 1 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, the memory 1005, which is one type of computer storage medium, may include an operating system, a network communication module, a user interface module, and a drone control program.
In the terminal shown in fig. 1, the processor 1001 may be configured to invoke a drone control program stored in the memory 1005 and perform the following operations:
judging whether a control instruction is triggered within a first preset time or not after the triggering of the palm control mode is detected;
if the control instruction is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle;
and when the current flight task is executed, controlling the unmanned aerial vehicle to land.
Further, the processor 1001 may be configured to invoke the drone control program stored in the memory 1005, and also perform the following operations:
acquiring the photographing task and acquiring a photographing object of the photographing task;
starting a corresponding flight action according to the photographing task, and detecting whether the photographing object can be photographed within a second preset time;
if the shooting object can be shot within the second preset time, tracking the shooting object for shooting until the shooting task is completed;
and if the shooting object cannot be shot within the second preset time, controlling the unmanned aerial vehicle to land.
Further, the processor 1001 may be configured to invoke the drone control program stored in the memory 1005, and also perform the following operations:
controlling the unmanned aerial vehicle to hover according to the photographing task;
rotating the course of the unmanned aerial vehicle until the shooting object is shot; and/or the presence of a gas in the gas,
and adjusting the height of the unmanned aerial vehicle until the shooting object is shot.
Further, the drone includes an acceleration sensor, and the processor 1001 may be configured to invoke a drone control program stored in the memory 1005, and further perform the following operations:
detecting the running state of the unmanned aerial vehicle through the acceleration sensor within a first preset time;
when the running state is detected to be a free-falling body state, judging that a control instruction is triggered within the first preset time;
if the operation state is not detected to be a free-fall state, it is determined that the control instruction is not triggered within the first preset time.
Further, the processor 1001 may be configured to invoke the drone control program stored in the memory 1005, and also perform the following operations:
initializing the unmanned aerial vehicle after the palm control mode is triggered;
starting the blades of the drone in preparation for flight.
Further, the processor 1001 may be configured to invoke the drone control program stored in the memory 1005, and also perform the following operations:
if the control instruction is not triggered within the first preset time, stopping the palm control mode;
detecting whether the palm control mode is triggered again;
and if the palm control mode is triggered again, the step of judging whether a control instruction is triggered within a first preset time is carried out.
Further, the processor 1001 may be configured to invoke the drone control program stored in the memory 1005, and also perform the following operations:
when the change of the attitude of the unmanned aerial vehicle is detected, detecting an attitude change parameter of the unmanned aerial vehicle, and judging whether the attitude change parameter is greater than a preset threshold value or not;
if the attitude change parameter is larger than the preset threshold value, triggering a palm control mode; or the like, or, alternatively,
and when the on-off key of the unmanned aerial vehicle is continuously pressed for the preset times, triggering the palm control mode.
Based on the hardware structure, the invention provides various embodiments of the unmanned aerial vehicle control method.
The invention provides an unmanned aerial vehicle control method.
Referring to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the unmanned aerial vehicle control method of the present invention.
In this embodiment, the drone control method is applied to a drone, and includes the following steps S10 to S30:
step S10, when the palm control mode is triggered, judging whether a control instruction is triggered within a first preset time;
in this embodiment, the drone control method is autonomously implemented by a drone, which may be an aircraft such as a fixed-wing aircraft or a rotary-wing aircraft, and is not limited herein. Specific codes can be embedded into the flight control program of the unmanned aerial vehicle so as to realize the autonomous control of the unmanned aerial vehicle.
When unmanned aerial vehicle is in standby state, when unmanned aerial vehicle's paddle and each flight control equipment all are in the stop state promptly, detect whether the user triggers palm control mode, whether promptly the user selects palm control mode. The control mode may include a remote controller control mode, a mobile phone APP (Application) control mode, a palm control mode, and the like. In the detection process, after the palm control mode is detected to be triggered, whether a control instruction is triggered within first preset time is judged. The first preset time may be set according to actual conditions, for example, 3 minutes, 5 minutes, and the like, and is not limited herein.
The palm control mode is that a user only needs to place the unmanned aerial vehicle in a palm and can trigger a corresponding control instruction through simple action. That is to say, the operations such as detecting palm control mode and control command are detected by unmanned aerial vehicle itself to independently carry out flight control, in order to realize various flight tasks.
In an embodiment, the user can control the drone to trigger the palm control mode through the change of the attitude. Specifically, through the acceleration sensor who deploys on the unmanned aerial vehicle, detect whether the unmanned aerial vehicle gesture changes, when detecting that the unmanned aerial vehicle gesture changes, detect this unmanned aerial vehicle's gesture change parameter through acceleration sensor to judge whether this gesture change parameter is greater than and predetermine the threshold value, with the function of realizing preventing the mistake and touching. The attitude change parameters comprise attitude change distance, attitude change gravity and other parameters. Correspondingly, the preset threshold includes a corresponding attitude change distance threshold and an attitude change gravity threshold. In other embodiments, the user may also trigger the palm control mode by pressing continuously a preset number of times. The predetermined number of times may be 2 times, and is not limited herein.
In an embodiment, whether the control instruction is triggered within the first preset time is determined, and whether the unmanned aerial vehicle is thrown by the user can be determined. It can be understood that throwing out unmanned aerial vehicle, unmanned aerial vehicle will be in the free fall state, and unmanned aerial vehicle changes into the free fall state by static state promptly. Specifically, the method comprises the steps that the running state of the unmanned aerial vehicle is detected through an acceleration sensor within a first preset time, and then when the running state is detected to be a free-falling body state, namely the running state is detected to be changed from a static state to the free-falling body state, it is judged that a control instruction is triggered within the first preset time; if the operation state is not detected to be the free falling body state, the control instruction is judged not to be triggered within the first preset time. It should be noted that the triaxial values may be detected by an acceleration sensor deployed on the unmanned aerial vehicle, so as to determine whether to trigger the control instruction according to whether the sum of squares of the triaxial values is close to 0. In other embodiments, the user may also trigger the control instruction by way of putting, throwing, or the like, and the specific execution flow is substantially the same as the above-mentioned throwing process, which is not described in detail here.
It should be noted that, after the palm control mode is triggered, in order to ensure that the subsequent flight task can be executed immediately and prevent the unmanned aerial vehicle from falling, the unmanned aerial vehicle can be initialized after the palm control mode is triggered, that is, initialization operations such as calibration are performed on each flight control device and flight device of the unmanned aerial vehicle. Then, start unmanned aerial vehicle's paddle to guarantee that the user throws the back, unmanned aerial vehicle has power can hover, in order to prevent to fall.
Step S20, if the control command is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle;
and then, if the control instruction is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle. The flight missions comprise shooting missions, monitoring missions, identifying missions, tracking missions and the like, and specifically comprise military reconnaissance, police reconnaissance, anti-terrorism and anti-riot, group events, safety monitoring, emergency rescue and relief, patrol and inspection, search and rescue, tracking and searching, public safety, traffic supervision, traffic duty evidence obtaining, exploration and survey, environment monitoring, meteorological monitoring, industrial monitoring and inspection, scientific investigation, record evidence obtaining and other missions.
In an embodiment, if the current flight task is a shooting task, the shooting task and a shooting object thereof are obtained, a corresponding flight action is started according to the shooting task, the flight action is hovering waiting, and the heading and the flying altitude are rotated to retrieve the shooting object. If the shooting object cannot be retrieved after a period of time, the unmanned aerial vehicle is controlled to return and land and is close to a user of the unmanned aerial vehicle, and finally the unmanned aerial vehicle stops the propeller and exits from the palm control mode to enter a standby state. And if the shot object is searched within a period of time, tracking the shot object until the shooting task is completed. Then, after the task of shooing is accomplished, control unmanned aerial vehicle and return to navigate, descend to be close to this unmanned aerial vehicle's user, finally, the oar that stops withdraws from this palm control mode, in order to get into standby state. In other embodiments, the current flight task may also be a monitoring task, and the like, and the specific execution flow is similar to the above-described photographing task, which is not described in detail herein.
In an embodiment, the current flight mission may be sent to the drone by the cell phone APP or other third terminal, so that the drone executes the sent flight mission, or is ready to execute the flight mission. In other embodiments, the current flight mission may be triggered by a button on the drone or the drone may detect its own attitude data to detect whether the user has made a particular operation on the drone, it being understood that each particular operation corresponds to a flight mission.
And step S30, when the current flight task is finished, controlling the unmanned aerial vehicle to land.
In this embodiment, when the current flight task is completed, the unmanned aerial vehicle is controlled to land. Wherein, the complete process of descending is including returning a journey, descending, and be close to this unmanned aerial vehicle's user, and finally, the oar that stops withdraws from this palm control mode to get into standby state.
The embodiment of the invention provides an unmanned aerial vehicle control method, which is applied to an unmanned aerial vehicle, and is used for judging whether a control instruction is triggered within first preset time or not after a palm control mode is detected to be triggered; if the control instruction is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle; when the current flight task is executed and completed, the unmanned aerial vehicle is controlled to land. Through the mode, the unmanned aerial vehicle control method is applied to the unmanned aerial vehicle, namely, the specific code is embedded into the flight control program of the unmanned aerial vehicle, so that the unmanned aerial vehicle autonomously triggers the palm control mode and the control instruction thereof, and the unmanned aerial vehicle is not required to be controlled through a remote controller or a mobile phone APP (application), and therefore, the control operation is simplified, the hardware cost is reduced, and the control efficiency of the unmanned aerial vehicle is improved. And, unmanned aerial vehicle can independently carry out current flight task, need not the user and controls to further simplify control operation, with further improvement unmanned aerial vehicle's control efficiency. Therefore, the control efficiency of the unmanned aerial vehicle is improved, and the hardware cost is reduced.
Further, based on the first embodiment, a second embodiment of the unmanned aerial vehicle control method is provided.
Referring to fig. 3, fig. 3 is a schematic flow chart of a method for controlling an unmanned aerial vehicle according to a second embodiment of the present invention.
In the present embodiment, in the step S20, the current flight mission of the drone is executed, including the following steps S21-S24:
step S21, acquiring the photographing task and acquiring a photographing object of the photographing task;
before the unmanned aerial vehicle executes the flight task, firstly, the current flight task of the unmanned aerial vehicle is obtained, the current flight task is a photographing task, and then, a photographing object of the photographing task is obtained. The shooting objects are targets to be shot and targets to be subsequently tracked and shot.
In an embodiment, the shooting object may be a human face, and the shooting task includes human face identification information related to the human face, where the human face identification information includes facial features such as distances, angles, and sizes between facial features and facial organs. When the subject is a pedestrian, biometric information such as the shape and voice print of the pedestrian is included in the photographing task. In other embodiments, the subject may be a building, a vehicle, or the like.
Step S22, starting corresponding flight action according to the photographing task, and detecting whether the photographing object can be photographed within a second preset time;
in this embodiment, different shooting tasks correspond to different flight actions. Therefore, the corresponding flight action is started according to the shooting task, and whether the shooting object can be shot in the second preset time is detected. The flight action comprises actions of hovering, rotating the heading, adjusting the height and the like.
It should be noted that, detecting a photographic subject refers to a process of distinguishing a specific target (or a type of target) from other targets (or other types of targets), and the target objects mainly include buildings, vehicles, pedestrians, and the like. Wherein the identification process comprises the identification of two very similar objects and also the identification of one type of object with other types of objects. The basic principle of the target identification is that target feature information such as color, contour and the like in a diagram is utilized, the size, shape, weight and physical characteristic parameters of a surface layer of a target are estimated through various mathematical multi-dimensional space transformations, and finally identification judgment is carried out in a classifier according to a discrimination function determined by a large number of training samples.
Specifically, the identification step of the shot object is as follows: firstly, acquiring images containing a target object group and the environment thereof to the maximum extent by a camera device of an airborne system; then distinguishing a positive sample image from a negative sample image according to the acquired image, manufacturing an image sample library of the classifier, extracting haar characteristics of the positive sample image and the negative sample image in the image sample library according to the requirement of an adaboost algorithm, and training a floating point classifier model according to the extracted haar characteristics and the adaboost algorithm; and then, converting the floating point type data output by the floating point classifier model according to a preset data structure into fixed point type data, and simultaneously storing the fixed point data through the preset data structure so as to train the fixed point classifier. Then, a RANdom Sample Consensus (RANSAC) method is used for fitting a motion model of the background, the background is compensated, the motion target recognition under the dynamic background is converted into the motion target recognition under the static background, and the motion target recognition under the dynamic background is converted into the motion target recognition under the static background. Finally, calling a trained fixed point classifier model through a DSP (Digital Signal Processor) image processing chip in the airborne system, classifying all moving objects converted into the static background according to fixed point type data (namely preset data required to be applied when the adaboost algorithm is used for classification) in the fixed point classifier model to obtain a target object group, wherein the target object group information comprises two-dimensional coordinates and sizes of all moving objects in the image, and classifying the acquired image information according to the pre-trained fixed point classifier to obtain target object group information
Specifically, in the step S22, the starting of the corresponding flight action according to the photo task includes the following steps a221 to a 223:
step a221, controlling the unmanned aerial vehicle to hover according to the photographing task;
for the task of taking a picture, first, the unmanned aerial vehicle is controlled to hover. The hovering height of the unmanned aerial vehicle is a preset height, and the preset height can be set according to actual needs, for example, 1 meter, 1.5 meters and the like, and is not limited here.
It should be noted that the hovering height of the unmanned aerial vehicle can be detected through optical flow information. Of course, detection may be performed by other sensor information.
Step a222, rotating the course of the unmanned aerial vehicle until the shooting object is shot; and/or the presence of a gas in the gas,
after the unmanned aerial vehicle is controlled to hover, the heading of the unmanned aerial vehicle is rotated until a shooting object is shot. It should be noted that the heading of rotating the unmanned aerial vehicle may be left or right, and steering is performed according to a preset steering speed until a shooting object is shot, that is, the shooting object is identified, and the specific identification manner may refer to the identification step of the shooting object, which is not described herein any more.
Step a223, adjusting the height of the unmanned aerial vehicle until the shooting object is shot.
In this embodiment, if the above-mentioned course that rotates unmanned aerial vehicle just can shoot the shooting object, then need not to adjust unmanned aerial vehicle's height, if the above-mentioned course that rotates unmanned aerial vehicle still can't shoot the shooting object, then adjust unmanned aerial vehicle's height, until shooting the shooting object, it is specific, can rise unmanned aerial vehicle's height to make the visual angle more extensive.
Of course, the shooting object can be shot only by adjusting the height of the unmanned aerial vehicle, and at the moment, the course of the unmanned aerial vehicle does not need to be rotated.
It can be understood that the shooting visual angle can be changed by rotating the course of the unmanned aerial vehicle and adjusting the height of the unmanned aerial vehicle, so that the shooting object can be retrieved and identified more comprehensively and more accurately.
Step S23, if the shooting object can be shot in the second preset time, tracking the shooting object for shooting until the shooting task is completed;
in this embodiment, if the photographic subject can be photographed within the second preset time, the photographic subject is tracked to perform photographing until the photographing task is completed. The second preset time may be set according to actual needs, for example, 5 minutes, 10 minutes, and the like, and is not limited herein. In addition, the second preset time can be confirmed according to unmanned aerial vehicle's electric quantity condition to prevent that the unmanned aerial vehicle electric quantity is low excessively, provide not enough power and lead to falling.
The step of tracking the photographic subject is: after the shooting object is identified and positioned, the unmanned aerial vehicle tracks the shooting object according to the command. Specifically, the detection and the image tracking of the dynamic target are carried out, the distance between the unmanned aerial vehicle and the dynamic target is further measured according to the corresponding relation between the target in the image and the target in the real environment, the confirmation of the dynamic target is completed, the dynamic target is always presented in the center of an imaging plane, the key link of the tracking of the moving target is realized, and only when the target is determined, the target can be used as a feedback signal to form closed-loop control and guide the tracking flight of the rotorcraft. The positioning of the shot object refers to that after recognition and confirmation of the shot object are completed, information such as an image angle and the size of the target object is calculated, and then the relative position information and the coordinate information of the target object are sent to a corresponding information processing center to be processed to obtain target positioning information, so that the target is completely locked and tracked.
And step S24, if the shooting object can not be shot within the second preset time, controlling the unmanned aerial vehicle to land.
In this embodiment, if the shooting object is not shot within the second preset time, the unmanned aerial vehicle is controlled to land. The second preset time may be set according to actual needs, for example, 5 minutes, 10 minutes, and the like, and is not limited herein. In addition, the second preset time can be confirmed according to unmanned aerial vehicle's electric quantity condition to prevent that the unmanned aerial vehicle electric quantity is low excessively, provide not enough power and lead to falling.
It should be noted that the complete process of landing includes returning, landing, and approaching the user of the drone, and finally, stopping the propeller and exiting the palm control mode to enter the standby state.
In this embodiment, unmanned aerial vehicle realizes following the shooting through flying the code relevant with the task of shooing in the control procedure, need not the user and controls unmanned aerial vehicle and carry out the task of shooing to unmanned aerial vehicle's control efficiency has further been improved, and the intelligence of unmanned aerial vehicle control has been improved.
Further, based on the first embodiment, a third embodiment of the unmanned aerial vehicle control method of the present invention is provided.
In this embodiment, the step S10 of determining whether the control command is triggered within the first preset time includes the following steps a 11-a 13:
step a11, detecting the running state of the unmanned aerial vehicle through the acceleration sensor within a first preset time;
starting from the triggering of the palm control mode, in a first preset time, detecting the running state of the unmanned aerial vehicle through an acceleration sensor deployed on the unmanned aerial vehicle. Wherein, acceleration sensor can detect triaxial numerical value to detect unmanned aerial vehicle's running state according to triaxial numerical value.
The operation state comprises states of static, free falling, hovering, rising, falling and the like. Of course, other operation states may be included according to actual needs, and are not limited herein.
Step a12, when the running state is detected to be a free-fall state, judging that a control instruction is triggered within the first preset time;
step a13, if the operation state is not detected to be a free-fall state, determining that the control instruction is not triggered within the first preset time.
It can be understood that when the unmanned aerial vehicle is placed on the palm, the unmanned aerial vehicle is in a static state, and whether the unmanned aerial vehicle is in the static state or not can be judged according to the fact that the sum of the squares of the three-axis numerical values is close to 0. Then, throw out, place or throw etc. unmanned aerial vehicle, unmanned aerial vehicle will be in the free fall state, and unmanned aerial vehicle changes into the free fall state from static state promptly. Specifically, when the operation state is detected to be a free-fall state, namely the operation state is detected to be changed from a static state to the free-fall state, it is determined that a control instruction is triggered within a first preset time; if the operation state is not detected to be the free falling body state, the control instruction is judged not to be triggered within the first preset time.
In this embodiment, the user accessible is thrown, is put, operation such as throw, triggers control command, and is corresponding, whether unmanned aerial vehicle accessible acceleration sensor independently detects and triggers control command, need not to control unmanned aerial vehicle through third ends such as remote controller or cell-phone APP to further reduce the operation complexity, with further improvement unmanned aerial vehicle's control efficiency, and further reduce the hardware cost.
Further, based on the first embodiment, a fourth embodiment of the unmanned aerial vehicle control method of the present invention is provided.
In this embodiment, in the step S10, it is determined whether the control command is triggered within the first preset time, and the following steps a-B are further included before:
step A, initializing the unmanned aerial vehicle after the palm control mode is triggered;
and B, starting the blades of the unmanned aerial vehicle to prepare for flying.
After the palm control mode is triggered, the unmanned aerial vehicle is initialized, namely, initialization operations such as calibration and the like are performed on each flight control device, flight device and the like of the unmanned aerial vehicle. The blades of the drone are then activated in preparation for flight.
In this embodiment, in order to guarantee follow-up can carry out the flight task at once to prevent that unmanned aerial vehicle from falling, can trigger the back at palm control mode, initialize unmanned aerial vehicle, to each flight control equipment and flight equipment etc. of unmanned aerial vehicle promptly, carry out initialization operation such as calibration. Then, start unmanned aerial vehicle's paddle to guarantee that the user throws the back, unmanned aerial vehicle has power can hover, in order to prevent to fall.
Further, based on the first embodiment, a fifth embodiment of the unmanned aerial vehicle control method of the present invention is provided.
In this embodiment, after the step S10, the following steps C to E are further included:
step C, if the control instruction is not triggered within the first preset time, stopping the palm control mode;
in this embodiment, if the control command is not triggered within the first preset time, the palm control mode is stopped to prevent long waiting time for triggering the control command.
In an embodiment, after the palm control mode is stopped, if the paddle of the unmanned aerial vehicle is started, the paddle is closed, and the unmanned aerial vehicle is controlled to enter a standby state. In other embodiments, other operations may also be set as desired.
Step D, detecting whether the palm control mode is triggered again;
and E, if the palm control mode is triggered again, judging whether a control instruction is triggered within a first preset time.
After the palm control mode is stopped, it is detected whether the palm control mode is re-triggered. If the palm control mode is triggered again, the step S10 is entered to determine whether to trigger the control command within the first preset time to accurately execute the flight mission again.
In an embodiment, the user can control the drone to trigger the palm control mode through the change of the attitude. Specifically, through the acceleration sensor who deploys on the unmanned aerial vehicle, detect whether the unmanned aerial vehicle gesture changes, when detecting that the unmanned aerial vehicle gesture changes, detect this unmanned aerial vehicle's gesture change parameter through acceleration sensor to judge whether this gesture change parameter is greater than and predetermine the threshold value, with the function of realizing preventing the mistake and touching. The attitude change parameters comprise attitude change distance, attitude change gravity and other parameters. Correspondingly, the preset threshold includes a corresponding attitude change distance threshold and an attitude change gravity threshold. In other embodiments, the user may also trigger the palm control mode by pressing continuously a preset number of times. The predetermined number of times may be 2 times, and is not limited herein.
In this embodiment, if the control instruction is not triggered within the first preset time, the palm control mode is stopped to prevent the control instruction from being triggered by long-time waiting, so that the power consumption of the unmanned aerial vehicle is reduced, and the energy efficiency of the unmanned aerial vehicle is improved.
Further, based on the first embodiment, a sixth embodiment of the unmanned aerial vehicle control method of the present invention is provided.
In this embodiment, before the step S10, the following steps F-H are further included:
step F, when the change of the attitude of the unmanned aerial vehicle is detected, detecting an attitude change parameter of the unmanned aerial vehicle, and judging whether the attitude change parameter is larger than a preset threshold value or not;
in order to trigger the palm control mode, when the unmanned aerial vehicle is in a standby state, namely when blades of the unmanned aerial vehicle and all flight control devices are in a stop state, whether a user triggers the palm control mode or not is detected, namely whether the user selects the palm control mode or not is detected. Specifically, when detecting that the unmanned aerial vehicle gesture changes, detect unmanned aerial vehicle's gesture change parameter to judge whether this gesture change parameter is greater than and predetermine the threshold value.
It should be noted that, whether the unmanned aerial vehicle attitude changes can be detected through the acceleration sensor deployed on the unmanned aerial vehicle. When detecting that the unmanned aerial vehicle gesture changes, accessible acceleration sensor detects this unmanned aerial vehicle's gesture change parameter.
The attitude change parameters comprise attitude change distance, attitude change gravity and other parameters. Correspondingly, the preset threshold includes a corresponding attitude change distance threshold and an attitude change gravity threshold. The preset threshold may be set according to actual needs, and is not limited herein.
G, if the attitude change parameter is larger than the preset threshold value, triggering a palm control mode; or the like, or, alternatively,
and then, if the attitude change parameter of the unmanned aerial vehicle is greater than a preset threshold value, triggering the palm control mode to enter an autonomous and intelligent control mode of the unmanned aerial vehicle.
And step H, triggering the palm control mode when detecting that the on-off key of the unmanned aerial vehicle is continuously pressed for a preset number of times.
Or when the on-off key of the unmanned aerial vehicle is continuously pressed for the preset times, the palm control mode is triggered to enter the autonomous and intelligent control mode of the unmanned aerial vehicle.
The preset number of times may be set according to actual needs, for example, 2 times, 3 times, and the like, and is not limited herein. The time interval of each pressing should be smaller than a preset time interval, which can be set according to practical situations, for example, 0.5 second, 0.4 second, 0.6 second, etc., and is not limited herein.
In a specific embodiment, the on/off key may be replaced by a specific key, and the number of pressing times may be 1, and the specific execution flow is substantially the same as that of the on/off key, which is not described herein any more.
In this embodiment, can be through rocking unmanned aerial vehicle to make unmanned aerial vehicle gesture change or press unmanned aerial vehicle's on & off switch trigger palm control mode in succession, need not to trigger this palm control mode through third terminals such as remote controller or cell-phone APP, thereby further reduced unmanned aerial vehicle's operation complexity, further improved unmanned aerial vehicle's control efficiency. Correspondingly, unmanned aerial vehicle can independently detect whether to trigger the palm control mode to unmanned aerial vehicle's intellectuality has further been improved. And, whether the in-process that whether detects unmanned aerial vehicle gesture and change, judge whether this unmanned aerial vehicle's gesture change parameter is greater than preset threshold value to prevent the mistake and touch, thereby improve unmanned aerial vehicle control's accuracy.
The invention also provides an unmanned aerial vehicle control device.
Referring to fig. 4, fig. 4 is a functional module schematic diagram of the first embodiment of the unmanned aerial vehicle control device of the present invention.
In this embodiment, the drone controlling device includes:
the trigger judging module 10 is configured to judge whether to trigger the control instruction within a first preset time after detecting that the handheld control mode is triggered;
the task execution module 20 is configured to execute a current flight task of the unmanned aerial vehicle if the control instruction is triggered within the first preset time;
and the landing control module 30 is used for controlling the unmanned aerial vehicle to land when the current flight task is executed and completed.
Each virtual function module of the above-mentioned drone control device is stored in the memory 1005 of the drone control device shown in fig. 1, and is used to implement all functions of the drone control program; when each module is executed by the processor 1001, the unmanned aerial vehicle control function can be realized.
Further, the task execution module 20 includes:
the task acquisition unit is used for acquiring the photographing task and acquiring a photographing object of the photographing task;
the action starting unit is used for starting corresponding flight actions according to the shooting task and detecting whether the shooting object can be shot within second preset time;
the shooting tracking unit is used for tracking the shooting object to shoot until the shooting task is finished if the shooting object can be shot within the second preset time;
and the landing control unit is used for controlling the unmanned aerial vehicle to land if the shooting object cannot be shot within the second preset time.
Further, the action starting unit includes:
the hovering control subunit is used for controlling the unmanned aerial vehicle to hover according to the photographing task;
the course rotating subunit is used for rotating the course of the unmanned aerial vehicle until the shooting object is shot; and/or the presence of a gas in the gas,
and the height adjusting subunit is used for adjusting the height of the unmanned aerial vehicle until the shooting object is shot.
Further, the triggering judgment module 10 includes:
the state detection unit is used for detecting the running state of the unmanned aerial vehicle through the acceleration sensor within a first preset time;
the instruction judging unit is used for judging that a control instruction is triggered within the first preset time when the operation state is detected to be a free-fall state;
and the instruction judging unit is further used for judging that the control instruction is not triggered within the first preset time if the operation state is not detected to be a free-fall state.
Further, the unmanned aerial vehicle control device further comprises:
the initialization module is used for initializing the unmanned aerial vehicle after the palm control mode is triggered;
a paddle start module for starting the unmanned aerial vehicle's paddle to prepare for flight.
Further, the unmanned aerial vehicle control device further comprises:
the mode stopping module is used for stopping the palm control mode if the control instruction is not triggered within the first preset time;
the mode detection module is used for detecting whether the palm control mode is triggered again;
and the mode triggering module is used for entering the step of judging whether to trigger the control instruction within the first preset time if the palm control mode is triggered again.
Further, the unmanned aerial vehicle control device further comprises:
the attitude detection module is used for detecting an attitude change parameter of the unmanned aerial vehicle when the unmanned aerial vehicle is detected to have a change in attitude, and judging whether the attitude change parameter is greater than a preset threshold value;
the mode triggering module is further used for triggering the handheld control mode if the attitude change parameter is larger than the preset threshold; or the like, or, alternatively,
and the mode triggering module is also used for triggering the palm control mode when the on-off key of the unmanned aerial vehicle is continuously pressed for the preset times.
The function implementation of each module in the unmanned aerial vehicle control device corresponds to each step in the unmanned aerial vehicle control method embodiment, and the function and implementation process are not described in detail herein.
The present invention also provides an unmanned aerial vehicle, comprising: memory, a processor and a drone control program stored on the memory and executable on the processor, the drone control program when executed by the processor implementing the steps of the drone control method of any of the above embodiments.
The specific embodiment of the unmanned aerial vehicle of the invention is basically the same as the embodiments of the unmanned aerial vehicle control method, and is not described herein again.
The present invention also provides a computer-readable storage medium having stored thereon a drone control program which, when executed by a processor, implements the steps of the drone control method according to any one of the above embodiments.
The specific embodiment of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the above-mentioned unmanned aerial vehicle control method, and is not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above, and includes instructions for enabling a terminal device (e.g., a drone, a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. The unmanned aerial vehicle control method is applied to an unmanned aerial vehicle, and comprises the following steps:
judging whether a control instruction is triggered within a first preset time or not after the triggering of the palm control mode is detected;
if the control instruction is triggered within the first preset time, executing the current flight task of the unmanned aerial vehicle;
and when the current flight task is executed, controlling the unmanned aerial vehicle to land.
2. The drone controlling method of claim 1, wherein the current flight mission is a photo task, the step of performing the current flight mission of the drone including:
acquiring the photographing task and acquiring a photographing object of the photographing task;
starting a corresponding flight action according to the photographing task, and detecting whether the photographing object can be photographed within a second preset time;
if the shooting object can be shot within the second preset time, tracking the shooting object for shooting until the shooting task is completed;
and if the shooting object cannot be shot within the second preset time, controlling the unmanned aerial vehicle to land.
3. The drone controlling method of claim 2, wherein the step of initiating a corresponding flight action according to the photo task comprises:
controlling the unmanned aerial vehicle to hover according to the photographing task;
rotating the course of the unmanned aerial vehicle until the shooting object is shot; and/or the presence of a gas in the gas,
and adjusting the height of the unmanned aerial vehicle until the shooting object is shot.
4. The drone controlling method of claim 1, wherein the drone includes an acceleration sensor, and the step of determining whether to trigger a control command within a first preset time includes:
detecting the running state of the unmanned aerial vehicle through the acceleration sensor within a first preset time;
when the running state is detected to be a free-falling body state, judging that a control instruction is triggered within the first preset time;
if the operation state is not detected to be a free-fall state, it is determined that the control instruction is not triggered within the first preset time.
5. The drone controlling method of claim 1, wherein before the step of determining whether to trigger the control command within a first preset time, further comprising:
initializing the unmanned aerial vehicle after the palm control mode is triggered;
starting the blades of the drone in preparation for flight.
6. The drone controlling method according to claim 1, wherein after the step of determining whether to trigger the control command within a first preset time after detecting the palm control mode triggering, further comprising:
if the control instruction is not triggered within the first preset time, stopping the palm control mode;
detecting whether the palm control mode is triggered again;
and if the palm control mode is triggered again, the step of judging whether a control instruction is triggered within a first preset time is carried out.
7. The drone controlling method according to claim 1, wherein before the step of determining whether to trigger the control command within a first preset time after detecting the palm control mode triggering, further comprising:
when the change of the attitude of the unmanned aerial vehicle is detected, detecting an attitude change parameter of the unmanned aerial vehicle, and judging whether the attitude change parameter is greater than a preset threshold value or not;
if the attitude change parameter is larger than the preset threshold value, triggering a palm control mode; or the like, or, alternatively,
and when the on-off key of the unmanned aerial vehicle is continuously pressed for the preset times, triggering the palm control mode.
8. An unmanned aerial vehicle control device, characterized in that, unmanned aerial vehicle control device includes:
the trigger judging module is used for judging whether a control instruction is triggered within first preset time or not after the trigger of the palm control mode is detected;
the task execution module is used for executing the current flight task of the unmanned aerial vehicle if the control instruction is triggered within the first preset time;
and the landing control module is used for controlling the unmanned aerial vehicle to land when the current flight task is executed and completed.
9. A drone, characterized in that it comprises: memory, a processor and a drone control program stored on the memory and executable on the processor, the drone control program when executed by the processor implementing the steps of the drone control method of any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a drone control program, which when executed by a processor implements the steps of the drone control method of any one of claims 1 to 7.
CN202110072365.5A 2021-01-19 2021-01-19 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium Pending CN112711274A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110072365.5A CN112711274A (en) 2021-01-19 2021-01-19 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110072365.5A CN112711274A (en) 2021-01-19 2021-01-19 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN112711274A true CN112711274A (en) 2021-04-27

Family

ID=75549542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110072365.5A Pending CN112711274A (en) 2021-01-19 2021-01-19 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN112711274A (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
CN103426172A (en) * 2013-08-08 2013-12-04 深圳一电科技有限公司 Vision-based target tracking method and device
CN104890889A (en) * 2015-05-13 2015-09-09 深圳一电科技有限公司 Control method of aircraft and aircraft
CN105527972A (en) * 2016-01-13 2016-04-27 深圳一电航空技术有限公司 Unmanned aerial vehicle (UAV) flight control method and device
CN105730707A (en) * 2016-04-28 2016-07-06 深圳飞马机器人科技有限公司 Manual throwing automatic takeoff method for unmanned aerial vehicles
CN106506956A (en) * 2016-11-17 2017-03-15 歌尔股份有限公司 Based on the track up method of unmanned plane, track up apparatus and system
CN106647798A (en) * 2016-09-30 2017-05-10 腾讯科技(深圳)有限公司 Take-off control method and take-off control apparatus for aircrafts
CN106873607A (en) * 2017-04-01 2017-06-20 高域(北京)智能科技研究院有限公司 Takeoff method, control device and unmanned vehicle
CN206671894U (en) * 2017-04-01 2017-11-24 高域(北京)智能科技研究院有限公司 Take off control device and its unmanned vehicle
CN206804018U (en) * 2017-04-13 2017-12-26 高域(北京)智能科技研究院有限公司 Environmental data server, unmanned vehicle and alignment system
CN108521812A (en) * 2017-05-19 2018-09-11 深圳市大疆创新科技有限公司 Control method, unmanned plane and the machine readable storage medium of unmanned plane
CN109241820A (en) * 2018-07-10 2019-01-18 北京二郎神科技有限公司 The autonomous image pickup method of unmanned plane based on space exploration
CN111127518A (en) * 2019-12-24 2020-05-08 深圳火星探索科技有限公司 Target tracking method and device based on unmanned aerial vehicle
WO2020107310A1 (en) * 2018-11-29 2020-06-04 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method and control device and unmanned aerial vehicle

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
CN103426172A (en) * 2013-08-08 2013-12-04 深圳一电科技有限公司 Vision-based target tracking method and device
CN104890889A (en) * 2015-05-13 2015-09-09 深圳一电科技有限公司 Control method of aircraft and aircraft
CN105527972A (en) * 2016-01-13 2016-04-27 深圳一电航空技术有限公司 Unmanned aerial vehicle (UAV) flight control method and device
CN105730707A (en) * 2016-04-28 2016-07-06 深圳飞马机器人科技有限公司 Manual throwing automatic takeoff method for unmanned aerial vehicles
CN106647798A (en) * 2016-09-30 2017-05-10 腾讯科技(深圳)有限公司 Take-off control method and take-off control apparatus for aircrafts
CN106506956A (en) * 2016-11-17 2017-03-15 歌尔股份有限公司 Based on the track up method of unmanned plane, track up apparatus and system
CN106873607A (en) * 2017-04-01 2017-06-20 高域(北京)智能科技研究院有限公司 Takeoff method, control device and unmanned vehicle
CN206671894U (en) * 2017-04-01 2017-11-24 高域(北京)智能科技研究院有限公司 Take off control device and its unmanned vehicle
CN206804018U (en) * 2017-04-13 2017-12-26 高域(北京)智能科技研究院有限公司 Environmental data server, unmanned vehicle and alignment system
CN108521812A (en) * 2017-05-19 2018-09-11 深圳市大疆创新科技有限公司 Control method, unmanned plane and the machine readable storage medium of unmanned plane
CN109241820A (en) * 2018-07-10 2019-01-18 北京二郎神科技有限公司 The autonomous image pickup method of unmanned plane based on space exploration
WO2020107310A1 (en) * 2018-11-29 2020-06-04 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method and control device and unmanned aerial vehicle
CN111127518A (en) * 2019-12-24 2020-05-08 深圳火星探索科技有限公司 Target tracking method and device based on unmanned aerial vehicle

Similar Documents

Publication Publication Date Title
US11720126B2 (en) Motion and image-based control system
US20220083078A1 (en) Method for controlling aircraft, device, and aircraft
US10824149B2 (en) System and method for automated aerial system operation
US10168700B2 (en) Control of an aerial drone using recognized gestures
US11604479B2 (en) Methods and system for vision-based landing
US9973737B1 (en) Unmanned aerial vehicle assistant for monitoring of user activity
CN108572659B (en) Method for controlling unmanned aerial vehicle and unmanned aerial vehicle supporting same
WO2018209702A1 (en) Method for controlling unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
KR102254491B1 (en) Automatic fly drone embedded with intelligent image analysis function
WO2018103689A1 (en) Relative azimuth control method and apparatus for unmanned aerial vehicle
US20200108914A1 (en) Unmanned aerial vehicle and method for controlling same
De Croon et al. Sky segmentation approach to obstacle avoidance
CN109196439B (en) Unmanned aerial vehicle control method and device and unmanned aerial vehicle
CN112740226A (en) Operating system and method of movable object based on human body indication
Pareek et al. Person identification using autonomous drone through resource constraint devices
JPWO2020136703A1 (en) Unmanned aerial vehicle control system, unmanned aerial vehicle control method, and program
EP3399380A1 (en) Somatosensory remote controller, somatosensory remote control flight system and method, and remote control method
Patrona et al. An overview of hand gesture languages for autonomous UAV handling
De Marsico et al. Using hands as an easy UAV joystick for entertainment applications
Jung et al. Real time embedded system framework for autonomous drone racing using deep learning techniques
Bruce et al. Ready—aim—fly! hands-free face-based HRI for 3D trajectory control of UAVs
US20210181769A1 (en) Movable platform control method, movable platform, terminal device, and system
CN112711274A (en) Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium
Wong et al. Low cost unmanned aerial vehicle monitoring using smart phone technology
Schelle et al. Gestural transmission of tasking information to an airborne UAV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210427

RJ01 Rejection of invention patent application after publication