WO2018103689A1 - 无人机相对方位控制方法及装置 - Google Patents

无人机相对方位控制方法及装置 Download PDF

Info

Publication number
WO2018103689A1
WO2018103689A1 PCT/CN2017/114974 CN2017114974W WO2018103689A1 WO 2018103689 A1 WO2018103689 A1 WO 2018103689A1 CN 2017114974 W CN2017114974 W CN 2017114974W WO 2018103689 A1 WO2018103689 A1 WO 2018103689A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
relative orientation
information
gesture
height difference
Prior art date
Application number
PCT/CN2017/114974
Other languages
English (en)
French (fr)
Inventor
任毫亮
Original Assignee
北京奇虎科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京奇虎科技有限公司 filed Critical 北京奇虎科技有限公司
Publication of WO2018103689A1 publication Critical patent/WO2018103689A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones

Definitions

  • the present invention relates to the field of aviation science and technology, and more particularly to a method and apparatus for controlling relative orientation of a drone.
  • the unmanned drone is referred to as the unmanned aerial vehicle, and is an unmanned drone operated by a remote control method and a self-contained program control device.
  • the unmanned aerial vehicle In order to maintain the balance of the body and complete the task, more and more sensors are installed on the unmanned aircraft.
  • microelectronics technology it has become a reality to integrate multiple high-precision sensors on small unmanned aerial vehicles.
  • the functions that UAVs can achieve are also increasing, and have been widely used in aerial reconnaissance, surveillance, communication, anti-submarine, and electronic interference. In some usage scenarios, it is necessary to control the drone to maintain a certain relative orientation with the user, so as to complete operations such as drone shooting.
  • the flight control of the drone is greatly affected by the environment. If the user performs the relative azimuth control completely manually, not only the user's flight control technology level is required, but also the user needs to focus on the relative position control of the drone, which is difficult to complete at the same time. Other operations. Therefore, in general, the following two methods are used to control the relative orientation of the drone:
  • the sensors of the drone and the remote control device are used to detect the orientation information of the drone and the remote control device, and the relative orientation control is performed according to the orientation information.
  • the method 1 relies on the positioning sensor of the remote control device, has low precision, and is not conducive to the lightness of the remote control device, affecting the human-computer interaction experience in the relative orientation control; the method 2 has a high complexity of the graphics algorithm. It not only occupies a large amount of computing resources of the drone, but also limits the application scenario of the relative position control of the drone, and the effect is not satisfactory.
  • the object of the present invention is to provide a method and device for controlling relative orientation of a drone, and a method and device for controlling relative orientation of a drone for a wearable device, which are capable of improving The efficiency of the relative orientation control of the drone, the control method can provide an implementation manner for improving the relative azimuth control efficiency of the drone.
  • the present invention also provides a drone control device and a wearable device control device that are compatible with the aforementioned control method.
  • the embodiment of the present invention provides a method for controlling a relative orientation of a drone, comprising the steps of: acquiring an infrared image formed by an infrared light action gesture area emitted by the wearable device; determining infrared light according to the infrared image. Depicting a gesture area of the contour and a gesture instruction type characterizing the relative orientation preset value; detecting first relative orientation information between the present drone and the gesture area; according to the first relative orientation information and the gesture instruction type The flight state of the drone is controlled such that the relative orientation between the drone and the gesture area is the preset value.
  • an embodiment of the present invention provides a method for controlling a relative orientation of a drone for a wearable device, comprising the steps of: receiving a drive for driving a wearable device to emit infrared light based on a trusted connection; In response to the driving instruction, the infrared illuminating component preset to drive the wearable device emits infrared light, so that the drone determines the gesture area and the gesture instruction type characterization of the relative orientation preset value based on the infrared imaging to apply to the drone Relative orientation control; receiving an alarm instruction of the drone based on the trust connection; controlling the wearable device to start the vibration in response to the alarm instruction Move the motor and/or turn on the indicator light to alert the user that the current drone is in an azimuth deviation state.
  • an embodiment of the present invention provides a drone relative orientation control apparatus, including: at least one processor; and at least one memory communicably coupled to the at least one processor;
  • a memory includes processor-executable instructions that, when executed by the at least one processor, cause the apparatus to perform at least the following: obtaining infrared light emitted by the wearable device after acting on the gesture area Forming an infrared image; determining, according to the infrared image, a gesture area characterized by infrared light and a gesture instruction type indicating a relative orientation preset value; detecting first relative orientation information between the drone and the gesture area; And controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture area is the preset value.
  • an embodiment of the present invention provides a drone relative orientation assisting control apparatus for a wearable device, comprising: at least one processor; and at least one memory communicable with the at least one processor
  • the at least one memory includes processor-executable instructions that, when executed by the at least one processor, cause the apparatus to perform at least the following operations: receiving based on a trusted connection a driving command for driving the wearing device to emit infrared light; in response to the driving instruction, driving the infrared light emitting component preset by the wearing device to emit infrared light, so that the drone determines the gesture region and characterizes the relative orientation based on the infrared imaging Setting a type of gesture instruction to apply to the relative orientation control of the drone; receiving an alarm instruction of the drone based on the trusted connection; controlling the wearable device to activate the vibration motor and/or turning on the indicator light in response to the alarm instruction Prompt the user that the current drone is in an azimuth deviation state.
  • an embodiment of the present invention provides a drone control device having a function of implementing the method for controlling a relative orientation of a drone in the above first aspect.
  • Said function It can be implemented by hardware or by software.
  • the hardware or software includes one or more units corresponding to the functions described above.
  • the structure of the drone control device includes: one or more cameras, at least one of which has an infrared imaging function; one or more sensors for detecting the relative orientation information; a program for storing a supportive wearable device to perform the above-described drone relative orientation control method; a communication interface for communicating with the wearable device or other device or communication network; and one or more processors for executing the memory a program stored therein; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors; the one or more programs Means for driving the one or more processors to construct a UAV relative orientation control method as described in the first aspect or any one of the implementations.
  • an embodiment of the present invention provides a wearable device control device having a function of implementing the unmanned aerial vehicle relative orientation assist control method for the wearable device in the second aspect.
  • the functions may be implemented by hardware or by corresponding software implemented by hardware.
  • the hardware or software includes one or more units corresponding to the functions described above.
  • the structure of the wearable device control device includes: a memory for storing a program supporting the wearable device to perform the above-described drone relative orientation assist control method for the wearable device; and a communication interface for the wearable device Communicating with a drone or other device or communication network; vibrating motor and/or indicator light for prompting the user of the current state of the drone; one or more processors for executing the program stored in said memory; infrared a lighting assembly comprising one or more infrared light sources for emitting infrared light; one or more applications, wherein the one or more applications are stored in the memory and configured to be by the one or more Processor execution One or more programs for driving the one or more processors to construct a relative orientation of the drone for the wearable device described in the second aspect or any one of its implementations A unit of the auxiliary control method.
  • a computer program in an embodiment of the present invention, comprising computer readable code, when the drone runs the computer readable code, causing the method described in the first aspect to be performed.
  • a computer program in an embodiment of the present invention, comprising computer readable code, when the wearable device runs the computer readable code, causing the method described in the second aspect to be performed.
  • a ninth aspect the embodiment of the present invention provides a computer readable medium, wherein the computer program according to the seventh aspect or the eighth aspect is stored.
  • the technical solution provided by the present invention has at least the following advantages:
  • the present invention determines a gesture area according to an infrared image formed by infrared light emitted by the wearable device, and a gesture instruction type that represents a relative orientation preset value, and detects a first relative orientation information between the drone and the gesture area. And controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture area is the preset value.
  • the gesture area and the recognition gesture can be determined according to the infrared imaging, and the relative orientation is determined by combining the position information of the drone, thereby realizing the relative azimuth control of the drone, reducing the calculation amount of the image recognition of the drone, and improving the relative of the drone While the efficiency of the azimuth control is improved, the accuracy of the relative orientation control of the drone is improved.
  • the user can adjust the relative orientation of the drone and himself by gesture control by using a wearable device capable of emitting infrared light, without the need to wear the device to provide orientation information, thereby reducing the cost of the wearable device. And make the wearable device lighter and lighter. Improved user experience in drone relative orientation control.
  • the drone can maintain the preset relative orientation with the user when the user goes up and down, reducing the relative orientation of the environment change to the drone.
  • the impact of control when detecting the influence of the above environmental change, the drone sends an alarm command to the wearable device to prompt the user that the current drone is in an azimuth deviation state. It is beneficial for users to make corresponding adjustments in time to prevent drone loss and safety accidents.
  • FIG. 1 is a block diagram showing the structure of an apparatus for a relative azimuth control method of a drone according to an embodiment of the present invention
  • FIG. 2 is a schematic flow chart of a method for controlling a relative orientation of a drone according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a scenario of a relative azimuth control process of a drone according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a scenario of a relative azimuth control process of a drone according to an embodiment of the present invention
  • FIG. 5 is a schematic flow chart of a method for assisting a relative orientation of a drone for a wearable device according to an embodiment of the present invention
  • FIG. 6 is a structural block diagram of a relative azimuth control device for a drone according to an embodiment of the present invention.
  • FIG. 7 is a structural block diagram of a relative azimuth assist control device for a wearable device according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of a drone control device according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a device for controlling a wearable device according to an embodiment of the present invention.
  • Figure 10 shows a block diagram of a drone or wearable device for performing the method according to the invention
  • Figure 11 shows a schematic diagram of a memory unit for holding or carrying program code implementing a method in accordance with the present invention.
  • control device or "unmanned control device” as used herein includes both a device of a wireless signal receiver, a device having only a wireless signal receiver without a transmitting capability, and a receiving device. And a device that transmits hardware having a receiving and transmitting hardware capable of two-way communication over a two-way communication link.
  • Such devices may include cellular or other communication devices having single line displays or multi-line displays or cellular or other communication devices without multi-line displays; portable, transportable, and mounted on vehicles (aviation, sea and/or land) Mobile smart devices, such as drones, unmanned airships, etc.
  • wearing device or “wearing device control device” as used herein includes both a device of a wireless signal transmitter, a device having only a wireless signal transmitter without receiving capability, and a receiving and receiving device.
  • Such a device can be designed to be placed on a person, especially an arm, including a smart bracelet, a smart watch and a bracelet, and the like.
  • the method of the present invention is mainly applicable to a terminal having a communication function such as a drone or a wearable device. Not limited to the type of its operating system, it can be Android, IOS, WP, Symbian and other operating systems, or embedded operating systems.
  • FIG. 1 a block diagram of a device structure for a method for controlling a relative position of a drone is shown in FIG. 1.
  • the overall structure includes a processor 704, a sensor module, a controller, an execution control terminal, and the like, wherein the sensor module includes Inertial measurement unit (IMU, including acceleration sensor, gyro sensor), magnetometer, direction sensor, ranging sensor, satellite positioning sensor (such as GPS sensor, Beidou sensor, etc.), image sensor, etc., used to generate Various sensor data is generated to generate azimuth information, heading information, image information, positioning information, distance information, etc. for drone control, thereby reflecting various parameters of the drone flight, and facilitating the drone to make its own adjustments. .
  • IMU Inertial measurement unit
  • the inertial sensor can detect the change of the attitude data of the drone, and the drone adjusts its posture after acquiring the attitude data to ensure flight according to the control command;
  • the distance sensor can be used to detect the distance from the obstacle, so that the obstacle avoidance action can be quickly performed, thereby ensuring that the airframe is not damaged, and when the drone has obstacle avoidance measures
  • the image sensor is used to acquire the infrared light emitted by the wearable device.
  • An infrared image formed after the gesture area The infrared image determines a gesture area and a gesture instruction type that characterizes a relative orientation preset value; and detects a first relative between the drone and the gesture area by using a gyro sensor, a satellite positioning sensor, a ranging sensor, and a direction sensor Positioning information; controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, such that the relative orientation between the drone and the gesture area is the preset value.
  • the processor 704 is a core part of performing data integration, transmission control, and execution of the operation.
  • the specific information is identified from the data through a series of algorithms, thereby determining that the data is to be executed according to the information.
  • the operation of the person skilled in the art can understand that the processor 704 can not only complete the integration and sending of sensor data, but also perform other operations.
  • the processor 704 is capable of performing any unmanned aircraft relative orientation control. method.
  • the controller is a control device for controlling the drone. Generally, when the remote control device (such as a wearable device) is used as a controller to control the drone, the control frequency of the drone and the controller needs to be set to ensure effective control.
  • the drone flies.
  • the execution control terminal is used for the drone to execute the operation instruction, and the execution control terminal communicates with the processor 704 to ensure that the drone is executed according to the operation instruction.
  • the method includes the following steps:
  • Step S11 Acquire an infrared image formed by the infrared light emitted by the wearable device acting on the gesture area.
  • UAVs usually include camera units, processors, storage, etc., based on computer vision for gesture recognition.
  • the drone includes at least one camera having an infrared imaging function, and the drone acquires an infrared image formed by the infrared light emitted by the wearable device, and separates the gesture area from the background area according to the infrared image to complete the gesture segmentation. To achieve infrared gesture recognition.
  • the camera unit includes at least one camera.
  • the camera unit can acquire an infrared image by any one or any of an IR-CUT dual filter technology, an IR lens technology, and an infrared induction CCD technology.
  • the wearable device emits infrared light to illuminate the back of the user's hand, so that the outline of the user's gesture area is "illuminated” by the infrared light to form an infrared image of the gesture area.
  • the user's gesture area and background area are distinguished by infrared light in infrared imaging.
  • Step S12 determining a gesture area that describes the contour by the infrared light and a gesture instruction type that represents the relative orientation preset value according to the infrared image.
  • the drone separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, and the gesture area can be determined.
  • an image algorithm may also be utilized to cause the camera unit of the drone to lock the gesture area.
  • the determining, by the infrared image, the gesture area and the gesture instruction type comprises: one or more frames of images acquired from a preview infrared image acquired by the camera unit; determining the multi-frame image A gesture area in which the contour is described by the infrared light; the gesture feature data is extracted based on the gesture area, and matched with the preset gesture instruction type description data to determine a corresponding gesture instruction type.
  • the gesture instruction type can characterize the relative orientation preset for the drone relative orientation control.
  • the video acquired by the UAV through the camera unit can be regarded as composed of multi-frame images.
  • static gesture recognition only one or a few frames of the gesture are analyzed to extract gesture feature data, and the gesture feature data may include a gesture profile. Data and/or gesture depth data.
  • dynamic gesture recognition it is also necessary to acquire the spatio-temporal features of gestures.
  • the common methods of dynamic gesture spatio-temporal trajectory analysis are mainly two categories: Trajectories Matching and State Space Modeling. . Therefore, it is necessary to analyze the multi-frame image to obtain the spatiotemporal trajectory generated by the gesture during the movement.
  • the drone After acquiring the infrared image, the drone separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, determines the gesture area that describes the contour by the infrared light, and then acquires the gesture feature through the gesture area and estimates the gesture model parameter. Gesture analysis, and then classifying the gesture according to the model parameters to determine the corresponding gesture instruction type, and implementing infrared gesture recognition.
  • the identification method may be based on template matching, based on a hidden Markov model (HMM) or a method based on a neural network.
  • HMM hidden Markov model
  • the drone can determine the corresponding instruction type according to the infrared imaging capturing gesture. It reduces the occupation of computing resources, shortens the response time required for the user to use the gesture recognition to control the relative position of the drone in the background complex or dimly lit, and improves the efficiency and accuracy of the user's human-computer interaction, especially in the unmanned This is especially true when the machine and/or user is in the process of moving.
  • Step S13 detecting first relative orientation information between the drone and the gesture area.
  • the drone determines the relative position according to the relative orientation information between the drone and the gesture area, and it is pointed out that the first relative orientation information of the drone includes between the drone and the gesture area.
  • the distance information, the azimuth information, the altitude angle information, the height difference information, and the positioning information of the drone are any one or any of a plurality of types. Therefore, the first relative position information is a general term, and when specifically applied, The specific data listed here can be determined as needed.
  • the detecting process of the first relative orientation information between the drone and the gesture area comprises: detecting the positioning information of the drone by the satellite positioning sensor of the drone;
  • the distance measuring sensor detects the distance information between the drone and the gesture area; detects the azimuth information between the drone and the gesture area by using the direction sensor of the drone;
  • the gyro sensor detects the height angle information between the drone and the gesture area, and calculates the height difference information between the drone and the gesture area according to the distance information and the height angle information; According to the positioning information, The height difference information, the azimuth information, and the distance information calculate relative orientation information between the drone and the gesture area.
  • the horizontal distance between the gesture areas of the drone can be adjusted to achieve horizontal movement between the drone and the user.
  • the head orientation of the drone can be adjusted, or the shooting direction of the camera unit in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust its nose or The orientation of the camera unit enables the camera unit to lock the user at all times.
  • the relative height between the drone and the operator can be adjusted, so that the drone can maintain a relative orientation with the preset value when the user performs the uphill movement or the downhill movement.
  • the positioning information of the first relative orientation information may represent the latitude and longitude coordinates A(x 1 , y 1 ) of the drone, and the latitude and longitude coordinates may be acquired by a satellite positioning sensor installed by the drone, and the satellite positioning sensor
  • the positioning function is implemented based on a satellite positioning system connected thereto, and the satellite system connected to the positioning module includes but is not limited to: a GPS positioning system, a Beidou positioning system, a GLONAS positioning system or a Galileo positioning system;
  • the distance is a drone a linear distance l from the gesture area, the ranging sensor is a laser ranging sensor and/or an infrared ranging sensor;
  • the azimuth angle ⁇ AB is also called Azimuth (angle), referred to as Az, indicating From the north direction of the drone, the horizontal angle between the clockwise direction and the direction line of the gesture area, for example, in the Android system, obtained by public static float[]getOrientation(float[]R,float[]value
  • the positioning coordinates B(x 2 , y 2 ) of the current gesture area can be obtained.
  • the distance measuring sensor or the ultrasonic ranging sensor may also be used to detect the relative height information of the drone and the ground, and the height difference information may be replaced in some usage scenarios (such as open flat).
  • the horizontal distance between the drone and the gesture area can be adjusted to achieve horizontal movement between the drone and the user.
  • the head orientation of the drone can be adjusted, or the shooting direction of the camera unit in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust the head or the camera unit accordingly.
  • the orientation allows the camera unit to lock the user all the time.
  • the height difference information the height between the drone and the user can be adjusted to maintain the relative orientation of the drone when the user performs the uphill movement and the downhill movement.
  • the first relative orientation information of the drone is obtained, which is used to represent the relative orientation between the drone and the gesture area.
  • the first relative orientation information may be obtained according to the information using an information fusion algorithm, such as a Kalman filter algorithm, to improve accuracy.
  • the first relative orientation information may decompose the positioning information, the height difference information, the azimuth information, and the distance information.
  • the first relative orientation information may also be a data packet including the positioning information, the height difference information, the azimuth information, and the distance information.
  • the first relative orientation information characterizes the relative orientation between the drone and the gesture area for the relative orientation control of the drone.
  • Step S14 controlling the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture area is the preset value.
  • the second relative orientation information may be calculated according to the first relative orientation information and the relative orientation preset value represented by the gesture instruction type, and further according to the second relative orientation.
  • the information controls the drone to adjust the flight status accordingly. Adjust its relative orientation with the gesture area.
  • the relative orientation preset value may include a distance, an azimuth angle, and a height difference between the adjusted drone and the gesture area.
  • the drone is controlled to adjust the heading angle according to the azimuth information in the second relative orientation information, so that the drone is oriented toward the gesture area.
  • the height difference or the first relative orientation information is characterized by the same height difference.
  • the coordinate position C(x 3 , y 3 ) represented by the position information in the second relative orientation information described above may be represented by the position information in the first relative orientation information by the UAV coordinate position A(x 1 , y 1 )
  • the preset value is obtained according to the foregoing formula.
  • the horizontal distance between the drone and the operator can be adjusted to achieve horizontal movement between the drone and the operator.
  • the nose orientation of the drone can be adjusted, which is actually controlling the shooting direction of the camera unit in the drone.
  • the drone can adjust the orientation of the handpiece accordingly, so that the camera can always lock the operator.
  • the height difference the height between the drone and the operator can be adjusted to realize the uphill movement and the downhill movement of the drone.
  • the first relative orientation information and the preset value only change the distance between the drone and the gesture area, and keep the azimuth and the height difference unchanged, so that the drone and the gesture area remain in the preset value.
  • the distance is changed; or the azimuth is changed, and the distance and the height difference are kept constant, so that the drone performs operations such as shooting around the gesture area.
  • the user can realize the control of the relative orientation of the drone simply and quickly.
  • the relative orientation between the drone and the gesture area is After the preset value, the step of detecting the third relative orientation information between the drone and the gesture area is further included.
  • the principle and the step of detecting the third relative orientation information are the same as the detection of the first relative orientation information, and details are not described herein again. Further, according to actual needs, the following steps in at least one of the following solutions may be included:
  • the height difference between the drone and the gesture area is raised to the first preset height difference.
  • the flight state of the drone is controlled such that the height difference between the drone and the gesture area is reduced to the second preset height difference.
  • the rate of change of the distance represented by the third relative orientation information may represent the relative speed between the user and the drone, and the drone is adjusted by the adjustment of the flight speed of the drone.
  • the distance between the gesture areas is within a predetermined distance.
  • the preset speed range and the predetermined distance range may be preset according to actual needs and/or correspondingly according to the gesture instruction type to represent the relative orientation preset value, so as to ensure that the drone better follows the user, and does not You can take a clearer picture when you are shooting, and/or to ensure that the drone is shooting.
  • the update frequency of the ranging sensor is generally low, when the gesture area changes rapidly when moving at a relatively fast speed, that is, when the user moves quickly, the response of the drone is often not fast enough, and there is a delay phenomenon.
  • the distance sensor can be used to measure the distance of the gesture area, and the motion speed of the gesture area can be calculated periodically and the following speed of the drone can be adjusted in real time according to the motion speed.
  • the machine can adjust the flight speed according to the moving speed of the gesture area, so that the drone and the user maintain the relative orientation as a preset range, achieve a good follow-up effect, and improve the user's drone interaction experience during fast motion.
  • the drone when the height difference between the user and the drone is less than the first preset height difference, the flight state of the drone is controlled, and the height difference is raised to the first preset height. difference.
  • the drone can maintain the preset relative height with the operator by adjusting the height of the user during the ascending movement to avoid a collision accident.
  • the first preset height difference is 0.5 meters.
  • the second preset height difference is 0.5 meters.
  • the method for controlling the relative orientation of the drone shown in the embodiment of the present invention is adopted between the drone and the user Relative position, real-time adjustment of the flight state of the drone, so as to achieve the relative position control of the drone and the user.
  • the complexity of the control operation of the drone is reduced, and the hidden trouble of the operation error is also reduced.
  • the user in this embodiment may not only be a real person, but also other drones, or devices such as cars on the ground.
  • Another embodiment is an improvement based on the previous embodiment.
  • the wearer is connected based on the trust connection.
  • the device sends an alarm instruction: the rate of change of the distance represented by the third relative orientation information is greater than the preset speed range; and the height difference represented by the third relative orientation information is less than the first preset height difference;
  • the height difference represented by the third relative orientation information is greater than the second preset height difference.
  • the UAV When the UAV detects the impact of the above environmental changes, it sends an alarm command to the wearable device to prompt the user that the current drone is in an azimuth deviation state. It is beneficial for users to make corresponding adjustments in time, which reduces the hidden dangers of drone loss and safety accidents.
  • driving control of the infrared light emitted by the wearer to the wearer is implemented, and the loss of power of the wearable device is reduced.
  • the method further includes the following pre-step: sending the device to the wearer based on the trusted connection A drive command that emits infrared light.
  • UAVs and wearable devices are typically connected by communication to effect the transmission of data and instructions.
  • wireless communication is used.
  • the signal amplifying device such as a signal repeater can be connected.
  • a trust connection is adopted, so that only the drone and the remote control device that have been authenticated by identity (ID) can perform the interaction operation.
  • another embodiment of the present invention further includes the following pre-steps: authenticating the wearable device through a communication connection; when the identity verification is successful, The drone establishes a trust connection with the wearable device.
  • the pre-step only the wearable device that has been authenticated by identity (ID) can establish a trust connection with the drone, thereby implementing an interactive operation, preventing the identification device from misjudging or malicious interference, and improving system accuracy and security.
  • ID identity
  • the implementation of the present invention can determine the gesture area and the recognition gesture by infrared imaging, and determine the relative orientation by combining the position information of the drone itself, thereby improving the implementation of the drone.
  • the relative orientation control efficiency improves the user experience.
  • the method includes the following steps:
  • Step S21 Receive a driving instruction of the drone for driving the wearing device to emit infrared light based on the trusted connection.
  • the wearable device interacts with the drone through gesture recognition and communication connections.
  • the communication connection uses a wireless communication connection.
  • the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can be connected.
  • a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
  • ID identity
  • the trusted connection between the drone and the wearable device may be any one or more of a Bluetooth trusted connection, a near field communication connection, a UBW trusted connection, a ZigBee trusted connection, or an Internet trusted connection.
  • the wearable device receives a drive command of the drone based on the connection, and the drive command is used to drive the wearable device to emit infrared light.
  • Step S22 in response to the driving instruction, driving the infrared illuminating component preset by the wearable device to transmit Infrared light causes the drone to determine the gesture area and the type of gesture instruction that characterizes the relative orientation preset based on infrared imaging to apply to the relative orientation control of the drone.
  • the wearable device includes an infrared illumination assembly, and in one embodiment, the infrared light diodes of the infrared illumination assembly are arranged linearly along the sides of the wear device.
  • the infrared illuminating component presets one or more infrared point light sources, such as infrared light emitting diodes, for emitting infrared light.
  • Another embodiment is an improvement made on the basis of the previous embodiment.
  • the wearable device is adapted to be disposed on the arm such that the gesture implementation area is located in the wearable device and the identification device. between.
  • the wearable device drives the infrared light-emitting component preset by the wearable device to emit infrared light to form an infrared ring in response to the driving instruction of the drone, so that the drone determines the gesture region and the representation relative based on the infrared imaging.
  • the type of gesture command for the orientation preset to apply to the relative orientation control of the drone.
  • the drone can generate a corresponding gesture interaction event based on the infrared imaging capture gesture.
  • the infrared light emitted by the wearable device is diffusely reflected on the back of the user's hand, "illuminating" the outline of the hand, so that the user's gesture area and the background area are distinguished by infrared light in infrared imaging.
  • the infrared light may also be partially absorbed by the hand, making the infrared imaging of the hand more visible.
  • the UAV performs gesture segmentation based on infrared imaging, which can reduce the calculation amount of the processor, shorten the response time, and improve the efficiency and accuracy of gesture recognition and relative position control of the drone, especially when the drone or user The effect is particularly significant when moving.
  • the wearable device drives only one of the preset one or more infrared point sources in the infrared illumination assembly to emit infrared light in response to the drive command.
  • the wearable device drives only one of the preset one or more infrared point sources in the infrared illumination assembly to emit infrared light in response to the drive command.
  • the energy consumption of the wearable device can be reduced and the use time can be prolonged while ensuring the use effect.
  • the infrared light emitted by the infrared light emitting device is controlled to emit in a wavelength range of 0.76 to 2.5 um.
  • the hand contour in the infrared image acquired by the identification device is mainly formed by the infrared light reflected by the hand, which is safer for the human body and has better recognition effect.
  • Step S23 receiving an alarm instruction of the drone based on the trust connection.
  • the wearable device and the drone can interact through wireless communication.
  • the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can also be connected.
  • a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
  • the wearable device is based on the trusted connection, and the receiving drone sends an alert command when the determined azimuth deviation state condition is satisfied, for prompting the user that the current drone is in an azimuth deviation state.
  • Step S24 in response to the alarm instruction, controlling the wearable device to activate the vibration motor and/or turn on the indicator light to prompt the user that the current drone is in an azimuth deviation state.
  • the wearable device After receiving the alarm instruction, the wearable device activates the vibration motor and/or turns on the indicator light accordingly in response to the alarm instruction.
  • the vibration mode of the vibration motor and/or the flashing mode of the indicator light may be a preset mode or a corresponding setting mode for different UAV azimuth deviation states characterized according to the alarm instruction; may be set at the factory, or User set up.
  • the method further includes the following concurrent step of: calculating a working duration of the infrared light emitting component, and controlling the infrared light emitting component when the working time exceeds a predetermined length of time Stop emitting infrared light. Allowing the wearable device to illuminate at the illumination component The infrared component is automatically turned off after a predetermined length of time has elapsed. Therefore, it is effective to prevent waste of electric energy due to user negligence and the like, and the user can also set the predetermined length of time to control the illumination duration of the infrared illuminating component, thereby improving work efficiency.
  • the unmanned aerial vehicle relative orientation assist control method for the wearable device of the present invention further includes the following Step: sending an authentication request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
  • Step sending an authentication request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
  • ID identity
  • the implementation of the present invention can improve the efficiency of the relative orientation control of the drone by performing infrared gestures and communication interaction with the drone. Improve the user experience.
  • the invention further proposes a relative orientation control device for the unmanned aerial vehicle based on the above-mentioned unmanned aerial vehicle relative orientation control method.
  • the image capturing unit 11, the identification unit 12, the detecting unit 13, and the control unit 14 are specifically disclosed as follows:
  • the imaging unit 11 is configured to acquire an infrared image formed by the infrared light emitted by the wearable device acting on the gesture area.
  • the drone usually includes a camera unit 11, a processor, a memory, etc., and performs gesture recognition based on computer vision.
  • the drone includes at least one camera having an infrared imaging function, and the drone acquires an infrared image formed by the infrared light emitted by the wearable device, and the hand is pressed according to the infrared image.
  • the potential area is separated from the background area to complete the gesture segmentation to achieve infrared gesture recognition.
  • the camera unit 11 includes at least one camera.
  • the camera unit 11 can acquire an infrared image by any one or any of IR-CUT dual filter technology, IR lens technology, and infrared sensing CCD technology.
  • the wearable device emits infrared light to illuminate the back of the user's hand, so that the outline of the user's gesture area is "illuminated” by the infrared light to form an infrared image of the gesture area.
  • the user's gesture area and background area are distinguished by infrared light in infrared imaging.
  • the identification unit 12 is configured to determine a gesture area that describes the contour by the infrared light and a gesture instruction type that characterizes the relative orientation preset value according to the infrared image.
  • the drone separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, and the gesture area can be determined.
  • an image algorithm may also be utilized to cause the camera unit 11 of the drone to lock the gesture area.
  • the determining unit 12 determines, according to the infrared image, the gesture area and the gesture instruction type, including: one or more frames of images acquired from the preview infrared image acquired by the camera unit 11; a gesture area in which the contour is described by infrared light in the multi-frame image; the gesture feature data is extracted based on the gesture area, and matched with the preset gesture instruction type description data to determine a corresponding gesture instruction type.
  • the gesture instruction type can characterize the relative orientation preset for the drone relative orientation control.
  • the video acquired by the UAV through the camera unit 11 can be regarded as composed of a multi-frame image.
  • static gesture recognition only one or a few frames of the gesture are analyzed to extract gesture feature data, and the gesture feature data may include a gesture. Contour data and/or gesture depth data.
  • dynamic gesture recognition it is also necessary to acquire the spatio-temporal features of gestures.
  • the common methods of dynamic gesture spatio-temporal trajectory analysis are mainly two categories: Trajectories Matching and State Space Modeling. . Therefore, it is necessary to analyze the multi-frame image to obtain the gesture in motion. The resulting space-time trajectory.
  • the recognition unit 12 separates the gesture area from the background area according to the infrared image to complete the gesture segmentation, determines the gesture area described by the infrared light, and then acquires the gesture feature through the gesture area and estimates the gesture model.
  • the parameters are analyzed by gestures, and then the gestures are classified according to the model parameters to determine the corresponding gesture instruction type, and the infrared gesture recognition is realized.
  • the identification method may be based on template matching, based on a hidden Markov model (HMM) or a method based on a neural network.
  • HMM hidden Markov model
  • the drone can determine the corresponding instruction type according to the infrared imaging capturing gesture. It reduces the occupation of computing resources, shortens the response time required for the user to use the gesture recognition to control the relative position of the drone in the background complex or dimly lit, and improves the efficiency and accuracy of the user's human-computer interaction, especially in the unmanned This is especially true when the machine and/or user is in the process of moving.
  • the detecting unit 13 is configured to detect first relative orientation information between the drone and the gesture area.
  • the drone determines the relative position according to the relative orientation information between the drone and the gesture area, and it is pointed out that the first relative orientation information of the drone includes between the drone and the gesture area.
  • the distance information, the azimuth information, the altitude angle information, the height difference information, and the positioning information of the drone are any one or any of a plurality of types. Therefore, the first relative position information is a general term, and when specifically applied, The specific data listed here can be determined as needed.
  • the detecting process of the first relative orientation information between the UAV and the gesture area by the detecting unit 13 includes: detecting the positioning information of the UAV through the satellite positioning sensor of the UAV; The distance measuring sensor of the drone detects the distance information between the drone and the gesture area; and detects the azimuth information between the drone and the gesture area by using the direction sensor of the drone; Gyro of the drone The sensor detects the height angle information between the drone and the gesture area, and calculates the height difference information between the drone and the gesture area according to the distance information and the height angle information; The positioning information, the height difference information, the azimuth information and the distance information are calculated to obtain relative orientation information between the drone and the gesture area.
  • the horizontal distance between the gesture areas of the drone can be adjusted to achieve horizontal movement between the drone and the user.
  • the head orientation of the drone can be adjusted, or the shooting direction of the camera unit 11 in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust the head accordingly.
  • the orientation of the imaging unit 11 enables the imaging unit 11 to always lock the user.
  • the relative height between the drone and the operator can be adjusted, so that the drone can maintain a relative orientation with the preset value when the user performs the uphill movement or the downhill movement.
  • the positioning information of the first relative orientation information may represent the latitude and longitude coordinates A(x 1 , y 1 ) of the drone, and the latitude and longitude coordinates may be acquired by a satellite positioning sensor installed by the drone, and the satellite positioning sensor
  • the positioning function is implemented based on a satellite positioning system connected thereto, and the satellite system connected to the positioning module includes but is not limited to: a GPS positioning system, a Beidou positioning system, a GLONAS positioning system or a Galileo positioning system;
  • the distance is a drone a linear distance l from the gesture area, the ranging sensor is a laser ranging sensor and/or an infrared ranging sensor;
  • the azimuth angle ⁇ AB is also called Azimuth (angle), referred to as Az, indicating From the north direction of the drone, the horizontal angle between the clockwise direction and the direction line of the gesture area, for example, in the Android system, obtained by public static float[]getOrientation(float[]R,float[]value
  • the positioning coordinates B(x 2 , y 2 ) of the current gesture area can be obtained.
  • the distance measuring sensor or the ultrasonic ranging sensor may also be used to detect the relative height information of the drone and the ground, and the height difference information may be replaced in some usage scenarios (such as open flat).
  • the horizontal distance between the drone and the gesture area can be adjusted to achieve horizontal movement between the drone and the user.
  • the head orientation of the drone can be adjusted, or the shooting direction of the camera unit 11 in the drone can be controlled, so that when the position of the gesture area changes, the drone can adjust the head or the camera unit accordingly.
  • the orientation of 11 enables the camera unit 11 to lock the user all the time.
  • the height difference information the height between the drone and the user can be adjusted to maintain the relative orientation of the drone when the user performs the uphill movement and the downhill movement.
  • the first relative orientation information of the drone is obtained, which is used to represent the relative orientation between the drone and the gesture area.
  • the first relative orientation information may be obtained according to the information using an information fusion algorithm, such as a Kalman filter algorithm, to improve accuracy.
  • the first relative orientation information may decompose the positioning information, the height difference information, the azimuth information, and the distance information.
  • the first relative orientation information may also be a data packet including the positioning information, the height difference information, the azimuth information, and the distance information.
  • the first relative orientation information characterizes the relative orientation between the drone and the gesture area for the relative orientation control of the drone.
  • the control unit 14 is configured to control the flight state of the drone according to the first relative orientation information and the gesture instruction type, so that the relative orientation between the drone and the gesture region is The preset value.
  • the second relative orientation information may be calculated according to the first relative orientation information and the relative orientation preset value represented by the gesture instruction type, and then the control unit 14 is configured according to the first
  • the second relative position information correspondingly controls the drone to adjust the flight state and adjust the relative orientation between the drone and the gesture area.
  • the relative orientation preset value may include a distance, an azimuth angle, and a height difference between the adjusted drone and the gesture area.
  • control unit 14 can flexibly select at least one of the following according to actual requirements, so that the relative orientation between the UAV and the gesture area is the Set value:
  • the drone is controlled to adjust the heading angle according to the azimuth information in the second relative orientation information, so that the drone is oriented toward the gesture area.
  • the height difference or the first relative orientation information is characterized by the same height difference.
  • the coordinate position C(x 3 , y 3 ) represented by the position information in the second relative orientation information described above may be represented by the position information in the first relative orientation information by the UAV coordinate position A(x 1 , y 1 )
  • the preset value is obtained according to the foregoing formula.
  • the horizontal distance between the drone and the operator can be adjusted to achieve horizontal movement between the drone and the operator.
  • the head orientation of the drone can be adjusted, and actually the shooting direction of the camera unit 11 in the drone is controlled.
  • the drone can adjust the orientation of the handpiece accordingly, so that the camera can always lock the operator.
  • the height difference the height between the drone and the operator can be adjusted to realize the uphill movement and the downhill movement of the drone.
  • the control unit 14 may change only the drone and the gesture area according to the first relative orientation information and the preset value.
  • the distance while maintaining its azimuth and height difference, keeps the drone and the gesture area at a preset distance; or, changing the azimuth, keeping the distance and height difference constant, making the drone Perform operations such as shooting around the gesture area.
  • the user can realize the control of the relative orientation of the drone simply and quickly.
  • the relative orientation between the drone and the gesture area is After the preset value, the detecting unit 13 is further configured to detect third relative orientation information between the drone and the gesture area.
  • the principle and the step of detecting the third relative orientation information are the same as the detection of the first relative orientation information, and details are not described herein again.
  • a determining unit is further included, and the determining unit and the control unit 14 can be configured according to at least one solution according to actual requirements:
  • the determining unit is configured to determine whether a rate of change of the distance characterized by the third relative orientation information is greater than a preset speed range; the control unit 14 is configured to control the current limit when greater than the preset speed range The human machine adjusts the flight speed such that the distance between the drone and the gesture area is within a predetermined distance range.
  • the determining unit is configured to determine that the height difference represented by the third relative orientation information is No less than the first preset height difference; the control unit 14 is configured to control the flight state of the drone to make the altitude between the drone and the gesture area when the first preset height difference is smaller than the first preset height difference The difference is raised to the first preset height difference.
  • the determining unit is configured to determine whether the height difference represented by the third relative orientation information is greater than a second preset height difference; the control unit 14 is configured to be greater than the second preset height difference And controlling the flight state of the drone to reduce the height difference between the drone and the gesture area to the second preset height difference.
  • the rate of change of the distance represented by the third relative orientation information may represent the relative speed between the user and the drone, and the drone is adjusted by the adjustment of the flight speed of the drone.
  • the distance between the gesture areas is within a predetermined distance.
  • the preset speed range and the predetermined distance range may be preset according to actual needs and/or correspondingly according to the gesture instruction type to represent the relative orientation preset value, so as to ensure that the drone better follows the user, and does not You can take a clearer picture when you are shooting, and/or to ensure that the drone is shooting.
  • the update frequency of the ranging sensor is generally low, when the gesture area changes rapidly when moving at a relatively fast speed, that is, when the user moves quickly, the response of the drone is often not fast enough, and there is a delay phenomenon.
  • the distance sensor can be used to measure the distance of the gesture area, and the motion speed of the gesture area can be calculated periodically and the following speed of the drone can be adjusted in real time according to the motion speed.
  • the machine can adjust the flight speed according to the moving speed of the gesture area, so that the drone and the user maintain the relative orientation as a preset range, achieve a good follow-up effect, and improve the user's drone interaction experience during fast motion.
  • the drone when the height difference between the user and the drone is less than the first preset height difference, the flight state of the drone is controlled, and the height difference is raised to the first preset height. difference.
  • the first preset height difference is 0.5 meters.
  • the second preset height difference is 0.5 meters.
  • the relative azimuth control method of the unmanned aerial vehicle shown in the embodiment of the present invention adjusts the flight state of the drone in real time by the relative orientation between the drone and the user, thereby realizing the relative azimuth control of the drone and the user.
  • the complexity of the control operation of the drone is reduced, and the hidden trouble of the operation error is also reduced.
  • the user in this embodiment may not only be a real person, but also other drones, or devices such as cars on the ground.
  • a transmitting unit is further configured to be configured to determine when any of the following azimuth deviation state conditions is determined by the drone When satisfied, the alarm command is sent to the wearable device based on the trust connection: the rate of change of the distance represented by the third relative orientation information is greater than a preset speed range; and the height difference represented by the third relative orientation information is less than the first preset height difference The third relative orientation information represents a height difference greater than a second predetermined height difference.
  • the sending unit sends an alarm instruction to the wearable device to prompt the user that the current drone is in an azimuth deviation state. It is beneficial for users to make corresponding adjustments in time, which reduces the hidden dangers of drone loss and safety accidents.
  • the drone is implemented to drive control of the infrared light emitted by the wearable device to reduce the loss of power of the wearable device, and the transmitting unit is further configured to: based on the trusted connection, A drive command for driving the infrared light to be emitted is sent to the wearable device.
  • UAVs and wearable devices are typically connected by communication to effect the transmission of data and instructions.
  • wireless communication is used.
  • the signal amplifying device such as a signal repeater can be connected.
  • a trust connection is adopted, so that only the drone and the remote control device that have been authenticated by identity (ID) can perform the interaction operation.
  • another embodiment of the present invention further includes a first communication unit configured to: authenticate the wearable device through the communication connection; When the verification is successful, the drone establishes a trust connection with the wearable device.
  • a first communication unit configured to: authenticate the wearable device through the communication connection; When the verification is successful, the drone establishes a trust connection with the wearable device.
  • ID identity
  • the implementation of the present invention can determine the gesture area and the recognition gesture by infrared imaging, and determine the relative orientation by combining the position information of the drone itself, thereby improving the drone.
  • the relative orientation control efficiency improves the user experience.
  • the present invention further provides a drone relative orientation assisting control device for a wearable device based on the above-described unmanned aerial vehicle relative orientation assisting control method for a wearable device.
  • the first receiving unit 21, the driving unit 22, the second receiving unit 23, the alarm unit 24, and each unit are included.
  • the functions implemented are specifically disclosed as follows:
  • the first receiving unit 21 is configured to receive a driving instruction of the drone for driving the wearing device to emit infrared light based on the trusted connection.
  • the wearable device interacts with the drone through gesture recognition and communication connections.
  • the communication connection uses a wireless communication connection.
  • the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can be connected.
  • a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
  • ID identity
  • the trusted connection between the drone and the wearable device may be any one or more of a Bluetooth trusted connection, a near field communication connection, a UBW trusted connection, a ZigBee trusted connection, or an Internet trusted connection.
  • the first receiving unit 21 of the wearable device receives a driving instruction of the drone based on the connection, and the driving instruction is for driving the wearing device to emit infrared light.
  • the driving unit 22 is configured to, in response to the driving instruction, driving the infrared illuminating component preset by the wearing device to emit infrared light, so that the drone determines the gesture area and the gesture instruction type characterization of the relative orientation preset value based on the infrared imaging, Applied to the relative position control of the drone.
  • the wearable device includes an infrared illumination assembly, and in one embodiment, the infrared light diodes of the infrared illumination assembly are arranged linearly along the sides of the wear device.
  • the infrared illuminating component presets one or more infrared point light sources, such as infrared light emitting diodes, for emitting infrared light.
  • Another embodiment is an improvement made on the basis of the previous embodiment.
  • the wearable device is adapted to be disposed on the arm such that the gesture implementation area is located in the wearable device and the identification device. between.
  • the driving unit 22 drives the infrared illuminating component preset by the wearable device to emit infrared light to form an infrared ring in response to the driving instruction of the drone, so that the drone determines the gesture based on infrared imaging.
  • the drone After the infrared illuminating component of the wearable device emits infrared light, the drone is caused to capture a gesture based on infrared imaging to generate a corresponding gesture interaction event.
  • the infrared light emitted by the wearable device is diffusely reflected on the back of the user's hand, "illuminating" the outline of the hand, so that the user's gesture area and the background area are distinguished by infrared light in infrared imaging.
  • the infrared light may also be partially absorbed by the hand, making the infrared imaging of the hand more visible.
  • the UAV performs gesture segmentation based on infrared imaging, which can reduce the calculation amount of the processor, shorten the response time, and improve the efficiency and accuracy of gesture recognition and relative position control of the drone, especially when the drone or user The effect is particularly significant when moving.
  • the driving unit 22 drives only one of the preset one or more infrared point sources in the infrared illuminating component to emit infrared light in response to the driving instruction.
  • the driving unit 22 drives only one of the preset one or more infrared point sources in the infrared illuminating component to emit infrared light in response to the driving instruction.
  • more infrared point sources are driven to emit light; otherwise, less infrared point sources are driven to emit light. Therefore, the energy consumption of the wearable device can be reduced and the use time can be prolonged while ensuring the use effect.
  • the infrared light emitted by the infrared light emitting device is controlled to emit in a wavelength range of 0.76 to 2.5 um.
  • the hand contour in the infrared image acquired by the identification device is mainly formed by the infrared light reflected by the hand, which is safer for the human body and has better recognition effect.
  • the second receiving unit 23 is configured to receive an alert instruction of the drone based on the trusted connection.
  • the wearable device and the drone can interact through wireless communication.
  • the distance between the wearable device and the drone is relatively long, or the environmental electromagnetic conditions are complicated, and the like, and the signal amplifying device such as a signal repeater can also be connected.
  • a trust connection manner may also be adopted, so that only the drone and the wearable device that have been authenticated by identity (ID) can perform the interaction operation.
  • the second receiving unit 23 receives based on the trusted connection.
  • the drone sends an alarm command when the determined azimuth deviation state condition is satisfied, and is used to prompt the user that the current drone is in an azimuth deviation state.
  • the alarm unit 24 is configured to control the wearable device to activate the vibration motor and/or turn on the indicator light in response to the alarm instruction to prompt the user that the current drone is in an azimuth deviation state.
  • the alert unit 24 activates the vibration motor and/or turns on the indicator light in response to the alert command.
  • the vibration mode of the vibration motor and/or the flashing mode of the indicator light may be a preset mode or a corresponding setting mode for different UAV azimuth deviation states characterized according to the alarm instruction; may be set at the factory, or User set up.
  • the method further includes: a shutdown unit configured to: calculate a working duration of the infrared light emitting component, and when the working time exceeds a predetermined length of time, control the The infrared illuminating component stops emitting infrared light.
  • the wearable device can automatically turn off the infrared component after the illumination component has been illuminated for more than a predetermined period of time. Therefore, it is effective to prevent waste of electric energy due to user negligence and the like, and the user can also set the predetermined length of time to control the illumination duration of the infrared illuminating component, thereby improving work efficiency.
  • the unmanned aerial vehicle relative orientation control device for the wearable device of the present invention further includes a second The communication unit is configured to: send an identity verification request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
  • the communication unit is configured to: send an identity verification request to the drone through the communication connection; when the identity verification is successful, the wearable device and the drone establish a trust connection.
  • the implementation of the present invention can improve the efficiency of the relative orientation control of the drone by performing infrared gestures and communication interaction with the drone. Improve the user experience.
  • another embodiment of the present invention further provides a UAV control device having a function of implementing the above-described UAV relative azimuth control method.
  • the functions may be implemented by hardware or by corresponding software implemented by hardware.
  • the hardware or software includes one or more units corresponding to the functions described above.
  • the structure of the drone control device includes:
  • One or more cameras 707 at least one of which has an infrared imaging function
  • the memory 702 is configured to store a program for supporting the wearable device to perform the foregoing method for controlling the relative orientation of the drone;
  • a communication interface 703, configured to communicate with the wearable device or other device or communication network;
  • One or more processors 704 for executing programs stored in the memory are provided.
  • the one or more programs 705 are configured to drive the one or more processors 704 to construct a unit for performing any of the above-described drone relative orientation control methods.
  • FIG. 8 shows the related to the unmanned aerial vehicle relative orientation control device provided by the embodiment of the present invention.
  • memory 702 memory 702
  • communication interface 703 one or more processors 704
  • applications 705 power source 706, one or more cameras 707, and one or more sensors 708, and the like.
  • power source 706 one or more cameras 707
  • sensors 708 sensors
  • the memory 702 can be used to store software programs and modules, and the processor 704 executes various functional applications and data processing of the drone by running software programs and modules stored in the memory 702.
  • the memory 702 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program 705 required for at least one function, and the like; the storage data area may store data created according to usage of the drone, and the like.
  • memory 702 can include high speed random access memory area 702, and can also include non-volatile memory area 702, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the communication interface 703 is used in the above control process to communicate with the wearable device and other devices or communication networks.
  • the communication interface 703 is an interface between the processor 704 and the external subsystem for transmitting information between the processor 704 and the external system to achieve the purpose of the control subsystem.
  • the processor 704 is a control center of the drone, and connects various parts of the entire UAV relative orientation control device using various communication interfaces 703 and lines, by running or executing software programs and/or modules stored in the storage area 702, The data stored in the storage area 702 is called, and various functions and processing data of the drone are executed to perform overall monitoring of the drone.
  • the processor 704 may include one or more processing units; preferably, the processor 704 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, and an application 705. Etc, modulation The demodulation processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 704.
  • One or more applications 705, preferably, are stored in the storage area 702 and configured to be executed by the one or more processors 704, the one or more programs being configured A function implemented by any embodiment of a drone relative orientation control method.
  • a power source 706 (such as a battery) that supplies power to the various components.
  • the power source 706 can be logically coupled to the processor 704 via a power management system to manage functions such as charging, discharging, and power management through the power 706 management system.
  • the drone may further include one or more cameras 707 including at least one camera having infrared imaging functions, the cameras 707 are connected to the processor 704 and controlled by the processor 704, and the images acquired by the camera 707 may be stored in the memory. 702.
  • IMU inertial sensors
  • accelerometers accelerometers, gyroscope sensors
  • magnetometers magnetometers
  • direction sensors e.g., direction sensors
  • ranging sensors e.g., ranging sensors
  • satellite positioning sensors eg, GPS sensors, Beidou sensors, etc.
  • image sensor etc.
  • the drone may further include a Bluetooth module or the like, which will not be described herein.
  • the processor 704 included in the drone has the following functions:
  • the embodiment of the invention further provides a computer storage medium for storing computer software instructions for use in the above-mentioned drone relative orientation control device, which comprises a program for performing the above-mentioned design for the drone.
  • another embodiment of the present invention further provides a wearable device control device having a function for implementing the above-described method for assisting a relative position control of a wearable device.
  • the functions may be implemented by hardware or by corresponding software implemented by hardware.
  • the hardware or software includes one or more units corresponding to the functions described above.
  • the structure of the drone control device includes:
  • a memory 702 configured to store a program for supporting the wearable device to perform the above-described drone relative orientation assist control method for the wearable device;
  • a communication interface 703, configured to communicate with the drape device or other device or communication network;
  • a vibration motor and/or indicator light 710 for prompting the user of the current state of the drone
  • One or more processors 704 for executing programs stored in the memory are provided.
  • the infrared illuminating component 709 includes one or more infrared light sources for emitting infrared light;
  • the one or more programs 705 are configured to drive the one or more processors 704 to construct a unit for performing any of the above-described wearable device drone assisted relative orientation control methods.
  • FIG. 9 is a block diagram showing a partial structure of a smart wristband associated with the wearable device control device provided by the embodiment of the present invention. Including: a memory 702, a communication interface 703, one or more processors 704, one or more applications 705, a power source 706, an infrared illumination component 709, an indicator light 710, and the like.
  • a memory 702 a communication interface 703, one or more processors 704, one or more applications 705, a power source 706, an infrared illumination component 709, an indicator light 710, and the like.
  • the structure illustrated in Figure 9 does not constitute a limitation of the opponent's ring, may include more or fewer components than illustrated, or combine some components, or different component arrangements.
  • the memory 702 can be used to store software programs and modules, and the processor 704 executes various functional applications and data processing of the wearable device by running software programs and modules stored in the memory 702.
  • the memory 702 may mainly include a storage program area and an storage data area, wherein the storage program area may store an operating system, an application 705 required for at least one function, and the like; the storage data area may store data created according to usage of the wearable device, and the like.
  • memory 702 can include high speed random access memory area 702, and can also include non-volatile memory area 702, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
  • the communication interface 703 is used for communication between the smart bracelet and the unmanned aerial vehicle relative orientation control device and other devices or communication networks in the above control process.
  • the communication interface 703 is an interface between the processor 704 and the external subsystem for transmitting information between the processor 704 and the external system to achieve the purpose of the control subsystem.
  • the processor 704 is a control center for the smart bracelet that connects various portions of the entire wearable device with various communication interfaces 703 and lines, by running or executing software programs and/or modules stored in the storage area 702, and by calling stored in storage.
  • the data within the area 702 performs various functions of the wearable device and processing data to thereby perform overall monitoring of the wearable device.
  • the processor 704 may include one or more processing units; preferably, the processor 704 may integrate an application processor and a modem processor, where The application processor mainly processes an operating system, a user interface, an application 705, etc., and the modem processor mainly processes wireless communication. It will be appreciated that the above described modem processor may also not be integrated into the processor 704.
  • One or more applications 705, preferably, are stored in the storage area 702 and configured to be executed by the one or more processors 704, the one or more programs being configured A function implemented for any embodiment for performing a drone relative azimuth assisted control method for a wearable device.
  • the infrared illuminating component 709 presets one or more infrared point light sources, such as infrared light emitting diodes, for emitting infrared light.
  • the indicator light 710 may be in a preset mode or a corresponding mode set for different UAV azimuth deviation states according to the alarm instruction; and is used to prompt the user that the current drone is in an azimuth deviation state.
  • the processor 704 included in the wearable device further has the following functions:
  • the infrared illuminating component preset to drive the wearable device emits infrared light, so that the drone determines the gesture area and the type of gesture instruction characterization of the relative orientation preset value based on the infrared imaging to apply to the relative orientation of the drone control;
  • the wearable device In response to the alerting command, the wearable device is controlled to activate the vibration motor and/or turn on the indicator light to prompt the user that the current drone is in an azimuthally offset state.
  • Figure 10 illustrates a wearable device (referred to collectively as a drone and wearable device) of a drone or drone relative azimuth assist control that can implement relative orientation control in accordance with the present invention.
  • the device conventionally includes a processor 1010 and a computer program product or computer readable medium in the form of a memory 1020.
  • the memory 1020 may be an electronic memory such as a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), an EPROM, a hard disk, or a ROM.
  • the memory 1020 has a memory space 1030 for executing program code 1031 of any of the above method steps.
  • storage space 1030 for program code may include various program code 1031 for implementing various steps in the above methods, respectively.
  • the program code can be read from or written to one or more computer program products.
  • These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such a computer program product is typically a portable or fixed storage unit as described with reference to FIG.
  • the storage unit may have a storage section or a storage space or the like arranged similarly to the storage 1020 in FIG.
  • the program code can be compressed, for example, in an appropriate form.
  • the storage unit comprises program code 1031' for performing the steps of the method according to the invention, ie code that can be read by, for example, a processor such as 1010, which when executed by the device causes the device to perform the above Each step in the described method.
  • the disclosed system, apparatus and method can be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the coupling or direct coupling or communication connection between the two may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)
  • Selective Calling Equipment (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种无人机相对方位控制方法及装置,其中,无人机相对方位控制方法包括如下步骤:获取穿戴设备发射的红外光作用于手势区域后形成的红外图像(S11);依据红外图像确定由红外光描述轮廓的手势区域和表征相对方位预设值的手势指令类型(S12);检测本无人机与手势区域之间的第一相对方位信息(S13);根据第一相对方位信息和手势指令类型控制本无人机的飞行状态,使得本无人机与手势区域之间的相对方位为预设值(S14)。从而能够提高无人机相对方位控制的效率,提升无人机相对方位控制的用户体验。

Description

无人机相对方位控制方法及装置 技术领域
本发明涉及航空科学技术领域,更具体地,涉及无人机相对方位控制方法及装置。
背景技术
无人驾驶无人机简称无人机,是利用遥控方法和自备的程序控制装置操纵的不载人的无人机。为了维持机体平衡以及完成工作任务,无人机体上安装的传感器越来越多,而随着微电子技术的发展,在小型无人机上集成多个高精度的传感器已经成为现实。目前,无人机能够实现的功能也越来越多,已经广泛应用于空中侦察、监视、通信、反潜、电子干扰等。在一些使用场景中,需要对控制无人机与用户保持一定的相对方位,以便于完成无人机拍摄等操作。
无人机的飞行控制受环境影响较大,若由用户完全手动地进行相对方位控制,不仅对用户的飞行控制技术水平要求较高,而且用户需专注于无人机相对方位控制,难以同时完成其他操作。因此,通常而言,采用如下两种方法进行无人机相对方位控制:
其一,利用无人机和遥控设备的传感器检测无人机和遥控设备的方位信息,根据所述方位信息进行相对方位控制。
其二,基于图形算法进行计算机视觉跟踪,从而进行相对方位控制。
然而,方法一依赖遥控设备的定位传感器,精度较低,且不利于遥控装置的轻便化,影响了相对方位控制中的人机交互体验;方法二图形算法复杂度高, 不仅占用了无人机大量的运算资源,也限制了无人机相对位置控制的应用场景,效果不理想。
发明内容
本发明的目的在于针对以上存在的至少一方面不足,提供一种无人机相对方位控制方法及装置、一种用于穿戴设备的无人机相对方位辅助控制方法及装置,所述装置能够提高无人机相对方位控制的效率,所述控制方法能够为所述无人机相对方位控制效率的提高提供实现方式。
进一步地,本发明还提供了与前述控制方法相适应的无人机控制装置和穿戴设备控制装置。
为了实现上述目的,本发明采取如下若干方面的技术方案:
第一方面,本发明实施例中提供了一种无人机相对方位控制方法,包括如下步骤:获取穿戴设备发射的红外光作用手势区域后形成的红外图像;依据所述红外图像确定由红外光描述轮廓的手势区域和表征相对方位预设值的手势指令类型;检测本无人机与所述手势区域之间的第一相对方位信息;根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。
第二方面,本发明实施例中提供了一种用于穿戴设备的无人机相对方位辅助控制方法,包括如下步骤:基于信任连接,接收无人机的用于驱动穿戴设备发射红外光的驱动指令;响应于所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制;基于信任连接,接收无人机的告警指令;响应于所述告警指令,控制穿戴设备启动振 动马达和/或开启指示灯,以提示用户当前无人机处于方位偏离状态。
第三方面,本发明实施例中提供了一种无人机相对方位控制装置,包括:至少一个处理器;以及,至少一个存储器,其与所述至少一个处理器可通信地连接;所述至少一个存储器包括处理器可执行的指令,当所述处理器可执行的指令由所述至少一个处理器执行时,致使所述装置执行至少以下操作:获取穿戴设备发射的红外光作用于手势区域后形成的红外图像;依据所述红外图像确定由红外光描述轮廓的手势区域和表征相对方位预设值的手势指令类型;检测本无人机与所述手势区域之间的第一相对方位信息;根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。
第四方面,本发明实施例中提供了一种用于穿戴设备的无人机相对方位辅助控制装置,包括:至少一个处理器;以及,至少一个存储器,其与所述至少一个处理器可通信地连接;所述至少一个存储器包括处理器可执行的指令,当所述处理器可执行的指令由所述至少一个处理器执行时,致使所述装置执行至少以下操作:基于信任连接,接收无人机的用于驱动穿戴设备发射红外光的驱动指令;响应于所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制;基于信任连接,接收无人机的告警指令;响应于所述告警指令,控制穿戴设备启动振动马达和/或开启指示灯,以提示用户当前无人机处于方位偏离状态。
第五方面,本发明实施例中提供了一种无人机控制装置,该无人机控制装置具有实现上述第一方面中无人机相对方位控制方法的功能。所述功能 可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的单元。在一个可能的设计中,无人机控制装置的结构中包括:一个或多个摄像头,其中至少一个摄像头具有红外成像功能;一个或多个传感器,用于检测所述相对方位信息;存储器,用于存储支持穿戴设备执行上述无人机相对方位控制方法的程序;通信接口,用于上述无人机与穿戴设备或其他设备或通信网络通信;一个或多个处理器,用于执行所述存储器中存储的程序;一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行;所述一个或多个程序用于驱动所述一个或多个处理器构造用于执行所述第一方面或其任意一种实现方式中所述的无人机相对方位控制方法的单元。
第六方面,本发明实施例中提供了一种穿戴设备控制装置,该穿戴设备控制装置具有实现上述第二方面中用于穿戴设备的无人机相对方位辅助控制方法的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的单元。在一个可能的设计中,穿戴设备控制装置的结构中包括:存储器,用于存储支持穿戴设备执行上述用于穿戴设备的无人机相对方位辅助控制方法的程序;通信接口,用于上述穿戴设备与无人机或其他设备或通信网络通信;振动马达和/或指示灯,用于提示用户当前无人机的状态;一个或多个处理器,用于执行所述存储器中存储的程序;红外发光组件,包括一个或多个红外光源,用于发射红外光;一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执 行;所述一个或多个程序用于驱动所述一个或多个处理器构造用于执行所述第二方面或其任意一种实现方式中所述的用于穿戴设备的无人机相对方位辅助控制方法的单元。
第七方面,本发明实施例中提供了一种计算机程序,包括计算机可读代码,当无人机运行所述计算机可读代码时,导致第一方面中所述的方法被执行。
第八方面,本发明实施例中提供了一种计算机程序,包括计算机可读代码,当穿戴设备运行所述计算机可读代码时,导致第二方面中所述的方法被执行。
第九方面,本发明实施例中提供了一种计算机可读介质,其中存储了如第七方面或第八方面所述的计算机程序。
相对于现有技术,本发明提供的技术方案至少具有如下优点:
首先,本发明依据穿戴设备发射的红外光作用下形成的红外图像确定手势区域和表征相对方位预设值的手势指令类型,检测本无人机与所述手势区域之间的第一相对方位信息,根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。能够根据红外成像确定手势区域和识别手势,并结合无人机自身的方位信息确定相对方位,从而实现无人机相对方位控制,减轻了无人机图像识别的计算量,在提高无人机相对方位控制的效率的同时,提高了无人机相对方位控制的准确性。
其次,用户在控制过程中,只需使用能够发射红外光的穿戴设备就能通过手势控制对无人机与自己的相对方位进行调整,而无须穿戴设备提供方位信息,有利于降低穿戴设备的成本,并使得穿戴设备更加轻便轻便化, 提升了无人机相对方位控制中的用户体验。
最后,通过检测和控制无人机与手势区域的高度差和距离,使得在用户上下坡时,也能使无人机与用户保持预设的相对方位,降低了环境变化对无人机相对方位控制的影响。而且,在检测出上述环境变化的影响时,无人机通过向穿戴设备发送告警指令,以提示用户当前的无人机处于方位偏离状态。有利于用户及时做出相应的调整,预防无人机丢失和安全事故。
书不尽言,本发明附加的方面和优点将在下面的描述中部分给出,这些将从下面的描述中变得更加简明易懂,或通过本发明的实践了解到。
附图说明
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一个实施例的用于无人机相对方位控制方法的设备结构框图;
图2为本发明一个实施例的无人机相对方位控制方法的流程示意图;
图3为本发明一个实施例的无人机相对方位控制过程的场景示意图;
图4为本发明一个实施例的无人机相对方位控制过程的场景示意图;
图5为本发明一个实施例的用于穿戴设备的无人机辅助相对方位控制方法的流程示意图;
图6为本发明一个实施例的无人机相对方位控制装置的结构框图;
图7为本发明一个实施例的用于穿戴设备的无人机相对方位辅助控制装置的结构框图;
图8为本发明一个实施例的无人机控制装置的结构示意图;
图9为本发明一个实施例的穿戴设备控制装置的结构示意图;
图10示出了用于执行根据本发明的方法的无人机或穿戴设备的框图;以及
图11示出了用于保持或者携带实现根据本发明的方法的程序代码的存储单元示意图。
具体实施方式
为了使本技术领域的人员更好地理解本发明方案,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述。
在本发明的说明书和权利要求书及上述附图中的描述的一些流程中,包含了按照特定顺序出现的多个操作,但是应该清楚了解,这些操作可以不按照其在本文中出现的顺序来执行或并行执行,操作的序号如S10、S11等,仅仅是用于区分开各个不同的操作,序号本身不代表任何的执行顺序。另外,这些流程可以包括更多或更少的操作,并且这些操作可以按顺序执行或并行执行。需要说明的是,本文中的“第一”、“第二”等描述,是用于区分不同的消息、设备、模块等,不代表先后顺序,也不限定“第一”和“第二”是不同的类型。
本领域普通技术人员可以理解,除非特意声明,这里使用的单数形式“一”、“一个”、“所述”和“该”也可包括复数形式。应该进一步理解的是,本发明的说明书中使用的措辞“包括”是指存在所述特征、整数、步骤、操作、元件和/或组件,但是并不排除存在或添加一个或多个其他特征、整数、步骤、操作、元件、组件和/或它们的组。应该理解,当我们称元件被“连接” 或“耦接”到另一元件时,它可以直接连接或耦接到其他元件,或者也可以存在中间元件。此外,这里使用的“连接”或“耦接”可以包括无线连接或无线耦接。这里使用的措辞“和/或”包括一个或更多个相关联的列出项的全部或任一单元和全部组合。
本领域普通技术人员可以理解,除非另外定义,这里使用的所有术语(包括技术术语和科学术语),具有与本发明所属领域中的普通技术人员的一般理解相同的意义。还应该理解的是,诸如通用字典中定义的那些术语,应该被理解为具有与现有技术的上下文中的意义一致的意义,并且除非像这里一样被特定定义,否则不会用理想化或过于正式的含义来解释。
本领域普通技术人员可以理解,这里所使用的“控制装置”或“无人机控制装置”既包括无线信号接收器的设备,其仅具备无发射能力的无线信号接收器的设备,又包括接收和发射硬件的设备,其具有能够在双向通信链路上,进行双向通信的接收和发射硬件的设备。这种设备可以包括:蜂窝或其他通信设备,其具有单线路显示器或多线路显示器或没有多线路显示器的蜂窝或其他通信设备;便携式、可运输、安装在交通工具(航空、海运和/或陆地)中的移动智能设备,如无人机、无人飞艇等。
本领域普通技术人员可以理解,这里所使用的“穿戴设备”或“穿戴设备控制装置”既包括无线信号发射器的设备,其仅具备无接收能力的无线信号发射器的设备,又包括接收和发射硬件的设备,其具有能够在双向通信链路上,进行双向通信的接收和发射硬件的设备。这种设备可被设计为适于布置于人身上,尤其是手臂部,其包括智能手环,智能手表和手链等等。
本发明所述方法主要适用于无人机或者穿戴设备等具有通信功能的终端, 不限制于其操作***的类型,可以是Android、IOS、WP、塞班等操作***,或嵌入式操作***。
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
在本发明的一个实施例中,用于无人机相对方位控制方法的设备结构框图如图1所示,整体结构包括处理器704、传感器模块、控制器、执行控制端等,其中传感器模块包括惯性传感器(Inertial measurement unit,简称IMU,含加速度传感器、陀螺仪传感器)、磁强计、方向传感器、测距传感器、卫星定位传感器(例如GPS传感器、北斗传感器等)、图像传感器等,用于生成各种传感器数据从而生成用于无人机控制的方位角信息、航向信息、图像信息、定位信息、距离信息等,从而反映无人机飞行中的各项参数,便于无人机做自身的调整。例如当无人机飞行受到刮风影响时,利用惯性传感器可以检测出无人机的姿态数据发生变化,无人机获取姿态数据后调整自身姿态以保证按照操控指令飞行;又如当无人机飞行过程中某个方向遇到障碍物时,可以利用距离传感器检测出与障碍物的距离,从而迅速做出避障动作,从而保证机身不损伤,而且当无人机有了避障措施后,就可以单独执行空间检测等任务;又如当用户想控制无人机的相对方位时,在本发明的一个实施例中,如图2所示,利用图像传感器获取穿戴设备发射的红外光作用于手势区域后形成的红外图像;依据 所述红外图像确定手势区域和表征相对方位预设值的手势指令类型;利用陀螺仪传感器、卫星定位传感器、测距传感器和方向传感器检测本无人机与所述手势区域之间的第一相对方位信息;根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。
处理器704是完成数据整合、发送控制、执行操作执行的核心部分,其在收到传感器模块发送的数据时,通过一系列的算法从数据中识别出特定的信息,从而根据这些信息判断将要执行的操作,本领域内技术人员可以理解,处理器704不止能够完成传感器数据的整合和发送指令,还可以进行其他的操作,在本发明中,处理器704具备能够完成任何无人机相对方位控制方法。控制器是用于控制无人机的控制器件,一般地,当远程遥控设备(如穿戴设备)作为控制器控制无人机时,需要设置无人机与控制器的控制频率,以保证有效控制无人机飞行。执行控制端用于无人机执行操作指令,执行控制端与处理器704互相通讯,以保证无人机按照操作指令执行。
请参阅图2,本发明的无人机相对方位控制方法的实施例中,其包括如下步骤:
步骤S11,获取穿戴设备发射的红外光作用于手势区域后形成的红外图像。
无人机通常包括摄像单元,处理器,储存器等,基于计算机视觉进行手势识别。一种实施例中,无人机包括至少一个具有红外成像功能的摄像头,无人机获取穿戴设备发射的红外光作用下形成的红外图像,依据红外图像将手势区域与背景区域分离以完成手势分割,以实现红外手势识别。
其中,摄像单元包括至少一个摄像头,在可能的实施例中,摄像单元可以通过IR-CUT双滤镜技术、IR镜头技术、红外感应CCD技术中的任意一种或任意多种获取红外图像。
穿戴设备发射出红外光照射到用户的手背部,使用户的手势区域的轮廓被红外光所“照亮”,形成手势区域的红外图像。使得用户的手势区域与背景区域在红外成像中被红外光所区分。
步骤S12,依据所述红外图像确定由红外光描述轮廓的手势区域和表征相对方位预设值的手势指令类型。
如前所述,无人机依据红外图像将手势区域与背景区域分离以完成手势分割,能够确定手势区域。在一些可能的实施例中,还可利用图像算法使无人机的摄像单元锁定所述手势区域。一种实施例中,所述依据红外图像确定所述手势区域和所述手势指令类型的过程包括:从摄像单元获取的预览红外图像中获取的一帧或多帧图像;确定所述多帧图像中由红外光描述轮廓的手势区域;基于所述手势区域提取手势特征数据,将其与预设的手势指令类型描述数据进行匹配,确定相对应的手势指令类型。本发明的实施例中,手势指令类型能够表征相对方位预设值,以用于无人机相对方位控制。
无人机通过摄像单元获取的视频可以看作由多帧图像组成,在静态手势识别中,只需对其中某一帧或几帧进行手势分析以提取手势特征数据,手势特征数据可以包括手势轮廓数据和/或手势深度数据。而在动态手势识别中,则还需要获取手势的时空特征,动态手势时空轨迹分析的常用方法,主要有两大类:轨迹模板匹配法(Trajectories Matching)和状态空间建模法(State Space Modeling)。因而需要对多帧图像进行分析,以获取手势在运动中所产生的时空轨迹。
无人机获取所述红外图像后,依据红外图像将手势区域与背景区域分离以完成手势分割,确定由红外光描述轮廓的手势区域,而后通过所述手势区域获取手势特征并估计手势模型参数进行手势分析,再根据模型参数对手势进行分类确定对应的手势指令类型,实现红外手势识别。识别方法可以是基于模板匹配,基于隐马尔可夫模型(HMM)或基于神经网络等方法。
由于利用红外光有效增强了手势区域与背景区域的区分度,能够使无人机根据红外成像捕捉手势,确定相应的指令类型。减少了计算资源的占用,缩短用户在背景复杂或光线昏暗的情况下利用手势识别进行无人机相对方位控制所需的响应时间,提高了用户人机交互的效率和准确性,尤其在无人机和/或用户处于移动过程中时,其效果尤为显著。
步骤S13,检测本无人机与所述手势区域之间的第一相对方位信息。
无人机根据其与所述手势区域之间的相对方位信息确定所述相对位置,需要指出的是,所述无人机的第一相对方位信息包括本无人机与所述手势区域之间的距离信息、方位角信息,高度角信息、高度差信息和本无人机的定位信息中的任意一种或任意多种,因此,第一相对方位信息是一个概括性名词,具体应用时,可以根据需要确定选用此处所列的具体数据。一种实施例中,无人机与所述手势区域之间的第一相对方位信息的检测过程包括:通过本无人机的卫星定位传感器检测本无人机的定位信息;通过本无人机的测距传感器检测本无人机与所述手势区域之间的距离信息;通过本无人机的方向传感器检测本无人机与所述手势区域之间的方位角信息;通过本无人机的陀螺仪传感器检测本无人机与所述手势区域之间的高度角信息,根据所述距离信息和所述高度角信息计算得到本无人机与所述手势区域之间的高度差信息;根据所述定位信息、 高度差信息、方位角信息和距离信息计算得到本无人机与所述手势区域之间的相对方位信息。
其中,根据距离信息,可以调整无人机手势区域之间的水平距离,实现无人机与用户之间的水平移动。根据方位角信息,可以调整无人机的机头朝向,或者实际上控制无人机内摄像单元的拍摄方向,使得当手势区域的位置发生变化时,无人机可以相应的调整其机头或摄像单元的朝向,使摄像单元能够一直锁定用户。根据高度差信息,可以调整无人机与操作者之间的相对高度,实现无人机能够在用户做上坡运动或下坡运动时与之保持相对方位为预设值。
在本发明的一些实施例中,第一相对方位信息的定位信息可表征无人机的经纬度坐标A(x1,y1),经纬度坐标可由无人机安装的卫星定位传感器获取,卫星定位传感器的定位功能基于与其连接的卫星定位***实现,与定位模块相连的卫星***包括但不限于:GPS定位***、北斗定位***、格罗纳斯定位***或伽利略定位***;所述距离为无人机与所述手势区域的直线距离l,所述测距传感器为激光测距传感器和/或红外线测距传感器;所述方位角ΦAB又称地平经度(Azimuth(angle),简称Az),表示从无人机的指北方向线起,依顺时针方向到手势区域方向线之间的水平夹角,例如在Android***中,通过public static float[]getOrientation(float[]R,float[]values)得到方向传感器获取的方位角信息,其范围是(0~359),360/0表示正北,90表示正东,180表示正南,270表示正西;所述高度角θ即从一点至观测目标的方向线与水平面间的夹角,且一般为俯角,可由陀螺仪传感器(gyroscope/gyro,也称陀螺仪)结合上述测距传感器或摄像头测得,进而得到所述高度差h=l×sinθ,根据:
Figure PCTCN2017114974-appb-000001
可得到当前手势区域的定位坐标B(x2,y2)。
当然,在某些实施例中,也可利用上述测距传感器或超声波测距传感器检测无人机与地面的相对高度信息,在某些使用场景(如开阔平地)中代替高度差信息。其中,根据距离信息,可以调整无人机与手势区域之间的水平距离,实现无人机与用户之间的水平移动。根据方位角,可以调整无人机的机头朝向,或者控制无人机内摄像单元的拍摄方向,使得当手势区域的位置发生变化时,无人机可以相应的调整其机头或摄像单元的朝向,使摄像单元能够一直锁定用户。根据高度差信息,可以调整无人机与用户之间的高度,实现无人机在用户做上坡运动和下坡运动时保持与之的相对方位。
通过分析计算所述的述定位信息、高度差信息、方位角信息和距离信息,得出无人机的第一相对方位信息,用于表征无人机与手势区域之间的相对方位。第一相对方位信息可根据所述信息利用信息融合算法,如卡尔曼滤波算法而得到,以提高精度。相应地,所述第一相对方位信息可解算出述定位信息、高度差信息、方位角信息和距离信息。或者,第一相对方位信息也可为包含述定位信息、高度差信息、方位角信息和距离信息的数据包。第一相对方位信息表征无人机与所述手势区域之间的相对方位,用于无人机的所述相对方位控制。
步骤S14,根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。
获取前述第一方位信息和确定手势指令类型后,可根据第一相对方位信息和所述手势指令类型所表征的相对方位预设值计算得到第二相对方位信息,进而根据所述第二相对方位信息相应地控制本无人机调整飞行状态, 调整其与所述手势区域之间的相对方位。其中,相对方位预设值可包括调整后的无人机与所述手势区域之间的距离、方位角和高度差等。
因此,所述得到第二相对方位信息后,具体地,可根据实际需求,灵活选定以下至少一个方案,使得本无人机与所述手势区域之间的相对方位为所述预设值:
其一,控制本无人机飞往所述第二相对方位信息中的位置信息所表征的坐标位置,使得本无人机与所述手势区域之间的距离与所述预设值中的预设距离相同。
其二,根据所述第二相对方位信息中的方位角信息控制本无人机调整航向角,使得本无人机朝向所述手势区域。
其三,根据所述第二相对方位信息中的第二高度差信息控制本无人机调整飞行高度,使得本无人机与所述手势区域之间的高度差与所述预设值中的高度差或第一相对方位信息所表征的高度差相同。
上述的第二相对方位信息中的位置信息所表征的坐标位置C(x3,y3)可由第一相对方位信息中的位置信息所表征的无人机坐标位置A(x1,y1)和所述预设值,根据前述公式求得。其中,根据航向距离,可以调整无人机与操作者之间的水平距离,实现无人机与操作者之间的水平移动。根据航向角度,可以调整无人机的机头朝向,实际上是在控制无人机内摄像单元的拍摄方向。这样当遥控器的运动轨迹发生变化时,无人机可以相应的调整机头的朝向,使相机可以一直锁定操作者。根据高度差,可以调整无人机与操作者之间的高度,实现无人机的上坡运动和下坡运动。
在以上方案的实施例中,请参阅图3,在某些使用场景中,可以根据所述 第一相对方位信息和所述预设值,只改变本无人机与所述手势区域的距离,而保持其方位角和高度差不变,使无人机与手势区域保持预设值中的距离;或者,改变所述方位角,保持其距离和高度差不变,使得无人机环绕手势区域进行拍摄等操作。从而能够使用户简单快捷地实现对无人机相对方位的控制。
由于在实际使用场景中,用户可能处于移动状态,导致因为用户位置或高度变化带来的相对位置改变。因此,为了使无人机保持与所述手势区域的相对方位为所述预设值,在本发明的某些实施例中,在无人机与所述手势区域之间的相对方位为所述预设值之后,还包括检测本无人机与所述手势区域之间的第三相对方位信息的步骤。检测所述第三相对方位信息的原理与步骤与检测所述第一相对方位信息的相同,在此不再赘述。进一步地,还可根据实际需求,包括以下至少一个方案中的后续步骤:
其一,确定所述第三相对方位信息表征的距离的变化率是否大于预设速度范围;当大于预设速度范围时,控制本无人机调整飞行速度,使得本无人机与所述手势区域之间的距离在预定距离范围内。
其二,确定所述第三相对方位信息表征的高度差是否小于第一预设高度差;
当小于第一预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差提升至所述第一预设高度差。
其三,确定所述第三相对方位信息表征的高度差是否大于第二预设高度差;
当所述大于第二预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差降低至所述第二预设高度差。
在上述方案中,所述第三相对方位信息表征的距离的变化率可表征用户与本无人机之间的相对速度,通过对本无人机飞行速度的调整,使得本无人机与所述手势区域之间的距离在预定距离范围内。这里的预设速度范围、预定距离范围可以依据实际需要事先设定和/或根据所述手势指令类型表征相对方位预设值而对应得到,目的是保证无人机较好地跟随用户,不至于跟丢,和/或保证无人机拍摄时可拍摄到较清楚的画面。具体而言,由于一般而言测距传感器的更新频率较低,当手势区域以较快速度移动时快速变化时,即用户快速移动时,无人机的反应往往不够迅速,存在延迟现象,因此除了调高测距传感器的更新频率,还可以通过测距传感器测定手势区域的距离,并定时计算手势区域的运动速度并根据该运动速度对无人机的跟随速度进行实时调整,使本无人机可以根据手势区域的移动速度调整飞行速度,从而使无人机与用户保持相对方位为预设范围,达到良好的跟随效果,提升了用户在快速运动时的无人机交互体验。
而如图4所示,当用户和无人机之间的高度差小于第一预设高度差时,则控制无人机的飞行状态,将所述高度差提升至所述第一预设高度差。通过该方案,在用户上坡运动过程中无人机可以通过调整自身的高度,与操作者保持预设的相对高度,避免发生碰撞事故。优选的,第一预设高度差为0.5米。类似地,当操作者和无人机之间的高度差大于第二预设高度差时,则控制无人机的飞行状态,将所述高度差降低至所述第二预设高度差。通过这种方式,在上坡运动过程中无人机可以通过调整自身的高度,与操作者保持预设的相对高度,避免发生碰撞事故。优选的,第二预设高度差为0.5米。
本发明实施例所示的无人机相对方位控制方法,通过无人机与用户之间的 相对方位,实时调整无人机的飞行状态,从而实现使无人机与用户的相对方位控制。相较于现有技术减小了无人机控制操作的复杂度、也减小了操作失误的隐患。需要说明的是,本实施例中的用户不仅可以是一个真实的人,其还可以是其他无人机,或地面上的汽车等设备。
另一实施例是在前一实施例的基础上做出的改进,为了预防无人机丢失和安全事故,当无人机确定以下任意一种方位偏离状态条件被满足时,基于信任连接向穿戴设备发送告警指令:所述第三相对方位信息表征的距离的变化率大于预设速度范围;所述第三相对方位信息表征的高度差小于第一预设高度差;
所述第三相对方位信息表征的高度差大于第二预设高度差。
无人机检测出上述环境变化的影响时,通过向穿戴设备发送告警指令,以提示用户当前的无人机处于方位偏离状态。有利于用户及时做出相应的调整,减少了无人机丢失和发生安全事故的隐患。
在本发明的某些实施例中,实现无人机对穿戴设备发射红外光的驱动控制,减少穿戴设备电能的损耗,还包括如下前置步骤:基于信任连接,向穿戴设备发送用于驱动其发射红外光的驱动指令。
无人机和穿戴设备通常通过通信连接以实现数据和指令的传输,一般而言,采用无线通信的连接方式。甚至,在某些情况下,例如无人机和穿戴设备之间距离较远,或环境电磁条件复杂等,还可以通过信号中继器等信号放大设备进行连接。在一种实施例中,为了保证人机交互控制的准确性和安全性,采用信任连接的方式,使得只有已经通过身份(ID)验证的无人机和遥控装置才能进行交互操作。
如前所述,为了提高无人机控制的准确性和安全性,本发明的另一实施例中进一步包括以下前置步骤:通过通信连接,对穿戴设备进行身份验证;当身份验证成功时,本无人机和所述穿戴设备建立信任连接。通过该前置步骤,使得只有已经通过身份(ID)验证的穿戴设备才能和无人机建立信任连接,进而实现交互操作,防止识别设备误判或他人恶意干扰,提高***准确性和安全性。
通过对本发明的无人机相对方位控制方法的揭示可以知晓,本发明的实施,通过红外成像确定手势区域和识别手势,并结合无人机自身的方位信息确定相对方位,可以提高实现无人机相对方位控制的效率,提升用户体验。
请参阅图5,本发明的用于穿戴设备的无人机相对方位辅助控制方法的实施例中,其包括如下步骤:
步骤S21,基于信任连接,接收无人机的用于驱动穿戴设备发射红外光的驱动指令。
穿戴设备与无人机通过手势识别和通信连接进行交互。一般而言,所述通信连接采用无线通信的连接方式。甚至,在某些情况下,例如穿戴设备和无人机之间距离较远,或环境电磁条件复杂等,还可以通过信号中继器等信号放大设备进行连接。在一种实施例中,为了保证人机交互控制的准确性和安全性,还可以采用信任连接的方式,使得只有已经通过身份(ID)验证的无人机和穿戴设备才能进行交互操作。在可能的实施例中,无人机和穿戴设备之间的信任连接可以为蓝牙信任连接、近场通信连接、UBW信任连接、ZigBee信任连接或互联网信任连接中的任意一种或几种。穿戴设备基于上述连接,接收无人机的驱动指令,所述驱动指令用于驱动穿戴设备发射红外光。
步骤S22,响应于所述驱动指令,驱动穿戴设备预置的红外发光组件发射 红外光,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制。
穿戴设备包括红外发光组件,在一种实施例中,红外发光组件的红外光二极管沿穿戴设备的侧边线性排列布置。红外发光组件预设一个或多个红外点光源,如红外发光二极管等,用于发射红外光。另一实施例是在前一实施例的基础上做出的改进,为了使穿戴设备发出的红外光有利于手势识别,穿戴设备适于布置在手臂部,使得手势实施区域位于穿戴设备与识别设备之间。一种可能的设计中,穿戴设备响应于无人机的所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光而形成红外光环,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制。
穿戴设备的红外发光组件发射红外光后,导致所述无人机可基于红外成像捕捉手势,而生成相应的手势交互事件。穿戴设备发出的红外光在用户手背部发生漫反射,“照亮”手部轮廓,使得用户的手势区域与背景区域在红外成像中被红外光所区分。此外,所述红外光也可能被手部所部分吸收,使手部的红外成像更加明显。因此,所述无人机基于红外成像进行手势分割,能够减少处理器的计算量,缩短响应时间,提高手势识别和无人机相对位置控制的效率和准确性,尤其在当无人机或用户处于移动过程中时,其效果尤为显著。
一种实施例中,穿戴设备响应于驱动指令,仅驱动所述红外发光组件中的预设的一个或多个红外点光源发射红外光。以便于根据实际使用环境进行调节。例如,在背景相对复杂或环境光线相对昏暗时,驱动较多的红外点光源发光;反之,则驱动较少的红外点光源发光。从而在保证使用效果的情况下降低穿戴设备的能耗,延长使用时间。
另一种实施例中,红外发光组件受控发射的红外光的波长范围为:0.76~2.5um。使得识别设备获取的红外图像中的手部轮廓主要由手部反射的红外光所形成,对人体较为安全,且识别效果更佳。
步骤S23,基于信任连接,接收无人机的告警指令。
穿戴设备与无人机可通过无线通信的进行交互。在某些情况下,例如穿戴设备和无人机之间距离较远,或环境电磁条件复杂等,还可以通过信号中继器等信号放大设备进行连接。在一种实施例中,为了保证人机交互控制的准确性和安全性,还可以采用信任连接的方式,使得只有已经通过身份(ID)验证的无人机和穿戴设备才能进行交互操作。
为了预防无人机丢失和安全事故,穿戴设备基于信任连接,接收无人机在确定方位偏离状态条件被满足时发送告警指令,用于提示用户当前的无人机处于方位偏离状态。
步骤S24,响应于所述告警指令,控制穿戴设备启动振动马达和/或开启指示灯,以提示用户当前无人机处于方位偏离状态。
穿戴设备接收所述告警指令后,响应于该告警指令,相应地启动振动马达和/或开启指示灯。其中,振动马达的振动模式和/或指示灯的闪动模式可为预设模式或为根据告警指令表征的不同的无人机方位偏离状态而对应设置的模式;可在出厂时设置,或由用户自行设置。
通过提示用户当前的无人机处于方位偏离状态,便于用户及时做出相应的调整,减少了无人机丢失和发生安全事故的隐患。
在本发明的某些实施例中,为了减少穿戴设备电能的损耗,还包括如下并发步骤:计算所述红外发光组件的工作时长,当该工作时长超过时长预定值时,控制所述红外发光组件停止发射红外光。使得穿戴设备可在发光组件发光时间 超过预定时长之后,自动关闭红外组件。从而有效防止由于用户疏忽等原因而浪费电能,且用户也可以自行设置时长预定值来控制红外发光组件的发光时长,提高工作效率。
如前所述,在本发明的某些实施例中,为了提高无人机相对方位控制的准确性和安全性,本发明的用于穿戴设备的无人机相对方位辅助控制方法还包括以下前置步骤:通过通信连接,向无人机发送身份验证请求;当身份验证成功时,所述穿戴设备和所述无人机建立信任连接。通过该前置步骤,使得只有已经通过身份(ID)验证的穿戴设备才能和所述无人机建立信任连接,进而实现交互操作,防止无人机误判或他人恶意干扰,提高***准确性和安全性。
通过对本发明的用于穿戴设备的无人机相对方位辅助控制方法的揭示可以知晓,本发明的实施,通过与无人机进行红外手势和通信交互,能够提高无人机相对方位控制的效率,提升用户体验。
依据模块化设计思维,本发明在上述无人机相对方位控制方法的基础上,进一步提出一种无人机相对方位控制装置。
请参阅图6,本发明的无人机相对方位控制装置的实施例中,其包括摄像单元11,识别单元12,检测单元13,控制单元14,各单元所实现的功能具体揭示如下:
摄像单元11,用于获取穿戴设备发射的红外光作用于手势区域后形成的红外图像。
无人机通常包括摄像单元11,处理器,储存器等,基于计算机视觉进行手势识别。一种实施例中,无人机包括至少一个具有红外成像功能的摄像头,无人机获取穿戴设备发射的红外光作用下形成的红外图像,依据红外图像将手 势区域与背景区域分离以完成手势分割,以实现红外手势识别。
摄像单元11包括至少一个摄像头,在可能的实施例中,摄像单元11可以通过IR-CUT双滤镜技术、IR镜头技术、红外感应CCD技术中的任意一种或任意多种获取红外图像。
穿戴设备发射出红外光照射到用户的手背部,使用户的手势区域的轮廓被红外光所“照亮”,形成手势区域的红外图像。使得用户的手势区域与背景区域在红外成像中被红外光所区分。
识别单元12,被配置为依据所述红外图像确定由红外光描述轮廓的手势区域和表征相对方位预设值的手势指令类型。
如前所述,无人机依据红外图像将手势区域与背景区域分离以完成手势分割,能够确定手势区域。在一些可能的实施例中,还可利用图像算法使无人机的摄像单元11锁定所述手势区域。一种实施例中,识别单元12所述依据红外图像确定所述手势区域和所述手势指令类型的过程包括:从摄像单元11获取的预览红外图像中获取的一帧或多帧图像;确定所述多帧图像中由红外光描述轮廓的手势区域;基于所述手势区域提取手势特征数据,将其与预设的手势指令类型描述数据进行匹配,确定相对应的手势指令类型。本发明的实施例中,手势指令类型能够表征相对方位预设值,以用于无人机相对方位控制。
无人机通过摄像单元11获取的视频可以看作由多帧图像组成,在静态手势识别中,只需对其中某一帧或几帧进行手势分析以提取手势特征数据,手势特征数据可以包括手势轮廓数据和/或手势深度数据。而在动态手势识别中,则还需要获取手势的时空特征,动态手势时空轨迹分析的常用方法,主要有两大类:轨迹模板匹配法(Trajectories Matching)和状态空间建模法(State Space Modeling)。因而需要对多帧图像进行分析,以获取手势在运动中 所产生的时空轨迹。
无人机获取所述红外图像后,识别单元12依据红外图像将手势区域与背景区域分离以完成手势分割,确定由红外光描述的手势区域,而后通过所述手势区域获取手势特征并估计手势模型参数进行手势分析,再根据模型参数对手势进行分类确定对应的手势指令类型,实现红外手势识别。识别方法可以是基于模板匹配,基于隐马尔可夫模型(HMM)或基于神经网络等方法。
由于利用红外光有效增强了手势区域与背景区域的区分度,能够使无人机根据红外成像捕捉手势,确定相应的指令类型。减少了计算资源的占用,缩短用户在背景复杂或光线昏暗的情况下利用手势识别进行无人机相对方位控制所需的响应时间,提高了用户人机交互的效率和准确性,尤其在无人机和/或用户处于移动过程中时,其效果尤为显著。
检测单元13,被配置为检测本无人机与所述手势区域之间的第一相对方位信息。
无人机根据其与所述手势区域之间的相对方位信息确定所述相对位置,需要指出的是,所述无人机的第一相对方位信息包括本无人机与所述手势区域之间的距离信息、方位角信息,高度角信息、高度差信息和本无人机的定位信息中的任意一种或任意多种,因此,第一相对方位信息是一个概括性名词,具体应用时,可以根据需要确定选用此处所列的具体数据。一种实施例中,检测单元13对无人机与所述手势区域之间的第一相对方位信息的检测过程包括:通过本无人机的卫星定位传感器检测本无人机的定位信息;通过本无人机的测距传感器检测本无人机与所述手势区域之间的距离信息;通过本无人机的方向传感器检测本无人机与所述手势区域之间的方位角信息;通过本无人机的陀螺 仪传感器检测本无人机与所述手势区域之间的高度角信息,根据所述距离信息和所述高度角信息计算得到本无人机与所述手势区域之间的高度差信息;根据所述定位信息、高度差信息、方位角信息和距离信息计算得到本无人机与所述手势区域之间的相对方位信息。
其中,根据距离信息,可以调整无人机手势区域之间的水平距离,实现无人机与用户之间的水平移动。根据方位角信息,可以调整无人机的机头朝向,或者实际上控制无人机内摄像单元11的拍摄方向,使得当手势区域的位置发生变化时,无人机可以相应的调整其机头或摄像单元11的朝向,使摄像单元11能够一直锁定用户。根据高度差信息,可以调整无人机与操作者之间的相对高度,实现无人机能够在用户做上坡运动或下坡运动时与之保持相对方位为预设值。
在本发明的一些实施例中,第一相对方位信息的定位信息可表征无人机的经纬度坐标A(x1,y1),经纬度坐标可由无人机安装的卫星定位传感器获取,卫星定位传感器的定位功能基于与其连接的卫星定位***实现,与定位模块相连的卫星***包括但不限于:GPS定位***、北斗定位***、格罗纳斯定位***或伽利略定位***;所述距离为无人机与所述手势区域的直线距离l,所述测距传感器为激光测距传感器和/或红外线测距传感器;所述方位角ΦAB又称地平经度(Azimuth(angle),简称Az),表示从无人机的指北方向线起,依顺时针方向到手势区域方向线之间的水平夹角,例如在Android***中,通过public static float[]getOrientation(float[]R,float[]values)得到方向传感器获取的方位角信息,其范围是(0~359),360/0表示正北,90表示正东,180表示正南,270表示正西;所述高度角θ即从一点至观测目标的方向线与水平面间的夹角, 且一般为俯角,可由陀螺仪传感器(gyroscope/gyro,也称陀螺仪)结合上述测距传感器或摄像头测得,进而得到所述高度差h=l×sinθ,根据:
Figure PCTCN2017114974-appb-000002
可得到当前手势区域的定位坐标B(x2,y2)。
当然,在某些实施例中,也可利用上述测距传感器或超声波测距传感器检测无人机与地面的相对高度信息,在某些使用场景(如开阔平地)中代替高度差信息。其中,根据距离信息,可以调整无人机与手势区域之间的水平距离,实现无人机与用户之间的水平移动。根据方位角,可以调整无人机的机头朝向,或者控制无人机内摄像单元11的拍摄方向,使得当手势区域的位置发生变化时,无人机可以相应的调整其机头或摄像单元11的朝向,使摄像单元11能够一直锁定用户。根据高度差信息,可以调整无人机与用户之间的高度,实现无人机在用户做上坡运动和下坡运动时保持与之的相对方位。
通过分析计算所述的述定位信息、高度差信息、方位角信息和距离信息,得出无人机的第一相对方位信息,用于表征无人机与手势区域之间的相对方位。第一相对方位信息可根据所述信息利用信息融合算法,如卡尔曼滤波算法而得到,以提高精度。相应地,所述第一相对方位信息可解算出述定位信息、高度差信息、方位角信息和距离信息。或者,第一相对方位信息也可为包含述定位信息、高度差信息、方位角信息和距离信息的数据包。第一相对方位信息表征无人机与所述手势区域之间的相对方位,用于无人机的所述相对方位控制。
控制单元14,被配置为根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为 所述预设值。
获取前述第一方位信息和确定手势指令类型后,可根据第一相对方位信息和所述手势指令类型所表征的相对方位预设值计算得到第二相对方位信息,进而控制单元14根据所述第二相对方位信息相应地控制本无人机调整飞行状态,调整其与所述手势区域之间的相对方位。其中,相对方位预设值可包括调整后的无人机与所述手势区域之间的距离、方位角和高度差等。
因此,所述得到第二相对方位信息后,具体地,控制单元14可根据实际需求,灵活选定以下至少一个方案,使得本无人机与所述手势区域之间的相对方位为所述预设值:
其一,控制本无人机飞往所述第二相对方位信息中的位置信息所表征的坐标位置,使得本无人机与所述手势区域之间的距离与所述预设值中的预设距离相同。
其二,根据所述第二相对方位信息中的方位角信息控制本无人机调整航向角,使得本无人机朝向所述手势区域。
其三,根据所述第二相对方位信息中的第二高度差信息控制本无人机调整飞行高度,使得本无人机与所述手势区域之间的高度差与所述预设值中的高度差或第一相对方位信息所表征的高度差相同。
上述的第二相对方位信息中的位置信息所表征的坐标位置C(x3,y3)可由第一相对方位信息中的位置信息所表征的无人机坐标位置A(x1,y1)和所述预设值,根据前述公式求得。其中,根据航向距离,可以调整无人机与操作者之间的水平距离,实现无人机与操作者之间的水平移动。根据航向角度,可以调整无人机的机头朝向,实际上是在控制无人机内摄像单元11的拍摄方向。 这样当遥控器的运动轨迹发生变化时,无人机可以相应的调整机头的朝向,使相机可以一直锁定操作者。根据高度差,可以调整无人机与操作者之间的高度,实现无人机的上坡运动和下坡运动。
在以上方案的实施例中,请参阅图3,在某些使用场景中,控制单元14可以根据所述第一相对方位信息和所述预设值,只改变本无人机与所述手势区域的距离,而保持其方位角和高度差不变,使无人机与手势区域保持预设值中的距离;或者,改变所述方位角,保持其距离和高度差不变,使得无人机环绕手势区域进行拍摄等操作。从而能够使用户简单快捷地实现对无人机相对方位的控制。
由于在实际使用场景中,用户可能处于移动状态,导致因为用户位置或高度变化带来的相对位置改变。因此,为了使无人机保持与所述手势区域的相对方位为所述预设值,在本发明的某些实施例中,在无人机与所述手势区域之间的相对方位为所述预设值之后,所述检测单元13还被配置为:检测本无人机与所述手势区域之间的第三相对方位信息。检测所述第三相对方位信息的原理与步骤与检测所述第一相对方位信息的相同,在此不再赘述。进一步地,还包括确定单元,所述确定单元和所述控制单元14可根据实际需求,按如下至少一个方案进行配置:
其一,所述确定单元被配置为确定所述第三相对方位信息表征的距离的变化率是否大于预设速度范围;所述控制单元14被配置为当大于预设速度范围时,控制本无人机调整飞行速度,使得本无人机与所述手势区域之间的距离在预定距离范围内。
其二,所述确定单元被配置为确定所述第三相对方位信息表征的高度差是 否小于第一预设高度差;所述控制单元14被配置为当小于第一预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差提升至所述第一预设高度差。
其三,所述确定单元被配置为确定所述第三相对方位信息表征的高度差是否大于第二预设高度差;所述控制单元14被配置为当所述大于第二预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差降低至所述第二预设高度差。
在上述方案中,所述第三相对方位信息表征的距离的变化率可表征用户与本无人机之间的相对速度,通过对本无人机飞行速度的调整,使得本无人机与所述手势区域之间的距离在预定距离范围内。这里的预设速度范围、预定距离范围可以依据实际需要事先设定和/或根据所述手势指令类型表征相对方位预设值而对应得到,目的是保证无人机较好地跟随用户,不至于跟丢,和/或保证无人机拍摄时可拍摄到较清楚的画面。具体而言,由于一般而言测距传感器的更新频率较低,当手势区域以较快速度移动时快速变化时,即用户快速移动时,无人机的反应往往不够迅速,存在延迟现象,因此除了调高测距传感器的更新频率,还可以通过测距传感器测定手势区域的距离,并定时计算手势区域的运动速度并根据该运动速度对无人机的跟随速度进行实时调整,使本无人机可以根据手势区域的移动速度调整飞行速度,从而使无人机与用户保持相对方位为预设范围,达到良好的跟随效果,提升了用户在快速运动时的无人机交互体验。
而如图4所示,当用户和无人机之间的高度差小于第一预设高度差时,则控制无人机的飞行状态,将所述高度差提升至所述第一预设高度差。通过该方 案,在用户上坡运动过程中无人机可以通过调整自身的高度,与操作者保持预设的相对高度,避免发生碰撞事故。优选的,第一预设高度差为0.5米。类似地,当操作者和无人机之间的高度差大于第二预设高度差时,则控制无人机的飞行状态,将所述高度差降低至所述第二预设高度差。通过这种方式,在上坡运动过程中无人机可以通过调整自身的高度,与操作者保持预设的相对高度,避免发生碰撞事故。优选的,第二预设高度差为0.5米。
本发明实施例所示的无人机相对方位控制方法,通过无人机与用户之间的相对方位,实时调整无人机的飞行状态,从而实现使无人机与用户的相对方位控制。相较于现有技术减小了无人机控制操作的复杂度、也减小了操作失误的隐患。需要说明的是,本实施例中的用户不仅可以是一个真实的人,其还可以是其他无人机,或地面上的汽车等设备。
另一实施例是在前一实施例的基础上做出的改进,为了预防无人机丢失和安全事故,还包括发送单元,被配置为当无人机确定以下任意一种方位偏离状态条件被满足时,基于信任连接向穿戴设备发送告警指令:所述第三相对方位信息表征的距离的变化率大于预设速度范围;所述第三相对方位信息表征的高度差小于第一预设高度差;所述第三相对方位信息表征的高度差大于第二预设高度差。
无人机检测出上述环境变化的影响时,通过发送单元向穿戴设备发送告警指令,以提示用户当前的无人机处于方位偏离状态。有利于用户及时做出相应的调整,减少了无人机丢失和发生安全事故的隐患。
在本发明的某些实施例中,实现无人机对穿戴设备发射红外光的驱动控制,减少穿戴设备电能的损耗,所述发送单元还被配置为:基于信任连接, 向穿戴设备发送用于驱动其发射红外光的驱动指令。
无人机和穿戴设备通常通过通信连接以实现数据和指令的传输,一般而言,采用无线通信的连接方式。甚至,在某些情况下,例如无人机和穿戴设备之间距离较远,或环境电磁条件复杂等,还可以通过信号中继器等信号放大设备进行连接。在一种实施例中,为了保证人机交互控制的准确性和安全性,采用信任连接的方式,使得只有已经通过身份(ID)验证的无人机和遥控装置才能进行交互操作。
如前所述,为了提高无人机控制的准确性和安全性,本发明的另一实施例中还包括第一通信单元,被配置为:通过通信连接,对穿戴设备进行身份验证;当身份验证成功时,本无人机和所述穿戴设备建立信任连接。通过该前置步骤,使得只有已经通过身份(ID)验证的穿戴设备才能和无人机建立信任连接,进而实现交互操作,防止识别设备误判或他人恶意干扰,提高***准确性和安全性。
通过对本发明的无人机相对方位控制装置的揭示可以知晓,本发明的实施,能够通过红外成像确定手势区域和识别手势,并结合无人机自身的方位信息确定相对方位,可以提高无人机相对方位控制的效率,提升用户体验。
依据模块化设计思维,本发明在上述用于穿戴设备的无人机相对方位辅助控制方法的基础上,进一步提出一种用于穿戴设备的无人机相对方位辅助控制装置。
请参阅图7,本发明的用于穿戴设备的无人机相对方位辅助控制装置的实施例中,其包括第一接收单元21,驱动单元22,第二接收单元23,告警单元24,各单元所实现的功能具体揭示如下:
第一接收单元21,被配置为基于信任连接,接收无人机的用于驱动穿戴设备发射红外光的驱动指令。
穿戴设备与无人机通过手势识别和通信连接进行交互。一般而言,所述通信连接采用无线通信的连接方式。甚至,在某些情况下,例如穿戴设备和无人机之间距离较远,或环境电磁条件复杂等,还可以通过信号中继器等信号放大设备进行连接。在一种实施例中,为了保证人机交互控制的准确性和安全性,还可以采用信任连接的方式,使得只有已经通过身份(ID)验证的无人机和穿戴设备才能进行交互操作。在可能的实施例中,无人机和穿戴设备之间的信任连接可以为蓝牙信任连接、近场通信连接、UBW信任连接、ZigBee信任连接或互联网信任连接中的任意一种或几种。穿戴设备的第一接收单元21基于上述连接,接收无人机的驱动指令,所述驱动指令用于驱动穿戴设备发射红外光。
驱动单元22,被配置为响应于所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制。
穿戴设备包括红外发光组件,在一种实施例中,红外发光组件的红外光二极管沿穿戴设备的侧边线性排列布置。红外发光组件预设一个或多个红外点光源,如红外发光二极管等,用于发射红外光。另一实施例是在前一实施例的基础上做出的改进,为了使穿戴设备发出的红外光有利于手势识别,穿戴设备适于布置在手臂部,使得手势实施区域位于穿戴设备与识别设备之间。一种可能的设计中,驱动单元22响应于无人机的所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光而形成红外光环,使得无人机基于红外成像确定手势 区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制。
穿戴设备的红外发光组件发射红外光后,导致所述无人机基于红外成像捕捉手势,而生成相应的手势交互事件。穿戴设备发出的红外光在用户手背部发生漫反射,“照亮”手部轮廓,使得用户的手势区域与背景区域在红外成像中被红外光所区分。此外,所述红外光也可能被手部所部分吸收,使手部的红外成像更加明显。因此,所述无人机基于红外成像进行手势分割,能够减少处理器的计算量,缩短响应时间,提高手势识别和无人机相对位置控制的效率和准确性,尤其在当无人机或用户处于移动过程中时,其效果尤为显著。
一种实施例中,驱动单元22响应于驱动指令,仅驱动所述红外发光组件中的预设的一个或多个红外点光源发射红外光。以便于根据实际使用环境进行调节。例如,在背景相对复杂或环境光线相对昏暗时,驱动较多的红外点光源发光;反之,则驱动较少的红外点光源发光。从而在保证使用效果的情况下降低穿戴设备的能耗,延长使用时间。
另一种实施例中,红外发光组件受控发射的红外光的波长范围为:0.76~2.5um。使得识别设备获取的红外图像中的手部轮廓主要由手部反射的红外光所形成,对人体较为安全,且识别效果更佳。
第二接收单元23,被配置为基于信任连接,接收无人机的告警指令。
穿戴设备与无人机可通过无线通信的进行交互。在某些情况下,例如穿戴设备和无人机之间距离较远,或环境电磁条件复杂等,还可以通过信号中继器等信号放大设备进行连接。在一种实施例中,为了保证人机交互控制的准确性和安全性,还可以采用信任连接的方式,使得只有已经通过身份(ID)验证的无人机和穿戴设备才能进行交互操作。
为了预防无人机丢失和安全事故,第二接收单元23基于信任连接,接收 无人机在确定方位偏离状态条件被满足时发送告警指令,用于提示用户当前的无人机处于方位偏离状态。
告警单元24,被配置为响应于所述告警指令,控制穿戴设备启动振动马达和/或开启指示灯,以提示用户当前无人机处于方位偏离状态。
穿戴设备接收所述告警指令后,告警单元24响应于该告警指令,相应地启动振动马达和/或开启指示灯。其中,振动马达的振动模式和/或指示灯的闪动模式可为预设模式或为根据告警指令表征的不同的无人机方位偏离状态而对应设置的模式;可在出厂时设置,或由用户自行设置。
通过提示用户当前的无人机处于方位偏离状态,便于用户及时做出相应的调整,减少了无人机丢失和发生安全事故的隐患。
在本发明的某些实施例中,为了减少穿戴设备电能的损耗,还包括关闭单元,被配置为:计算所述红外发光组件的工作时长,当该工作时长超过时长预定值时,控制所述红外发光组件停止发射红外光。使得穿戴设备可在发光组件发光时间超过预定时长之后,自动关闭红外组件。从而有效防止由于用户疏忽等原因而浪费电能,且用户也可以自行设置时长预定值来控制红外发光组件的发光时长,提高工作效率。
如前所述,在本发明的某些实施例中,为了提高无人机相对方位控制的准确性和安全性,本发明的用于穿戴设备的无人机相对方位辅助控制装置还包括第二通信单元,被配置为:通过通信连接,向无人机发送身份验证请求;当身份验证成功时,所述穿戴设备和所述无人机建立信任连接。通过该前置步骤,使得只有已经通过身份(ID)验证的穿戴设备才能和所述无人机建立信任连接,进而实现交互操作,防止无人机误判或他人恶意干扰,提高***准确性和安全 性。
通过对本发明的用于穿戴设备的无人机相对方位辅助控制装置的揭示可以知晓,本发明的实施,能够通过与无人机进行红外手势和通信交互,提高无人机相对方位控制的效率,提升用户体验。
请参阅图8,本发明另一实施例中进一步提供了一种无人机控制装置,该无人机控制装置具有实现上述无人机相对方位控制方法的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的单元。
在一个可能的设计中,无人机控制装置的结构中包括:
一个或多个摄像头707,其中至少一个摄像头具有红外成像功能;
一个或多个传感器708,用于检测所述相对方位信息;
存储器702,用于存储支持穿戴设备执行上述无人机相对方位控制方法的程序;
通信接口703,用于上述无人机与穿戴设备或其他设备或通信网络通信;
一个或多个处理器704,用于执行所述存储器中存储的程序;
一个或多个应用程序705,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行;
所述一个或多个程序705用于驱动所述一个或多个处理器704构造用于执行上述任意一项无人机相对方位控制方法的单元。
图8示出的是与本发明实施例提供的无人机相对方位控制装置相关的无 人机部分结构的框图。包括:存储器702、通信接口703、一个或多个处理器704、一个或多个应用程序705、电源706、一个或多个摄像头707、以及一个或多个传感器708等部件。本领域技术人员可以理解,图8中示出的结构并不构成对无人机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图8对无人机的各个构成部件进行具体的介绍:
存储器702可用于存储软件程序以及模块,处理器704通过运行存储在存储器702的软件程序以及模块,从而执行无人机的各种功能应用以及数据处理。存储器702可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序705等;存储数据区可存储根据无人机的使用所创建的数据等。此外,存储器702可以包括高速随机存取存储区702,还可以包括非易失性存储区702,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
通信接口703,用于上述控制过程中无人机与穿戴设备及其他设备或通信网络通信。通信接口703是处理器704与外界子***进行通信的接口,用于处理器704与外界***之间信息的传输,以达到控制子***的目的。
处理器704是无人机的控制中心,利用各种通信接口703和线路连接整个无人机相对方位控制装置的各个部分,通过运行或执行存储在存储区702内的软件程序和/或模块,以及调用存储在存储区702内的数据,执行无人机的各种功能和处理数据,从而对无人机进行整体监控。可选的,处理器704可包括一个或多个处理单元;优选的,处理器704可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作***、用户界面和应用程序705等,调制 解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器704中。
一个或多个应用程序705,优选地,这些应用程序705都被存储在所述存储区702中并被配置为由所述一个或多个处理器704执行,所述一个或多个程序被配置为无人机相对方位控制方法的任何实施例所实现的功能。
给各个部件供电的电源706(比如电池),优选的,电源706可以通过电源管理***与处理器704逻辑相连,从而通过电源706管理***实现管理充电、放电、以及功耗管理等功能。
此外无人机还可包括一个或多个摄像头707,其中至少包括一个具有红外成像功能的摄像头,这些摄像头707与处理器704连接并受处理器704所控制,摄像头707获取的图像可存储于存储器702中。
一个或多个传感器708,包括惯性传感器(Inertial measurement unit,简称IMU,含加速度传感器、陀螺仪传感器)、磁强计、方向传感器、测距传感器、卫星定位传感器(例如GPS传感器、北斗传感器等)、图像传感器等,用于生成各种传感器数据从而生成用于无人机控制的方位角信息、航向信息、图像信息、定位信息、距离信息等,从而反映无人机飞行中的各项参数,便于无人机做自身的调整,以实现无人机相对方位控制。
尽管未示出,无人机还可以包括蓝牙模块等,在此不再赘述。
在本发明实施例中,该无人机所包括的处理器704还具有以下功能:
获取穿戴设备发射的红外光作用下形成的红外图像;
依据所述红外图像确定手势区域和表征相对方位预设值的手势指令类型;
检测本无人机与所述手势区域之间的第一相对方位信息;
根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。
本发明实施例中还提供了一种计算机存储介质,用于储存为上述无人机相对方位控制装置所用的计算机软件指令,其包含用于执行上述为所述无人机所设计的程序。
请参阅图9,本发明另一实施例中进一步提供了一种穿戴设备控制装置,该无人机控制装置具有实现上述中用于穿戴设备无人机辅助相对方位控制方法的功能。所述功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的单元。
在一个可能的设计中,无人机控制装置的结构中包括:
存储器702,用于存储支持穿戴设备执行上述用于穿戴设备的无人机相对方位辅助控制方法的程序;
通信接口703,用于上述穿戴设备与无人机或其他设备或通信网络通信;
振动马达和/或指示灯710,用于提示用户当前无人机的状态;
一个或多个处理器704,用于执行所述存储器中存储的程序;
红外发光组件709,包括一个或多个红外光源,用于发射红外光;
一个或多个应用程序705,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行;
所述一个或多个程序705用于驱动所述一个或多个处理器704构造用于执行上述任意一项穿戴设备无人机辅助相对方位控制方法的单元。
图9示出的是与本发明实施例提供的穿戴设备控制装置相关的智能手环的部分结构的框图。包括:存储器702、通信接口703、一个或多个处理器704、一个或多个应用程序705、电源706、红外发光组件709、指示灯710等部件。本领域技术人员可以理解,图9中示出的结构并不构成对手环的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图9对智能手环的各个构成部件进行具体的介绍:
其中,存储器702可用于存储软件程序以及模块,处理器704通过运行存储在存储器702的软件程序以及模块,从而执行穿戴设备的各种功能应用以及数据处理。存储器702可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序705等;存储数据区可存储根据穿戴设备的使用所创建的数据等。此外,存储器702可以包括高速随机存取存储区702,还可以包括非易失性存储区702,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
通信接口703,用于上述控制过程中智能手环与无人机相对方位控制装置及其他设备或通信网络通信。通信接口703是处理器704与外界子***进行通信的接口,用于处理器704与外界***之间信息的传输,以达到控制子***的目的。
处理器704是智能手环的控制中心,利用各种通信接口703和线路连接整个穿戴设备的各个部分,通过运行或执行存储在存储区702内的软件程序和/或模块,以及调用存储在存储区702内的数据,执行穿戴设备的各种功能和处理数据,从而对穿戴设备进行整体监控。可选的,处理器704可包括一个或多个处理单元;优选的,处理器704可集成应用处理器和调制解调处理器,其中, 应用处理器主要处理操作***、用户界面和应用程序705等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器704中。
一个或多个应用程序705,优选地,这些应用程序705都被存储在所述存储区702中并被配置为由所述一个或多个处理器704执行,所述一个或多个程序被配置为用于执行用于用于穿戴设备的无人机相对方位辅助控制方法的任何实施例所实现的功能。
红外发光组件709,预设一个或多个红外点光源,如红外发光二极管等,用于发射红外光。
指示灯710,其闪动模式可为预设模式或为根据告警指令表征的不同的无人机方位偏离状态而对应设置的模式;用于提示用户当前的无人机处于方位偏离状态。
在本发明实施例中,该穿戴设备所包括的处理器704还具有以下功能:
基于信任连接,接收无人机的用于驱动穿戴设备发射红外光的驱动指令;
响应于所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制;
基于信任连接,接收无人机的告警指令;
响应于所述告警指令,控制穿戴设备启动振动马达和/或开启指示灯,以提示用户当前无人机处于方位偏离状态。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述 的***,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
图10示出了可以实现根据本发明的相对方位控制的无人机或无人机相对方位辅助控制的穿戴设备(下述将无人机及穿戴设备统称为设备)。该设备传统上包括处理器1010和以存储器1020形式的计算机程序产品或者计算机可读介质。存储器1020可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器1020具有用于执行上述方法中的任何方法步骤的程序代码1031的存储空间1030。例如,用于程序代码的存储空间1030可以包括分别用于实现上面的方法中的各种步骤的各个程序代码1031。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图11所述的便携式或者固定存储单元。该存储单元可以具有与图10中的存储器1020类似布置的存储段或者存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括用于执行根据本发明的方法步骤的程序代码1031’,即可以由例如诸如1010之类的处理器读取的代码,这些代码当由设备运行时,导致该设备执行上面所描述的方法中的各个步骤。
在本申请所提供的几个实施例中,本领域内技术人员可以理解,所揭露的***,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个***,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互 之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序指令来控制相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁盘或光盘等。
以上对本发明所提供的无人机相对方位控制方法及装置进行了详细介绍,对于本领域的一般技术人员,依据本发明实施例的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (37)

  1. 一种无人机相对方位控制方法,包括如下步骤:
    获取穿戴设备发射的红外光作用于手势区域后形成的红外图像;
    依据所述红外图像确定由红外光描述轮廓的手势区域和表征相对方位预设值的手势指令类型;
    检测本无人机与所述手势区域之间的第一相对方位信息;
    根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。
  2. 根据权利要求1所述的无人机相对方位控制方法,其特征在于,根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态的过程具体包括:
    根据第一相对方位信息和所述手势指令类型所表征的相对方位预设值计算得到第二相对方位信息,根据所述第二相对方位信息控制本无人机调整飞行状态。
  3. 根据权利要求2所述的无人机相对方位控制方法,其特征在于,所述根据所述第二相对方位信息控制本无人机调整飞行状态的过程具体包括:
    控制本无人机飞往所述第二相对方位信息中的位置信息所表征的坐标位置,使得本无人机与所述手势区域之间的距离与所述预设值中的预设距离相同。
  4. 根据权利要求3所述的无人机相对方位控制方法,其特征在于,所述根据所述第二相对方位信息控制本无人机调整飞行状态的过程具体还包括:
    根据所述第二相对方位信息中的方位角信息控制本无人机调整航向角,使 得本无人机朝向所述手势区域。
  5. 根据权利要求3所述的无人机相对方位控制方法,其特征在于,所述根据所述第二相对方位信息控制本无人机调整飞行状态的过程具体还包括:
    根据所述第二相对方位信息中的第二高度差信息控制本无人机调整飞行高度,使得本无人机与所述手势区域之间的高度差与所述预设值中的高度差或第一相对方位信息所表征的高度差相同。
  6. 根据权利要求5所述的无人机相对方位控制方法,其特征在于,所述检测本无人机与所述手势区域之间的相对方位信息的过程包括:
    通过本无人机的卫星定位传感器检测本无人机的定位信息;
    通过本无人机的测距传感器检测本无人机与所述手势区域之间的距离信息;
    通过本无人机的方向传感器检测本无人机与所述手势区域之间的方位角信息;
    通过本无人机的陀螺仪传感器检测本无人机与所述手势区域之间的高度角信息,根据所述距离信息和所述高度角信息计算得到本无人机与所述手势区域之间的高度差信息;
    根据所述定位信息、高度差信息、方位角信息和距离信息计算得到本无人机与所述手势区域之间的相对方位信息。
  7. 根据权利要求6所述的无人机相对方位控制方法,其特征在于,所述依据红外图像确定所述手势区域和所述手势指令类型的过程包括:
    从摄像单元获取的预览红外图像中获取的一帧或多帧图像;
    确定所述多帧图像中由红外光描述的手势区域;
    基于所述手势区域提取手势特征数据,将其与预设的手势指令类型描述数据进行匹配,确定相对应的手势指令类型。
  8. 根据权利要求6所述的无人机相对方位控制方法,其特征在于,还包括以下后续步骤:
    检测本无人机与所述手势区域之间的第三相对方位信息;
    确定所述第三相对方位信息表征的距离的变化率是否大于预设速度范围;
    当大于预设速度范围时,控制本无人机调整飞行速度,使得本无人机与所述手势区域之间的距离在预定距离范围内。
  9. 根据权利要求8所述的无人机相对方位控制方法,其特征在于,还包括以下后续步骤:
    确定所述第三相对方位信息表征的高度差是否小于第一预设高度差;
    当小于第一预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差提升至所述第一预设高度差。
  10. 根据权利要求9所述的无人机相对方位控制方法,其特征在于,还包括以下后续步骤:
    确定所述第三相对方位信息表征的高度差是否大于第二预设高度差;
    当所述大于第二预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差降低至所述第二预设高度差。
  11. 根据权利要求10所述的无人机相对方位控制方法,其特征在于,还包括以下后续步骤:
    当确定以下任意一种方位偏离状态条件被满足时,基于信任连接向穿戴设备发送告警指令:
    所述第三相对方位信息表征的距离的变化率大于预设速度范围;
    所述第三相对方位信息表征的高度差小于第一预设高度差;
    所述第三相对方位信息表征的高度差大于第二预设高度差。
  12. 根据权利要求6所述的无人机相对方位控制方法,其特征在于,还包括以下前置步骤:
    基于信任连接,向穿戴设备发送用于驱动其发射红外光的驱动指令。
  13. 根据权利要求11或12所述的无人机相对方位控制方法,其特征在于,还包括如下前置步骤:
    通过通信连接,对穿戴设备进行身份验证;
    当身份验证成功时,本无人机和所述穿戴设备建立信任连接。
  14. 一种用于穿戴设备的无人机相对方位辅助控制方法,包括如下步骤:
    基于信任连接,接收无人机的用于驱动穿戴设备发射红外光的驱动指令;
    响应于所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制;
    基于信任连接,接收无人机的告警指令;
    响应于所述告警指令,控制穿戴设备启动振动马达和/或开启指示灯,以提示用户当前无人机处于方位偏离状态。
  15. 根据权利要求14所述的用于穿戴设备的无人机相对方位辅助控制方法,其特征在于,还包括如下前置步骤:
    通过通信连接,向无人机发送身份验证请求;
    当身份验证成功时,所述穿戴设备和所述无人机建立信任连接。
  16. 根据权利要求14所述的用于穿戴设备的无人机相对方位辅助控制方法,其特征在于,还包括如下并发步骤:
    计算所述红外发光组件的工作时长,当该工作时长超过时长预定值时,控制所述红外发光组件停止发射红外光。
  17. 一种无人机相对方位控制装置,包括:
    至少一个处理器;
    以及,至少一个存储器,其与所述至少一个处理器可通信地连接;所述至少一个存储器包括处理器可执行的指令,当所述处理器可执行的指令由所述至少一个处理器执行时,致使所述装置执行至少以下操作:
    获取穿戴设备发射的红外光作用于手势区域后形成的红外图像;
    依据所述红外图像确定由红外光描述轮廓的手势区域和表征相对方位预设值的手势指令类型;
    检测本无人机与所述手势区域之间的第一相对方位信息;
    根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态,使得本无人机与所述手势区域之间的相对方位为所述预设值。
  18. 根据权利要求17所述的无人机相对方位控制装置,其特征在于,根据所述第一相对方位信息和所述手势指令类型控制本无人机的飞行状态的过程具体包括:
    根据第一相对方位信息和所述手势指令类型所表征的相对方位预设值 计算得到第二相对方位信息,根据所述第二相对方位信息控制本无人机调整飞行状态。
  19. 根据权利要求18所述的无人机相对方位控制装置,其特征在于,所述根据所述第二相对方位信息控制本无人机调整飞行状态的过程具体包括:
    控制本无人机飞往所述第二相对方位信息中的位置信息所表征的坐标位置,使得本无人机与所述手势区域之间的距离与所述预设值中的预设距离相同。
  20. 根据权利要求19所述的无人机相对方位控制装置,其特征在于,所述根据所述第二相对方位信息控制本无人机调整飞行状态的过程具体还包括:
    根据所述第二相对方位信息中的方位角信息控制本无人机调整航向角,使得本无人机朝向所述手势区域。
  21. 根据权利要求19所述的无人机相对方位控制装置,其特征在于,所述根据所述第二相对方位信息控制本无人机调整飞行状态的过程具体还包括:
    根据所述第二相对方位信息中的第二高度差信息控制本无人机调整飞行高度,使得本无人机与所述手势区域之间的高度差与所述预设值中的高度差或第一相对方位信息所表征的高度差相同。
  22. 根据权利要求21所述的无人机相对方位控制装置,其特征在于,所述检测本无人机与所述手势区域之间的相对方位信息的过程包括:
    通过本无人机的卫星定位传感器检测本无人机的定位信息;
    通过本无人机的测距传感器检测本无人机与所述手势区域之间的距离信息;
    通过本无人机的方向传感器检测本无人机与所述手势区域之间的方位角信息;
    通过本无人机的陀螺仪传感器检测本无人机与所述手势区域之间的高度角信息,根据所述距离信息和所述高度角信息计算得到本无人机与所述手势区域之间的高度差信息;
    根据所述定位信息、高度差信息、方位角信息和距离信息计算得到本无人机与所述手势区域之间的相对方位信息。
  23. 根据权利要求22所述的无人机相对方位控制装置,其特征在于,所述依据红外图像确定所述手势区域和所述手势指令类型的过程包括:
    从摄像单元获取的预览红外图像中获取的一帧或多帧图像;
    确定所述多帧图像中由红外光描述的手势区域;
    基于所述手势区域提取手势特征数据,将其与预设的手势指令类型描述数据进行匹配,确定相对应的手势指令类型。
  24. 根据权利要求22所述的无人机相对方位控制装置,其特征在于,所述操作还包括:
    检测本无人机与所述手势区域之间的第三相对方位信息;
    确定所述第三相对方位信息表征的距离的变化率是否大于预设速度范围;
    当大于预设速度范围时,控制本无人机调整飞行速度,使得本无人机与所述手势区域之间的距离在预定距离范围内。
  25. 根据权利要求24所述的无人机相对方位控制装置,其特征在于,所述操作还包括:
    确定所述第三相对方位信息表征的高度差是否小于第一预设高度差;
    当小于第一预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差提升至所述第一预设高度差。
  26. 根据权利要求25所述的无人机相对方位控制装置,其特征在于,所述操作还包括:
    确定所述第三相对方位信息表征的高度差是否大于第二预设高度差;
    当所述大于第二预设高度差时,控制本无人机的飞行状态,使本无人机与所述手势区域之间的高度差降低至所述第二预设高度差。
  27. 根据权利要求26所述的无人机相对方位控制装置,其特征在于,所述操作还包括:
    当所述确定单元确定以下任意一种方位偏离状态条件被满足时,基于信任连接向穿戴设备发送告警指令:
    所述第三相对方位信息表征的距离的变化率大于预设速度范围;
    所述第三相对方位信息表征的高度差小于第一预设高度差;
    所述第三相对方位信息表征的高度差大于第二预设高度差。
  28. 根据权利要求22所述的无人机相对方位控制装置,其特征在于,所述操作还包括:
    基于信任连接,向穿戴设备发送用于驱动其发射红外光的驱动指令。
  29. 根据权利要求28所述的无人机相对方位控制装置,其特征在于,所述操作还包括:
    通过通信连接,对穿戴设备进行身份验证;
    当身份验证成功时,本无人机和所述穿戴设备建立信任连接。
  30. 一种用于穿戴设备的无人机相对方位辅助控制装置,包括:
    至少一个处理器;
    以及,至少一个存储器,其与所述至少一个处理器可通信地连接;所述至少一个存储器包括处理器可执行的指令,当所述处理器可执行的指令由所述至少一个处理器执行时,致使所述装置执行至少以下操作:
    基于信任连接,接收无人机的用于驱动穿戴设备发射红外光的驱动指令;
    响应于所述驱动指令,驱动穿戴设备预置的红外发光组件发射红外光,使得无人机基于红外成像确定手势区域和表征相对方位预设值的手势指令类型,以应用于无人机相对方位控制;
    基于信任连接,接收无人机的告警指令;
    响应于所述告警指令,控制穿戴设备启动振动马达和/或开启指示灯,以提示用户当前无人机处于方位偏离状态。
  31. 根据权利要求30所述的用于穿戴设备的无人机相对方位辅助控制装置,其特征在于,所述操作还包括:
    通过通信连接,向无人机发送身份验证请求;
    当身份验证成功时,所述穿戴设备和所述无人机建立信任连接。
  32. 根据权利要求30所述的用于穿戴设备的无人机相对方位辅助控制装置,其特征在于,所述操作还包括:
    计算所述红外发光组件的工作时长,当该工作时长超过时长预定值时,控制所述红外发光组件停止发射红外光。
  33. 一种无人机控制装置,包括:
    一个或多个摄像头,其中至少一个摄像头具有红外成像功能;
    一个或多个传感器,用于检测所述相对方位信息;
    存储器,用于存储支持穿戴设备执行上述无人机相对方位控制方法的程序;
    通信接口,用于上述无人机与穿戴设备或其他设备或通信网络通信;
    一个或多个处理器,用于执行所述存储器中存储的程序;
    一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行;
    所述一个或多个程序用于驱动所述一个或多个处理器构造用于执行权利要求1至13中任意一项所述的无人机相对方位控制方法的单元。
  34. 一种穿戴设备控制装置,包括:
    存储器,用于存储支持穿戴设备执行上述用于穿戴设备的无人机相对方位辅助控制方法的程序;
    通信接口,用于上述穿戴设备与无人机或其他设备或通信网络通信;
    振动马达和/或指示灯,用于提示用户当前无人机的状态;
    一个或多个处理器,用于执行所述存储器中存储的程序;
    红外发光组件,包括一个或多个红外光源,用于发射红外光;
    一个或多个应用程序,其中所述一个或多个应用程序被存储在所述存储器中并被配置为由所述一个或多个处理器执行;
    所述一个或多个程序用于驱动所述一个或多个处理器构造用于执行权利要求14至16中任意一项所述的用于穿戴设备的无人机相对方位辅助控制方法 的单元。
  35. 一种计算机程序,包括计算机可读代码,当无人机运行所述计算机可读代码时,导致权利要求1-13中的任一项权利要求所述的方法被执行。
  36. 一种计算机程序,包括计算机可读代码,当穿戴设备运行所述计算机可读代码时,导致权利要求14-16中的任一项权利要求所述的方法被执行。
  37. 一种计算机可读介质,其中存储了如权利要求35或36所述的计算机程序。
PCT/CN2017/114974 2016-12-07 2017-12-07 无人机相对方位控制方法及装置 WO2018103689A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611114763.4 2016-12-07
CN201611114763.4A CN106444843B (zh) 2016-12-07 2016-12-07 无人机相对方位控制方法及装置

Publications (1)

Publication Number Publication Date
WO2018103689A1 true WO2018103689A1 (zh) 2018-06-14

Family

ID=58216143

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/114974 WO2018103689A1 (zh) 2016-12-07 2017-12-07 无人机相对方位控制方法及装置

Country Status (2)

Country Link
CN (1) CN106444843B (zh)
WO (1) WO2018103689A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109857260A (zh) * 2019-02-27 2019-06-07 百度在线网络技术(北京)有限公司 三维互动影像的控制方法、装置和***
CN115841487A (zh) * 2023-02-20 2023-03-24 深圳金三立视频科技股份有限公司 一种沿输电线路的隐患定位方法及终端

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444843B (zh) * 2016-12-07 2019-02-15 北京奇虎科技有限公司 无人机相对方位控制方法及装置
CN109690440B (zh) * 2017-03-31 2022-03-08 深圳市大疆创新科技有限公司 一种无人机的飞行控制方法及无人机
WO2018191840A1 (zh) * 2017-04-17 2018-10-25 英华达(上海)科技有限公司 无人机交互拍摄***及方法
WO2018195883A1 (zh) * 2017-04-28 2018-11-01 深圳市大疆创新科技有限公司 无人飞行器的控制方法、设备及无人飞行器
CN107024725B (zh) * 2017-05-31 2023-09-22 湖南傲英创视信息科技有限公司 一种大视场微光低空无人机探测装置
CN107643074B (zh) * 2017-09-07 2019-12-03 天津津航技术物理研究所 一种航空扫描仪摆扫成像方位预置方法
WO2020019193A1 (zh) * 2018-07-25 2020-01-30 深圳市大疆创新科技有限公司 一种无人机控制方法、***及无人机
CN109270954A (zh) * 2018-10-30 2019-01-25 西南科技大学 一种基于姿态识别的无人机交互***及其控制方法
CN109725637B (zh) * 2018-12-04 2021-10-15 广东嘉腾机器人自动化有限公司 一种agv防丢包调度方法、存储装置及agv交管***
WO2021026780A1 (zh) * 2019-08-13 2021-02-18 深圳市大疆创新科技有限公司 拍摄控制方法、终端、云台、***及存储介质
CN112051856B (zh) * 2020-07-31 2024-01-19 深圳市贝贝特科技实业有限公司 用于无人机动态回收的复合传感***
CN114442305A (zh) * 2020-11-02 2022-05-06 上海迈利船舶科技有限公司 一种视觉增强ais船舶望远镜

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105138126A (zh) * 2015-08-26 2015-12-09 小米科技有限责任公司 无人机的拍摄控制方法及装置、电子设备
CN105518576A (zh) * 2013-06-28 2016-04-20 陈家铭 根据手势的控制装置操作
WO2016078742A1 (de) * 2014-11-20 2016-05-26 Audi Ag Verfahren zum betreiben eines navigationssystems eines kraftfahrzeugs mittels einer bediengeste
CN105677300A (zh) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 基于手势识别操控无人机的方法、无人机及***
CN105676860A (zh) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 一种可穿戴设备、无人机控制装置和控制实现方法
CN106054914A (zh) * 2016-08-17 2016-10-26 腾讯科技(深圳)有限公司 一种飞行器的控制方法及飞行器控制装置
CN106094846A (zh) * 2016-05-31 2016-11-09 中国航空工业集团公司西安飞机设计研究所 一种飞机飞行控制方法
CN106444843A (zh) * 2016-12-07 2017-02-22 北京奇虎科技有限公司 无人机相对方位控制方法及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105518576A (zh) * 2013-06-28 2016-04-20 陈家铭 根据手势的控制装置操作
WO2016078742A1 (de) * 2014-11-20 2016-05-26 Audi Ag Verfahren zum betreiben eines navigationssystems eines kraftfahrzeugs mittels einer bediengeste
CN105138126A (zh) * 2015-08-26 2015-12-09 小米科技有限责任公司 无人机的拍摄控制方法及装置、电子设备
CN105677300A (zh) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 基于手势识别操控无人机的方法、无人机及***
CN105676860A (zh) * 2016-03-17 2016-06-15 歌尔声学股份有限公司 一种可穿戴设备、无人机控制装置和控制实现方法
CN106094846A (zh) * 2016-05-31 2016-11-09 中国航空工业集团公司西安飞机设计研究所 一种飞机飞行控制方法
CN106054914A (zh) * 2016-08-17 2016-10-26 腾讯科技(深圳)有限公司 一种飞行器的控制方法及飞行器控制装置
CN106444843A (zh) * 2016-12-07 2017-02-22 北京奇虎科技有限公司 无人机相对方位控制方法及装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109857260A (zh) * 2019-02-27 2019-06-07 百度在线网络技术(北京)有限公司 三维互动影像的控制方法、装置和***
CN115841487A (zh) * 2023-02-20 2023-03-24 深圳金三立视频科技股份有限公司 一种沿输电线路的隐患定位方法及终端
CN115841487B (zh) * 2023-02-20 2023-06-09 深圳金三立视频科技股份有限公司 一种沿输电线路的隐患定位方法及终端

Also Published As

Publication number Publication date
CN106444843B (zh) 2019-02-15
CN106444843A (zh) 2017-02-22

Similar Documents

Publication Publication Date Title
WO2018103689A1 (zh) 无人机相对方位控制方法及装置
US9977434B2 (en) Automatic tracking mode for controlling an unmanned aerial vehicle
CN110494360B (zh) 用于提供自主摄影及摄像的***和方法
US11460844B2 (en) Unmanned aerial image capture platform
CN110692027B (zh) 用于提供无人机应用的易用的释放和自动定位的***和方法
US11604479B2 (en) Methods and system for vision-based landing
US20210356956A1 (en) Systems and methods for controlling an unmanned aerial vehicle
US20230195102A1 (en) Systems and methods for adjusting flight control of an unmanned aerial vehicle
US11531336B2 (en) Systems and methods for automatically customizing operation of a robotic vehicle
US11531340B2 (en) Flying body, living body detection system, living body detection method, program and recording medium
JPWO2017170148A1 (ja) 飛行装置、電子機器およびプログラム
TW201706970A (zh) 無人飛機導航系統及方法
US10557718B2 (en) Auxiliary control method and system for unmanned aerial vehicle
KR102486768B1 (ko) 탐지 상황에 따라 자동으로 이동 경로를 설정하는 무인 항공기, 및 운용 방법
US11354897B2 (en) Output control apparatus for estimating recognition level for a plurality of taget objects, display control system, and output control method for operating output control apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17879044

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17879044

Country of ref document: EP

Kind code of ref document: A1