WO2024092586A1 - 无人机的控制方法、装置及存储介质 - Google Patents

无人机的控制方法、装置及存储介质 Download PDF

Info

Publication number
WO2024092586A1
WO2024092586A1 PCT/CN2022/129378 CN2022129378W WO2024092586A1 WO 2024092586 A1 WO2024092586 A1 WO 2024092586A1 CN 2022129378 W CN2022129378 W CN 2022129378W WO 2024092586 A1 WO2024092586 A1 WO 2024092586A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
posture
preset
target object
controlling
Prior art date
Application number
PCT/CN2022/129378
Other languages
English (en)
French (fr)
Inventor
李博文
邬奇峰
袁一然
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2022/129378 priority Critical patent/WO2024092586A1/zh
Publication of WO2024092586A1 publication Critical patent/WO2024092586A1/zh

Links

Images

Definitions

  • the present application relates to the field of control, and in particular to a method, device and storage medium for controlling an unmanned aerial vehicle.
  • Drones can be used in aerial photography, inspection, forest protection, disaster investigation, pesticide spraying and other scenarios, which has led to their widespread use.
  • the waypoint flight function has emerged.
  • the drone In the waypoint flight mode, the drone can fly along a preset route and perform corresponding shooting actions or flight operations for preset waypoints or preset points of interest in the preset route.
  • the embodiments of the present application provide a control method, device and storage medium for a drone, aiming to solve the problem that the degree of freedom in shooting points of interest of the drone in waypoint flight mode is low.
  • an embodiment of the present application provides a method for controlling a drone, comprising: obtaining a first instruction when the drone flies along a preset route; wherein the first instruction is generated according to a first operation of indicating a target object input by a user when the drone flies along the preset route;
  • the drone In response to the first instruction, the drone is controlled to continue flying along the preset route, while the posture of the drone and/or the posture of the shooting device carried by the drone is controlled to continuously shoot the target object while the drone is flying along the preset route; wherein the preset route includes at least two preset waypoints.
  • an embodiment of the present application provides a method for controlling a drone, including:
  • a first operation indicating a target object is received from a user; wherein the first operation is used to generate a first instruction, and the first instruction is used to control the drone to continue flying along the preset route while controlling the posture of the drone and/or the posture of a shooting device carried by the drone, so as to continuously shoot the target object while the drone is flying along the preset route;
  • the preset route includes at least two preset waypoints.
  • an embodiment of the present application provides a method for controlling a drone, including:
  • the posture of the drone or the posture of the photographing device carried by the drone is automatically controlled so that the target object remains at a preset position in the first direction of the first image;
  • the posture of the drone or the posture of the photographing device carried on the drone is controlled according to a second operation input by the user to adjust the position of the target object in a second direction of the first image; wherein the preset route includes at least two preset waypoints.
  • an embodiment of the present application provides a method for controlling a drone, including:
  • the second operation is used to generate a second instruction, and the second instruction is used to:
  • the posture of the drone or the posture of the photographing device carried on the drone is controlled according to the second operation to adjust the position of the target object in the second direction of the first image; wherein the preset route includes at least two preset waypoints.
  • an embodiment of the present application provides a control device for a drone, the control device comprising a memory and a processor;
  • the memory is used to store computer programs
  • the processor is used to execute the computer program and implement the method steps described in the first aspect or the third aspect when executing the computer program.
  • an embodiment of the present application provides a control device for a drone, the control device comprising a memory and a processor;
  • the memory is used to store computer programs
  • the processor is used to execute the computer program and implement the method steps described in the second aspect or the fourth aspect when executing the computer program.
  • an embodiment of the present application provides a drone, comprising a control device for the drone as described in the fifth aspect.
  • an embodiment of the present application provides a control terminal, comprising the control device of the drone as described in the sixth aspect.
  • an embodiment of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the method steps described in the first aspect or the third aspect.
  • an embodiment of the present application provides a computer-readable storage medium, which stores a computer program.
  • the processor implements the method steps described in the second aspect or the fourth aspect.
  • a user can select a target object in real time through a first operation to generate a first instruction while the drone is flying along a preset route.
  • the drone can control the drone's posture and/or the posture of a shooting device on the drone according to the first instruction to continue flying along the preset route, so as to continuously shoot the target object while the drone is flying along the preset route. This improves the shooting freedom in the waypoint flight mode.
  • FIG1 is a schematic diagram of an application scenario of a control system provided in an embodiment of the present application.
  • FIG2 is a schematic diagram of the structure of a photographing device provided in an embodiment of the present application.
  • FIG3 is a flowchart of one of the steps of the control method provided in an embodiment of the present application.
  • FIG4 is one of the schematic diagrams of the waypoint flight of the UAV provided in the embodiment of the present application.
  • FIG5 is a second schematic diagram of a UAV waypoint flight provided in an embodiment of the present application.
  • FIG6 is a third schematic diagram of a UAV waypoint flight provided in an embodiment of the present application.
  • FIG7 is a fourth schematic diagram of a UAV waypoint flight provided in an embodiment of the present application.
  • FIG8 is one of the schematic diagrams of the shooting screen of the drone provided in the embodiment of the present application.
  • FIG9 is a second schematic diagram of a shooting screen of a drone provided in an embodiment of the present application.
  • FIG10 is a second flow chart of the steps of the control method provided in an embodiment of the present application.
  • FIG11 is one of the interactive scene diagrams of the control terminal provided in an embodiment of the present application.
  • FIG12 is a second interactive scene diagram of the control terminal provided in an embodiment of the present application.
  • FIG13 is a flowchart of the third step of the control method provided in an embodiment of the present application.
  • FIG14 is a fourth flow chart of the steps of the control method provided in an embodiment of the present application.
  • FIG. 15 is a schematic diagram of the structure of a control device provided in an embodiment of the present application.
  • Drones can be used in aerial photography, inspection, forest protection, disaster investigation, pesticide spraying and other scenarios, which has led to their widespread use.
  • the waypoint flight function has emerged.
  • the drone In the waypoint flight mode, the drone can fly along a preset route and perform corresponding shooting actions or flight operations for preset waypoints or preset points of interest in the preset route.
  • the embodiments of the present application provide a control method, device and storage medium for a drone, aiming to solve the problem that the drone has low shooting freedom in waypoint flight mode.
  • control method provided in the embodiment of the present application is applied to unmanned aerial vehicles, also known as drones.
  • the drones may include rotor-type drones, such as quad-rotor drones, hexacopter drones, octo-rotor drones, fixed-wing drones, or a combination of rotor-type and fixed-wing drones.
  • rotor-type drones such as quad-rotor drones, hexacopter drones, octo-rotor drones, fixed-wing drones, or a combination of rotor-type and fixed-wing drones.
  • quad-rotor drones such as quad-rotor drones, hexacopter drones, octo-rotor drones, fixed-wing drones, or a combination of rotor-type and fixed-wing drones.
  • the drone can respond to control commands sent by at least two control terminals (such as a remote controller and a head-mounted display device) to control the drone to perform operations such as fuselage deflection and gimbal rotation.
  • control terminals such as a remote controller and a head-mounted display device
  • a control system includes a drone 100 and a control terminal 200, and the drone 100 can be connected to the control terminal 200 for communication.
  • the control terminal 200 can be used to control the drone 100.
  • the control terminal 200 may include at least one of a remote control, a smart phone, and a tablet computer, and may also include at least one of a remote control, a smart phone, and a wearable device, the wearable device includes a head-mounted display device, and the head-mounted display device may include a virtual reality (VR, virtual reality) display device or a first-person perspective (FPV, first person view) display device.
  • VR virtual reality
  • FMV first-person perspective
  • the drone 100 includes a fuselage 110, a power system 120, an imaging device 130, and a control device (not shown in FIG. 1 ).
  • the fuselage 110 may include a nose.
  • the drone 100 further includes an arm, wherein the arm is connected to the fuselage 110, and the arm is used to install the power system.
  • the power system 120 may be directly installed on the fuselage 110.
  • the power system 120 is used to provide flight power for the UAV, and the power system 120 may include a motor and a propeller mounted on the motor and driven by the motor.
  • the power system 120 can drive the fuselage 110 of the UAV 100 to rotate around one or more rotation axes.
  • the above-mentioned rotation axes may include a roll axis, a yaw axis, and a pitch axis.
  • the motor can be a DC motor or an AC motor.
  • the motor can be a brushless motor or a brushed motor.
  • the camera 130 is directly carried on the fuselage 110 or carried on the fuselage 110 through the gimbal, and is used to shoot images, which may be pictures and/or videos.
  • the drone may include a gimbal 140, on which the camera 130 is mounted, and the gimbal 140 is connected to the fuselage 110.
  • the gimbal 140 can control the yaw rotation of the camera 130 to adjust the yaw orientation of the camera 130.
  • the gimbal 140 may include a yaw motor 141, which is used to control the yaw rotation of the camera 130.
  • the gimbal 140 can control the pitch rotation of the camera 130 to adjust the pitch orientation of the camera 130.
  • the gimbal 140 may include a pitch motor, which is used to control the pitch rotation of the camera 130.
  • the gimbal 140 can control the roll rotation of the camera 130 to adjust the roll orientation of the camera 130.
  • the gimbal 140 may include a roll motor, which is used to control the roll rotation of the camera 130.
  • the yaw rotation of the camera 130 and the yaw rotation of the body 110 may be associated. Further, the camera 130 may yaw rotate following the yaw rotation of the body 110, or the body 110 may yaw rotate following the yaw rotation of the camera 130.
  • the control terminal may include an input device, wherein the input device may detect a control operation of a user of the control terminal, and the control terminal may generate a control instruction for the drone according to the control operation of the user detected by the input device.
  • the control terminal may generate a yaw control instruction according to a nose yaw control operation of the user of the control terminal detected by the input device, and the control terminal may send the nose yaw control instruction to the drone;
  • the control terminal may generate a pitch control instruction according to a pan/tilt pitch control operation of the user of the control terminal detected by the input device, and the control terminal may send the pan/tilt pitch control instruction to the drone.
  • the control terminal 200 includes a remote controller, which is provided with an input device 210 and a communication device 220.
  • the communication device 220 is a wireless communication device, which may include at least one of a high-frequency radio transceiver, a WIFI module, and a Bluetooth module.
  • the input device 210 is used to generate corresponding control instructions in response to the user's manipulation, so that the remote controller can control the drone to adjust the flight attitude and/or flight speed through the control instruction.
  • the input device 210 includes at least one of a button, a joystick, a dial, and a touch screen.
  • the input device 210 is a joystick, which is installed on the body of the remote controller.
  • the remote controller senses the user's yaw control manipulation of the joystick to generate a corresponding control instruction, and sends the first yaw control instruction to the drone 100 through the communication device 220.
  • the control terminal 200 can receive images transmitted by the drone 100 and display them on the display device 230.
  • the display device 230 can be integrated into the control terminal 200, or the display device 230 can be separately provided from the control terminal 200 and communicatively connected to the control terminal 200.
  • the communication connection method can be a wired communication connection method or a wireless communication connection method.
  • the wireless communication connection method can be a WiFi connection, a Bluetooth connection, or a high-frequency wireless signal connection.
  • the control terminal 200 includes a remote controller, which is provided with an input device 210 and a communication device 220.
  • the communication device 220 is a wireless communication device, which may include at least one of a high-frequency radio transceiver, a WIFI module, and a Bluetooth module.
  • the input device 210 is used to generate corresponding control instructions in response to user manipulation, so that the remote controller can control the drone to adjust the flight attitude and/or flight speed through the control instructions.
  • the input device 210 includes at least one of a button, a joystick, a dial, and a touch screen.
  • the input device 210 is a joystick, which is installed on the body of the remote controller.
  • the user may generate control instructions by using buttons, joysticks, and dials, or by inputting operations on a touch display screen, which is not limited here.
  • FIG. 3 is a flowchart of a control method provided in an embodiment of the present application.
  • the control method can be applied to the above-mentioned drone 100, or to a system composed of the above-mentioned drone 100 and the control terminal 200.
  • the drone 100 can be connected to the control terminal 200 in communication, so as to control the drone 100 to perform corresponding operations according to the control instructions sent by the control terminal 200.
  • the method includes step S101 and step S102:
  • S102 In response to the first instruction, the drone is controlled to continue to fly along the preset route, while the attitude of the drone and/or the attitude of the shooting device on the drone is controlled to continuously shoot the target object while the drone is flying along the preset route.
  • the preset route includes at least two preset waypoints.
  • the preset route may be a route pre-set by the user, including at least two preset waypoints.
  • the preset waypoints may be manually set by the user. Specifically, in some embodiments, the preset waypoints may be determined by the user by clicking on a map displayed by the control terminal; in some embodiments, the preset waypoints may be determined by the user based on image recognition after selecting a frame on a shooting interface displayed by the control terminal; in some embodiments, the user may also control the drone to fly to a specified geographical location, and determine the specified geographical location as a preset waypoint by operating the control terminal.
  • the user can select the order of multiple preset waypoints so that the drone passes through the multiple preset waypoints in order when executing the preset route.
  • the drone can also choose not to pass through the preset waypoints, but to fly around the preset waypoints within a specified radius.
  • the first instruction may be generated by the control terminal and sent to the drone. That is, the first operation input by the user to indicate the target object may be an operation input by the user to the control terminal.
  • the above step of obtaining the first instruction may be a step of receiving the first instruction when the drone flies along a preset route.
  • the first instruction may be generated by the drone itself, and the above step of obtaining the first instruction may also be a step of generating the first instruction by the drone.
  • the target object may be a geographical location or a geographical location range, or a photographic object appearing in a photographing screen of a photographing device.
  • the user may control the drone to continuously photograph the target object while flying along a preset route by indicating a first operation of the target object.
  • the drone may perform a corresponding operation according to the first instruction.
  • the attitude of the drone and/or the attitude of the camera mounted on the drone may be controlled without interrupting the drone's flight along the preset route, so that the drone can continue to photograph the target object while flying along the preset route.
  • the drone can obtain a first instruction, and in response to the first instruction, control the drone to continue flying along the preset route while controlling the drone's posture and/or the posture of a shooting device carried on the drone.
  • the user can control the drone's posture and/or the posture of the shooting device carried on the drone in real time by using the first operation indicating the target object to continuously shoot the target object.
  • the attitude of the drone may include the orientation of the nose, such as the yaw rotation angle of the nose.
  • the attitude of the camera may also include the attitude of the camera along the yaw axis, pitch axis or roll axis, which is not further limited here.
  • Continuously photographing the target object can be understood as the drone controlling its own posture and the posture of the photographing device in at least part of the preset route so that the target object can be photographed by the photographing device.
  • the target object can be located at a relatively fixed position in the field of view of the photographing device.
  • the user can select a target object in real time through a first operation to generate a first instruction while the drone is flying along a preset route.
  • the drone can control the drone to continue flying along the preset route according to the first instruction, while controlling the attitude of the drone and/or the attitude of the shooting device on the drone, so as to continuously shoot the target object while the drone is flying along the preset route, which improves the shooting freedom in the waypoint flight mode.
  • controlling the posture of the drone and/or the posture of a photographing device carried on the drone can be achieved by controlling the posture of the drone and/or the photographing device along the pitch axis and/or the yaw axis.
  • the above step 102 may specifically include at least one of the following:
  • the posture of the shooting device carried by the UAV along the yaw axis is controlled so as to continuously shoot the target object while the UAV is flying along the preset route.
  • the nose attitude may include the attitude of the nose along the yaw axis and the attitude of the nose along the pitch axis.
  • the yaw attitude of the drone's nose can be associated with the yaw attitude of the shooting device. Furthermore, when the shooting device is carried by a gimbal, the yaw rotation of the gimbal can follow the yaw rotation of the nose.
  • the drone can change the shooting angle of the shooting device by controlling the rotation of the nose along the yaw axis and the rotation of the gimbal along the pitch axis. Therefore, in some embodiments, the drone can continuously shoot the target object by controlling the direction of the nose along the yaw axis and the attitude of the shooting device along the pitch axis. In other optional embodiments, the drone can also simultaneously control the attitude of the gimbal along the yaw axis and the pitch axis to continuously shoot the target object, which will not be listed one by one here.
  • the maximum speed or maximum angular velocity of the rotation of the head and/or the shooting device can be controlled.
  • the drone and/or the control terminal can recognize the image of the target object and determine the type of the target object, so that a corresponding control strategy can be determined according to the type of the target object.
  • the above step 102 may specifically include:
  • the drone When the target object is of a preset type, the drone is controlled to change the drone's posture and/or the posture of the shooting device according to the status information of the target object collected in real time, so that when the target object is in motion, the drone and/or the shooting device carried by the drone can follow the target object for continuous shooting.
  • the shooting device carried by the drone can obtain an image of the target object, and determine whether the target object belongs to the preset type by extracting image features and comparing them with preset type features. In the case where the target object belongs to the preset type, it means that the target object is likely to be a dynamic target.
  • the drone can control the posture of the drone and/or the posture of the shooting device carried by the drone based on the state information of the target object obtained in real time, so as to follow the target object for continuous shooting.
  • the state information of the target object may include the position information and motion state information of the target object.
  • the state information of the target object may be at least partially acquired by a camera mounted on a drone.
  • the motion state information of the target object may be acquired based on the position information of the drone and the position information of the target object at two time points.
  • the drone may capture images of the target object at two time points and calculate the motion speed of the target object by triangulation or other methods.
  • the drone may control the rotation of the drone's nose and/or camera according to the motion state information and the position information of the target object while continuing to fly along a preset route.
  • the UAV can also continue to fly along the route and, based on the status information of the target object collected in real time, control the posture of the UAV and/or the posture of the shooting device to follow the target object for continuous shooting, thereby achieving dynamic tracking of the target object, further improving the shooting freedom in the waypoint flight scenario.
  • the preset route in FIG. 4 includes preset waypoints A, B, C, and D.
  • the drone flies along the preset route, it needs to pass through the preset waypoints A, B, C, and D in sequence.
  • the target object is a dynamic target
  • the target object will move as the drone flies along the preset route.
  • the drone can control the drone's nose and/or shooting device to turn from facing point X to facing point Y, that is, to follow the movement of the target object for continuous shooting.
  • the above step 102 may specifically include:
  • the drone is controlled to change the posture of the drone and/or the posture of the shooting device according to the initial state information of the target object, so as to continuously shoot the target object while the drone flies along the preset route.
  • the initial state information may be information obtained by the drone when responding to the first instruction, or may be information obtained by the drone at a certain moment after responding to the first instruction.
  • the initial state information may include the position information of the target object obtained by the drone when responding to the first instruction. Since the position information of the target object will not change when it is a static target, the drone can change the posture of the drone and/or the posture of the shooting device carried by the drone according to the initial position information of the target object while continuing to fly along the preset route, so as to continuously shoot the target object.
  • the drone does not need to obtain the status information of the target object in real time, but only needs to obtain the initial status information of the target object, that is, the drone can continuously shoot the target object while flying along the preset route, reducing resource usage.
  • the target object in FIG. 5 is a static target, and the target object is always located at point X.
  • the drone can continuously shoot the target object by controlling the nose of the drone and/or the shooting device carried by the drone toward point X while flying along the preset route.
  • the drone by judging whether the target object belongs to a preset type, it is determined whether the target object is likely to be a dynamic target, and different control strategies are adopted for static targets and dynamic targets, so that the drone can reduce the resource usage when continuously shooting static targets while being able to track and lock dynamic targets in real time.
  • the preset type can be a dynamic target that the user shoots in most shooting scenes.
  • the preset type can include cars, ships, and people, so as to improve the accuracy of judgment while covering most shooting scenes where dynamic targets are tracked and locked in real time.
  • the drone can control the posture of the drone and/or the posture of the camera mounted on the drone while continuing to fly along the preset route, so as to continuously photograph the target object before the flight of the preset route ends.
  • the drone can also control the posture of the drone and/or the posture of the camera mounted on the drone while continuing to fly along the preset route, so as to continuously photograph the target object in a section of the route.
  • the above step 102 may specifically include:
  • the posture of the drone and/or a shooting device carried by the drone is controlled to continuously shoot the target object while the drone is flying along a preset route.
  • the first preset condition includes the drone reaching a target preset waypoint among the preset waypoints, and the target preset waypoint includes one of the preset waypoints located after the position of the drone on the preset route.
  • the target preset waypoint can be one of the preset waypoints on the preset route that is located after the current position of the drone.
  • the current position of the drone can be the position of the drone when the drone executes the corresponding control operation in response to the first instruction.
  • the first preset condition may further include:
  • At least part of the target object is out of the field of view of the shooting device.
  • the target object can be a static target or a dynamic target.
  • the target object may be out of the field of view (FOV) of the camera due to its own movement, resulting in the drone being unable to continue tracking and locking the target object through the camera.
  • FOV field of view
  • the drone can control the posture of the drone and/or the posture of the shooting device on the drone before at least part of the target object leaves the field of view of the shooting device, so as to continuously shoot the target object.
  • the situation where at least part of the target object is out of the field of view of the camera can be determined according to the specific application scenario. For example, if the target object is a person, the target object can be determined by head-shoulder recognition and other methods. If the head or shoulder of the target object is out of the field of view of the camera, the target object cannot be identified and tracked.
  • the drone in order to reduce the situation of stopping continuous shooting, may stop continuously shooting the target object after at least part of the target object is out of the field of view of the shooting device for a duration greater than or equal to a preset time.
  • the first preset condition may also include: the time for continuously photographing the target object is greater than or equal to the photographing time preset by the user; the user inputs an operation to stop continuously photographing the target object, etc., which are not listed here one by one.
  • the drone may no longer continuously photograph the target object.
  • control method may further include:
  • the posture of the drone and/or the posture of the shooting device carried by the drone is controlled according to the original preset parameters of the preset route.
  • the original preset parameters of the preset route may be the default control parameters of the preset route, or may be the control parameters preset by the user.
  • the drone needs to meet all conditions of the first preset conditions before performing continuous shooting of the target object; in other optional embodiments, the drone can perform continuous shooting of the target object when any one of the first preset conditions or one or more specific conditions are met, and the specific settings can be made according to actual needs.
  • the user can pre-set the shooting strategy of the drone after the continuous shooting ends.
  • the attitude of the drone and/or the attitude of the shooting device can be adjusted according to the current flight direction along the preset route, so that the drone's head and the shooting device face the flight direction; or the drone can adjust the attitude of the drone and the attitude of the shooting device according to the user's preset head attitude angle and shooting device orientation angle.
  • the specific settings can be made according to actual needs and are not further limited here.
  • users can freely control the start and end conditions of continuous shooting of the target object, and can choose the shooting strategy after the continuous shooting ends, which further enhances the freedom of shooting during waypoint flight and improves the user experience.
  • the drone can respond to the first instruction when flying along the preset route to point E, and control the attitude of the drone and/or the attitude of the camera to continuously shoot the target object. Assuming that the target preset waypoint is point C, when the drone reaches point C, it is equivalent to satisfying the first preset condition. At this time, the drone can adjust the nose attitude and the direction of the camera to be consistent with the flight direction, and fly the route from point C to point D.
  • the user can pre-set a target point of interest so that the drone can continuously shoot the target point of interest while flying along the preset route. Furthermore, the user can set the association between the target point of interest and the preset waypoint so that the drone can continuously shoot the target point of interest when it reaches the preset waypoint.
  • the user can also select the nose attitude angle, gimbal pitch angle, etc. corresponding to the preset waypoint during route editing or route setting, so that when the drone flies along the preset route and reaches the preset waypoint, the attitude of the nose and camera device can be controlled according to the nose attitude angle and gimbal pitch angle.
  • the drone can optionally continue to shoot the target object while flying along a preset route and its own posture and the posture of the shooting device can be manually controlled by the user.
  • the above step 102 may specifically include:
  • the posture of the drone and/or the posture of the shooting device on the drone is controlled to continuously shoot the target object while the drone flies along the preset route;
  • the second preset condition includes that the posture of the drone and/or the posture of the shooting device can be manually controlled by the user.
  • the second preset condition may include at least one of the following:
  • the control mode of the posture of the drone and/or the posture of the shooting device carried by the drone is a manual control mode.
  • the preset route does not have a preset target point of interest.
  • the drone controls the posture of the drone and/or the posture of the shooting device on the drone during the flight according to the preset route to continuously shoot the target point of interest.
  • the control mode of the attitude of the drone and/or the attitude of the camera mounted on the drone can be preset by the user.
  • the user can preset the attitude of the drone and/or the attitude of the camera mounted on the drone to be manually controlled, so that when the drone flies along a preset route, the user can control the attitude of the drone or the attitude of the camera by inputting operations on the joystick of the control terminal.
  • Target points of interest can be set by the user during the process of route setting or editing. They can be associated with all preset waypoints of a preset route, or with some preset waypoints in a preset route. Target points of interest can be static targets, represented by location information. In some cases, target objects can be regarded as "target points of interest" set in real time while the drone is flying along a preset route.
  • the drone can continuously shoot the target object if the second preset condition is met. In other words, while the drone is flying along the preset route, the drone can continuously shoot the target object, continuously shoot the target point of interest, and shoot based on the shooting strategy preset by the user in parallel.
  • the user can pre-set the target points of interest and shooting strategies, and when the target points of interest and shooting strategies do not exist during the flight of the drone along the preset route, the user can control the drone to continuously shoot the target object through the first operation, thereby avoiding conflicts between multiple shooting strategies and improving the user experience.
  • the user sets the drone's nose posture and camera orientation to follow the flight direction of the route from A to B, sets the drone's nose posture and camera orientation to manual control mode from B to C, and sets the target point of interest Y from C to D.
  • the drone flies along the preset route, the drone's posture and camera orientation are controlled from A to B to follow the route direction, and from B to E, the user manually controls. If the user does not perform manual control, the drone's posture and camera orientation at point B can be maintained or controlled along the route direction by default.
  • the drone responds to the first instruction and controls the drone's posture and camera orientation to continuously shoot point X.
  • the drone's posture and camera orientation are controlled to continuously shoot point Y.
  • the drone needs to meet all the conditions of the second preset condition before performing continuous shooting of the target object; in other optional embodiments, the drone can perform continuous shooting of the target object when any one of the second preset conditions or one or more specific conditions are met.
  • the specific setting can be made according to actual needs.
  • the drone's continuous photography of the target object while flying along a preset route may also overwrite previous target points of interest and other user settings.
  • the above step 102 may specifically include:
  • the posture of the drone and/or the posture of the shooting device carried by the drone is controlled to adjust the drone from continuously shooting the target point of interest to shooting the target object.
  • the user can cover the preset target point of interest with the target object through the first operation while the drone is flying along the preset route, and control the drone to adjust from continuous shooting of the target point of interest to shooting of the target object, further improving the user's shooting freedom while the drone is flying along the preset route.
  • the user can adjust the shooting picture through manual control while the drone is continuously shooting the target object.
  • control method may further include:
  • the drone or a photographing device carried by the drone is automatically controlled so that the target object remains at a preset position in a first direction of a first image acquired by the photographing device;
  • the drone or a photographing device carried by the drone is controlled according to a second operation input by the user to adjust the position of the target object in the second direction of the first image.
  • the first image may be acquired by a camera of the drone, and may include multiple frames of images within a preset time period.
  • the target object remains at a preset position in a first direction of the first image, which may be understood as the position of the target object in the first direction of the first image remains substantially unchanged.
  • the user can also control the posture of the drone or the posture of the camera mounted on the drone through a second operation to adjust the position of the target object in the second direction.
  • the second operation is similar to the first operation and can be an operation input by the user to the control terminal, such as an input operation to a joystick, a dial or a sliding button.
  • the second operation includes an operation input by a user on a control stick, and according to the second operation input by the user, controlling the drone or a shooting device carried by the drone includes:
  • the posture of the drone or the posture of a camera mounted on the drone is controlled according to the direction and amount of the operation input to the joystick.
  • Figures 8 and 9 may be first images of two time points when the drone is flying along a preset route.
  • the first direction is the X-axis direction in the figure, that is, the horizontal direction
  • the second direction is the Y-axis direction in the figure, that is, the vertical direction
  • the target object is the high-rise building shown in the figure.
  • the posture of the drone or the posture of the shooting device carried on the drone can be automatically controlled so that the position of the target object in the horizontal direction of Figures 8 and 9 remains approximately unchanged.
  • the user can control the posture of the drone or the posture of the shooting device carried on the drone by inputting operations on the control joystick of the control terminal to change the position of the target object in the vertical direction of the first image, as shown in Figures 8 and 9.
  • the user when the drone is flying along a preset route and continuously photographing the target object, the user can have the drone automatically control the target object to maintain the preset position in the first direction of the first image, and by manually adjusting the position of the target object in the second direction of the first image, the user is given greater freedom in camera movement during the continuous photographing of the target object, thereby further enhancing the user experience.
  • the first direction and the second direction may be specifically set according to the actual application scenario.
  • the control strategy may be determined according to the settings of the first direction and the second direction.
  • the first direction and the second direction may be two directions perpendicular to each other, and the first direction and the second direction may be associated with the yaw axis direction and the pitch axis direction of the drone and/or the camera.
  • the first direction may be a vertical direction of the first image.
  • the above-mentioned step of automatically controlling the posture of the drone or the posture of the shooting device carried on the drone may specifically include: automatically controlling the posture of the drone or the posture of the shooting device carried on the drone along the yaw axis so that the target object remains at a preset position in the first direction of the first image acquired by the shooting device.
  • the second direction may be a vertical direction of the first image.
  • the step of controlling the posture of the drone or the posture of the shooting device carried by the drone according to the second operation input by the user may specifically include:
  • the posture of the drone or the shooting device carried by the drone along the pitch axis is controlled.
  • the control objects in the two control steps may be different. Since the nose of the drone can rotate 360° in the yaw axis direction, in some embodiments, the above-mentioned step of automatically controlling the drone or the shooting device carried by the drone may specifically include:
  • the deflection of the nose of the drone along the yaw axis is automatically controlled so that the target object remains at a preset position in the first direction of the first image acquired by the shooting device.
  • the step of controlling the posture of the drone or the posture of the shooting device carried by the drone according to the second operation input by the user may specifically include:
  • the posture of the shooting device carried by the drone along the pitch axis is controlled.
  • the drone can automatically control the nose posture during the continuous shooting of the high-rise building so as to keep the target object at a preset position in the vertical direction of the first image.
  • the user can adjust the pitch angle of the gimbal by inputting a second operation to control the posture of the shooting device along the pitch axis so as to capture images of different heights of the high-rise building.
  • the first direction may also be the vertical direction of the first image
  • the second direction may also be the horizontal direction of the first image, so as to support more camera movement angles in the yaw axis direction when the target object is a target with a longer length (such as a wall, a forest, etc.).
  • the function of the user adjusting the shooting picture through manual control can be enabled when the target object is a static target.
  • the drone can automatically control the drone or the shooting device on the drone according to the static position information of the target object, so that the target object remains at the preset position in the first direction of the first image acquired by the shooting device.
  • the function of adjusting the shooting picture by manual control by the user can be preset to the on state or disabled state when setting or editing the route.
  • the user can also set control parameters through a second operation so that the drone automatically controls the posture of the drone or the posture of a shooting device on the drone according to the control parameters corresponding to the second operation to adjust the position of the target object in the second direction of the first image.
  • the interactive interface of the control terminal may have interactive controls for adjusting control parameters, such as controlling the nose posture, gimbal pitch, etc.
  • the user can input operations on the interactive controls to enable the drone to automatically control the posture of the drone or the posture of the shooting device on the drone to adjust the position of the target object in the second direction of the first image.
  • control parameters can be set according to actual needs.
  • the control parameters may include at least one of the following:
  • the user can pre-set the movement trajectory of the nose posture or the gimbal pitch, for example, rotating from angle A to angle B within a preset time period, and through a second operation, control the drone according to the pre-set movement trajectory of the nose posture or the gimbal pitch, automatically controlling the posture of the drone or the posture of the shooting device on the drone to adjust the position of the target object in the second direction of the first image.
  • FIG. 10 is a flowchart of steps of a control method provided by an embodiment of the present application.
  • the control method can be applied to the control terminal 200 described above, or to a system composed of the drone 100 and the control terminal 200 described above.
  • the control terminal 200 can be connected to the drone 100 in communication to control the drone 100 to perform corresponding operations according to the control instructions sent by the control terminal 200.
  • the method includes:
  • the first operation is used to generate a first instruction, and the first instruction is used to control the attitude of the drone and/or the attitude of the shooting device carried by the drone while controlling the drone to continue flying along the preset route, so as to continuously shoot the target object while the drone is flying along the preset route.
  • the preset route includes at least two preset waypoints.
  • the above-mentioned first instruction can be used to control the drone to execute the method steps in the above-mentioned control method embodiment, which will not be repeated here to avoid repetition.
  • the control terminal can receive a first operation of a user input indicating a target object when the drone is flying along a preset route, and the first operation is used to generate a first instruction, and the first instruction is used to control the attitude of the drone and/or the attitude of the shooting device carried by the drone while the first instruction is used to control the drone to continue flying along the preset route, so as to continuously shoot the target object during the drone's flight along the preset route.
  • the user can select the target object for continuous shooting in real time by performing the first operation on the control terminal during the drone's flight along the preset route, thereby improving the shooting freedom.
  • control terminal can generate the first instruction according to the user's first operation, or the control terminal can communicate with the drone according to the user's first operation and instruct the drone to generate the first instruction, which is not limited here.
  • the method may further include:
  • the first image captured by the camera of the UAV is displayed.
  • Receiving a first operation input by a user indicating a target object may specifically include:
  • a first operation on a first image region of a first image is received, where the first image region includes image content corresponding to a target object.
  • the first operation may include a frame selection operation
  • the first image area may be an image area determined by the frame selection operation. The image area can be determined more easily by the frame selection operation, which simplifies the user's operation.
  • image recognition technology can be used to generate interactive controls corresponding to objects of a preset type on the first image.
  • the first operation may include a click operation.
  • the method may further include: displaying at least one interactive control on the first image, where the interactive control corresponds to at least a portion of the image area in the first image.
  • Receiving a first operation on a first image area of the first image includes: receiving a click operation on the interactive control corresponding to the first image area.
  • displaying at least one interactive control in the first image may specifically include: displaying at least one interactive control of a preset type in an image area corresponding to the first image according to the preset type.
  • the preset types may include cars, ships, and people.
  • the user can determine the target object of the preset type by inputting operations on the interactive controls, thereby improving the convenience of user operations.
  • FIG. 13 is a flowchart of the steps of a control method provided by an embodiment of the present application.
  • the control method can be applied to the above-mentioned drone 100, or to a system composed of the above-mentioned drone 100 and a control terminal 200.
  • the drone 100 can be connected to the control terminal 200 in communication to control the drone 100 to perform corresponding operations according to the control instructions sent by the control terminal 200.
  • the method includes:
  • the drone When the drone flies along a preset route, the drone is controlled to shoot a preset target object to generate a first image.
  • S302 During the process of controlling the drone to photograph the target object, the drone or a photographing device mounted on the drone is automatically controlled to keep the target object at a preset position in a first direction of the first image.
  • the preset route includes at least two preset waypoints.
  • the target object can be pre-set by the user when setting or editing the route.
  • the user can determine the target object by clicking on the map displayed by the control terminal; in some embodiments, the target object can be determined by the user based on image recognition after selecting a frame on the shooting interface displayed by the control terminal; in some embodiments, the user can also control the drone to fly to a specified geographical location or geographical location range, and determine the specified geographical location or geographical location range as the target object by operating the control terminal.
  • the target object may also be determined by user input when the drone is flying along a preset route.
  • the target object may also be determined by user input when the drone is flying along a preset route.
  • step 302 and step 303 can all refer to the explanation of the method steps in the above embodiment, and will not be described again here to avoid repetition.
  • the drone can automatically control the target object to maintain a preset position in the first direction of the first image, and the user can manually adjust the position of the target object in the second direction of the first image, thereby giving the user greater freedom in camera movement during the continuous shooting of the target object, further enhancing the user experience.
  • the first direction may be the horizontal direction of the first image
  • automatically controlling the posture of the drone or the posture of the shooting device carried on the drone may specifically include: automatically controlling the posture of the drone or the shooting device carried on the drone along the yaw axis.
  • the second direction may be the vertical direction of the first image.
  • the posture of the drone or the posture of the shooting device carried on the drone is controlled. Specifically, it may include: according to the second operation, the posture of the drone or the shooting device carried on the drone along the pitch axis is controlled.
  • the second operation may include the user's operation on the control stick input.
  • the posture of the drone or the posture of the camera mounted on the drone is controlled. Specifically, it may include: controlling the posture of the drone or the posture of the camera mounted on the drone according to the direction and amount of the operation on the control stick input.
  • controlling the posture of the drone or the posture of the shooting device carried by the drone may specifically include: automatically controlling the posture of the drone or the posture of the shooting device carried by the drone according to preset control parameters.
  • control parameters may include at least one of the following:
  • FIG. 14 is a flowchart of steps of a control method provided by an embodiment of the present application.
  • the control method can be applied to the control terminal 200 described above, or to a system composed of the drone 100 and the control terminal 200 described above.
  • the control terminal 200 can be connected to the drone 100 in communication to control the drone 100 to perform corresponding operations according to the control instructions sent by the control terminal 200.
  • the method includes:
  • S402 Receive a second operation input by the user.
  • the second operation is used to generate a second instruction, and the second instruction is used to:
  • the posture of the drone or the posture of the photographing device carried by the drone is automatically controlled so that the target object remains at a preset position in the first direction of the first image.
  • the posture of the drone or the posture of the photographing device carried by the drone is controlled to adjust the position of the target object in the second direction of the first image.
  • the preset route includes at least two preset waypoints.
  • the second instruction can be specifically used to control the drone to execute the method steps described in the above control method embodiment applied to the drone, and will not be described again here to avoid repetition.
  • FIG 15 is a schematic block diagram of the structure of a control device provided in an embodiment of the present application.
  • the control device can be applied to the aforementioned drone 100 or control terminal 200, wherein the control device can be integrated into the aforementioned drone 100 or control terminal 200, or can be independently set up with the drone 100 or control terminal 200 and communicated with, and the aforementioned control method can also be applied to the control device.
  • control device 500 includes a processor 501 and a memory 502.
  • the processor 501 and the memory 502 are connected via a bus 503, and the bus 503 is, for example, an I2C (Inter-integrated Circuit) bus.
  • I2C Inter-integrated Circuit
  • the processor 501 can be a micro-controller unit (MCU), a central processing unit (CPU) or a digital signal processor (DSP), etc.
  • MCU micro-controller unit
  • CPU central processing unit
  • DSP digital signal processor
  • the memory 502 can be a Flash chip, a read-only memory (ROM) disk, an optical disk, a U disk or a mobile hard disk, etc.
  • ROM read-only memory
  • the processor 501 is used to run the computer program stored in the memory 502, and implement the following steps when executing the computer program:
  • a first instruction is obtained; wherein the first instruction is generated based on a first operation of obtaining a user input indicating a target object when the drone is flying along the preset route; in response to the first instruction, the drone is controlled to continue flying along the preset route, while the posture of the drone and/or the posture of a shooting device carried by the drone is controlled, so as to continuously shoot the target object while the drone is flying along the preset route; wherein the preset route includes at least two preset waypoints.
  • the following steps are implemented: when the drone is flying along a preset route, a first operation input by a user indicating a target object is received.
  • the first operation is used to generate a first instruction
  • the first instruction is used to control the drone to continue flying along the preset route while controlling the posture of the drone and/or the posture of a shooting device carried by the drone, so as to continuously shoot the target object while the drone is flying along the preset route;
  • the preset route includes at least two preset waypoints.
  • the following steps are implemented: when the drone flies along a preset route, the drone is controlled to shoot a preset target object to generate a first image; in the process of controlling the drone to shoot the target object, the posture of the drone or the posture of a shooting device carried on the drone is automatically controlled so that the target object remains at a preset position in a first direction of the first image; in the process of controlling the drone to continuously shoot the target object, the posture of the drone or the posture of the shooting device carried on the drone is controlled according to a second operation input by a user to adjust the position of the target object in a second direction of the first image; wherein the preset route includes at least two preset waypoints.
  • the following steps are implemented: when the drone flies along a preset route, a first image generated by the drone photographing a preset target object is obtained; a second operation input by a user on the first image is received; wherein the second operation is used to generate a second instruction, and the second instruction is used to: in the process of controlling the drone to continuously photograph the target object, automatically control the posture of the drone or the posture of a photographing device carried on the drone, so that the target object remains at a preset position in a first direction of the first image; in the process of controlling the drone to continuously photograph the target object, according to the second operation, control the posture of the drone or the posture of the photographing device carried on the drone to adjust the position of the target object in a second direction of the first image; wherein the preset route includes at least two preset waypoints.
  • An embodiment of the present application further provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, wherein the computer program includes program instructions, and the processor executes the program instructions to implement the steps of the control method provided in the above embodiment.
  • the computer-readable storage medium may be an internal storage unit of the drone or control terminal described in any of the aforementioned embodiments, such as a hard disk or memory of the drone or control terminal.
  • the computer-readable storage medium may also be an external storage device of the drone or control terminal, such as a plug-in hard disk, a smart media card (SMC), a secure digital (SD) card, a flash card, etc., equipped on the drone or control terminal.
  • SMC smart media card
  • SD secure digital

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种无人机的控制方法、装置及存储介质,应用于控制领域。其中,方法包括:在无人机沿预设航线飞行时,获取第一指令;其中,第一指令根据无人机沿预设航线飞行时获取用户输入的指示目标对象的第一操作生成;响应于第一指令,控制无人机继续沿预设航线飞行的同时,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以在无人机沿预设航线飞行的过程中对目标对象进行持续拍摄;其中,预设航线包括至少两个预设航点。

Description

无人机的控制方法、装置及存储介质 技术领域
本申请涉及控制领域,尤其涉及一种无人机控制方法、装置及存储介质。
背景技术
无人机,能够应用于航拍、巡检、森林防护、灾情勘察及农药喷洒等场景,进而使得无人机得到了广泛应用。随着用户对无人机需求的日益增加,航点飞行功能就此产生。在航点飞行模式中,无人机可以沿预设航线进行飞行,并针对预设航线中的预设航点或预设兴趣点,执行相应的拍摄动作或飞行操作。
目前,针对兴趣点的设置,用户需要在编辑航线时事先预设兴趣点,在航点飞行的过程中,无人机会在沿预设航线飞行的同时,朝向预设的兴趣点拍摄。用户无法在航线飞行过程中调整兴趣点或调整针对兴趣点的拍摄动作或飞行操作,拍摄自由度不高。
发明内容
基于此,本申请实施例提供了一种无人机的控制方法、装置及存储介质,旨在解决无人机在航点飞行模式下兴趣点拍摄自由度不高的问题。
第一方面,本申请实施例提供了一种无人机的控制方法,包括:在所述无人机沿预设航线飞行时,获取第一指令;其中,所述第一指令根据所述无人机沿所述预设航线飞行时获取用户输入的指示目标对象的第一操作生成;
响应于所述第一指令,控制所述无人机继续沿所述预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;其中,所述预设航线包括至少两个预设航点。
第二方面,本申请实施例提供了一种无人机的控制方法,包括:
在所述无人机沿预设航线飞行时,接收用户输入的指示目标对象的第一操作;其中,所述第一操作用于生成第一指令,所述第一指令用于控制所述无人机继续沿所述预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;所述预设航线包括至少两个预设航点。
第三方面,本申请实施例提供了一种无人机的控制方法,包括:
在所述无人机沿预设航线飞行时,控制所述无人机对预设的目标对象进行拍摄以生成第一图像;
控制所述无人机对所述目标对象进行拍摄的过程中,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以使得所述目标对象保持在所述第一图像的第一方向的预设位置;
控制所述无人机对所述目标对象进行持续拍摄的过程中,根据用户输入的 第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以调整所述目标对象在所述第一图像的第二方向的位置;其中,所述预设航线包括至少两个预设航点。
第四方面,本申请实施例提供了一种无人机的控制方法,包括:
在所述无人机沿预设航线飞行时,获取所述无人机对预设的目标对象拍摄生成的第一图像;
接收用户对第一图像输入的第二操作;
其中,所述第二操作用于生成第二指令,所述第二指令用于:
在控制所述无人机对所述目标对象进行持续拍摄的过程中,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以使得所述目标对象保持在所述第一图像的第一方向的预设位置;
控制所述无人机对所述目标对象进行持续拍摄的过程中,根据所述第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以调整所述目标对象在所述第一图像的第二方向的位置;其中,所述预设航线包括至少两个预设航点。
第五方面,本申请实施例提供了一种无人机的控制装置,所述控制装置包括存储器和处理器;
所述存储器,用于存储计算机程序;
所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如第一方面或第三方面所述的方法步骤。
第六方面,本申请实施例提供了一种无人机的控制装置,所述控制装置包括存储器和处理器;
所述存储器,用于存储计算机程序;
所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如第二方面或第四方面所述的方法步骤。
第七方面,本申请实施例提供了一种无人机,包括如第五方面所述的无人机的控制装置。
第八方面,本申请实施例提供了一种控制终端,包括如第六方面所述的无人机的控制装置。
第九方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如第一方面或第三方面所述的方法步骤。
第十方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如第二方面或第四方面所述的方法步骤。
本申请实施例中,用户可以在无人机沿预设航线飞行的过程中,通过第一操作实时选择目标对象以生成第一指令,无人机可以根据第一指令,控制无人机继续沿预设航线飞行的同时,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以在无人机沿预设航线飞行的过程中对目标对象进行持续拍摄,这提升了在航点飞行模式下的拍摄自由度。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的控制***的应用场景示意图;
图2是本申请实施例提供的拍摄装置的结构示意图;
图3是本申请实施例提供的控制方法的步骤流程图之一;
图4是本申请实施例提供的无人机航点飞行示意图之一;
图5是本申请实施例提供的无人机航点飞行示意图之二;
图6是本申请实施例提供的无人机航点飞行示意图之三;
图7是本申请实施例提供的无人机航点飞行示意图之四;
图8是本申请实施例提供的无人机的拍摄画面示意图之一;
图9是本申请实施例提供的无人机的拍摄画面示意图之二;
图10是本申请实施例提供的控制方法的步骤流程图之二;
图11是本申请实施例提供的控制终端的交互场景图之一;
图12是本申请实施例提供的控制终端的交互场景图之二;
图13是本申请实施例提供的控制方法的步骤流程图之三;
图14是本申请实施例提供的控制方法的步骤流程图之四;
图15是本申请实施例提供的控制装置的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
无人机,能够应用于航拍、巡检、森林防护、灾情勘察及农药喷洒等场景,进而使得无人机得到了广泛应用。随着用户对无人机需求的日益增加,航点飞行功能就此产生。在航点飞行模式中,无人机可以沿预设航线进行飞行,并针对预设航线中的预设航点或预设兴趣点,执行相应的拍摄动作或飞行操作。
目前,针对兴趣点的设置,用户需要在编辑航线时事先选择兴趣点,在航点飞行的过程中,无人机会在沿预设航线飞行的同时,朝向兴趣点拍摄。用户无法在航线飞行过程中调整兴趣点或调整针对兴趣点的拍摄动作或飞行操作, 拍摄自由度不高。
为解决上述技术问题,本申请实施例提供了一种无人机的控制方法、装置及存储介质,旨在解决无人机在航点飞行模式下拍摄自由度不高的问题。
下面结合附图,对本申请的一些实施例作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
需要说明的是,本申请实施例提供的控制方法,应用于无人机,也称无人机,无人机可以包括旋翼型无人机,例如四旋翼无人机、六旋翼无人机、八旋翼无人机,也可以是固定翼无人机,还可以是旋翼型与固定翼无人机的组合,本申请实施例对此不作具体限定。
无人机可以响应至少两个控制终端(例如遥控器和头戴式显示装置)发送的控制指令来控制无人机进行机身偏转、云台转动等操作。
请参阅图1,图1是实施本申请实施例提供的一种控制***的结构示意图。如图1所示,一种控制***包括无人机100和控制终端200,无人机100可以与控制终端200通信连接。控制终端200可以用于控制无人机100。其中,控制终端200可以包括遥控器、智能手机、平板电脑中的至少一种,也可以包括遥控器、智能手机、穿戴式设备中的至少一种,该穿戴式设备包括头戴式显示装置,头戴式显示装置可以包括虚拟现实(VR,virtual reality)显示设备或第一人称视角(FPV,first person view)显示设备。
在一些实施例中,无人机100包括机身110、动力***120、成像装置130和控制装置(图1中未示出)。该机身110可以包括机头。在某些实施例中,无人机100还包括机臂,其中,机臂与机身110连接,机臂用于安装动力***,在一些实施例中,动力***120可以直接安装在机身110上。
该动力***120用于为无人机提供飞行动力,动力***120可以包括电机和安装在电机上并电机驱动的螺旋桨。动力***120可以带动无人机100的机身110可以围绕一个或多个旋转轴转动。例如,上述旋转轴可以包括横滚轴、偏航轴和俯仰轴。当动力***120带动机身110围绕偏航轴转动,此时机身的机头的偏航朝向就会改变,即可以通过控制动力***120控制机身110偏航转动。应理解,电机可以是直流电机,也可以交流电机。另外,电机可以是无刷电机,也可以是有刷电机。
拍摄装置130直接承载或者通过云台承载于机身110,用于拍摄图像,图像可以是图片和/或视频。在一些实施例中,如图1所示,无人机可以包括云台140,拍摄装置130安装在云台140上,云台140与机身110连接。在一些实施例中,云台140能够控制拍摄装置130偏航转动以调节拍摄装置130的偏航朝向,具体地,云台140可以包括偏航电机141,偏航电机141用于控制拍摄装置130偏航转动。在一些实施例中,云台140能够控制拍摄装置130俯仰转动以调节拍摄装置130的俯仰朝向,具体地,云台140可以包括俯仰电机,俯仰电机用于控制拍摄装置130俯仰转动。在一些实施例中,云台140能够控制拍摄装置130横滚转动以调节拍摄装置130的横滚朝向,具体地,云台140可以包括横滚电机,俯仰电机用于控制拍摄装置130横滚转动。
在偏航方向上,拍摄装置130的偏航转动和机身110的偏航转动可以是关联 的,进一步地,拍摄装置130可以跟随机身110的偏航转动而偏航转动,或者机身110跟随拍摄装置130的偏航转动而偏航转动。
控制终端可以包括输入装置,其中,输入装置可以检测控制终端的用户的控制操作,控制终端可以根据输入装置检测到的用户的控制操作生成对无人机的控制指令。例如,控制终端可以根据输入装置检测到的控制终端的用户的机头偏航控制操作生成偏航控制指令,控制终端可以将机头偏航控制指令发送给无人机;控制终端可以根据输入装置检测到的控制终端的用户的云台俯仰控制操作生成俯仰控制指令,控制终端可以将云台俯仰控制指令发送给无人机。
请参阅图2,在一些实施例中,控制终端200包括遥控器,遥控器设置有输入装置210和通信装置220,通信装置220为无线通信装置,该无线通信装置可包括高频无线电收发器、WIFI模组、蓝牙模组中的至少一个。输入装置210用于响应用户的操控生成对应的控制指令,从而使得遥控器可以通过该控制指令操控无人机进行飞行姿态和/或飞行速度调整。其中,输入装置210包括按键、摇杆、拨轮、触控显示屏中的至少一种。例如,输入装置210为摇杆,摇杆安装于遥控器的本体上,遥控器感测用户对摇杆的偏航控制操控生成对应的控制指令,并通过通信装置220将第一偏航控制指令发送给无人机100。
在一些实施例中,控制终端200可以接收无人机100传输的图像,并显示装置230进行显示,显示装置230可以集成于控制终端200,或显示装置230与控制终端200分离设置,并与控制终端200通信连接,该通信连接的方式可以通过有线通信连接方式或无线通信连接方式,如,无线通信连接方式可以是WiFi连接、蓝牙连接、或高频无线信号连接。
在一些实施例中,控制终端200包括遥控器,遥控器设置有输入装置210和通信装置220,通信装置220为无线通信装置,该无线通信装置可包括高频无线电收发器、WIFI模组、蓝牙模组中的至少一个。输入装置210用于响应用户的操控生成对应的控制指令,从而使得遥控器可以通过该控制指令操控无人机进行飞行姿态和/或飞行速度调整。其中,输入装置210包括按键、摇杆、拨轮、触控显示屏中的至少一种。例如,输入装置210为摇杆,摇杆安装于遥控器的本体上。
用户可以通过使用按键、摇杆、拨轮来生成控制指令,也可以通过对触控显示屏的输入操作来生成控制指令,在此不作限定。
以下,将结合图1中的场景对本申请的实施例提供的无人机的控制方法进行详细介绍。需知,图1中的场景仅用于解释本申请实施例提供的无人机的控制方法,但并不构成对本申请实施例提供的无人机的控制方法应用场景的限定。
请参阅图3,图3是本申请实施例提供的一种控制方法的步骤流程图。该控制方法可以应用于上述的无人机100,或应用于上述无人机100和控制终端200组成的***。该无人机100可以与控制终端200通信连接,以根据控制终端200发送的控制指令,控制无人机100执行对应的操作。
该方法包括步骤S101和步骤S102:
S101、在无人机沿预设航线飞行时,获取第一指令;其中,第一指令根据无人机沿预设航线飞行时获取用户输入的指示目标对象的第一操作生成。
S102、响应于第一指令,控制无人机继续沿预设航线飞行的同时,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以在无人机沿预设航线飞行的过程中对目标对象进行持续拍摄。其中,预设航线包括至少两个预设航点。
步骤101中,上述预设航线可以为用户预先设置的航线,包括至少两个预设航点。其中,预设航点可以由用户手动设置。具体地,一些实施例中,预设航点可以由用户通过在控制终端显示的地图上进行点选操作确定;一些实施例中,预设航点可以由用户通过在控制终端显示的拍摄界面上进行框选操作后,基于图像识别确定;一些实施例中,用户也可以控制无人机飞行到指定的地理位置后,通过对控制终端的操作将指定的地理位置确定为预设航点。
应理解,用户可以选择多个预设航点的顺序,以使得无人机在执行预设航线时,按照顺序依次经过多个预设航点。在一些实施例中,例如航线为曲线时,无人机也可以选择不经过预设航点,而围绕预设航点进行指定半径的飞行。
第一指令可以由控制终端生成并发送至无人机。也即,用户输入的指示目标对象的第一操作可以为用户针对控制终端输入的操作。上述获取第一指令的步骤,可以为无人机沿预设航线飞行时,接收第一指令的步骤。在一些实施例中,第一指令可以由无人机自行生成,上述获取第一指令的步骤,也可以为无人机生成第一指令的步骤。
其中,目标对象可以为一地理位置或一个地理位置范围,或者出现在拍摄装置的拍摄画面中的一个拍摄对象。用户可以通过指示目标对象的第一操作,来控制无人机在沿预设航线飞行的过程中,对目标对象进行持续拍摄。
步骤102中,无人机可以根据第一指令,执行相应的操作。这样,在不中断无人机沿预设航线飞行的过程的同时,可以控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以使得无人机在沿预设航线飞行的过程中持续拍摄目标对象。
无人机可以在沿预设航线飞行的过程中,在用户输入指示目标对象的第一操作后,获取第一指令,并响应于第一指令控制无人机继续沿预设航线飞行的同时,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,从而用户可以在无人机沿预设航线飞行的过程中,通过指示目标对象的第一操作实时控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以对目标对象进行持续拍摄。
其中,无人机的姿态,可以包括机头的朝向,例如机头偏航转动角度。相应地,拍摄装置的姿态也可以包括拍摄装置沿偏航轴、俯仰轴或横滚轴的姿态,在此不作进一步的限定。
对目标对象进行持续拍摄,可以理解为在至少部分预设航线中,无人机可以通过控制自身姿态以及拍摄装置的姿态,以使得目标对象能够被拍摄装置拍摄到。具体地,为了提高拍摄成片的质量,目标对象可以位于拍摄装置视野中相对固定的位置。
这样,用户可以在无人机沿预设航线飞行的过程中,通过第一操作实时选择目标对象以生成第一指令,无人机可以根据第一指令,控制无人机继续沿预设航线飞行的同时,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以在无人机沿预设航线飞行的过程中对目标对象进行持续拍摄,这提升了在航 点飞行模式下的拍摄自由度
可选地,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,可以通过控制无人机和/或拍摄装置沿俯仰轴和/或偏航轴的姿态来实现。
在一些实施例中,上述步骤102,具体可以包括以下至少一项:
控制所述无人机改变机头姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;
控制所述无人机上搭载的拍摄装置沿俯仰轴的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;
控制所述无人机上搭载的拍摄装置沿偏航轴的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄。
其中,机头姿态可以包括机头沿偏航轴的姿态和沿俯仰轴的姿态。
为了便于控制,无人机的机头偏航姿态可以与拍摄装置偏航姿态相关联。进一步地,在拍摄装置由云台搭载的情况下,云台的偏航转动可以跟随机头的偏航转动。无人机可以通过控制机头沿偏航轴的转动以及云台沿俯仰轴的转动,来改变拍摄装置的拍摄角度。因而在一些实施例中,无人机可以通过控制机头沿偏航轴朝向和拍摄装置沿俯仰轴的姿态,来对目标对象进行持续拍摄。在其他可选的实施例中,无人机也可以同时控制云台沿偏航轴和俯仰轴的姿态,来对目标对象进行持续拍摄,在此不再一一列举。
在一些实施例中,为了使得无人机沿预设航线飞行时对目标对象的持续拍摄较为连贯平滑,可以控制机头和/或拍摄装置转动时的最大速度或最大角速度。
在不同的拍摄场景中,用户对静态目标和动态目标均存在拍摄需求,而对于静态目标和动态目标的拍摄策略存在不同。可选地,在沿预设航线飞行的过程中,无人机和/或控制终端可以对目标对象的图像进行识别,并确定目标对象的类型,从而可以根据目标对象的类型,来确定相应的控制策略。
在一些实施例中,上述步骤102,具体可以包括:
在目标对象属于预设类型的情况下,控制无人机根据实时采集到的目标对象的状态信息改变无人机的姿态和/或拍摄装置的姿态,以使得目标对象处于运动状态时,无人机和/或无人机搭载的拍摄装置能够跟随目标对象进行持续拍摄。
其中,无人机搭载的拍摄装置可以获取目标对象的图像,通过提取图像特征与预设类型特征相比对等方式,确定目标对象是否属于预设类型。在目标对象属于预设类型的情况下,即表明目标对象有较大概率属于动态目标。为了使得无人机在沿预设航线飞行的过程中,能够实时锁定目标对象进行拍摄,无人机可以根据实时获取到的目标对象的状态信息,来控制无人机的姿态和/或无人机搭载的拍摄装置的姿态,以跟随目标对象进行持续拍摄。
具体地,目标对象的状态信息,可以包括目标对象的位置信息以及运动状态信息。在一些实施例中,目标对象的状态信息可以至少部分由无人机上搭载的拍摄装置获取。举例而言,目标对象的运动状态信息可以根据两个时间点无人机的位置信息以及目标对象的位置信息,无人机可以拍摄两个时间点目标对象的图像,通过三角测量等方式计算得到目标对象的运动速度。进而,无人机可以在继续沿预设航线飞行时,根据目标对象的运动状态信息和目标对象的位 置信息来控制无人机的机头和/或拍摄装置转动。
这样,在目标对象为动态目标的情况下,无人机也可以在继续沿航线飞行的过程中,根据实时采集的目标对象的状态信息,通过控制无人机的姿态和/或拍摄装置的姿态跟随目标对象进行持续拍摄,实现对目标对象的动态追踪锁定(tracking),进一步提升了航点飞行场景下的拍摄自由度。
示例性地,参照图4,图4中预设航线包括预设航点A、B、C、D,无人机在沿预设航线飞行时需要依次经过预设航点A、B、C、D。在目标对象为动态目标时,目标对象会随着无人机沿预设航线飞行过程运动。例如无人机在位于航点B时,目标对象位于X点,无人机在位于航点C时,目标对象位于Y点,则无人机在由航点B飞行至航点C的过程中,可以控制无人机的机头和/或拍摄装置由朝向X点转变为朝向Y点,即跟随目标对象的运动进行持续拍摄。
相应地,若目标对象不属于预设类型,即表明目标对象有较大概率属于静态目标。在一些实施例中,上述步骤102,具体可以包括:
在目标对象不属于预设类型的情况下,控制无人机根据目标对象的初始状态信息改变无人机的姿态和/或拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄。
其中,初始状态信息可以为无人机在响应第一指令时获取的信息,也可以为无人机在响应第一指令后的某一时刻获取的信息。举例而言,初始状态信息可以包括无人机在响应第一指令时获取的目标对象的位置信息。由于目标对象为静态目标时,其位置信息不会发生变化,因而无人机在继续沿预设航线飞行的过程中,可以根据目标对象初始的位置信息,改变无人机的姿态和/或无人机搭载的拍摄装置的姿态,以对目标对象进行持续拍摄。
这样,在目标对象大概率为静态目标的情况下,无人机无需实时获取目标对象的状态信息,而仅需获取目标对象的初始状态信息,即可以使得无人机在沿预设航线飞行的过程中,对目标对象进行持续拍摄,减少了资源占用。
示例性地,参照图5,图5中目标对象为静态目标,目标对象始终位于X点,此时无人机可以在沿预设航线飞行的过程中,通过控制无人机的机头和/或无人机搭载的拍摄装置朝向X点,以对目标对象进行持续拍摄。
本申请实施例中,通过判断目标对象是否属于预设类型,以确定目标对象是否大概率属于动态目标,从而针对静态目标和动态目标采用不同的控制策略,以使得无人机在能够实时追踪锁定动态目标的前提下,降低在对静态目标进行持续拍摄时的资源占用。
可选地,预设类型可以为用户在大部分拍摄场景下拍摄的动态目标。在一些实施例中,预设类型可以包括车、船和人,以在覆盖大部分动态目标实时追踪锁定的拍摄场景的同时,提升判断的准确度。
可选地,无人机可以在响应于第一指令后,在继续沿预设航线飞行的过程中,控制无人机的姿态和/或无人机搭载的拍摄装置的姿态,以在预设航线的飞行结束前,始终对目标对象进行持续拍摄。也可以在继续沿预设航线飞行的过程中,控制无人机的姿态和/或无人机搭载的拍摄装置的姿态,以在其中的一段航线持续拍摄目标对象。
在一些实施例中,上述步骤102,具体可以包括:
在无人机满足第一预设条件之前,控制无人机和/或无人机上搭载的拍摄装置的姿态,以在无人机沿预设航线飞行的过程中对目标对象进行持续拍摄。
其中,第一预设条件包括无人机到达预设航点中的目标预设航点,目标预设航点包括位于无人机在预设航线上的位置之后的其中一个预设航点。
也即,在无人机沿预设航线飞行至目标预设航点之前,可以控制无人机和/或拍摄装置的姿态,以对目标对象进行持续拍摄。目标预设航点可以为预设航线上位于无人机的当前位置之后的其中一个预设航点。无人机的当前位置可以为无人机响应于第一指令,执行相应的控制操作时的位置。
在一些实施例中,第一预设条件还可以包括:
在控制无人机继续沿预设航线飞行的过程中,至少部分目标对象脱离拍摄装置的视野。
由上述内容可知,目标对象可以为静态目标,也可以为动态目标。在目标对象为动态目标的情况下,在无人机沿预设航线飞行的过程中,目标对象可能会由于自身运动脱离拍摄装置的(Field of View,FOV)视野,从而导致无人机无法通过拍摄装置继续追踪锁定目标对象。
因而,无人机可以在至少部分目标对象脱离拍摄装置的视野之前,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以对目标对象进行持续拍摄。
可以理解的是,至少部分目标对象脱离拍摄装置的视野的情况可以根据具体的应用场景进行确定。举例而言,对于目标对象为人的情况,可以通过头-肩识别等方式确定目标对象,若目标对象的头部或肩部脱离拍摄装置的视野,则无法继续对目标对象进行识别与跟踪锁定。
具体地,在一些实施例中,为了减少停止持续拍摄的情况,无人机可以在至少部分目标对象脱离拍摄装置的视野的持续时间大于或等于预设时间后,不再对目标对象进行持续拍摄。
在其他可选的实施例中,第一预设条件还可以包括:对目标对象持续拍摄的时间大于或等于用户预设的拍摄时间;用户输入停止对目标对象进行持续拍摄的操作等,在此不再一一列举。
可选地,在无人机满足第一预设条件时,无人机可以不再对目标对象进行持续拍摄。
在一些实施例中,上述控制方法还可以包括:
在无人机满足第一预设条件时,按照预设航线的原始预设参数控制无人机的姿态和/或无人机上搭载拍摄装置的姿态。
其中,预设航线的原始预设参数,可以为预设航线默认的控制参数,也可以为用户预设的控制参数。
应理解的是,在一些实施例中,无人机需要满足第一预设条件的所有条件,再执行对目标对象的持续拍摄;在其他可选的实施例中,无人机可以在满足第一预设条件中的其中任意一个条件或者特定一个或多个条件的情况下,执行对目标对象的持续拍摄,具体可以根据实际需要进行设置。
举例而言,用户可以预先设置在持续拍摄结束后无人机的拍摄策略,在无 人机满足第一预设条件的其中一项时,可以根据当前沿预设航线的飞行方向调整无人机的姿态和/或拍摄装置的姿态,使得无人机的机头和拍摄装置朝向飞行方向;或者,无人机可以根据用户预设的机头姿态角度和拍摄装置朝向角度,调整无人机的姿态和拍摄装置的姿态。具体可以根据实际需要进行设置,在此不作进一步的限定。
这样,用户可以自由控制对目标对象持续拍摄的开始和结束条件,并可以选择持续拍摄结束后的拍摄策略,进一步提升了航点飞行时拍摄的自由度,提高了用户的使用体验。
示例性地,参照图6,无人机可以在沿预设航线飞行至E点时响应于第一指令,控制无人机的姿态和/或拍摄装置的姿态,以对目标对象进行持续拍摄。假定目标预设航点为C点,则无人机在到达C点时,即相当于满足了第一预设条件,此时无人机可以将机头姿态和拍摄装置的朝向调整为与飞行方向一致,并进行C至D点的航线飞行。
由于在航线设置与编辑时,用户可以预先设置目标兴趣点,以在无人机沿预设航线飞行的过程中,对目标兴趣点进行持续拍摄。进一步地,用户可以设置目标兴趣点与预设航点之间的关联关系,以使得无人机在到达预设航点时,对目标兴趣点进行持续拍摄。
用户也可以在航线编辑或航线设置的过程中,选择预设航点所对应的机头姿态角,云台俯仰角等,以在无人机沿预设航线飞行,且到达预设航点后,根据机头姿态角和云台俯仰角控制机头和拍摄装置的姿态。
为了避免无人机对目标对象的持续拍摄与对目标兴趣点的持续拍摄,或者与用户预设的拍摄策略冲突,可选地,无人机可以在沿预设航线飞行,且自身姿态和拍摄装置的姿态能够由用户手动控制的情况下,对目标对象进行持续拍摄。
在一些实施例中,上述步骤102,具体可以包括:
若无人机满足第二预设条件,则控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以在无人机沿预设航线飞行的过程中对目标对象进行持续拍摄;第二预设条件包括无人机的姿态和/或拍摄装置的姿态能够由用户手动控制。
进一步地,在一些实施例中,第二预设条件可以包括以下至少一项:
无人机的姿态和/或无人机上搭载的拍摄装置的姿态的控制模式为手动控制模式。
预设航线不存在预设的目标兴趣点。其中,在预设航线存在目标兴趣点的情况下,无人机在按照预设航线飞行的过程中,控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以对目标兴趣点进行持续拍摄。
其中,无人机的姿态和/或无人机上搭载的拍摄装置的姿态的控制模式可以由用户预先设定。举例而言,用户可以预设无人机的姿态和/或无人机上搭载的拍摄装置的姿态由手动控制,从而在无人机沿预设航线飞行时,用户可以通过对控制终端的摇杆的输入操作,来控制无人机的姿态或者拍摄装置的姿态。
目标兴趣点可以由用户在航线设置或编辑的过程中设置,其可以关联预设航线的所有预设航点,也可以关联预设航线中的部分预设航点。目标兴趣点可 以为静态目标,以位置信息进行表征。在一些情况下,目标对象可以看作在无人机沿预设航线飞行时实时设置的“目标兴趣点”。
无人机可以在满足第二预设条件的情况下,进行对目标对象的持续拍摄。换句话说,在无人机沿预设航线飞行的过程中,无人机对目标对象的持续拍摄、对目标兴趣点的持续拍摄、以及基于用户预设的拍摄策略进行拍摄三者可以并行。
这样,用户可以通过预先设置目标兴趣点以及拍摄策略,并可以在无人机在沿预设航线飞行过程中未存在目标兴趣点以及拍摄策略的情况下,通过第一操作控制无人机对目标对象进行持续拍摄,避免了多种拍摄策略之间的冲突,提升了用户的使用体验。
举例而言,参照7,A至B点用户设置了无人机的机头姿态和拍摄装置朝向跟随航线飞行方向,B至C点设置了无人机的机头姿态和拍摄装置朝向为手动控制模式,C至D点设置了目标兴趣点Y点。则无人机在沿预设航线飞行的过程中,A至B点跟随航线方向控制无人机的姿态和拍摄装置的姿态,B至E点为用户手动控制,若用户未进行手动控制,则可以保持B点的无人机的姿态和拍摄装置的姿态或者默认保持沿航线方向控制无人机的姿态和拍摄装置的姿态。在E至C点无人机响应于第一指令,控制无人机的姿态和拍摄装置的姿态,以对X点进行持续拍摄。在C至D点则控制无人机的姿态和拍摄装置的姿态,以对Y点进行持续拍摄。
应理解的是,在一些实施例中,无人机需要满足第二预设条件的所有条件,再执行对目标对象的持续拍摄;在其他可选的实施例中,无人机可以在满足第二预设条件中的其中任意一个条件或者特定一个或多个条件的情况下,执行对目标对象的持续拍摄。具体可以根据实际需要进行设置。
可选地,无人机在沿预设航线飞行时对目标对象的持续拍摄,也可以覆盖之前的目标兴趣点以及其他用户设置。
在一些实施例中,上述步骤102,具体可以包括:
若预设航线存在预设的目标兴趣点,则控制无人机的姿态和/或无人机上搭载的拍摄装置的姿态,以将无人机由对目标兴趣点进行持续拍摄调整为对目标对象进行拍摄。
这样,用户可以在无人机沿预设航线飞行的过程中,通过第一操作将目标对象覆盖预设的目标兴趣点,并控制无人机由对目标兴趣点进行持续拍摄调整为对目标对象进行拍摄,进一步提升了用户在无人机沿预设航线飞行的过程中的拍摄自由度。
可选地,为了满足更多的拍摄场景需求,在无人机对目标对象进行持续拍摄的过程中,用户可以通过手动控制对拍摄画面进行调整。
在一些实施例中,上述控制方法,还可以包括:
控制无人机对目标对象进行持续拍摄的过程中,自动控制无人机或者无人机上搭载的拍摄装置,以使得目标对象保持在拍摄装置获取的第一图像的第一方向的预设位置;
控制无人机对目标对象进行持续拍摄的过程中,根据用户输入的第二操作, 控制无人机或者无人机上搭载的拍摄装置,以调整目标对象在第一图像的第二方向的位置。
其中,第一图像可以由无人机的拍摄装置获取,其可以包括预设时间段内的多帧图像。目标对象保持在第一图像的第一方向的预设位置,可以理解为目标对象在第一图像的第一方向上的位置大致不变。
在此过程中,用户也可以通过第二操作来控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以调整目标对象在第二方向的位置。第二操作与第一操作类似,可以为用户针对控制终端输入的操作,例如针对摇杆、拨轮或者滑动按键的输入操作。
在一些实施例中,第二操作包括用户对控制摇杆输入的操作,根据用户输入的第二操作,控制无人机或者无人机上搭载的拍摄装置,包括:
根据对上述控制摇杆输入的操作的方向和杆量,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态。
举例而言,一并参照图8至图9,图8和图9可以分别为无人机在沿预设航线飞行时,两个时间点的第一图像。其中,第一方向为图中的X轴方向,即水平方向,第二方向为图中的Y轴方向,即竖直方向,目标对象为图中所示的高楼。在无人机沿预设航线飞行的过程中,可以通过自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以使得目标对象在图8和图9的水平方向的位置大致保持不变。而在此过程中,用户可以通过在控制终端的控制摇杆的输入操作,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以改变目标对象在第一图像竖直方向的位置,如图8和图9所示。
这样,用户可以在无人机沿预设航线飞行,且对目标对象进行持续拍摄的过程中,可以由无人机自动控制目标对象保持第一图像的第一方向的预设位置,而通过手动调整目标对象在第一图像的第二方向上的位置,从而在对目标对象进行持续拍摄的过程中给予用户更大的运镜自由度,进一步提升了用户的使用体验。
第一方向和第二方向可以根据实际的应用场景进行具体设置。其控制策略可以根据第一方向和第二方向的设置确定。可选地,为了便于用户控制,第一方向和第二方向可以为相互垂直的两个方向,且第一方向和第二方向可以与无人机和/或拍摄装置的偏航轴方向及俯仰轴方向关联。
在一些实施例中,第一方向可以为第一图像的竖直方向。上述自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态的步骤,具体可以包括:自动控制无人机的姿态或者无人机上搭载的拍摄装置沿偏航轴的姿态,以使得目标对象保持在拍摄装置获取的第一图像的第一方向的预设位置。
上述第二方向可以为第一图像的竖直方向,上述根据用户输入的第二操作,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态的步骤,具体可以包括:
根据用户输入的第二操作,控制无人机或者无人机上搭载的拍摄装置沿俯仰轴的姿态。
应理解,为了避免控制冲突,实现无人机自动控制过程和根据用户第二操作进行控制的过程的解耦,两个控制步骤中的控制对象可以不同。由于无人机 的机头能够在偏航轴方向360°转动,在一些实施例中,上述自动控制无人机或者无人机上搭载的拍摄装置的步骤,具体可以包括:
自动控制无人机的机头沿偏航轴的偏转,以使得目标对象保持在拍摄装置获取的第一图像的第一方向的预设位置。
相应地,由于无人机调整自身在俯仰轴上的姿态较为困难,为了简化控制,上述根据用户输入的第二操作,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态的步骤,具体可以包括:
根据用户输入的第二操作,控制无人机上搭载的拍摄装置沿俯仰轴的姿态。
举例而言,在目标对象为高楼的情况下,为了在对高楼进行持续拍摄的过程中,能够支持俯仰轴方向更多的运镜角度,以拍摄到高楼的不同高度,无人机可以在对高楼进行持续拍摄的过程中,自动控制机头姿态,以使得目标对象保持在第一图像的竖直方向的预设位置。同时,用户可以通过输入的第二操作,调节云台的俯仰角度,以控制拍摄装置沿俯仰轴的姿态,以拍摄高楼不同高度的画面。
在其他可选的实施例中,第一方向也可以为第一图像的竖直方向,第二方向也可以为第一图像的水平方向,以在目标对象为具有较长长度(如围墙、树林等)的目标时,支持偏航轴方向的更多运镜角度。
可以理解的是,为了避免目标对象为动态目标的情况下,用户手动控制无人机或者无人机上搭载的拍摄装置导致目标对象脱离拍摄装置的视野,进而无法继续识别与追踪锁定目标对象,在对目标对象进行持续拍摄的过程中,用户通过手动控制对拍摄画面进行调整的功能可以在目标对象为静态目标的情况下开启。在目标对象为静态目标的情况下,无人机可以根据目标对象的静态位置信息,执行自动控制无人机或者无人机上搭载的拍摄装置,以使得目标对象保持在拍摄装置获取的第一图像的第一方向的预设位置。
同时,为了避免用户的误操作,在对目标对象进行持续拍摄的过程中,用户通过手动控制对拍摄画面进行调整的功能可以在航线设置或者编辑时,预设为开启状态或者禁用状态。
可选地,在一些实施例中,用户也可以通过第二操作设置控制参数,以使得无人机根据第二操作对应的控制参数,自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以调整目标对象在第一图像的第二方向的位置。
举例而言,控制终端的交互界面可以具有调整控制参数,例如控制机头姿态、云台俯仰等的交互控件,用户可以通过对交互控件的输入操作,来使得无人机自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以调整目标对象在第一图像的第二方向的位置。
上述控制参数可以根据实际需要进行设置,在一些实施例中,控制参数可以包括以下至少一项:
拍摄装置和/或无人机的机头的转动方向;
拍摄装置和/或无人机的机头的转动速度;
拍摄装置和/或无人机的机头的转动加速度;
拍摄装置和/或无人机在无人机到达其中一个预设航点时的姿态。
进一步地,为了简化用户操作,用户可以预先设置机头姿态或云台俯仰的运动轨迹,例如在预设时间段由A角度转动至B角度,并通过第二操作控制无人机按照预先设置机头姿态或云台俯仰的运动轨迹,自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以调整目标对象在第一图像的第二方向的位置。
参照图10,图10是本申请实施例提供的一种控制方法的步骤流程图。该控制方法可以应用于上述的控制终端200,或应用于上述无人机100和控制终端200组成的***。该控制终端200可以与无人机100通信连接,以根据控制终端200发送的控制指令,控制无人机100执行对应的操作。其中,该方法包括:
S201、在无人机沿预设航线飞行时,接收用户输入的指示目标对象的第一操作。
其中,第一操作用于生成第一指令,第一指令用于控制无人机继续沿预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄。预设航线包括至少两个预设航点。上述第一指令可以用于控制无人机执行上述控制方法实施例中的方法步骤,为避免重复在此不再赘述。
控制终端可以在无人机沿预设航线飞行时,接收用户输入的指示目标对象的第一操作,第一操作用于生成第一指令,第一指令用于控制第一指令用于控制无人机继续沿预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄。这样,用户可以在无人机沿预设航线飞行的过程中,通过对控制终端的第一操作实时选择目标对象进行持续拍摄,提升了拍摄自由度。
应理解的是,控制终端可以根据用户的第一操作生成第一指令,或者,控制终端可以根据用户的第一操作与无人机通信,并指示无人机生成第一指令,在此不作限定。
在一些实施例中,步骤201之前,上述方法还可以包括:
显示无人飞行器的拍摄装置采集的第一图像。
接收用户输入的指示目标对象的第一操作,具体可以包括:
接收对第一图像的第一图像区域的第一操作,第一图像区域包括目标对象对应的图像内容。
示例性地,参照图11至图12,在一些实施例中,第一操作可以包括框选操作,第一图像区域可以为框选操作确定的图像区域。通过框选操作可以较为容易地确定图像区域,简化了用户的操作。
可选地,在无人机沿预设航线飞行的过程中,由于控制终端可以获取并显示无人机实时采集的第一图像,因而为了便于用户选择目标对象,可以通过图像识别技术,生成对应第一图像上属于预设类型的对象的交互控件。
在一些实施例中,第一操作可以包括点击操作,步骤201之前,上述方法还可以包括:在第一图像显示至少一个交互控件,交互控件对应第一图像中的 至少部分图像区域。接收对第一图像的第一图像区域的第一操作,包括:接收对第一图像区域对应的交互控件的点击操作。
具体地,为了便于用户选择目标对象,在一些实施例中,在第一图像显示至少一个交互控件,具体可以包括:根据预设类型在第一图像对应的图像区域,显示对应预设类型的至少一个交互控件。预设类型可以包括车、船和人。
这样,通过预先对第一图像进行识别,并针对第一图像中属于预设类型的对象,生成相应的交互控件,用户即可以通过对交互控件的输入操作,确定属于预设类型的目标对象,提升了用户操作的便捷性。
参照图13,图13是本申请实施例提供的一种控制方法的步骤流程图。该控制方法可以应用于上述的无人机100,或应用于上述无人机100和控制终端200组成的***。该无人机100可以与控制终端200通信连接,以根据控制终端200发送的控制指令,控制无人机100执行对应的操作。其中,该方法包括:
S301、在无人机沿预设航线飞行时,控制无人机对预设的目标对象进行拍摄以生成第一图像。
S302、控制无人机对目标对象进行拍摄的过程中,自动控制无人机或者无人机上搭载的拍摄装置,以使得目标对象保持在第一图像的第一方向的预设位置。
S303、控制无人机对目标对象进行持续拍摄的过程中,根据用户输入的第二操作,控制无人机或者无人机上搭载的拍摄装置,以调整目标对象在第一图像的第二方向的位置。其中,预设航线包括至少两个预设航点。
步骤301中,目标对象可以由用户在进行航线设置或编辑时预先设置。例如,一些实施例中,用户可以通过在控制终端显示的地图上进行点选操作确定目标对象;一些实施例中,目标对象可以由用户通过在控制终端显示的拍摄界面上进行框选操作后,基于图像识别确定;一些实施例中,用户也可以控制无人机飞行到指定的地理位置或地理位置范围后,通过对控制终端的操作将指定的地理位置或地理位置范围确定为目标对象。
在其他可选的实施例中,目标对象也可以为无人机在沿预设航线飞行的过程中,通过用户输入的操作确定,具体可以参照上述实施例的解释说明。
步骤302和步骤303、以及下述实施例的具体实现和有益效果均可以参照上述实施例的方法步骤的解释说明,为避免重复,在此不再赘述。
这样,用户可以在无人机沿预设航线飞行,且对目标对象进行持续拍摄的过程中,可以由无人机自动控制目标对象保持在第一图像的第一方向的预设位置,而通过手动调整目标对象在第一图像的第二方向上的位置,从而在对目标对象进行持续拍摄的过程中给予用户更大的运镜自由度,进一步提升了用户的使用体验。
在一些实施例中,第一方向可以为第一图像的水平方向,自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,具体可以包括:自动控制无人机或者无人机上搭载的拍摄装置沿偏航轴的姿态。
在一些实施例中,第二方向可以为第一图像的竖直方向,根据第二操作,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,具体可以包括:根据 第二操作,控制无人机或者无人机上搭载的拍摄装置沿俯仰轴的姿态。
在一些实施例中,第二操作可以包括用户对控制摇杆输入的操作,根据第二操作,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,具体可以包括:根据对控制摇杆输入的操作的方向和杆量改变,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态。
在一些实施例中,根据第二操作,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,具体可以包括:根据预设的控制参数,自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态。
在一些实施例中,控制参数,可以包括以下至少一项:
拍摄装置和/或无人机的机头的转动方向;
拍摄装置和/或无人机的机头的转动速度;
拍摄装置和/或无人机的机头的转动加速度;
拍摄装置和/或无人机在无人机到达其中一个预设航点时的姿态。
参照图14,图14是本申请实施例提供的一种控制方法的步骤流程图。该控制方法可以应用于上述的控制终端200,或应用于上述无人机100和控制终端200组成的***。该控制终端200可以与无人机100通信连接,以根据控制终端200发送的控制指令,控制无人机100执行对应的操作。其中,该方法包括:
S401、在无人机沿预设航线飞行时,获取无人机对预设的目标对象拍摄生成的第一图像。
S402、接收用户输入的第二操作。
其中,第二操作用于生成第二指令,第二指令用于:
在控制无人机对目标对象进行持续拍摄的过程中,自动控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以使得目标对象保持在第一图像的第一方向的预设位置。
控制无人机对目标对象进行持续拍摄的过程中,根据第二操作,控制无人机的姿态或者无人机上搭载的拍摄装置的姿态,以调整目标对象在第一图像的第二方向的位置。其中,预设航线包括至少两个预设航点。
第二指令具体可以用于控制无人机执行如上述应用于无人机的控制方法实施例所述的方法步骤,为避免重复,在此不再赘述。
请参阅图15,图15是本申请实施例提供的一种控制装置的结构示意性框图。该控制装置可以应用于前述的无人机100或控制终端200,其中,该控制装置可以集成于前述的无人机100或控制终端200,也可以与无人机100或控制终端200独立设置,并通信连接,前述的控制方法也可以应用于该控制装置。
如图15所示,该控制装置500包括处理器501及存储器502,处理器501、存储器502通过总线503连接,该总线503比如为I2C(Inter-integrated Circuit)总线。
具体地,处理器501可以是微控制单元(Micro-controller Unit,MCU)、中央处理单元(Central Processing Unit,CPU)或数字信号处理器(Digital Signal Processor,DSP)等。
具体地,存储器502可以是Flash芯片、只读存储器(ROM,Read-Only Memory)磁盘、光盘、U盘或移动硬盘等。
其中,所述处理器501用于运行存储在存储器502中的计算机程序,并在执行所述计算机程序时实现如下步骤:
在所述无人机沿预设航线飞行时,获取第一指令;其中,所述第一指令根据所述无人机沿所述预设航线飞行时获取用户输入的指示目标对象的第一操作生成;响应于所述第一指令,控制所述无人机继续沿所述预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;其中,所述预设航线包括至少两个预设航点。
或者,实现如下步骤:在所述无人机沿预设航线飞行时,接收用户输入的指示目标对象的第一操作。其中,所述第一操作用于生成第一指令,所述第一指令用于控制所述无人机继续沿所述预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;所述预设航线包括至少两个预设航点。
或者,实现如下步骤:在所述无人机沿预设航线飞行时,控制所述无人机对预设的目标对象进行拍摄以生成第一图像;控制所述无人机对所述目标对象进行拍摄的过程中,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以使得所述目标对象保持在所述第一图像的第一方向的预设位置;控制所述无人机对所述目标对象进行持续拍摄的过程中,根据用户输入的第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以调整所述目标对象在所述第一图像的第二方向的位置;其中,所述预设航线包括至少两个预设航点。
或者,实现如下步骤:在所述无人机沿预设航线飞行时,获取所述无人机对预设的目标对象拍摄生成的第一图像;接收用户对第一图像输入的第二操作;其中,所述第二操作用于生成第二指令,所述第二指令用于:在控制所述无人机对所述目标对象进行持续拍摄的过程中,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以使得所述目标对象保持在所述第一图像的第一方向的预设位置;控制所述无人机对所述目标对象进行持续拍摄的过程中,根据所述第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以调整所述目标对象在所述第一图像的第二方向的位置;其中,所述预设航线包括至少两个预设航点。
需要说明的是,所属领域的技术人员可以清楚地了解到,为了描述的方便和简洁,上述描述的控制装置的具体工作过程,可以参考前述控制方法实施例中的对应过程,在此不再赘述。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序中包括程序指令,所述处理器执行所述程序指令,实现上述实施例提供的控制方法的步骤。
其中,所述计算机可读存储介质可以是前述任一实施例所述的无人机或控 制终端的内部存储单元,例如无人机或控制终端的硬盘或内存。所述计算机可读存储介质也可以是无人机或控制终端的外部存储设备,例如无人机或控制终端上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。
应当理解,在此本申请说明书中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。如在本申请说明书和所附权利要求书中所使用的那样,除非上下文清楚地指明其它情况,否则单数形式的“一”、“一个”及“该”意在包括复数形式。
还应当理解,在本申请说明书和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (36)

  1. 一种无人机的控制方法,其特征在于,包括:
    在所述无人机沿预设航线飞行时,获取第一指令;其中,所述第一指令根据所述无人机沿所述预设航线飞行时获取用户输入的指示目标对象的第一操作生成;
    响应于所述第一指令,控制所述无人机继续沿所述预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;其中,所述预设航线包括至少两个预设航点。
  2. 根据权利要求1所述的方法,其特征在于,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄,包括以下至少一项:
    控制所述无人机改变机头姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;
    控制所述无人机上搭载的拍摄装置沿俯仰轴的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;
    控制所述无人机上搭载的拍摄装置沿偏航轴的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄。
  3. 根据权利要求1所述的方法,其特征在于,所述控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄,包括:
    在所述目标对象属于预设类型的情况下,控制所述无人机根据实时采集到的所述目标对象的状态信息改变所述无人机的姿态和/或所述拍摄装置的姿态,以使得所述目标对象处于运动状态时,所述无人机和/或所述拍摄装置能够跟随所述目标对象进行持续拍摄。
  4. 根据权利要求3所述的方法,其特征在于,所述目标对象的运动状态信息和位置信息至少部分由所述无人机上搭载的拍摄装置获取。
  5. 根据权利要求1所述的方法,其特征在于,所述控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄,包括:
    在所述目标对象不属于预设类型的情况下,控制所述无人机根据所述目标对象的初始状态信息改变所述无人机的姿态和/或所述拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄。
  6. 根据权利要求3-5中任一项所述的方法,其特征在于,所述预设类型包括车、船和人。
  7. 根据权利要求1所述的方法,其特征在于,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄,包括:
    在所述无人机满足第一预设条件之前,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;
    其中,所述第一预设条件包括所述无人机到达所述预设航点中的目标预设航点,所述目标预设航点包括位于所述无人机在所述预设航线上的位置之后的其中一个所述预设航点。
  8. 根据权利要求7所述的方法,其特征在于,所述第一预设条件还包括:
    在控制所述无人机继续沿所述预设航线飞行的过程中,至少部分所述目标对象脱离所述拍摄装置的视野。
  9. 根据权利要求7或8所述的方法,其特征在于,所述方法还包括:
    在所述无人机满足所述第一预设条件的其中一项时,按照所述预设航线的原始预设参数控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态。
  10. 根据权利要求1所述的方法,其特征在于,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄,包括:
    若所述无人机满足第二预设条件,则控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;所述第二预设条件包括所述无人机的姿态和/或拍摄装置的姿态能够由用户手动控制。
  11. 根据权利要求10所述的方法,其特征在于,所述第二预设条件包括以下至少一项:
    所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态的控制模式为手动控制模式;
    所述预设航线不存在预设的目标兴趣点;其中,在所述预设航线存在所述目标兴趣点的情况下,所述无人机在按照所述预设航线飞行的过程中,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以对所述目标兴趣点进行持续拍摄。
  12. 根据权利要求1所述的方法,其特征在于,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄,包括:
    若所述预设航线存在预设的目标兴趣点,则控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以将所述无人机由对所述目标兴趣点进行持续拍摄调整为对所述目标对象进行拍摄。
  13. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    控制所述无人机对所述目标对象进行持续拍摄的过程中,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以使得所述目标对象保持在所述拍摄装置获取的第一图像的第一方向的预设位置;
    控制所述无人机对所述目标对象进行持续拍摄的过程中,根据用户输入的第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以调整所述目标对象在所述第一图像的第二方向的位置。
  14. 根据权利要求13所述的方法,其特征在于,所述第一方向为所述第一图像的水平方向,所述自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,包括:
    自动控制无人机或者所述无人机上搭载的拍摄装置沿偏航轴的姿态。
  15. 根据权利要求13所述的方法,其特征在于,所述第二方向为所述第一图像的竖直方向,所述根据用户输入的第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,包括:
    根据用户输入的第二操作,控制无人机或者所述无人机上搭载的拍摄装置沿俯仰轴的姿态。
  16. 根据权利要求13所述的方法,其特征在于,所述第二操作包括用户对控制摇杆输入的操作,所述控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,包括:
    根据对上述控制摇杆输入的操作的方向和杆量,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态。
  17. 根据权利要求13所述的方法,其特征在于,所述根据用户输入的第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,包括:
    根据所述第二操作对应的控制参数,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态。
  18. 根据权利要求17所述的方法,其特征在于,所述控制参数,包括以下至少一项:
    所述拍摄装置和/或所述无人机的机头转动方向;
    所述拍摄装置和/或所述无人机的转动速度;
    所述拍摄装置和/或所述无人机的转动加速度;
    所述拍摄装置和/或所述无人机在所述无人机到达其中一个所述预设航点时的姿态。
  19. 一种无人机的控制方法,其特征在于,包括:
    在所述无人机沿预设航线飞行时,接收用户输入的指示目标对象的第一操作;
    其中,所述第一操作用于生成第一指令,所述第一指令用于控制所述无人机继续沿所述预设航线飞行的同时,控制所述无人机的姿态和/或所述无人机上搭载的拍摄装置的姿态,以在所述无人机沿所述预设航线飞行的过程中对所述目标对象进行持续拍摄;所述预设航线包括至少两个预设航点。
  20. 根据权利要求19所述的方法,其特征在于,所述接收用户输入的指示目标对象的第一操作之前,所述方法还包括:
    显示所述无人飞行器的拍摄装置采集的第一图像;
    所述接收用户输入的指示目标对象的第一操作,包括:
    接收对第一图像的第一图像区域的第一操作,所述第一图像区域包括所述目标对象对应的图像内容。
  21. 根据权利要求20所述的方法,其特征在于,所述第一操作包括框选操作,所述第一图像区域为所述框选操作确定的图像区域。
  22. 根据权利要求20所述的方法,其特征在于,所述第一操作包括点击操作,所述接收用户输入的指示目标对象的第一操作之前,所述方法还包括:
    在所述第一图像显示至少一个交互控件,所述交互控件对应所述第一图像中的至少部分图像区域;
    所述接收对第一图像的第一图像区域的第一操作,包括:
    接收对所述第一图像区域对应的交互控件的点击操作。
  23. 根据权利要求22所述的方法,其特征在于,在所述第一图像显示至少一个交互控件,包括:
    根据预设类型在所述第一图像对应的图像区域,显示对应所述预设类型的至少一个交互控件;所述预设类型包括车、船和人。
  24. 一种无人机的控制方法,其特征在于,包括:
    在所述无人机沿预设航线飞行时,控制所述无人机对预设的目标对象进行拍摄以生成第一图像;
    控制所述无人机对所述目标对象进行拍摄的过程中,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以使得所述目标对象保持在所述第一图像的第一方向的预设位置;
    控制所述无人机对所述目标对象进行持续拍摄的过程中,根据用户输入的第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以调整所述目标对象在所述第一图像的第二方向的位置;其中,所述预设航线包括至少两个预设航点。
  25. 根据权利要求24所述的方法,其特征在于,所述第一方向为所述第一图像的水平方向,所述自动控制所述无人机的姿态或者所述无人机上搭载的拍 摄装置的姿态,包括:
    自动控制无人机或者所述无人机上搭载的拍摄装置沿偏航轴的姿态。
  26. 根据权利要求24所述的方法,其特征在于,所述第二方向为所述第一图像的竖直方向,根据所述第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,包括:
    根据所述第二操作,控制无人机或者所述无人机上搭载的拍摄装置沿俯仰轴的姿态。
  27. 根据权利要求24所述的方法,其特征在于,所述第二操作包括用户对控制摇杆输入的操作,根据所述第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,包括:
    根据对所述控制摇杆输入的操作的方向和杆量改变,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态。
  28. 根据权利要求24所述的方法,其特征在于,根据所述第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,包括:
    根据预设的控制参数,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态。
  29. 根据权利要求28所述的方法,其特征在于,所述控制参数,包括以下至少一项:
    所述拍摄装置和/或所述无人机的机头的转动方向;
    所述拍摄装置和/或所述无人机的机头的转动速度;
    所述拍摄装置和/或所述无人机的机头的转动加速度;
    所述拍摄装置和/或所述无人机在所述无人机到达其中一个所述预设航点时的姿态。
  30. 一种无人机的控制方法,其特征在于,包括:
    在所述无人机沿预设航线飞行时,获取所述无人机对预设的目标对象拍摄生成的第一图像;
    接收用户对第一图像输入的第二操作;
    其中,所述第二操作用于生成第二指令,所述第二指令用于:
    在控制所述无人机对所述目标对象进行持续拍摄的过程中,自动控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以使得所述目标对象保持在所述第一图像的第一方向的预设位置;
    控制所述无人机对所述目标对象进行持续拍摄的过程中,根据所述第二操作,控制所述无人机的姿态或者所述无人机上搭载的拍摄装置的姿态,以调整所述目标对象在所述第一图像的第二方向的位置;其中,所述预设航线包括至少两个预设航点。
  31. 一种无人机的控制装置,其特征在于,所述控制装置包括存储器和处理 器;
    所述存储器,用于存储计算机程序;
    所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如权利要求1-18中任一项所述的方法步骤,或者,实现如权利要求24-29中任一项所述的方法步骤。
  32. 一种无人机的控制装置,其特征在于,所述控制装置包括存储器和处理器;
    所述存储器,用于存储计算机程序;
    所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如权利要求19-23中任一项所述的方法步骤,或者,实现如权利要求30所述的方法步骤。
  33. 一种无人机,其特征在于,包括如权利要求31所述的控制装置。
  34. 一种控制终端,其特征在于,包括如权利要求32所述的控制装置。
  35. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求1-18中任一项所述的方法步骤,或者,实现如权利要求24-29中任一项所述的方法步骤。
  36. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求19-23中任一项所述的方法步骤,或者,实现如权利要求30所述的方法步骤。
PCT/CN2022/129378 2022-11-02 2022-11-02 无人机的控制方法、装置及存储介质 WO2024092586A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/129378 WO2024092586A1 (zh) 2022-11-02 2022-11-02 无人机的控制方法、装置及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/129378 WO2024092586A1 (zh) 2022-11-02 2022-11-02 无人机的控制方法、装置及存储介质

Publications (1)

Publication Number Publication Date
WO2024092586A1 true WO2024092586A1 (zh) 2024-05-10

Family

ID=90929238

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/129378 WO2024092586A1 (zh) 2022-11-02 2022-11-02 无人机的控制方法、装置及存储介质

Country Status (1)

Country Link
WO (1) WO2024092586A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100228406A1 (en) * 2009-03-03 2010-09-09 Honeywell International Inc. UAV Flight Control Method And System
CN105518555A (zh) * 2014-07-30 2016-04-20 深圳市大疆创新科技有限公司 目标追踪***及方法
CN107943088A (zh) * 2017-12-22 2018-04-20 广州亿航智能技术有限公司 一种控制无人机的方法及其***
CN110771141A (zh) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 拍摄方法和无人机
US20200073385A1 (en) * 2018-09-04 2020-03-05 Skydio, Inc. Applications and skills for an autonomous unmanned aerial vehicle
WO2022193081A1 (zh) * 2021-03-15 2022-09-22 深圳市大疆创新科技有限公司 无人机的控制方法、装置及无人机

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100228406A1 (en) * 2009-03-03 2010-09-09 Honeywell International Inc. UAV Flight Control Method And System
CN105518555A (zh) * 2014-07-30 2016-04-20 深圳市大疆创新科技有限公司 目标追踪***及方法
CN107943088A (zh) * 2017-12-22 2018-04-20 广州亿航智能技术有限公司 一种控制无人机的方法及其***
US20200073385A1 (en) * 2018-09-04 2020-03-05 Skydio, Inc. Applications and skills for an autonomous unmanned aerial vehicle
CN110771141A (zh) * 2018-11-19 2020-02-07 深圳市大疆创新科技有限公司 拍摄方法和无人机
WO2022193081A1 (zh) * 2021-03-15 2022-09-22 深圳市大疆创新科技有限公司 无人机的控制方法、装置及无人机

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DJI: "Manvic2 Enterprise Series User Manual 1.8", MANVIC2 ENTERPRISE, 30 April 2021 (2021-04-30), pages 1 - 67, XP009556162 *

Similar Documents

Publication Publication Date Title
US20200346753A1 (en) Uav control method, device and uav
WO2019227441A1 (zh) 可移动平台的拍摄控制方法和设备
WO2020211813A1 (zh) 立面环绕飞行的控制方法、装置、终端及存储介质
WO2020143677A1 (zh) 一种飞行控制方法及飞行控制***
WO2020019106A1 (zh) 云台和无人机控制方法、云台及无人机
WO2019061295A1 (zh) 一种视频处理方法、设备、无人机及***
WO2021026789A1 (zh) 基于手持云台的拍摄方法、手持云台及存储介质
WO2019227289A1 (zh) 延时拍摄控制方法和设备
WO2019195991A1 (zh) 运动轨迹确定、延时摄影方法、设备及机器可读存储介质
WO2021212445A1 (zh) 拍摄方法、可移动平台、控制设备和存储介质
WO2018090807A1 (zh) 飞行拍摄控制***和方法、智能移动通信终端、飞行器
WO2020019212A1 (zh) 视频播放速度控制方法及***、控制终端和可移动平台
US20210004005A1 (en) Image capture method and device, and machine-readable storage medium
WO2020133410A1 (zh) 一种拍摄方法及装置
WO2020014953A1 (zh) 一种图像处理方法及设备
JP4896115B2 (ja) 空中移動体からの自動追尾撮影装置
WO2020087346A1 (zh) 拍摄控制方法、可移动平台、控制设备及存储介质
WO2022082440A1 (zh) 确定目标跟随策略的方法、装置、***、设备及存储介质
WO2022061541A1 (zh) 控制方法、手持云台、***及计算机可读存储介质
WO2020209167A1 (ja) 情報処理装置、情報処理方法、及びプログラム
WO2021217408A1 (zh) 无人机***及其控制方法和装置
WO2024092586A1 (zh) 无人机的控制方法、装置及存储介质
JP2018014608A (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2022141122A1 (zh) 无人机的控制方法、无人机及存储介质
WO2019227352A1 (zh) 飞行控制方法及飞行器