WO2019104554A1 - Procédé de commande pour véhicule aérien sans pilote et terminal de commande - Google Patents

Procédé de commande pour véhicule aérien sans pilote et terminal de commande Download PDF

Info

Publication number
WO2019104554A1
WO2019104554A1 PCT/CN2017/113653 CN2017113653W WO2019104554A1 WO 2019104554 A1 WO2019104554 A1 WO 2019104554A1 CN 2017113653 W CN2017113653 W CN 2017113653W WO 2019104554 A1 WO2019104554 A1 WO 2019104554A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
target
drones
job task
control
Prior art date
Application number
PCT/CN2017/113653
Other languages
English (en)
Chinese (zh)
Inventor
钟和立
吴旭民
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2017/113653 priority Critical patent/WO2019104554A1/fr
Priority to CN201780026702.9A priority patent/CN109154828A/zh
Publication of WO2019104554A1 publication Critical patent/WO2019104554A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft

Definitions

  • the embodiment of the invention relates to the field of drones, and in particular to a control method and a control terminal for a drone.
  • the flying hand can control a drone to perform work tasks in the work area through the remote control, such as a pesticide spraying operation task, but when the operating area of the drone is large, a drone may need The task of the job can be completed for a long time, resulting in a low efficiency of the drone.
  • Embodiments of the present invention provide a control method and a control terminal for a drone to improve work efficiency of a human machine.
  • a first aspect of the present invention provides a control method for a drone, which is applied to a control terminal of a drone, and includes:
  • a second aspect of the embodiments of the present invention provides a control method for a drone, which is applied to a control terminal of a drone, and includes:
  • the target drone being one of at least two drones communicatively coupled to the control terminal;
  • a third aspect of the embodiments of the present invention provides a control terminal for a drone, including: a processor and a communication interface;
  • the processor is used to:
  • the communication interface is configured to transmit target job task data determined for each of at least two drones communicatively coupled to the control terminal to a corresponding drone to cause the drone to perform the The target job task indicated by the target job task data.
  • a fourth aspect of the embodiments of the present invention provides a control terminal for a drone, including: a processor and a display component;
  • the processor is used to:
  • the target drone being one of at least two drones communicatively coupled to the control terminal;
  • the control method and the control terminal of the unmanned aerial vehicle provided by the embodiment detect the user's work task assignment operation by the control terminal, and according to the work task assignment operation, each of the at least two drones communicatively connected with the control terminal A target job task data is determined, and the target job task data corresponding to each drone is sent to the drone, so that the same control terminal can control a plurality of drones to perform work tasks, thereby improving the operating efficiency of the drone.
  • FIG. 1 is a flowchart of a method for controlling a drone according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a user interface according to an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a user interface according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of a user interface according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a user interface according to an embodiment of the present invention.
  • FIG. 6 is a schematic diagram of a user interface according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a user interface according to an embodiment of the present invention.
  • FIG. 8 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 9 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 10 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 11 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 12 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 13 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 14 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • 15 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 16 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • 17 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 18 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 19 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 21 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 22 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 23 is a flowchart of a method for controlling a drone according to an embodiment of the present invention.
  • FIG. 24 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 25 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 26 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 27 is a schematic diagram of a user interface according to another embodiment of the present invention.
  • FIG. 29 is a structural diagram of a control terminal of a drone according to another embodiment of the present invention.
  • a component when referred to as being "fixed” to another component, it can be directly on the other component or the component can be present. When a component is considered to "connect” another component, it can be directly connected to another component or possibly a central component.
  • FIG. 1 is a flowchart of a method for controlling a drone according to an embodiment of the present invention.
  • the control method of the drone is applied to a control terminal of the drone.
  • the method in this embodiment may include:
  • Step S101 detecting a job assignment operation of the user.
  • the execution body of the method of this embodiment may be a control terminal of the drone, and the control terminal may specifically be a remote controller, a smart phone, a tablet computer, a ground control station, a laptop computer, a watch, a wristband, and the like, and combinations thereof.
  • the control terminal is provided with a user interface, and the user can operate the user interface.
  • the control terminal can detect the operation of the user interface on the user, and control the drone connected to the control terminal according to the operation of the user.
  • the control terminal can communicate with at least two drones. Connected, the drone can be specifically an agricultural drone. This embodiment takes the control terminal as a communication connection with two drones as an example.
  • the user interface 20 of the control terminal includes a plurality of icons, such as an icon 21, an icon 22, an icon 23, and an icon 24.
  • the user interface 20 may display a job task list, the job task in the job task list is a task for the drone to perform; when the user operates the icon 22, for example, click, the user interface 20 may display the user information;
  • the user interface 20 can display information of the drone that is communicatively connected to the control terminal; when the user operates the icon 24, for example, clicks, the user interface 20 can display the associated with the control terminal.
  • Settings item This is merely a schematic illustration, and does not limit the specific form and content of the user interface, nor the specific shape and function of each icon in the user interface.
  • the job task list may include a plurality of job tasks, and each of the plurality of job tasks may be unmanned.
  • the job task that the machine has performed may also be a job task that the drone has not performed, or it may be a job task that the drone is performing.
  • the user interface 20 displays a job task icon 25 of No. 1 field, a job task icon 26 of No. 2 field, and a job task icon 27 of No. 3 field.
  • Each job task icon includes a work time and a pre-operation. Estimate area, number of plots, task status, location information, and task identification.
  • the task identifier may be a task number, such as a sequence of numbers, or a sequence of other forms, which is not specifically limited herein.
  • the user for example, the flying hand, can perform an assignment operation on the job task displayed in the job task list. For example, after the user clicks the arrow 251 included in the job task icon 25 or the job task icon 25, the user interface 20 displays the frame 30 as shown in FIG.
  • the bullet box 30 is provided for the user to select a drone that performs a work task corresponding to the No. 1 field.
  • the control terminal is in communication connection with the UAV No. 1 and the UAV No. 2
  • the user interface displays a drop-down list 32, and the drop-down list 32 includes the UAV No. 1
  • the identification information and the identification information of the UAV No. 2 the user can select an identification information of the UAV from the drop-down list 32, such as the identification information of the UAV No. 1, and click the OK button to set the UAV No. 1 A drone that performs a work task corresponding to No. 1 field.
  • the job task icon 25 further includes a positioning mark 252 of the No. 1 field.
  • the user interface displays an electronic map in which the position or area indicated by the positioning information 1 is displayed, optionally, The positioning information 1 is the position information of the No. 1 field.
  • the control terminal can detect that the user includes the job task icon 25 or the job task icon 25
  • the click operation of the arrow 251 can also detect the user's selection operation of the identification information of the UAV No. 1 and according to the user's click operation on the job task icon 25 or the arrow 251 included in the job task icon 25, and the user pair 1
  • the selection operation of the identification information of the UAV No. 1 is a user's assignment operation to the job task corresponding to the No. 1 field.
  • control terminal can also detect the user's assignment operation to the job task corresponding to the No. 2 field.
  • the user can set the UAV No. 2 as a drone that performs the work task corresponding to the No. 2 field. The specific process will not be described here.
  • Step S102 Determine target job task data for each of at least two drones communicatively connected to the control terminal according to the detected job task assignment operation.
  • the control terminal may store a plurality of job task data, or store a plurality of job task data in a server or a storage system communicably connected to the control terminal, optionally, one job task data and one on the user interface.
  • the job task icon corresponds to each other, that is, each job task icon on the user interface can be associated with a job task data through the task identifier, the job task data includes route information, flight height of the drone along the route, and flight speed. , flight attitude, actions that the drone needs to perform when flying along the route, such as spraying, aerial photography, etc.
  • the control terminal When the control terminal detects the assignment operation of the job task corresponding to the No. 1 field, the control terminal may acquire the information from the server or the storage system that is in communication with the control terminal according to the task identifier of the job task corresponding to the No. 1 field.
  • the task identifies the corresponding job task data, and determines the job task data as the target job task data of the UAV No. 1.
  • the server or the storage system that is connected to the local or communication terminal according to the task identifier of the job task corresponding to the No. 2 field may be identified.
  • the job task data corresponding to the task identifier is obtained, and the job task data is determined as the target job task data of the No. 2 drone.
  • Step S103 Send target job task data determined for each of at least two drones communicatively connected to the control terminal to a corresponding drone to cause the drone to execute the target job task data. Indicated target job task.
  • the control terminal determines the target work task data of the UAV No. 1 and the UAV No. 2
  • the target work task data of the drone No. 1 is transmitted to the drone No. 1 so that the drone No. 1 executes the target work task indicated by the target work task data.
  • the target work task data of the UAV No. 2 is sent to the UAV No. 2, so that the UAV No. 2 executes the target work task indicated by the target work task data.
  • the target job task data may include job area information, wherein the job area information may include at least one of location information, area information, and boundary information of the work area, for example, when the control terminal When the user's assignment operation to the job task corresponding to the No. 1 field is detected, the control terminal can acquire the work area information corresponding to the No. 1 field.
  • the implementation manner of the control terminal acquiring the work area information corresponding to the No. 1 field is as follows:
  • control terminal or a server or a storage system communicatively connected to the control terminal stores work area information corresponding to each work task, and the control terminal can be locally or connected to the control terminal. Or the working area information corresponding to the No. 1 field is obtained in the storage system.
  • 253 is an icon for identifying No. 1 field.
  • the user interface will display an electronic map 40 as shown in FIG. 4, and the electronic map 40 is displayed.
  • the control terminal detects the operation of the icon 41 by the user, it determines at least one of the work area information corresponding to the No. 1 field, such as the position information, the area information, and the boundary information.
  • the control terminal determines the work area information corresponding to the No. 1 field
  • the work area information corresponding to the No. 1 field is transmitted to the No. 1 drone according to the user's assignment operation to the work task corresponding to the No. 1 field, 1
  • the drone can plan the route to be operated on the No. 1 field based on the work area information corresponding to the No. 1 field, and execute the route.
  • control terminal transmits the work area information corresponding to the No. 2 field to the No. 2 drone, and after the No. 2 drone receives the work area information corresponding to the No. 2 field, the work area information corresponding to the No. 2 field can be obtained. Plan the route to operate in the No. 2 field and execute the route.
  • the target job task data sent by the control terminal to the drone may specifically be route information.
  • the target job task data sent by the control terminal to the UAV 1 may be specifically The target work area is, for example, the route information corresponding to the No. 1 field, and the target work task data transmitted by the control terminal to the UAV No. 2 may specifically be the route information corresponding to the target work area, for example, No. 2 field.
  • the same job task can also be performed by multiple drones. As shown in FIG. 5, the number of plots included in the No. 1 field is two. When the user clicks on the positioning mark 252 of the positioning information 1 as shown in FIG. 5, the user interface displays an electronic map 60 as shown in FIG.
  • Map 60 shows two plots of plot 1, such as parcel A and parcel B.
  • the user interface 20 displays a bullet box 70 as shown in FIG. 7, and the bullet box 70 includes a selection box 71 and a selection box 72.
  • the selection box 71 includes a drop-down arrow 711.
  • the user interface displays a drop-down list 712.
  • the drop-down list 712 includes the identification information of the drone No. 1 and the identification information of the UAV No. 2, and the user can pull down the In the list 712, the identification information of one drone is selected, for example, the identification information of the UAV No.
  • the selection box 72 includes a drop-down arrow 721.
  • the user interface displays a drop-down list 722.
  • the drop-down list 722 includes the identification information of the drone No. 1 and the identification information of the UAV No. 2, and the user can pull down from In the list 722, the identification information of one drone is selected, for example, the identification information of the UAV No. 2, and the parcel B is assigned to the UAV No. 2 to operate. Therefore, the work task corresponding to the No. 1 field is realized by the UAV No. 1 and the No. 2 UAV. This is only a schematic description, and does not limit the correspondence between the drone that is communicatively connected to the control terminal and the work area.
  • the control terminal detects the user's job task assignment operation, and determines the target job task data for each of the at least two drones communicatively connected to the control terminal according to the job task assignment operation.
  • the target job task data corresponding to the machine is sent to the drone, so that the same control terminal can control a plurality of drones to perform work tasks, thereby improving the operating efficiency of the drone.
  • Embodiments of the present invention provide a method for controlling a drone.
  • FIG. 8 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 8, on the basis of the embodiment shown in FIG. 1, the method in this embodiment may further include:
  • Step S801 detecting a flight state control operation of the user.
  • the communication connection between the control terminal and the two drones is taken as an example, and the two drones respectively For the No. 1 UAV and the No. 2 UAV, by the above embodiment, the control terminal can assign the target work task data to each of the two UAVs.
  • the control terminal gives No. 1
  • the target job task data assigned by the drone is the route data corresponding to the work area 91 in the electronic map 90 as shown in FIG. 9, and the target work task data assigned by the control terminal to the No. 2 drone is as shown in FIG.
  • the control terminal displays the route 911 corresponding to the work area in the work area 91.
  • the control terminal transmits the route data corresponding to the work area 92 to the UAV No. 2
  • the control terminal displays the route 921 corresponding to the work area in the work area 92.
  • control terminal may also display an icon 912 for marking the UAV No. 1 and an icon 922 for marking the UAV No. 2 in the electronic map 90.
  • the UAV No. 1 and the UAV No. 2 can also transmit their position information to the control terminal in real time, and the control terminal adjusts according to the real-time position information of the UAV No. 1.
  • the position of the icon 912 on the route 911 causes the icon 912 to move in the direction of the route 911.
  • control terminal adjusts the position of the icon 922 on the route 921 according to the real-time location information of the UAV 2, so that the icon 922 Move in the direction of the route 921.
  • the user can also control the flight status of UAV No. 1 and UAV No. 2 through the control terminal, for example, control No. 1 Flight speed, flight altitude, etc. of man-machine and No. 2 drone.
  • the user can input data in the input box 931 to control the flying height of the drone No. 1, or adjust the arrow on the progress bar 932 to adjust the flying height of the drone No. 1. It is also possible to input data in input box 941 to control the flight speed of drone 1, or adjust the arrow on progress bar 942 to adjust the flight speed of drone 1.
  • the control terminal can detect the user's control operation on the flight state of the UAV No. 1.
  • the user can enter data in input box 951 to control the flying height of drone 2, or adjust the arrow on progress bar 952 to adjust the flying height of drone 2.
  • Data can also be entered in input box 961 to control the flight speed of UAV 2, or the arrow on progress bar 962 can be adjusted to adjust the flight speed of UAV No. 2.
  • the control ends The terminal can detect the user's control operation on the flight state of the UAV 2.
  • Step S802 controlling flight states of the at least two unmanned aerial vehicles or controlling flight states of the target drones in the at least two unmanned aerial vehicles according to the flight state control operation.
  • the control terminal can control the flight state of the UAV No. 1 according to the user's control operation on the flight state of the UAV No. 1, and control the No. 2 unmanned according to the user's control operation on the flight state of the UAV No. 2 Flight status of the aircraft.
  • the user can control only the flight state of the UAV No. 1, or only the flight state of the UAV No. 2.
  • control terminal may determine the flying height set by the user for the UAV No. 2 according to the data input by the user in the input box 951, and send the flying height to the UAV No. 2 to make the UAV No. 2
  • the current flight altitude is adjusted according to the flight height.
  • the method further includes:
  • Step S1001 Receive target job task execution status information sent by each of the at least two drones.
  • each of the UAV No. 1 and UAV No. 2 can also perform the status information of the target work task that it performs, for example, completion.
  • the rate is sent to the control terminal.
  • the control terminal receives the completion rate of the target job task transmitted by each of the No. 1 UAV and the No. 2 UAV.
  • Step S1002 Display target job task execution status information sent by each of the at least two drones.
  • the control terminal After the control terminal receives the completion rate of the target job task sent by each of the UAV No. 1 and the UAV No. 2, the target job sent by each of the UAV No. 1 and the UAV No. 2
  • the completion rate of the task is displayed on the user interface.
  • the display box 111 of the user interface 110 can display the icon 912 of the drone No. 1 and the completion rate of the target task of the UAV No. 1; user interface
  • the display box 112 of 110 can display the icon 922 of the UAV No. 2 and the completion rate of the target work task of the UAV No. 2.
  • the method further includes:
  • Step S1201 Receive flight state information sent by each of the at least two drones.
  • each of the UAV No. 1 and UAV No. 2 can also transmit its flight status information to the control terminal.
  • the control terminal receives flight status information transmitted by each of the UAV No. 1 and the UAV No. 2.
  • the flight status information includes one or more of flight speed, flight altitude, location information, and power information.
  • Step S1202 Display flight state information sent by each of the at least two drones.
  • the control terminal After receiving the flight state information sent by each of the UAV No. 1 and the UAV No. 2, the control terminal displays the flight state information sent by each of the UAV No. 1 and the UAV No. 2 in the U.S.
  • the display box 111 of the user interface 110 can also display the power, speed, and altitude of the UAV No. 1; the display frame 112 of the user interface 110 can also display the UAV No. 2 Electricity, speed, altitude.
  • the method further includes:
  • Step S1301 Receive fault information sent by the first drone of the at least two drones.
  • the drone communicably connected to the control terminal may further include a drone No. 3, and the control terminal allocates a route corresponding to the work area 141 according to a job assignment operation of the user.
  • the data is sent to the drone No. 3, and correspondingly, the display box 113 of the user interface 140 can also display the icon of the drone No. 3, the completion rate of the target work task of the drone No. 3, and the drone of the No. 3 drone. Electricity, speed, altitude.
  • the order of the display frame 111, the display frame 112, and the display frame 113 may be determined according to the communication connection sequence between the control terminal and the three drones, or may be based on three drones. The number size is determined.
  • UAV No. 1 When the UAV No. 1, UAV No. 2 and UAV No. 3 perform their respective target tasks, if one of the UAVs, such as UAV No. 2, fails, UAV No. 2 can also The fault information is sent to the control terminal, and the control terminal can receive the fault information sent by the drone No. 2.
  • Step S1302 displaying fault indication information associated with the first drone according to the fault information.
  • the fault indication information associated with the fault information is displayed on the user interface, such as the user interface 140, for example, the drone 2 on the route 921 can be Icon 922 is displayed in a highlighted state, for example, displayed in red,
  • the icon of the drone No. 2 in the display box 112 is displayed in a highlighted state, for example, displayed in red, or/and a fault indication icon 142 is displayed in the display frame 112 to prompt the user that the UAV No. 2 has failed.
  • Step S1303 Detecting a fault information viewing operation of the user.
  • the user may click the display box 112, or click the fault indication icon 142, or click the icon 922 of the UAV No. 2 to view No. 2 Man-machine fault information.
  • the control terminal can detect a user's click operation on the display frame 112, the fault indication icon 142, or the icon 922.
  • Step S1304 Display an operation according to the detected fault information, and display fault information sent by the first drone.
  • control terminal may display the fault information sent by the drone No. 2 according to the click operation of the display box 112, the fault indication icon 142, or the icon 922 by the user, and by viewing the fault information, the flying hand can obtain the No. 2 unmanned person. In this way, the specific component of the machine is faulty, and in this way, the working state of the drone can be tracked in time.
  • the control terminal receives the target job task execution state information and flight state information sent by each of the at least two unmanned aerial vehicles that are communicatively connected thereto, and transmits the target job task execution state information and flight transmitted by each drone.
  • the status information is displayed on the user interface, so that the user can monitor multiple drones at the same time.
  • the control terminal receives fault information sent by one of the at least two drones, and can also display a fault indication on the user interface. The information is provided to the drone that the user has failed the drone so that the user can immediately find the fault.
  • Embodiments of the present invention provide a method for controlling a drone.
  • FIG. 15 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 15, the method in this embodiment may further include:
  • Step S1501 detecting a user's takeoff control operation.
  • the "execute task" icon as shown in FIG. 16 can be clicked.
  • the user interface 20 displays a bullet box 160 as shown in FIG. 16, and the bullet box 160 includes a drone No. 1
  • the user can control at least one of the UAV No. 1 and the UAV No. 2 to take off by operating at least one of the slide button 161, the slide button 162, and the slide button 163.
  • sliding the slide button to the right indicates opening.
  • Step S1502 Control the at least two UAVs to take off or control the target UAV in the at least two UAVs to take off according to the takeoff control operation.
  • the drone No. 1 is controlled to take off.
  • the drone No. 2 is controlled to take off.
  • the control terminal detects that the user has turned on the slide button 163 the control UAV No. 1 and the UAV No. 2 take off at the same time.
  • the method further includes:
  • step S1701 the user's return flight control operation is detected.
  • the user can also click "as shown in FIG. 18".
  • the "Return” icon controls at least one of the No. 1 drone and the No. 2 drone to return.
  • the user interface 110 displays a bullet box 180 as shown in FIG. 18
  • the bullet box 180 includes a sliding button 181 for controlling the return of the drone No. 1 , and controls the drone of the No. 2 to return.
  • the sliding button 182 controls the sliding button 183 of the UAV No. 1 and the UAV No. 2 to return.
  • the user can control at least one of the UAV 1 and the UAV 2 to return to the navigation by operating at least one of the slide button 181, the slide button 182, and the slide button 183.
  • sliding the slide button to the right indicates opening.
  • Step S1702 Control the at least two unmanned aircrafts to return or control the target drones in the at least two drones to return according to the return flight control operation.
  • control unit 1 When the control terminal detects that the user has turned on the slide button 181, the control unit 1 returns to the drone. At this time, the control terminal can send the position information of the preset return point 184 to the drone No. 1 so that the No. 1 is absent. The man machine returns to the preset return point 184 along the return route 185.
  • the control terminal When the control terminal detects that the user has turned on the slide button 182, the control unit 2 returns to the drone; at this time, the control terminal can transmit the position information of the preset return point 184 to the drone No. 2, so that the No. 2 is absent. The man machine returns to the preset return point 184 along the return route 186.
  • the control terminal When the control terminal detects that the user has turned on the slide button 183, the UAV No. 1 and the UAV No. 2 are controlled to return at the same time. At this time, the control terminal may send the position information of the preset return point 184 to the UAV No. 1 and the UAV No. 2, respectively, so that the UAV No. 1 returns to the preset return point 184 along the return route 185. The drone 2 is returned to the preset return point 184 along the return route 186.
  • the method further includes:
  • Step S1901 detecting a user's job task pause operation.
  • the user can also click on the "Pause” icon as shown in FIG. 20 to control the UAV No. 1 and At least one of the No. 2 drones suspends the execution of the target job task.
  • the user interface 110 displays the bullet box 200 as shown in FIG. 20, and the bullet box 200 includes a slide button 201 for controlling the No. 1 drone to suspend the execution of the target job task, and the control No. 2 The drone suspends the slide button 202 of the target job task, and simultaneously controls the slide button 203 of the No. 1 drone and the No. 2 drone to suspend the execution of the target job task.
  • the user can control at least one of the No. 1 UAV and the No. 2 UAV to suspend execution of the target work task by operating at least one of the slide button 201, the slide button 202, and the slide button 203.
  • sliding the slide button to the right indicates opening.
  • Step S1902 Control the at least two UAVs to suspend execution of the target job task or control the target UAV in the at least two UAVs to suspend execution of the target job task according to the job task suspending operation.
  • the drone No. 1 is controlled to suspend execution of the target job task.
  • the control terminal detects that the user has turned on the slide button 202 the control drone No. 2 suspends execution of the target job task.
  • the control terminal detects that the user has turned on the slide button 203 the No. 1 drone and the No. 2 drone are simultaneously suspended to perform the target work task.
  • the method further includes:
  • step S2101 the user's job task end operation is detected.
  • the user can also click on the "End” icon as shown in FIG. 22 to control the UAV No. 1 and At least one of the No. 2 drones ends the execution of the target job task.
  • the user interface 110 displays the bullet box 220 as shown in FIG. 22, and the bullet box 220
  • the utility model comprises a sliding button 221 for controlling the No. 1 drone to end the execution of the target work task, a sliding button 222 for controlling the No. 2 drone to end the execution of the target work task, and simultaneously controlling the No. 1 drone and the No. 2 drone to execute the target operation. Slide button 223 for the task.
  • the user can control at least one of the No. 1 UAV and the No. 2 UAV to end the execution of the target work task by operating at least one of the slide button 221, the slide button 222, and the slide button 223.
  • sliding the slide button to the right indicates opening.
  • Step S2102 controlling the at least two drones to end the execution of the target job task or controlling the target drone of the at least two drones to end the execution of the target job task according to the job task ending operation.
  • control drone No. 1 ends the execution of the target job task.
  • control drone No. 2 ends the execution of the target job task.
  • control terminal detects that the user has turned on the slide button 223 the control drone No. 1 and the drone No. 2 simultaneously end the execution of the target work task.
  • the control terminal detects the user's take-off control operation, the return flight control operation, the work task pause operation, or the work task end operation, and controls at least two drones that are in communication with the control terminal to take off, return, and suspend the execution of the target task. Ending the execution of the target job task, or controlling the target drone in the at least two drones to take off, return, suspend the execution of the target job task, or end the execution of the target job task, improving at least two of the communication connections with the control terminal The flexibility of the drone for control.
  • Embodiments of the present invention provide a method for controlling a drone.
  • the method in this embodiment may further include: adjusting a zoom level of the displayed electronic map to indicate a location of each of the at least two drones
  • the icon of the information is displayed completely on the electronic map.
  • the control terminal can also automatically adjust the zoom level of the electronic map 90 so that the icon 912 indicating the position information of the drone No. 1 and the icon 922 indicating the position information of the drone No. 2 are complete.
  • the map is displayed on the electronic map 90.
  • control terminal when the control terminal receives the location information sent by each of the plurality of drones communicatively connected thereto, the control terminal may adjust the electronic map according to the location information sent by each drone.
  • the zoom level is such that the icon indicating the location information of each drone is completely displayed on the electronic map.
  • the adjusting the zoom level of the displayed electronic map to completely display the icon indicating the location information of each of the at least two drones on the electronic map including: adjusting the display
  • the zoom level of the electronic map is such that an icon indicating position information of each of the at least two drones and an icon indicating position information of the preset return point are completely displayed on the electronic map.
  • the control terminal may further adjust the electronic map 90 according to the position information of the preset return point and the position information sent by each of the plurality of drones communicatively connected to the control terminal.
  • the zoom level is such that the icon 912 indicating the position information of the drone No. 1 and the icon 922 indicating the position information of the drone No. 2 are completely displayed on the electronic map 90 while also causing the position indicating the preset return point.
  • the icon 97 of the information is completely displayed on the electronic map 90.
  • the preset return point may be a dosing point of the agricultural drone.
  • the preset return point may be determined by the user performing a return point setting operation on the control terminal, wherein the preset return point may be in the same position as the control terminal or may be inconsistent.
  • the adjusting the zoom level of the displayed electronic map to cause the icon indicating the position information of each of the at least two drones to be completely displayed on the electronic map including: adjusting the displayed electronic map
  • the zoom level is such that an icon indicating location information of each of the at least two drones and an icon indicating location information of the control terminal are completely displayed on the electronic map.
  • the location where the preset return point is located is different from the location where the control terminal is located.
  • the control terminal is located at the position indicated by the icon 98 in the electronic map 90, for example, based on FIG.
  • the terminal may further adjust the zoom level of the electronic map 90 according to the location information of the control terminal and the location information sent by each of the plurality of drones communicatively connected to the control terminal, so that the drone No. 1 is indicated.
  • the icon 912 of the location information and the icon 922 indicating the location information of the drone 2 are completely displayed on the electronic map 90, while also causing the icon 98 indicating the location information of the control terminal to be completely displayed on the electronic map 90.
  • the location where the control terminal is located may be a preset return point; if the drone that is in communication with the control terminal is an agricultural drone, the location of the control terminal may also be agricultural Man-made dosing point.
  • the icon of the position information of each drone is completely displayed on the electronic map, or the icon indicating the position information of the preset return point is completely displayed on the electronic map. Up, or making the icon indicating the location information of the control terminal completely displayed on the electronic map, improves the flexibility of electronic map content display.
  • Embodiments of the present invention provide a method for controlling a drone.
  • FIG. 23 is a flowchart of a method for controlling a drone according to an embodiment of the present invention.
  • the control method of the drone is applied to a control terminal of the drone.
  • the control method of the drone is applied to a control terminal of the drone.
  • the method in this embodiment may include:
  • Step S2301 Detecting a target drone selection operation of the user, the target drone being one of at least two drones communicatively connected to the control terminal.
  • the user interface 20 of the control terminal includes a plurality of icons, such as an icon 21, an icon 22, an icon 23, and an icon 24.
  • the user interface 20 may display information of the drone that is communicatively connected to the control terminal, for example, the drone that is communicatively connected to the control terminal includes the UAV No. 1 and No. 2
  • the drone when the user clicks on the icon 23, the user interface 20 can display a bullet box 240 as shown in FIG.
  • the bullet box 240 includes an icon 241 and an icon 242, and the user can click on the icon 241 or the icon 242 when the control terminal detects When the user clicks on the icon 241, it is determined that the target drone selected by the user is the UAV No. 1. When the control terminal detects that the user clicks the icon 242, it determines that the target drone selected by the user is the UAV No. 2.
  • Step S2302 Display a control interface corresponding to the target drone according to the target drone selection operation.
  • the control terminal can display the control interface corresponding to the UAV No. 1 according to the user's selection operation of the UAV No. 1.
  • the control terminal can display the control interface corresponding to the UAV No. 2 according to the user's selection operation of the UAV No. 2.
  • control terminal may display the control interface of the drone corresponding to the display frame selected by the user according to the user's selection operation on the display frame 111, the display frame 112, or the display frame 113.
  • control terminal may also be based on the user icon 912 or The selection operation of the icon 922 displays the control interface of the drone corresponding to the icon selected by the user.
  • the control terminal can display the control interface corresponding to the UAV No. 1 as shown in FIG. 25 according to the user's selection operation of the UAV No. 1.
  • the control interface 250 displays an icon 912 for signing the UAV No. 1, a task assignment icon 254 for assigning a work task to the UAV No. 1, and an input for controlling the fly height of the UAV No. 1.
  • the user can assign a job task to the drone 1 by clicking the task assignment icon 254.
  • the user can also enter data in input box 931 to control the flying height of drone 1, or adjust the arrow on progress bar 932 to adjust the flying height of drone 1.
  • only the control interface corresponding to the UAV No. 1 is schematically illustrated, and the specific form and content of the control interface are not limited.
  • the method further includes: detecting a user task assignment operation on the control interface; determining target task task data for the target drone according to the detected job task assignment operation; The job task data is sent to the target drone to cause the target drone to execute the target job task indicated by the target job task data.
  • the control terminal displays the popup 260 on the control interface 250 according to the user's click operation on the task assignment icon 254, and the popup window 260 includes a drop-down arrow 261.
  • the pop-up window 260 displays a drop-down list 262.
  • the drop-down list 262 includes a work task list including a plurality of work tasks, such as the work task of No. 1 field and the work task of No. 2 field. This is merely a schematic illustration and does not limit the specific form and content of the control interface. The user can select a job task such as the job job of No.
  • the target work task data is determined for the UAV No. 1 according to the operation, and the target work task data may be specifically It is the route information corresponding to the No. 1 field. Further, the control terminal transmits the route information corresponding to the No. 1 field to the UAV No. 1 so that the UAV No. 1 executes the work task corresponding to the No. 1 field.
  • the control terminal After the control terminal transmits the route information corresponding to the No. 1 field to the UAV No. 1, the control terminal can display the control interface 270 as shown in FIG.
  • the control interface 270 displays the work area 91 of the drone No. 1 , the route 911 corresponding to the work area 91 , and the icon 912 for signing the drone No. 1 .
  • the method further includes: detecting a flight state control operation of the user on the control interface; determining a flight state control instruction according to the detected flight state control operation; and transmitting the flight state control command to the A target drone to control the flight status of the target drone.
  • the user can also input data in the input box 931 to control the flying height of the UAV No. 1, or adjust the progress bar 932.
  • the arrow on the arrow adjusts the flying height of the drone No. 1.
  • the control terminal can detect the user's control operation on the flight state of the UAV No. 1, and control the flight state of the UAV No.
  • a flying height control command is generated; the flying height control command is sent to the No. 1 drone to control the flying height of the No. 1 drone.
  • the method further includes: receiving target job task execution state information sent by the target drone; and displaying target job task execution state information sent by the target drone on the control interface.
  • control terminal receives the completion rate of the target job task sent by the UAV No. 1, and displays the completion rate of the target job task sent by the UAV No. 1 on the control interface 270, as shown in FIG.
  • the completion rate of the target work task of the drone No. 1 is displayed in the display frame 111 of 270.
  • the method further includes: receiving flight state information sent by the target drone; and displaying flight state information sent by the target drone on the control interface.
  • the control terminal can also receive flight status information transmitted by the UAV No. 1.
  • the flight status information includes one or more of flight speed, flight altitude, location information, and power information.
  • the display frame 111 of the control interface 270 can also display 1 The power, speed, and altitude of the drone.
  • the control terminal can also receive the location information transmitted by the UAV No. 1 and display the icon 912 indicating the No. 1 UAV position information on the route 911 indicating that the UAV No. 1 is executed.
  • the method further includes: receiving fault information sent by the target drone; and displaying fault indication information associated with the target drone on the control interface according to the fault information.
  • the control terminal can receive the fault information sent by the UAV No. 1, and display the fault indication information associated with the fault information on the control interface 270, for example, the route 911 can be
  • the icon 912 of the drone No. 1 is displayed in a highlighted state, for example, displayed in red
  • the icon of the drone No. 1 in the display frame 111 is displayed in a highlighted state, for example, displayed in red, or/and in the display frame 111.
  • a fault indication icon 271 is displayed to prompt the user that the No. 1 drone has failed.
  • the method further includes: detecting a fault information viewing operation of the user on the control interface; displaying an operation according to the detected fault information, and displaying, on the control interface, fault information sent by the target drone .
  • the user may click the display box 111, or click the fault indication icon 271, or click the icon 912 of the drone No. 1 to view No. 1 Man-machine fault information.
  • the control terminal detects the target drone selection operation of the user, determines the target drone selected by the user from the plurality of drones communicatively connected with the control terminal, and displays the corresponding control of the target drone.
  • the interface allows the user interface to switch from a user interface that controls multiple drones to a user interface that controls a target drone, further enhancing the flexibility of the user interface and the flexibility of drone control.
  • FIG. 28 is a structural diagram of a control terminal of a drone according to an embodiment of the present invention.
  • the control terminal 280 of the drone includes a processor 281 and a communication interface 282.
  • the processor 281 is configured to: detect a user task assignment operation; and determine target job task data for each of at least two drones communicatively connected to the control terminal according to the detected job task assignment operation; 282 for determining each of at least two drones communicatively coupled to the control terminal
  • the target job task data is sent to the corresponding drone to cause the drone to execute the target job task indicated by the target job task data.
  • the processor 281 is further configured to: detect a flight state control operation of the user; control the flight state of the at least two unmanned aerial vehicles or control the at least two unmanned aerial vehicles according to the flight state control operation; The flight status of the target drone.
  • control terminal 280 of the drone further includes: a display component 283; the communication interface 282 is further configured to: receive target job task execution state information sent by each of the at least two drones; For: the control display component 283 displays target job task execution state information transmitted by each of the at least two drones.
  • the communication interface 282 is further configured to: receive flight state information sent by each of the at least two drones; the processor 281 is further configured to: control the display component 283 to display the at least two drones Flight status information for each transmission.
  • the flight status information includes one or more of flight speed, flight altitude, location information, and power information.
  • the communication interface 282 is further configured to: receive fault information sent by the first drone of the at least two drones; the processor 281 is further configured to: control the display component 283 to display and The fault indication information associated with the first drone.
  • the processor 281 is further configured to: detect a fault information viewing operation of the user; and view the operation according to the detected fault information, and the control display component 283 displays the fault information sent by the first drone.
  • the processor 281 is further configured to: detect a user's take-off control operation; control the at least two drones to take off or control the target of the at least two drones according to the take-off control operation The plane took off.
  • the processor 281 is further configured to: detect a return control operation of the user; control, according to the return control operation, to return the at least two drones or control a target of the at least two unmanned aerial vehicles The aircraft returned.
  • the processor 281 is further configured to: detect a user's job task pause operation; control the at least two drones to suspend execution of the target job task or control the at least two unmanned objects according to the job task pause operation The target drone in the machine suspends the execution of the target job task.
  • the processor 281 is further configured to: detect a user's job task ending operation;
  • the job task ending operation controls the at least two drones to end the execution of the target job task or controls the target drone of the at least two drones to end the execution of the target job task.
  • the processor 281 is further configured to: adjust a zoom level of the displayed electronic map, so that an icon indicating location information of each of the at least two drones is completely displayed on the electronic map.
  • the processor 281 adjusts a zoom level of the displayed electronic map to enable the icon indicating the location information of each of the at least two drones to be completely displayed on the electronic map, specifically for : adjusting a zoom level of the displayed electronic map such that an icon indicating location information of each of the at least two drones and an icon indicating location information of the preset return point are completely displayed on the electronic map .
  • the processor 281 adjusts a zoom level of the displayed electronic map to enable the icon indicating the location information of each of the at least two drones to be completely displayed on the electronic map, specifically for : adjusting a zoom level of the displayed electronic map such that an icon indicating location information of each of the at least two drones and an icon indicating location information of the control terminal are completely displayed on the electronic map .
  • control terminal provided by the embodiment of the present invention are similar to the embodiment shown in FIG. 1 to FIG. 22, and details are not described herein again.
  • the control terminal detects the user's job task assignment operation, and determines the target job task data for each of the at least two drones communicatively connected to the control terminal according to the job task assignment operation.
  • the target job task data corresponding to the machine is sent to the drone, so that the same control terminal can control a plurality of drones to perform work tasks, thereby improving the operating efficiency of the drone.
  • FIG. 29 is a structural diagram of a control terminal of a drone according to another embodiment of the present invention.
  • the control terminal 290 of the drone includes a processor 291 and a display component 292.
  • the processor 291 is configured to: detect a target drone selection operation of the user, the target drone is one of at least two drones communicatively connected to the control terminal; The machine selects an operation, and the control display component 292 displays a control interface corresponding to the target drone.
  • control terminal 290 further includes: a communication interface 293; the processor 291 is further configured to: Detecting a job assignment operation of the user on the control interface; determining target job task data for the target drone according to the detected job task assignment operation; the communication interface 293 is configured to send the target job task data to The target drone is configured to cause the target drone to perform the target job task indicated by the target job task data.
  • the processor 291 is further configured to: detect a flight state control operation of the user on the control interface; determine a flight state control command according to the detected flight state control operation; and the communication interface 293 is further configured to: A flight state control command is sent to the target drone to control the flight state of the target drone.
  • the communication interface 293 is further configured to: receive target job task execution state information sent by the target drone; and the processor 291 is further configured to: control the display component 292 to display the target on the control interface The target job task execution status information sent by the machine.
  • the communication interface 293 is further configured to: receive flight state information sent by the target drone; the processor 291 is further configured to: control the display component 292 to display, on the control interface, the target drone to send Flight status information.
  • the communication interface 293 is further configured to: receive fault information sent by the target drone; the processor 291 is further configured to: according to the fault information, control display component 292 to display and The fault indication information associated with the target drone.
  • the processor 291 is further configured to: detect a fault information viewing operation of the user on the control interface; and view the operation according to the detected fault information, and the control display component 292 displays the target on the control interface.
  • the fault information sent by the machine is further configured to: detect a fault information viewing operation of the user on the control interface; and view the operation according to the detected fault information, and the control display component 292 displays the target on the control interface. The fault information sent by the machine.
  • control terminal provided by the embodiment of the present invention are similar to the embodiment shown in FIG. 23, and details are not described herein again.
  • the control terminal detects the target drone selection operation of the user, determines the target drone selected by the user from the plurality of drones communicatively connected with the control terminal, and displays the corresponding control of the target drone.
  • the interface allows the user interface to switch from a user interface that controls multiple drones to a user interface that controls a target drone, further enhancing the flexibility of the user interface and the flexibility of drone control.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, actual There may be additional divisions at present, for example multiple units or components may be combined or integrated into another system, or some features may be omitted or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-described integrated unit implemented in the form of a software functional unit can be stored in a computer readable storage medium.
  • the above software functional unit is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform the methods of the various embodiments of the present invention. Part of the steps.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande pour un véhicule aérien sans pilote et un terminal de commande (280, 290). Le procédé comprend : la détection d'une opération d'attribution de tâche de travail d'un utilisateur (S101) ; en fonction de l'opération d'attribution de tâche de travail détectée, la détermination de données de tâche de travail cible pour chacun d'au moins deux véhicules aériens sans pilote connectés en communication à un terminal de commande (280, 290) (S102) ; et l'envoi des données de tâche de travail cible, déterminées pour chacun des au moins deux véhicules aériens sans pilote connectés en communication au terminal de commande (280, 290), au véhicule aérien sans pilote correspondant, de telle sorte que le véhicule aérien sans pilote exécute la tâche de travail cible indiquée par les données de tâche de travail cible (S103). La présente invention détecte, à l'aide du terminal de commande (280, 290), l'opération d'attribution de tâche de travail de l'utilisateur, détermine, en fonction de l'opération d'attribution de tâche de travail, les données de tâche de travail cible pour chacun des au moins deux véhicules aériens sans pilote connectés en communication au terminal de commande (280, 290), et envoie les données de tâche de travail cible correspondant à chaque véhicule aérien sans pilote au véhicule aérien sans pilote, de telle sorte que le même terminal de commande (280, 290) peut commander une pluralité de véhicules aériens sans pilote de façon à exécuter la tâche de travail, améliorant l'efficacité de travail de véhicules aériens sans pilote.
PCT/CN2017/113653 2017-11-29 2017-11-29 Procédé de commande pour véhicule aérien sans pilote et terminal de commande WO2019104554A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/113653 WO2019104554A1 (fr) 2017-11-29 2017-11-29 Procédé de commande pour véhicule aérien sans pilote et terminal de commande
CN201780026702.9A CN109154828A (zh) 2017-11-29 2017-11-29 无人机的控制方法及控制终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/113653 WO2019104554A1 (fr) 2017-11-29 2017-11-29 Procédé de commande pour véhicule aérien sans pilote et terminal de commande

Publications (1)

Publication Number Publication Date
WO2019104554A1 true WO2019104554A1 (fr) 2019-06-06

Family

ID=64803848

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/113653 WO2019104554A1 (fr) 2017-11-29 2017-11-29 Procédé de commande pour véhicule aérien sans pilote et terminal de commande

Country Status (2)

Country Link
CN (1) CN109154828A (fr)
WO (1) WO2019104554A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210278834A1 (en) * 2018-07-17 2021-09-09 Emesent IP Pty Ltd. Method for Exploration and Mapping Using an Aerial Vehicle

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111356984A (zh) * 2019-01-21 2020-06-30 深圳市大疆创新科技有限公司 任务显示方法及装置
WO2020199126A1 (fr) * 2019-04-02 2020-10-08 深圳市大疆创新科技有限公司 Procédé de commande pour plateforme mobile, borne de commande et support d'informations lisible par ordinateur
CN111684385B (zh) * 2019-05-27 2023-03-28 深圳市大疆创新科技有限公司 飞行控制方法、控制终端和无人机
CN112166404A (zh) * 2019-08-08 2021-01-01 深圳市大疆创新科技有限公司 农机监管方法及装置
CN113467511B (zh) * 2021-07-15 2022-12-27 广西壮族自治区自然资源调查监测院 无人机任务协同方法及***

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110098874A1 (en) * 2009-10-26 2011-04-28 Electronics And Telecommunications Research Institute Method and apparatus for navigating robot
CN106383646A (zh) * 2016-10-26 2017-02-08 广州极飞科技有限公司 一种无人飞行器启动植保作业的方法和装置
CN106502265A (zh) * 2016-10-26 2017-03-15 广州极飞科技有限公司 一种无人飞行器的航线生成方法和装置
CN106716288A (zh) * 2016-11-24 2017-05-24 深圳市大疆创新科技有限公司 农业无人飞行器的控制方法、地面控制端及存储介质
CN106774393A (zh) * 2016-09-22 2017-05-31 重庆零度智控智能科技有限公司 一种任务进度计算方法、装置及无人机
CN107390709A (zh) * 2017-08-25 2017-11-24 上海拓攻机器人有限公司 一种植保无人机多机协同作业方法及***

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016065513A1 (fr) * 2014-10-27 2016-05-06 深圳市大疆创新科技有限公司 Procédé et appareil permettant d'indiquer la position d'un véhicule aérien
CN105892472A (zh) * 2015-02-13 2016-08-24 Lg电子株式会社 移动终端及其控制方法
WO2017079623A1 (fr) * 2015-11-06 2017-05-11 Massachusetts Institute Of Technology Attribution dynamique de tâches dans une mission autonome multi-uav
CN105867181A (zh) * 2016-04-01 2016-08-17 腾讯科技(深圳)有限公司 无人机的控制方法和装置
CN107077411A (zh) * 2016-09-23 2017-08-18 深圳市大疆创新科技有限公司 应用于遥控器的信息提示方法及遥控器
WO2018058309A1 (fr) * 2016-09-27 2018-04-05 深圳市大疆创新科技有限公司 Procédé de commande, dispositif de commande, dispositif électronique et système de commande de véhicule aérien
CN106527481A (zh) * 2016-12-06 2017-03-22 重庆零度智控智能科技有限公司 无人机飞行控制方法、装置及无人机
CN106657320A (zh) * 2016-12-15 2017-05-10 北京佰人科技有限责任公司 基于地图定位的通讯方法和通讯装置
CN206658205U (zh) * 2017-05-04 2017-11-21 国网浙江省电力公司杭州供电公司 一种无人机数据传输***

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110098874A1 (en) * 2009-10-26 2011-04-28 Electronics And Telecommunications Research Institute Method and apparatus for navigating robot
CN106774393A (zh) * 2016-09-22 2017-05-31 重庆零度智控智能科技有限公司 一种任务进度计算方法、装置及无人机
CN106383646A (zh) * 2016-10-26 2017-02-08 广州极飞科技有限公司 一种无人飞行器启动植保作业的方法和装置
CN106502265A (zh) * 2016-10-26 2017-03-15 广州极飞科技有限公司 一种无人飞行器的航线生成方法和装置
CN106716288A (zh) * 2016-11-24 2017-05-24 深圳市大疆创新科技有限公司 农业无人飞行器的控制方法、地面控制端及存储介质
CN107390709A (zh) * 2017-08-25 2017-11-24 上海拓攻机器人有限公司 一种植保无人机多机协同作业方法及***

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210278834A1 (en) * 2018-07-17 2021-09-09 Emesent IP Pty Ltd. Method for Exploration and Mapping Using an Aerial Vehicle

Also Published As

Publication number Publication date
CN109154828A (zh) 2019-01-04

Similar Documents

Publication Publication Date Title
WO2019104554A1 (fr) Procédé de commande pour véhicule aérien sans pilote et terminal de commande
CN110531960B (zh) 用于通过虚拟世界,在现实世界中开发,测试以及部署数字现实应用程式的***与方法
US20190251851A1 (en) Navigation method and device based on three-dimensional map
CN109542119B (zh) 飞行器航线规划方法及***
EP3345832B1 (fr) Véhicule aérien sans pilote et procédé de commande
CN108227746A (zh) 一种无人机集群控制***及方法
US11334077B2 (en) Method and device for locating faulty photovoltaic panel, and unmanned aerial vehicle
CN105763620B (zh) 一种无人机与飞手的匹配方法和***
US20180276997A1 (en) Flight tag obtaining method, terminal, and server
JP2017502397A (ja) 飛行ミッション処理方法、装置及びシステム
US11087633B2 (en) Simulation server capable of interacting with a plurality of simulators to perform a plurality of simulations
WO2022247498A1 (fr) Surveillance de véhicule aérien sans pilote
WO2019119200A1 (fr) Procédé d'attribution de tâche de travail destiné à un véhicule aérien sans pilote, dispositif associé et support de stockage
CN113158116A (zh) 基于移动互联网的无人机控制平台
WO2023040986A1 (fr) Procédé et appareil de gestion de tâches d'engin volant sans pilote embarqué, dispositif et support de stockage
CN109714830A (zh) 一种飞行日志上传方法、装置及移动终端、无人机
CN107622177B (zh) 基于eati方法的航空投送仿真方法
CN108693892A (zh) 一种跟踪方法、电子装置
Tso et al. A human factors testbed for command and control of unmanned air vehicles
WO2019084952A1 (fr) Procédé d'attribution d'itinéraire aérien, serveur, dispositif terminal, dispositif de commande, et système
WO2023193604A1 (fr) Procédé de planification en ligne de tâche d'itinéraire et appareil associé
CN110020217A (zh) 接机信息控制/显示方法/***,介质、服务端及车载端
WO2023025202A1 (fr) Procédé et appareil de commande de direction de cardan et terminal
CN109799841A (zh) 一种无人机地面控制***、设备和存储介质
US11164121B1 (en) Task communication and management system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17933467

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17933467

Country of ref document: EP

Kind code of ref document: A1