CN111279284B - Control method and apparatus - Google Patents

Control method and apparatus Download PDF

Info

Publication number
CN111279284B
CN111279284B CN201880069355.2A CN201880069355A CN111279284B CN 111279284 B CN111279284 B CN 111279284B CN 201880069355 A CN201880069355 A CN 201880069355A CN 111279284 B CN111279284 B CN 111279284B
Authority
CN
China
Prior art keywords
target
area
spraying
map
control information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880069355.2A
Other languages
Chinese (zh)
Other versions
CN111279284A (en
Inventor
李文林
李焕婷
雷蔚然
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111279284A publication Critical patent/CN111279284A/en
Application granted granted Critical
Publication of CN111279284B publication Critical patent/CN111279284B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D1/00Dropping, ejecting, releasing, or receiving articles, liquids, or the like, in flight
    • B64D1/16Dropping or releasing powdered, liquid, or gaseous matter, e.g. for fire-fighting
    • B64D1/18Dropping or releasing powdered, liquid, or gaseous matter, e.g. for fire-fighting by spraying, e.g. insecticides
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Automation & Control Theory (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pest Control & Pesticides (AREA)
  • Catching Or Destruction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A control method and apparatus, the method comprising: acquiring and displaying a two-dimensional map of a target area on a display device (S401), wherein the target area comprises a target crop, and the two-dimensional map is acquired according to a shooting image acquired when the surveying and mapping unmanned aerial vehicle flies on the target area; determining a target map area corresponding to a target crop in the two-dimensional map (S402), wherein the target map area is used for determining spraying control information, and the spraying control information is used for controlling the unmanned spraying machine to spray the target crop; the identifier is displayed on the target map area in the two-dimensional map displayed by the display device (S403). The user can quickly and intuitively determine the target map area in the two-dimensional map by identifying the identifier, and the target map area is an area where the spraying operation needs to be performed, so that the spraying control information can be more intelligently and quickly obtained, the spraying effect is improved, and the spraying efficiency is improved.

Description

Control method and apparatus
Technical Field
The embodiment of the invention relates to the technical field of unmanned aerial vehicles, in particular to a control method and control equipment.
Background
With the increasing popularity of consumer unmanned aerial vehicles, industry-class application unmanned aerial vehicles also begin to become the corner of the world, and for the agricultural industry, agricultural unmanned aerial vehicles occupy important positions as industry-class application unmanned aerial vehicles, which can perform plant protection operations on farmland, such as spraying operations (spraying water, pesticides, seeds, etc.), and bring great convenience to the agricultural field, such as saving user time, improving operation efficiency, increasing operation benefits, improving utilization efficiency of agricultural machinery, etc.
However, the existing spraying operation mode is not high in intelligent degree, and is inconvenient enough, for example, an operator needs to observe a target crop through naked eyes and manually control the unmanned aerial vehicle to spray the target crop, so that the spraying efficiency is low.
Disclosure of Invention
The embodiment of the invention provides a control method and control equipment, which can help a user to observe target crops to be sprayed in the environment rapidly and intuitively so as to generate spraying control information rapidly and improve the spraying efficiency.
In a first aspect, an embodiment of the present invention provides a control method, including:
Acquiring and displaying a two-dimensional map of a target area on a display device, wherein the target area comprises a target crop, and the two-dimensional map is acquired according to a shooting image acquired when a surveying and mapping unmanned aerial vehicle flies on the target area;
Determining a target map area corresponding to the target crop in the two-dimensional map, wherein the target map area is used for determining spraying control information, and the spraying control information is used for controlling the unmanned spraying of the unmanned spraying machine to the target crop;
And displaying the identifier on a target map area in the two-dimensional map displayed by the display device.
In a second aspect, an embodiment of the present invention provides a control apparatus, including: a processor and a display device;
The processor is used for acquiring a two-dimensional map of a target area, wherein the target area comprises a target crop, and the two-dimensional map is acquired according to a shooting image acquired when the surveying and mapping unmanned aerial vehicle flies on the target area;
the display device is used for displaying the two-dimensional map acquired by the processor;
The processor is further configured to determine a target map area corresponding to the target crop in the two-dimensional map, where the target map area is configured to determine spraying control information, and the spraying control information is configured to control the spraying of the target crop by the unmanned spraying plane;
the display device is further configured to display an identifier on a target map area in the displayed two-dimensional map.
In a third aspect, an embodiment of the present invention provides a computer readable storage medium storing a computer program, where the computer program includes at least one piece of code, where the at least one piece of code is executable by a computer to control the computer to perform a control method according to an embodiment of the present invention in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer program, which when executed by a computer, is configured to implement the control method according to the embodiment of the first aspect.
According to the control method and the control device, the two-dimensional map of the target area is obtained and displayed on the display device, wherein the target area comprises target crops, and the two-dimensional map is obtained according to a shooting image obtained when the unmanned aerial vehicle is mapped to fly on the target area; determining a target map area corresponding to the target crop in the two-dimensional map, wherein the target map area is used for determining spraying control information, and the spraying control information is used for controlling the unmanned spraying of the unmanned spraying machine to the target crop; and displaying the identifier on a target map area in the two-dimensional map displayed by the display device. Therefore, the user can quickly and intuitively determine the target map area in the two-dimensional map by identifying the identifier, and the target map area is an area where the spraying operation is required, so that the spraying control information can be more intelligently and quickly obtained, the spraying effect is improved, and the spraying efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of mapping a target crop area by a mapping unmanned aerial vehicle according to an embodiment of the present invention;
Fig. 2 is a schematic diagram of an application scenario provided in an embodiment of the present invention;
fig. 3 is a schematic diagram of a spraying unmanned aerial vehicle for spraying a plurality of target crops according to the embodiment of the invention;
FIG. 4 is a flow chart of a control method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a two-dimensional map displayed by a display device according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of displaying identifiers on a two-dimensional map according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of editing an initial map area according to an embodiment of the present invention;
FIG. 8 is another schematic view of editing an initial map area according to an embodiment of the present invention;
FIG. 9 is a schematic diagram of a selected spray area according to one embodiment of the present invention;
FIG. 10 is a schematic diagram showing a three-dimensional scene according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of displaying identifiers in a three-dimensional scene according to an embodiment of the present invention;
FIG. 12 is a schematic diagram showing spray control information in a three-dimensional scene according to an embodiment of the present invention;
FIG. 13 is a schematic diagram showing spray control information in a three-dimensional scene with identifiers displayed according to an embodiment of the present invention;
Fig. 14 is a schematic structural diagram of a control device according to an embodiment of the present invention;
fig. 15 is a schematic structural diagram of a control system according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When a component is considered to be "connected" to another component, it can be directly connected to the other component or intervening components may also be present.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The term "and/or" as used herein includes any and all combinations of one or more of the associated listed items.
The control method, the control equipment, the mapping unmanned aerial vehicle and the spraying unmanned aerial vehicle provided by the embodiment of the invention can be applied to a scene that the spraying unmanned aerial vehicle carries out spraying control on a plurality of target crops in a crop area. Fig. 1 is a schematic diagram of mapping a target crop area by using a mapping unmanned aerial vehicle provided by the embodiment of the invention, and as shown in fig. 1, the mapping unmanned aerial vehicle 101 flies on the target area, and a two-dimensional map is obtained by shooting the target area by using a shooting device configured by the mapping unmanned aerial vehicle 101. The two-dimensional map includes a target map area, in which a plurality of target crops are planted, and in some cases, the target crops may be any agricultural crops, further, the target crops may include trees, further, the trees may include fruit trees, tea trees, rubber trees, and the like, but the embodiments of the present invention are not limited thereto. Fig. 2 is a schematic view of an application scenario provided by the embodiment of the present invention, as shown in fig. 2, fig. 2 shows a mapping unmanned aerial vehicle 101, a control device 102, a spraying unmanned aerial vehicle 103 and a ground control terminal 104. The control device 102 is any device that can determine the spraying control information according to the two-dimensional map of the target area as described above, for example, the control device 102 may be a terminal device with a display screen, including one or more of a smart phone, a desktop computer, a laptop computer, and a wearable device (watch, bracelet), which is schematically illustrated by using the control device 102 as a computer, but the embodiment of the present invention is not limited thereto. The ground control terminal 104 of the spray drone 103 may be one or more of a remote control, a smart phone, a desktop computer, a laptop computer, a wearable device (watch, bracelet). The embodiment of the present invention is schematically illustrated by taking the ground control terminal 104 as a remote controller 1041 and a terminal device 1042 as an example. The terminal device 1042 is, for example, a smart phone, a wearable device, a tablet computer, etc., but the embodiment of the invention is not limited thereto. In some embodiments, the control device 102 and the terminal device 1042 may be the same device.
The mapping unmanned aerial vehicle 101 may acquire an image output by the photographing device, and acquire a two-dimensional map of the target area according to the image, where the two-dimensional map of the target area is used for determining spraying control information of the spraying unmanned aerial vehicle to spray the multiple target crops. The control device 102 may acquire a two-dimensional map of the target area from the mapping unmanned aerial vehicle 101 through direct or indirect, wired communication or wireless communication, and determine spraying control information according to the two-dimensional map, where the spraying control information is used to control the spraying of the unmanned aerial vehicle 103 to the multiple target crops in the target area. In some cases, the control device 102 may obtain a two-dimensional map of the target area from other approaches than the mapping drone 101. The spraying unmanned aerial vehicle 103 may acquire the spraying control information from the control device 102 by directly or indirectly, by wired communication or by wireless communication, and perform spraying control on each target crop in the target crop area according to the spraying control information. In some embodiments, the control device 102 may send the spraying control information to the ground control terminal 104 of the unmanned spraying plane 103, and the ground control terminal 104 of the unmanned spraying plane 103 controls the unmanned spraying plane 103 to perform spraying control on the target crop according to the information for controlling the unmanned spraying plane 103 to perform spraying operation. In some embodiments, the control device 102 may send the spraying control information to the spraying unmanned aerial vehicle 103, and the spraying unmanned aerial vehicle 103 may spray the target crop according to the spraying control information, as shown in fig. 3.
Some embodiments of the present invention are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
Fig. 4 is a flowchart of a control method according to an embodiment of the present invention, as shown in fig. 4, where the method of the present embodiment may be applied to a control device, for example, and the method of the present embodiment may include:
s401, acquiring and displaying a two-dimensional map of the target area on the display device.
In this embodiment, the control device may acquire a two-dimensional map of a target area, which may be an area specified by the user as required, and includes a target crop in the target area, optionally, the target crop is, for example, a tree, optionally, the tree is, for example, a fruit tree. The target crop of the present embodiment is not limited thereto.
The two-dimensional map of the target area is acquired according to a shooting image acquired when the surveying and mapping unmanned aerial vehicle flies on the target area. Wherein, be configured with the shooting device on the survey unmanned aerial vehicle, the in-process that the survey unmanned aerial vehicle flies at the target area, the shooting device of configuration on the survey unmanned aerial vehicle can shoot the target area to obtain the shooting image of shooting device output. Alternatively, the mapping drone may fly within the target area according to a pre-planned route. The photographing device can be equidistant photographing or equidistant photographing when the surveying unmanned aerial vehicle flies.
Optionally, after obtaining the captured image, the mapping drone may generate a two-dimensional map of the target area from the captured image. The control device then obtains the two-dimensional map, and in one possible implementation, the control device receives the two-dimensional map sent by the drone via a wireless or wired communication connection, where the wired or wireless communication connection may be a direct communication, i.e. a point-to-point communication, or an indirect communication, i.e. a communication via an intermediate device (e.g. a ground control terminal of the drone). In another possible implementation, the mapping drone stores the two-dimensional map in a storage device from which the control device retrieves the two-dimensional map. The storage device is, for example, a secure digital card (Secure Digital Memory Card, SD card), and the embodiment is not limited thereto, and the mapping unmanned aerial vehicle may store the obtained two-dimensional map in the SD card, and then the user pulls out the SD card from the mapping unmanned aerial vehicle and inserts the SD card into the control device, and the control device obtains the two-dimensional map from the SD card inserted therein.
Alternatively, after the mapping drone obtains the captured image, the control device may obtain the captured image, and then the control device generates a two-dimensional map of the target area from the captured image. How the control device obtains the captured image may refer to a description of how the control device obtains the two-dimensional map, which is not described herein.
In this embodiment, the control device has a display device therein, and after the control device obtains the two-dimensional map of the target area, the two-dimensional map of the target area is displayed on the display device, so that the user can learn the two-dimensional map of the target area through the display device. In this case, the two-dimensional map of the target area displayed on the display device is shown in fig. 5, and it should be noted that, although the two-dimensional map shown in fig. 5 is black and white, in practical application, the two-dimensional map displayed on the display device may be colored.
S402, determining a target map area corresponding to the target crop in the two-dimensional map.
In this embodiment, the control device may be configured to determine, according to a target map area corresponding to the target crop in the two-dimensional map, the target map area may be a map area in which the target crop is planted in the two-dimensional map. It should be noted that, the position information of the target map area corresponding to the target crop in the two-dimensional map may represent the position information of the target crop, and in addition, the size of the target map area may represent the horizontal size of the target crop.
The target map area is used for determining spraying control information, and the spraying control information is used for controlling the unmanned spraying plane to spray the target crops. Optionally, the spraying control information includes a flight path between the target crops of the spraying unmanned aerial vehicle, where the flight path is, for example, a folded line type route, if the target crops are denser, the spraying unmanned aerial vehicle may not close the spray head on the flight path between the target crops, if the target crops are sparser, the spraying unmanned aerial vehicle may close the spray head on the flight path between the target crops, and open the spray head at the target crops. Optionally, the spraying control information includes a spraying mode of the spraying unmanned aerial vehicle when spraying each of the plurality of target crops. Optionally, the spraying control information includes a flight path between the target crops of the spraying unmanned aerial vehicle, and a spraying mode of the spraying unmanned aerial vehicle when spraying each of the plurality of target crops.
Optionally, the spray pattern includes one or more of a flight path, a spray amplitude, a flight speed, a spray density, a flight speed.
In some embodiments, one possible implementation of S402 is: and inputting the two-dimensional map into a preset identification model to determine the target map area. The preset recognition model may be used to recognize a target map area corresponding to the target crop, and the control device inputs the two-dimensional map obtained in S401 into the preset recognition model, so that the target map area corresponding to the target crop in the two-dimensional map may be recognized, so as to determine the target map area.
Optionally, the control device of the present embodiment further acquires height information of the target area, where the height information is determined according to the captured image. Accordingly, one possible implementation of inputting the two-dimensional map into a preset recognition model to determine the target map area is: and inputting the altitude information and the two-dimensional map into the preset recognition model to determine the target map area. In this embodiment, the parallax information may be determined from the captured image, and then the height information of the target area, that is, the height map of the target area, may be determined from the parallax information. Since the height information of the target area is also used when the target map area is identified, the accuracy of determining the target map area in the two-dimensional map can be improved.
S403, displaying an identifier on a target map area in the two-dimensional map displayed by the display device.
In this embodiment, after determining the target map area, the control device displays an identifier on the target map area in the two-dimensional map displayed by the display device, where the identifier is any symbol that plays a role in identification, and may be one or more of a tile, a grid, a dot matrix, a number, and a letter. The user can quickly and intuitively determine which map areas in the two-dimensional map are target map areas through the identifier displayed by the display device, as shown in fig. 6, the fruit tree is taken as a target object, wherein the darker tiles are displayed on the target map areas corresponding to the fruit tree, and in some embodiments, the identifier is a grid on the target map areas, which is not limited specifically herein.
In the embodiment, a two-dimensional map of a target area is acquired and displayed on a display device, wherein the target area comprises a target crop, and the two-dimensional map is acquired according to a shooting image acquired when a surveying and mapping unmanned aerial vehicle flies on the target area; determining a target map area corresponding to the target crop in the two-dimensional map, wherein the target map area is used for determining spraying control information, and the spraying control information is used for controlling the unmanned spraying of the unmanned spraying machine to the target crop; and displaying the identifier on a target map area in the two-dimensional map displayed by the display device. Therefore, the user can quickly and intuitively determine the target map area in the two-dimensional map by identifying the identifier, and the target map area is an area where the spraying operation is required, so that the spraying control information can be more intelligently and quickly obtained, the spraying effect is improved, and the spraying efficiency is improved.
In some embodiments, the target area includes a plurality of types of target objects, where the plurality of types of target objects include, for example: fruit trees, buildings, ground, water, utility poles, etc. Wherein the plurality of types of target objects include the target crop. One possible implementation manner of S402 is: and identifying multiple types of target objects in the two-dimensional map to determine map areas corresponding to the various types of target objects in the two-dimensional map. In this embodiment, multiple types of target objects in the two-dimensional map may be identified, so that map areas corresponding to the various types of target objects in the two-dimensional map may be determined. Optionally, one possible implementation of identifying multiple types of target objects in a two-dimensional map is: and inputting the two-dimensional map into a preset recognition model to recognize multiple types of target objects in the two-dimensional map. Further, the height information of the target area and the two-dimensional map may be input into a preset recognition model to recognize various types of target objects in the two-dimensional map.
Accordingly, one possible implementation manner of S403 is as follows: and displaying identifiers corresponding to the target objects of each type on a map area corresponding to the target objects of each type in the two-dimensional map displayed by the display device, wherein different types of target objects adopt different kinds of identifiers. In this embodiment, after identifying a plurality of types of target objects in the two-dimensional map to determine map areas corresponding to the various types of target objects in the two-dimensional map, identifiers corresponding to the types of target objects are displayed on the map areas corresponding to each type of target object in the two-dimensional map displayed by the display device, for example: and displaying identifiers corresponding to the fruit trees in a map area corresponding to the fruit trees in the two-dimensional map displayed by the display device, displaying identifiers corresponding to the buildings in a map area corresponding to the buildings in the two-dimensional map displayed by the display device, and displaying identifiers corresponding to the water surfaces in a map area corresponding to the water surfaces in the two-dimensional map displayed by the display device. Since different kinds of target objects adopt different kinds of identifiers, a user can quickly determine map areas corresponding to different kinds of target objects in a two-dimensional map through the different kinds of identifiers displayed in the two-dimensional map in a display interface, wherein the map areas are shown in fig. 6, for example.
Wherein the different kinds of identifiers may comprise at least one different identifier of color, shape, transparency, linearity, dot type, for example. For example: different kinds of identifiers may be distinguished by different colors or shapes or transparency, etc.
Optionally, the control device of the present embodiment further detects an identifier display operation input by the user before displaying an identifier corresponding to each type of target object on a map area corresponding to the type of target object in the two-dimensional map displayed by the display device. Then, the control device displays the identifier corresponding to each type of target object on the map area corresponding to the type of target object in the two-dimensional map displayed by the display device according to the identifier display operation.
Wherein the control device detecting identifier displaying operation may be, for example: the control device detects an identifier display operation through the interaction means. The interaction device can be an important component of the control terminal and is an interface for interaction with a user, and the user can control the control equipment through operation of the interaction device; when the user wants to control the control device, the user operates the interaction means of the control device, the control device detects the operation of the user through the interaction means, in this embodiment, when the user wants to display the identifier, the user performs the identifier display operation on the interaction means, and the interaction means detects the identifier display operation, so that the control terminal can detect the identifier display operation of the user through the interaction means. In some embodiments, the interaction means may be integrated with the display device described above as a touch display screen.
In some embodiments, one possible implementation manner of the determining the target map area corresponding to the target crop in the two-dimensional map is: determining an initial map area corresponding to the target crop in the two-dimensional map, displaying the initial target map area, and detecting an area editing operation of a user, wherein the editing operation is used for editing the initial map area; and determining a target map area according to the detected editing operation.
The control device of the embodiment may determine an initial map area corresponding to the target crop in the two-dimensional map, for example, may input the two-dimensional map into a preset recognition model to determine a map area corresponding to the target crop, where the map area may be referred to as an initial map area; further, the height information of the target area and the two-dimensional map may be input into a preset recognition model to determine a map area (i.e., an initial map area) corresponding to the target crop. After the initial map area is determined, the initial map area is displayed through a display device. If the user needs to edit the initial map area, for example, the user considers that some target crops are not identified, or some non-target crops are identified as target crops, the control device of the embodiment may detect the area editing operation of the user, and then determine the target map area according to the area editing operation, for example, as shown in fig. 7 or fig. 8. The region editing operation is, for example, a region adding operation, and the determined target map region includes an initial map region and is larger than the initial map region; the region editing operation is, for example, a region deleting operation, and the determined target map region is included in the initial map region and smaller than the initial map region; the region editing operation is, for example, a region correction operation, and the specified target map region corrects the map region having errors in the initial map region.
In some embodiments, the control device of the present embodiment also detects a pointing point determination operation of the user input; and determining the position information of the calibration point according to the calibration point determining operation, and sending the position information of the calibration point to the unmanned spraying machine, wherein the position information of the calibration point is used for correcting the position of the unmanned spraying machine during spraying.
In this embodiment, the control device may detect the calibration point determining operation input by the user based on the displayed two-dimensional map, and since each pixel point in the two-dimensional map may represent the position information of the corresponding position, the position information of the calibration point may be determined according to the calibration point determining operation, as shown in fig. 9, and then the position information of the calibration point may be transmitted to the spraying unmanned aerial vehicle. After the position information of the calibration point is received by the unmanned spraying plane, the position can be corrected according to the position information of the calibration point during spraying, namely, the position output by the positioning equipment in the unmanned spraying plane is corrected according to the position information of the calibration point so as to obtain the accurate position of the unmanned spraying plane, so that the spraying accuracy of the unmanned spraying plane during spraying is ensured.
In some embodiments, the control device of the present embodiment further generates, according to the target map area, spraying control information for the spraying unmanned aerial vehicle to spray the target crop; and displaying the spray control information. In this embodiment, the control device may generate, according to the determined target map area in the two-dimensional map, spraying control information for the spraying unmanned aerial vehicle to spray the target crop in the target map area, and then display the spraying control information through the display device, for example, may display the spraying control information on the target map area of the two-dimensional map displayed by the display device, and the embodiment is not limited thereto.
In some embodiments, the control device of the present embodiment further obtains height information of the target area, which is determined according to the photographed image of the mapping unmanned aerial vehicle described above. And then the control equipment inputs the height information of the target area and the two-dimensional map of the target area into the preset identification model, so that not only can the target map area of the target crop be determined, but also the height information of the target crop can be determined.
Accordingly, one possible implementation manner of generating the spraying control information is as follows: and generating spraying control information for spraying the target crops by the unmanned spraying machine according to the determined height information of the target crops and the target map area. If the heights of the target crops are greatly different, the generated spraying control information can enable the unmanned spraying plane to fly at a high safe flying height and spray.
In some embodiments, the control device further detects a user's spray job area selection operation, and determines a spray area in the two-dimensional map according to the spray job area selection operation. Accordingly, one possible implementation manner of generating the spraying control information of the spraying unmanned aerial vehicle for spraying the target crops according to the target map area is as follows: and generating spraying control information for the unmanned spraying machine to spray the target crops according to the target map area in the spraying area.
In some application scenarios, for example, a user needs to spray a target object of a partial area in a target area, so the user can plan the spraying area. Accordingly, the control device may detect a user's spray operation area selection operation and determine a spray area in the two-dimensional map according to the spray operation area operation, for example, as shown in fig. 9. After determining the spraying area, the spraying control information generated by the control device is generated according to the target map area in the spraying area, and the unmanned spraying plane sprays target crops in the target map area in the spraying area according to the spraying control information.
In some embodiments, the control terminal of the present embodiment further generates and displays a three-dimensional scene according to the two-dimensional map and the height information of the target area; accordingly, one possible implementation manner of displaying the spraying control information is as follows: and displaying the spraying control information in the three-dimensional scene. In this embodiment, after obtaining the height information of the target area, the control device may further generate a three-dimensional scene of the target area according to the height information of the target area and the two-dimensional map, and then display the three-dimensional scene through the display device, for example, as shown in fig. 10, and in some embodiments, may further display the identifier in the displayed three-dimensional scene, for example, as shown in fig. 11.
The control device may then, after determining the spray control information, also display the spray control information in a three-dimensional scene, for example as shown in fig. 12, so that the user knows how the target crop is sprayed by the spraying drone. In some embodiments, the spray control information may also be displayed in a three-dimensional scene with an identifier displayed, for example as shown in fig. 13.
In some embodiments, the control apparatus of the present embodiment further detects a spray control information adjustment operation by the user; and adjusting the spraying control information according to the spraying control information adjusting operation. In this embodiment, after the control device displays the spraying control information, if the user considers that the spraying control information is not suitable, the control device may further adjust the spraying control information, and accordingly, the control device may detect the spraying control information adjustment operation of the user and adjust the spraying control information according to the spraying control information adjustment operation. The adjusted spraying control information can enable the unmanned aerial vehicle to be safer when being sprayed, or the time consumption is shortest, or the power consumption is minimum, and the spraying efficiency of the unmanned aerial vehicle is greatly improved.
In summary, a two-dimensional map of a target area is acquired and displayed on a display device, wherein the target area comprises a target crop, and the two-dimensional map is acquired according to a shooting image acquired when a surveying and mapping unmanned aerial vehicle flies on the target area; determining a target map area corresponding to the target crop in the two-dimensional map, wherein the target map area is used for determining spraying control information, and the spraying control information is used for controlling the unmanned spraying of the unmanned spraying machine to the target crop; and displaying the identifier on a target map area in the two-dimensional map displayed by the display device. Therefore, a user can quickly and intuitively determine the target map area in the two-dimensional map by identifying the identifier, and the target map area is an area where spraying operation is required, so that spraying control information can be obtained more intelligently, the spraying effect is improved, and the spraying efficiency is improved. And after the spraying control information is generated, the spraying control information can be displayed, so that a user can quickly know the flying spraying path of the spraying unmanned aerial vehicle, in addition, the displayed spraying control information can be adjusted according to the requirement, the spraying effect is further improved, the spraying efficiency is improved, and the safety of spraying operation is ensured.
The embodiment of the invention also provides a computer storage medium, and the computer storage medium stores program instructions, and the program can include some or all of the steps of the control method in fig. 4 and the corresponding embodiment.
Fig. 14 is a schematic structural diagram of a control device according to an embodiment of the present invention, as shown in fig. 14, a control device 1400 according to the present embodiment may include: a processor 1401 and a display device 1402. The processor 1401 and the display device 1402 are connected via a bus. Optionally, the control device 1400 may further comprise an interaction means 1403, which interaction means 1403 may be connected to the above-mentioned components via a bus. Optionally, the control device 1400 may further comprise a communication device 1404, where the communication device 1404 may be connected to the above-mentioned components by a bus.
The processor 1401 is configured to obtain a two-dimensional map of a target area, where the target area includes a target crop, where the two-dimensional map is obtained from a captured image obtained when the mapping unmanned aerial vehicle flies over the target area.
The display device 1402 is configured to display the two-dimensional map acquired by the processor 1401.
The processor 1401 is further configured to determine a target map area corresponding to the target crop in the two-dimensional map, where the target map area is used to determine spraying control information, and the spraying control information is used to control the spraying of the target crop by the spraying unmanned aerial vehicle.
The display device 1402 is further configured to display an identifier on a target map area in the displayed two-dimensional map.
In some embodiments, the processor 1401 is specifically configured to, when determining the target map area corresponding to the target crop in the two-dimensional map: and inputting the two-dimensional map into a preset identification model to determine the target map area.
In some embodiments, the processor 1401 is further configured to obtain height information of the target area, where the height information is determined according to the captured image.
The processor 1401 is specifically configured to, when inputting the two-dimensional map into a preset recognition model to determine the target map region: and inputting the altitude information and the two-dimensional map into the preset recognition model to determine the target map area.
In some embodiments, the target area includes a plurality of types of target objects therein, wherein the plurality of types of target objects include the target crop.
The processor 1401 is specifically configured to, when determining a target map area corresponding to the target crop in the two-dimensional map: and identifying multiple types of target objects in the two-dimensional map to determine map areas corresponding to the various types of target objects in the two-dimensional map.
The display device 1402, when displaying an identifier on a target map area in a displayed two-dimensional map, is specifically configured to: and displaying identifiers corresponding to the target objects of each type in the displayed two-dimensional map on a map area corresponding to the target objects of the type, wherein different types of target objects adopt different kinds of identifiers.
In some embodiments, the processor 1401 is specifically configured to, when determining the target map area corresponding to the target crop in the two-dimensional map:
determining an initial map area corresponding to the target crop in the two-dimensional map;
displaying the initial target map area;
determining a target map area according to the detected area editing operation of the user, wherein the area editing operation is used for editing the initial map area;
Wherein the interaction means 1403 is used to detect a region editing operation of the user.
In some embodiments, the interaction means 1403 is configured to detect a pointing point determination operation of the user input.
The processor 1401 is further configured to determine location information of a calibration point according to the calibration point determining operation.
The communication device 1404 is configured to send the location information of the calibration point to the unmanned spraying plane, where the location information of the calibration point is used for correcting the location of the unmanned spraying plane during spraying.
In some embodiments, the processor 1401 is further configured to generate, according to the target map area, spray control information of the spraying unmanned aerial vehicle to spray the target crop.
The display device 1402 is configured to display the spraying control information.
In some embodiments, the processor 1401 is further configured to obtain height information of the target area, where the height information is determined according to the captured image.
The processor 1401 is specifically configured to, when determining a target map area corresponding to the target crop in the two-dimensional map: and inputting the height information and the two-dimensional map into the preset recognition model to determine the height information of the target map area and the target crops.
The processor 1401 is specifically configured to, when generating, according to the target map area, spray control information for spraying the target crop by the spraying unmanned aerial vehicle: and generating spraying control information of the spraying unmanned aerial vehicle for spraying the target crops according to the target map area and the height information of the target crops.
In some embodiments, the processor 1401 is further configured to generate a three-dimensional scene according to the two-dimensional map and the height information of the target area.
The display device 1402 is further configured to display the three-dimensional scene.
The display device 1402, when displaying the spraying control information, is specifically configured to: and displaying the spraying control information in the three-dimensional scene.
In some embodiments, the interaction means 1403 is configured to detect a user's spray job area selection operation, and determine a spray area in the two-dimensional map according to the spray job area selection operation;
The processor 1401 is specifically configured to, when generating, according to the target map area, spray control information for spraying the target crop by the spraying unmanned aerial vehicle: and generating spraying control information for the unmanned spraying machine to spray the target crops according to the target map area in the spraying area.
In some embodiments, the interaction means 1403 is configured to detect a user's spray control information adjustment operation.
The processor 1401 is further configured to adjust the spraying control information according to the spraying control information adjustment operation.
In some embodiments, the spray control information includes a flight path between the target crops of the spray drone.
In some embodiments, the spray control information includes a spray pattern of the spray drone when spraying each of the plurality of target crops.
In some embodiments, the spray pattern includes one or more of a flight path, a spray amplitude, a flight speed, a spray density, a flight speed.
In some embodiments, the target crop is a tree.
In some embodiments, the tree is a fruit tree.
Optionally, the control apparatus 1400 of the present embodiment may further include: a memory (not shown in the figure) for storing program codes, and when the program codes are executed, the control device 1400 can implement the above-described technical solution.
The control device of the present embodiment may be used to execute the technical solutions of the control device in the foregoing method embodiments of the present invention, and its implementation principle and technical effects are similar, and are not repeated herein.
Fig. 15 is a schematic structural diagram of a control system according to an embodiment of the present invention, as shown in fig. 15, a control system 1500 of the present embodiment may include: a mapping drone 1501, a control device 1502, and a spraying drone 1503.
The mapping unmanned aerial vehicle 1501 is used for acquiring a shooting image when flying on a target area, and the shooting image is used for determining a two-dimensional map of the target area;
The control device 1502 may adopt the structure of the embodiment shown in fig. 14, and correspondingly, may execute the technical solutions of the control device in the above method embodiments, and the implementation principle and technical effects are similar, which are not repeated herein.
The spraying unmanned aerial vehicle 1503 may acquire the spraying control information determined by the control device 1502, and then spray the target crop according to the spraying control information. In one possible implementation, the spraying drone 1503 receives the spraying control information sent by the control device 1502 through a wireless communication connection or a wired communication connection. In another possible implementation, after determining the spray control information, the control device 1502 stores the spray control information in a storage device, and accordingly, the spray drone 1503 retrieves the spray control information from the storage device. The storage device is, for example, an SD card, the control device may store the obtained spray control information in the SD card, and then the user pulls out the SD card from the control device and inserts the SD card into the spray unmanned aerial vehicle 1503, and the spray unmanned aerial vehicle 1503 obtains the spray control information from the SD card inserted therein.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the above method embodiments may be implemented by hardware associated with program instructions, where the foregoing program may be stored in a computer readable storage medium, and when executed, the program performs steps including the above method embodiments; and the aforementioned storage medium includes: read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic or optical disk, and the like.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (26)

1. A control method, characterized by comprising:
acquiring and displaying a two-dimensional map of a target area on a display device, wherein the target area comprises a plurality of types of target objects, and the plurality of types of target objects comprise target crops;
Determining a target map area corresponding to the target crop in the two-dimensional map;
Displaying identifiers on the target map areas displayed by the display device, wherein the target objects of the multiple types in the two-dimensional map respectively correspond to multiple map areas, the multiple map areas comprise the target map areas, identifiers representing the target objects of the corresponding types are respectively displayed on the multiple map areas, and different identifiers are adopted for the target objects of different types;
Generating spraying control information according to the target map area, wherein the spraying control information is used for controlling the unmanned spraying machine to spray the target crops, and the spraying control information comprises the flight path of the unmanned spraying machine between the target crops and a spraying mode aiming at the target crops; and
Displaying the spray control information in the three-dimensional scene in a case where the display device displays the three-dimensional scene, wherein the three-dimensional scene is determined according to the height information of the target area and the two-dimensional map;
Wherein the determining the target map area corresponding to the target crop in the two-dimensional map includes:
determining an initial map area corresponding to the target crop in the two-dimensional map;
and determining the target map area according to the area editing operation of the user on the initial map area.
2. The method of claim 1, wherein the determining a target map area in the two-dimensional map corresponding to the target crop comprises:
the two-dimensional map is input into a preset recognition model to determine the initial map region.
3. The method as recited in claim 2, further comprising: acquiring the height information of the target area, wherein the height information of the target area is determined according to the shot image;
The inputting the two-dimensional map into a preset recognition model to determine the initial map area comprises the following steps:
And inputting the height information of the target area and the two-dimensional map into the preset recognition model to determine the initial map area.
4. A method according to any one of claim 1 to 3, wherein,
The determining the target map area according to the area editing operation of the user on the initial map area comprises the following steps:
detecting the region editing operation of a user;
And determining the target map area according to the detected area editing operation.
5. A method according to any one of claims 1-3, further comprising:
detecting a calibration point determining operation input by a user;
And determining the position information of the calibration point according to the calibration point determining operation, and sending the position information of the calibration point to the unmanned spraying machine, wherein the position information of the calibration point is used for correcting the position of the unmanned spraying machine during spraying.
6. A method according to any one of claims 1-3, further comprising:
And displaying the spraying control information in the target map area of the two-dimensional map in a state where the two-dimensional map is displayed by the display device.
7. The method of claim 6, wherein the determining a target map area in the two-dimensional map corresponding to the target crop comprises:
inputting the height information of the target area and the two-dimensional map into a preset recognition model to determine the height information of the target map area and the target crop;
the generating, according to the target map area, spraying control information for the spraying unmanned aerial vehicle to spray the target crop includes:
and generating the spraying control information for the spraying unmanned aerial vehicle to spray the target crops according to the target map area and the height information of the target crops.
8. The method of claim 1, wherein the displaying the spray control information in the three-dimensional scene comprises:
and displaying the spraying control information in the three-dimensional scene displayed with the identifier.
9. The method as recited in claim 1, further comprising: detecting a spraying operation area selection operation of a user, and determining a spraying area in the two-dimensional map according to the spraying operation area selection operation;
The generating, according to the target map area, spraying control information for spraying the target crop by the spraying unmanned aerial vehicle includes:
and generating spraying control information for spraying the target crops by the unmanned spraying machine according to the target map area in the spraying area.
10. The method as recited in claim 1, further comprising:
detecting a spraying control information adjustment operation of a user;
and adjusting the spraying control information according to the spraying control information adjusting operation.
11. The method of claim 1, wherein the spray pattern comprises one or more of spray amplitude, flight speed, spray density.
12. A method according to any one of claims 1 to 3, wherein the target crop comprises a tree.
13. The method of claim 12, wherein the tree comprises a fruit tree.
14. A control apparatus, characterized by comprising: a processor and a display device;
The processor is used for acquiring a two-dimensional map of a target area, wherein the target area comprises a plurality of types of target objects, and the plurality of types of target objects comprise target crops;
the display device is used for displaying the two-dimensional map acquired by the processor;
the processor is further used for determining a target map area corresponding to the target crop in the two-dimensional map;
The display device is further configured to display identifiers on the displayed target map areas, where the plurality of types of target objects in the two-dimensional map respectively correspond to a plurality of map areas, the plurality of map areas include the target map areas, and identifiers representing the corresponding types of target objects are respectively displayed on the plurality of map areas, where different identifiers are adopted for different types of target objects;
The processor is further configured to generate spraying control information according to the target map area, where the spraying control information is used to control the spraying of the target crops by the unmanned spraying plane, and the spraying control information includes a flight path of the unmanned spraying plane between the target crops and a spraying mode for the target crops;
The processor is also used for adjusting the spraying control information according to the spraying control adjustment operation of the user;
The display device is further configured to display the adjusted spraying control information in a three-dimensional scene that is determined according to the height information of the target area and the two-dimensional map.
15. The control device of claim 14, wherein the processor, when determining a target map area in the two-dimensional map corresponding to the target crop, is specifically configured to: and inputting the two-dimensional map into a preset recognition model to determine an initial map area, and determining the target map area according to the area editing operation of a user on the initial map area.
16. The control device of claim 15, wherein the processor is further configured to obtain height information of the target area, wherein the height information of the target area is determined from the captured image;
The processor is specifically configured to, when inputting the two-dimensional map into a preset recognition model to determine the initial map area: and inputting the height information of the target area and the two-dimensional map into the preset recognition model to determine the initial map area.
17. The control device according to claim 15 or 16, characterized in that,
The processor is specifically configured to, when determining the target map area according to an area editing operation of the user on the initial map area: and determining the target map area according to the detected area editing operation.
18. The control apparatus according to any one of claims 14 to 16, characterized by further comprising: interaction means and communication means;
The interaction device is used for detecting the operation of determining the calibration point input by the user;
the processor is further used for determining the position information of the calibration point according to the detected calibration point determining operation;
the communication device is used for sending the position information of the calibration point to the unmanned spraying plane, and the position information of the calibration point is used for correcting the position of the unmanned spraying plane during spraying.
19. Control device according to any one of claims 14-16, characterized in that,
The display device is used for displaying the spraying control information in the target map area of the two-dimensional map in the condition that the two-dimensional map is displayed by the display device.
20. The control apparatus according to claim 14, wherein,
The processor is specifically configured to, when determining a target map area corresponding to the target crop in the two-dimensional map: inputting the height information of the target area and the two-dimensional map into a preset recognition model to determine the height information of the target map area and the target crop;
The processor is specifically configured to, when generating the spraying control information according to the target map area: and generating spraying control information for the unmanned spraying machine to spray the target crops according to the target map area and the height information of the target crops.
21. The control apparatus according to claim 14, wherein,
The display device is specifically configured to, when displaying the spraying control information in the three-dimensional scene: and displaying the spraying control information in the three-dimensional scene displayed with the identifier.
22. The control apparatus according to claim 14, characterized by further comprising: an interaction device;
the interaction device is used for detecting a spraying operation area selection operation of a user and determining a spraying area in the two-dimensional map according to the spraying operation area selection operation;
the processor is specifically configured to, when generating the spraying control information according to the target map area: and generating the spraying control information for the unmanned spraying machine to spray the target crops according to the target map area in the spraying area.
23. The control apparatus according to claim 14, characterized by further comprising: an interaction device;
The interaction device is used for detecting the spraying control information adjustment operation of a user.
24. The control device of any one of claims 14-16, wherein the spray pattern comprises one or more of spray amplitude, flight speed, spray density.
25. The control device of any one of claims 14-16, wherein the target crop comprises a tree.
26. The control device of claim 25, wherein the tree comprises a fruit tree.
CN201880069355.2A 2018-11-29 2018-11-29 Control method and apparatus Active CN111279284B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/118297 WO2020107354A1 (en) 2018-11-29 2018-11-29 Control method and device

Publications (2)

Publication Number Publication Date
CN111279284A CN111279284A (en) 2020-06-12
CN111279284B true CN111279284B (en) 2024-05-17

Family

ID=70851880

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880069355.2A Active CN111279284B (en) 2018-11-29 2018-11-29 Control method and apparatus

Country Status (2)

Country Link
CN (1) CN111279284B (en)
WO (1) WO2020107354A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112506464A (en) * 2020-12-14 2021-03-16 广州极飞科技有限公司 Equipment display method and device, electronic equipment, system and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159319A (en) * 2015-09-29 2015-12-16 广州极飞电子科技有限公司 Spraying method of unmanned plane and unmanned plane
CN105259909A (en) * 2015-11-08 2016-01-20 杨珊珊 Vegetation data acquisition method and acquisition apparatus based on unmanned aerial vehicle
EP3185170A1 (en) * 2015-12-22 2017-06-28 Thomson Licensing Method for controlling movement of at least one movable object, computer readable storage medium and apparatus configured to control movement of at least one movable object
CN107933921A (en) * 2017-10-30 2018-04-20 广州极飞科技有限公司 Aircraft and its sprinkling Route Generation and execution method, apparatus, control terminal
CN107950506A (en) * 2017-11-15 2018-04-24 广州极飞科技有限公司 Mobile device, the sprinkling control method and device based on mobile device
CN108253971A (en) * 2017-12-29 2018-07-06 深圳创动科技有限公司 A kind of method for inspecting and system
CN108846325A (en) * 2018-05-28 2018-11-20 广州极飞科技有限公司 Planing method, device, storage medium and the processor of target area operation

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108008735A (en) * 2017-11-07 2018-05-08 深圳常锋信息技术有限公司 Plant protection operation control method, system and the terminal device of unmanned plane

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105159319A (en) * 2015-09-29 2015-12-16 广州极飞电子科技有限公司 Spraying method of unmanned plane and unmanned plane
CN105259909A (en) * 2015-11-08 2016-01-20 杨珊珊 Vegetation data acquisition method and acquisition apparatus based on unmanned aerial vehicle
EP3185170A1 (en) * 2015-12-22 2017-06-28 Thomson Licensing Method for controlling movement of at least one movable object, computer readable storage medium and apparatus configured to control movement of at least one movable object
CN107933921A (en) * 2017-10-30 2018-04-20 广州极飞科技有限公司 Aircraft and its sprinkling Route Generation and execution method, apparatus, control terminal
CN107950506A (en) * 2017-11-15 2018-04-24 广州极飞科技有限公司 Mobile device, the sprinkling control method and device based on mobile device
CN108253971A (en) * 2017-12-29 2018-07-06 深圳创动科技有限公司 A kind of method for inspecting and system
CN108846325A (en) * 2018-05-28 2018-11-20 广州极飞科技有限公司 Planing method, device, storage medium and the processor of target area operation

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"互联网+林业"背景下智能飞防***设计综述;龙树全;周文安;杨显华;牛颢;;电子技术与软件工程;20180710(第13期);全文 *
基于无人机可见光影像的农田作物分类方法比较;郭鹏;武法东;戴建国;王海江;徐丽萍;张国顺;;农业工程学报;20170708(第13期);全文 *
飞机防治马尾松毛虫关键技术探讨;李永松;伍南;李密;;绿色科技;20171015(第19期);全文 *

Also Published As

Publication number Publication date
WO2020107354A1 (en) 2020-06-04
CN111279284A (en) 2020-06-12

Similar Documents

Publication Publication Date Title
CN109358650B (en) Routing inspection path planning method and device, unmanned aerial vehicle and computer readable storage medium
US11498676B2 (en) Method and apparatus for controlling flight of unmanned aerial vehicle
US10599142B2 (en) Display device and control method for display device
CN105116911B (en) Unmanned plane spray method
US9275267B2 (en) System and method for automatic registration of 3D data with electro-optical imagery via photogrammetric bundle adjustment
EP3859480A1 (en) Unmanned aerial vehicle control method, device and spraying system, and unmanned aerial vehicle and storage medium
US20110115812A1 (en) Method for colorization of point cloud data based on radiometric imagery
US11543836B2 (en) Unmanned aerial vehicle action plan creation system, method and program
CN106292799B (en) Unmanned plane, remote control and its control method
WO2020103108A1 (en) Semantic generation method and device, drone and storage medium
CN106647804A (en) Automatic routing inspection method and system
CN109076173A (en) Image output generation method, equipment and unmanned plane
WO2020103109A1 (en) Map generation method and device, drone and storage medium
US20180276997A1 (en) Flight tag obtaining method, terminal, and server
CN108885467B (en) Control method, terminal, management platform, system and storage medium
JP2015177397A (en) Head-mounted display, and farm work assistance system
CN106919186A (en) Unmanned vehicle flight control operation method and device
CN105701496B (en) A kind of go disk recognition methods based on artificial intelligence technology
CN109479086A (en) Method and apparatus relative to object zoom
CN111279284B (en) Control method and apparatus
CN109801336A (en) Airborne target locating system and method based on visible light and infrared light vision
CN113741495B (en) Unmanned aerial vehicle attitude adjustment method and device, computer equipment and storage medium
CN112991440A (en) Vehicle positioning method and device, storage medium and electronic device
CN112180987B (en) Collaborative operation method, collaborative operation system, collaborative operation device, collaborative operation computer equipment and collaborative operation storage medium
CN112106112A (en) Point cloud fusion method, device and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant