WO2019042067A1 - Aerial vehicle control method, aerial vehicle, program and recording medium - Google Patents

Aerial vehicle control method, aerial vehicle, program and recording medium Download PDF

Info

Publication number
WO2019042067A1
WO2019042067A1 PCT/CN2018/098109 CN2018098109W WO2019042067A1 WO 2019042067 A1 WO2019042067 A1 WO 2019042067A1 CN 2018098109 W CN2018098109 W CN 2018098109W WO 2019042067 A1 WO2019042067 A1 WO 2019042067A1
Authority
WO
WIPO (PCT)
Prior art keywords
work area
image information
sample image
flying body
area
Prior art date
Application number
PCT/CN2018/098109
Other languages
French (fr)
Chinese (zh)
Inventor
顾磊
王向伟
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201880015439.8A priority Critical patent/CN110383333A/en
Publication of WO2019042067A1 publication Critical patent/WO2019042067A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0202Control of position or course in two dimensions specially adapted to aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/16Flying platforms with five or more distinct rotor axes, e.g. octocopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/45UAVs specially adapted for particular uses or applications for releasing liquids or powders in-flight, e.g. crop-dusting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS

Definitions

  • the present disclosure relates to a flying body control method, a flying body, a program, and a recording medium for specifying a work area in which a flying body performs work.
  • Flying bodies flying in the air are used in various fields, such as aerial photography of objects from above, aerial surveys of objects such as terrain, buildings on the ground, and airborne transport of goods to destinations. Wait.
  • the flying body is also used for spraying purposes, such as the agricultural field, loading sprays such as pesticides, fertilizers, water, etc., and spraying the spray on predetermined objects in the fields such as crops on the farmland.
  • Patent Document 1 discloses a rotorless drone including four rotors, a bracket, a main control box, a windproof box, a spray box, and a rotor mounted at the outer end.
  • a windshield is provided in the lower part of the X-shaped bracket, and a spray box is arranged in the windproof box.
  • Patent Document 1 Chinese Utility Model No. 204297097.
  • Patent Document 1 When an unmanned aerial vehicle disclosed in Patent Document 1 is used to perform a work such as spraying of a spray material (for example, a pesticide, a fertilizer, water, or the like), it is generally required to perform the work accurately within a predetermined work area (for example, farmland or the like). If the spray material flies out of the work area, it may not only damage the health and environment of the human body (for example, when spraying pesticides on the farmhouse next to the farmland, etc.), but also cause waste of the spray material. On the other hand, in the case where a part of the work area is not sprinkled with the spray, the purpose of spraying cannot be achieved.
  • a spray material for example, a pesticide, a fertilizer, water, or the like
  • the user manually controls the flight of the drone using a transmitter or the like while visually viewing the work area.
  • the user in the case of a wide range of work areas, the user must concentrate on the operation of the drone for a long period of time, which is not only physically and mentally burdensome, but also difficult to accurately perform work in the work area.
  • a user sets a work area in advance while referring to an electronic map on a terminal device capable of communicating with the drone, and the flying body performs work while flying along a flight path in the set work area.
  • the electronic map is not updated at any time, it does not necessarily reflect the latest terrain. Even if the location information of the map is the latest information, it is not unusual to generate a few meters of error.
  • the setting method of the learning work area itself becomes a burden.
  • a flying body control method for specifying a work area of a flying body the method of: acquiring sample image information of a work area; and designating an operation in the object area according to similarity with sample image information The steps of the area.
  • the sample image information can be obtained from an image taken at a reference position located in the work area.
  • the sample image information may be acquired from one image selected from a plurality of images respectively taken at a plurality of locations located in the work area.
  • the reference position may be located at an inner position above a predetermined distance from the boundary line of the work area.
  • the sample image information may contain information related to the attributes of the color.
  • the step of specifying the work area in the object area may have: a step of determining a series of image capturing positions in the object area; a step of performing image capturing at each of the image capturing positions; and a similarity of the combined image according to the captured image and the sample image information The steps to specify the job area.
  • the step of designating the work area according to the similarity between the composite image of the captured image and the sample image information may have: synthesizing the plurality of images that have been captured each time the image is taken at the image capturing position, when the combined image and the sample When the similarity of the image information is a closed area above the first threshold, the closed area is designated as the work area.
  • the step of designating the work area according to the similarity between the composite image of the captured image and the sample image information may have: synthesizing the plurality of images that have been captured each time the image is taken at the image capturing position, when the combined image and the sample When the similarity of the image information is a closed area above the first threshold, the closed area is designated as the work area.
  • the step of designating, as the work area, an area in which the similarity of the sample image information is equal to or greater than the first threshold value in the synthesized image may include: comparing the similarity with the sample image information in the synthesized image at the first An area above the threshold and having an area above the second threshold is designated as the work area.
  • a series of imaging positions can constitute a path that starts from the reference position and expands spirally.
  • a flying body designating a work area having a processing unit that acquires sample image information of a work area and specifies a work area in the target area based on a similarity with the sample image information.
  • the flying body also has an imaging device, and the sample image information can be acquired from an image captured at a reference position located in the work area.
  • the sample image information may be acquired from one image selected from a plurality of images respectively taken at a plurality of locations located in the work area.
  • the reference position may be located at an inner position above a predetermined distance from the boundary line of the work area.
  • the sample image information may contain information related to the attributes of the color.
  • the processing unit may determine a series of imaging positions in the target region, perform imaging at each imaging position, and specify a work region based on the similarity between the composite image of the captured image and the sample image information.
  • the processing unit performs imaging at the imaging position, the plurality of images that have been captured may be combined, and when the similarity between the combined image and the sample image information is a closed region in a similar region above the first threshold, the closed region is designated. For the work area.
  • the processing unit may acquire information indicating a predetermined range of the target region, and specify, in the image obtained by combining the images captured at the imaging positions within the predetermined range, the region having the similarity with the sample image information at the first threshold or more Work area.
  • the processing unit may designate, in the synthesized image, an area having a degree of similarity with the sample image information equal to or greater than the first threshold and an area equal to or larger than the second threshold as the work area.
  • a series of imaging positions can constitute a path that starts from the reference position and expands spirally.
  • FIG. 1 is a diagram showing an example of the appearance of an unmanned flying body.
  • FIG. 2 is a block diagram showing one example of a hardware configuration of an unmanned aerial vehicle.
  • FIG. 3 is a flowchart showing one example of a processing procedure of the aircraft control method in the present disclosure.
  • FIG. 4 is a schematic view showing one example of a work area in the present disclosure.
  • FIG. 5 is a schematic diagram showing one example of a reference position in the present disclosure.
  • FIG. 6 is a flowchart showing one example of a procedure of designating a work area in the present disclosure.
  • FIG. 7 is a schematic diagram showing a series of imaging positions in a work area constituting a probe path.
  • FIG. 8 is a schematic diagram showing an image range imaged at a series of imaging positions.
  • Figure 9 is a flow chart showing one embodiment of the steps of assigning a work area.
  • FIG. 10 is a schematic diagram showing a composite image of an image captured at a series of imaging positions.
  • FIG. 11 is a schematic diagram showing a composite image of an image captured at a series of imaging positions.
  • FIG. 12 is a schematic diagram showing a composite image of an image captured at a series of imaging positions.
  • FIG. 13 is a schematic diagram showing one example of a work area in the present disclosure.
  • Figure 14 is a flow chart showing one embodiment of the steps of assigning a work area.
  • Fig. 15 is a schematic diagram showing a predetermined range set in the object area.
  • Fig. 16 is a schematic view showing the designated work area.
  • the flight control method defines various processes (steps) in the information processing apparatus for controlling the flight of the flying body.
  • the flight body includes an aircraft that moves in the air (eg, a remote drone, a helicopter).
  • the flying body can also be an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • the flying body can specify a work area (for example, farmland) for performing predetermined operations (for example, spraying pesticides, fertilizers, water, etc.).
  • the program related to the present disclosure is a program for causing the information processing apparatus to execute various processes (steps).
  • the recording medium to which the present disclosure relates is recorded with a program (i.e., a program for causing the information processing apparatus to execute various processes (steps)).
  • the flying body is exemplified by an unmanned aerial vehicle (UAV).
  • UAV unmanned aerial vehicle
  • the unmanned aerial vehicle is labeled "UAV".
  • the flying body specifies a work area of a spray object such as a crop of a farmland.
  • a flight path for uniformly spraying the spray material to the designated work area is set by the same flying body, and the work is performed along the flight path as an example. Description, but the disclosure is not limited thereto.
  • the information of the designated working area may be transmitted to the other working flying body so that the working flying body is performed within the designated working range. operation.
  • the information processing device is described as an example of a processing unit provided inside the flying body.
  • information processing such as an independent remote server that can communicate with the flying body may be used.
  • the telematics device capable of communicating with the processing unit and the flying body can perform a part of various processes (steps) according to the present disclosure.
  • the term "communication" as used herein is a broad concept including all data communication, and includes not only a case where a wired connection is made by a cable or the like but also a case where a connection is made by wireless communication. In addition, it includes not only the case where the information processing apparatus directly communicates with the flying body but also the case of indirect communication by the transmitter or the storage medium.
  • FIG. 1 is a diagram showing an example of the appearance of the unmanned aerial vehicle 100.
  • the unmanned aerial vehicle 100 for example, performs a spraying operation of a spray object such as a pesticide, a fertilizer, or a water on a spray target in a designated work area after specifying a work area of the spray operation.
  • the configuration of the unmanned aerial vehicle 100 includes a UAV main body 102, a rotor mechanism 130, a nozzle 200, a storage tank 210, and an image pickup device 270.
  • the unmanned aerial vehicle 100 can, for example, designate a work area (for example, a farmland) after the predetermined target area is detected by the imaging device 270, set a flight path in the work area, move along the set flight path, and pass the nozzle. 200 sprays pesticides, fertilizers, water, and the like stored in the storage tank 210.
  • the movement of the unmanned aerial vehicle 100 refers to flight, including at least a flight of ascending, descending, left-rotating, right-rotating, left-right moving, and right-level moving. Further, when the unmanned flying body 100 is only designated for the work area, and the work is performed by the other working flying body, the unmanned flying body 100 may not include the nozzle 200 and the storage tank 210.
  • the unmanned aerial vehicle 100 is provided with a plurality of rotor mechanisms (screws) 130.
  • the unmanned aerial vehicle 100 is provided with, for example, eight rotor mechanisms 130.
  • the unmanned aerial vehicle 100 moves the unmanned aerial vehicle 100 by controlling the rotation of these rotor mechanisms 130.
  • the number of rotors is not limited to eight.
  • the unmanned aerial vehicle 100 can be a fixed wing aircraft without a rotor.
  • FIG. 2 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100.
  • the unmanned aerial vehicle 100 is configured to include a processing unit 110, a memory 120, a rotor mechanism 130, a GPS receiver 140, an inertial measurement device 150, a magnetic compass 160, a barometric altimeter 170, a millimeter wave radar 180, a wind speed and direction finder 190, and a nozzle 200.
  • the processing unit 110 is configured by a processor, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor).
  • the processing unit 110 performs signal processing for overall control of the operation of each part of the unmanned aerial vehicle 100, input/output processing of data with other parts, arithmetic processing of data, and storage processing of data.
  • the processing unit 110 has a function of performing processing related to control of flight in the unmanned aerial vehicle 100.
  • the processing unit 110 controls the flight of the unmanned aerial vehicle 100 in accordance with a program stored in the memory 120 or the memory 240 and information related to the flight path. In addition, the processing unit 110 controls the unmanned aerial vehicle 100 in accordance with data and instructions received from a remote server via the communication interface 250.
  • the processing unit 110 controls the flight of the unmanned aerial vehicle 100 by controlling the rotor mechanism 130. That is, the processing unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aerial vehicle 100 by controlling the rotor mechanism 130.
  • the processing unit 110 controls the rotor mechanism 130 based on position information acquired by at least one of the GPS receiver 140, the inertial measurement device 150, the magnetic compass 160, the barometric altimeter 170, and the millimeter wave radar 180.
  • the memory 120 is an example of a storage section.
  • the memory 120 stores the processing unit 110 to the rotor mechanism 130, the GPS receiver 140, the inertial measurement device 150, the magnetic compass 160, the barometric altimeter 170, the millimeter wave radar 180, the wind speed and direction indicator 190, the nozzle 200, the storage tank 210, and the pressure sensor 220.
  • the flow sensor 230, the memory 240, the communication interface 250, and a program and the like necessary for the imaging device 270 to perform control.
  • the memory 120 stores various kinds of information and data used when the processing unit 110 performs processing.
  • the memory 120 may be a computer readable recording medium, and may include an SRAM (Static Random Access Memory), a DRAM (Dynamic Random Access Memory), and an EPROM (Erasable Programmable Read Only Memory: Erasable). At least one of a flash memory such as a programmable read only memory, an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a USB (Universal Serial Bus) memory.
  • the memory 120 may be disposed inside the unmanned aerial vehicle 100 or may be detachably disposed from the unmanned aerial vehicle 100.
  • the rotor mechanism 130 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors.
  • the rotor mechanism 130 controls the flight (rise, fall, horizontal movement, rotation, tilt, etc.) of the unmanned aerial vehicle 100 by rotating the rotor to generate airflow in a specific direction.
  • the GPS receiver 140 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (ie, GPS satellites) and the position (coordinates) of each GPS satellite.
  • the GPS receiver 140 calculates the position of the GPS receiver 140 (i.e., the position of the unmanned aerial vehicle 100) based on the received plurality of signals.
  • the GPS receiver 140 outputs the position information of the unmanned aerial vehicle 100 to the processing unit 110. Further, the processing of the position information of the GPS receiver 140 may be performed by the processing unit 110 instead of the GPS receiver 140.
  • the processing unit 110 inputs information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 140.
  • An inertial measurement unit (IMU: Inertial Measurement Unit) 150 detects the posture of the unmanned aerial vehicle 100 and outputs the detection result to the processing unit 110.
  • the inertial measurement device 150 detects the acceleration in the three-axis direction of the front, rear, left and right, and up and down of the unmanned aerial vehicle 100, and the angular velocity in the three-axis direction of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aerial vehicle 100.
  • the magnetic compass 160 detects the orientation of the nose of the unmanned aerial vehicle 100, and outputs the detection result to the processing unit 110.
  • the barometric altimeter 170 detects the altitude at which the unmanned aerial vehicle 100 flies, and outputs the detection result to the processing unit 110.
  • the millimeter wave radar 180 transmits a high-frequency wave of a millimeter wave band, and measures a reflected wave reflected by the ground and the object to detect the position of the ground and the object, and outputs the detection result to the processing unit 110.
  • the detection result may indicate, for example, the distance (i.e., height) from the unmanned aerial vehicle 100 to the ground.
  • the result of the detection may also represent, for example, the distance from the unmanned vehicle 100 to the object.
  • the detection result may also indicate the topography of the work area where the spraying operation is performed, for example, by the unmanned aerial vehicle 100.
  • the wind speed and direction meter 190 detects the wind speed and the wind direction around the unmanned aerial vehicle 100, and outputs the detection result to the processing unit 110.
  • the detection result may indicate the wind speed and the wind direction in the work area where the unmanned aerial vehicle 100 is flying.
  • the nozzle 200 is disposed at an end of a pipe that discharges a spray such as a pesticide, a fertilizer, water, or the like, for example, sprays the spray toward the lower side (vertical direction).
  • the nozzle 200 can have a plurality of mouths (eg, four).
  • the nozzle 200 adjusts the on/off of the injection, the injection amount, and the injection speed based on the control of the processing unit 110. Thereby, the spray from the nozzle 200 is sprayed toward the spray target at a predetermined injection amount and injection speed.
  • the storage tank 210 contains sprays such as pesticides, fertilizers, and water.
  • the storage tank 210 sends the spray to the nozzle 200 via a pipe based on the control of the processing unit 110.
  • the nozzle 200 and the storage tank 210 are included in one example of the configuration of the spray mechanism.
  • the pressure sensor 220 detects the pressure of the spray injected from the nozzle 200 and outputs the detection result to the processing unit 110.
  • the detection result may indicate, for example, the amount of injection or the ejection speed from the nozzle 200.
  • the flow rate sensor 230 detects the flow rate of the sprayed material ejected from the nozzle 200, and outputs the detection result to the processing unit 110.
  • the detection result may indicate, for example, the amount of injection or the ejection speed from the nozzle 200.
  • the memory 240 is an example of a storage section.
  • the memory 240 stores and stores various data and information.
  • the memory 240 may be an HDD (Hard Disk Drive), an SSD (Solid State Drive), a memory card, a USB memory, or the like.
  • the memory 240 may be disposed inside the unmanned aerial vehicle 100 or may be detachably disposed from the unmanned aerial vehicle 100.
  • Communication interface 250 is in communication with terminal 50. Communication interface 250 receives various information from terminal 50 relating to flight paths, sprays, and the like. The communication interface 250 receives various instructions from the terminal 50 for the processing unit 110.
  • the battery 260 has a function as a drive source for each part of the unmanned aerial vehicle 100, and supplies the required power to each part of the unmanned aerial vehicle 100.
  • the imaging device 270 is a camera that images an object (for example, the above-described farmland) included in a desired imaging range.
  • an example in which the image pickup apparatus is fixed to the lower portion of the unmanned aerial vehicle 100 is shown, but it may be mounted on the universal direction that rotatably supports the image pickup device 270 around the yaw axis, the roll axis, and the pitch axis. On the stand.
  • the flying body control method in the present disclosure specifies a work area in which a job is to be performed from a predetermined probe target area.
  • the unmanned aerial vehicle 100 can accurately grasp the work area by specifying the work area immediately before the work, thereby achieving efficient work.
  • FIG. 3 is a flowchart showing one example of a processing procedure (step) of the aircraft control method in the present disclosure.
  • 4 is a schematic view showing one example of a work area. Further, in the present disclosure, for convenience of explanation, the work area A shown in FIG. 4 is taken as an example, but the area, shape, and the like of the actual work area are not limited thereto.
  • the flying body control method S100 in the present disclosure first acquires sample image information of the work area A of the unmanned aerial vehicle 100 (step S110).
  • the sample image information referred to herein may be any information indicating a feature of the work area A, which may be used to identify a portion of the work area A and a portion of the non-work area A from the object area, for example, may include information related to the shape, color At least one of the information involved in the attribute.
  • the information related to the attribute of the color may be information indicating brightness, chroma, and hue such as RGB information.
  • the sample image information may be a regular color pattern (pattern) within the work area A. For example, in the case where the work area A is a vegetable farmland, it can be determined by a green regular pattern (color pattern).
  • the sample image information is preferably acquired from an image obtained at a predetermined position located in the work area A.
  • the unmanned aerial vehicle 100 can move to a position above the reference position P0 according to a position signal transmitted by a beacon of a reference position P0 set in the work area A in advance, and perform imaging at the reference position P0.
  • Sample image information such as RGB information is acquired from the captured image.
  • the reference position P0 in the present disclosure is not limited to the case of being determined by the position information transmitted by the beacon, and may be determined, for example, based on position information previously input by the user in the terminal device capable of communicating with the unmanned aerial vehicle 100.
  • images may be respectively captured at a plurality of locations within the work area A, and sample image information may be obtained from one of the plurality of images selected by the user.
  • the method of acquiring sample image information is not limited to the above method, and may be acquired from a remote database, for example, by wireless communication, a network, or in the case where there is a history of designating a work area A or a job in the past.
  • the memory of the flying body 100 or the like acquires past sample image information.
  • the method of acquiring the sample image information from the vicinity of the center of the work area A more accurately reflects the work area A than the acquisition of the sample image information from the vicinity of the boundary line. Therefore, preferably, as shown in FIG. 5, the reference position P0 is An inner position that is more than a predetermined distance from the boundary line of the work area.
  • the unmanned aerial vehicle 100 is initialized by a user-preset flight parameter prior to flight.
  • the flight parameters include, for example, a safe flying height (for example, 20 m), a range of the target area (for example, a radius of 100 m from the reference position P0), a flying speed (for example, 5 m/s), and the like, but are not limited thereto.
  • the object area refers to an area for judging whether or not the unmanned aerial vehicle 100 belongs to the work area, and is preferably set in advance as a parameter by the user as needed. For example, the user can set a range in which the reference position radius of the self-confidence target is 100 m as the object area.
  • the object area is probed, and the sample image information within the range of the target area is similar to a certain degree (for example, 70%) or more. It is recognized as a work area, whereby the work area can be accurately specified.
  • FIG. 6 is a flowchart showing one example of processing (step S120) of designating a work area in the present disclosure.
  • the unmanned aerial vehicle 100 determines a series of path points (i.e., imaging positions) P0, P1, P2, ..., Pn constituting a flight path of the unmanned aerial vehicle 100 in the target area (step S121).
  • These series of path points P0, P1, P2, ..., Pn are spaced apart such that the image captured at the adjacent path points is at a predetermined repetition rate (e.g., 10%) or more.
  • a predetermined repetition rate e.g. 10%
  • the imaging range in P0 is E0
  • the imaging range in P1 is E1
  • the imaging range in P2 is E2
  • E0 and E1 have a repetition rate of 10% or more
  • E1 and E2 also have 10%.
  • the repetition rate above In this way, by repeating at least a certain imaging range at a certain ratio, it is possible to accurately synthesize (stitch) images captured at a series of path points P0, P1, P2, ..., Pn.
  • all of the route points P0, P1, P2, ..., Pn may be determined in advance based on the imaging range calculated by the flying height of the unmanned aerial vehicle 100 and the angle of view of the lens used in the imaging device.
  • the disclosure is not limited to this.
  • the next path point Px+1 may be determined after reaching one path point Px, so that the imaging range is repeated at a predetermined ratio.
  • a series of path points P0, P1, P2, ..., Pn start from the reference position P0 in Fig. 5 for acquiring sample image information.
  • the sample image information is acquired, and imaging is started at the first path point, thereby achieving high efficiency.
  • a series of path points P0, P1, P2, ..., Pn form a spirally expanded path. Since the reference position P0 is located near the center of the work area A, the unmanned flying body 100 is expanded in a spiral shape by the probe, and the work area A can be designated quickly and accurately.
  • step S122 the respective path points are sequentially moved to perform imaging (step S122), and the work area is determined based on the similarity between the combined image of the captured image and the sample image information (step S123).
  • FIG. 9 is a flowchart showing the first embodiment S200 of the step (step S122) of designating the work area after the unmanned aerial vehicle 100 performs imaging at the route point P0 and moves to the route point P1. As shown in FIG. 9, the unmanned aerial vehicle 100 first performs imaging at the route point P1 (step S210).
  • the image captured at the route point P0 is combined with the image captured at the route point P1 (step S220).
  • the synthesis is preferably performed in the processing unit 110 of the unmanned aerial vehicle 100, but may be performed in a telematics device.
  • the unmanned aerial vehicle 100 may receive the synthesized image after transmitting the imaged image to the telematics device via the communication interface 250.
  • the unmanned aerial vehicle 100 estimates a region in which the similarity with the sample image information in the synthesized image is equal to or greater than the first threshold (for example, 70%) as a similar region (step S230). For example, the unmanned aerial vehicle 100 estimates a region similar to the RGB information of the sample image information at 70% or more among the synthesized images as a similar region. In the estimation, a region growing method based on an image captured at the reference position P0 or the like can be specifically used.
  • the first threshold for example, 70%
  • step S240 it is judged whether or not the similar region estimated in step S230 is a closed region.
  • the path point P0 is located near the center of the work area A, and the path point P1 is adjacent to the path point P0, as shown in FIG. 10, all the ranges of the composite image are similar areas. Therefore, it is judged that the similar area is not the closed area (NO in step S240), and it moves to the next path point P2 (step S260).
  • step S210 After the unmanned aerial vehicle 100 performs imaging at the route point P2 (step S210), the images captured at the route points P0, P1, and P2 are combined (stitched) (step S220). Then, an area of the synthesized image that is similar to the sample image information by a first threshold or more is estimated as a similar area (step S230). Thus, each time the image is taken at the path point Px, the images captured at all of the path points P0, P1, P2, ..., Px-1, Px before reaching the path point Px are repeatedly synthesized, and the similarity is estimated. The steps of the area. Each time an image captured at a new waypoint is added, the composite image becomes large, and as shown in FIG.
  • step S240 the similar area becomes a closed area (step S240, YES).
  • the closed area that is, the similar area is designated as the work area A (step S250), and the processing is ended.
  • the geographical information of the designated work area A (for example, the range of longitude and latitude in the GPS information) can be obtained from the position information of each route point.
  • the position information of each path point may also be calculated according to the relative distance from the reference position transmitted by the beacon when a series of path points are determined (ie, when step S121 is performed), but in order to grasp more accurate position information, preferably,
  • the unmanned aerial vehicle 100 acquires the position information of the route point from the GPS receiver 140 each time the image is taken at each of the waypoints.
  • the present disclosure provides a second embodiment that is capable of determining a plurality of closed work areas within a predetermined range at a time.
  • the unmanned aerial vehicle 100 acquires information indicating a predetermined range of the target area in advance (step S310).
  • the predetermined range of the object area B is specified by a parameter indicating the range.
  • the predetermined range may be specified by taking the reference position P0 to which the beacon belongs as a center radius of 100 meters or less.
  • the unmanned aerial vehicle 100 performs imaging at the route point P0 (step S320). Thereafter, it is judged whether or not the imaging at all the path points within 100 meters from the reference position (i.e., the path point) P0 is completed (step S330). Since there are other path points that are not imaged at the center of the reference position P0 (step S330, NO), the unmanned aerial vehicle 100 continues to move to the next path point P1 (step S350), and at the path point P1 Imaging is performed (step S320). Then, the unmanned aerial vehicle 100 judges again whether or not the imaging at all the path points within 100 meters from the reference position P0 is completed (step S330). In this way, the movement and imaging are repeated until the imaging at all the path points within 100 meters from the reference position P0 is completed.
  • the synthesized image includes the predetermined range of the entire object region B shown in FIG. 15 (that is, the range within the range of P0100 m from the reference position). Then, an area of the synthesized image that is similar to the sample image information at a first threshold (for example, 70%) is designated as a work area (step S360). Thereby, a plurality of closed areas A1, A2, and A3 located in the target area can be specified at one time.
  • the composition can be performed every time the image is captured at each of the route points.
  • the similarity with the sample image information in the synthesized image is above a first threshold (eg, 70%), and the area is at a second threshold.
  • the area above (for example, 10 square meters) is designated as the work area.
  • the second threshold may be preset in accordance with the actual work area and the area of the area that may be disturbed, and recorded in the memory 120 or the memory 240. Therefore, as shown in FIG. 16, since the farmland A1 and the farmland A3 are 10 square meters or more, it is designated as a work area, and since the grassland A2 is 10 square meters or less, it is not designated as a work area, and can be accurately performed. Specify the job area.
  • the flying body control method in the present disclosure may plan a work path in the work area by an existing method and perform an operation after the work area is designated. Thereby, even if the user does not manually set the work area, the flying body can automatically grasp the work area automatically, and it is possible to realize efficient work that is not excessive or insufficient.
  • step S230 of synthesizing an image in the first embodiment step S230 of estimating a similar area
  • step S340 of synthesizing an image in the second embodiment one or more of step S360 of designating a work area may be remote Executed in an information processing device such as a cloud server.
  • the flying body control method in the present disclosure can also be realized by causing the unmanned aerial vehicle 100 to execute a program of each process (step).
  • the program can be stored in memory 120, memory 240, or other storage medium.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

The present invention enables an aerial vehicle to accurately specify an operation area, so as to operate more effectively. An aerial vehicle control method for specifying an operation area of an aerial vehicle comprises: the step of acquiring sample image information of the operation area; and the step of specifying the operation area in an object area according to the similarity with the sample image information.

Description

飞行体控制方法、飞行体、程序及记录介质Flying body control method, flying body, program and recording medium 技术领域Technical field
本公开涉及一种用于指定飞行体进行作业的作业区域的飞行体控制方法、飞行体、程序以及记录介质。The present disclosure relates to a flying body control method, a flying body, a program, and a recording medium for specifying a work area in which a flying body performs work.
背景技术Background technique
在空中飞行的飞行体被用于各种领域,例如从上空对对象进行摄像的空中摄像、从上空对地形、地面的建筑物等对象进行测量的测绘、飞行来将货物搬运到目的地的空运等。另外,飞行体还被应用于喷洒用途,例如农业领域,装载农药、肥料、水等喷洒物并在农田的作物等对象上将喷洒物喷洒于预定的作业区域内。Flying bodies flying in the air are used in various fields, such as aerial photography of objects from above, aerial surveys of objects such as terrain, buildings on the ground, and airborne transport of goods to destinations. Wait. In addition, the flying body is also used for spraying purposes, such as the agricultural field, loading sprays such as pesticides, fertilizers, water, etc., and spraying the spray on predetermined objects in the fields such as crops on the farmland.
作为用于农业领域的飞行体的例子,例如在专利文献1公开了一种旋翼无人机,其包括四个旋翼、支架、主控制箱、防风箱、喷药箱,在外端安装有旋翼的X型支架的下部设置防风箱,在防风箱中设置喷药箱。As an example of a flying body for use in the agricultural field, for example, Patent Document 1 discloses a rotorless drone including four rotors, a bracket, a main control box, a windproof box, a spray box, and a rotor mounted at the outer end. A windshield is provided in the lower part of the X-shaped bracket, and a spray box is arranged in the windproof box.
现有技术文献Prior art literature
专利文献1:中国实用新型第204297097号说明书。Patent Document 1: Chinese Utility Model No. 204297097.
发明内容Summary of the invention
在使用专利文献1所示的无人机来进行喷洒物(例如农药、肥料、水等)的喷洒等作业时,一般要求在预定的作业区域(例如农田等)的范围内正确地进行作业。若喷洒物飞出作业区域外,则不仅有可能会损害人体的健康、环境(例如,将农药喷洒于农田旁边的农家时等),还会造成喷洒物的浪费。另一方面,在一部分作业区域没有遍撒喷洒物的情况下,也无法达到喷洒的目的。When an unmanned aerial vehicle disclosed in Patent Document 1 is used to perform a work such as spraying of a spray material (for example, a pesticide, a fertilizer, water, or the like), it is generally required to perform the work accurately within a predetermined work area (for example, farmland or the like). If the spray material flies out of the work area, it may not only damage the health and environment of the human body (for example, when spraying pesticides on the farmhouse next to the farmland, etc.), but also cause waste of the spray material. On the other hand, in the case where a part of the work area is not sprinkled with the spray, the purpose of spraying cannot be achieved.
以往,用户一边目视作业区域,一边使用发送器等手动控制无人机的飞行来进行作业。但是,在较大范围的作业区域的情况下,用户必须长时间集中精神于无人机的操纵,不仅体力、精神上的负担大,而且难以在作业区域内准确地进行作业。另外,还已知用户在能够与无人机通信的终端装置上一边参照电子地图一边预先设定作业区域,飞行体一边沿着所设定的作业区域内的飞行路径飞行一边进行作业。然而,由于电子地图并不是随时更新的,因此并不一定反映最新的地形,即使假设地图的位置信息是最新的信息,产生数米的误差也并不稀奇。另外,对于平时不习惯使用终端装置的用户来说,学习作业区域的设定方法等本身就 成为了负担。Conventionally, the user manually controls the flight of the drone using a transmitter or the like while visually viewing the work area. However, in the case of a wide range of work areas, the user must concentrate on the operation of the drone for a long period of time, which is not only physically and mentally burdensome, but also difficult to accurately perform work in the work area. Further, it is also known that a user sets a work area in advance while referring to an electronic map on a terminal device capable of communicating with the drone, and the flying body performs work while flying along a flight path in the set work area. However, since the electronic map is not updated at any time, it does not necessarily reflect the latest terrain. Even if the location information of the map is the latest information, it is not unusual to generate a few meters of error. In addition, for a user who is not accustomed to using a terminal device, the setting method of the learning work area itself becomes a burden.
在一个方面中,一种用于指定飞行体的作业区域的飞行体控制方法,其具有:获取作业区域的样本图像信息的步骤;以及根据与样本图像信息的相似度来指定对象区域中的作业区域的步骤。In one aspect, a flying body control method for specifying a work area of a flying body, the method of: acquiring sample image information of a work area; and designating an operation in the object area according to similarity with sample image information The steps of the area.
样本图像信息可以从在位于作业区域内的参考位置拍摄的图像中获取。The sample image information can be obtained from an image taken at a reference position located in the work area.
样本图像信息可以从选自在位于作业区域内的多个位置分别拍摄的多个图像的一个图像中获取。The sample image information may be acquired from one image selected from a plurality of images respectively taken at a plurality of locations located in the work area.
参考位置可以位于距作业区域的边界线预定距离以上的内侧位置。The reference position may be located at an inner position above a predetermined distance from the boundary line of the work area.
样本图像信息中可以包含与颜色的属性相关的信息。The sample image information may contain information related to the attributes of the color.
指定对象区域中的作业区域的步骤,可以具有:在对象区域内确定一系列的摄像位置的步骤;在各个摄像位置进行摄像的步骤;以及根据所摄像的图像的合成图像与样本图像信息的相似度来指定作业区域的步骤。The step of specifying the work area in the object area may have: a step of determining a series of image capturing positions in the object area; a step of performing image capturing at each of the image capturing positions; and a similarity of the combined image according to the captured image and the sample image information The steps to specify the job area.
根据所摄像的图像的合成图像与样本图像信息的相似度来指定作业区域的步骤,可以具有:每次在摄像位置进行摄像时,对已经摄像的多个图像进行合成,当合成的图像与样本图像信息的相似度在第一阈值以上的相似区域为闭合区域时,将闭合区域指定为作业区域的步骤。The step of designating the work area according to the similarity between the composite image of the captured image and the sample image information may have: synthesizing the plurality of images that have been captured each time the image is taken at the image capturing position, when the combined image and the sample When the similarity of the image information is a closed area above the first threshold, the closed area is designated as the work area.
根据所摄像的图像的合成图像与样本图像信息的相似度来指定作业区域的步骤,可以具有:每次在摄像位置进行摄像时,对已经摄像的多个图像进行合成,当合成的图像与样本图像信息的相似度在第一阈值以上的相似区域为闭合区域时,将闭合区域指定为作业区域的步骤。The step of designating the work area according to the similarity between the composite image of the captured image and the sample image information may have: synthesizing the plurality of images that have been captured each time the image is taken at the image capturing position, when the combined image and the sample When the similarity of the image information is a closed area above the first threshold, the closed area is designated as the work area.
将所合成的图像中的、与样本图像信息的相似度在第一阈值以上的区域指定为作业区域的步骤,可以具有:将所合成的图像中的、与样本图像信息的相似度在第一阈值以上且面积在第二阈值以上的区域指定为作业区域的步骤。The step of designating, as the work area, an area in which the similarity of the sample image information is equal to or greater than the first threshold value in the synthesized image may include: comparing the similarity with the sample image information in the synthesized image at the first An area above the threshold and having an area above the second threshold is designated as the work area.
一系列的摄像位置可以构成从参考位置开始并螺旋状扩展的路径。A series of imaging positions can constitute a path that starts from the reference position and expands spirally.
在一个方面中,一种指定作业区域的飞行体,其具有处理部,处理部获取作业区域的样本图像信息,并根据与样本图像信息的相似度来指定对象区域中的作业区域。In one aspect, a flying body designating a work area having a processing unit that acquires sample image information of a work area and specifies a work area in the target area based on a similarity with the sample image information.
飞行体还具有摄像装置,样本图像信息可以从在位于作业区域内的参考位置摄像的图像中获取。The flying body also has an imaging device, and the sample image information can be acquired from an image captured at a reference position located in the work area.
样本图像信息可以从选自在位于作业区域内的多个位置分别拍摄的多个图像的一个图像中获取。The sample image information may be acquired from one image selected from a plurality of images respectively taken at a plurality of locations located in the work area.
参考位置可以位于距作业区域的边界线预定距离以上的内侧位置。The reference position may be located at an inner position above a predetermined distance from the boundary line of the work area.
样本图像信息中可以包含与颜色的属性相关的信息。The sample image information may contain information related to the attributes of the color.
处理部可以在对象区域内确定一系列的摄像位置,并在各个摄像位置进行摄像,根据所摄像的图像的合成图像与样本图像信息的相似度来指定作业区域。The processing unit may determine a series of imaging positions in the target region, perform imaging at each imaging position, and specify a work region based on the similarity between the composite image of the captured image and the sample image information.
处理部每次在摄像位置进行摄像时,可以对已经摄像的多个图像进行合成,当合成的图像与样本图像信息的相似度在第一阈值以上的相似区域为闭合区域时,将闭合区域指定为作业区域。Each time the processing unit performs imaging at the imaging position, the plurality of images that have been captured may be combined, and when the similarity between the combined image and the sample image information is a closed region in a similar region above the first threshold, the closed region is designated. For the work area.
处理部可以获取表示对象区域的预定范围的信息,将在预定范围内的摄像位置处摄像的图像进行合成而得到的图像中的、与样本图像信息的相似度在第一阈值以上的区域指定为作业区域。The processing unit may acquire information indicating a predetermined range of the target region, and specify, in the image obtained by combining the images captured at the imaging positions within the predetermined range, the region having the similarity with the sample image information at the first threshold or more Work area.
处理部可以将合成的图像中的、与样本图像信息的相似度在第一阈值以上且面积在第二阈值以上的区域指定为作业区域。The processing unit may designate, in the synthesized image, an area having a degree of similarity with the sample image information equal to or greater than the first threshold and an area equal to or larger than the second threshold as the work area.
一系列的摄像位置可以构成从参考位置开始并螺旋状扩展的路径。A series of imaging positions can constitute a path that starts from the reference position and expands spirally.
在一个方面中,一种程序,其用于使指定作业区域的飞行体执行以下步骤:获取作业区域的样本图像信息的步骤;以及根据与样本图像信息的相似度来指定对象区域中的作业区域的步骤。In one aspect, a program for causing a flying body of a designated work area to perform the steps of: acquiring sample image information of a work area; and specifying a work area in the target area according to similarity with sample image information A step of.
在一个方面中,一种程序,其用于使指定作业区域的飞行体执行以下步骤:获取作业区域的样本图像信息的步骤;以及根据与样本图像信息的相似度来指定对象区域中的作业区域的步骤。In one aspect, a program for causing a flying body of a designated work area to perform the steps of: acquiring sample image information of a work area; and specifying a work area in the target area according to similarity with sample image information A step of.
另外,上述的发明内容中并未穷举出本公开的所有特征。此外,这些特征组的子组合也可以构成发明。In addition, all the features of the present disclosure are not exhaustive in the above summary. Furthermore, sub-combinations of these feature sets may also constitute an invention.
附图说明DRAWINGS
图1是示出无人飞行体的外观的一个示例的图。FIG. 1 is a diagram showing an example of the appearance of an unmanned flying body.
图2是示出无人飞行体的硬件构成的一个示例的框图。2 is a block diagram showing one example of a hardware configuration of an unmanned aerial vehicle.
图3是示出本公开中的飞行器控制方法的处理程序的一个示例的流程图。FIG. 3 is a flowchart showing one example of a processing procedure of the aircraft control method in the present disclosure.
图4是示出本公开中的作业区域的一个示例的示意图。4 is a schematic view showing one example of a work area in the present disclosure.
图5是示出本公开中的参考位置的一个示例的示意图。FIG. 5 is a schematic diagram showing one example of a reference position in the present disclosure.
图6是示出本公开中的指定作业区域的步骤的一个示例的流程图。FIG. 6 is a flowchart showing one example of a procedure of designating a work area in the present disclosure.
图7是示出构成探查路径的、作业区域内的一系列的摄像位置的示意图。FIG. 7 is a schematic diagram showing a series of imaging positions in a work area constituting a probe path.
图8是示出在一系列的摄像位置摄像的图像范围的示意图。FIG. 8 is a schematic diagram showing an image range imaged at a series of imaging positions.
图9是示出指定作业区域的步骤的一个实施例的流程图。Figure 9 is a flow chart showing one embodiment of the steps of assigning a work area.
图10是示出在一系列的摄像位置摄像的图像的合成图像的示意图。FIG. 10 is a schematic diagram showing a composite image of an image captured at a series of imaging positions.
图11是示出在一系列的摄像位置摄像的图像的合成图像的示意图。FIG. 11 is a schematic diagram showing a composite image of an image captured at a series of imaging positions.
图12是示出在一系列的摄像位置摄像的图像的合成图像的示意图。FIG. 12 is a schematic diagram showing a composite image of an image captured at a series of imaging positions.
图13是示出本公开中的作业区域的一个示例的示意图。FIG. 13 is a schematic diagram showing one example of a work area in the present disclosure.
图14是示出指定作业区域的步骤的一个实施例的流程图。Figure 14 is a flow chart showing one embodiment of the steps of assigning a work area.
图15是示出在对象区域中设定的预定范围的示意图。Fig. 15 is a schematic diagram showing a predetermined range set in the object area.
图16是示出所指定的作业区域的示意图。Fig. 16 is a schematic view showing the designated work area.
符号说明:Symbol Description:
100无人飞行体(UAV)100 unmanned aircraft (UAV)
102UAV主体102UAV subject
110处理部110 processing department
120内存120 memory
130旋翼机构130 rotor mechanism
140GPS接收器140GPS receiver
150惯性测量装置150 inertial measurement device
160磁罗盘160 magnetic compass
170气压高度计170 barometer
180毫米波雷达180 mm wave radar
190风速风向仪190 wind speed wind direction meter
200喷嘴200 nozzle
210储料罐210 storage tank
220压力传感器220 pressure sensor
230流量传感器230 flow sensor
240存储器240 memory
250通信接口250 communication interface
260电池260 battery
270摄像装置270 camera
具体实施方式Detailed ways
下面结合附图对本公开的实施方式作进一步说明。Embodiments of the present disclosure will be further described below in conjunction with the accompanying drawings.
以下,通过发明的实施方式来对本公开进行说明,但是以下实施方式并非限制权利要求 书所涉及的发明。实施方式中说明的特征的组合并非全部是发明的解决方案所必须的。Hereinafter, the present disclosure will be described by way of embodiments of the invention, but the following embodiments do not limit the invention according to the claims. The combinations of features described in the embodiments are not all that are necessary for the inventive solution.
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所显示的那样进行这些文件的复制,著作权人就无法异议。但是,在除此以外的情况下,保留一切的著作权。The claims, the description, the drawings, and the abstract of the specification contain matters that are protected by copyright. Anyone who makes copies of these documents as shown in the documents or records of the Patent Office cannot be objected to by the copyright owner. However, in other cases, all copyrights are reserved.
本公开涉及的飞行控制方法规定了用于对飞行体的飞行进行控制的信息处理装置中的各种处理(步骤)。飞行体包括在空中移动的飞行器(例如遥控无人驾驶飞机、直升机)。飞行体也可以是无人飞行体(UAV:Unmanned Aerial Vehicle)。飞行体能够指定进行预定的作业(例如喷洒农药、肥料、水等)的作业区域(例如农田)。The flight control method according to the present disclosure defines various processes (steps) in the information processing apparatus for controlling the flight of the flying body. The flight body includes an aircraft that moves in the air (eg, a remote drone, a helicopter). The flying body can also be an unmanned aerial vehicle (UAV). The flying body can specify a work area (for example, farmland) for performing predetermined operations (for example, spraying pesticides, fertilizers, water, etc.).
本公开涉及的程序为用于使信息处理装置执行各种处理(步骤)的程序。The program related to the present disclosure is a program for causing the information processing apparatus to execute various processes (steps).
本公开涉及的记录介质记录有程序(即用于使信息处理装置执行各种处理(步骤)的程序)。The recording medium to which the present disclosure relates is recorded with a program (i.e., a program for causing the information processing apparatus to execute various processes (steps)).
在以下所示的本公开涉及的各实施方式中,飞行体以无人飞行体(UAV)为例。本说明书的附图中,无人飞行体标记为“UAV”。在以下所示的各实施方式中,飞行体指定包括农田的作物等喷洒对象的作业区域。以下,以通过一个飞行体指定作业区域后,通过同一飞行体对该指定的作业区域设定用于将喷洒物大致均匀地完全喷洒的飞行路径,并沿着飞行路径进行作业的情况为例进行说明,但本公开不限于此。例如,可以在一个本公开涉及的探查用飞行体在指定作业区域之后,将所指定的作业区域的信息传送到其他作业用飞行体,以使该作业用飞行体在该指定的作业范围内进行作业。In each of the embodiments of the present disclosure shown below, the flying body is exemplified by an unmanned aerial vehicle (UAV). In the drawings of the present specification, the unmanned aerial vehicle is labeled "UAV". In each of the embodiments described below, the flying body specifies a work area of a spray object such as a crop of a farmland. Hereinafter, after the work area is designated by one flying body, a flight path for uniformly spraying the spray material to the designated work area is set by the same flying body, and the work is performed along the flight path as an example. Description, but the disclosure is not limited thereto. For example, after the designated flying object of the present disclosure is in the designated working area, the information of the designated working area may be transmitted to the other working flying body so that the working flying body is performed within the designated working range. operation.
在以下所示的本公开涉及的各实施方式中,信息处理装置以设置于飞行体的内部的处理部为例进行说明,但也可以是能够与飞行体通信的、独立的远程服务器等信息处理装置。另外,在飞行体内部,能够与处理部、飞行体通信的远程信息处理装置可以分别执行本公开涉及的各种处理(步骤)的一部分。这里所说的“通信”是包括所有数据通信的广义概念,不仅包括通过线缆等进行有线连接的情况,还包括通过无线通信进行连接的情况。另外,不仅包括信息处理装置与飞行体直接通信的情况,还包括通过发送器或存储介质间接通信的情况。In each of the embodiments of the present disclosure described below, the information processing device is described as an example of a processing unit provided inside the flying body. However, information processing such as an independent remote server that can communicate with the flying body may be used. Device. Further, inside the flying body, the telematics device capable of communicating with the processing unit and the flying body can perform a part of various processes (steps) according to the present disclosure. The term "communication" as used herein is a broad concept including all data communication, and includes not only a case where a wired connection is made by a cable or the like but also a case where a connection is made by wireless communication. In addition, it includes not only the case where the information processing apparatus directly communicates with the flying body but also the case of indirect communication by the transmitter or the storage medium.
图1是示出无人飞行体100的外观的一个示例的图。无人飞行体100例如在指定喷洒作业的作业区域后,对指定的作业区域内的喷洒对象进行农药、肥料、水等喷洒物的喷洒作业。无人飞行体100的构成包括UAV主体102、旋翼机构130、喷嘴200、储料罐210和摄像装置270。无人飞行体100例如能够在通过摄像装置270探查预定的对象区域来指定作业区域(例如农田)之后,在作业区域内设定飞行路径,沿着所设定的飞行路径进行移动,并通过喷嘴200喷洒存储在储料罐210中的农药、肥料、水等。无人飞行体100的移动是指飞行,至少包括上升、下降、左旋转、右旋转、左水平移动、右水平移动的飞行。另外,在无人飞 行体100仅进行作业区域的指定,在之后通过其他作业用飞行体进行作业的情况下,无人飞行体100也可以不具备喷嘴200和储料罐210。FIG. 1 is a diagram showing an example of the appearance of the unmanned aerial vehicle 100. The unmanned aerial vehicle 100, for example, performs a spraying operation of a spray object such as a pesticide, a fertilizer, or a water on a spray target in a designated work area after specifying a work area of the spray operation. The configuration of the unmanned aerial vehicle 100 includes a UAV main body 102, a rotor mechanism 130, a nozzle 200, a storage tank 210, and an image pickup device 270. The unmanned aerial vehicle 100 can, for example, designate a work area (for example, a farmland) after the predetermined target area is detected by the imaging device 270, set a flight path in the work area, move along the set flight path, and pass the nozzle. 200 sprays pesticides, fertilizers, water, and the like stored in the storage tank 210. The movement of the unmanned aerial vehicle 100 refers to flight, including at least a flight of ascending, descending, left-rotating, right-rotating, left-right moving, and right-level moving. Further, when the unmanned flying body 100 is only designated for the work area, and the work is performed by the other working flying body, the unmanned flying body 100 may not include the nozzle 200 and the storage tank 210.
无人飞行体100具备多个旋翼机构(螺旋浆)130。无人飞行体100例如具备八个旋翼机构130。无人飞行体100通过控制这些旋翼机构130的旋转而使无人飞行体100移动。但是,旋翼的数量并不限于八个。另外,无人飞行体100可以是没有旋翼的固定翼机。The unmanned aerial vehicle 100 is provided with a plurality of rotor mechanisms (screws) 130. The unmanned aerial vehicle 100 is provided with, for example, eight rotor mechanisms 130. The unmanned aerial vehicle 100 moves the unmanned aerial vehicle 100 by controlling the rotation of these rotor mechanisms 130. However, the number of rotors is not limited to eight. Additionally, the unmanned aerial vehicle 100 can be a fixed wing aircraft without a rotor.
接着,对无人飞行体100的构成示例进行说明。Next, a configuration example of the unmanned aerial vehicle 100 will be described.
图2是示出了无人飞行体100的硬件构成的一个示例的框图。无人飞行体100构成为包括:处理部110、内存120、旋翼机构130、GPS接收器140、惯性测量装置150、磁罗盘160、气压高度计170、毫米波雷达180、风速风向仪190、喷嘴200、储料罐210、压力传感器220、流量传感器230、存储器240、通信接口250、电池260、摄像装置270。FIG. 2 is a block diagram showing one example of the hardware configuration of the unmanned aerial vehicle 100. The unmanned aerial vehicle 100 is configured to include a processing unit 110, a memory 120, a rotor mechanism 130, a GPS receiver 140, an inertial measurement device 150, a magnetic compass 160, a barometric altimeter 170, a millimeter wave radar 180, a wind speed and direction finder 190, and a nozzle 200. The storage tank 210, the pressure sensor 220, the flow sensor 230, the memory 240, the communication interface 250, the battery 260, and the imaging device 270.
处理部110由处理器、例如CPU(Central Processing Unit:中央处理器)、MPU(Micro Processing Unit:微处理器)或DSP(Digital Signal Processor:数字信号处理器)构成。处理部110进行用于总体控制无人飞行体100的各部分的动作的信号处理、与其它各部分之间的数据的输入输出处理、数据的运算处理以及数据的存储处理。处理部110具有在无人飞行体100中执行与飞行的控制相关的处理的功能。The processing unit 110 is configured by a processor, for example, a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or a DSP (Digital Signal Processor). The processing unit 110 performs signal processing for overall control of the operation of each part of the unmanned aerial vehicle 100, input/output processing of data with other parts, arithmetic processing of data, and storage processing of data. The processing unit 110 has a function of performing processing related to control of flight in the unmanned aerial vehicle 100.
处理部110按照存储在内存120或存储器240中的程序及与飞行路径相关的信息来控制无人飞行体100的飞行。另外,处理部110按照通过通信接口250从远程的服务器接收到的数据和指令来控制无人飞行体100。The processing unit 110 controls the flight of the unmanned aerial vehicle 100 in accordance with a program stored in the memory 120 or the memory 240 and information related to the flight path. In addition, the processing unit 110 controls the unmanned aerial vehicle 100 in accordance with data and instructions received from a remote server via the communication interface 250.
处理部110通过控制旋翼机构130来控制无人飞行体100的飞行。即,处理部110通过控制旋翼机构130来对包含无人飞行体100的纬度、经度以及高度的位置进行控制。处理部110基于通过GPS接收器140、惯性测量装置150、磁罗盘160、气压高度计170、毫米波雷达180中的至少一个获取的位置信息来控制旋翼机构130。The processing unit 110 controls the flight of the unmanned aerial vehicle 100 by controlling the rotor mechanism 130. That is, the processing unit 110 controls the position including the latitude, longitude, and altitude of the unmanned aerial vehicle 100 by controlling the rotor mechanism 130. The processing unit 110 controls the rotor mechanism 130 based on position information acquired by at least one of the GPS receiver 140, the inertial measurement device 150, the magnetic compass 160, the barometric altimeter 170, and the millimeter wave radar 180.
内存120是存储部的一个示例。内存120存储处理部110对旋翼机构130、GPS接收器140、惯性测量装置150、磁罗盘160、气压高度计170、毫米波雷达180、风速风向仪190、喷嘴200、储料罐210、压力传感器220、流量传感器230、存储器240、通信接口250、以及摄像装置270进行控制所需的程序等。内存120保存处理部110进行处理时使用的各种信息、数据。内存120可以是计算机可读记录介质,可以包括SRAM(Static Random Access Memory:静态随机存取存储器)、DRAM(Dynamic Random Access Memory:动态随机存取存储器)、EPROM(Erasable Programmable Read Only Memory:可擦除可编程只读存储器)、EEPROM(Electrically Erasable Programmable Read-Only Memory:电可擦除可编程只读存储器)以及USB(Universal Serial Bus:通用串行总线)存储器等闪存中的至少一个。内存120可以设置 在无人飞行体100的内部,也可以从无人飞行体100可拆卸地设置。The memory 120 is an example of a storage section. The memory 120 stores the processing unit 110 to the rotor mechanism 130, the GPS receiver 140, the inertial measurement device 150, the magnetic compass 160, the barometric altimeter 170, the millimeter wave radar 180, the wind speed and direction indicator 190, the nozzle 200, the storage tank 210, and the pressure sensor 220. The flow sensor 230, the memory 240, the communication interface 250, and a program and the like necessary for the imaging device 270 to perform control. The memory 120 stores various kinds of information and data used when the processing unit 110 performs processing. The memory 120 may be a computer readable recording medium, and may include an SRAM (Static Random Access Memory), a DRAM (Dynamic Random Access Memory), and an EPROM (Erasable Programmable Read Only Memory: Erasable). At least one of a flash memory such as a programmable read only memory, an EEPROM (Electrically Erasable Programmable Read-Only Memory), and a USB (Universal Serial Bus) memory. The memory 120 may be disposed inside the unmanned aerial vehicle 100 or may be detachably disposed from the unmanned aerial vehicle 100.
旋翼机构130具有多个旋翼和使多个旋翼旋转的多个驱动电机。旋翼机构130通过使旋翼旋转而产生特定方向的气流,控制无人飞行体100的飞行(上升、下降、水平移动、旋转、倾斜等)。The rotor mechanism 130 has a plurality of rotors and a plurality of drive motors that rotate the plurality of rotors. The rotor mechanism 130 controls the flight (rise, fall, horizontal movement, rotation, tilt, etc.) of the unmanned aerial vehicle 100 by rotating the rotor to generate airflow in a specific direction.
GPS接收器140接收从多个导航卫星(即GPS卫星)发送的指示时间以及各GPS卫星的位置(坐标)的多个信号。GPS接收器140基于接收到的多个信号,计算出GPS接收器140的位置(即无人飞行体100的位置)。GPS接收器140将无人飞行体100的位置信息输出给处理部110。此外,可以用处理部110代替GPS接收器140来进行GPS接收器140的位置信息的计算。这时,在处理部110中输入GPS接收器140接收到的多个信号中包含的指示时间及各GPS卫星的位置的信息。The GPS receiver 140 receives a plurality of signals indicating the time transmitted from a plurality of navigation satellites (ie, GPS satellites) and the position (coordinates) of each GPS satellite. The GPS receiver 140 calculates the position of the GPS receiver 140 (i.e., the position of the unmanned aerial vehicle 100) based on the received plurality of signals. The GPS receiver 140 outputs the position information of the unmanned aerial vehicle 100 to the processing unit 110. Further, the processing of the position information of the GPS receiver 140 may be performed by the processing unit 110 instead of the GPS receiver 140. At this time, the processing unit 110 inputs information indicating the time and the position of each GPS satellite included in the plurality of signals received by the GPS receiver 140.
惯性测量装置(IMU:Inertial Measurement Unit)150检测无人飞行体100的姿势,并将检测结果输出给处理部110。惯性测量装置150检测出无人飞行体100的前后、左右以及上下的三轴方向的加速度和俯仰轴、横滚轴以及偏航轴的三轴方向的角速度,作为无人飞行体100的姿势。An inertial measurement unit (IMU: Inertial Measurement Unit) 150 detects the posture of the unmanned aerial vehicle 100 and outputs the detection result to the processing unit 110. The inertial measurement device 150 detects the acceleration in the three-axis direction of the front, rear, left and right, and up and down of the unmanned aerial vehicle 100, and the angular velocity in the three-axis direction of the pitch axis, the roll axis, and the yaw axis as the posture of the unmanned aerial vehicle 100.
磁罗盘160检测无人飞行体100的机头的方位,并将检测结果输出给处理部110。气压高度计170检测无人飞行体100飞行的高度,并将检测结果输出给处理部110。The magnetic compass 160 detects the orientation of the nose of the unmanned aerial vehicle 100, and outputs the detection result to the processing unit 110. The barometric altimeter 170 detects the altitude at which the unmanned aerial vehicle 100 flies, and outputs the detection result to the processing unit 110.
毫米波雷达180发送毫米波段的高频电波、并测定地面、物体反射的反射波,来检测地面、物体的位置,并将检测结果输出给处理部110。检测结果可以表示例如从无人飞行体100到地面的距离(即高度)。检测结果还可以表示例如从无人飞行体100到物体的距离。检测结果还可以表示例如通过无人飞行体100进行喷洒作业的作业区域的地形。The millimeter wave radar 180 transmits a high-frequency wave of a millimeter wave band, and measures a reflected wave reflected by the ground and the object to detect the position of the ground and the object, and outputs the detection result to the processing unit 110. The detection result may indicate, for example, the distance (i.e., height) from the unmanned aerial vehicle 100 to the ground. The result of the detection may also represent, for example, the distance from the unmanned vehicle 100 to the object. The detection result may also indicate the topography of the work area where the spraying operation is performed, for example, by the unmanned aerial vehicle 100.
风速风向仪190检测无人飞行体100周围的风速、风向,并将检测结果输出给处理部110。检测结果可以表示无人飞行体100飞行的作业区域中的风速、风向。The wind speed and direction meter 190 detects the wind speed and the wind direction around the unmanned aerial vehicle 100, and outputs the detection result to the processing unit 110. The detection result may indicate the wind speed and the wind direction in the work area where the unmanned aerial vehicle 100 is flying.
喷嘴200设置在导出农药、肥料、水等喷洒物的管道的端部,例如朝向下方(竖直方向)喷射喷洒物。喷嘴200可以具有多个嘴部(例如四个)。喷嘴200基于处理部110的控制对喷射的开/关、喷射量以及喷射速度进行调整。由此,来自喷嘴200的喷洒物以预定的喷射量、喷射速度朝向喷洒对象喷洒。储料罐210容纳农药、肥料、水等喷洒物。储料罐210基于处理部110的控制,经由管道将喷洒物向喷嘴200送出。喷嘴200、储料罐210包括在喷洒机构的构成的一个示例中。The nozzle 200 is disposed at an end of a pipe that discharges a spray such as a pesticide, a fertilizer, water, or the like, for example, sprays the spray toward the lower side (vertical direction). The nozzle 200 can have a plurality of mouths (eg, four). The nozzle 200 adjusts the on/off of the injection, the injection amount, and the injection speed based on the control of the processing unit 110. Thereby, the spray from the nozzle 200 is sprayed toward the spray target at a predetermined injection amount and injection speed. The storage tank 210 contains sprays such as pesticides, fertilizers, and water. The storage tank 210 sends the spray to the nozzle 200 via a pipe based on the control of the processing unit 110. The nozzle 200 and the storage tank 210 are included in one example of the configuration of the spray mechanism.
压力传感器220检测从喷嘴200喷射的喷洒物的压力,并将检测结果输出给处理部110。检测结果可以表示例如来自喷嘴200的喷射量或喷射速度。流量传感器230检测从喷嘴200喷射的喷洒物的流量,并将检测结果输出给处理部110。检测结果可以表示例如来自喷嘴200 的喷射量或喷射速度。The pressure sensor 220 detects the pressure of the spray injected from the nozzle 200 and outputs the detection result to the processing unit 110. The detection result may indicate, for example, the amount of injection or the ejection speed from the nozzle 200. The flow rate sensor 230 detects the flow rate of the sprayed material ejected from the nozzle 200, and outputs the detection result to the processing unit 110. The detection result may indicate, for example, the amount of injection or the ejection speed from the nozzle 200.
存储器240是存储部的一个示例。存储器240存储并保存各种数据、信息。存储器240可以是HDD(Hard Disk Drive:硬盘驱动器)、SSD(Solid State Drive:固态硬盘)、内存卡、USB存储器等。存储器240可以设置在无人飞行体100的内部,也可以从无人飞行体100可拆卸地设置。The memory 240 is an example of a storage section. The memory 240 stores and stores various data and information. The memory 240 may be an HDD (Hard Disk Drive), an SSD (Solid State Drive), a memory card, a USB memory, or the like. The memory 240 may be disposed inside the unmanned aerial vehicle 100 or may be detachably disposed from the unmanned aerial vehicle 100.
通信接口250与终端50通信。通信接口250接收来自终端50的与飞行路径、喷洒物等有关的各种信息。通信接口250接收来自终端50的针对处理部110的各种指令。 Communication interface 250 is in communication with terminal 50. Communication interface 250 receives various information from terminal 50 relating to flight paths, sprays, and the like. The communication interface 250 receives various instructions from the terminal 50 for the processing unit 110.
电池260具有作为无人飞行体100的各部分的驱动源的功能,向无人飞行体100的各部分提供所需的电源。The battery 260 has a function as a drive source for each part of the unmanned aerial vehicle 100, and supplies the required power to each part of the unmanned aerial vehicle 100.
摄像装置270是对包含在期望的摄像范围内的被摄体(例如上述农田)进行摄像的相机。在本公开中,示出了摄像装置固定在无人飞行体100的下部的示例,但也可以安装在以偏航轴、横滚轴以及俯仰轴为中心可旋转地支撑摄像装置270的万向支架上。The imaging device 270 is a camera that images an object (for example, the above-described farmland) included in a desired imaging range. In the present disclosure, an example in which the image pickup apparatus is fixed to the lower portion of the unmanned aerial vehicle 100 is shown, but it may be mounted on the universal direction that rotatably supports the image pickup device 270 around the yaw axis, the roll axis, and the pitch axis. On the stand.
以下,对本公开中的飞行体控制方法的一个示例进行说明。本公开中的飞行体控制方法从预定的探查对象区域指定出进行作业的作业区域。无人飞行体100在即将作业之前通过指定作业区域,能够掌握准确的作业区域,实现高效率的作业。Hereinafter, an example of the flying body control method in the present disclosure will be described. The flying body control method in the present disclosure specifies a work area in which a job is to be performed from a predetermined probe target area. The unmanned aerial vehicle 100 can accurately grasp the work area by specifying the work area immediately before the work, thereby achieving efficient work.
以下,以设置于无人飞行体100的处理部110执行本公开中的飞行体控制方法的各步骤的情况为例进行说明,但本发明不限于此,一部分步骤也可以通过远离无人飞行体100的服务器执行。Hereinafter, the case where each step of the flying body control method in the present disclosure is executed by the processing unit 110 provided in the unmanned aerial vehicle 100 will be described as an example, but the present invention is not limited thereto, and some steps may be further away from the unmanned flying body. 100 server execution.
图3是示出本公开中的飞行器控制方法的处理程序(步骤)的一个示例的流程图。图4是示出作业区域的一个示例的示意图。另外,显而易见,在本公开中,为了便于说明,以图4所示的作业区域A为例,但实际的作业区域的面积、形状等并不限于此。FIG. 3 is a flowchart showing one example of a processing procedure (step) of the aircraft control method in the present disclosure. 4 is a schematic view showing one example of a work area. Further, in the present disclosure, for convenience of explanation, the work area A shown in FIG. 4 is taken as an example, but the area, shape, and the like of the actual work area are not limited thereto.
如图3所示,本公开中的飞行体控制方法S100,首先获取无人飞行体100的作业区域A的样本图像信息(步骤S110)。这里所说的样本图像信息可以是表示作业区域A的特征的任何信息,其可以用于从对象区域识别作业区域A的部分和非作业区域A的部分,例如可以包含形状涉及的信息、颜色的属性涉及的信息中的至少一个。颜色的属性涉及的信息可以是例如RGB信息等表示明度、彩度、色相的信息。另外,样本图像信息可以是作业区域A内的规则的色彩图形(图案)。例如,在作业区域A为蔬菜农田的情况下,能够通过绿色的规则图案(色彩图形)来确定。As shown in FIG. 3, the flying body control method S100 in the present disclosure first acquires sample image information of the work area A of the unmanned aerial vehicle 100 (step S110). The sample image information referred to herein may be any information indicating a feature of the work area A, which may be used to identify a portion of the work area A and a portion of the non-work area A from the object area, for example, may include information related to the shape, color At least one of the information involved in the attribute. The information related to the attribute of the color may be information indicating brightness, chroma, and hue such as RGB information. In addition, the sample image information may be a regular color pattern (pattern) within the work area A. For example, in the case where the work area A is a vegetable farmland, it can be determined by a green regular pattern (color pattern).
样本图像信息优选从在位于作业区域A内的预定位置得到的图像中获取。作为具体示例,如图5所示,无人飞行体100可以根据预先设置在作业区域A内的参考位置P0的信标发送的位置信号移动到参考位置P0的上方,在参考位置P0进行摄像,并从该摄像图像获取RGB 信息等样本图像信息。另外,本公开中的参考位置P0并不限于通过信标发送的位置信息来确定的情况,例如也可以根据用户在能够与无人飞行体100通信的终端装置中预先输入的位置信息来确定。另外,为了获取更准确的样本图像信息,可以在作业区域A内的多个位置分别摄像图像,并从由用户从这些多个图像中选择的一个图像中获得样本图像信息。The sample image information is preferably acquired from an image obtained at a predetermined position located in the work area A. As a specific example, as shown in FIG. 5, the unmanned aerial vehicle 100 can move to a position above the reference position P0 according to a position signal transmitted by a beacon of a reference position P0 set in the work area A in advance, and perform imaging at the reference position P0. Sample image information such as RGB information is acquired from the captured image. In addition, the reference position P0 in the present disclosure is not limited to the case of being determined by the position information transmitted by the beacon, and may be determined, for example, based on position information previously input by the user in the terminal device capable of communicating with the unmanned aerial vehicle 100. In addition, in order to acquire more accurate sample image information, images may be respectively captured at a plurality of locations within the work area A, and sample image information may be obtained from one of the plurality of images selected by the user.
但是,样本图像信息的获取方法并不限于上述方法,例如可以通过无线通信、网络从远程数据库获取,或者在存在过去指定过作业区域A或进行过作业的历史的情况下,也可以从无人飞行体100的内存等获取过去的样本图像信息。However, the method of acquiring sample image information is not limited to the above method, and may be acquired from a remote database, for example, by wireless communication, a network, or in the case where there is a history of designating a work area A or a job in the past. The memory of the flying body 100 or the like acquires past sample image information.
另外,与从边界线附近获取样本图像信息相比,从作业区域A的中央附近获取样本图像信息的方法更加准确地反映作业区域A,因此,优选地,如图5所示,参考位置P0为距作业区域的边界线预定距离以上的内侧位置。Further, the method of acquiring the sample image information from the vicinity of the center of the work area A more accurately reflects the work area A than the acquisition of the sample image information from the vicinity of the boundary line. Therefore, preferably, as shown in FIG. 5, the reference position P0 is An inner position that is more than a predetermined distance from the boundary line of the work area.
优选地,在本公开中,无人飞行体100在飞行之前通过用户预先设定的飞行参数进行初始化。飞行参数中例如包括安全飞行高度(例如20m)、对象区域的范围(例如自参考位置P0半径为100m)、飞行速度(例如5m/s)等,但并不限于此。Preferably, in the present disclosure, the unmanned aerial vehicle 100 is initialized by a user-preset flight parameter prior to flight. The flight parameters include, for example, a safe flying height (for example, 20 m), a range of the target area (for example, a radius of 100 m from the reference position P0), a flying speed (for example, 5 m/s), and the like, but are not limited thereto.
然后,根据与所获取的样本图像信息的类似度来指定对象区域中的作业区域(步骤S120)。对象区域是指判断无人飞行体100是否属于作业区域的区域,优选地用户根据需要预先设置为参数。例如,用户可以将自信标所处的参考位置半径为100m的范围设置为对象区域。Then, the work area in the object area is specified based on the degree of similarity with the acquired sample image information (step S120). The object area refers to an area for judging whether or not the unmanned aerial vehicle 100 belongs to the work area, and is preferably set in advance as a parameter by the user as needed. For example, the user can set a range in which the reference position radius of the self-confidence target is 100 m as the object area.
如上所述,在无人飞行体100获取表示作业区域的特征的样本图像信息之后,探查对象区域,并将与对象区域的范围内的样本图像信息在一定程度(例如70%)以上类似的部分识别为作业区域,由此能够准确地指定作业区域。As described above, after the unmanned aerial vehicle 100 acquires the sample image information indicating the feature of the work area, the object area is probed, and the sample image information within the range of the target area is similar to a certain degree (for example, 70%) or more. It is recognized as a work area, whereby the work area can be accurately specified.
图6是示出本公开中的指定作业区域的处理(步骤S120)的一个示例的流程图。首先,如图7所示,无人飞行体100在对象区域内确定构成无人飞行体100的飞行路径的一系列的路径点(即摄像位置)P0、P1、P2、…、Pn(步骤S121)。这些一系列的路径点P0、P1、P2、…、Pn间隔设置,该间隔使在相邻的路径点摄像的图像在预定的重复率(例如10%)以上。例如,在图8中,P0中的摄像范围为E0,P1中的摄像范围为E1,P2中的摄像范围为E2时,E0与E1有10%以上的重复率,并且E1与E2也有10%以上的重复率。这样,通过使至少相邻的摄像范围彼此以一定比例重复,能够准确地合成(拼合)在一系列的路径点P0、P1、P2、…、Pn摄像的图像。FIG. 6 is a flowchart showing one example of processing (step S120) of designating a work area in the present disclosure. First, as shown in FIG. 7, the unmanned aerial vehicle 100 determines a series of path points (i.e., imaging positions) P0, P1, P2, ..., Pn constituting a flight path of the unmanned aerial vehicle 100 in the target area (step S121). ). These series of path points P0, P1, P2, ..., Pn are spaced apart such that the image captured at the adjacent path points is at a predetermined repetition rate (e.g., 10%) or more. For example, in FIG. 8, the imaging range in P0 is E0, the imaging range in P1 is E1, and the imaging range in P2 is E2, E0 and E1 have a repetition rate of 10% or more, and E1 and E2 also have 10%. The repetition rate above. In this way, by repeating at least a certain imaging range at a certain ratio, it is possible to accurately synthesize (stitch) images captured at a series of path points P0, P1, P2, ..., Pn.
此时,也可以根据通过无人飞行体100的飞行高度和摄像装置中使用的镜头的视角而计算出的摄像范围,来预先确定全部的路径点P0、P1、P2、…、Pn,但本公开并不限于此。例如,可以在到达一个路径点Px后确定下一个路径点Px+1,以使摄像范围以预定的比例重复。At this time, all of the route points P0, P1, P2, ..., Pn may be determined in advance based on the imaging range calculated by the flying height of the unmanned aerial vehicle 100 and the angle of view of the lens used in the imaging device. The disclosure is not limited to this. For example, the next path point Px+1 may be determined after reaching one path point Px, so that the imaging range is repeated at a predetermined ratio.
优选地,一系列的路径点P0、P1、P2、…、Pn从用于获取样本图像信息的、图5中的参 考位置P0开始。由此,在无人飞行体100移动到参考位置P0之后,在获取样本图像信息的同时,开始在第一个路径点处摄像,能够实现高效率化。Preferably, a series of path points P0, P1, P2, ..., Pn start from the reference position P0 in Fig. 5 for acquiring sample image information. As a result, after the unmanned aerial vehicle 100 moves to the reference position P0, the sample image information is acquired, and imaging is started at the first path point, thereby achieving high efficiency.
另外,优选地,如图7所示,一系列的路径点P0、P1、P2、…、Pn形成螺旋状扩展的路径。由于参考位置P0位于作业区域A的中央附近,因此通过探查使得无人飞行体100以螺旋状扩展,能够迅速且准确地指定作业区域A。Further, preferably, as shown in FIG. 7, a series of path points P0, P1, P2, ..., Pn form a spirally expanded path. Since the reference position P0 is located near the center of the work area A, the unmanned flying body 100 is expanded in a spiral shape by the probe, and the work area A can be designated quickly and accurately.
然后,依次移动到各个路径点进行摄像(步骤S122),并根据所摄像的图像的合成图像与上述样本图像信息的相似度来确定作业区域(步骤S123)。以下,对其具体实施例进行说明。Then, the respective path points are sequentially moved to perform imaging (step S122), and the work area is determined based on the similarity between the combined image of the captured image and the sample image information (step S123). Hereinafter, specific embodiments thereof will be described.
图9是示出无人飞行体100在路径点P0进行摄像并移动到路径点P1后,指定作业区域的步骤(步骤S122)的第一具体实施例S200的流程图。如图9所示,无人飞行体100首先在路径点P1进行摄像(步骤S210)。FIG. 9 is a flowchart showing the first embodiment S200 of the step (step S122) of designating the work area after the unmanned aerial vehicle 100 performs imaging at the route point P0 and moves to the route point P1. As shown in FIG. 9, the unmanned aerial vehicle 100 first performs imaging at the route point P1 (step S210).
然后,对在路径点P0摄像的图像与在路径点P1摄像的图像进行合成(步骤S220)。合成优选在无人飞行体100的处理部110中进行,但也可以在远程信息处理装置中进行。具体地,无人飞行体100可以在通过通信接口250将摄像的图像发送到远程信息处理装置之后,接收所合成后的图像。Then, the image captured at the route point P0 is combined with the image captured at the route point P1 (step S220). The synthesis is preferably performed in the processing unit 110 of the unmanned aerial vehicle 100, but may be performed in a telematics device. Specifically, the unmanned aerial vehicle 100 may receive the synthesized image after transmitting the imaged image to the telematics device via the communication interface 250.
接着,无人飞行体100将合成后的图像中的、与样本图像信息的相似度在第一阈值(例如70%)以上的区域推定为类似区域(步骤S230)。例如,无人飞行体100将所合成的图像中的、与样本图像信息的RGB信息在70%以上类似的区域推定为类似区域。在推定中,具体可以使用基于在参考位置P0摄像的图像的区域生长法(Region Growing)等。Next, the unmanned aerial vehicle 100 estimates a region in which the similarity with the sample image information in the synthesized image is equal to or greater than the first threshold (for example, 70%) as a similar region (step S230). For example, the unmanned aerial vehicle 100 estimates a region similar to the RGB information of the sample image information at 70% or more among the synthesized images as a similar region. In the estimation, a region growing method based on an image captured at the reference position P0 or the like can be specifically used.
接下来,判断在步骤S230中推定出的类似区域是否是闭合区域(步骤S240)。此时,由于路径点P0位于工作区域A的中央附近,且路径点P1与路径点P0相邻,因此如图10所示,合成图像的所有的范围为类似区域。因此,判断类似区域不是闭合区域(步骤S240,否),移动到下一个路径点P2(步骤S260)。Next, it is judged whether or not the similar region estimated in step S230 is a closed region (step S240). At this time, since the path point P0 is located near the center of the work area A, and the path point P1 is adjacent to the path point P0, as shown in FIG. 10, all the ranges of the composite image are similar areas. Therefore, it is judged that the similar area is not the closed area (NO in step S240), and it moves to the next path point P2 (step S260).
接着,无人飞行体100在路径点P2进行摄像之后(步骤S210),对在路径点P0、P1、P2摄像的图像进行合成(拼合)(步骤S220)。然后,将所合成后的图像中的、与样本图像信息的相似度在第一阈值以上的区域推定为类似区域(步骤S230)。这样,每次在路径点Px进行摄像时,重复进行对在到达该路径点Px之前的所有的路径点P0、P1、P2、…、Px-1、Px处所摄像的图像进行合成,并推定类似区域的步骤。每次追加在新的路径点处摄像的图像时,合成图像变大,如图11所示,合成图像中出现不属于类似区域的部分。很快,如图12所示,类似区域成为闭合区域(步骤S240,是)。此时,将该闭合区域,即类似区域指定为作业区域A(步骤S250),结束处理。Next, after the unmanned aerial vehicle 100 performs imaging at the route point P2 (step S210), the images captured at the route points P0, P1, and P2 are combined (stitched) (step S220). Then, an area of the synthesized image that is similar to the sample image information by a first threshold or more is estimated as a similar area (step S230). Thus, each time the image is taken at the path point Px, the images captured at all of the path points P0, P1, P2, ..., Px-1, Px before reaching the path point Px are repeatedly synthesized, and the similarity is estimated. The steps of the area. Each time an image captured at a new waypoint is added, the composite image becomes large, and as shown in FIG. 11, a portion that does not belong to a similar region appears in the composite image. Soon, as shown in Fig. 12, the similar area becomes a closed area (step S240, YES). At this time, the closed area, that is, the similar area is designated as the work area A (step S250), and the processing is ended.
这样,即使不一定移动到在对象区域内确定的全部的一系列的路径点P0、P1、P2、…、Pn并进行摄像,也能够在类似区域成为闭合区域的阶段,通过指定作业区域A并结束处理,节约电力、时间并能够迅速且准确地指定作业区域。In this way, even if it is not necessary to move to all of the series of path points P0, P1, P2, ..., Pn determined in the target area and to perform imaging, it is possible to specify the work area A at the stage where the similar area becomes the closed area. End processing, saving power and time, and being able to specify the work area quickly and accurately.
另外,指定后的作业区域A的地理信息(例如GPS信息中的经度、纬度的范围)能够从各路径点的位置信息中得到。各路径点的位置信息也可以在确定一系列的路径点时(即,执行步骤S121时),根据与信标发送的参考位置的相对距离计算出,但为了掌握更准确的位置信息,优选地,无人飞行体100每次在各路径点进行摄像时,从GPS接收机140获取该路径点的位置信息。Further, the geographical information of the designated work area A (for example, the range of longitude and latitude in the GPS information) can be obtained from the position information of each route point. The position information of each path point may also be calculated according to the relative distance from the reference position transmitted by the beacon when a series of path points are determined (ie, when step S121 is performed), but in order to grasp more accurate position information, preferably, The unmanned aerial vehicle 100 acquires the position information of the route point from the GPS receiver 140 each time the image is taken at each of the waypoints.
由于指定作业区域的步骤的第一具体实施例是在基准位置P0所属的闭合的作业区域得以确定的阶段中结束处理,因此每执行一次只能确定一个闭合的区域。但是,如图13所示,在存在多个闭合的区域A1、A2、A3的情况下,必须在各个闭合的作业区域中执行上述方法,反而变得繁杂。因此,本公开提供了能够一次确定预定范围内的多个闭合的作业区域的第二具体实施例。Since the first embodiment of the step of designating the work area ends the process in the stage in which the closed work area to which the reference position P0 belongs is determined, only one closed area can be determined each time it is executed. However, as shown in FIG. 13, in the case where there are a plurality of closed areas A1, A2, A3, it is necessary to perform the above method in each of the closed work areas, and it becomes complicated. Accordingly, the present disclosure provides a second embodiment that is capable of determining a plurality of closed work areas within a predetermined range at a time.
以下,对本公开中的飞行体控制方法的指定作业区域的步骤的第二具体实施例S300进行说明。首先,如图14所示,无人飞行体100预先获取表示对象区域的预定范围的信息(步骤S310)。对象区域B的预定范围通过表示范围的参数来指定。例如,预定范围可以通过以信标所属的参考位置P0为中心半径为100米以内来指定。Hereinafter, a second specific embodiment S300 of the procedure of designating the work area of the flying body control method in the present disclosure will be described. First, as shown in FIG. 14, the unmanned aerial vehicle 100 acquires information indicating a predetermined range of the target area in advance (step S310). The predetermined range of the object area B is specified by a parameter indicating the range. For example, the predetermined range may be specified by taking the reference position P0 to which the beacon belongs as a center radius of 100 meters or less.
然后,无人飞行体100在路径点P0进行摄像(步骤S320)。之后,判断在以参考位置(即,路径点)P0为中心100米以内的所有的路径点处的摄像是否完成(步骤S330)。由于在以参考位置P0为中心100米以内存在未进行摄像的其他路径点(步骤S330,否),因此无人飞行体100继续移动到下一个路径点P1(步骤S350),并在路径点P1进行摄像(步骤S320)。然后,无人飞行体100再次判断在以参考位置P0为中心100米以内的所有的路径点处的摄像是否完成(步骤S330)。这样,反复进行移动和摄像,直到在以参考位置P0为中心半径100米以内的所有的路径点处的摄像完成为止。Then, the unmanned aerial vehicle 100 performs imaging at the route point P0 (step S320). Thereafter, it is judged whether or not the imaging at all the path points within 100 meters from the reference position (i.e., the path point) P0 is completed (step S330). Since there are other path points that are not imaged at the center of the reference position P0 (step S330, NO), the unmanned aerial vehicle 100 continues to move to the next path point P1 (step S350), and at the path point P1 Imaging is performed (step S320). Then, the unmanned aerial vehicle 100 judges again whether or not the imaging at all the path points within 100 meters from the reference position P0 is completed (step S330). In this way, the movement and imaging are repeated until the imaging at all the path points within 100 meters from the reference position P0 is completed.
在自参考位置P0半径100米以内的所有的路径点处的摄像完成之后(步骤S330,是),对摄像到的图像进行合成。此时,结果是,所合成的图像包含图15所示的整个对象区域B的预定范围(即,自参考位置P0100米以内的范围)。然后,将所合成的图像中的、与样本图像信息的相似度在第一阈值(例如70%)以上的区域指定为作业区域(步骤S360)。由此,能够一次指定位于对象区域内的多个闭合的区域A1、A2、A3。After the imaging at all the path points within the radius of 100 m from the reference position P0 is completed (YES in step S330), the captured images are combined. At this time, as a result, the synthesized image includes the predetermined range of the entire object region B shown in FIG. 15 (that is, the range within the range of P0100 m from the reference position). Then, an area of the synthesized image that is similar to the sample image information at a first threshold (for example, 70%) is designated as a work area (step S360). Thereby, a plurality of closed areas A1, A2, and A3 located in the target area can be specified at one time.
在本实施例中,虽然每次在各个路径点进行摄像时也能够进行合成,但优选地,在全部的路径点处进行摄像后,一并合成全部的图像。通过减少合成的次数,能够节约处理部110 的资源,实现省电。In the present embodiment, the composition can be performed every time the image is captured at each of the route points. However, it is preferable to combine all the images together after imaging at all of the route points. By reducing the number of times of synthesis, it is possible to save resources of the processing unit 110 and achieve power saving.
另外,若将与对象区域B的预定范围内的样本图像信息的相似度在第一阈值以上的区域全部指定为作业区域,则有可能包含很多干扰。例如,在图15所示的农田A1(100平方米)和农田A3(20平方米)的附近存在小块草地A2(5平方米),在农田的样本图像信息为与泛绿的颜色对应的RGB信息的情况下,有可能将草地A2也指定为作业区域。为了避免产生这样的干扰,进一步优选地,在步骤S360中,将所合成的图像中的、与样本图像信息的相似度为在第一阈值(例如70%)以上、且面积为在第二阈值(例如10平方米)以上的区域指定为作业区域。第二阈值可以根据实际的作业区域和可能成为干扰的区域的面积而预先设定,并记录在内存120或存储器240中。由此,如图16所示,由于农田A1、农田A3为10平方米以上,因此被指定为作业区域,而由于草地A2在10平方米以下,因此未被指定为作业区域,从而能够准确地指定作业区域。In addition, if all of the regions having the similarity of the sample image information within the predetermined range of the target region B within the first threshold are designated as the work region, there is a possibility that a lot of interference is included. For example, there is a small grassland A2 (5 square meters) in the vicinity of the farmland A1 (100 square meters) and the farmland A3 (20 square meters) shown in Fig. 15, and the sample image information on the farmland corresponds to the color of the greenish green. In the case of RGB information, it is possible to designate the grass A2 as the work area. In order to avoid such interference, it is further preferred that, in step S360, the similarity with the sample image information in the synthesized image is above a first threshold (eg, 70%), and the area is at a second threshold. The area above (for example, 10 square meters) is designated as the work area. The second threshold may be preset in accordance with the actual work area and the area of the area that may be disturbed, and recorded in the memory 120 or the memory 240. Therefore, as shown in FIG. 16, since the farmland A1 and the farmland A3 are 10 square meters or more, it is designated as a work area, and since the grassland A2 is 10 square meters or less, it is not designated as a work area, and can be accurately performed. Specify the job area.
本公开中的飞行体控制方法可以在指定了作业区域之后,通过现有方法计划作业区域内的作业路径并进行作业。由此,即使用户未手动设定作业区域,飞行体也能够自动地准确地掌握作业区域,能够实现不会过度也不会不足的高效作业。The flying body control method in the present disclosure may plan a work path in the work area by an existing method and perform an operation after the work area is designated. Thereby, even if the user does not manually set the work area, the flying body can automatically grasp the work area automatically, and it is possible to realize efficient work that is not excessive or insufficient.
本公开中的飞行体控制方法的各处理(步骤),其全部都可以通过无人飞行体100的处理部110执行,但并不限于此。例如,在第一具体实施例中合成图像的步骤S230、推定类似区域的步骤S230、在第二具体实施例中合成图像的步骤S340、指定作业区域的步骤S360中的一个或多个可以在远程信息处理装置(例如云服务器)中执行。Each of the processes (steps) of the flying body control method in the present disclosure may be performed by the processing unit 110 of the unmanned aerial vehicle 100, but is not limited thereto. For example, in step S230 of synthesizing an image in the first embodiment, step S230 of estimating a similar area, step S340 of synthesizing an image in the second embodiment, one or more of step S360 of designating a work area may be remote Executed in an information processing device such as a cloud server.
另外,本公开中的飞行体控制方法也可以通过使无人飞行体100执行各处理(步骤)的程序来实现。该程序可以存储在内存120、存储器240或其他存储介质中。In addition, the flying body control method in the present disclosure can also be realized by causing the unmanned aerial vehicle 100 to execute a program of each process (step). The program can be stored in memory 120, memory 240, or other storage medium.
以上使用实施方式对本公开进行了说明,但是本公开涉及的发明的技术范围并不限于上述实施方式所记载的范围。对本领域普通技术人员来说,显然可以对上述实施方式进行各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本公开的技术范围之内。The present disclosure has been described above using the embodiments, but the technical scope of the invention according to the present disclosure is not limited to the scope described in the above embodiments. It will be obvious to those skilled in the art that various changes or modifications may be made to the above-described embodiments. It is apparent from the description of the claims that such modifications or improvements can be included in the technical scope of the present disclosure.
权利要求书、说明书、以及说明书附图中所示的装置、***、程序、以及方法中的动作、过程、步骤、以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,只要前面处理的输出并不用在后面的处理中,则可以以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。The order of execution of the processes, processes, steps, and stages in the devices, systems, programs, and methods illustrated in the claims, the description, and the drawings, unless specifically stated as "before", "Previous" or the like, as long as the output of the previous processing is not used in the subsequent processing, it can be implemented in any order. The operation flow in the claims, the description, and the drawings of the specification has been described using "first", "next", etc. for convenience, but does not mean that it must be implemented in this order.

Claims (22)

  1. 一种用于指定飞行体的作业区域的飞行体控制方法,其具有:A flying body control method for specifying a work area of a flying body, which has:
    获取所述作业区域的样本图像信息的步骤;以及Obtaining sample image information of the work area; and
    根据与所述样本图像信息的相似度来指定对象区域中的所述作业区域的步骤。The step of specifying the work area in the object area based on the similarity with the sample image information.
  2. 根据权利要求1所述的飞行体控制方法,其中,The flying body control method according to claim 1, wherein
    所述样本图像信息从在位于所述作业区域内的参考位置摄像的图像中获取。The sample image information is acquired from an image captured at a reference position located within the work area.
  3. 根据权利要求2所述的飞行体控制方法,其中,The flying body control method according to claim 2, wherein
    所述样本图像信息从选自在位于所述作业区域内的多个位置分别摄像的多个图像的一个图像中获取。The sample image information is acquired from one image selected from a plurality of images respectively captured at a plurality of locations located within the work area.
  4. 根据权利要求2所述的飞行体控制方法,其中,The flying body control method according to claim 2, wherein
    所述参考位置位于距所述作业区域的边界线预定距离以上的内侧位置。The reference position is located at an inner position above a predetermined distance from a boundary line of the work area.
  5. 根据权利要求1至4中任一项所述的飞行体控制方法,其中,The flying body control method according to any one of claims 1 to 4, wherein
    所述样本图像信息中包含与颜色的属性相关的信息。The sample image information includes information related to attributes of the color.
  6. 根据权利要求2所述的飞行体控制方法,其中,The flying body control method according to claim 2, wherein
    指定所述对象区域中的所述作业区域的步骤,具有:The step of specifying the work area in the object area has:
    在所述对象区域内确定一系列摄像位置的步骤;Determining a series of imaging positions within the object area;
    在各个摄像位置进行摄像的步骤;以及a step of imaging at each imaging position;
    根据所摄像的图像的合成图像与所述样本图像信息的相似度来指定所述作业区域的步骤。The step of specifying the work area is based on a similarity between the composite image of the captured image and the sample image information.
  7. 根据权利要求6所述的飞行体控制方法,其中,The flying body control method according to claim 6, wherein
    根据所述所摄像的图像的合成图像与所述样本图像信息的相似度来指定所述作业区域的步骤,具有:And the step of specifying the work area according to the similarity between the composite image of the captured image and the sample image information, having:
    每次在所述摄像位置进行摄像时,对已经摄像的多个图像进行合成,Each time an image is taken at the image capturing position, a plurality of images that have been captured are synthesized,
    当所述所合成的图像与所述样本图像信息的相似度在第一阈值以上的相似区域为闭 合区域时,When a similar region in which the combined image and the sample image information have a similarity above a first threshold is a closed region,
    将所述闭合区域指定为所述作业区域的步骤。The step of designating the closed area as the work area.
  8. 根据权利要求6所述的飞行体控制方法,其中,The flying body control method according to claim 6, wherein
    根据所述所摄像的图像的合成图像与所述样本图像信息的相似度来指定所述作业区域的步骤,具有:And the step of specifying the work area according to the similarity between the composite image of the captured image and the sample image information, having:
    获取表示所述对象区域的预定范围的信息的步骤;以及Obtaining information indicating a predetermined range of the object area; and
    将在位于所述预定范围内的摄像位置处摄像的图像进行合成而得到的图像中的、与所述样本图像信息的相似度在第一阈值以上的区域指定为所述作业区域的步骤。Among the images obtained by combining the images captured at the imaging positions within the predetermined range, the region having the similarity with the sample image information being equal to or greater than the first threshold is designated as the work region.
  9. 根据权利要求8所述的飞行体控制方法,其中,The flying body control method according to claim 8, wherein
    将所述所合成的图像中的、与所述样本图像信息的相似度在第一阈值以上的区域指定为所述作业区域的步骤,具有:The step of designating, as the work area, an area in which the similarity of the sample image information is equal to or greater than the first threshold value in the synthesized image has:
    将所述所合成的图像中的、与所述样本图像信息的相似度在第一阈值以上且面积在第二阈值以上的区域指定为所述作业区域的步骤。An area of the synthesized image that has a similarity to the sample image information above a first threshold and an area above a second threshold is designated as the work area.
  10. 根据权利要求6至9中任一项所述的飞行体控制方法,其中,The flying body control method according to any one of claims 6 to 9, wherein
    所述一系列的摄像位置构成从所述参考位置开始并螺旋状扩展的路径。The series of imaging positions constitute a path that starts from the reference position and spirally expands.
  11. 一种指定作业区域的飞行体,a flying body that specifies a work area,
    其具有处理部,It has a processing unit,
    所述处理部获取所述作业区域的样本图像信息,The processing unit acquires sample image information of the work area,
    并根据与所述样本图像信息的相似度来指定对象区域中的所述作业区域。And specifying the work area in the object area based on the similarity with the sample image information.
  12. 根据权利要求11所述的飞行体,其中,The flying body according to claim 11, wherein
    所述飞行体还具有摄像装置,The flying body also has an imaging device,
    所述样本图像信息从在位于所述作业区域内的参考位置摄像的图像中获取。The sample image information is acquired from an image captured at a reference position located within the work area.
  13. 根据权利要求12所述的飞行体,其中,The flying body according to claim 12, wherein
    所述样本图像信息从选自在位于所述作业区域内的多个位置分别摄像的多个图像的一个图像中获取。The sample image information is acquired from one image selected from a plurality of images respectively captured at a plurality of locations located within the work area.
  14. 根据权利要求12所述的飞行体,其中,The flying body according to claim 12, wherein
    所述参考位置位于距所述作业区域的边界线预定距离以上的内侧位置。The reference position is located at an inner position above a predetermined distance from a boundary line of the work area.
  15. 根据权利要求11至14中任一项所述的飞行体,其中,The flying body according to any one of claims 11 to 14, wherein
    所述样本图像信息中包含与颜色的属性相关的信息。The sample image information includes information related to attributes of the color.
  16. 根据权利要求12所述的飞行体,其中,The flying body according to claim 12, wherein
    所述处理部在所述对象区域内确定一系列的摄像位置,并在各个摄像位置进行摄像,根据所摄像的图像的合成图像与所述样本图像信息的相似度来指定所述作业区域。The processing unit determines a series of imaging positions in the target region, and performs imaging at each imaging position, and specifies the work region based on the similarity between the composite image of the captured image and the sample image information.
  17. 根据权利要求16所述的飞行体,其中,The flying body according to claim 16, wherein
    所述处理部每次在所述摄像位置进行摄像时,对已经摄像的多个图像进行合成,The processing unit synthesizes a plurality of images that have been captured each time the imaging unit performs imaging.
    当所述所合成的图像与所述样本图像信息的相似度在第一阈值以上的相似区域为闭合区域时,将所述闭合区域指定为所述作业区域。When the similar region in which the combined image and the sample image information have a similarity above the first threshold is a closed region, the closed region is designated as the work region.
  18. 根据权利要求16所述的飞行体,其中,The flying body according to claim 16, wherein
    所述处理部获取表示所述对象区域的预定范围的信息,并将在所述预定范围内的摄像位置处摄像的图像进行合成而得到的图像中的、与所述样本图像信息的相似度在第一阈值以上的区域指定为所述作业区域。The processing unit acquires information indicating a predetermined range of the target region, and compares the similarity with the sample image information in an image obtained by combining images captured at an imaging position within the predetermined range An area above the first threshold is designated as the work area.
  19. 根据权利要求18所述的飞行体,其中,The flying body according to claim 18, wherein
    所述处理部将所述所合成的图像中的、与所述样本图像信息的相似度在第一阈值以上且面积在第二阈值以上的区域指定为所述作业区域。The processing unit specifies, as the work area, an area in which the degree of similarity with the sample image information is equal to or greater than a first threshold value and the area is equal to or larger than a second threshold value among the combined images.
  20. 根据权利要求16至19中任一项所述的飞行体,其中,The flying body according to any one of claims 16 to 19, wherein
    所述一系列的摄像位置构成从所述参考位置开始并螺旋状扩展的路径。The series of imaging positions constitute a path that starts from the reference position and spirally expands.
  21. 一种程序,其用于使指定作业区域的飞行体执行以下步骤:A program for causing a flying body of a designated work area to perform the following steps:
    获取所述作业区域的样本图像信息的步骤;以及Obtaining sample image information of the work area; and
    根据与所述样本图像信息的相似度来指定对象区域中的所述作业区域的步骤。The step of specifying the work area in the object area based on the similarity with the sample image information.
  22. 一种计算机可读记录介质,A computer readable recording medium,
    其记录有用于使指定作业区域的飞行体执行以下步骤的程序:It records the procedure for the flight body of the designated work area to perform the following steps:
    获取所述作业区域的样本图像信息的步骤;以及Obtaining sample image information of the work area; and
    根据与所述样本图像信息的相似度来指定对象区域中的所述作业区域的步骤。The step of specifying the work area in the object area based on the similarity with the sample image information.
PCT/CN2018/098109 2017-08-29 2018-08-01 Aerial vehicle control method, aerial vehicle, program and recording medium WO2019042067A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201880015439.8A CN110383333A (en) 2017-08-29 2018-08-01 Flight body controlling means, flying body, program and recording medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-164640 2017-08-29
JP2017164640A JP2019045898A (en) 2017-08-29 2017-08-29 Flying body control method, flying body, program and recording medium

Publications (1)

Publication Number Publication Date
WO2019042067A1 true WO2019042067A1 (en) 2019-03-07

Family

ID=65524863

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/098109 WO2019042067A1 (en) 2017-08-29 2018-08-01 Aerial vehicle control method, aerial vehicle, program and recording medium

Country Status (3)

Country Link
JP (1) JP2019045898A (en)
CN (1) CN110383333A (en)
WO (1) WO2019042067A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6651682B1 (en) * 2019-04-02 2020-02-19 株式会社ネクスドローン Material distribution system and material distribution device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103235830A (en) * 2013-05-13 2013-08-07 北京臻迪科技有限公司 Unmanned aerial vehicle (UAV)-based electric power line patrol method and device and UAV
CN103985117A (en) * 2014-04-28 2014-08-13 上海融军科技有限公司 Method for capturing and determining object based on remote sensing image
US20170193830A1 (en) * 2016-01-05 2017-07-06 California Institute Of Technology Controlling unmanned aerial vehicles to avoid obstacle collision
CN106956778A (en) * 2017-05-23 2017-07-18 广东容祺智能科技有限公司 A kind of unmanned plane pesticide spraying method and system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5123812B2 (en) * 2008-10-10 2013-01-23 日立オートモティブシステムズ株式会社 Road marking recognition system
JP6014795B2 (en) * 2012-08-24 2016-10-26 ヤンマー株式会社 Travel path identification device
JP6387782B2 (en) * 2014-10-17 2018-09-12 ソニー株式会社 Control device, control method, and computer program
US9898932B2 (en) * 2015-05-04 2018-02-20 International Business Machines Corporation Unmanned vehicle movement path assignment and management
CN105354841B (en) * 2015-10-21 2019-02-01 武汉工程大学 A kind of rapid remote sensing image matching method and system
JP6621140B2 (en) * 2016-02-16 2019-12-18 株式会社ナイルワークス Method and program for spraying medicine by unmanned air vehicle
CN106020233B (en) * 2016-07-08 2023-11-28 聂浩然 Unmanned aerial vehicle plant protection operation system, unmanned aerial vehicle for plant protection operation and control method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103235830A (en) * 2013-05-13 2013-08-07 北京臻迪科技有限公司 Unmanned aerial vehicle (UAV)-based electric power line patrol method and device and UAV
CN103985117A (en) * 2014-04-28 2014-08-13 上海融军科技有限公司 Method for capturing and determining object based on remote sensing image
US20170193830A1 (en) * 2016-01-05 2017-07-06 California Institute Of Technology Controlling unmanned aerial vehicles to avoid obstacle collision
CN106956778A (en) * 2017-05-23 2017-07-18 广东容祺智能科技有限公司 A kind of unmanned plane pesticide spraying method and system

Also Published As

Publication number Publication date
CN110383333A (en) 2019-10-25
JP2019045898A (en) 2019-03-22

Similar Documents

Publication Publication Date Title
JP6962720B2 (en) Flight control methods, information processing equipment, programs and recording media
Rieke et al. High-precision positioning and real-time data processing of UAV-systems
WO2022094854A1 (en) Growth monitoring method for crops, and devices and storage medium
Küng et al. The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery
CN110968110A (en) Method and device for determining operation area, unmanned aerial vehicle and storage medium
CN110254722B (en) Aircraft system, aircraft system method and computer-readable storage medium
JP6836385B2 (en) Positioning device, location method and program
WO2018209898A1 (en) Information processing device, aerial photographing path generation method, aerial photographing path generation system, program and recording medium
Madawalagama et al. Low cost aerial mapping with consumer-grade drones
JP6138326B1 (en) MOBILE BODY, MOBILE BODY CONTROL METHOD, PROGRAM FOR CONTROLLING MOBILE BODY, CONTROL SYSTEM, AND INFORMATION PROCESSING DEVICE
CN110570463B (en) Target state estimation method and device and unmanned aerial vehicle
Liu et al. Development of a positioning system using UAV-based computer vision for an airboat navigation in paddy field
JP6289750B1 (en) Mobile object, mobile object control method, mobile object control system, and mobile object control program
WO2021159249A1 (en) Route planning method and device, and storage medium
JP2023041675A (en) Drone-work support system and drone-work support method
CN108007437B (en) Method for measuring farmland boundary and internal obstacles based on multi-rotor aircraft
CN110785355A (en) Unmanned aerial vehicle testing method, device and storage medium
Jensen et al. Low-cost multispectral aerial imaging using autonomous runway-free small flying wing vehicles
WO2019042067A1 (en) Aerial vehicle control method, aerial vehicle, program and recording medium
CN109163718A (en) A kind of unmanned plane autonomous navigation method towards groups of building
WO2020088399A1 (en) Information processing device, flight control method, and flight control system
Conte et al. Evaluation of a light-weight LiDAR and a photogrammetric system for unmanned airborne mapping applications
WO2021207977A1 (en) Movable platform operation method, movable platform and electronic device
WO2021081922A1 (en) Control method and apparatus, and storage medium
WO2022095061A1 (en) Spraying assessment method and device based on radar, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18851749

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18851749

Country of ref document: EP

Kind code of ref document: A1