CN110383333A - Flight body controlling means, flying body, program and recording medium - Google Patents

Flight body controlling means, flying body, program and recording medium Download PDF

Info

Publication number
CN110383333A
CN110383333A CN201880015439.8A CN201880015439A CN110383333A CN 110383333 A CN110383333 A CN 110383333A CN 201880015439 A CN201880015439 A CN 201880015439A CN 110383333 A CN110383333 A CN 110383333A
Authority
CN
China
Prior art keywords
operating area
image information
sample image
area
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880015439.8A
Other languages
Chinese (zh)
Inventor
顾磊
王向伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN110383333A publication Critical patent/CN110383333A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0202Control of position or course in two dimensions specially adapted to aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C13/00Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
    • B64C13/02Initiating means
    • B64C13/16Initiating means actuated automatically, e.g. responsive to gust detectors
    • B64C13/18Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • B64U10/16Flying platforms with five or more distinct rotor axes, e.g. octocopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/45UAVs specially adapted for particular uses or applications for releasing liquids or powders in-flight, e.g. crop-dusting
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Image Analysis (AREA)

Abstract

Making flying body, accurately specified operating area carries out more effective operation.A kind of flight body controlling means of the operating area for specifying flying body, the step of including the sample image information for obtaining the operating area;And according to the similarity with the sample image information come the step of specifying the operating area in subject area.

Description

Flight body controlling means, flying body, program and recording medium Technical field
This disclosure relates to a kind of flight body controlling means, flying body, program and the recording medium of the operating area for specifying flying body to carry out operation.
Background technique
The flying body to fly in the sky is used for various fields, such as cargo is transported to the air transportion etc. of destination by the aerial mapping for imaging, measuring from objects such as the buildings on upper air to surface shape, ground, the flight imaged from overhead to object.In addition, flying body is also applied to sprinkling purposes, such as agriculture field, loads the sprays such as pesticide, fertilizer, water and spray is sprayed in scheduled operating area on the objects such as the crop in farmland.
Example as the flying body for agriculture field, such as a kind of rotor wing unmanned aerial vehicle is disclosed in patent document 1, it includes four rotors, bracket, main control box, anti-bellows, spray case, anti-bellows is arranged in the lower part that outer end is equipped with the X-type bracket of rotor, the setting spray case in anti-bellows.
Existing technical literature
Patent document 1: No. 204297097 specification of Chinese utility model.
Summary of the invention
It is general to require correctly to carry out operation in the range of scheduled operating area (such as farmland etc.) when carrying out the operations such as the sprinkling of spray (such as pesticide, fertilizer, water etc.) using unmanned plane shown in patent document 1.If spray flies out outside operating area, be not only the possibility to can insulting health, environment (for example, when by farmers' beside farmland of pesticide spraying etc.), also will cause the waste of spray.On the other hand, a part of operating area without all over spread spray in the case where, be also unable to reach the purpose of sprinkling.
In the past, the visual operating area in user one side, manually controls the flight of unmanned plane using transmitter etc. to carry out operation on one side.But in the case where large range of operating area, user must concentrate one's mind on the manipulation of unmanned plane for a long time, and not only physical strength, spiritual burden are big, but also are difficult in operating area accurately carry out operation.In addition, it is also known that user can preset operating area on the terminal installation with UAV Communication while referring to electronic map, and flying body carries out operation while flying along the flight path in set operating area.However, might not reflect newest landform since electronic map is not to update at any time, even if the location information for assuming map is newest information, it is also unrare to generate several meters of error.In addition, the setting method in learning performance region etc. inherently becomes burden for the usually user of uncomfortable using terminal device.
In an aspect, a kind of flight body controlling means of the operating area for specifying flying body, include obtain operating area sample image information the step of;And the step of operating area in subject area is specified according to the similarity with sample image information.
Sample image information can be obtained from the image that the reference position being located in operating area is shot.
Sample image information can be obtained from an image selected from the multiple images shot respectively in the multiple positions being located in operating area.
Reference position can be located at the inner side of the boundary line preset distance away from operating area or more.
It may include information relevant to the attribute of color in sample image information.
The step of specifying the operating area in subject area, can have: the step of a series of camera position is determined in subject area;In the step of each camera position is imaged;And according to the composograph of the image imaged and the similarity of sample image information come specified operating area the step of.
The step of according to the similarity of the composograph of the image imaged and sample image information come specified operating area, it can have: every time when camera position is imaged, the multiple images imaged are synthesized, when similar area of the similarity of the image of synthesis and sample image information more than first threshold is enclosed region, the step of enclosed region is appointed as operating area.
The step of according to the similarity of the composograph of the image imaged and sample image information come specified operating area, it can have: every time when camera position is imaged, the multiple images imaged are synthesized, when similar area of the similarity of the image of synthesis and sample image information more than first threshold is enclosed region, the step of enclosed region is appointed as operating area.
The step of region of the similarity in synthesized image and sample image information more than first threshold is appointed as operating area, can have: by similarity in synthesized image and sample image information more than first threshold and the step of region of the area more than second threshold is appointed as operating area.
A series of camera position may be constructed since reference position and the path of helical form extension.
In an aspect, a kind of flying body of specified operating area, with processing unit, processing unit obtains the sample image information of operating area, and the operating area in subject area is specified according to the similarity with sample image information.
Flying body also has photographic device, and sample image information can be obtained from the image that the reference position being located in operating area images.
Sample image information can be obtained from an image selected from the multiple images shot respectively in the multiple positions being located in operating area.
Reference position can be located at the inner side of the boundary line preset distance away from operating area or more.
It may include information relevant to the attribute of color in sample image information.
Processing unit can determine a series of camera position in subject area, and be imaged in each camera position, according to the similarity of the composograph of the image imaged and sample image information come specified operating area.
Processing unit when camera position is imaged, can synthesize the multiple images imaged every time, and when similar area of the similarity of the image of synthesis and sample image information more than first threshold is enclosed region, enclosed region is appointed as operating area.
The information of the available preset range for indicating subject area of processing unit, by it is in the image that is synthesized into of image imaged at camera position within a predetermined range, with the region of the similarity of sample image information more than first threshold be appointed as operating area.
Processing unit can by it is in the image of synthesis, with sample image information similarity is more than first threshold and region of the area more than second threshold is appointed as operating area.
A series of camera position may be constructed since reference position and the path of helical form extension.
In an aspect, a kind of program is used to make the flying body of specified operating area to execute following steps: the step of obtaining the sample image information of operating area;And the step of operating area in subject area is specified according to the similarity with sample image information.
In an aspect, a kind of program is used to make the flying body of specified operating area to execute following steps: the step of obtaining the sample image information of operating area;And the step of operating area in subject area is specified according to the similarity with sample image information.
In addition, exhaustion does not go out all features of the disclosure in above-mentioned summary of the invention.In addition, the sub-portfolio of these feature groups also may be constructed invention.
Detailed description of the invention
Fig. 1 is an exemplary figure for showing the appearance of unmanned flight's body.
Fig. 2 is the exemplary block diagram for showing the hardware of unmanned flight's body and constituting.
Fig. 3 is an exemplary flow chart of the processing routine for the flying vehicles control method in the disclosure that shows.
Fig. 4 is an exemplary schematic diagram for showing the operating area in the disclosure.
Fig. 5 is an exemplary schematic diagram for showing the reference position in the disclosure.
Fig. 6 is exemplary flow chart the step of showing the specified operating area in the disclosure.
Fig. 7 is to show the schematic diagram for constituting and detecting path, a series of camera position in operating area.
Fig. 8 is the schematic diagram shown in the image range of a series of camera position camera shooting.
Fig. 9 is the flow chart of one embodiment the step of showing specified operating area.
Figure 10 is the schematic diagram shown in the composograph of the image of a series of camera position camera shooting.
Figure 11 is the schematic diagram shown in the composograph of the image of a series of camera position camera shooting.
Figure 12 is the schematic diagram shown in the composograph of the image of a series of camera position camera shooting.
Figure 13 is an exemplary schematic diagram for showing the operating area in the disclosure.
Figure 14 is the flow chart of one embodiment the step of showing specified operating area.
Figure 15 is the schematic diagram for showing the preset range set in subject area.
Figure 16 is the schematic diagram for showing specified operating area.
Symbol description:
100 unmanned flight's bodies (UAV)
102UAV main body
110 processing units
120 memories
130 rotor mechanisms
140GPS receiver
150 inertial measuring units
160 magnetic compasses
170 barometertic altimeters
180 millimetre-wave radars
190 anemoclinographs
200 nozzles
210 storage tanks
220 pressure sensors
230 flow sensors
240 memories
250 communication interfaces
260 batteries
270 photographic devices
Specific embodiment
Embodiment of the present disclosure is described further with reference to the accompanying drawing.
Hereinafter, the disclosure is illustrated by the embodiment of invention, but following implementation not limits invention involved in claims.The combination of the feature illustrated in embodiment is not entirely necessary to the solution of invention.
Include the item as copyright institute protected object in claims, specification, Figure of description and abstract of description.As long as carrying out the duplication of these files illustrated in anyone document or record such as Patent Office, copyright owner can not objection.But in the case where in addition to this, retain the copyright of all.
This disclosure relates to flight control method define the various processing (step) in the information processing unit controlled for the flight to flying body.Flying body includes the aircraft (such as drone, helicopter) in aerial mobile.Flying body is also possible to unmanned flight's body (UAV:Unmanned Aerial Vehicle).Flying body can specify the operating area (such as farmland) for carrying out scheduled operation (such as spray insecticide, fertilizer, water etc.).
This disclosure relates to program be for make information processing unit execute it is various processing (step) programs.
This disclosure relates to recording medium recording have program (i.e. for make information processing unit execute it is various processing (step) programs).
It is as shown below this disclosure relates to each embodiment in, flying body is by taking unmanned flight's body (UAV) as an example.In the attached drawing of this specification, unmanned flight's body is labeled as " UAV ".In each embodiment as shown below, the operating area of the sprinkling object such as specified crop including farmland of flying body.Below, after through a flying body specified operating area, the flight path for substantially evenly spraying spray completely is set to the specified operating area by same flying body, and is illustrated in case where carrying out operation along flight path, but the present disclosure is not limited thereto.For example, can at one this disclosure relates to detect with flying body after specified operating area, the information of specified operating area is transmitted to other operation flying bodies, so that the operation carries out operation in the specified job area with flying body.
It is as shown below this disclosure relates to each embodiment in, be illustrated for the processing unit of inside of the information processing unit to be set to flying body but it is also possible to be can be with the information processing units such as flight body communication, independent remote server.In addition, inside flying body, can be executed respectively with the remote information process device of processing unit, flight body communication this disclosure relates to various processing (step) a part." communication " mentioned here is the generalized concept for including all data communications, not only includes the case where carrying out wired connection by cable etc., include thes case where being attached by wireless communication.In addition, not only including the case where information processing unit and flying body direct communication, include the case where through transmitter or storage medium indirect communication.
Fig. 1 is an exemplary figure for showing the appearance of unmanned flight's body 100.Unmanned flight's body 100 carries out the spraying operation of the sprays such as pesticide, fertilizer, water to the sprinkling object in specified operating area for example behind the operating area of specified spraying operation.The composition of unmanned flight's body 100 includes UAV main body 102, rotor mechanism 130, nozzle 200, storage tank 210 and photographic device 270.Unmanned flight's body 100 for example can be after detecting scheduled subject area by photographic device 270 come specified operating area (such as farmland), flight path is set in operating area, it is moved along set flight path, and pesticide, fertilizer, the water etc. that are stored in storage tank 210 is sprayed by nozzle 200.The movement of unmanned flight's body 100 refers to flight, moves horizontally including at least up and down, anticlockwise, right rotation, a left side, the right flight moved horizontally.In addition, only carrying out the specified of operating area in unmanned flight's body 100, in the case where carrying out operation with flying body by other operations later, unmanned flight's body 100 may not possess nozzle 200 and storage tank 210.
Unmanned flight's body 100 has multiple rotor mechanisms (propeller) 130.Unmanned flight's body 100 for example has eight rotor mechanisms 130.Unmanned flight's body 100 keeps unmanned flight's body 100 mobile by controlling the rotation of these rotor mechanisms 130.But the quantity of rotor is not limited to eight.In addition, unmanned flight's body 100 can be the fixed-wing aircraft of not rotor.
Then, the composition example of unmanned flight's body 100 is illustrated.
Fig. 2 is the exemplary block diagram for showing the hardware of unmanned flight's body 100 and constituting.Unmanned flight's body 100 is configured to include: processing unit 110, memory 120, rotor mechanism 130, GPS receiver 140, inertial measuring unit 150, magnetic compass 160, barometertic altimeter 170, millimetre-wave radar 180, anemoclinograph 190, nozzle 200, storage tank 210, pressure sensor 220, flow sensor 230, memory 240, communication interface 250, battery 260, photographic device 270.
Processing unit 110 is made of processor, such as CPU (Central Processing Unit: central processing unit), MPU (Micro Processing Unit: microprocessor) or DSP (Digital Signal Processor: digital signal processor).Processing unit 110 carries out the storage processing of the input and output processing of the data between the signal processing and other each sections of the movement of each section for overall control unmanned flight body 100, the calculation process of data and data.Processing unit 110 has the function that processing relevant to the control of flight is executed in unmanned flight's body 100.
Processing unit 110 controls the flight of unmanned flight's body 100 according to the program and information relevant to flight path that are stored in memory 120 or memory 240.In addition, processing unit 110 controls unmanned flight's body 100 according to the data and instruction received by communication interface 250 from long-range server.
Processing unit 110 controls the flight of unmanned flight's body 100 by control rotor mechanism 130.That is, processing unit 110 controls the position of the latitude comprising unmanned flight's body 100, longitude and height by control rotor mechanism 130.Processing unit 110 controls rotor mechanism 130 based on the location information obtained by least one of GPS receiver 140, inertial measuring unit 150, magnetic compass 160, barometertic altimeter 170, millimetre-wave radar 180.
Memory 120 is an example of storage unit.Memory 120 stores processing unit 110 and carries out controlling required program etc. to rotor mechanism 130, GPS receiver 140, inertial measuring unit 150, magnetic compass 160, barometertic altimeter 170, millimetre-wave radar 180, anemoclinograph 190, nozzle 200, storage tank 210, pressure sensor 220, flow sensor 230, memory 240, communication interface 250 and photographic device 270.Memory 120 saves various information, the data used when processing unit 110 is handled.Memory 120 can be computer readable recording medium, may include SRAM (Static Random Access Memory: static random access memory), DRAM (Dynamic Random Access Memory: dynamic random access memory), EPROM (Erasable Programmable Read Only Memory: Erasable Programmable Read Only Memory EPROM), EEPROM (Electrically Erasable Programmable Read-Only Memory: electrically erasable programmable read-only memory) and USB (Universal At least one of Serial Bus: universal serial bus) flash memories such as memory.Memory 120 can be set in the inside of unmanned flight's body 100, can also be removably disposed from unmanned flight's body 100.
Rotor mechanism 130 has multiple rotors and makes multiple driving motors of multiple rotor wing rotations.Rotor mechanism 130 generates the air-flow of specific direction and making rotor wing rotation, controls the flight of unmanned flight's body 100 (up and down are moved horizontally, rotated, tilting).
GPS receiver 140 receives multiple signals of the position (coordinate) of the instruction time and each GPS satellite that send from multiple navigation satellites (i.e. GPS satellite).GPS receiver 140 calculates the position (i.e. the position of unmanned flight's body 100) of GPS receiver 140 based on the multiple signals received.GPS receiver 140 exports the location information of unmanned flight's body 100 to processing unit 110.Furthermore, it is possible to replace GPS receiver 140 with processing unit 110 to carry out the calculating of the location information of GPS receiver 140.At this moment, the information of the position of the instruction time and each GPS satellite that include in multiple signals that GPS receiver 140 receives is inputted in processing unit 110.
Inertial measuring unit (IMU:Inertial Measurement Unit) 150 detects the posture of unmanned flight's body 100, and will test result and export to processing unit 110.Inertial measuring unit 150 detects the angular speed of three axis directions of the front and rear, left and right of unmanned flight's body 100 and the acceleration and pitch axis, roll axis and yaw axis of three upper and lower axis directions, the posture as unmanned flight's body 100.
Magnetic compass 160 detects the orientation of the head of unmanned flight's body 100, and will test result and export to processing unit 110.Barometertic altimeter 170 detects the height that unmanned flight's body 100 flies, and will test result and export to processing unit 110.
Millimetre-wave radar 180 sends the high-frequency electric wave of millimere-wave band and measures the back wave on ground, object reflection, to detect the position on ground, object, and will test result and exports to processing unit 110.Testing result can indicate the distance (i.e. height) for example from unmanned flight's body 100 to ground.Testing result also may indicate that for example from unmanned flight's body 100 to the distance of object.Testing result also may indicate that the landform that the operating area of spraying operation is for example carried out by unmanned flight's body 100.
Anemoclinograph 190 detects wind speed, wind direction around unmanned flight's body 100, and will test result and export to processing unit 110.Testing result can indicate wind speed, wind direction in the operating area that unmanned flight's body 100 flies.
The end of the pipelines of sprays such as export pesticide, fertilizer, water is arranged in nozzle 200, such as (vertical direction) sprays spray downward.Nozzle 200 can have multiple mouths (such as four).Control of the nozzle 200 based on processing unit 110 is adjusted the ON/OFF, the amount of injection and jet velocity of injection.Spray as a result, from nozzle 200 is sprayed with scheduled the amount of injection, jet velocity towards sprinkling object.Storage tank 210 accommodates the sprays such as pesticide, fertilizer, water.Control of the storage tank 210 based on processing unit 110 sends out spray to nozzle 200 via pipeline.Nozzle 200, storage tank 210 include in an example of the composition of spraying mechanism.
Pressure sensor 220 detects the pressure for the spray sprayed from nozzle 200, and will test result and export to processing unit 110.Testing result can indicate the amount of injection or jet velocity for example from nozzle 200.Flow sensor 230 detects the flow for the spray sprayed from nozzle 200, and will test result and export to processing unit 110.Testing result can indicate the amount of injection or jet velocity for example from nozzle 200.
Memory 240 is an example of storage unit.Memory 240 stores and saves various data, information.Memory 240 can be HDD (Hard Disk Drive: hard disk drive), SSD (Solid State Drive: solid state hard disk), RAM card, USB storage etc..Memory 240 can be set in the inside of unmanned flight's body 100, can also be removably disposed from unmanned flight's body 100.
Communication interface 250 is communicated with terminal 50.Communication interface 250 receives the related various information with flight path, spray etc. for carrying out self terminal 50.Communication interface 250 receives the various instructions for processing unit 110 for carrying out self terminal 50.
Battery 260 has the function of the driving source of each section as unmanned flight's body 100, the power supply needed for providing to each section of unmanned flight's body 100.
Photographic device 270 is to including camera that the subject (such as above-mentioned farmland) in desired image pickup scope is imaged.In the disclosure, the example that photographic device is fixed on the lower part of unmanned flight's body 100 is shown, but also may be mounted at centered on yaw axis, roll axis and pitch axis and be pivotably supported on the gimbals of photographic device 270.
Hereinafter, an example to the flight body controlling means in the disclosure is illustrated.Flight body controlling means in the disclosure detect the specified operating area for carrying out operation out of subject area from scheduled.Unmanned flight's body 100, by specified operating area, will appreciate that accurate operating area, realize efficient operation before i.e. by operation.
Hereinafter, being illustrated in case where the processing unit 110 for being set to unmanned flight's body 100 executes each step of the flight body controlling means in the disclosure, however, the present invention is not limited thereto, a part of step can also be executed by the server far from unmanned flight's body 100.
Fig. 3 is an exemplary flow chart for showing the processing routine (step) of the flying vehicles control method in the disclosure.Fig. 4 is an exemplary schematic diagram for showing operating area.In addition, it is therefore apparent that in the disclosure, for ease of description, by taking operating area A shown in Fig. 4 as an example, but the area of actual operating area, shape etc. are not limited to this.
As shown in figure 3, the flight body controlling means S100 in the disclosure, first the sample image information (step S110) of the operating area A of acquisition unmanned flight's body 100.Sample image information mentioned here can be any information for indicating the feature of operating area A, it can be used for identifying the part of operating area A and part of non-operating area A from subject area, such as may include at least one of the information that the attribute of information, color that shape is related to is related to.The information that the attribute of color is related to can be the expression lightness, chroma, the information of form and aspect such as RGB information.In addition, sample image information can be the color figure (pattern) of the rule in the A of operating area.For example, can be determined by the regular pattern (color figure) of green in the case where operating area A is vegetables farmland.
Sample image information is preferably obtained from the image that the predetermined position being located in the A of operating area obtains.As a specific example, as shown in Figure 5, unmanned flight's body 100 can be moved to the top of reference position P0 according to the position signal that the beacon for the reference position P0 being set in advance in the A of operating area is sent, in reference position, P0 is imaged, and obtains RGB information equal samples image information from the photographed images.In addition, the situation that the reference position P0 in the disclosure is not limited to the location information sent by beacon to determine, such as can also be determined according to the location information that user pre-enters in the terminal installation that can be communicated with unmanned flight's body 100.In addition, in order to obtain more accurate sample image information photographed images can be distinguished in multiple positions in the A of operating area, and obtain sample image information from the image selected in these multiple images from by user.
But, the acquisition methods of sample image information are not limited to the above method, such as can by wireless communication, network from remote data base obtain, or in the presence of specifying operating area A in the past or carrying out the history of operation, past sample image information can also be obtained from memory of unmanned flight's body 100 etc..
In addition, compared with obtaining sample image information near boundary line, the method for nearby obtaining sample image information from the center of operating area A more accurately reflects operating area A, therefore, preferably, as shown in figure 5, reference position P0 is the inner side of the boundary line preset distance away from operating area or more.
Preferably, in the disclosure, unmanned flight's body 100 is initialized before flight by the preset flight parameter of user.For example including enroute I. F. R. altitude (such as 20m), the range (such as self-reference position P0 radius is 100m) of subject area, flying speed (such as 5m/s) etc. in flight parameter, but it is not limited to this.
Then, the operating area (step S120) in subject area is specified according to the similar degree with acquired sample image information.Subject area refers to the region for judging whether unmanned flight's body 100 belongs to operating area, and preferably user is set in advance as parameter as needed.For example, user can set subject area for the range that the reference position radius locating for the beacon is 100m.
As described above, after the sample image information that unmanned flight's body 100 obtains the feature for indicating operating area, detect subject area, and it will be identified as operating area in (such as 70%) to a certain degree part like above with the sample image information in the range of subject area, thus, it is possible to accurately specified operating areas.
Fig. 6 is an exemplary flow chart for showing the processing (step S120) of the specified operating area in the disclosure.Firstly, as shown in fig. 7, unmanned flight's body 100 determined in subject area a series of path point (i.e. camera position) the P0, P1 of flight path for constituting unmanned flight's body 100, P2 ..., Pn (step S121).These a series of path point P0, P1, P2 ..., the setting of the interval Pn, which makes the image imaged in adjacent path point more than scheduled repetitive rate (such as 10%).For example, in fig. 8, the image pickup scope in P0 is E0, the image pickup scope in P1 is E1, and when the image pickup scope in P2 is E2, E0 and E1 has 10% or more repetitive rate, and E1 and E2 also have 10% or more repetitive rate.In this way, by repeating at least adjacent image pickup scope with certain proportion each other, can accurately synthesize (split) a series of path point P0, P1, P2 ..., the image of Pn camera shooting.
At this point, can also according to by unmanned flight's body 100 flying height and photographic device used in camera lens the calculated image pickup scope in visual angle, come predefine whole path point P0, P1, P2 ..., Pn, but the disclosure is not limited to this.For example, next path point Px+1 can be determined after reaching a path point Px, so that image pickup scope repeats at a predetermined ratio.
Preferably, a series of path point P0, P1, P2 ..., Pn is since for obtaining sample image information, reference position P0 in Fig. 5.As a result, after unmanned flight's body 100 is moved to reference position P0, while obtaining sample image information, starts to image at first path point, can be realized high efficiency.
Furthermore it is preferred that as shown in fig. 7, a series of path point P0, P1, P2 ..., Pn formed helical form extension path.The center for being located at operating area A due to reference position P0 nearby, passes through and detects so that unmanned flight's body 100 spirally extends, being capable of quickly and accurately specified operating area A.
Then, it is successively moved to each path point to be imaged (step S122), and operating area (step S123) is determined according to the similarity of the composograph of the image imaged and above-mentioned sample image information.Hereinafter, being illustrated to its specific embodiment.
The flow chart of first specific embodiment S200 of the step of Fig. 9 is to show unmanned flight's body 100 after path point P0 is imaged and is moved to path point P1, specified operating area (step S122).As shown in figure 9, unmanned flight's body 100 is imaged (step S210) in path point P1 first.
Then, (step S220) is synthesized to the image imaged in path point P0 and the image in path point P1 camera shooting.Synthesis preferably carries out in the processing unit 110 of unmanned flight's body 100, but can also carry out in remote information process device.Specifically, unmanned flight's body 100 can after sending remote information process device for the image of camera shooting by communication interface 250, receive it is synthesized after image.
Then, region of the similarity in the image after synthesis and sample image information more than first threshold (such as 70%) is estimated as zone similarity (step S230) by unmanned flight's body 100.For example, RGB information in synthesized image and sample image information is estimated as zone similarity in 70% region like above by unmanned flight's body 100.In presumption, the region-growing method (Region Growing) etc. based on the image imaged in reference position P0 specifically can be used.
Next, it is judged that whether the zone similarity deduced in step S230 is enclosed region (step S240).At this point, since path point P0 is located near the center of working region A, and path point P1 is adjacent with path point P0, therefore as shown in Figure 10, all ranges of composograph are zone similarity.Therefore, judge that zone similarity is not enclosed region (step S240, no), be moved to next path point P2 (step S260).
Then, unmanned flight's body 100 is synthesized (split) (step S220) to the image imaged in path point P0, P1, P2 after path point P2 is imaged (step S210).Then, region of in the image after will be synthesized and sample image information the similarity more than first threshold is estimated as zone similarity (step S230).In this way, every time when path point Px is imaged, repeat to before reaching path point Px all path point P0, P1, P2 ..., the image that is imaged at Px-1, Px synthesize, and the step of estimating zone similarity.Every time when the additional image imaged at new path point, composograph becomes larger, and occurs the part for being not belonging to zone similarity as shown in figure 11, in composograph.Quickly, as shown in figure 12, zone similarity becomes enclosed region (step S240, yes).At this point, that is, zone similarity is appointed as operating area A (step S250) by the enclosed region, end processing.
In this way, even if be not necessarily moved to the whole a series of path point P0, P1 determined in subject area, P2 ..., Pn and imaged, it also can be in stage of the zone similarity as enclosed region, it by specified operating area A and ends processing, saves electric power, time and being capable of quickly and accurately specified operating area.
In addition, the geography information (such as range of the longitude, latitude in GPS information) of the operating area A after specified can be obtained from the location information of each path point.The location information of each path point can also be when determining a series of path point (i.e., when executing step S121), it is calculated according to the relative distance of the reference position sent with beacon, but in order to grasp more accurate location information, preferably, when each path point images, the location information of the path point is obtained from GPS receiver 140 every time for unmanned flight's body 100.
The first specific embodiment as the step of specified operating area was ended processing in stage that the operating area of the closure belonging to the P0 of base position is determined, and every execution can only once determine the region of a closure.But as shown in figure 13, there are region A1, A2, A3 of multiple closures, it is necessary to execute the above method in the operating area of each closure, become many and diverse instead.Therefore, present disclose provides the second specific embodiments of the operating area for the multiple closures that can once determine in preset range.
Hereinafter, the second specific embodiment S300 of the step of to the specified operating areas of the flight body controlling means in the disclosure is illustrated.Firstly, as shown in figure 14, unmanned flight's body 100 obtains the information (step S310) for indicating the preset range of subject area in advance.The preset range of subject area B is specified by indicating the parameter of range.For example, preset range can be by being to specify within 100 meters by center radius of reference position P0 belonging to beacon.
Then, unmanned flight's body 100 is imaged (step S320) in path point P0.Later, judge using reference position (that is, path point) P0 as 100 meters of center within all path points at camera shooting whether complete (step S330).Due to using reference position P0 as 100 meters of center within there is other path points (the step S330 not imaged, it is no), therefore unmanned flight's body 100 continues to move to next path point P1 (step S350), and is imaged (step S320) in path point P1.Then, unmanned flight's body 100 judge again using reference position P0 as 100 meters of center within all path points at camera shooting whether complete (step S330).In this way, mobile and camera shooting is repeated, until until reference position P0 is completed as the camera shooting at all path points within 100 meters of center radius.
After camera shooting at all path points within 100 meters of self-reference position P0 radius is completed (step S330, yes), the image imaged is synthesized.At this point, as a result, synthesized image includes the preset range (that is, range within P0100 meters of self-reference position) of entire subject area B shown in figure 15.Then, region of the similarity in synthesized image and sample image information more than first threshold (such as 70%) is appointed as operating area (step S360).Thereby, it is possible to once specify region A1, A2, A3 of multiple closures in subject area.
In the present embodiment, although being also able to carry out synthesis when each path point is imaged every time, preferably, after being imaged at whole path points, whole images is synthesized together.By reducing the number of synthesis, the resource of processing unit 110 can be saved, realizes power saving.
In addition, if by operating area is designated all as with region of the similarity of the sample image information in the preset range of subject area B more than first threshold, it is likely that include many interference.Such as, there are grassplat A2 (5 square metres) near farmland A1 shown in figure 15 (100 square metres) and farmland A3 (20 square metres), in the case where the sample image information in farmland is RGB information corresponding with green tinged color, it is possible to which meadow A2 is also appointed as operating area.In order to avoid generating such interference, further preferably, in step S360, by it is in synthesized image, be more than first threshold (such as 70%) with the similarity of sample image information and area is that region more than second threshold (such as 10 square metres) is appointed as operating area.Second threshold can be preset according to the area in actual operating area and the region for being likely to become interference, and be recorded in memory 120 or memory 240.As a result, as shown in figure 16, since farmland A1, farmland A3 are 10 square metres or more, be designated as operating area, and due to meadow A2 at 10 square metres hereinafter, being not designated as operating area, therefore so as to accurately specified operating area.
Flight body controlling means in the disclosure by the working path in existing method plan operating area and can carry out operation after specifying operating area.Even if user does not manually set operating area as a result, flying body also automatically can accurately grasp operating area, and can be realized will not excessively will not insufficient high-efficient homework.
(step) is managed everywhere in flight body controlling means in the disclosure, can all be executed by the processing unit 110 of unmanned flight's body 100, but it is not limited to this.For example, in the first specific embodiment the step S230 of composograph, the step S230 for estimating zone similarity, in the second specific embodiment the step S340 of composograph, specified operating area one or more of step S360 can be executed in remote information process device (such as Cloud Server).
In addition, the flight body controlling means in the disclosure can also be realized by making unmanned flight's body 100 execute the program of each processing (step).The program can store in memory 120, memory 240 or other storage mediums.
The disclosure is illustrated in embodiment used above, but this disclosure relates to the technical scope of invention be not limited to the above embodiment documented range.For those of ordinary skills, it is clear that above embodiment can be made various changes or be improved.From the description of claims it is readily apparent that the mode for being subject to such change or improvement all may include within scope of the presently disclosed technology.
Every processing such as movement, process, step and stage in device shown in claims, specification and Figure of description, system, program and method executes sequence, as long as no especially express " ... before ", " prior " etc., as long as the output of previous processed is not used in subsequent processing, can be realized with random order.About the operating process in claims, specification and Figure of description, use for convenience " first ", " then " etc. be illustrated, but be not meant to implement in this order.

Claims (22)

  1. A kind of flight body controlling means of the operating area for specifying flying body, include
    The step of obtaining the sample image information of the operating area;And
    According to the similarity with the sample image information come the step of specifying the operating area in subject area.
  2. Flight body controlling means according to claim 1, wherein
    The sample image information is obtained from the image that the reference position being located in the operating area images.
  3. Flight body controlling means according to claim 2, wherein
    The sample image information is obtained from an image selected from the multiple images imaged respectively in the multiple positions being located in the operating area.
  4. Flight body controlling means according to claim 2, wherein
    The reference position is located at the inner side of the boundary line preset distance away from the operating area or more.
  5. Flight body controlling means according to any one of claim 1 to 4, wherein
    It include information relevant to the attribute of color in the sample image information.
  6. Flight body controlling means according to claim 2, wherein
    The step of specifying the operating area in the subject area, comprising:
    A series of the step of camera positions are determined in the subject area;
    In the step of each camera position is imaged;And
    According to the similarity of the composograph of the image imaged and the sample image information come the step of specifying the operating area.
  7. Flight body controlling means according to claim 6, wherein
    According to the similarity of the composograph of the imaged image and the sample image information come the step of specifying the operating area, comprising:
    Every time when the camera position is imaged, the multiple images imaged are synthesized,
    When similar area of the similarity of the synthesized image and the sample image information more than first threshold is enclosed region,
    The step of enclosed region is appointed as the operating area.
  8. Flight body controlling means according to claim 6, wherein
    According to the similarity of the composograph of the imaged image and the sample image information come the step of specifying the operating area, comprising:
    The step of obtaining the information for indicating the preset range of the subject area;And
    The step of region of the similarity in the image that the image imaged at the camera position in the preset range is synthesized into and the sample image information more than first threshold is appointed as the operating area.
  9. Flight body controlling means according to claim 8, wherein
    The step of region of the similarity in the synthesized image and the sample image information more than first threshold is appointed as the operating area, comprising:
    By similarity in the synthesized image and the sample image information more than first threshold and the step of region of the area more than second threshold is appointed as the operating area.
  10. Flight body controlling means according to any one of claims 6 to 9, wherein
    The a series of camera position is constituted since the reference position and the path of helical form extension.
  11. A kind of flying body of specified operating area,
    Its with processing unit,
    The processing unit obtains the sample image information of the operating area,
    And the operating area in subject area is specified according to the similarity with the sample image information.
  12. Flying body according to claim 11, wherein
    The flying body also has photographic device,
    The sample image information is obtained from the image that the reference position being located in the operating area images.
  13. Flying body according to claim 12, wherein
    The sample image information is obtained from an image selected from the multiple images imaged respectively in the multiple positions being located in the operating area.
  14. Flying body according to claim 12, wherein
    The reference position is located at the inner side of the boundary line preset distance away from the operating area or more.
  15. Flying body described in any one of 1 to 14 according to claim 1, wherein
    It include information relevant to the attribute of color in the sample image information.
  16. Flying body according to claim 12, wherein
    The processing unit determines a series of camera position in the subject area, and is imaged in each camera position, specifies the operating area according to the similarity of the composograph of the image imaged and the sample image information.
  17. Flying body according to claim 16, wherein
    The processing unit when the camera position is imaged, synthesizes the multiple images imaged every time,
    When similar area of the similarity of the synthesized image and the sample image information more than first threshold is enclosed region, the enclosed region is appointed as the operating area.
  18. Flying body according to claim 16, wherein
    The processing unit obtains the information for indicating the preset range of the subject area, and by it is in image that the image imaged at the camera position in the preset range is synthesized into, with the region of the similarity of the sample image information more than first threshold be appointed as the operating area.
  19. Flying body according to claim 18, wherein
    The processing unit by it is in the synthesized image, with the sample image information similarity is more than first threshold and region of the area more than second threshold is appointed as the operating area.
  20. Flying body described in any one of 6 to 19 according to claim 1, wherein
    The a series of camera position is constituted since the reference position and the path of helical form extension.
  21. A kind of program is used to make the flying body of specified operating area to execute following steps:
    The step of obtaining the sample image information of the operating area;And
    According to the similarity with the sample image information come the step of specifying the operating area in subject area.
  22. A kind of computer readable recording medium,
    It records the program having for making the flying body of specified operating area execute following steps:
    The step of obtaining the sample image information of the operating area;And
    According to the similarity with the sample image information come the step of specifying the operating area in subject area.
CN201880015439.8A 2017-08-29 2018-08-01 Flight body controlling means, flying body, program and recording medium Pending CN110383333A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017-164640 2017-08-29
JP2017164640A JP2019045898A (en) 2017-08-29 2017-08-29 Flying body control method, flying body, program and recording medium
PCT/CN2018/098109 WO2019042067A1 (en) 2017-08-29 2018-08-01 Aerial vehicle control method, aerial vehicle, program and recording medium

Publications (1)

Publication Number Publication Date
CN110383333A true CN110383333A (en) 2019-10-25

Family

ID=65524863

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880015439.8A Pending CN110383333A (en) 2017-08-29 2018-08-01 Flight body controlling means, flying body, program and recording medium

Country Status (3)

Country Link
JP (1) JP2019045898A (en)
CN (1) CN110383333A (en)
WO (1) WO2019042067A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6651682B1 (en) * 2019-04-02 2020-02-19 株式会社ネクスドローン Material distribution system and material distribution device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105354841A (en) * 2015-10-21 2016-02-24 武汉工程大学 Fast matching method and system for remote sensing images
CN106020233A (en) * 2016-07-08 2016-10-12 聂浩然 Unmanned aerial vehicle (UAV) adopted plant protection system, unmanned aerial vehicle (UAV) for plant protection and its control method
US20160327959A1 (en) * 2015-05-04 2016-11-10 International Business Machines Corporation Unmanned Vehicle Movement Path Assignment and Management
CN106796112A (en) * 2014-10-17 2017-05-31 索尼公司 Detection vehicle control apparatus, control method and computer program
CN106956778A (en) * 2017-05-23 2017-07-18 广东容祺智能科技有限公司 A kind of unmanned plane pesticide spraying method and system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5123812B2 (en) * 2008-10-10 2013-01-23 日立オートモティブシステムズ株式会社 Road marking recognition system
JP6014795B2 (en) * 2012-08-24 2016-10-26 ヤンマー株式会社 Travel path identification device
CN103235830A (en) * 2013-05-13 2013-08-07 北京臻迪科技有限公司 Unmanned aerial vehicle (UAV)-based electric power line patrol method and device and UAV
CN103985117A (en) * 2014-04-28 2014-08-13 上海融军科技有限公司 Method for capturing and determining object based on remote sensing image
US10665115B2 (en) * 2016-01-05 2020-05-26 California Institute Of Technology Controlling unmanned aerial vehicles to avoid obstacle collision
JP6621140B2 (en) * 2016-02-16 2019-12-18 株式会社ナイルワークス Method and program for spraying medicine by unmanned air vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106796112A (en) * 2014-10-17 2017-05-31 索尼公司 Detection vehicle control apparatus, control method and computer program
US20160327959A1 (en) * 2015-05-04 2016-11-10 International Business Machines Corporation Unmanned Vehicle Movement Path Assignment and Management
CN105354841A (en) * 2015-10-21 2016-02-24 武汉工程大学 Fast matching method and system for remote sensing images
CN106020233A (en) * 2016-07-08 2016-10-12 聂浩然 Unmanned aerial vehicle (UAV) adopted plant protection system, unmanned aerial vehicle (UAV) for plant protection and its control method
CN106956778A (en) * 2017-05-23 2017-07-18 广东容祺智能科技有限公司 A kind of unmanned plane pesticide spraying method and system

Also Published As

Publication number Publication date
JP2019045898A (en) 2019-03-22
WO2019042067A1 (en) 2019-03-07

Similar Documents

Publication Publication Date Title
JP6962720B2 (en) Flight control methods, information processing equipment, programs and recording media
CN108594850B (en) Unmanned aerial vehicle-based air route planning and unmanned aerial vehicle operation control method and device
WO2019080924A1 (en) Method for configuring navigation chart, obstacle avoidance method and device, terminal, unmanned aerial vehicle
CN110889808B (en) Positioning method, device, equipment and storage medium
US10459445B2 (en) Unmanned aerial vehicle and method for operating an unmanned aerial vehicle
US20160301859A1 (en) Imaging method and apparatus
CN107544531B (en) Line inspection method and device and unmanned aerial vehicle
US9897417B2 (en) Payload delivery
JP2015006874A (en) Systems and methods for autonomous landing using three dimensional evidence grid
CN104808675A (en) Intelligent terminal-based somatosensory flight operation and control system and terminal equipment
KR101929129B1 (en) Apparauts and method for controlling an agricultural unmanned aerial vehicles
CN110192122A (en) Radar-directed system and method on unmanned moveable platform
CN111741897A (en) Control method and device of unmanned aerial vehicle, spraying system, unmanned aerial vehicle and storage medium
US10203691B2 (en) Imaging method and apparatus
WO2023272633A1 (en) Unmanned aerial vehicle control method, unmanned aerial vehicle, flight system, and storage medium
CN108562279A (en) A kind of unmanned plane mapping method
JP2018191124A (en) Relay device, information processing apparatus, system, relay position determination method, information processing method, and program
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
JP2020170213A (en) Drone-work support system and drone-work support method
WO2021159249A1 (en) Route planning method and device, and storage medium
CN109407706A (en) Unmanned aerial vehicle (UAV) control method and apparatus
WO2021166845A1 (en) Information processing device, information processing method, and program
CN110383333A (en) Flight body controlling means, flying body, program and recording medium
CN114077249A (en) Operation method, operation equipment, device and storage medium
US11459117B1 (en) Drone-based cameras to detect wind direction for landing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20191025

WD01 Invention patent application deemed withdrawn after publication