CN106767817A - A kind of method and aircraft for obtaining flight location information - Google Patents

A kind of method and aircraft for obtaining flight location information Download PDF

Info

Publication number
CN106767817A
CN106767817A CN201611100259.9A CN201611100259A CN106767817A CN 106767817 A CN106767817 A CN 106767817A CN 201611100259 A CN201611100259 A CN 201611100259A CN 106767817 A CN106767817 A CN 106767817A
Authority
CN
China
Prior art keywords
location information
flight location
moment
camera
individual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201611100259.9A
Other languages
Chinese (zh)
Other versions
CN106767817B (en
Inventor
黄盈
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201611100259.9A priority Critical patent/CN106767817B/en
Publication of CN106767817A publication Critical patent/CN106767817A/en
Priority to PCT/CN2017/111577 priority patent/WO2018095278A1/en
Application granted granted Critical
Publication of CN106767817B publication Critical patent/CN106767817B/en
Priority to US16/296,073 priority patent/US10942529B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Navigation (AREA)

Abstract

The embodiment of the invention discloses a kind of method for obtaining flight location information, including:The first intrinsic parameters are determined according to the first realtime graphic with two aircraft of camera, the second intrinsic parameters are determined according to the second realtime graphic;Obtain first initial positioning information and the second initial positioning information of initial time;According to the first initial positioning information and the first intrinsic parameters, the first flight location information is determined, and according to the second initial positioning information and the second intrinsic parameters, determine the second flight location information;According to the first flight location information and the second flight location information, target flight location information is obtained using preset position constraint condition.The embodiment of the present invention also provides a kind of aircraft.The present invention can obtain the target flight location information closer to actual value, in the case where light stream camera or high accuracy inertial sensor is not used, still can obtain accurate location information, reduce error amount, while also reducing the cost of aircraft.

Description

A kind of method and aircraft for obtaining flight location information
Technical field
The present invention relates to Intelligent flight device technical field, more particularly to a kind of method for obtaining flight location information and flight Device.
Background technology
Unpiloted aircraft is referred to as aircraft, and aircraft has become various countries competitively due to the plurality of advantages of its own The focus of development.One side aircraft has small volume, and maneuverability is difficult the advantage being found, and another aspect aircraft can To carry multiple sensors, can provide diversified forms high-resolution target information, furthermore aircraft is cheap, does not result in Casualties, the characteristics of with economic security.
Due to lacking extraneous assisting navigation, aircraft is difficult to estimate under circumstances not known positioning and the motion of aircraft, flies Need to solve this key issue during row device independent navigation.And this way to solve the problem and air craft carried sensing The type of device is closely connected, in existing scheme, can by aircraft fuselage install monocular cam, light stream camera or Person's inertial sensor measurement obtains the location information of aircraft, and flight control is carried out to aircraft using the location information.
However, in actual applications, the precision of the positioning of monocular cam and inertial sensor is poor, accumulated error is big, And light stream camera or high-precision inertial sensor typically cost are higher, so as to cause the increase of the cost of aircraft, no Beneficial to the generality of aircraft applications.
The content of the invention
A kind of method and aircraft for obtaining flight location information is the embodiment of the invention provides, can be obtained closer to very Real-valued target flight location information, in the case where light stream camera or high accuracy inertial sensor is not used, still may be used To obtain accurate location information, reduce error amount, while also reducing the cost of aircraft.
In view of this, first aspect present invention provides a kind of method for obtaining flight location information, and methods described is applied to Aircraft, the aircraft includes the first camera and second camera, wherein, first camera is used to obtain N number of Not N number of first realtime graphic corresponding in the same time, the second camera is used to obtaining N number of not corresponding N in the same time Individual second realtime graphic, the N is the positive integer more than or equal to 2, and methods described includes:
(N-1) individual first intrinsic parameters are determined according to N number of first realtime graphic, and it is real-time according to described N number of second Image determines (N-1) individual second intrinsic parameters;
Obtain the second of the first initial positioning information of the first camera described in initial time and the second camera Initial positioning information;
According to first initial positioning information and (N-1) individual first intrinsic parameters, it is determined that (N-1) the individual moment Corresponding (N-1) individual first flight location information, and it is intrinsic with (N-1) individual second according to second initial positioning information Parameter, it is determined that (N-1) individual moment corresponding (N-1) individual second flight location information;
According to (N-1) individual first flight location information and (N-1) individual second flight location information, using pre- Put position constraint condition and obtain N number of target flight location information not in the same time corresponding to middle finish time.
Second aspect present invention provides a kind of aircraft, and the aircraft includes the first camera and second camera, Wherein, first camera is used to obtain N number of or not first realtime graphic corresponding in the same time, the second camera For obtaining N number of not corresponding N number of second realtime graphic in the same time, the N is the positive integer more than or equal to 2, described Aircraft includes:
First determining module, for determining (N-1) individual first intrinsic parameters, and root according to N number of first realtime graphic Determine (N-1) individual second intrinsic parameters according to N number of second realtime graphic;
First acquisition module, for obtaining the first initial positioning information of the first camera described in initial time and described Second initial positioning information of second camera;
Second determining module, for first initial positioning information that is obtained according to first acquisition module with it is described (N-1) individual first intrinsic parameters that first determining module determines, it is determined that (N-1) the individual moment corresponding (N-1) individual the One flight location information, and second initial positioning information according to first acquisition module acquisition and the described first determination (N-1) individual second intrinsic parameters that module determines, it is determined that (N-1) individual moment corresponding (N-1) individual second flight positioning letter Breath;
Second acquisition module, (N-1) the individual first flight positioning letter described in being determined according to second determining module Breath and (N-1) individual second flight location information, N number of not middle knot in the same time is obtained using preset position constraint condition Target flight location information corresponding to the beam moment.
As can be seen from the above technical solutions, the embodiment of the present invention has advantages below:
In the embodiment of the present invention, aircraft includes the first camera and second camera, and the first camera is used to obtain N Individual not N number of first realtime graphic corresponding in the same time, second camera is used to obtaining N number of not in the same time corresponding N number of second Realtime graphic, flight location information can be obtained using above-mentioned aircraft, and (N-1) individual the is determined according to N number of first realtime graphic One intrinsic parameters, and (N-1) individual second intrinsic parameters are determined according to N number of second realtime graphic, obtain initial time first and image First initial positioning information of head and the second initial positioning information of second camera, then according to the first initial positioning information The first intrinsic parameters individual with (N-1), it is determined that (N-1) individual moment corresponding (N-1) individual first flight location information, and according to second Initial positioning information and (N-1) individual second intrinsic parameters, it is determined that (N-1) individual moment corresponding (N-1) individual second flight positioning letter Breath, according to (N-1) individual first flight location information and (N-1) individual second flight location information, is finally positioned about using preset Beam condition obtains N number of target flight location information not in the same time corresponding to middle finish time.Through the above way, using binocular Camera realizes Aerial vehicle position, can in real time obtain multiple not corresponding images in the same time, and then analysis obtains every two field picture Between translation parameters, two cameras are utilized respectively translation parameters and obtain corresponding location information, finally using preset positioning Constraints amendment location information, to obtain the target flight location information closer to actual value, is not using light stream camera Or in the case of high accuracy inertial sensor, accurate location information still can be obtained, reduce error amount, while also reducing The cost of aircraft.
Brief description of the drawings
Fig. 1 is method one embodiment schematic diagram of acquisition flight location information in the embodiment of the present invention;
Fig. 2 is to be provided with the aircraft schematic diagram of binocular camera in the embodiment of the present invention;
The schematic diagram that Fig. 3 is positioned for binocular camera in the embodiment of the present invention;
Fig. 4 is a schematic flow sheet of acquisition target flight location information in the embodiment of the present invention;
Fig. 5 is the workflow schematic diagram of binocular camera in application scenarios;
Fig. 6 is aircraft one embodiment schematic diagram in the embodiment of the present invention;
Fig. 7 is another embodiment schematic diagram of aircraft in the embodiment of the present invention;
Fig. 8 is another embodiment schematic diagram of aircraft in the embodiment of the present invention;
Fig. 9 is another embodiment schematic diagram of aircraft in the embodiment of the present invention;
Figure 10 is another embodiment schematic diagram of aircraft in the embodiment of the present invention;
Figure 11 is another embodiment schematic diagram of aircraft in the embodiment of the present invention;
Figure 12 is another embodiment schematic diagram of aircraft in the embodiment of the present invention;
Figure 13 is one structural representation of aircraft in the embodiment of the present invention.
Specific embodiment
A kind of method and aircraft for obtaining flight location information is the embodiment of the invention provides, can be obtained closer to very Real-valued target flight location information, in the case where light stream camera or high accuracy inertial sensor is not used, still may be used To obtain accurate location information, reduce error amount, while also reducing the cost of aircraft.
Term " first ", " second ", " the 3rd ", " in description and claims of this specification and above-mentioned accompanying drawing Four " etc. (if present) is for distinguishing similar object, without for describing specific order or precedence.Should manage Solution so data for using can be exchanged in the appropriate case, so that embodiments of the invention described herein for example can be removing Order beyond those for illustrating herein or describing is implemented.Additionally, term " comprising " and " having " and theirs is any Deformation, it is intended that covering is non-exclusive to be included, for example, containing process, method, system, the product of series of steps or unit Product or equipment are not necessarily limited to those steps clearly listed or unit, but may include not list clearly or for this A little processes, method, product or other intrinsic steps of equipment or unit.
It should be understood that the present invention program is mainly used in the operation of aircraft, aircraft (English full name:Unmanned Aerial Vehicle, english abbreviation:UAV it is exactly) that flying for specific aviation mission is performed using wireless remote control or programme-control Row device, refers to a kind of power airborne aircraft for not carrying operating personnel, uses lift of the air force for needed for aircraft is provided, Can automatically fly or remotely guide, can single use can also be reclaimed, mortality and non-lethal can be carried again Pay(useful) load.
It should be noted that aircraft can be unmanned plane, or aeromodelling airplane, or other kinds of flying machine Device, does not limit herein.
Nowadays, unmanned plane positioning hovering is capable of achieving error in vertical 10 centimetres, the automatic suspension in 1 meter of accuracy rating of level Stop, when a higher accuracy is required, just need manually to be finely adjusted.Unmanned plane realizes that automatic hovering is substantially fixed in On pre-set height and position and horizontal level, this is previously read itself that is, to realize the This move that hovers Position, that is, produce one group of three-dimensional coordinate this step to seem most important.The positional information for relatively accurately determining unmanned plane is nothing The premise of man-machine completion positioning hovering This move and basis.
It is relatively conventional in the location technology that unmanned plane is used to have following several:
First, with global positioning system (English full name:Global Positioning System, english abbreviation:GPS) mould Positioning based on block.GPS is capable of achieving the space orientation of unmanned plane after comprehensive at least 4 positional informations of satellite.Utilize with Centered on GPS, auxiliary is main flow targeting scheme that nowadays unmanned plane is used with the localization method of various sensors.In order to tackle Availability technology (English full name is selected in gps system:Selective availability, english abbreviation:SA the mistake for) causing Difference, the GPS that unmanned plane is carried generally improves positioning precision using Differential GPS Technology.
2nd, with the positioning of vision system.The lasting shooting of Airborne camera, for navigation system provides continuous image Frame, in the calculation procedure of Image Feature Matching, tracking features device obtains nature landmark information from continuous two picture frames, And displacement is measured in a pair of physical features.By periodically recording new feature point, and compare the characteristic point of repetition, just can be with The homography matrix as three-dimensional geometry projection between each image capturing sequence is calculated, such that it is able to realize determining unmanned plane Position.
3rd, radio adds the high accuracy positioning scheme that laser is pinpointed.Radio position finding radio directional bearing is in the accurate position of known guidance station Put down, the radio signal sent to guidance station by receiver is received, calculate between signal is issued to reception and be spaced Time, obtain guidance station to the relative distance between object and reach the determination of position to process.
Then, in these three modes, because vision system has broken away from the constraint for needing to receive gps signal, can not have By the cooperation with the part such as inertial sensor in the case of gps signal, the stabilization of unmanned plane is kept, so using the program It is significantly regional that unmanned plane can be used in some environmental characteristics, in such as nearby having an a few thing environment such as river, house.This hair Bright main use vision system is positioned, and be will not be described in detail below.
Fig. 1 is referred to, method one embodiment that flight location information is obtained in the embodiment of the present invention includes:
201st, the aircraft comprising the first camera and second camera determines (N-1) according to N number of first realtime graphic Individual first intrinsic parameters, and (N-1) individual second intrinsic parameters are determined according to N number of second realtime graphic, wherein, the first camera is used In obtaining N number of or not first realtime graphic corresponding in the same time, second camera is used to obtaining N number of not corresponding N in the same time Individual second realtime graphic, N is the positive integer more than or equal to 2;
In the present embodiment, aircraft includes one group of binocular camera, i.e., with two cameras, be respectively defined as first Camera and second camera.Binocular camera can simultaneously provide depth information and location information, wherein, depth information master Refer to elevation information, the method for obtaining depth information can be that binocular camera is arranged on the place vertically downward of aircraft, Thus can preferably catch height change.
First camera and second camera are respectively positioned at two diverse locations of aircraft, while capturing N frame figures Picture, N is greater than or equal to 2 positive integer, so just can guarantee that the two field pictures for obtaining the front and rear moment, such that it is able to carry out feature Compare.For corresponding each realtime graphic referred to as the first realtime graphic of N number of moment that the first camera gets, and Corresponding each realtime graphic referred to as the second realtime graphic of N number of moment that second camera gets.
Corresponding N two field pictures of N number of moment are contained in the first realtime graphic, front and rear two field pictures are by aspect ratio to rear Multigroup translation parameters is obtained, N two field pictures can just obtain (N-1) individual translation parameters, be then referred to as (N-1) individual translation parameters unification It is the first intrinsic parameters.Similarly, corresponding N two field pictures of N number of moment, front and rear two frames figure are also contains in the second realtime graphic As obtaining multigroup translation parameters to after by aspect ratio, N two field pictures can just obtain (N-1) individual translation parameters, and then this is by (N-1) Individual translation parameters referred to as the second intrinsic parameters.
202nd, N number of not the first initial positioning information of middle initial time, Yi Jitong in the same time are obtained by the first camera Cross second camera and obtain N number of not the second initial positioning information of middle initial time in the same time;
In the present embodiment, aircraft can obtain in N number of moment corresponding first initial positioning information of initial time and Second initial positioning information.
Wherein, to be the first camera do not obtain captured by middle initial time the first initial positioning information in the same time N number of, Second initial positioning information is second camera not to be obtained captured by middle initial time in the same time N number of.By aircraft flight Whole space regard a three-dimensional system of coordinate as, then the first initial positioning information is exactly the three-dimensional seat that the first camera shoots The position of origin in mark system, the second initial positioning information is exactly the position of origin in the three-dimensional system of coordinate that second camera shoots.
203rd, according to the first initial positioning information and (N-1) individual first intrinsic parameters, it is determined that (N-1) individual moment is corresponding (N-1) individual first flight location information, and according to the second initial positioning information and (N-1) individual second intrinsic parameters, it is determined that (N-1) Individual moment corresponding (N-1) individual second flight location information;
In the present embodiment, aircraft has got the first initial positioning information, and has been calculated (N-1) individual first Parameter is levied, such that it is able to determine (N-1) individual moment corresponding (N- using the first initial positioning information and the first initial positioning information 1) individual first flight location information.Similarly, according to the second initial positioning information and (N-1) individual second intrinsic parameters, it is also possible to It is determined that (N-1) individual moment corresponding (N-1) individual second flight location information.
Specifically, as a example by obtaining the first flight location information, it is assumed that N is 5, the first initial positioning information is X1, N1Moment The first intrinsic parameters be a, N2First intrinsic parameters at moment are b, N3First intrinsic parameters at moment are c and N4The of moment One intrinsic parameters are d, then N1First intrinsic parameters at moment are a X1, N2First intrinsic parameters at moment are ab X1, N3Moment The first intrinsic parameters be abc X1, N4First intrinsic parameters at moment are abcdX1
204th, according to (N-1) individual first flight location information and (N-1) individual second flight location information, using preset fixed Position constraints obtains N number of target flight location information not in the same time corresponding to middle finish time.
In the present embodiment, (N-1) individual first flight positioning that aircraft can be using preset position constraint condition to obtaining Information and (N-1) individual second flight location information are modified and adjust, the individual first flight location informations of (N-1) after adjustment And the error between (N-1) individual second flight location information is minimum value, first after finally utilizing solver to adjustment flies Row location information and the second flight location information carry out the calculating of optimal solution, so as to obtain target flight location information, the target Flight location information is used as N number of not flight location information of middle finish time in the same time.
Target flight location information is sent to the winged control module of aircraft, it is flown or is hanged using the information Stop.
In the embodiment of the present invention, aircraft includes the first camera and second camera, and the first camera is used to obtain N Individual not N number of first realtime graphic corresponding in the same time, second camera is used to obtaining N number of not in the same time corresponding N number of second Realtime graphic, flight location information can be obtained using above-mentioned aircraft, and (N-1) individual the is determined according to N number of first realtime graphic One intrinsic parameters, and (N-1) individual second intrinsic parameters are determined according to N number of second realtime graphic, obtain initial time first and image First initial positioning information of head and the second initial positioning information of second camera, then according to the first initial positioning information The first intrinsic parameters individual with (N-1), it is determined that (N-1) individual moment corresponding (N-1) individual first flight location information, and according to second Initial positioning information and (N-1) individual second intrinsic parameters, it is determined that (N-1) individual moment corresponding (N-1) individual second flight positioning letter Breath, according to (N-1) individual first flight location information and (N-1) individual second flight location information, is finally positioned about using preset Beam condition obtains N number of target flight location information not in the same time corresponding to middle finish time.Through the above way, using binocular Camera realizes Aerial vehicle position, can in real time obtain multiple not corresponding images in the same time, and then analysis obtains every two field picture Between translation parameters, two cameras are utilized respectively translation parameters and obtain corresponding location information, finally using preset positioning Constraints amendment location information, to obtain the target flight location information closer to actual value, is not using light stream camera Or in the case of high accuracy inertial sensor, accurate location information still can be obtained, reduce error amount, while also reducing The cost of aircraft.
Alternatively, it is provided in an embodiment of the present invention to obtain flight positioning letter on the basis of the corresponding embodiments of above-mentioned Fig. 1 In first alternative embodiment of method of breath, the first initial positioning information and second for obtaining the camera of initial time first are taken the photograph Before as the second initial positioning information of head, can also include:
In preset camera distance range, the first camera and second camera are arranged at the same level of aircraft On line.
In the present embodiment, Fig. 2 is referred to, Fig. 2 illustrates to be provided with the aircraft of binocular camera in the embodiment of the present invention Figure, as shown in the figure, it is necessary to the first camera and second camera are arranged in the same horizontal line of aircraft, and ensures both Between spacing distance meet within default camera distance range, and two camera positions in Fig. 2 are only a signal, Should not be construed as the restriction to this case.
It should be noted that preset camera distance range is usually 6 centimetres to 10 centimetres, in actual applications, also may be used To carry out some adjustment, do not limit herein.
However, mounted two cameras mathematically cannot really be realized being accurate to same level in actual applications On line, it is therefore desirable to which respectively two cameras are carried out with stereo calibration, stereo calibration can use Zhang Zhengyou standardizations.
Specifically, the implementation process of Zhang Zhengyou standardizations may comprise steps of:
1st, a gridiron pattern is printed, it is pasted in one plane, as demarcation thing;
2nd, the direction of thing or video camera is demarcated by adjusting, to demarcate the photo that thing shoots some different directions;
3rd, characteristic point (such as angle point) is extracted from photo;
4th, estimate it is preferable it is distortionless in the case of, five internal references and all outer ginsengs;
5th, estimated using least square method.Distortion factor under physical presence radial distortion.
6th, maximum-likelihood method, optimal estimating lifts estimated accuracy.
By such process, we just obtain five internal references with estimated accuracy high, three outward ginseng and two it is abnormal Variable coefficient.Using these information, we can carry out distortion correction, image rectification and final three dimensional signal space.
Binocular camera needs the parameter demarcated to include but is not limited to camera intrinsic parameter matrix, distortion factor matrix, sheet Levy matrix, basis matrix, spin matrix and translation matrix.Wherein camera intrinsic parameter matrix and distortion factor matrix can lead to The method for crossing monocular demarcation calibrates.Binocular camera is demarcated and the topmost difference of monocular-camera demarcation is exactly that binocular is taken the photograph Camera needs to calibrate the relativeness between left and right cameras coordinate system.
Secondly, in the embodiment of the present invention, binocular camera requirement vertically downward is arranged on same horizontal line, and Two distances at camera interval, by above-mentioned mounting means, can cause the first shooting in preset camera distance range Head and second camera can photograph satisfactory realtime graphic, if two camera intervals are too small, be difficult to To rational depth information and location information, and two too big object shootings that can cause nearby in camera interval less than, So as to lack object of reference.
Alternatively, it is provided in an embodiment of the present invention to obtain flight positioning letter on the basis of the corresponding embodiments of above-mentioned Fig. 1 In second alternative embodiment of method of breath, obtain N number of the first of middle initial time not initial in the same time by the first camera Location information, and obtain N number of not in the same time before the second initial positioning information of middle initial time by second camera, also Can include:
First moment corresponding first subgraph and the second moment corresponding second subgraph are obtained by the first camera Picture, wherein, the first moment and the second moment be it is N number of not in the same time in two moment, the first subgraph and the second subgraph Belong to the first realtime graphic;
Second moment corresponding 3rd subgraph and the second moment corresponding 4th subgraph are obtained by second camera Picture, wherein, the 3rd subgraph and the 4th subgraph belong to the second realtime graphic;
The first depth information and the second depth information are obtained using based on binocular stereo vision mode measurement.
In the present embodiment, before aircraft obtains the first initial positioning information and the second initial positioning information, can be with The first moment corresponding first subgraph is obtained using the first camera, at next moment, i.e. the second moment obtains corresponding Second subgraph, similarly, also corresponding 3rd subgraph is obtained by second camera at the first moment, and at the second moment The 4th subgraph is obtained, certainly, the first subgraph and the second subgraph belong to the first realtime graphic, and the 3rd subgraph and Four subgraphs belong to the second realtime graphic.
Then the first depth letter that can be respectively measured based on binocular stereo vision mode and obtained in the first subgraph is used Breath, the second depth information in the second subgraph, in the 3rd depth information and the 4th subgraph in the 3rd subgraph the Four depth informations.Wherein, binocular stereo vision is a kind of important form of machine vision, and it is based on principle of parallax and utilizes into As equipment from the two images of different position acquisition testees, by calculating image corresponding points between position deviation obtain The method for taking object dimensional geological information.
Specifically, first subgraph and the 3rd subgraph at the first moment at the first moment of contrast, merge two eyes and obtain Image and observe the difference between them, us is obtained obvious depth perception, set up the corresponding relation between feature, Photosites of the same space physical points in different images are mapped, you can obtain the first depth information.Similarly, contrast Second subgraph at the first moment and the 4th subgraph at the first moment, can obtain the second depth information.
Binocular stereo vision measurement method has that efficiency high, precision be suitable, system architecture is simple and low cost and other advantages, non- It is very suitable for the online of manufacture scene, noncontact Product checking and quality control.In measuring moving object, because image is obtained It is to be completed in moment, therefore Stereo Vision is a kind of more effective measuring method.
Secondly, in the embodiment of the present invention, aircraft obtains the first moment corresponding first subgraph by the first camera And second moment corresponding second subgraph, and by second camera obtain the second moment corresponding 3rd subgraph and Second moment corresponding 4th subgraph, then uses and obtains the first of the first subgraph based on binocular stereo vision mode measurement Depth information, the second depth information of the second subgraph, the 3rd depth information of the 3rd subgraph and the of the 4th subgraph Four depth informations.Through the above way, the first camera and second camera can also obtain depth information, i.e. elevation information, Overcoming monocular cam and light stream camera cannot provide the shortcoming of depth information, so as to enhance the practicality of scheme, together When, obtain can be additionally used in landform identification, object identification after depth information and determine height, with the diversity of this lifting scheme.
Alternatively, it is provided in an embodiment of the present invention to obtain flight on the basis of corresponding second embodiment of above-mentioned Fig. 1 In the 3rd alternative embodiment of method of location information, the first intrinsic parameters can include the first spin matrix and the first translation Vector, the second intrinsic parameters include the second spin matrix and the second translation vector, wherein, the first spin matrix is used to represent the The angle change of one camera, the second spin matrix is used to represent the angle change of second camera, and the first translation vector is used for The height change of the first camera is represented, the second translation vector is used to represent the height change of second camera.
In the present embodiment, what the first camera was obtained is the first intrinsic parameters, and what second camera was obtained is second intrinsic Parameter, the first intrinsic parameters belong to intrinsic parameters with the second intrinsic parameters, and intrinsic parameters include spin matrix and translation Vector, will respectively introduce spin matrix and translation vector below.
Relative position relation between any two coordinate system can be described by two matrixes:Spin matrix R and Translation matrix T.We describe two relativenesses of camera coordinate system in left and right with R and T herein, specially by left video camera Under Coordinate Conversion to right video camera under coordinate, the seat under Coordinate Conversion to the second video camera that will be under the first video camera Mark.
Assuming that there is a point P in space, its coordinate under coordinate system is PW, r represents left camera, and l represents right camera Its coordinate under left and right cameras coordinate system can be expressed as:
Wherein, PlAnd pτThere is following relation again:
Pr=RPl+T (2)
Wherein, often with left video camera, i.e. the first camera for master coordinate system in binocular camera analysis, but R and T are but It is from left to right conversion, so TxIt is negative.Comprehensive (1) and (2) two formulas, can be derived from following formula:
Camera extrinsic number is exactly R herein in monocular demarcationl, Tl, RrAnd Tr, substituting into (3) formula can be to obtain spin matrix With R and translation matrix T, translation vector t can be obtained according to translation matrix T.
The intrinsic parameters being made up of spin matrix and translation vector are very important in binocular problem to level geometry, can be with The problems such as simplifying Stereo matching, and grade line is such as sought, it is necessary to know intrinsic parameters using solve problem is gone to level geometry, because Intrinsic parameters can also be determined according to spin matrix and R and translation matrix T during this binocular calibration.
Intrinsic parameters often represent that its physical significance is the parameter that left and right coordinate system is mutually changed, and can be described with letter e Relation on the left and right cameras plane of delineation between corresponding points.
Again, in the embodiment of the present invention, illustrate that binocular camera can get spin matrix and translation vector, utilize Spin matrix and translation vector build and obtain intrinsic parameters, through the above way, be respectively necessary in binocular camera each take the photograph As head is demarcated, obtain spin matrix and translation vector to describe relative position relation between two cameras, and also can To constitute intrinsic parameters, so as to ensure the feasibility and practicality of scheme.
Alternatively, it is provided in an embodiment of the present invention to obtain flight on the basis of corresponding 3rd embodiment of above-mentioned Fig. 1 In the 4th alternative embodiment of method of location information, (N-1) individual first intrinsic parameters are determined according to N number of first realtime graphic, And (N-1) individual second intrinsic parameters are determined according to N number of second realtime graphic, can include:
(N-1) individual first intrinsic parameters are calculated as follows:
Wherein, λ1Represent the first depth information, λ2The second depth information is represented,Represent impact point X in the first subgraphj Three dimensions,Represent impact point X in second subgraphjThree dimensions, C represents the inner parameter of advance measurement, R1Represent the first spin matrix, t1Represent the first translation vector;
(N-1) individual second intrinsic parameters are calculated as follows:
Wherein, λ3Represent the 3rd depth information, λ4The 4th depth information is represented,Represent impact point Y in the 3rd subgraphk Three dimensions,Represent impact point Y in the 4th subgraphkThree dimensions, R2Represent the second spin matrix, t2Represent second Translation vector.
In the present embodiment, Fig. 3 is referred to, Fig. 3 is the schematic diagram that binocular camera is positioned in the embodiment of the present invention, its In, the R in (N-1) individual first intrinsic parameters, i.e. Fig. 3, the L in (N-1) individual second intrinsic parameters, i.e. Fig. 3, E are preset Position constraint condition.
Specifically, in the corresponding realtime graphic of each moment of each camera shooting, can be extracted using feature based and calculated Method (English full name:ORiented Brief, english abbreviation:ORB the spin matrix and translation vector of realtime graphic are calculated).It is first The ORB characteristic points per frame realtime graphic are first extracted, is then matched with the ORB characteristic points of previous frame realtime graphic, it is possible thereby to The two of which moment obtained in N number of moment distinguishes corresponding ORB set of characteristic points:
z1It is the set of characteristic points of previous time chart picture, z2It is when the set of characteristic points of the one before time chart picture.Actually should The point of n groups matching is had in, herein only with one group of meeting point as signal, if z1And z2It is perfect matching, then every group of point Between should meet equation below:
Wherein, λ1Represent the first depth information, λ2The second depth information is represented,Represent impact point X in the first subgraphj Three dimensions,Represent impact point X in second subgraphjThree dimensions, C represents the inner parameter of advance measurement, R1Represent the first spin matrix, t1Represent the first translation vector.
Certainly, it is same in second camera to determine to meet equation below between every group of point using aforesaid way:
Wherein, λ3Represent the 3rd depth information, λ4The 4th depth information is represented,Represent impact point Y in the 3rd subgraphk Three dimensions,Represent impact point Y in the 4th subgraphkThree dimensions, R2Represent the second spin matrix, t2Represent second Translation vector.
Convolution (6), formula (7), formula (8) and formula (9) composition equation group, can be calculated the first intrinsic parameters and Second intrinsic parameters, that is, obtain first the first translation vector of spin matrix, the second spin matrix and the second translation vector.
Further, it is how to determine (N-1) individual first intrinsic parameters and (N-1) individual second in the embodiment of the present invention Levy parameter and corresponding computing formula is provided, intrinsic parameters can be calculated by corresponding formula, for the realization of scheme is provided Feasible foundation, so as to increase the feasibility of scheme.
Alternatively, it is provided in an embodiment of the present invention to obtain flight positioning letter on the basis of the corresponding embodiments of above-mentioned Fig. 1 In the 5th alternative embodiment of method of breath, according to the individual second flight positioning of (N-1) individual first flight location information and (N-1) Information, N number of target flight location information not in the same time corresponding to middle finish time is obtained using preset position constraint condition, can To include:
It is fixed with the first flight that the second flight location information under the conditions of preset position constraint is met is calculated as follows Variance minimum value between the information of position:
Wherein, X represents the first flight location information, and Y represents the second flight location information,Expression is meeting preset Variance minimum value under the conditions of position constraint between the second flight location information and the first flight location information, when N represents n-th Carve, j represents j-th moment in N number of moment, XjRepresent j-th moment corresponding second flight location information, YjRepresent j-th Moment corresponding second flight location information, RextRepresent the rotation between first camera and second camera of measurement in advance Matrix, textRepresent the translation vector between first camera and second camera of measurement in advance;
Target flight location information is calculated according to variance minimum value.
In the present embodiment, first with every group of first flight location information of equation below calculating and the second flight location information Minimum value:
Wherein, X represents the first flight location information, and Y represents the second flight location information,Expression is meeting preset Variance minimum value under the conditions of position constraint between the second flight location information and the first flight location information, when N represents n-th Carve, j represents j-th moment in N number of moment, XjRepresent j-th moment corresponding second flight location information, YjRepresent j-th Moment corresponding second flight location information, RextRepresent the rotation between first camera and second camera of measurement in advance Matrix, textRepresent the translation vector between first camera and second camera of measurement in advance.
The flight location information that N groups were adjusted can be namely obtained, such as the first flight location information flies with second Row location information collectively forms { X1, Y1 }, { X2, Y2 } ... ..., { Xn, Yn }, every group of { X1, Y1 } after adjustment, and X2, Y2 } ... ..., { Xn, Yn } can be closer to minimum, so that measurement result is also more accurate.
Wherein, RextRepresent the spin matrix between first camera and second camera of measurement in advance, textRepresent pre- Translation vector between the first camera and second camera that first measure, RextAnd textCollectively as the outside ginseng of camera Number, can be obtained by stereo calibration.
For the ease of introducing, Fig. 4 is referred to, Fig. 4 is of acquisition target flight location information in the embodiment of the present invention In schematic flow sheet, step 201, aircraft calculates the current pose of left and right camera respectively, i.e., current flight location information, Flight location information can specifically include the coordinate points position in three-dimensional coordinate system and heading;In step 202, profit With standard drawing optimized algorithm (English full name:General Graph Optimization, english abbreviation:G2o) structural map relation, And constrained using binocular, i.e., preset position constraint condition amendment flight location information, wherein, g2o is a realization for set of algorithms, According to the theory for solving non-linear least square, most suitable algorithm is selected according to specific problem.It is a platform, can be with Linear equation solver is added, the optimization object function of oneself is write, it is determined that the mode for updating;In step 203, using g2o's Solver is solved and obtains optimal solution, finally updates current posture information using optimal solution in step 204, that is, update current flying Row location information, the flight location information after renewal is exactly target flight location information.
Secondly, in the embodiment of the present invention, the first flight location information for obtaining and the are measured respectively based on binocular camera Two flight location informations, the constraint set up between binocular camera flight location information, flight can be just solved by the constraint The flight optimization location information of device, that is, obtain target flight location information, so as to reduce error, lifts the accuracy of positioning.
Alternatively, on the basis of any one of corresponding first to the 5th embodiment of above-mentioned Fig. 1 and Fig. 1, this hair It is fixed according to (N-1) individual first flight in the 6th alternative embodiment of method for the acquisition flight location information that bright embodiment is provided Position information and (N-1) individual second flight location information, using preset position constraint condition obtain it is N number of not in the same time at the end of Carve after corresponding target flight location information, can also include:
According to target flight location information, the first sub- flight location information corresponding to (N+1) moment, the first son are determined Flight location information is an information in target flight location information;
Using preset position constraint condition and the first sub- flight location information, the corresponding to (N+1) moment is obtained Two sub- flight location informations;
According to the first sub- flight location information and the first intrinsic parameters, the 3rd son corresponding to (N+2) moment is determined Flight location information;
Using preset position constraint condition and the 3rd sub- flight location information, the corresponding to (N+2) moment is obtained Four sub- flight location informations;
The first optimal solution of the first sub- flight location information and the 3rd target flight location information is calculated, and calculates the second son Second optimal solution of flight location information and the 4th sub- flight location information, the first optimal solution is optimal with second to be deconstructed into (N+ 2) the flight location information at moment.
In the present embodiment, aircraft using preset position constraint condition obtain it is N number of not in the same time middle finish time institute it is right After the target flight location information answered, follow-up flight positioning letter can also be calculated using target flight location information Breath.
Specifically, the location information of the first camera is included in known target flight location information, and the second shooting The location information of head, it is assumed that only select one of location information X1 corresponding to (N+1) moment, X1 to be referred to as the first son flight Location information, then retrodicted the location information Y1 obtained corresponding to (N+1) moment, i.e., second using preset position constraint condition Sub- flight location information, so far, one group of sub- flight location information is obtained and finished, and then next group of sub- flight location information of beginning Obtain.
According to XI and the first intrinsic parameters, the 3rd sub- flight location information corresponding to (N+2) moment is calculated, That is X2, similarly, using preset position constraint condition and X2, the 4th son flight calculated corresponding to (N+2) moment is fixed Position information, i.e. Y2, so far, next group of sub- flight location information is also obtained and finished, and then can also continue to carry out subsequent child flight The acquisition of location information, does not repeat herein.
In actual applications, two cameras try to achieve optimal solution according to the X and Y being calculated respectively, for example with minimum Come the optimal solution asked, two optimal solutions are the flight location information that may make up (N+2) moment to square law.
Secondly, in the embodiment of the present invention, after optimal target flight location information is obtained, it is possible to use the target flies Row positioning letter and preset position constraint condition are come flight location information optimal in predicting following a period of time.By above-mentioned side Formula, on the one hand provides a kind of feasible means to obtain the mode of accurate flight location information, and the flexible of scheme is increased with this Property, on the other hand, the follow-up flight location information for obtaining is more focused on consideration of overall importance, is conducive in global coordinate system really Determine the location information of aircraft.
Alternatively, it is provided in an embodiment of the present invention to obtain flight on the basis of corresponding 6th embodiment of above-mentioned Fig. 1 In the 7th alternative embodiment of method of location information, according to the first sub- flight location information and the first intrinsic parameters, it is determined that The 3rd sub- flight location information corresponding to (N+2) moment, can include:
The 3rd sub- flight location information corresponding to (N+2) moment is calculated as follows:
XN+2=RN+1XN+1+tN+1
Wherein, XN+2Represent the 3rd sub- flight location information corresponding to (N+2) moment, RN+1Represent the first intrinsic parameters In (N+1) moment spin matrix, tN+1Represent the translation vector at (N+1) moment in the first intrinsic parameters, XN+1Represent the (N+1) the first sub- flight location information corresponding to the moment.
In the present embodiment, the 3rd sub- flight location information how calculated corresponding to (N+2) moment will be specifically introduced, by Spin matrix and translation vector are included in we have been obtained for intrinsic parameters, and intrinsic parameters, using spin matrix and Translation vector can carry out obtaining the 3rd sub- flight location information.
The 3rd sub- flight location information corresponding to (N+2) moment is calculated using equation below:
XN+2=RN+1XN+1+tN+1 (11)
Wherein, the X in formulaN+2Represent the 3rd sub- flight location information corresponding to (N+2) moment, RN+1Represent first The spin matrix at (N+1) moment in intrinsic parameters, tN+1The translation vector at (N+1) moment in the first intrinsic parameters is represented, XN+1Represent the first sub- flight location information corresponding to (N+1) moment.
Through the above way, you can when being calculated current using the sub- flight location information at a upper moment every time The sub- flight location information carved.Secondly a series of sub- flight location information and the external parameter of binocular camera that will be calculated Input builds native relation to g2o, then calls the solver of g2o to try to achieve the optimal solution of its least square method, finally optimal with this Solution updates target flight location information, meanwhile, also optimal solution is sent to the winged control module of aircraft.
Again, in the embodiment of the present invention, it is calculated using last moment corresponding first sub- flight location information latter Individual moment corresponding 3rd sub- flight location information, i.e., can be calculated using corresponding formula, through the above way, can be with The practicality and feasibility of lifting scheme.
For ease of understanding, below with a concrete application scene to a kind of method for obtaining flight location information in the present invention It is described in detail, refers to Fig. 5, Fig. 5 is the workflow schematic diagram of binocular camera in application scenarios, specially:
In step 301, it is assumed that the aircraft of use is unmanned plane, first unmanned plane by its carry vertically downward Binocular camera gathers the realtime graphic of right and left eyes respectively;
In step 302, the depth value of image is calculated using the realtime graphic of right and left eyes;
In step 303, the spin matrix of camera of left and right two is calculated based on ORB image characteristic points respectively and is translated towards Amount, because the image of left and right camera collection is different, its image characteristic point will be different, therefore left and right camera is calculated Error is had between spin matrix and translation vector;
In step 304, we set up two groups between spin matrix and translation vector according to the constraint between binocular camera Restrictive condition, the optimal solution of its UAV position and orientation is tried to achieve using the method for least square method.The optimal solution is unmanned plane Location information;
In step 305, the information is sent to system for flight control computer, so that unmanned plane can be obtained more accurately Location information.
The aircraft in the present invention is described in detail below, refers to Fig. 6, the aircraft bag in the embodiment of the present invention The first camera and second camera are included, wherein, first camera is used to obtain N number of corresponding not N number of in the same time First realtime graphic, the second camera is used to obtain N number of not corresponding N number of second realtime graphic, the N in the same time It is the positive integer more than or equal to 2, the aircraft includes:
First determining module 401, for determining (N-1) individual first intrinsic parameters according to N number of first realtime graphic, and (N-1) individual second intrinsic parameters are determined according to N number of second realtime graphic;
First acquisition module 402, for obtaining N number of not middle initial time in the same time by first camera First initial positioning information, and obtain described N number of not in the same time at the beginning of the second of middle initial time by the second camera Beginning location information;
Second determining module 403, for first initial positioning information obtained according to first acquisition module 402 With first determining module 401 determine described in (N-1) individual first intrinsic parameters, it is determined that (N-1) the individual moment is corresponding (N-1) individual first flight location information, and according to first acquisition module 402 obtain second initial positioning information with (N-1) individual second intrinsic parameters that first determining module 401 determines, it is determined that (N-1) individual moment corresponding (N-1) is individual Second flight location information;
Second acquisition module 404, (N-1) individual first flight described in being determined according to second determining module 403 Location information and (N-1) individual second flight location information, when obtaining described N number of different using preset position constraint condition Target flight location information in quarter corresponding to finish time.
In the present embodiment, the first determining module 401 determines that (N-1) individual first is intrinsic according to N number of first realtime graphic Parameter, and (N-1) individual second intrinsic parameters are determined according to N number of second realtime graphic, the first acquisition module 402 is by described First camera obtains N number of not first initial positioning information of middle initial time in the same time, and is taken the photograph by described second As head obtains N number of not second initial positioning information of middle initial time in the same time, the second determining module 403 is according to described the First initial positioning information that one acquisition module 402 is obtained is individual with (N-1) described in first determining module 401 determination First intrinsic parameters, it is determined that (N-1) the individual moment corresponding (N-1) individual first flight location information, and according to described first Second initial positioning information and first determining module 401 that acquisition module 402 is obtained determine described in (N-1) individual the Two intrinsic parameters, it is determined that (N-1) individual moment corresponding (N-1) individual second flight location information, the second acquisition module 404 is according to institute State the individual second flight positioning of (N-1) individual first flight location information and (N-1) of the determination of the second determining module 403 Information, obtains N number of target flight not in the same time corresponding to middle finish time and positions letter using preset position constraint condition Breath.
In the embodiment of the present invention, aircraft realizes Aerial vehicle position using binocular camera, can in real time obtain multiple not Corresponding image, and then analysis in the same time obtains the translation parameters between every two field picture, and two cameras are utilized respectively translation ginseng Number obtains corresponding location information, finally using preset position constraint condition amendment location information, to obtain closer to actual value Target flight location information, in the case where light stream camera or high accuracy inertial sensor is not used, still can obtain To accurate location information, reduce error amount, while also reducing the cost of aircraft.
Alternatively, on the basis of the corresponding embodiments of above-mentioned Fig. 6, Fig. 7, flight provided in an embodiment of the present invention are referred to In another embodiment of device, the aircraft also includes:
Setup module 405, N number of difference is obtained for first acquisition module 402 by first camera First initial positioning information of initial time in moment, and by the second camera obtain it is described it is N number of not in the same time in Before second initial positioning information of initial time, in preset camera distance range, by first camera with it is described Second camera is arranged in the same horizontal line of the aircraft.
Secondly, in the embodiment of the present invention, binocular camera requirement vertically downward is arranged on same horizontal line, and Two distances at camera interval, by above-mentioned mounting means, can cause the first shooting in preset camera distance range Head and second camera can photograph satisfactory realtime graphic, if two camera intervals are too small, be difficult to To rational depth information and location information, and two too big object shootings that can cause nearby in camera interval less than, So as to lack object of reference.
Alternatively, on the basis of the corresponding embodiments of above-mentioned Fig. 6, Fig. 8, flight provided in an embodiment of the present invention are referred to In another embodiment of device, the aircraft also includes:
3rd acquisition module 406, (N- is determined for first determining module 402 according to N number of first realtime graphic 1) individual first intrinsic parameters, and before determining (N-1) individual second intrinsic parameters according to N number of second realtime graphic, by institute State the first camera and obtain the first moment corresponding first subgraph and the second moment corresponding second subgraph, wherein, institute State the first moment and second moment be it is described it is N number of not in the same time in two moment, first subgraph with it is described Second subgraph belongs to first realtime graphic;
4th acquisition module 407, for obtaining second moment corresponding 3rd subgraph by the second camera Picture and second moment corresponding 4th subgraph, wherein, the 3rd subgraph is belonged to the 4th subgraph Second realtime graphic;
Measurement module 408, the first depth information and second are obtained for using based on binocular stereo vision mode measurement Depth information.
Secondly, in the embodiment of the present invention, aircraft obtains the first moment corresponding first subgraph by the first camera And second moment corresponding second subgraph, and by second camera obtain the second moment corresponding 3rd subgraph and Second moment corresponding 4th subgraph, then uses and obtains the first of the first subgraph based on binocular stereo vision mode measurement Depth information, the second depth information of the second subgraph, the 3rd depth information of the 3rd subgraph and the of the 4th subgraph Four depth informations.Through the above way, the first camera and second camera can also obtain depth information, i.e. elevation information, Overcoming monocular cam and light stream camera cannot provide the shortcoming of depth information, so as to enhance the practicality of scheme, together When, obtain can be additionally used in landform identification, object identification after depth information and determine height, with the diversity of this lifting scheme.
Alternatively, on the basis of the corresponding embodiments of above-mentioned Fig. 8, another implementation of aircraft provided in an embodiment of the present invention In example, first intrinsic parameters include the first spin matrix and the first translation vector, and second intrinsic parameters include the Two spin matrixs and the second translation vector, wherein, first spin matrix is used to represent the angle of first camera Change, second spin matrix is used to represent the angle change of the second camera, and first translation vector is used for table Show the height change of first camera, second translation vector is used to represent the height change of the second camera.
Again, in the embodiment of the present invention, illustrate that binocular camera can get spin matrix and translation vector, utilize Spin matrix and translation vector build and obtain intrinsic parameters, through the above way, be respectively necessary in binocular camera each take the photograph As head is demarcated, obtain spin matrix and translation vector to describe relative position relation between two cameras, and also can To constitute intrinsic parameters, so as to ensure the feasibility and practicality of scheme.
Alternatively, on the basis of the corresponding embodiments of above-mentioned Fig. 8, Fig. 9, flight provided in an embodiment of the present invention are referred to In another embodiment of device,
First determining module 401 includes:
First computing unit 4011, for calculating (N-1) individual first intrinsic parameters as follows:
Wherein, the λ1Represent first depth information, the λ2Second depth information is represented, it is describedRepresent Impact point X in first subgraphjThree dimensions, it is describedRepresent impact point X described in second subgraphjThree Dimension space, C represents the inner parameter of advance measurement, the R1Represent first spin matrix, the t1Represent that described first is flat The amount of shifting to;
(N-1) individual second intrinsic parameters are calculated as follows:
Wherein, the λ3Represent the 3rd depth information, the λ4The 4th depth information is represented, it is describedRepresent Impact point Y in 3rd subgraphkThree dimensions, it is describedRepresent impact point Y described in the 4th subgraphkThree Dimension space, the R2Represent second spin matrix, the t2Represent second translation vector.
Further, it is how to determine (N-1) individual first intrinsic parameters and (N-1) individual second in the embodiment of the present invention Levy parameter and corresponding computing formula is provided, intrinsic parameters can be calculated by corresponding formula, for the realization of scheme is provided Feasible foundation, so as to increase the feasibility of scheme.
Alternatively, on the basis of the corresponding embodiments of above-mentioned Fig. 6, Figure 10 is referred to, it is provided in an embodiment of the present invention to fly In another embodiment of row device,
Second acquisition module 404 includes:
Second computing unit 4041, it is described under the conditions of the preset position constraint is met for calculating as follows Variance minimum value between second flight location information and the first flight location information:
Wherein, the X represents the first flight location information, and the Y represents the second flight location information, describedRepresent the second flight location information and the described first flight positioning letter under the conditions of the preset position constraint is met Variance minimum value between breath, the N represents the n-th moment, and the j represents j-th moment in N number of moment, the XjTable Show j-th moment corresponding second flight location information, the YjRepresent j-th moment corresponding described Two flight location informations, the RextRepresent the rotation between first camera and the second camera of measurement in advance Matrix, the textRepresent the translation vector between first camera and the second camera of measurement in advance;
3rd computing unit 4042, the variance minimum value for being calculated according to second computing unit 4041 is calculated The target flight location information.
Secondly, in the embodiment of the present invention, the first flight location information for obtaining and the are measured respectively based on binocular camera Two flight location informations, the constraint set up between binocular camera flight location information, flight can be just solved by the constraint The flight optimization location information of device, that is, obtain target flight location information, so as to reduce error, lifts the accuracy of positioning.
Alternatively, on the basis of corresponding any embodiment in above-mentioned Fig. 6 to Figure 10, Figure 11 is referred to, the present invention is real Apply in another embodiment of aircraft of example offer, the aircraft also includes:
3rd determining module 409A, for second acquisition module 404 according to (N-1) individual first flight positioning letter Breath and (N-1) individual second flight location information, N number of not middle knot in the same time is obtained using preset position constraint condition After target flight location information corresponding to the beam moment, according to the target flight location information, (N+1) moment institute is determined Corresponding first sub- flight location information, the first sub- flight location information is one in the target flight location information Information;
5th acquisition module 409B, for using the preset position constraint condition and the 3rd determining module 409A The the described first sub- flight location information for determining, obtains the second sub- flight location information corresponding to (N+1) moment;
4th determining module 409C, for the first son flight positioning determined according to the 3rd determining module 409A Information and the first intrinsic parameters, determine the 3rd sub- flight location information corresponding to (N+2) moment;
6th acquisition module 409D, for using the preset position constraint condition and the 4th determining module 409C The the described 3rd sub- flight location information for determining, obtains the 4th sub- flight location information corresponding to (N+2) moment;
Computing module 409E, for calculate the 3rd determining module 409A determine the described first sub- flight location information with First optimal solution of the 3rd target flight location information that the 4th determining module 409C determines, and calculate the described 5th Acquisition module 409B obtain the described second sub- flight location information obtained with the 6th acquisition module 409D the described 4th Second optimal solution of sub- flight location information, first optimal solution is optimal with described second to be deconstructed into (N+2) moment Flight location information.
Secondly, in the embodiment of the present invention, after optimal target flight location information is obtained, it is possible to use the target flies Row positioning letter and preset position constraint condition are come flight location information optimal in predicting following a period of time.By above-mentioned side Formula, on the one hand provides a kind of feasible means to obtain the mode of accurate flight location information, and the flexible of scheme is increased with this Property, on the other hand, the follow-up flight location information for obtaining is more focused on consideration of overall importance, is conducive in global coordinate system really Determine the location information of aircraft.
Alternatively, on the basis of the corresponding embodiments of above-mentioned Figure 11, Figure 12 is referred to, it is provided in an embodiment of the present invention to fly In another embodiment of row device,
The 4th determining module 409C includes:
4th computing unit 409C1, flies for calculating the 3rd son corresponding to (N+2) moment as follows Row location information:
XN+2=RN+1XN+1+tN+1
Wherein, the XN+2Represent the 3rd sub- flight location information corresponding to (N+2) moment, the RN+1Represent The spin matrix at (N+1) moment, the t in first intrinsic parametersN+1Represent in first intrinsic parameters (N+1) The translation vector at moment, the XN+1Represent the described first sub- flight location information corresponding to (N+1) moment.
Again, in the embodiment of the present invention, it is calculated using last moment corresponding first sub- flight location information latter Individual moment corresponding 3rd sub- flight location information, i.e., can be calculated using corresponding formula, through the above way, can be with The practicality and feasibility of lifting scheme.
The embodiment of the present invention additionally provides another aircraft, as shown in figure 13, for convenience of description, illustrate only and this The related part of inventive embodiments, particular technique details is not disclosed, and refer to present invention method part.With aircraft As a example by unmanned plane:
Figure 13 is illustrated that the block diagram of the part-structure of the unmanned plane related to aircraft provided in an embodiment of the present invention.Ginseng Figure 13 is examined, unmanned plane includes:Radio frequency (English full name:Radio Frequency, english abbreviation:RF) circuit 510, memory 520th, input block 530, display unit 540, sensor 550, voicefrequency circuit 560, Wireless Fidelity (English full name:wireless Fidelity, english abbreviation:WiFi) the part such as module 570, processor 580 and power supply 590.Those skilled in the art can be with Understand, the unmanned plane structure shown in Figure 13 does not constitute the restriction to unmanned plane, can include more more or less than illustrating Part, or some parts are combined, or different part arrangements.
Each component parts of unmanned plane is specifically introduced with reference to Figure 13:
RF circuits 510 can be used to receiving and sending messages or communication process in, the reception and transmission of signal, especially, by aircraft After the downlink information of control device is received, processed to processor 580;In addition, giving aircraft control by up data is activation is designed Device processed.Generally, RF circuits 510 including but not limited to antenna, at least one amplifier, transceiver, coupler, low noise is put Big device (English full name:Low Noise Amplifier, english abbreviation:LNA), duplexer etc..Additionally, RF circuits 510 can be with Communicated with network and other equipment by radio communication.Above-mentioned radio communication can use any communication standard or agreement, including But it is not limited to global system for mobile communications (English full name:Global System of Mobile communication, English Abbreviation:GSM), general packet radio service (English full name:General Packet Radio Service, GPRS), code division it is many Location (English full name:Code Division Multiple Access, english abbreviation:CDMA), (English is complete for WCDMA Claim:Wideband Code Division Multiple Access, english abbreviation:WCDMA), Long Term Evolution (English full name: Long Term Evolution, english abbreviation:LTE), Email, Short Message Service (English full name:Short Messaging Service, SMS) etc..
Memory 520 can be used to store software program and module, and processor 580 is by running storage in memory 520 Software program and module, so as to perform various function application and the data processing of unmanned plane.Memory 520 can be wrapped mainly Storing program area and storage data field are included, wherein, the application that storing program area can be needed for storage program area, at least one function Program (such as sound-playing function, image player function etc.) etc.;Storage data field can be stored to be created according to using for unmanned plane Data (such as voice data, phone directory etc.) built etc..Additionally, memory 520 can include high-speed random access memory, also Nonvolatile memory, for example, at least one disk memory, flush memory device or other volatile solid-states storage can be included Device.
Input block 530 can be used to receive the numeral or character information of input, and generation is set with the user of unmanned plane And the relevant key signals input of function control.Specifically, input block 530 may include contact panel 531 and other inputs Equipment 532.Contact panel 531, also referred to as touch-screen, user can be collected thereon or neighbouring touch operation (such as user makes With the operation of any suitable object such as finger, stylus or annex on contact panel 531 or near contact panel 531), and Corresponding attachment means are driven according to formula set in advance.Optionally, contact panel 531 may include touch detecting apparatus and touch Touch two parts of controller.Wherein, touch detecting apparatus detect the touch orientation of user, and detect the letter that touch operation brings Number, transmit a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and is converted into Contact coordinate, then give processor 580, and the order sent of receiving processor 580 and can be performed.Furthermore, it is possible to using The polytypes such as resistance-type, condenser type, infrared ray and surface acoustic wave realize contact panel 531.It is defeated except contact panel 531 Entering unit 530 can also include other input equipments 532.Specifically, other input equipments 532 can include but is not limited to physics One or more in keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, action bars etc..
Display unit 540 can be used for show by user input information or be supplied to user information and unmanned plane it is each Plant menu.Display unit 540 may include display panel 541, optionally, can use liquid crystal display (English full name:Liquid Crystal Display, english abbreviation:LCD), Organic Light Emitting Diode (English full name:Organic Light-Emitting Diode, english abbreviation:) etc. OLED form configures display panel 541.Further, contact panel 531 can cover display surface Plate 541, when contact panel 531 is detected thereon or after neighbouring touch operation, sends processor 580 to determine to touch thing The type of part, corresponding visual output is provided with preprocessor 580 according to the type of touch event on display panel 541.Though So in fig. 13, contact panel 531 and display panel 541 are the input and input that mobile phone is realized as two independent parts Function, but in some embodiments it is possible to by contact panel 531 and display panel 541 be integrated input that realize mobile phone and Output function.
Unmanned plane may also include at least one sensor 550, such as optical sensor, motion sensor and other sensings Device.Specifically, optical sensor may include ambient light sensor and proximity transducer, wherein, ambient light sensor can be according to environment The light and shade of light adjusts the brightness of display panel 541, and proximity transducer can close display when unmanned plane is moved at light Panel 541 and/or backlight.As one kind of motion sensor, (generally three in the detectable all directions of accelerometer sensor Axle) acceleration size, size and the direction of gravity are can detect that when static, can be used for recognize UAV Attitude application (ratio Such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, tap);Extremely The other sensors such as gyroscope, barometer, hygrometer, thermometer, the infrared ray sensor that be can also configure in mobile phone, herein no longer Repeat.
Voicefrequency circuit 560, loudspeaker 561, microphone 562 can provide the COBBAIF between user and unmanned plane.Audio Electric signal after the voice data conversion that circuit 560 will can be received, is transferred to loudspeaker 561, and sound is converted to by loudspeaker 561 Sound signal output;On the other hand, the voice signal of collection is converted to electric signal by microphone 562, after being received by voicefrequency circuit 560 Voice data is converted to, then after voice data output processor 580 is processed, through RF circuits 510 being sent to such as another hand Machine, or voice data is exported to memory 520 so as to further treatment.
WiFi belongs to short range wireless transmission technology, and unmanned plane can help user's transceiver electronicses by WiFi module 570 Mail, browse webpage and access streaming video etc., it has provided the user wireless broadband internet and has accessed.Although Figure 13 shows WiFi module 570, but it is understood that, it is simultaneously not belonging to must be configured into for mobile phone, completely can as needed not Change in the essential scope of invention and omit.
Processor 580 is the control centre of unmanned plane, using various interfaces and each portion of the whole unmanned plane of connection Point, by running or performing software program and/or module of the storage in memory 520, and storage is called in memory 520 Interior data, perform the various functions and processing data of unmanned plane, so as to carry out integral monitoring to unmanned plane.Optionally, process Device 580 may include one or more processing units;Preferably, processor 580 can integrated application processor and modulation /demodulation treatment Device, wherein, application processor mainly processes operating system, user interface and application program etc., and modem processor is mainly located Reason radio communication.It is understood that above-mentioned modem processor can not also be integrated into processor 580.
Unmanned plane also includes the power supply 590 (such as battery) powered to all parts, it is preferred that power supply can be by power supply Management system is logically contiguous with processor 580, so as to realize management charging, electric discharge and power consumption pipe by power-supply management system The functions such as reason.
Although not shown, unmanned plane can also will not be repeated here including camera, bluetooth module etc..
In embodiments of the present invention, the processor 580 included by the terminal also has following functions:
(N-1) individual first intrinsic parameters are determined according to N number of first realtime graphic, and it is real-time according to described N number of second Image determines (N-1) individual second intrinsic parameters;
Obtain the second of the first initial positioning information of the first camera described in initial time and the second camera Initial positioning information;
According to first initial positioning information and (N-1) individual first intrinsic parameters, it is determined that (N-1) the individual moment Corresponding (N-1) individual first flight location information, and it is intrinsic with (N-1) individual second according to second initial positioning information Parameter, it is determined that (N-1) individual moment corresponding (N-1) individual second flight location information;
According to (N-1) individual first flight location information and (N-1) individual second flight location information, using pre- Put position constraint condition and obtain N number of target flight location information not in the same time corresponding to middle finish time.
It is apparent to those skilled in the art that, for convenience and simplicity of description, the system of foregoing description, The specific work process of device and unit, may be referred to the corresponding process in preceding method embodiment, will not be repeated here.
In several embodiments provided herein, it should be understood that disclosed system, apparatus and method can be with Realize by another way.For example, device embodiment described above is only schematical, for example, the unit Divide, only a kind of division of logic function there can be other dividing mode when actually realizing, for example multiple units or component Can combine or be desirably integrated into another system, or some features can be ignored, or do not perform.It is another, it is shown or The coupling each other for discussing or direct-coupling or communication connection can be the indirect couplings of device or unit by some interfaces Close or communicate to connect, can be electrical, mechanical or other forms.
The unit that is illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit The part for showing can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.Some or all of unit therein can be according to the actual needs selected to realize the mesh of this embodiment scheme 's.
In addition, during each functional unit in each embodiment of the invention can be integrated in a processing unit, it is also possible to It is that unit is individually physically present, it is also possible to which two or more units are integrated in a unit.Above-mentioned integrated list Unit can both be realized in the form of hardware, it would however also be possible to employ the form of SFU software functional unit is realized.
If the integrated unit is to realize in the form of SFU software functional unit and as independent production marketing or use When, can store in a computer read/write memory medium.Based on such understanding, technical scheme is substantially The part for being contributed to prior art in other words or all or part of the technical scheme can be in the form of software products Embody, the computer software product is stored in a storage medium, including some instructions are used to so that a computer Equipment (can be personal computer, server, or network equipment etc.) performs the complete of each embodiment methods described of the invention Portion or part steps.And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (English full name:Read-Only Memory, english abbreviation:ROM), random access memory (English full name:Random Access Memory, english abbreviation: RAM), magnetic disc or CD etc. are various can be with the medium of store program codes.
The above, the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although with reference to preceding Embodiment is stated to be described in detail the present invention, it will be understood by those within the art that:It still can be to preceding State the technical scheme described in each embodiment to modify, or equivalent is carried out to which part technical characteristic;And these Modification is replaced, and does not make the spirit and scope of the essence disengaging various embodiments of the present invention technical scheme of appropriate technical solution.

Claims (16)

1. it is a kind of obtain flight location information method, it is characterised in that methods described is applied to aircraft, the aircraft bag The first camera and second camera are included, wherein, first camera is used to obtain N number of corresponding not N number of in the same time First realtime graphic, the second camera is used to obtain N number of not corresponding N number of second realtime graphic, the N in the same time It is the positive integer more than or equal to 2, methods described includes:
(N-1) individual first intrinsic parameters are determined according to N number of first realtime graphic, and according to N number of second realtime graphic It is determined that (N-1) individual second intrinsic parameters;
N number of not first initial positioning information of middle initial time, Yi Jitong in the same time are obtained by first camera Cross the second camera and obtain N number of not second initial positioning information of middle initial time in the same time;
According to first initial positioning information and (N-1) individual first intrinsic parameters, it is determined that (N-1) individual moment correspondence (N-1) individual first flight location information, and according to second initial positioning information and (N-1) individual second intrinsic ginseng Number, it is determined that (N-1) individual moment corresponding (N-1) individual second flight location information;
According to (N-1) individual first flight location information and (N-1) individual second flight location information, using preset fixed Position constraints obtains N number of target flight location information not in the same time corresponding to middle finish time.
2. method according to claim 1, it is characterised in that it is described by first camera obtain it is described it is N number of not First initial positioning information of middle initial time in the same time, and by the second camera obtain it is described it is N number of not in the same time Before second initial positioning information of middle initial time, methods described also includes:
In preset camera distance range, first camera and the second camera are arranged at the aircraft In same horizontal line.
3. method according to claim 1, it is characterised in that described that (N-1) is determined according to N number of first realtime graphic Individual first intrinsic parameters, and before determining (N-1) individual second intrinsic parameters according to N number of second realtime graphic, methods described Also include:
First moment corresponding first subgraph and the second moment corresponding second subgraph are obtained by first camera Picture, wherein, first moment and second moment be it is described it is N number of not in the same time in two moment, first son Image belongs to first realtime graphic with second subgraph;
Second moment corresponding 3rd subgraph is obtained by the second camera and second moment is corresponding 4th subgraph, wherein, the 3rd subgraph belongs to second realtime graphic with the 4th subgraph;
The first depth information and the second depth information are obtained using based on binocular stereo vision mode measurement.
4. method according to claim 3, it is characterised in that first intrinsic parameters include the first spin matrix and First translation vector, second intrinsic parameters include the second spin matrix and the second translation vector, wherein, first rotation Torque battle array is used to represent the angle change of first camera, and second spin matrix is used to represent the second camera Angle change, first translation vector is used to represent the height change of first camera, second translation vector Height change for representing the second camera.
5. method according to claim 4, it is characterised in that described that (N-1) is determined according to N number of first realtime graphic Individual first intrinsic parameters, and (N-1) individual second intrinsic parameters are determined according to N number of second realtime graphic, including:
(N-1) individual first intrinsic parameters are calculated as follows:
λ 1 z 1 j 1 = CX j ;
λ 2 z 2 j 1 = C ( R 1 X j + t 1 ) ;
Wherein, the λ1Represent first depth information, the λ2Second depth information is represented, it is describedRepresent described Impact point X in first subgraphjThree dimensions, it is describedRepresent impact point X described in second subgraphjThree-dimensional space Between, C represents the inner parameter of advance measurement, the R1Represent first spin matrix, the t1Represent that described first is translated towards Amount;
(N-1) individual second intrinsic parameters are calculated as follows:
λ 3 z 3 k 1 = CY k ;
λ 4 z 4 k 1 = C ( R 2 Y k + t 2 ) ;
Wherein, the λ3Represent the 3rd depth information, the λ4The 4th depth information is represented, it is describedRepresent described Impact point Y in 3rd subgraphkThree dimensions, it is describedRepresent impact point Y described in the 4th subgraphkThree-dimensional space Between, the R2Represent second spin matrix, the t2Represent second translation vector.
6. method as requested described in 1, it is characterised in that (N-1) described in the basis individual first flight location information and (N-1) individual second flight location information, N number of not middle finish time in the same time is obtained using preset position constraint condition Corresponding target flight location information, including:
The second flight location information and described under the conditions of the preset position constraint is met is calculated as follows Variance minimum value between one flight location information:
m i n X , Y Σ j = 1 N | | ( R e x t Y j + t e x t ) - X j | | 2 ;
Wherein, the X represents the first flight location information, and the Y represents the second flight location information, describedRepresent the second flight location information and the described first flight positioning letter under the conditions of the preset position constraint is met Variance minimum value between breath, the N represents the n-th moment, and the j represents j-th moment in N number of moment, the XjTable Show j-th moment corresponding second flight location information, the YjRepresent j-th moment corresponding described Two flight location informations, the RextRepresent the rotation between first camera and the second camera of measurement in advance Matrix, the textRepresent the translation vector between first camera and the second camera of measurement in advance;
The target flight location information is calculated according to the variance minimum value.
7. method according to any one of claim 1 to 6, it is characterised in that (N-1) individual first described in the basis flies Row location information and (N-1) individual second flight location information, N number of difference is obtained using preset position constraint condition After target flight location information in moment corresponding to finish time, methods described also includes:
According to the target flight location information, the first sub- flight location information corresponding to (N+1) moment, described are determined One sub- flight location information is an information in the target flight location information;
Using the preset position constraint condition and the first sub- flight location information, (N+1) moment institute is obtained Corresponding second sub- flight location information;
According to the described first sub- flight location information and the first intrinsic parameters, the 3rd son corresponding to (N+2) moment is determined Flight location information;
Using the preset position constraint condition and the 3rd sub- flight location information, (N+2) moment institute is obtained Corresponding 4th sub- flight location information;
The first optimal solution of the described first sub- flight location information and the 3rd target flight location information is calculated, and calculates institute State the second optimal solution of the second sub- flight location information and the described 4th sub- flight location information, first optimal solution with it is described The second optimal flight location information for being deconstructed into (N+2) moment.
8. method according to claim 7, it is characterised in that described according to the described first sub- flight location information and One intrinsic parameters, determine the 3rd sub- flight location information corresponding to (N+2) moment, including:
The 3rd sub- flight location information corresponding to (N+2) moment is calculated as follows:
XN+2=RN+1XN+1+tN+1
Wherein, the XN+2Represent the 3rd sub- flight location information corresponding to (N+2) moment, the RN+1Represent described The spin matrix at (N+1) moment in first intrinsic parameters, the tN+1Represent (N+1) moment in first intrinsic parameters Translation vector, the XN+1Represent the described first sub- flight location information corresponding to (N+1) moment.
9. a kind of aircraft, it is characterised in that the aircraft includes the first camera and second camera, wherein, it is described First camera is used to obtain N number of or not first realtime graphic corresponding in the same time, and the second camera is used to obtain institute N number of not corresponding N number of second realtime graphic in the same time is stated, the N is the positive integer more than or equal to 2, and the aircraft includes:
First determining module, for determining (N-1) individual first intrinsic parameters according to N number of first realtime graphic, and according to institute State N number of second realtime graphic and determine (N-1) individual second intrinsic parameters;
First acquisition module, the first initial positioning information and described second for obtaining the first camera described in initial time Second initial positioning information of camera;
Second determining module, for first initial positioning information and described first obtained according to first acquisition module (N-1) individual first intrinsic parameters that determining module determines, it is determined that (N-1) the individual moment corresponding (N-1) individual first is winged Row location information, and according to second initial positioning information and first determining module of first acquisition module acquisition (N-1) individual second intrinsic parameters for determining, it is determined that (N-1) individual moment corresponding (N-1) individual second flight location information;
Second acquisition module, for according to second determining module determine described in (N-1) individual first flight location information with And (N-1) individual second flight location information, using preset position constraint condition obtain it is described it is N number of not in the same time at the end of Carve corresponding target flight location information.
10. aircraft according to claim 9, it is characterised in that the aircraft also includes:
Setup module, N number of not middle starting in the same time is obtained for first acquisition module by first camera First initial positioning information at moment, and N number of not middle initial time in the same time is obtained by the second camera Before second initial positioning information, in preset camera distance range, by first camera and the second camera It is arranged in the same horizontal line of the aircraft.
11. aircraft according to claim 9, it is characterised in that the aircraft also includes:
3rd acquisition module, (N-1) individual first is determined for first determining module according to N number of first realtime graphic Levy parameter, and before determining (N-1) individual second intrinsic parameters according to N number of second realtime graphic, imaged by described first Head obtains the first moment corresponding first subgraph and the second moment corresponding second subgraph, wherein, first moment With second moment be it is described it is N number of not in the same time in two moment, first subgraph and second subgraph Belong to first realtime graphic;
4th acquisition module, for obtaining second moment corresponding 3rd subgraph and institute by the second camera The second moment corresponding 4th subgraph is stated, wherein, the 3rd subgraph belongs to described second with the 4th subgraph Realtime graphic;
Measurement module, for obtaining the first depth information and the second depth letter using based on binocular stereo vision mode measurement Breath.
12. aircraft according to claim 11, it is characterised in that first intrinsic parameters include the first spin matrix And first translation vector, second intrinsic parameters include the second spin matrix and the second translation vector, wherein, described the One spin matrix is used to represent the angle change of first camera, and second spin matrix is used to represent that described second takes the photograph As the angle change of head, first translation vector is used to represent the height change of first camera, second translation Vector is used to represent the height change of the second camera.
13. aircraft according to claim 12, it is characterised in that first determining module includes:
First computing unit, for calculating (N-1) individual first intrinsic parameters as follows:
λ 1 z 1 j 1 = CX j ;
λ 2 z 2 j 1 = C ( R 1 X j + t 1 ) ;
Wherein, the λ1Represent first depth information, the λ2Second depth information is represented, it is describedRepresent described Impact point X in first subgraphjThree dimensions, it is describedRepresent impact point X described in second subgraphjThree-dimensional space Between, C represents the inner parameter of advance measurement, the R1Represent first spin matrix, the t1Represent that described first is translated towards Amount;
(N-1) individual second intrinsic parameters are calculated as follows:
λ 3 z 3 k 1 = CY k ;
λ 4 z 4 k 1 = C ( R 2 Y k + t 2 ) ;
Wherein, the λ3Represent the 3rd depth information, the λ4The 4th depth information is represented, it is describedRepresent described Impact point Y in 3rd subgraphkThree dimensions, it is describedRepresent impact point Y described in the 4th subgraphkThree-dimensional space Between, the R2Represent second spin matrix, the t2Represent second translation vector.
14. aircraft according to claim 9, it is characterised in that second acquisition module includes:
Second computing unit, for calculating second flight under the conditions of the preset position constraint is met as follows Variance minimum value between location information and the first flight location information:
m i n X , Y Σ j = 1 N | | ( R e x t Y j + t e x t ) - X j | | 2 ;
Wherein, the X represents the first flight location information, and the Y represents the second flight location information, describedRepresent the second flight location information and the described first flight positioning letter under the conditions of the preset position constraint is met Variance minimum value between breath, the N represents the n-th moment, and the j represents j-th moment in N number of moment, the XjTable Show j-th moment corresponding second flight location information, the YjRepresent j-th moment corresponding described Two flight location informations, the RextRepresent the rotation between first camera and the second camera of measurement in advance Matrix, the textRepresent the translation vector between first camera and the second camera of measurement in advance;
The target flight location information is calculated according to the variance minimum value.
15. aircraft according to any one of claim 9 to 14, it is characterised in that the aircraft also includes:
3rd determining module, for second acquisition module according to (N-1) individual first flight location information and described (N-1) individual second flight location information, using preset position constraint condition obtain it is described it is N number of not in the same time middle finish time institute it is right After the target flight location information answered, according to the target flight location information, first corresponding to (N+1) moment is determined Sub- flight location information, the first sub- flight location information is an information in the target flight location information;
5th acquisition module, for determined using the preset position constraint condition and the 3rd determining module described the One sub- flight location information, obtains the second sub- flight location information corresponding to (N+1) moment;
4th determining module, for the described first sub- flight location information and first determined according to the 3rd determining module Intrinsic parameters, determine the 3rd sub- flight location information corresponding to (N+2) moment;
6th acquisition module, for determined using the preset position constraint condition and the 4th determining module described the Three sub- flight location informations, obtain the 4th sub- flight location information corresponding to (N+2) moment;
Computing module, the described first sub- flight location information and the 4th determination are determined for calculating the 3rd determining module First optimal solution of the 3rd target flight location information that module determines, and calculate the institute that the 5th acquisition module is obtained State the second sub- flight location information optimal with the second of the described 4th sub- flight location information that the 6th acquisition module is obtained Solution, first optimal solution flight location information for being deconstructed into (N+2) moment optimal with described second.
16. aircraft according to claim 15, it is characterised in that the 4th determining module includes:
3rd computing unit, for calculating the 3rd son flight positioning letter corresponding to (N+2) moment as follows Breath:
XN+2=RN+1XN+1+tN+1
Wherein, the XN+2Represent the 3rd sub- flight location information corresponding to (N+2) moment, the RN+1Represent described The spin matrix at (N+1) moment in first intrinsic parameters, the tN+1Represent (N+1) moment in first intrinsic parameters Translation vector, the XN+1Represent the described first sub- flight location information corresponding to (N+1) moment.
CN201611100259.9A 2016-11-24 2016-12-01 A kind of method and aircraft obtaining flight location information Active CN106767817B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201611100259.9A CN106767817B (en) 2016-12-01 2016-12-01 A kind of method and aircraft obtaining flight location information
PCT/CN2017/111577 WO2018095278A1 (en) 2016-11-24 2017-11-17 Aircraft information acquisition method, apparatus and device
US16/296,073 US10942529B2 (en) 2016-11-24 2019-03-07 Aircraft information acquisition method, apparatus and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611100259.9A CN106767817B (en) 2016-12-01 2016-12-01 A kind of method and aircraft obtaining flight location information

Publications (2)

Publication Number Publication Date
CN106767817A true CN106767817A (en) 2017-05-31
CN106767817B CN106767817B (en) 2019-01-04

Family

ID=58884163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611100259.9A Active CN106767817B (en) 2016-11-24 2016-12-01 A kind of method and aircraft obtaining flight location information

Country Status (1)

Country Link
CN (1) CN106767817B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107270900A (en) * 2017-07-25 2017-10-20 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system and method for posture
WO2018095278A1 (en) * 2016-11-24 2018-05-31 腾讯科技(深圳)有限公司 Aircraft information acquisition method, apparatus and device
CN109099927A (en) * 2018-09-26 2018-12-28 北京永安信通科技股份有限公司 Object positioning method, object positioning device and electronic equipment
WO2019019139A1 (en) * 2017-07-28 2019-01-31 Qualcomm Incorporated Image sensor calibration in a robotic vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824298A (en) * 2014-03-10 2014-05-28 北京理工大学 Intelligent body visual and three-dimensional positioning method based on double cameras and intelligent body visual and three-dimensional positioning device based on double cameras
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
CN105335733A (en) * 2015-11-23 2016-02-17 西安韦德沃德航空科技有限公司 Autonomous landing visual positioning method and system for unmanned aerial vehicle
CN105346706A (en) * 2015-11-13 2016-02-24 深圳市道通智能航空技术有限公司 Flight device, and flight control system and method
CN105844692A (en) * 2016-04-27 2016-08-10 北京博瑞空间科技发展有限公司 Binocular stereoscopic vision based 3D reconstruction device, method, system and UAV
JP2016151566A (en) * 2015-02-19 2016-08-22 株式会社島津製作所 Motion tracker device
CN105928493A (en) * 2016-04-05 2016-09-07 王建立 Binocular vision three-dimensional mapping system and method based on UAV
CN106153008A (en) * 2016-06-17 2016-11-23 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824298A (en) * 2014-03-10 2014-05-28 北京理工大学 Intelligent body visual and three-dimensional positioning method based on double cameras and intelligent body visual and three-dimensional positioning device based on double cameras
CN104236548A (en) * 2014-09-12 2014-12-24 清华大学 Indoor autonomous navigation method for micro unmanned aerial vehicle
JP2016151566A (en) * 2015-02-19 2016-08-22 株式会社島津製作所 Motion tracker device
CN105346706A (en) * 2015-11-13 2016-02-24 深圳市道通智能航空技术有限公司 Flight device, and flight control system and method
CN105335733A (en) * 2015-11-23 2016-02-17 西安韦德沃德航空科技有限公司 Autonomous landing visual positioning method and system for unmanned aerial vehicle
CN105928493A (en) * 2016-04-05 2016-09-07 王建立 Binocular vision three-dimensional mapping system and method based on UAV
CN105844692A (en) * 2016-04-27 2016-08-10 北京博瑞空间科技发展有限公司 Binocular stereoscopic vision based 3D reconstruction device, method, system and UAV
CN106153008A (en) * 2016-06-17 2016-11-23 北京理工大学 A kind of rotor wing unmanned aerial vehicle objective localization method of view-based access control model

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018095278A1 (en) * 2016-11-24 2018-05-31 腾讯科技(深圳)有限公司 Aircraft information acquisition method, apparatus and device
US10942529B2 (en) 2016-11-24 2021-03-09 Tencent Technology (Shenzhen) Company Limited Aircraft information acquisition method, apparatus and device
CN107270900A (en) * 2017-07-25 2017-10-20 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system and method for posture
WO2019019139A1 (en) * 2017-07-28 2019-01-31 Qualcomm Incorporated Image sensor calibration in a robotic vehicle
CN109099927A (en) * 2018-09-26 2018-12-28 北京永安信通科技股份有限公司 Object positioning method, object positioning device and electronic equipment

Also Published As

Publication number Publication date
CN106767817B (en) 2019-01-04

Similar Documents

Publication Publication Date Title
US10942529B2 (en) Aircraft information acquisition method, apparatus and device
CN106767682A (en) A kind of method and aircraft for obtaining flying height information
CN110147705B (en) Vehicle positioning method based on visual perception and electronic equipment
US11822353B2 (en) Simple multi-sensor calibration
CN207117844U (en) More VR/AR equipment collaborations systems
CN106767817A (en) A kind of method and aircraft for obtaining flight location information
CN108153320A (en) Flight control method and the electronic equipment for supporting this method
CN108139758A (en) Apparatus of transport positioning based on significant characteristics
CN111145339B (en) Image processing method and device, equipment and storage medium
JP2017509939A (en) Method and system for generating a map including sparse and dense mapping information
US11353891B2 (en) Target tracking method and apparatus
CN106444797A (en) Method for controlling aircraft to descend and related device
EP3852065A1 (en) Data processing method and apparatus
CN105892476A (en) Control method and control terminal of aircraft
CN114140528A (en) Data annotation method and device, computer equipment and storage medium
CN107622525A (en) Threedimensional model preparation method, apparatus and system
CN109977845A (en) A kind of drivable region detection method and car-mounted terminal
CN106323242A (en) Space structure detection method and device for unmanned aerial vehicle
CN109190648A (en) Simulated environment generation method, device, mobile terminal and computer-readable storage medium
CN107087441A (en) A kind of information processing method and its device
CN112595728B (en) Road problem determination method and related device
CN106020219B (en) A kind of control method and device of aircraft
CN110148167A (en) A kind of distance measurement method and terminal device
WO2020062024A1 (en) Distance measurement method and device based on unmanned aerial vehicle and unmanned aerial vehicle
CN114429515A (en) Point cloud map construction method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant