CN109076206A - Stereoscopic imaging method and device based on unmanned plane - Google Patents
Stereoscopic imaging method and device based on unmanned plane Download PDFInfo
- Publication number
- CN109076206A CN109076206A CN201780018614.4A CN201780018614A CN109076206A CN 109076206 A CN109076206 A CN 109076206A CN 201780018614 A CN201780018614 A CN 201780018614A CN 109076206 A CN109076206 A CN 109076206A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- flight path
- unmanned
- target
- stereoscopic imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 53
- 238000000034 method Methods 0.000 claims abstract description 35
- 238000004891 communication Methods 0.000 claims description 21
- 238000003860 storage Methods 0.000 claims description 13
- 230000001360 synchronised effect Effects 0.000 claims description 9
- 238000004590 computer program Methods 0.000 claims description 3
- 230000007246 mechanism Effects 0.000 description 9
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 1
- 238000005267 amalgamation Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 210000003733 optic disk Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000005303 weighing Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/211—Image signal generators using stereoscopic image cameras using a single 2D image sensor using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/218—Image signal generators using stereoscopic image cameras using a single 2D image sensor using spatial multiplexing
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The present invention provides a kind of stereoscopic imaging method and device based on unmanned plane, the described method includes: passing through at least two width target images of the unmanned plane from different position acquisition target objects, wherein, the target object at least in target image described in two width is at least partly overlapped;According to target image described at least two width, the stereo-picture of the target object is generated.Two video cameras in binocular camera are replaced by unmanned plane, from at least two width target images of different position acquisition target objects, to obtain the stereo-picture of target object, the distance between phase collection point is increased by the way of low cost, the demand to imaging resolution is reduced, and improves reconstruction accuracy.
Description
Technical field
The present invention relates to imaging field more particularly to a kind of stereoscopic imaging methods and device based on unmanned plane.
Background technique
Stereoscopic vision is to be realized by the different of phase collection point position, such as there are a spacings between the right and left eyes of people
From, there is nuance between formed phase, therefore can judgment object distance.The distance between phase collection point is known as baseline,
Baseline is remoter, is more easy to get stereoscopic vision, and the stereoscopic vision obtained in other words is more obvious.
The prior art obtains stereo-picture, the power between image resolution ratio, object distance precision using binocular camera
Under weighing apparatus, caused inevitable outcome is that distance is larger between binocular camera.And small drone is caused due to the limitation of position
The application of binocular camera is very limited.Large-scale unmanned plane is mostly used to carry out related mapping in industry, it is at high cost, it is unfavorable for pushing away
Extensively.
Summary of the invention
The present invention provides a kind of stereoscopic imaging method and device based on unmanned plane.
According to the first aspect of the invention, a kind of stereoscopic imaging method based on unmanned plane is provided, comprising: pass through unmanned plane
From at least two width target images of different position acquisition target objects, wherein described in target image described at least two width
Target object is at least partly overlapped;According to target image described at least two width, the stereo-picture of the target object is generated.
According to the second aspect of the invention, a kind of stereoscopic imaging apparatus based on unmanned plane, including one or more are provided
Processor either individually or collectively works, and the processor is connect with the UAV Communication;The processor is used for: being passed through
At least two width target images of the unmanned plane from different position acquisition target objects, wherein in target image described at least two width
The target object be at least partly overlapped;According to target image described at least two width, the perspective view of the target object is generated
Picture.
According to the third aspect of the invention we, a kind of computer readable storage medium is provided, computer program is stored thereon with,
The program is executed by processor following steps: by unmanned plane from at least two width target figures of different position acquisition target objects
Picture, wherein the target object in target image described at least two width is at least partly overlapped;According to target described at least two width
Image generates the stereo-picture of the target object.
By the above technical solution provided in an embodiment of the present invention as it can be seen that the present invention replaces binocular camera by unmanned plane
In two video cameras, from at least two width target images of different position acquisition target objects, to obtain target object
Stereo-picture increases the distance between phase collection point by the way of low cost, reduces the demand to imaging resolution, and
Improve reconstruction accuracy.
Detailed description of the invention
To describe the technical solutions in the embodiments of the present invention more clearly, make required in being described below to embodiment
Attached drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the invention, for
For those of ordinary skill in the art, without any creative labor, it can also be obtained according to these attached drawings
His attached drawing.
Fig. 1 is the structural schematic diagram for the unmanned plane that one embodiment of the invention provides;
Fig. 2 is the flow chart of the stereoscopic imaging method based on unmanned plane in one embodiment of the invention;
Fig. 3 is the structural schematic diagram of the stereoscopic imaging apparatus based on unmanned plane in one embodiment of the invention;
Fig. 4 is the structural schematic diagram of the stereoscopic imaging apparatus based on unmanned plane in one embodiment of the invention, discloses two
The flight path of platform unmanned plane;
Fig. 5 is the structural schematic diagram of the stereoscopic imaging apparatus based on unmanned plane in another embodiment of the present invention, is disclosed
The flight path of one unmanned plane;
Fig. 6 is the structural block diagram of the stereoscopic imaging apparatus based on unmanned plane in one embodiment of the invention;
Fig. 7 is the structural block diagram of the stereoscopic imaging apparatus based on unmanned plane in another embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
With reference to the accompanying drawing, the stereoscopic imaging method of the invention based on unmanned plane and device are described in detail.?
In the case where not conflicting, the feature in following embodiment and embodiment be can be combined with each other.
Fig. 1 is a kind of schematic diagram of unmanned plane 100 provided in an embodiment of the present invention.The unmanned plane 100 may include carrying
Body 102 and load 104.In certain embodiments, load 104 can be on unmanned plane 100, without supporting body
102.In the present embodiment, the supporting body 102 is holder, for example, two axle The Cloud Terraces or three axis holders.The load 104 can be
Image capturing device or picture pick-up device (such as camera, video camera, infrared pick-up equipment, ultraviolet light picture pick-up device or similar
Equipment), audio capturing device (for example, parabolic microphone), infrared pick-up equipment etc., the load 104 can be with
State induction data (such as picture) or dynamic sensed data (such as video) are provided.The load 104 is mounted in the supporting body
102, to control 104 rotation of load by the supporting body 102.
Further, unmanned plane 100 may include power mechanism 106, sensor-based system 108 and communication system 110.Its
In, power mechanism 106 may include one or more rotary body, propeller, blade, motor, electron speed regulator etc..For example,
The rotary body of the power mechanism can be Self-fastening (self-tightening) rotary body, rotating body component or others
Rotary body power unit.Unmanned plane 100 can have one or more power mechanisms.All power mechanisms can be identical class
Type.Optionally, one or more power mechanism can be different type.Power mechanism 106 can pass through suitable means
It is mounted on unmanned plane, such as passes through support component (such as drive shaft).Power mechanism 106 may be mounted at any conjunction of unmanned plane 100
Suitable position, such as top, lower end, front end, rear end, side or any combination therein.By controlling one or more power
Mechanism 106, to control the flight of unmanned plane 100.
Sensor-based system 108 may include one or more sensor, to sense dimensional orientation, the speed of unmanned plane 100
And/or acceleration (such as relative to the rotation and translation of up to three degree of freedom).One or more of sensors may include
GPS sensor, motion sensor, inertial sensor, proximity sensor or image sensor.The sense that sensor-based system 108 provides
Dimensional orientation, speed and/or the acceleration that measured data can be used for tracking target 100 (as described below, utilize suitable processing list
Member and/or control unit).Optionally, sensor-based system 108 can be used for acquiring the environmental data of unmanned plane, and such as weather conditions are wanted
Close position, position of man-made structures of potential obstacle, geographical feature etc..
Communication system 110, which can be realized, to be communicated with the terminal 112 with communication system 114 by wireless signal 116.
Communication system 110,114 may include any amount of transmitter, receiver and/or transceiver for wireless telecommunications.It is described
Communication can be one-way communication, and such data can be sent from a direction.For example, one-way communication may include, only nobody
Machine 100 transfers data to terminal 112, or vice versa.One or more transmitter of communication system 110 can send number
According to one or more receiver for giving communication system 112, vice versa.Optionally, the communication can be both-way communication, this
Sample, data can be transmitted between unmanned plane 100 and terminal 112 in both direction.Both-way communication includes the one of communication system 110
A or multiple transmitters can send data to one or more receiver of communication system 114, and vice versa.
In certain embodiments, terminal 112 can to unmanned plane 100, supporting body 102 and load 104 in one or
Multiple offers control data, and receive information from one or more in unmanned plane 100, supporting body 102 and load 104
(position and/or motion information of such as unmanned plane, supporting body or load, load the data of sensing, the image number captured such as camera
According to).
In certain embodiments, unmanned plane 100 can be communicated with other remote equipments other than terminal 112, terminal
112 can also be communicated with other remote equipments in addition to unmanned plane 100.For example, unmanned plane and/or terminal 112 can be with
It is communicated with the supporting body or load of another unmanned plane or another unmanned plane.It is described other remote when in need
Journey equipment can be second terminal or other calculating equipment (such as computer, desktop computer, tablet computer, smart phone or
The other mobile devices of person).The remote equipment can transmit data to unmanned plane 100, receive data from unmanned plane 100, transmit number
Data are received according to terminal 112, and/or from terminal 112.Optionally, which may be coupled to internet or other
Telecommunication network, so as to be uploaded on website or server from unmanned plane 100 and/or the received data of terminal 112.
In certain embodiments, the relatively fixed object of reference of the movement of unmanned plane 100, the movement of supporting body 102 and load 104
The movement of the movement of (such as external environment) and/or person to each other can be controlled by terminal 112.The terminal 112 can be with
It is remote control terminal, is located remotely from unmanned plane, supporting body and/or the place of load.Terminal 112 can be located at or be pasted on
In support platform.Optionally, the terminal 112 can be hand-held or wearable.For example, the terminal 112 can wrap
Include smart phone, tablet computer, desktop computer, computer, glasses, gloves, the helmet, microphone or any of them knot
It closes.The terminal 112 may include user interface, such as keyboard, mouse, control stick, touch screen or display.It is any suitable
User's input can be interacted with terminal 112, be such as manually entered instruction, sound control, gesture control or position control and (such as passed through
Movement, position or the inclination of terminal 112).
It should be noted that stereo-picture may include individual stereo-picture in the embodiment of the present invention, may also comprise continuous
Three-dimensional video-frequency.
Embodiment one
The embodiment of the present invention one provides a kind of stereoscopic imaging method based on unmanned plane.Fig. 2 provides for the embodiment of the present invention
The stereoscopic imaging method based on unmanned plane flow chart.As shown in Fig. 2, the stereoscopic imaging method based on unmanned plane can be with
Include the following steps:
Step S201: by unmanned plane 100 from at least two width target images of different position acquisition target objects,
In, the target object in target image described at least two width is at least partly overlapped;
Optionally, at least two width target images may include two width or two width or more, it is preferable that by unmanned plane 100 from
Two width target images of different position acquisition target objects.
The present embodiment is the two width target images by unmanned plane 100 from different position acquisition target objects, specifically
Implementation may include the following two kinds:
The first
In conjunction with Fig. 3 and Fig. 4, the unmanned plane 100 include two (including Fig. 3 and unmanned plane shown in Fig. 4 110 and nobody
Machine 120), the different position includes first position and the second position.Step S201 may include: control two it is described nobody
Machine 100 is located at the first position and the second position, and obtains two unmanned planes 100 in respective positions
The target image of the target object P (spatial coordinate X, Y, Z) of shooting, the two width target images obtained by this way
Real-time is preferable.
Second
Referring to Fig. 5, the unmanned plane 100 includes one (unmanned plane 130 i.e. shown in fig. 5), the different position packet
Include first position and the second position.Step S201 may include: the control unmanned plane 100 be located at the first position and
The second position, and obtain the target image for the target object P that the unmanned plane 100 is shot in corresponding position.
Compared to the first implementation, stereo-picture is rebuild in such a way that 100 timesharing of separate unit unmanned plane is shot, reduces Three-dimensional Gravity
The cost built, but the real-time of the two width target images obtained is not as good as the first implementation.
The specific implementation process of the above two mode for obtaining target image is illustrated individually below.
(1) mode based on two unmanned planes 100
In the present embodiment, two unmanned planes 100 of the control are located at the first position and the second
Setting may include: the synchronous flight of the unmanned plane 100 of control two, and the first position and the second position are respectively to correspond to
The real time position of unmanned plane 100, with the synchronism for the two width target images for ensuring to obtain.
While the unmanned plane 100 of control two synchronizes flight, also need between two unmanned planes 100 of control
Relativeness it is constant, it is ensured that the calibration relationship of two unmanned planes 100 is constant, thus as far as possible improve ensure two width target figures
The registration of picture is convenient for three-dimensional reconstruction.Wherein, the relativeness may include following at least one: two unmanned planes 100
Shooting posture, the positional relationship between two unmanned planes 100.The shooting posture of two unmanned planes 100 can include:
The shooting direction of two unmanned planes 100 angle (shooting direction of two unmanned planes 100 respectively with the object
Angle between body P line).In the present embodiment, the angle of the shooting direction of two unmanned planes 100 is greater than 0 ° and is less than
180°.For example, during the synchronous flight of two unmanned planes 100, it can be by the shooting direction of two unmanned planes 100
Angle maintain 45 ° or 60 ° always.
Referring to fig. 4, the positional relationship between two unmanned planes 100 can include: between two unmanned planes 100
Distance w, compared to the distance between two video cameras in binocular camera, the distance between two described unmanned planes 100 w can
It greatly increases, for example, the distance between two described unmanned planes 100 w can be from several meters to tens meter.In this way, in three-dimensional reconstruction
Required resolution ratio substantially reduces.
Further, referring to fig. 4, while the unmanned plane 100 of control two synchronizes flight, control two is also needed
The real-time height h of the unmanned plane 100 is equal, and controls the real-time range of two unmanned planes 100 to the target object P
S is equal, it is ensured that the registration of two width target images, to be conducive to three-dimensional reconstruction.In the present embodiment, two unmanned planes are controlled
100 real-time height h is equal to be referred to: synchronization, and the height h for controlling two unmanned planes 100 is equal, such as is 5 meters,
The registration of two width target images is high, is conducive to three-dimensional reconstruction.And at different times, the height h of two unmanned planes 100 can
It is equal, can also be unequal, it can specifically be selected according to imaging demand.
Further, described to obtain the target object P's that two unmanned planes 100 are shot in respective positions
Target image may include: to obtain two unmanned planes 100 in the mesh of the target object P of respective positions sync pulse jamming
Logo image further ensures that the synchronism of two width target images of acquisition, convenient for generating smooth three-dimensional video-frequency.
(1) mode based on a unmanned plane 100
In the present embodiment, relativeness of the unmanned plane 100 in the first position and the second position is constant, from
And the registration for ensuring two width target images is improved as far as possible, it is convenient for three-dimensional reconstruction.Wherein, the relativeness include it is following at least
A kind of: the unmanned plane 100 is in the shooting posture of the first position and the second position, the unmanned plane 100 described
The positional relationship of first position and the second position.The unmanned plane 100 is in the first position and the second position
Shoot posture can include: the angle of shooting direction of the unmanned plane 100 in the first position and the second position is (described
Corresponding shooting direction and the line and the unmanned plane 100 of the target object P exist unmanned plane 100 at the first position
The angle when second position between corresponding shooting direction and the line of the target object P).It is described in the present embodiment
Unmanned plane 100 is greater than 0 ° and less than 180 ° in the angle of the shooting direction of the first position and the second position.For example, institute
The angle that unmanned plane 100 is stated in the shooting direction of the first position and the second position can be 45 ° or 60 °.And for weight
Volumetric video is established, the angle of shooting direction of the unmanned plane 100 in the first position and the second position need to be tieed up always
It holds in same angle.
Referring to Fig. 5, positional relationship of the unmanned plane 100 in the first position and the second position can include: institute
The distance w of unmanned plane 100 between the first position and the second position is stated, is taken the photograph compared to two in binocular camera
The distance between camera, one unmanned plane 100 of control are located at the first position and described the at different times
The mode of two positions, so that the distance between the first position and the second position w can be greatly increased, for example, described
The distance between one position and the second position w can be from several meters to tens meter.In this way, the resolution ratio required in three-dimensional reconstruction
It substantially reduces.
The first position is to the distance (s in Fig. 5) of the target object P and the second position to the object
Body P's is equidistant (s in Fig. 5), the height of the height (h in Fig. 5) of the first position and the second position also phase
Deng (h in Fig. 5), it is ensured that the registration of two width target images, to be conducive to three-dimensional reconstruction.In the present embodiment, described first
Setting to the distance of the distance of the target object P and the second position to the target object P can be 5 meters, can also be it
He.
Further, the first position and the second position may belong to same flight path, also may belong to not
Same flight path, to meet the needs of different.For example, in one embodiment, the first position and the second position
Belong to same flight path.Optionally, the flight of unmanned plane 100 to the first position and stops the first preset duration,
In first preset duration, shot by image capturing device on unmanned plane 100 or picture pick-up device alignment target object P, from
And obtain a wherein width target image.Then, the flight of unmanned plane 100 to the second position and stops the second preset duration,
In second preset duration, shot by image capturing device on unmanned plane 100 or picture pick-up device alignment target object P, from
And obtain another width target image, it is ensured that the clarity of two width target images.Wherein, the first preset duration and the second preset duration
Duration can be set as needed, the first preset duration can be equal with the second preset duration, can also be unequal.In addition, this implementation
Example is not construed as limiting the flight path of unmanned plane 100, it is preferable that the flight path of the unmanned plane 100 is one positioned at same height
Circle on degree improves the efficiency of three-dimensional reconstruction.The present embodiment can obtain any two points from circular flight path and make respectively
For the first position and the second position.Further, the unmanned plane 100 laterally flies along circular flight path
Row, convenient for rebuilding continuous three-dimensional video-frequency.
In another embodiment, the first position and the second position are located in different flight paths.This implementation
In example, the different flight path includes the first flight path and the second flight path, and the control unmanned plane 100 divides
Not Wei Yu first position and the second position may include: to control the unmanned plane 100 first successively along the first flight rail
Mark and second flight path flight, then choose the first position from first flight path, from described second
The second position is chosen in flight path, is rebuild stereo-picture in such a way that single machine timesharing is shot, is reduced three-dimensional reconstruction
Cost.
Further, the control unmanned plane 100 is successively along first flight path and second flight
Track flight may include: the control unmanned plane 100 successively along the first flight path and the second flight path horizontal flight,
Convenient for rebuilding continuous three-dimensional video-frequency.
The first position is chosen from first flight path, chooses described second from second flight path
Position may include: to choose multiple first positions from first flight path, choose from second flight path more
A second position, the plurality of first position and multiple second positions correspond.It is described to obtain the unmanned plane
100 in the target image of the target object P of corresponding position shooting may include: to obtain the unmanned plane 100 in correspondence
First position and second position shooting the target object P target image, convenient for generating the continuous target object P
Three-dimensional video-frequency.
Step S202: according to target image described at least two width, the stereo-picture of the target object P is generated.
Wherein, step S202 is specifically included: the target image described at least two width merges, and generates the target object
The stereo-picture of P.In the present embodiment, any amalgamation mode target image described at least two width progress in the prior art can be used
Fusion.
When target image includes two width or more, first wherein two images can be merged, generate the vertical of target object P
Body image, then the width in the stereo-picture being currently generated and remaining target image is continued into fusion treatment, until all
Image co-registration finishes, and obtains a width stereo-picture.
In the embodiment of the present invention, two video cameras in binocular camera are replaced by unmanned plane 100, from different positions
At least two width target images for obtaining target object P are set, so that the stereo-picture of target object P is obtained, using the side of low cost
Formula increases the distance between phase collection point, reduces the demand to imaging resolution, and improve reconstruction accuracy.
Further, after step S202, the stereoscopic imaging method based on unmanned plane 100 can also include: root
According to the stereo-picture of the different moments target image generated, the three-dimensional video-frequency of the target image is generated, improves user
Experience.
Further, after step S202, the stereoscopic imaging method based on unmanned plane 100 may also include that hair
Send the stereo-picture to display equipment 400, so as to show the stereo-picture by display equipment 400, by stereo-picture
Intuitively it is presented to the user.Wherein, the display equipment 400 can be smart phone, tablet computer, desktop computer, calculating
Machine or video glass etc..
Embodiment two
In conjunction with Fig. 6 and Fig. 7, second embodiment of the present invention provides a kind of stereoscopic imaging apparatus based on unmanned plane, described devices
It may include unmanned plane 100 and processor 200 (for example, single or multiple core processor), the processor 200 and the unmanned plane
100 communication connections.
The processor 200 can be central processing unit (central processing unit, CPU).The processor
200 can further include hardware chip.Above-mentioned hardware chip can be specific integrated circuit (application-
Specific integrated circuit, ASIC), programmable logic device (programmable logic device,
PLD) or combinations thereof.Above-mentioned PLD can be Complex Programmable Logic Devices (complex programmable logic
Device, CPLD), field programmable gate array (field-programmable gate array, FPGA), general battle array
Row logic (generic array logic, GAL) or any combination thereof.
The processor 200 may include one or more, either individually or collectively work.The processor 200 is used for: logical
Cross at least two width target images of the unmanned plane 100 from different position acquisition target object P, wherein target described at least two width
The target object P in image is at least partly overlapped;According to target image described at least two width, the target object P is generated
Stereo-picture.
In the embodiment of the present invention, two video cameras in binocular camera are replaced by unmanned plane 100, from different positions
At least two width target images for obtaining target object P are set, so that the stereo-picture of target object P is obtained, using the side of low cost
Formula increases the distance between phase collection point, reduces the demand to imaging resolution, and improve reconstruction accuracy.
In one embodiment, the processor 200, is merged for the target image described at least two width, is generated
The stereo-picture of the target object P.
In one embodiment, the unmanned plane 100 include two, two unmanned planes 100 with the processor
200 communication connections;The different position includes first position and the second position;The processor 200, for controlling two institutes
It states unmanned plane 100 and is located at the first position and the second position;Two unmanned planes 100 are obtained in respective position
Set the target image of the target object P of shooting.
In one embodiment, the processor 200, for controlling the synchronous flight of two unmanned planes 100, described the
One position and the second position are respectively the real time position of corresponding unmanned plane 100.
In one embodiment, the processor 200 is while the synchronous flight of two unmanned planes 100 of control, also
It is constant for controlling the relativeness between two unmanned planes 100.
In one embodiment, the relativeness comprises at least one of the following: the shooting appearance of two unmanned planes 100
Positional relationship between state, two unmanned planes 100.
In one embodiment, the shooting posture of two unmanned planes 100 includes: the bat of two unmanned planes 100
Take the photograph the angle in direction.
In one embodiment, the positional relationship between two unmanned planes 100 includes: two unmanned planes 100
The distance between (w in Fig. 4).
In one embodiment, the processor 200 is while the synchronous flight of two unmanned planes 100 of control, also
Real-time height (h in Fig. 4) for controlling two unmanned planes 100 is equal;Two unmanned planes 100 are controlled to described
The real-time range (s in Fig. 4) of target object P is equal.
In one embodiment, the processor 200, it is same in respective positions for obtaining two unmanned planes 100
Walk the target image of the target object P of shooting.
In one embodiment, the unmanned plane 100 includes one, and the different position includes first position and second
Position;The processor 200 is located at the first position and the second position for controlling the unmanned plane 100;It obtains
The target image for the target object P for taking the unmanned plane 100 to shoot in corresponding position.
In one embodiment, the first position and the second position belong to same flight path.
In one embodiment, the first position and the second position are located in different flight paths.
In one embodiment, the different flight path includes the first flight path and the second flight path, described
Processor 200 successively flies along first flight path and second flight path for controlling the unmanned plane 100
Row;The first position is chosen from first flight path, chooses the second position from second flight path.
In one embodiment, the processor 200, for controlling the unmanned plane 100 successively along the first flight rail
Mark and the second flight path horizontal flight.
In one embodiment, the processor 200, for choosing multiple first from first flight path
It sets, multiple second positions, the plurality of first position and multiple seconds is chosen from second flight path
Set one-to-one correspondence;Also, obtain the target object that the unmanned plane 100 is shot in corresponding first position and the second position
The target image of P.
In one embodiment, the unmanned plane 100 the relativeness of the first position and the second position not
Become.
In one embodiment, the relativeness comprises at least one of the following: the unmanned plane 100 is at described first
It sets and is closed with the shooting posture of the second position, the unmanned plane 100 in the position of the first position and the second position
System.
In one embodiment, shooting posture packet of the unmanned plane 100 in the first position and the second position
It includes: the angle of shooting direction of the unmanned plane 100 in the first position and the second position.
In one embodiment, positional relationship packet of the unmanned plane 100 in the first position and the second position
It includes: the distance (w in Fig. 5) of the unmanned plane 100 between the first position and the second position.
In one embodiment, the first position is to the distance (s in Fig. 5) of the target object P and described second
The distance (s in Fig. 5) of position to the target object P are equal, the height (h in Fig. 5) of the first position and described
The height of two positions is also equal (h in Fig. 5).
In one embodiment, the processor 200 generates the target object P in the target image according to two width
Stereo-picture after, be also used to generate the target according to the stereo-picture of the different moments target image generated
The three-dimensional video-frequency of image.
In one embodiment, the processor 200 is used for and display equipment 400 and communication connection, the processor 200
After the stereo-picture for generating the target object P, the stereo-picture is sent to the display equipment 400.
When the unmanned plane 100 is two, the processor 200 can be the flight control of a wherein unmanned plane 100
The combination of the flight controller of device or two unmanned planes 100, alternatively, the processor 200 can be the control being independently arranged
Device processed.
When the unmanned plane 100 is one, the processor 200 can be the flight controller of the unmanned plane 100, or
Person, the processor 200 can be the controller being independently arranged.
Further, the stereoscopic imaging apparatus based on unmanned plane may also include storage device.The storage device can
To include volatile memory (volatile memory), such as random access memory (random-access memory,
RAM);Storage device also may include nonvolatile memory (non-volatile memory), such as flash memory
(flash memory), hard disk (hard disk drive, HDD) or solid state hard disk (solid-state drive, SSD);It deposits
Storage device can also include the combination of the memory of mentioned kind.Optionally, the storage device is for storing program instruction.Institute
Stating processor 200 can call described program to instruct, and realize the correlation method such as above-described embodiment one.
It should be noted that the specific implementation of the processor 200 of the embodiment of the present invention can refer to above-mentioned each implementation
The description of corresponding contents in example, this will not be repeated here.
Embodiment three
The embodiment of the present invention three provides a kind of computer readable storage medium, is stored thereon with computer program, the program
The step of stereoscopic imaging method being executed by processor described in above-described embodiment one based on unmanned plane.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality
Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit
The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with
It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual
It needs that some or all of the modules therein is selected to achieve the purpose of the solution of this embodiment.Those of ordinary skill in the art are not
In the case where making the creative labor, it can understand and implement.
The description of " specific example " or " some examples " etc. means the specific spy described in conjunction with the embodiment or example
Sign, structure, material or feature are included at least one embodiment or example of the invention.In the present specification, to above-mentioned
The schematic representation of term may not refer to the same embodiment or example.Moreover, the specific features of description, structure, material
Or feature can be combined in any suitable manner in any one or more of the embodiments or examples.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes
It is one or more for realizing specific logical function or process the step of executable instruction code module, segment or portion
Point, and the range of the preferred embodiment of the present invention includes other realization, wherein sequence shown or discussed can not be pressed,
Including according to related function by it is basic simultaneously in the way of or in the opposite order, to execute function, this should be by of the invention
Embodiment person of ordinary skill in the field is understood.
Expression or logic and/or step described otherwise above herein in flow charts, for example, being considered use
In the order list for the executable instruction for realizing logic function, may be embodied in any computer-readable medium, for
Instruction execution system, device or equipment (such as computer based system, including the system of processor or other can be held from instruction
The instruction fetch of row system, device or equipment and the system executed instruction) it uses, or combine these instruction execution systems, device or set
It is standby and use.For the purpose of this specification, " computer-readable medium ", which can be, any may include, stores, communicates, propagates or pass
Defeated program is for instruction execution system, device or equipment or the dress used in conjunction with these instruction execution systems, device or equipment
It sets.The more specific example (non-exhaustive list) of computer-readable medium include the following: there is the electricity of one or more wirings
Interconnecting piece (electronic device), portable computer diskette box (magnetic device), random access memory (RAM), read-only memory
(ROM), erasable edit read-only storage (EPROM or flash memory), fiber device and portable optic disk is read-only deposits
Reservoir (CDROM).In addition, computer-readable medium can even is that the paper that can print described program on it or other are suitable
Medium, because can then be edited, be interpreted or when necessary with it for example by carrying out optical scanner to paper or other media
His suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each section of the invention can be realized with hardware, software, firmware or their combination.Above-mentioned
In embodiment, software that multiple steps or method can be executed in memory and by suitable instruction execution system with storage or
Firmware is realized.For example, if realized with hardware, in another embodiment, following skill well known in the art can be used
Any one of art or their combination are realized: have for data-signal is realized the logic gates of logic function from
Logic circuit is dissipated, the specific integrated circuit with suitable combinational logic gate circuit, programmable gate array (PGA), scene can compile
Journey gate array (FPGA) etc..
Those skilled in the art are understood that realize all or part of the steps that above-mentioned implementation method carries
It is that relevant hardware can be instructed to complete by program, the program can store in a kind of computer readable storage medium
In, which when being executed, includes the steps that one or a combination set of embodiment of the method.
It, can also be in addition, each functional unit in each embodiment of the present invention can integrate in a processing module
It is that each unit physically exists alone, can also be integrated in two or more units in a module.Above-mentioned integrated mould
Block both can take the form of hardware realization, can also be realized in the form of software function module.The integrated module is such as
Fruit is realized and when sold or used as an independent product in the form of software function module, also can store in a computer
In read/write memory medium.
Storage medium mentioned above can be read-only memory, disk or CD etc..Although having been shown and retouching above
The embodiment of the present invention is stated, it is to be understood that above-described embodiment is exemplary, and should not be understood as to limit of the invention
System, those skilled in the art can be changed above-described embodiment, modify, replace and become within the scope of the invention
Type.
Claims (47)
1. a kind of stereoscopic imaging method based on unmanned plane characterized by comprising
By unmanned plane from at least two width target images of different position acquisition target objects, wherein mesh described at least two width
The target object in logo image is at least partly overlapped;
According to target image described at least two width, the stereo-picture of the target object is generated.
2. the method according to claim 1, wherein the target image according at least two width, generates institute
State the stereo-picture of target object, comprising:
The target image described at least two width merges, and generates the stereo-picture of the target object.
3. the different position is wrapped the method according to claim 1, wherein the unmanned plane includes two
Include first position and the second position;
It is described by unmanned plane from two width target images of different position acquisition target objects, comprising:
Two unmanned planes of control are located at the first position and the second position;
Obtain the target image for the target object that two unmanned planes are shot in respective positions.
4. according to the method described in claim 3, it is characterized in that, two unmanned planes of the control are located at described
One position and the second position, comprising:
The synchronous flight of the unmanned plane of control two, the first position and the second position are respectively the reality of corresponding unmanned plane
When position.
5. according to the method described in claim 4, it is characterized in that, the unmanned plane of control two synchronizes the same of flight
When, further includes:
The relativeness controlled between two unmanned planes is constant.
6. according to the method described in claim 5, it is characterized in that, the relativeness comprises at least one of the following: two institutes
State the positional relationship between the shooting posture of unmanned plane, two unmanned planes.
7. according to the method described in claim 6, it is characterized in that, the shooting posture of two unmanned planes includes: two institutes
State the angle of the shooting direction of unmanned plane.
8. according to the method described in claim 6, it is characterized in that, the positional relationship between two unmanned planes includes: two
The distance between unmanned plane described in platform.
9. according to the method described in claim 4, it is characterized in that, the unmanned plane of control two synchronizes the same of flight
When, further includes:
The real-time height of two unmanned planes of control is equal;
The real-time range for controlling two unmanned planes to the target objects is equal.
10. according to the method described in claim 4, it is characterized in that, two unmanned planes of the acquisition are in respective positions
The target image of the target object of shooting, comprising:
Two unmanned planes are obtained in the target image of the target object of respective positions sync pulse jamming.
11. the different position is wrapped the method according to claim 1, wherein the unmanned plane includes one
Include first position and the second position;
It is described by unmanned plane from at least two width target images of different position acquisition target objects, comprising:
It controls the unmanned plane and is located at the first position and the second position;
Obtain the target image for the target object that the unmanned plane is shot in corresponding position.
12. according to the method for claim 11, which is characterized in that the first position and the second position belong to same
Flight path.
13. according to the method for claim 11, which is characterized in that the first position and the second position are located at difference
Flight path on.
14. according to the method for claim 13, which is characterized in that the different flight path includes the first flight path
With the second flight path, the control unmanned plane is located at first position and the second position, comprising:
The unmanned plane is controlled successively to fly along first flight path and second flight path;
The first position is chosen from first flight path, chooses the second from second flight path
It sets.
15. according to the method for claim 14, which is characterized in that the control unmanned plane is successively along described first
Flight path and second flight path flight, comprising:
The unmanned plane is controlled successively along the first flight path and the second flight path horizontal flight.
16. according to the method for claim 14, which is characterized in that described to choose described the from first flight path
The second position is chosen from second flight path in one position, comprising:
Multiple first positions are chosen from first flight path, choose multiple seconds from second flight path
It sets, the plurality of first position and multiple second positions correspond;
The target image for obtaining the target object that the unmanned plane is shot in corresponding position, comprising:
Obtain the target image for the target object that the unmanned plane is shot in corresponding first position and the second position.
17. 2 to 16 described in any item methods according to claim 1, which is characterized in that the unmanned plane is in the first position
It is constant with the relativeness of the second position.
18. according to the method for claim 17, which is characterized in that the relativeness comprises at least one of the following: described
Unmanned plane is in the shooting posture of the first position and the second position, the unmanned plane in the first position and described
The positional relationship of two positions.
19. according to the method for claim 18, which is characterized in that the unmanned plane is in the first position and described second
The shooting posture of position includes: angle of the unmanned plane in the shooting direction of the first position and the second position.
20. according to the method for claim 18, which is characterized in that the unmanned plane is in the first position and described second
The positional relationship of position includes: the distance of the unmanned plane between the first position and the second position.
21. 2 to 16 described in any item methods according to claim 1, which is characterized in that the first position to the object
The distance of body and the second position being equidistant to the target object, the height of the first position and the second
The height set is also equal.
22. the method according to claim 1, wherein the target image according to two width, generates the mesh
After the stereo-picture for marking object, further includes:
According to the stereo-picture of the different moments target image generated, the three-dimensional video-frequency of the target image is generated.
23. the method according to claim 1, wherein being gone back after the stereo-picture for generating the target object
Include:
The stereo-picture is sent to showing equipment.
24. a kind of stereoscopic imaging apparatus based on unmanned plane, including unmanned plane, which is characterized in that further include one or more places
Device is managed, is either individually or collectively worked, the processor is connect with the UAV Communication;
The processor is used for:
By unmanned plane from at least two width target images of different position acquisition target objects, wherein mesh described at least two width
The target object in logo image is at least partly overlapped;
According to target image described at least two width, the stereo-picture of the target object is generated.
25. stereoscopic imaging apparatus according to claim 24, which is characterized in that the processor, for at least two width
The target image is merged, and the stereo-picture of the target object is generated.
26. stereoscopic imaging apparatus according to claim 24, which is characterized in that the unmanned plane includes two, two institutes
Unmanned plane is stated to connect with the processor communication;
The different position includes first position and the second position;
The processor is located at the first position and the second position for controlling two unmanned planes;
Obtain the target image for the target object that two unmanned planes are shot in respective positions.
27. stereoscopic imaging apparatus according to claim 26, which is characterized in that the processor, for controlling two institutes
The synchronous flight of unmanned plane is stated, the first position and the second position are respectively the real time position of corresponding unmanned plane.
28. stereoscopic imaging apparatus according to claim 27, which is characterized in that the processor is controlling two nothings
While man-machine synchronous flight, it is also used to
The relativeness controlled between two unmanned planes is constant.
29. stereoscopic imaging apparatus according to claim 28, which is characterized in that the relativeness includes following at least one
Kind: the positional relationship between the shooting posture of two unmanned planes, two unmanned planes.
30. stereoscopic imaging apparatus according to claim 29, which is characterized in that the shooting posture packet of two unmanned planes
It includes: the angle of the shooting direction of two unmanned planes.
31. stereoscopic imaging apparatus according to claim 29, which is characterized in that close the position between two unmanned planes
System includes: the distance between two described unmanned planes.
32. stereoscopic imaging apparatus according to claim 27, which is characterized in that the processor is controlling two nothings
While man-machine synchronous flight, it is also used to
The real-time height of two unmanned planes of control is equal;
The real-time range for controlling two unmanned planes to the target objects is equal.
33. stereoscopic imaging apparatus according to claim 27, which is characterized in that the processor is used for
Two unmanned planes are obtained in the target image of the target object of respective positions sync pulse jamming.
34. stereoscopic imaging apparatus according to claim 24, which is characterized in that the unmanned plane include one, it is described not
Same position includes first position and the second position;
The processor, is used for
It controls the unmanned plane and is located at the first position and the second position;
Obtain the target image for the target object that the unmanned plane is shot in corresponding position.
35. stereoscopic imaging apparatus according to claim 34, which is characterized in that the first position and the second position
Belong to same flight path.
36. stereoscopic imaging apparatus according to claim 34, which is characterized in that the first position and the second position
In different flight paths.
37. stereoscopic imaging apparatus according to claim 36, which is characterized in that the different flight path includes first
Flight path and the second flight path, the processor are used for
The unmanned plane is controlled successively to fly along first flight path and second flight path;
The first position is chosen from first flight path, chooses the second from second flight path
It sets.
38. the stereoscopic imaging apparatus according to claim 37, which is characterized in that the processor, for controlling the nothing
It is man-machine successively along the first flight path and the second flight path horizontal flight.
39. the stereoscopic imaging apparatus according to claim 37, which is characterized in that the processor is used for from described first
Multiple first positions are chosen in flight path, and multiple second positions are chosen from second flight path, it is plurality of described
First position and multiple second positions correspond;Also, the unmanned plane is obtained in corresponding first position and second
The target image of the target object of position shooting.
40. according to the described in any item stereoscopic imaging apparatus of claim 35 to 39, which is characterized in that the unmanned plane is described
The relativeness of first position and the second position is constant.
41. stereoscopic imaging apparatus according to claim 40, which is characterized in that the relativeness includes following at least one
Kind: the unmanned plane is in the shooting posture of the first position and the second position, the unmanned plane in the first position
With the positional relationship of the second position.
42. stereoscopic imaging apparatus according to claim 41, which is characterized in that the unmanned plane in the first position and
The shooting posture of the second position includes: shooting direction of the unmanned plane in the first position and the second position
Angle.
43. stereoscopic imaging apparatus according to claim 41, which is characterized in that the unmanned plane in the first position and
The positional relationship of the second position includes: the distance of the unmanned plane between the first position and the second position.
44. according to the described in any item stereoscopic imaging apparatus of claim 35 to 39, which is characterized in that the first position to institute
State distance and the second position being equidistant to the target object of target object, the height of the first position and institute
The height for stating the second position is also equal.
45. stereoscopic imaging apparatus according to claim 24, which is characterized in that the processor is in the mesh according to two width
Logo image after the stereo-picture for generating the target object, is also used to
According to the stereo-picture of the different moments target image generated, the three-dimensional video-frequency of the target image is generated.
46. stereoscopic imaging apparatus according to claim 24, which is characterized in that the processor be used for display equipment with
Communication connection, the processor send the stereo-picture to the display after the stereo-picture for generating the target object
Equipment.
47. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is by processor
Perform claim requires the step of 1 to 23 described in any item stereoscopic imaging methods based on unmanned plane.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110004601.XA CN112672133A (en) | 2017-12-22 | 2017-12-22 | Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/118034 WO2019119426A1 (en) | 2017-12-22 | 2017-12-22 | Stereoscopic imaging method and apparatus based on unmanned aerial vehicle |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110004601.XA Division CN112672133A (en) | 2017-12-22 | 2017-12-22 | Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109076206A true CN109076206A (en) | 2018-12-21 |
CN109076206B CN109076206B (en) | 2021-01-26 |
Family
ID=64812363
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780018614.4A Expired - Fee Related CN109076206B (en) | 2017-12-22 | 2017-12-22 | Three-dimensional imaging method and device based on unmanned aerial vehicle |
CN202110004601.XA Withdrawn CN112672133A (en) | 2017-12-22 | 2017-12-22 | Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110004601.XA Withdrawn CN112672133A (en) | 2017-12-22 | 2017-12-22 | Three-dimensional imaging method and device based on unmanned aerial vehicle and computer readable storage medium |
Country Status (2)
Country | Link |
---|---|
CN (2) | CN109076206B (en) |
WO (1) | WO2019119426A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110068306A (en) * | 2019-04-19 | 2019-07-30 | 弈酷高科技(深圳)有限公司 | A kind of unmanned plane inspection photometry system and method |
CN111757084A (en) * | 2020-07-30 | 2020-10-09 | 北京博清科技有限公司 | Acquisition method and acquisition device for three-dimensional image and readable storage medium |
CN112684456A (en) * | 2020-12-22 | 2021-04-20 | 安徽配隆天环保科技有限公司 | Unmanned aerial vehicle supersound solid imaging model system |
CN113542718A (en) * | 2021-07-20 | 2021-10-22 | 翁均明 | Unmanned aerial vehicle stereo photography method |
CN113608550A (en) * | 2021-08-06 | 2021-11-05 | 寰宇鹏翔航空科技(深圳)有限公司 | Unmanned aerial vehicle data acquisition control method, unmanned aerial vehicle and storage medium |
CN113542718B (en) * | 2021-07-20 | 2024-06-28 | 翁均明 | Unmanned aerial vehicle stereoscopic photographing method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105225241A (en) * | 2015-09-25 | 2016-01-06 | 广州极飞电子科技有限公司 | The acquisition methods of unmanned plane depth image and unmanned plane |
CN105245846A (en) * | 2015-10-12 | 2016-01-13 | 西安斯凯智能科技有限公司 | Multi-unmanned aerial vehicle cooperative tracking type shooting system and shooting method |
CN106162145A (en) * | 2016-07-26 | 2016-11-23 | 北京奇虎科技有限公司 | Stereoscopic image generation method based on unmanned plane, device |
CN106296821A (en) * | 2016-08-19 | 2017-01-04 | 刘建国 | Multi-view angle three-dimensional method for reconstructing based on unmanned plane and system |
CN106331684A (en) * | 2016-08-30 | 2017-01-11 | 长江三峡勘测研究院有限公司(武汉) | Three-dimensional image obtaining method based on small unmanned aerial vehicle video recording in engineering geological survey |
CN107124606A (en) * | 2017-05-31 | 2017-09-01 | 东莞市妙音广告传媒有限公司 | The long-range image pickup method of digital video advertisement based on unmanned plane |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8350894B2 (en) * | 2009-04-17 | 2013-01-08 | The Boeing Company | System and method for stereoscopic imaging |
IL208910A0 (en) * | 2010-10-24 | 2011-02-28 | Rafael Advanced Defense Sys | Tracking and identification of a moving object from a moving sensor using a 3d model |
CN105391939B (en) * | 2015-11-04 | 2017-09-29 | 腾讯科技(深圳)有限公司 | Unmanned plane filming control method and device, unmanned plane image pickup method and unmanned plane |
CN205507553U (en) * | 2016-04-07 | 2016-08-24 | 吉林禾熙科技开发有限公司 | Three -dimensional scene data acquisition control device of unmanned aerial vehicle |
CN106485736B (en) * | 2016-10-27 | 2022-04-12 | 深圳市道通智能航空技术股份有限公司 | Panoramic visual tracking method for unmanned aerial vehicle, unmanned aerial vehicle and control terminal |
-
2017
- 2017-12-22 CN CN201780018614.4A patent/CN109076206B/en not_active Expired - Fee Related
- 2017-12-22 WO PCT/CN2017/118034 patent/WO2019119426A1/en active Application Filing
- 2017-12-22 CN CN202110004601.XA patent/CN112672133A/en not_active Withdrawn
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105225241A (en) * | 2015-09-25 | 2016-01-06 | 广州极飞电子科技有限公司 | The acquisition methods of unmanned plane depth image and unmanned plane |
CN105245846A (en) * | 2015-10-12 | 2016-01-13 | 西安斯凯智能科技有限公司 | Multi-unmanned aerial vehicle cooperative tracking type shooting system and shooting method |
CN106162145A (en) * | 2016-07-26 | 2016-11-23 | 北京奇虎科技有限公司 | Stereoscopic image generation method based on unmanned plane, device |
CN106296821A (en) * | 2016-08-19 | 2017-01-04 | 刘建国 | Multi-view angle three-dimensional method for reconstructing based on unmanned plane and system |
CN106331684A (en) * | 2016-08-30 | 2017-01-11 | 长江三峡勘测研究院有限公司(武汉) | Three-dimensional image obtaining method based on small unmanned aerial vehicle video recording in engineering geological survey |
CN107124606A (en) * | 2017-05-31 | 2017-09-01 | 东莞市妙音广告传媒有限公司 | The long-range image pickup method of digital video advertisement based on unmanned plane |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110068306A (en) * | 2019-04-19 | 2019-07-30 | 弈酷高科技(深圳)有限公司 | A kind of unmanned plane inspection photometry system and method |
CN111757084A (en) * | 2020-07-30 | 2020-10-09 | 北京博清科技有限公司 | Acquisition method and acquisition device for three-dimensional image and readable storage medium |
CN112684456A (en) * | 2020-12-22 | 2021-04-20 | 安徽配隆天环保科技有限公司 | Unmanned aerial vehicle supersound solid imaging model system |
CN112684456B (en) * | 2020-12-22 | 2024-05-17 | 安徽配隆天环保科技有限公司 | Unmanned aerial vehicle ultrasonic three-dimensional imaging model system |
CN113542718A (en) * | 2021-07-20 | 2021-10-22 | 翁均明 | Unmanned aerial vehicle stereo photography method |
CN113542718B (en) * | 2021-07-20 | 2024-06-28 | 翁均明 | Unmanned aerial vehicle stereoscopic photographing method |
CN113608550A (en) * | 2021-08-06 | 2021-11-05 | 寰宇鹏翔航空科技(深圳)有限公司 | Unmanned aerial vehicle data acquisition control method, unmanned aerial vehicle and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112672133A (en) | 2021-04-16 |
CN109076206B (en) | 2021-01-26 |
WO2019119426A1 (en) | 2019-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104699247B (en) | A kind of virtual reality interactive system and method based on machine vision | |
US11649052B2 (en) | System and method for providing autonomous photography and videography | |
CN106029501B (en) | UAV panoramic imagery | |
CN109076206A (en) | Stereoscopic imaging method and device based on unmanned plane | |
CN109791405A (en) | System and method for controlling the image captured by imaging device | |
CN108605098A (en) | system and method for rolling shutter correction | |
CN108156441A (en) | Visual is stablized | |
CN105898346A (en) | Control method, electronic equipment and control system | |
US11212437B2 (en) | Immersive capture and review | |
CN108139799A (en) | The system and method for region of interest (ROI) processing image data based on user | |
CN108292489A (en) | Information processing unit and image generating method | |
CN108351653A (en) | System and method for UAV flight controls | |
CN107168352A (en) | Target tracking system and method | |
CN108351654A (en) | System and method for visual target tracking | |
CN107079141A (en) | Image mosaic for 3 D video | |
CN108351649A (en) | System and method for UAV interactive instructions and control | |
CN107850436A (en) | Merged using the sensor of inertial sensor and imaging sensor | |
CN107850901A (en) | Merged using the sensor of inertial sensor and imaging sensor | |
CN104729484B (en) | The three-dimensional boat of unmanned plane various visual angles takes the photograph the method that device and its focal length determine | |
CN104536579A (en) | Interactive three-dimensional scenery and digital image high-speed fusing processing system and method | |
CN108885487B (en) | Gesture control method of wearable system and wearable system | |
WO2020014987A1 (en) | Mobile robot control method and apparatus, device, and storage medium | |
CN107850899A (en) | Merged using the sensor of inertial sensor and imaging sensor | |
CN109076173A (en) | Image output generation method, equipment and unmanned plane | |
CN106791360A (en) | Generate the method and device of panoramic video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210126 |