CN112204947A - Image processing device, imaging device, unmanned aircraft, image processing method, and program - Google Patents

Image processing device, imaging device, unmanned aircraft, image processing method, and program Download PDF

Info

Publication number
CN112204947A
CN112204947A CN202080002874.4A CN202080002874A CN112204947A CN 112204947 A CN112204947 A CN 112204947A CN 202080002874 A CN202080002874 A CN 202080002874A CN 112204947 A CN112204947 A CN 112204947A
Authority
CN
China
Prior art keywords
lens
image
positions
exposure period
image processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080002874.4A
Other languages
Chinese (zh)
Inventor
高宫诚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
SZ DJI Innovations Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN112204947A publication Critical patent/CN112204947A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

If a lens included in an optical system such as a focusing lens is moved, an unintended angle of view variation may be caused due to distortion or the like. An image processing apparatus of the present invention includes: an acquisition section for acquiring information indicating a plurality of positions of a lens moving within an image pickup element exposure period; and a correction section that corrects an image acquired by exposing the image pickup element through the lens within the exposure period based on the plurality of positions. The image processing method of the present invention includes: a phase of acquiring information representing a plurality of positions of a lens moving within an exposure period of an image pickup element; and a stage of correcting an image acquired by exposing the image pickup element through the lens within the exposure period based on the plurality of positions.

Description

Image processing device, imaging device, unmanned aircraft, image processing method, and program Technical Field
The invention relates to an image processing device, an imaging device, an unmanned aircraft, an image processing method, and a program.
Background
Patent document 1 describes a technique of correcting aberrations of input screen coordinate pixels using a distortion correction parameter table for storing pixel position coordinate data corresponding to lens parameters.
Patent document 1 japanese patent application laid-open No. 2011-.
Disclosure of Invention
The technical problems to be solved by the invention are as follows:
if a lens included in an optical system such as a focusing lens is moved, an unintended angle of view may be changed due to distortion or the like.
Technical means for solving the problems:
an image processing apparatus according to an aspect of the present invention includes an acquisition section for acquiring information indicating a plurality of positions of a lens that moves within an exposure period of an image pickup element. The image processing apparatus includes a correction section that corrects an image acquired by exposing the image pickup element through the lens for an exposure period based on the plurality of positions.
The correction section may calculate an average position of the lens within the exposure period based on the plurality of positions, and correct distortion of the image based on the calculated average position of the lens.
The correction section may correct distortion of the image based on the average position of the lens and a distortion coefficient corresponding to the lens position.
The correction unit may calculate a plurality of pixel positions corresponding to the plurality of pixels on the image for each combination of the plurality of positions and the plurality of pixels based on the plurality of positions and the positions of the plurality of pixels included in the image pickup element, and may correct the image by calculating an average position of the plurality of pixel positions for each of the plurality of pixels.
The image pickup element can be exposed in exposure periods in which a plurality of pixel columns included in the image pickup element are different from each other. The acquisition section may acquire information indicating a plurality of positions of the lens moving within exposure periods of the respective plurality of pixel columns. The correction section may correct the images acquired by the plurality of pixel columns, respectively, based on a plurality of positions within the exposure period of each of the plurality of pixel columns.
The lens may be a focus lens movable in the optical axis direction.
In order to adjust the focus of an optical system including the lens, the lens may be reciprocated in the optical axis direction during an exposure period. The acquisition section may acquire information indicating a plurality of positions of the lens that reciprocates within the exposure period.
In order to adjust the focus of an optical system including the lens, the lens may be moved in one direction in the optical axis direction within an exposure period. The acquisition section may acquire information indicating a plurality of positions of the lens that moves in one direction within the exposure period.
The exposure period may be a period in which the image pickup element is repeatedly exposed in order to acquire each of a plurality of moving picture constituent images constituting a moving picture. The correction section may correct each of the plurality of moving picture constituent images based on a plurality of positions within an exposure period for respectively acquiring each of the plurality of moving picture constituent images.
An image pickup apparatus according to an aspect of the present invention may include the image processing apparatus. The camera device may comprise an image sensor.
The lens may be a focus lens movable in the optical axis direction. The image pickup apparatus may include a focus adjustment section that adjusts a focus of an optical system including the lens based on the image corrected by the correction section.
An unmanned aerial vehicle according to an aspect of the present invention includes the imaging device and moves.
An image processing method according to an aspect of the present invention includes a stage of acquiring information indicating a plurality of positions of a lens moving within an exposure period of an image pickup element. The image processing method includes a stage of correcting an image acquired by exposing an image pickup element through a lens within an exposure period based on a plurality of positions.
The program according to one aspect of the present invention may be a program for causing a computer to function as the image processing apparatus.
According to an aspect of the present invention, the influence of the variation in the angle of view that occurs as the lens moves can be suppressed.
In addition, the above summary does not list all necessary features of the present invention. Furthermore, sub-combinations of these feature sets may also constitute the invention.
Drawings
Fig. 1 is a diagram showing an example of an external perspective view of an imaging apparatus 100 according to the present embodiment.
Fig. 2 is a diagram showing functional blocks of the imaging apparatus 100 according to the present embodiment.
Fig. 3 is a diagram schematically illustrating the processing performed by the correction unit 140.
Fig. 4 shows an example of distortion coefficients.
Fig. 5 schematically shows one example of the positional relationship of the coordinates on the image sensor 120 and the image coordinates (x, y).
Fig. 6 is a diagram illustrating a method of calculating a position average value of the focus lens 210.
Fig. 7 is a flowchart showing one example of the execution steps of the image capturing apparatus 100.
Fig. 8 is a diagram for explaining another correction method performed by the correction unit 140.
Fig. 9 is a diagram schematically illustrating correction processing when reading pixel data by scrolling reading.
Fig. 10 shows an Unmanned Aerial Vehicle (UAV) having the imaging device 100 mounted thereon.
FIG. 11 illustrates one example of a computer 1200 in which aspects of the invention may be embodied, in whole or in part.
Description of the symbols:
10 UAV
20 UAV body
50 universal joint
60 image pickup device
100 image pickup device
102 image pickup part
104 image processing unit
110 image pickup control unit
112 focus adjusting part
120 image sensor
130 memory
140 correcting part
142 acquisition part
160 display part
162 indicating part
200 lens part
210 focusing lens
211 zoom lens
212 lens driving unit
213 lens driving part
220 lens control part
222 memory
214. 215 position sensor
300 remote operation device
310 pixel data
311 corrected image
320 pixel data
321 correction of the image
810 pixel data
811 correcting pixel data
820 pixel data
821 correcting pixel data
1200 computer
1210 host controller
1212 CPU
1214 RAM
1220 input/output controller
1222 communication interface
1230 ROM
Detailed Description
The present invention will be described below with reference to embodiments of the invention, but the following embodiments do not limit the invention according to the claims. Moreover, all combinations of features described in the embodiments are not necessarily essential to the inventive solution. It will be apparent to those skilled in the art that various changes and modifications can be made in the following embodiments. It is apparent from the description of the claims that the modes to which such changes or improvements are made are included in the technical scope of the present invention.
The claims, the specification, the drawings, and the abstract of the specification contain matters to be protected by copyright. The copyright owner would not make an objection to the facsimile reproduction by anyone of the files, as represented by the patent office documents or records. However, in other cases, the copyright of everything is reserved.
Various embodiments of the present invention may be described with reference to flow diagrams and block diagrams, where blocks may represent (1) stages of a process to perform an operation or (2) a "part" of a device that has the role of performing an operation. Certain stages and "sections" may be implemented by programmable circuits and/or processors. The dedicated circuitry may comprise digital and/or analog hardware circuitry. May include Integrated Circuits (ICs) and/or discrete circuits. The programmable circuitry may comprise reconfigurable hardware circuitry. The reconfigurable hardware circuit may include logical AND, logical OR, logical XOR, logical NAND, logical NOR and other logical operations, flip-flops, registers, Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), etc. storage elements.
The computer readable medium may include any tangible device capable of storing instructions for execution by a suitable device. As a result, a computer-readable medium having stored thereon instructions that may be executed to create a means for implementing the operations specified in the flowchart or block diagram includes an article of manufacture including instructions that may be executed to implement the operations specified in the flowchart or block diagram block or blocks. As examples of the computer readable medium, an electronic storage medium, a magnetic storage medium, an optical storage medium, an electromagnetic storage medium, a semiconductor storage medium, and the like may be included. More specific examples of the computer-readable medium may include a floppy disk (registered trademark) disk, a floppy disk, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an erasable programmable read only memory (EPROM or flash memory), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Static Random Access Memory (SRAM), a compact disc read only memory (CD-ROM), a Digital Versatile Disc (DVD), a blu-ray (registered trademark) disc, a memory stick, an integrated circuit card, and so forth.
Computer readable instructions may include any one of source code or object code described by any combination of one or more programming languages. The source code or object code comprises a conventional procedural programming language. Conventional procedural programming languages may be assembly instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or Smalltalk, JAVA (registered trademark), C + +, or the like, and the "C" programming language, or similar programming languages. The computer readable instructions may be provided to a processor or programmable circuitry of a general purpose computer, special purpose computer, or other programmable data processing apparatus, either locally or via a Wide Area Network (WAN), such as a Local Area Network (LAN), the internet, or the like. A processor or programmable circuit may execute the computer readable instructions to create means for implementing the operations specified in the flowchart or block diagram. Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
Fig. 1 is a diagram showing an example of an external perspective view of an imaging apparatus 100 according to the present embodiment. Fig. 2 is a diagram showing functional blocks of the imaging apparatus 100 according to the present embodiment.
The imaging device 100 includes an imaging section 102 and a lens section 200. The imaging unit 102 includes an image sensor 120, an image processing unit 104, an imaging control unit 110, a memory 130, an instruction unit 162, and a display unit 160.
The image sensor 120 is an image pickup device such as a CCD or a CMOS. The image sensor 120 receives light through an optical system provided in the lens unit 200. The image sensor 120 outputs image data of an optical image imaged by the optical system included in the lens unit 200 to the image processing unit 104.
The imaging control unit 110 and the image processing unit 104 may be configured by a microprocessor such as a CPU or MPU, a microcontroller such as an MCU, or the like. The memory 130 may be a computer-readable recording medium, and may include at least one of flash memories such as SRAM, DRAM, EPROM, EEPROM, and USB memory. The memory 130 stores a program necessary for the imaging control unit 110 to control the image sensor 120 and the like, a program necessary for the image processing unit 104 to execute image processing, and the like. The memory 130 may be provided inside the housing of the image pickup apparatus 100. The memory 130 may be configured to be detachable from the housing of the image pickup apparatus 100.
The instruction unit 162 is a user interface for receiving an instruction from the user to the image pickup apparatus 100. The display unit 160 displays an image captured by the image sensor 120 and processed by the image processing unit 104, various setting information of the imaging apparatus 100, and the like. The display portion 160 may be composed of a touch panel.
The imaging control unit 110 controls the lens unit 200 and the image sensor 120. The imaging control unit 110 controls adjustment of the focal position and focal distance of the optical system included in the lens unit 200. The imaging control unit 110 outputs a control instruction to the lens control unit 220 included in the lens unit 200 based on information indicating an instruction of the user, thereby controlling the lens unit 200.
The lens section 200 includes a focus lens 210, a zoom lens 211, a lens driving section 212, a lens driving section 213, a lens control section 220, a memory 222, a position sensor 214, and a position sensor 215. The focus lens 210 and the zoom lens 211 may include at least one lens. The focus lens 210 and the zoom lens 211 are lenses included in the optical system. At least a part or all of the focus lens 210 and the zoom lens 211 are configured to be movable along the optical axis of the optical system.
The imaging control section 110 includes a focus adjustment section 112. The focus adjustment unit 112 adjusts the focus of the optical system included in the lens unit 200 by controlling the focus lens 210.
The lens portion 200 may be an interchangeable lens that is provided to be attachable to and detachable from the image pickup portion 102. The lens driving unit 212 may include a driving device such as a stepping motor, a DC motor, a coreless motor, or an ultrasonic motor. At least a part or all of the focus lens 210 receives power from a driving device included in the lens driving section 212 via a mechanism member such as a cam ring or a guide shaft, and moves. The lens driving unit 213 may include a driving device such as a stepping motor, a DC motor, a coreless motor, or an ultrasonic motor. At least a part or all of the zoom lens 211 is moved by power from a driving device included in the lens driving section 213 via a mechanism member such as a cam ring or a guide shaft.
The lens control section 220 drives at least one of the lens driving section 212 and the lens driving section 213 in accordance with a lens control instruction from the image pickup section 102, and moves at least one of the focus lens 210 and the zoom lens 211 in the optical axis direction via a mechanism member to perform at least one of a zooming action and a focusing action. The lens control command is, for example, a zoom control command and a focus control command.
The memory 222 stores control values for the focus lens and the zoom lens moved by the lens driving unit 212. The memory 222 may include at least one of SRAM, DRAM, EPROM, EEPROM, USB memory, and other flash memories.
The position sensor 214 detects the position of the focus lens 210. The position sensor 215 detects the position of the zoom lens 211. The position sensor 214 and the position sensor 215 may be a Magnetoresistive (MR) sensor or the like.
The imaging control unit 110 outputs a control command to the image sensor 120 based on information indicating an instruction from the user via the instruction unit 162 or the like, and causes the image sensor 120 to execute control including imaging operation control. The image captured by the image sensor 120 is processed by the image processing section 104 and stored in the memory 130.
The image acquired by the image sensor 120 is input to the image processing section 104. The correction section 140 corrects the image acquired by the image sensor 120. The display unit 160 displays the image corrected by the correction unit 140. The memory 130 stores the image corrected by the correcting unit 140. The image corrected by the correction section 140 can be transferred from the memory 130 to a recording medium such as a memory card.
The image processing section 104 includes a correction section 140 and an acquisition section 142. The acquisition section 142 acquires information indicating a plurality of positions of the focus lens 210 that move within the exposure period of the image sensor 120. For example, the acquisition section 142 acquires information indicating a plurality of positions of the focus lens 210 from the focus adjustment section 112. The correction section 140 corrects the image acquired by exposing the image sensor 120 through the focus lens 210 for the exposure period based on the plurality of positions.
For example, the correction section 140 calculates an average position of the focus lens 210 within the exposure period based on the plurality of positions of the focus lens 210 acquired, and corrects distortion of the image based on the calculated average position of the focus lens 210. Specifically, the correction unit 140 corrects distortion of the image based on the average position of the focus lens 210 and a distortion coefficient corresponding to the position of the focus lens 210.
The correction unit 140 may calculate a plurality of pixel positions corresponding to a plurality of pixels on the image for each combination of a plurality of positions of the focus lens 210 and a plurality of pixels of the image sensor 120, based on a plurality of positions of the focus lens 210 and a plurality of pixel positions included in the image sensor 120. The correction section 140 may calculate an average position of the calculated pixel positions for each of the pixels, thereby correcting the image.
The image sensor 120 may be exposed according to exposure periods that vary among a plurality of pixel columns included in the image sensor 120. For example, the imaging control unit 110 can read pixel information from the image sensor 120 by scrolling reading. At this time, the acquisition section 142 acquires information indicating a plurality of positions of the focus lens 210 moving within the exposure period of each of the plurality of pixel columns. The correction section 140 corrects images acquired by the plurality of pixel columns, respectively, based on a plurality of positions within the exposure period of each of the plurality of pixel columns.
In order to adjust the focus of the optical system including the focus lens 210, the focus lens 210 is reciprocated in the optical axis direction during the exposure period. For example, the focus adjustment section 112 causes the focus lens 210 to swing (wobbling) within the exposure period. The acquisition section 142 acquires information indicating a plurality of positions of the focus lens 210 that reciprocates within the exposure period. The correction section 140 corrects the image based on the plurality of positions of the focus lens 210 that reciprocates within the exposure period.
In order to adjust the focus of the optical system including the focus lens 210, the focus lens 210 may be moved in one direction in the optical axis direction during the exposure period. The acquisition section 142 acquires information indicating a plurality of positions of the focus lens 210 that moves in one direction within the exposure period. The correction section 140 corrects the image based on a plurality of positions of the focus lens 210 that move in one direction within the exposure period.
The exposure period may be a period in which the image sensor 120 is repeatedly exposed in order to acquire each of a plurality of moving picture constituent images constituting a moving picture, respectively. The correction section 140 may correct each of the plurality of moving picture constituent images based on a plurality of positions within an exposure period for respectively acquiring each of the plurality of moving picture constituent images.
The focus adjustment section 112 adjusts the focus of the optical system including the focus lens 210 based on the image corrected by the correction section 140. For example, the focus adjustment section 112 determines the optical axis direction position of the focus lens 210 based on the contrast value of the image corrected by the correction section 140, and moves the focus lens 210 to the determined position.
Fig. 3 is a diagram schematically illustrating the processing performed by the correction unit 140. Fig. 3 correlatively illustrates processing performed by the correction portion 140 when the exposure periods of all the horizontal pixel columns included in the image sensor 120 are the same. The processing when the image sensor 120 performs continuous shooting by global reading is specifically described. In this embodiment, the image sensor 120 has N (N is a natural number) horizontal pixel columns.
The imaging control unit 110 exposes the horizontal pixel rows 1 to N included in the image sensor 120 at a timing based on the vertical synchronization signal VD. Fig. 3 shows the exposure period from the time t1 to the time t7 and the exposure period from the time t9 to the time t 15.
The focus adjustment unit 112 detects the optical axis direction position of the focus lens 210 a plurality of times during a period from the triggering of the vertical synchronization signal VD to the next vertical synchronization signal VD. Specifically, the focus adjustment unit 112 detects the optical axis direction position of the focus lens 210 at predetermined time intervals and predetermined times from the vertical synchronization signal VD at time t 0. Then, the focus adjustment unit 112 detects the optical axis direction position of the focus lens 210 at predetermined time intervals and predetermined times from the vertical synchronization signal VD at time t 7.
As shown in fig. 3, the lens control section 220 detects LP1, LP2, LP3, LP4, LP5, LP6, and LP7 in the exposure period from the time t1 to the time t 7. Then, the lens control section 220 detects LP9, LP10, LP11, LP12, LP13, LP14, and LP15 in the exposure period from the time t9 to the time t 15. LPi (i is a natural number) indicates the optical axis direction position of the focus lens 210.
In accordance with the vertical synchronization signal VD at the time t7, the focus adjustment section 112 acquires lens position information indicating LP1, LP2, LP3, LP4, LP5, LP6, and LP7 detected during the exposure period from the time t1 to the time t7 from the lens control section 220, and outputs the lens position information to the image processing section 104. The acquisition section 142 acquires the lens position information output from the focus adjustment section 112.
The correction unit 140 calculates an average value of LP1, LP2, LP3, LP4, LP5, LP6, and LP7 based on the lens position information. The correction unit 140 calculates a distortion coefficient based on the average values of LP1, LP2, LP3, LP4, LP5, LP6, and LP 7. The distortion coefficient is information representing distortion determined according to the position of the focus lens 210. Fig. 4 and the like are related explanations of the distortion coefficient. The average value is a value calculated by dividing the sum of LP1 to LP7 by the time from the time t1 to the time t 7. Fig. 6 and the like are related to a specific method for calculating the average value.
The image processing unit 104 acquires the pixel data 310 of the horizontal pixel column 1 to the horizontal pixel column N from the image sensor 120 based on the vertical synchronization signal VD at time t 7. The image processing unit 104 corrects the image of the pixel data 310 based on the pixel data 310 acquired from the image sensor 120 and the distortion coefficient corresponding to the position average value of the focus lens 210, and generates a corrected image 311. The corrected image 311 generated by the correcting unit 140 is stored in the memory 130, and is output to the display unit 160 as an image for display by the display unit 160.
Also, in accordance with the vertical synchronization signal VD at time t7, the focus adjustment section 112 acquires lens position information representing LP9, LP10, LP11, LP12, LP13, LP14, and LP15 detected during the exposure period from time t9 to time t15 from the lens control section 220, and outputs to the image processing section 104. The acquisition section 142 acquires the lens position information output from the focus adjustment section 112.
The correction unit 140 calculates an average value of LP9, LP10, LP11, LP12, LP13, LP14, and LP15 based on the lens position information. The correction unit 140 calculates a distortion coefficient based on the average values of LP9, LP10, LP11, LP12, LP13, LP14, and LP 15.
The image processing unit 104 acquires the pixel data 320 of the horizontal pixel column 1 to the horizontal pixel column N from the image sensor 120 based on the vertical synchronization signal VD at time t 15. The image processing unit 104 corrects the image of the pixel data 320 based on the pixel data 320 acquired from the image sensor 120 and the distortion coefficient corresponding to the position average value of the focus lens 210, and generates a corrected image 321. The corrected image 321 generated by the correcting unit 140 is stored in the memory 130, and is output to the display unit 160 as an image for display by the display unit 160.
The imaging apparatus 100 executes the exposure process, the image data reading process, the image correction process, and the process of displaying on the display unit 160 once as described above based on the vertical synchronization signal VD. The imaging apparatus 100 repeatedly executes the above-described processing for each vertical synchronization signal VD.
Fig. 4 shows an example of distortion coefficients. The horizontal axis of fig. 4 represents the position of the focus lens 210, and the vertical axis represents the distortion coefficient value. The distortion coefficients include k1, k2, and k 3. The memory 130 stores distortion coefficient data representing the dependence of k1, k2, and k3 on the focus lens 210. The distortion coefficient data may be calculated in advance based on lens design data of an optical system included in the lens portion 200 and stored in the memory 130. The distortion coefficient data may be calculated in advance by experiment and stored in the memory 130. The distortion coefficient data of k1, k2, and k3 may be data representing a function having the position of the focus lens 210 as a variable. The correlation function can be obtained by fitting the pre-calculated distortion coefficients k1, k2, and k3 to a function having the position of the focus lens 210 as a variable. Distortion coefficient data of k1, k2, and k3 may be mapping data that maps the position of the focus lens 210 to k1, k2, and k 3.
The coordinates (x) on the image sensor 120 can be expressed by the following equation 1 using distortion coefficients k1, k2, and k3distorted,y distorted) And normalized image coordinates (x, y).
[ formula 1 ]
Figure PCTCN2020071175-APPB-000001
Coordinates (x, y) represent coordinates within the corrected image. LP in equation 1 denotes the position of the focus lens 210. Since the distortion coefficients k1, k2, and k3 depend on LP, the distortion coefficients k1(LP), k2(LP), and k3(LP) are expressed in expression 1. Fig. 5 schematically shows the coordinates (x) on the image sensor 120distorted,y distorted) And image coordinates (x, y).
The correction section 140 applies the coordinates of each pixel in the pixel data to (x) of expression 1distorted,y distorted) Then, the coordinates (x, y) satisfying equation 1 are calculated by applying k1, k2, and k3 calculated from the position average value of the focus lens 210 and the distortion coefficient data to k1(LP), k2(LP), and k3(LP), respectively. Then, the correction section 140 corrects the coordinates (x) in the image data acquired from the image sensor 120distorted,y distorted) Is used as the pixel value of the coordinates (x, y), thereby generating a corrected image.
As described in association with fig. 3, 4, 5, and the like, the correction section 140 corrects the image using a distortion coefficient determined according to the average position of the focus lens 210 within the exposure period. Thereby, as shown in fig. 3, the corrected image 311 and the corrected image 321 in which the variation in the angle of view caused by the movement of the focus lens 210 is suppressed can be generated.
Fig. 6 is a diagram for explaining a calculation method of the position average value of the focus lens 210. In fig. 6, T is an exposure time. That is, Te is the time from t1 to t 7. Td is a time interval for detecting the position of the focus lens 210. In the example shown in fig. 6, T is 6 Td. The position average of the focus lens 210 is a value calculated from the time average of LP1 to LP 7.
Specifically, the weighted sum of LP1 to LP7 is divided by T, and the position average of the focus lens 210 is calculated. From Σ α i × LPi, a weighted sum of LP1 to LP7 is calculated. i is a natural number from 1 to 7.
And alpha i is a weight coefficient of the weighted sum. The sum of α 1 to α 7 is T. α i may be specified based on a period time included in the exposure period among the periods from the time ti-Td/2 to the time ti + Td/2. Specifically, α 1 and α 7 are Td/2, and α 2 to α 6 are Td. As described above, the weight coefficient α i is determined based on the time at which the position of the focus lens 210 is detected, and the time average value of the position of the focus lens 210 can be calculated.
In addition, the average value may be a value calculated by dividing the sum of LP1 to LP7 by 7. That is, the average value may be the position of the focus lens 210 detected within the exposure period and a value calculated by dividing by the number of position detections of the focus lens 210.
Fig. 7 is a flowchart showing one example of the execution steps of the image capturing apparatus 100. When the vertical synchronization signal VD is triggered, the flowchart starts.
In S600, the focus adjustment section 112 detects the position of the focus lens 210 by outputting a timing signal for detecting the position of the focus lens 210 to the lens control section 220.
In S600, after a predetermined time has elapsed from the vertical synchronization signal VD, the imaging control unit 110 starts exposure of the image sensor 120. In S604, the lens control section 220 detects the position of the focus lens 210 based on the timing signal output from the focus adjustment section 112.
In S606, the imaging control unit 110 determines whether or not to end exposure. For example, when a trigger of a new vertical synchronization signal VD is detected, the imaging control unit 110 determines that exposure is to be terminated. Until it is determined that the exposure is ended, the image sensor 120 is exposed, and the process of S604 is repeatedly executed.
If it is determined in S606 that the exposure is to be ended, in S608, the correction unit 140 acquires lens position information of the focus lens 210 from the lens control unit 220 via the focus adjustment unit 112, and calculates a position average value of the focus lens 210.
In S610, the correction section 140 acquires pixel data output from the image sensor 120. In S612, the correction unit 140 calculates the pixel coordinates (x) corresponding to the image sensor 120 based on the distortion coefficient datadistorted,y distorted) Image coordinates (x, y).
In S614, the correction unit 140 generates a corrected image by reflecting each coordinate pixel value of the pixel data output from the image sensor 120 as the pixel value of the image coordinate calculated in S612.
In S616, the image processing section 104 stores the corrected image generated by the correcting section 140 in the memory 130. After the process of S616 is completed, the process of the present flowchart is ended.
The corrected image stored in the memory 130 in S616 is output to the display unit 160 as a moving image configuration image for display, for example. When the correction processing is performed during the shooting of the moving picture, after the shooting of the moving picture is completed, the corrected image stored in the memory 130 is recorded in the memory 130 as a moving picture configuration image of moving picture data.
Further, the correction unit 140 may store the image coordinates calculated in S612 in the memory 130 in association with the position average value of the focus lens 210. When the difference between the newly calculated position average value of the focus lens 210 and the average value stored in the memory 130 is smaller than a predetermined value, the correction section 140 may use the image coordinates stored in the memory 130 without performing the process of S612. So that the amount of computation for calculating the image coordinates can be reduced.
As described in association with fig. 3 to 7 and the like, the correction section 140 corrects the image distortion based on the average value of the positions of the focus lens 210 within the exposure period. It is thereby possible to generate a corrected image in which the influence of the change in the angle of view that occurs due to the movement of the focus lens 210 during the exposure period is suppressed.
Fig. 8 is a diagram for explaining another correction method performed by the correction unit 140. Fig. 8 enlarges a part of the image coordinates, showing the pixel coordinates (x) of the image sensor 120distorted,y distorted) And corrected image coordinates (x, y).
The correction section 140 calculates image coordinates (x, y) based on the respective positions of the focus lens 210 detected within the exposure period. Correction processing performed based on other correction methods will be described below with reference to an example shown in fig. 3.
The correction unit 140 calculates image coordinates (x, y) from LP1, LP2, LP3, LP4, LP5, LP6, and LP7 in the exposure period from time t1 to t7, respectively. Specifically, the correction section 140 applies the coordinates of the pixels in the pixel data acquired from the image sensor 120 to (x) of expression 1distorted,y distorted) Then, k1, k2 and k3 calculated from LP1 and distortion coefficient data are applied to k1(LP), k2(LP) and k3(LP) of expression 1, respectively, and coordinates (x, y) satisfying expression 1 are calculated. The calculated coordinates (x, y) are shown in fig. 8 as (x1, y 1).
The same operation is performed for LP2, and the correction section 140 applies the coordinates of the pixel in the pixel data to (x) of expression 1distorted,y distorted) Then, k1, k2 and k3 calculated from LP2 and distortion coefficient data are applied to k1(LP), k2(LP) and k3(LP) of expression 1, respectively, and coordinates (x, y) satisfying expression 1 are calculated. The calculated coordinates (x, y) are shown in fig. 8 as (x2, y 2). The same operation is performed for LP3, LP4, LP5, LP6, and LP7, and coordinates (x, y) satisfying expression 1 are calculated using k1, k2, and k3 calculated from LPi (i is a natural number of 3 to 7) and distortion coefficient data, and are calculated as (xi, yi) corresponding to each LPi.
The correction section 140 corrects the coordinates (x) in the pixel datadistorted,y distorted) Should be used as the pixel value of the average coordinate (x, y) of (xi, yi) (i is a natural number of 1 to 7). The correction unit 140 performs the same processing on each pixel of the pixel data acquired from the image sensor 120 to generate a corrected image.
By the correction method described in association with fig. 8, it is also possible to provide a corrected image in which the influence of the change in the angle of view occurring due to the movement of the focus lens 210 during the exposure period is suppressed. Compared with the correction method described in connection with fig. 3 and 5, etc., it is possible to generate a corrected image in which the influence of the change in the angle of view is further reduced.
Fig. 9 is a diagram schematically illustrating correction processing by the correction section 140 when pixel data is read from the image sensor 120 by scrolling reading. When performing the rolling reading, the imaging control unit 110 sequentially shifts the exposure start timing of the horizontal pixel row 1 to the horizontal pixel row N to perform the exposure. So that the exposure period of the image sensor 120 is different by the pixel columns included in the image sensor 120.
As shown in fig. 9, the exposure period of the horizontal pixel column 1 is time t1 to time t 7. The exposure period of the horizontal pixel column N is from time t7 to time t 13. In general, the exposure period of the horizontal pixel column 1+ i is from time T1+ δ T × (i-1) to time T7+ δ T × (i-1) (i is a natural number from 1 to N). δ T is the interval of the exposure start timings of adjacent horizontal pixel columns.
The positions of the focus lens 210 detected within the exposure period of the horizontal pixel column 1 are LP1, LP2, LP3, LP4, LP5, LP6, and LP 7. The correction unit 140 calculates distortion coefficients k1, k2, and k3 corresponding to the average values of LP1, LP2, LP3, LP4, LP5, LP6, and LP7, based on the average values of LP1, LP2, LP3, LP4, LP5, LP6, and LP7 and the distortion coefficient data. The correction unit 140 applies the calculated distortion coefficients k1, k2, and k3 to expression 1, corrects the pixel data 810 of the horizontal pixel column 1, and generates corrected pixel data 811.
Also, the positions of the focus lens 210 detected within the exposure period of the horizontal pixel column 2 are LP2, LP3, LP4, LP5, LP6, and LP 7. The correction unit 140 calculates distortion coefficients k1, k2, and k3 corresponding to the average values of LP2, LP3, LP4, LP5, LP6, and LP7, based on the average values of LP2, LP3, LP4, LP5, LP6, and LP7 and the distortion coefficient data. The correction unit 140 applies the calculated distortion coefficients k1, k2, and k3 to the calculation formula 1, and corrects the pixel data of the horizontal pixel column 2 to generate corrected pixel data. The average value is a value calculated by dividing the weighted sum of LP1 to LP7 by the time from the time t1 to the time t 7. Specifically, the average value is a value calculated by a calculation method described in association with fig. 6 and the like.
The correction unit 140 performs the same processing on the horizontal pixel column 3 to the horizontal pixel column N to generate corrected pixel data. For example, the positions of the focus lens 210 detected within the exposure period of the horizontal pixel column N are LP7, LP8, LP9, LP10, LP11, LP12, and LP13, the correction section 140 calculates the average values of LP7, LP8, LP9, LP10, LP11, LP12, and LP13 based on the average values of LP7, LP8, LP9, LP10, LP11, LP12, and LP13 and the distortion coefficient data, and calculates the distortion coefficients k1, k2, and k3 corresponding to the calculated average values. The correction unit 140 applies the calculated distortion coefficients k1, k2, and k3 to expression 1, and corrects the pixel data 820 of the horizontal pixel column N to generate corrected pixel data 821. The correction section 140 generates one corrected image based on the correction pixel data generated from each horizontal pixel column.
As described above, when scroll reading is performed from the image sensor 120, since the exposure periods of the respective horizontal pixel columns are different from each other, the detected position of the image sensor 120 may also be different from pixel column to pixel column. In contrast, the correction section 140 applies distortion coefficients k1, k2, and k3 based on the average value of the positions of the focus lens 210 within the exposure period to expression 1 for the pixel column group in which the positions of the focus lens 210 detected within the exposure period are the same, generating corrected pixel data.
Fig. 9 and the like describe in association with correction processing of image data read by scroll reading based on the position average value of the focus lens 210. The correction method described in association with fig. 8 or the like may be applied to correction of image data read by scrolling reading. For example, the correction unit 140 may calculate image coordinates corresponding to each position of the focus lens 210 for each pixel row, and apply the average coordinates of the calculated image coordinates as the pixel values of the corrected image.
As described above, according to the image pickup apparatus 100, it is possible to provide an image in which the influence of the change in the angle of view that occurs due to the movement of the focus lens 210 is suppressed. This effect is particularly effective for a small-sized lens device which is small in relation to the size of the image sensor. For example, if the optical system is miniaturized with respect to the size of the image sensor, the influence of distortion aberration due to the movement of the focus lens is significant. Therefore, if the focus lens swings during live view shooting, for example, a change in angle of view accompanying the swing may be observed on the live view image. As described above, according to the image pickup apparatus 100, it is possible to suppress the influence of the variation in the angle of view due to the movement of the focus lens 210, and the variation in the angle of view due to the wobbling is less likely to be observed on the live view image. Further, by detecting the contrast value used for the focus control from the corrected image, it is possible to suppress the image area of the contrast value detection target from being changed by the movement of the focus lens 210. Therefore, the focus control can be performed more accurately based on the contrast value.
The processing described in connection with the image pickup apparatus 100 according to the present embodiment is applicable not only to the wobbling of the focus lens 210 but also to the processing when the focus lens 210 is moved in one direction within the exposure period. The processing described in connection with the imaging apparatus 100 of the present embodiment is applicable not only to the image during framing shooting, but also to the correction processing when generating moving image data for recording and the correction processing when generating still image data for recording. The processing described in connection with the image pickup apparatus 100 according to the present embodiment is also applicable to movement of lenses other than the focus lens 210. That is, the correction section 140 corrects the image acquired by exposing the image sensor 120 for the relevant exposure period based on the plurality of positions of the lens other than the focus lens 210 that moves within the exposure period of the image sensor 120.
The imaging device 100 may be mounted on a mobile body. The camera 100 may also be mounted on an Unmanned Aerial Vehicle (UAV) as shown in fig. 10. The UAV10 may include a UAV body 20, a gimbal 50, a plurality of cameras 60, and a camera 100. The gimbal 50 and the image pickup apparatus 100 are one example of an image pickup system. The UAV10 is one example of a mobile body propelled by propulsion. The concept of a mobile body includes a flying body such as an airplane moving in the air, a vehicle moving on the ground, a ship moving on water, and the like, in addition to the UAV.
The UAV body 20 includes a plurality of rotors. Multiple rotors are one example of a propulsion section. The UAV body 20 flies the UAV10 by controlling the rotation of the plurality of rotors. The UAV body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four. Alternatively, the UAV10 may be a fixed wing aircraft without a rotor.
The imaging apparatus 100 is an imaging camera that captures an object included in a desired imaging range. The gimbal 50 rotatably supports the image pickup apparatus 100. The gimbal 50 is an example of a support mechanism. For example, the gimbal 50 rotatably supports the image pickup apparatus 100 with a pitch axis using an actuator. The gimbal 50 further rotatably supports the image pickup apparatus 100 centered on the roll axis and the yaw axis, respectively, using the actuators. The gimbal 50 can change the attitude of the imaging apparatus 100 by rotating the imaging apparatus 100 about at least one of the yaw axis, the pitch axis, and the roll axis.
The plurality of imaging devices 60 are sensing cameras for imaging the surroundings of the UAV10 in order to control the flight of the UAV 10. Two cameras 60 may be provided at the nose, i.e., the front, of the UAV 10. Also, two other cameras 60 may be provided on the bottom surface of the UAV 10. The two image pickup devices 60 on the front side may be paired to function as a so-called stereo camera. The two imaging devices 60 on the bottom surface side may also be paired to function as a stereo camera. Three-dimensional spatial data around the UAV10 may be generated from images taken by multiple cameras 60. The number of cameras 60 included in the UAV10 is not limited to four. It is sufficient that the UAV10 includes at least one camera 60. The UAV10 may also include at least one camera 60 at the nose, tail, sides, bottom, and top of the UAV 10. The angle of view that can be set in the image pickup device 60 can be larger than that which can be set in the image pickup device 100. The imaging device 60 may also have a single focus lens or a fisheye lens.
The remote operation device 300 communicates with the UAV10 to remotely operate the UAV 10. The remote operation device 300 may be in wireless communication with the UAV 10. The remote operation device 300 transmits instruction information indicating various instructions related to the movement of the UAV10, such as ascending, descending, accelerating, decelerating, advancing, retreating, and rotating, to the UAV 10. The indication includes, for example, an indication to raise the altitude of the UAV 10. The indication may indicate an altitude at which the UAV10 should be located. The UAV10 moves to be located at an altitude indicated by the instruction received from the remote operation device 300. The indication may include a lift instruction to lift the UAV 10. The UAV10 ascends while receiving the ascending instruction. When the altitude of the UAV10 has reached an upper limit altitude, the UAV10 may be restricted from ascending even if an ascending command is accepted.
FIG. 11 illustrates one example of a computer 1200 that may embody aspects of the invention in whole or in part. The program installed on the computer 1200 can cause the computer 1200 to function as one or more "sections" of or operations associated with the apparatus according to the embodiment of the present invention. For example, a program installed on the computer 1200 can cause the computer 1200 to function as the correction unit 140 or the image processing unit 104. Alternatively, the program can cause the computer 1200 to perform the relevant operations or functions associated with one or more "sections". The program enables the computer 1200 to execute the processes or the stages of the processes according to the embodiments of the present invention. Such programs may be executed by the CPU1212 to cause the computer 1200 to perform specified operations associated with some or all of the blocks in the flowchart and block diagrams described herein.
The computer 1200 of the present embodiment includes a CPU1212 and a RAM1214, which are connected to each other through a host controller 1210. The computer 1200 also includes a communication interface 1222, an input/output unit, which are connected to the host controller 1210 through the input/output controller 1220. Computer 1200 also includes a ROM 1230. The CPU1212 operates according to programs stored in the ROM1230 and the RAM1214, thereby controlling the respective units.
The communication interface 1222 communicates with other electronic devices via a network. The hard disk drive may store programs and data used by CPU1212 in computer 1200. The ROM1230 stores therein a boot program or the like executed by the computer 1200 at runtime, and/or a program depending on the hardware of the computer 1200. The program is provided through a computer-readable recording medium such as a CR-ROM, a USB memory, or an IC card, or a network. The program is installed in the RAM1214 or the ROM1230, which is also an example of a computer-readable recording medium, and executed by the CPU 1212. The information processing described in these programs is read by the computer 1200, and causes cooperation between the programs and the various types of hardware resources described above. Operations or processing of information may be performed with the use of the computer 1200 to constitute an apparatus or method.
For example, when performing communication between the computer 1200 and an external device, the CPU1212 may execute a communication program loaded in the RAM1214 and instruct the communication interface 1222 to perform communication processing based on processing described in the communication program. Under the control of the CPU1212, the communication interface 1222 reads transmission data stored in a transmission buffer provided in a recording medium such as the RAM1214 or a USB memory, and transmits the read transmission data to a network, or writes reception data received from the network to a reception buffer provided on the recording medium, or the like.
In addition, the CPU1212 can cause the RAM1214 to read all or a necessary portion of a file or a database stored in an external recording medium such as a USB memory, and perform various types of processing on data on the RAM 1214. Then, the CPU1212 may write back the processed data to the external recording medium.
Various types of information such as various types of programs, data, tables, and databases may be stored in the recording medium and subjected to information processing. With respect to data read from the RAM1214, the CPU1212 may execute various types of processing described throughout this disclosure, including various types of operations specified by instruction sequences of programs, information processing, condition judgment, condition branching, unconditional branching, retrieval/replacement of information, and the like, and write the result back into the RAM 1214. Further, the CPU1212 can retrieve information in files, databases, etc., within the recording medium. For example, when a plurality of entries having attribute values of a first attribute respectively associated with attribute values of a second attribute are stored in a recording medium, the CPU1212 may retrieve an entry matching a condition specifying an attribute value of the first attribute from the associated plurality of entries and read an attribute value of the second attribute stored in the entry, thereby acquiring an attribute value of the second attribute associated with the first attribute satisfying a predetermined condition.
The programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200. Further, a recording medium such as a hard disk or a RAM provided in a server system connected to a dedicated communication network or the internet may be used as the computer-readable storage medium, so that the program can be provided to the computer 1200 via the network.
It should be noted that the execution order of the operations, the sequence, the steps, and the stages in the apparatus, the system, the program, and the method shown in the claims, the description, and the drawings may be implemented in any order as long as "before …", "in advance", and the like are not particularly explicitly indicated, and as long as the output of the preceding process is not used in the following process. The operational flow in the claims, the specification, and the drawings is described using "first", "next", and the like for convenience, but it is not necessarily meant to be performed in this order.
The present invention has been described above using the embodiments, but the technical scope of the present invention is not limited to the scope described in the above embodiments. It will be apparent to those skilled in the art that various changes and modifications can be made in the above embodiments. It is apparent from the description of the claims that the modes to which such changes or improvements are made are included in the technical scope of the present invention.

Claims (14)

  1. An image processing apparatus characterized by comprising:
    an acquisition section for acquiring information indicating a plurality of positions of a lens moving within an image pickup element exposure period; and
    a correction section that corrects an image acquired by exposing the image pickup element through the lens in the exposure period based on the plurality of positions.
  2. The image processing apparatus according to claim 1,
    the correction section calculates an average position of the lens in the exposure period based on the plurality of positions, and corrects distortion of the image based on the calculated average position of the lens.
  3. The image processing apparatus according to claim 2,
    the correction unit corrects distortion of the image based on the average position of the lens and a distortion coefficient corresponding to the lens position.
  4. The image processing apparatus according to claim 1,
    the correction unit calculates a plurality of pixel positions corresponding to the plurality of pixels on the image for each combination of the plurality of positions and the plurality of pixels based on the plurality of positions and the positions of the plurality of pixels included in the image pickup element, calculates an average position of the plurality of pixel positions for each of the plurality of pixels, and corrects the image.
  5. The image processing apparatus according to claim 1 or 2,
    exposing the image pickup element for exposure periods that are different from each other in accordance with a plurality of pixel columns included in the image pickup element,
    the acquisition section acquires information indicating a plurality of positions of a lens moved within an exposure period of each of the plurality of pixel columns,
    the correction section corrects the images acquired by the plurality of pixel columns, respectively, based on the plurality of positions within the exposure period of each of the plurality of pixel columns.
  6. The image processing apparatus according to claim 1 or 2,
    the lens is a focus lens movable in the optical axis direction.
  7. The image processing apparatus according to claim 6,
    in order to adjust a focus of an optical system including the lens, the lens is reciprocated in the optical axis direction during the exposure period,
    the acquisition section acquires information indicating a plurality of positions of the lens that reciprocates within the exposure period.
  8. The image processing apparatus according to claim 6,
    in order to adjust a focus of an optical system including the lens, the lens is moved in the optical axis direction and in one direction during the exposure period,
    the acquisition section acquires information indicating a plurality of positions of the lens that moves in the one direction within the exposure period.
  9. The image processing apparatus according to claim 1 or 2,
    the exposure period is a period in which the image pickup element is repeatedly exposed in order to acquire each of a plurality of moving picture constituent images constituting a moving picture,
    the correction section corrects each of the plurality of moving picture constituent images based on the plurality of positions within the exposure period for acquiring each of the plurality of moving picture constituent images.
  10. An image pickup apparatus, comprising:
    the image processing apparatus according to any one of claims 1 to 9; and
    the image pickup device.
  11. The image pickup apparatus according to claim 10,
    the lens is a focus lens movable in the optical axis direction.
    The image pickup apparatus further includes a focus adjustment section that adjusts a focus of an optical system including the lens based on the image corrected by the correction section.
  12. An unmanned aerial vehicle comprising the imaging device according to claim 10 and moving.
  13. An image processing method, comprising:
    a phase of acquiring information representing a plurality of positions of a lens moving within an exposure period of an image pickup element; and
    and correcting an image acquired by exposing the image pickup element through the lens in the exposure period based on the plurality of positions.
  14. A program for causing a computer to function as the image processing apparatus according to claim 1 or 2.
CN202080002874.4A 2019-01-31 2020-01-09 Image processing device, imaging device, unmanned aircraft, image processing method, and program Pending CN112204947A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2019015752A JP6746857B2 (en) 2019-01-31 2019-01-31 Image processing device, imaging device, unmanned aerial vehicle, image processing method, and program
JP2019-015752 2019-01-31
PCT/CN2020/071175 WO2020156085A1 (en) 2019-01-31 2020-01-09 Image processing apparatus, photographing apparatus, unmanned aerial aircraft, image processing method and program

Publications (1)

Publication Number Publication Date
CN112204947A true CN112204947A (en) 2021-01-08

Family

ID=71840865

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080002874.4A Pending CN112204947A (en) 2019-01-31 2020-01-09 Image processing device, imaging device, unmanned aircraft, image processing method, and program

Country Status (3)

Country Link
JP (1) JP6746857B2 (en)
CN (1) CN112204947A (en)
WO (1) WO2020156085A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1525396A (en) * 2003-06-20 2004-09-01 北京中星微电子有限公司 A distortion correction method for lens imaging
US20060062557A1 (en) * 2004-09-17 2006-03-23 Canon Kabushiki Kaisha Camera system, image capturing apparatus, and a method of an image capturing apparatus
US20060140503A1 (en) * 2004-12-28 2006-06-29 Tohru Kurata Methods for correcting distortions of image-taking video signals and apparatus for correcting distortions of image-taking video signals
US20080175514A1 (en) * 2003-08-06 2008-07-24 Sony Corporation Image processing apparatus, image processing system, imaging apparatus and image processing method
CN101572777A (en) * 2008-05-02 2009-11-04 奥林巴斯映像株式会社 Filming device and filming method
JP2010141814A (en) * 2008-12-15 2010-06-24 Nikon Corp Image processing apparatus, imaging apparatus, program, and method of image processing
US20140300799A1 (en) * 2013-04-05 2014-10-09 Olympus Corporation Imaging device, method for controlling imaging device, and information storage device
US20180182075A1 (en) * 2016-12-22 2018-06-28 Canon Kabushiki Kaisha Image processing apparatus, image capturing apparatus, method of image processing, and storage medium
US20180217352A1 (en) * 2017-02-02 2018-08-02 Canon Kabushiki Kaisha Focus control apparatus, image capturing apparatus, and focus control method
US20180299748A1 (en) * 2014-05-01 2018-10-18 Canon Kabushiki Kaisha Focus adjustment device, method for controlling the same, and image capture apparatus

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011061444A (en) * 2009-09-09 2011-03-24 Hitachi Information & Communication Engineering Ltd Aberration correction device and method
JP5891440B2 (en) * 2011-05-16 2016-03-23 パナソニックIpマネジメント株式会社 Lens unit and imaging device
WO2013171954A1 (en) * 2012-05-17 2013-11-21 パナソニック株式会社 Imaging device, semiconductor integrated circuit and imaging method
JP5963542B2 (en) * 2012-05-30 2016-08-03 キヤノン株式会社 Image processing apparatus, control method thereof, and program
CN104380709B (en) * 2012-06-22 2018-05-29 富士胶片株式会社 Photographic device and its method of controlling operation
JP6136019B2 (en) * 2014-02-03 2017-05-31 パナソニックIpマネジメント株式会社 Moving image photographing apparatus and focusing method of moving image photographing apparatus
JP6516443B2 (en) * 2014-11-10 2019-05-22 オリンパス株式会社 Camera system
WO2017122348A1 (en) * 2016-01-15 2017-07-20 オリンパス株式会社 Focus control device, endoscope device, and operation method for focus control device
JP7166920B2 (en) * 2016-08-05 2022-11-08 ソニーグループ株式会社 IMAGING DEVICE, SOLID-STATE IMAGE SENSOR, CAMERA MODULE, AND IMAGING METHOD
CN110337805B (en) * 2017-03-01 2021-03-23 富士胶片株式会社 Image pickup apparatus, image processing method, and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1525396A (en) * 2003-06-20 2004-09-01 北京中星微电子有限公司 A distortion correction method for lens imaging
US20080175514A1 (en) * 2003-08-06 2008-07-24 Sony Corporation Image processing apparatus, image processing system, imaging apparatus and image processing method
US20060062557A1 (en) * 2004-09-17 2006-03-23 Canon Kabushiki Kaisha Camera system, image capturing apparatus, and a method of an image capturing apparatus
US20060140503A1 (en) * 2004-12-28 2006-06-29 Tohru Kurata Methods for correcting distortions of image-taking video signals and apparatus for correcting distortions of image-taking video signals
CN101572777A (en) * 2008-05-02 2009-11-04 奥林巴斯映像株式会社 Filming device and filming method
JP2010141814A (en) * 2008-12-15 2010-06-24 Nikon Corp Image processing apparatus, imaging apparatus, program, and method of image processing
US20140300799A1 (en) * 2013-04-05 2014-10-09 Olympus Corporation Imaging device, method for controlling imaging device, and information storage device
US20180299748A1 (en) * 2014-05-01 2018-10-18 Canon Kabushiki Kaisha Focus adjustment device, method for controlling the same, and image capture apparatus
US20180182075A1 (en) * 2016-12-22 2018-06-28 Canon Kabushiki Kaisha Image processing apparatus, image capturing apparatus, method of image processing, and storage medium
US20180217352A1 (en) * 2017-02-02 2018-08-02 Canon Kabushiki Kaisha Focus control apparatus, image capturing apparatus, and focus control method

Also Published As

Publication number Publication date
JP6746857B2 (en) 2020-08-26
WO2020156085A1 (en) 2020-08-06
JP2020123897A (en) 2020-08-13

Similar Documents

Publication Publication Date Title
CN108235815B (en) Imaging control device, imaging system, moving object, imaging control method, and medium
WO2019120082A1 (en) Control device, system, control method, and program
JP6874251B2 (en) Devices, imaging devices, moving objects, methods, and programs
CN110337609B (en) Control device, lens device, imaging device, flying object, and control method
JP6503607B2 (en) Imaging control apparatus, imaging apparatus, imaging system, moving object, imaging control method, and program
US20220046177A1 (en) Control device, camera device, movable object, control method, and program
CN112204947A (en) Image processing device, imaging device, unmanned aircraft, image processing method, and program
JP2019083390A (en) Control device, imaging device, mobile body, control method, and program
JP6543859B1 (en) IMAGE PROCESSING DEVICE, IMAGING DEVICE, MOBILE OBJECT, IMAGE PROCESSING METHOD, AND PROGRAM
JP6641574B1 (en) Determination device, moving object, determination method, and program
CN110785997B (en) Control device, imaging device, mobile body, and control method
JP6547984B2 (en) CONTROL DEVICE, IMAGING DEVICE, IMAGING SYSTEM, MOBILE OBJECT, CONTROL METHOD, AND PROGRAM
CN112313941A (en) Control device, imaging device, control method, and program
CN112313943A (en) Device, imaging device, moving object, method, and program
JP6569157B1 (en) Control device, imaging device, moving object, control method, and program
CN112313574B (en) Control device, imaging system, control method, and program
JP7043706B2 (en) Control device, imaging system, control method, and program
JP6961888B1 (en) Devices, imaging devices, mobiles, programs and methods
JP6878738B1 (en) Control devices, imaging systems, moving objects, control methods, and programs
WO2021249245A1 (en) Device, camera device, camera system, and movable member
JP6459012B1 (en) Control device, imaging device, flying object, control method, and program
CN112136315A (en) Control device, imaging device, mobile body, control method, and program
JP2021047367A (en) Control device, imaging device, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20230228

AD01 Patent right deemed abandoned