WO2020156085A1 - 图像处理装置、摄像装置、无人驾驶航空器、图像处理方法以及程序 - Google Patents

图像处理装置、摄像装置、无人驾驶航空器、图像处理方法以及程序 Download PDF

Info

Publication number
WO2020156085A1
WO2020156085A1 PCT/CN2020/071175 CN2020071175W WO2020156085A1 WO 2020156085 A1 WO2020156085 A1 WO 2020156085A1 CN 2020071175 W CN2020071175 W CN 2020071175W WO 2020156085 A1 WO2020156085 A1 WO 2020156085A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
image
positions
image processing
exposure period
Prior art date
Application number
PCT/CN2020/071175
Other languages
English (en)
French (fr)
Chinese (zh)
Inventor
高宫诚
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080002874.4A priority Critical patent/CN112204947A/zh
Publication of WO2020156085A1 publication Critical patent/WO2020156085A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an image processing device, a camera device, an unmanned aircraft, an image processing method and a program.
  • Patent Document 1 describes a technique that uses a distortion correction parameter table to correct the aberration of inputted screen coordinate pixels, and the distortion correction parameter table is used to store each pixel position coordinate data corresponding to lens parameters.
  • Patent Document 1 Japanese Patent Laid-Open No. 2011-61444.
  • the lens included in the optical system such as the focus lens is moved, it may cause unexpected changes in the angle of view due to distortion, etc.
  • An image processing device includes an acquisition unit for acquiring information representing a plurality of positions of the lens that moves within the exposure period of the imaging element.
  • the image processing device includes a correction unit that corrects an image obtained by exposing the image sensor through the lens during an exposure period based on a plurality of positions.
  • the correction unit may calculate the average position of the lens during the exposure period based on a plurality of positions, and correct the distortion of the image based on the calculated average position of the lens.
  • the correction unit may correct the distortion of the image based on the average position of the lens and the distortion coefficient corresponding to the lens position.
  • the correction unit may calculate a plurality of pixel positions corresponding to a plurality of pixels on the image for each combination of the plurality of positions and the plurality of pixels based on the plurality of positions and the positions of the plurality of pixels included in the imaging element, The pixels respectively calculate the average position of multiple pixel positions to correct the image.
  • the imaging element can be exposed according to different exposure periods of the plurality of pixel columns included in the imaging element.
  • the acquiring unit may acquire information indicating a plurality of positions of the lens that move within the respective exposure periods of the plurality of pixel columns.
  • the correction unit may correct the images respectively acquired by the plurality of pixel rows based on a plurality of positions within the exposure period of each of the plurality of pixel rows.
  • the lens may be a focusing lens movable in the direction of the optical axis.
  • the lens may reciprocate in the optical axis direction during the exposure period.
  • the acquiring unit may acquire information representing multiple positions of the reciprocating lens during the exposure period.
  • the lens In order to adjust the focus of the optical system including the lens, the lens may be moved in one direction in the optical axis direction during the exposure period.
  • the acquiring unit may acquire information representing multiple positions of the lens moving in one direction during the exposure period.
  • the exposure time period may be a time period during which the imaging element is repeatedly exposed in order to separately acquire each of a plurality of moving image constituent images constituting the moving image.
  • the correcting unit may correct each of the moving image constituting images based on a plurality of positions within the exposure period for each of the plurality of moving image constituting images.
  • the imaging device may include the above-mentioned image processing device.
  • the camera device may include an image sensor.
  • the lens may be a focusing lens movable in the direction of the optical axis.
  • the imaging device may include a focus adjustment part that adjusts the focus of the optical system including the lens based on the image corrected by the correction part.
  • An unmanned aircraft includes the aforementioned camera device and moves.
  • An image processing method includes a stage of acquiring information indicating a plurality of positions of a lens that moves within an exposure period of an imaging element.
  • the image processing method includes a stage of correcting an image obtained by exposing an imaging element through a lens during an exposure period based on a plurality of positions.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned image processing apparatus.
  • FIG. 1 is a diagram showing an example of an external perspective view of an imaging device 100 according to this embodiment.
  • FIG. 2 is a diagram showing functional blocks of the imaging device 100 according to this embodiment.
  • FIG. 3 is a diagram schematically illustrating processing performed by the correction unit 140.
  • Fig. 4 shows an example of the distortion coefficient.
  • FIG. 5 schematically shows an example of the positional relationship between the coordinates on the image sensor 120 and the image coordinates (x, y).
  • FIG. 6 is a diagram illustrating a calculation method of the position average value of the focus lens 210.
  • FIG. 7 is a flowchart showing an example of the execution procedure of the imaging device 100.
  • FIG. 8 is a diagram illustrating another correction method executed by the correction unit 140.
  • FIG. 9 is a diagram schematically illustrating correction processing when reading pixel data by scroll reading.
  • FIG. 10 shows an unmanned aerial vehicle (UAV) equipped with a camera device 100.
  • UAV unmanned aerial vehicle
  • FIG. 11 shows an example of a computer 1200 that may fully or partially embody aspects of the present invention.
  • the blocks can represent (1) the stages of the process of performing operations or (2) the "parts" of the device that perform operations.
  • the specific stages and “sections” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • the programmable circuit may include a reconfigurable hardware circuit.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR and other logical operations, flip-flops, registers, field programmable gate array (FPGA), programmable logic array (PLA) And other storage elements.
  • the computer-readable medium may include any tangible device that can store instructions for execution by a suitable device.
  • the computer-readable medium with instructions stored thereon includes a product including instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • a computer-readable medium it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • Computer readable media may include floppy (registered trademark) disk, floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash memory) , Electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (registered trademark) disc, memory stick, Integrated circuit cards, etc.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash memory erasable programmable read only memory
  • EEPROM Electrically erasable programmable read-only memory
  • SRAM compact disc read-only memory
  • DVD digital versatile disc
  • Blu-ray registered trademark
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes traditional procedural programming languages.
  • Traditional procedural programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk, JAVA (registered trademark), C++, etc.
  • the computer-readable instructions can be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and the like.
  • FIG. 1 is a diagram showing an example of an external perspective view of an imaging device 100 according to this embodiment.
  • FIG. 2 is a diagram showing functional blocks of the imaging device 100 according to this embodiment.
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the imaging unit 102 includes an image sensor 120, an image processing unit 104, an imaging control unit 110, a memory 130, an instruction unit 162, and a display unit 160.
  • the image sensor 120 is an imaging element such as CCD or CMOS.
  • the image sensor 120 receives light through an optical system included in the lens unit 200.
  • the image sensor 120 outputs image data of an optical image formed by the optical system of the lens unit 200 to the image processing unit 104.
  • the imaging control unit 110 and the image processing unit 104 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the memory 130 may be a computer-readable recording medium, and may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores a program necessary for the imaging control unit 110 to control the image sensor 120 and the like, a program necessary for the image processing unit 104 to execute image processing, and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the storage 130 may be configured to be detachable from the housing of the imaging device 100.
  • the instruction unit 162 is a user interface that accepts instructions to the imaging device 100 from the user.
  • the display unit 160 displays images captured by the image sensor 120 and processed by the image processing unit 104, various setting information of the imaging device 100, and the like.
  • the display part 160 may be composed of a touch panel.
  • the imaging control unit 110 controls the lens unit 200 and the image sensor 120.
  • the imaging control unit 110 controls the adjustment of the focal position and focal length of the optical system included in the lens unit 200.
  • the imaging control unit 110 outputs a control command to the lens control unit 220 included in the lens unit 200 based on the information indicating the user's instruction, thereby controlling the lens unit 200.
  • the lens unit 200 includes a focus lens 210, a zoom lens 211, a lens drive unit 212, a lens drive unit 213, a lens control unit 220, a memory 222, a position sensor 214, and a position sensor 215.
  • the focus lens 210 and the zoom lens 211 may include at least one lens.
  • the focus lens 210 and the zoom lens 211 are lenses included in the optical system. At least a part or all of the focus lens 210 and the zoom lens 211 are configured to be movable along the optical axis of the optical system.
  • the imaging control unit 110 includes a focus adjustment unit 112.
  • the focus adjustment unit 112 controls the focus lens 210 to adjust the focus of the optical system included in the lens unit 200.
  • the lens unit 200 may be an interchangeable lens provided to be detachable from the imaging unit 102.
  • the lens driving part 212 may include a driving device such as a stepping motor, a DC motor, a coreless motor, or an ultrasonic motor. At least a part or all of the focus lens 210 receives power from a driving device included in the lens driving unit 212 via a cam ring, a guide shaft, and other mechanism members, and moves.
  • the lens driving part 213 may include a driving device such as a stepping motor, a DC motor, a coreless motor, or an ultrasonic motor. At least a part or all of the zoom lens 211 receives power from a driving device included in the lens driving unit 213 via a cam ring, a guide shaft, and other mechanism members, and moves.
  • the lens control section 220 drives at least one of the lens drive section 212 and the lens drive section 213 in accordance with a lens control instruction from the imaging section 102, and makes at least one of the focus lens 210 and the zoom lens 211 along the optical axis direction via a mechanism member Move to perform at least one of zoom action and focus action.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the memory 222 stores control values for the focus lens and zoom lens that are moved by the lens drive unit 212.
  • the memory 222 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the position sensor 214 detects the position of the focus lens 210.
  • the position sensor 215 detects the position of the zoom lens 211.
  • the position sensor 214 and the position sensor 215 may be magnetoresistive (MR) sensors or the like.
  • the imaging control unit 110 outputs a control command to the image sensor 120 based on information indicating an instruction from the user through the instruction unit 162 or the like, thereby causing the image sensor 120 to perform control including imaging operation control.
  • the image captured by the image sensor 120 is processed by the image processing unit 104 and stored in the memory 130.
  • the image acquired by the image sensor 120 is input to the image processing unit 104.
  • the correction unit 140 corrects the image acquired by the image sensor 120.
  • the display unit 160 displays the image corrected by the correction unit 140.
  • the memory 130 stores the image corrected by the correction unit 140.
  • the image corrected by the correction unit 140 may be transferred from the memory 130 to a recording medium such as a memory card.
  • the image processing unit 104 includes a correction unit 140 and an acquisition unit 142.
  • the acquisition section 142 acquires information representing a plurality of positions of the focus lens 210 that moves within the exposure period of the image sensor 120.
  • the acquisition unit 142 acquires information indicating a plurality of positions of the focus lens 210 from the focus adjustment unit 112.
  • the correction unit 140 corrects the image acquired by exposing the image sensor 120 through the focus lens 210 during the exposure period based on a plurality of positions.
  • the correction unit 140 calculates the average position of the focus lens 210 during the exposure period based on the acquired positions of the focus lens 210, and corrects the distortion of the image based on the calculated average position of the focus lens 210. Specifically, the correction unit 140 corrects the distortion of the image based on the average position of the focus lens 210 and the distortion coefficient corresponding to the position of the focus lens 210.
  • the correction unit 140 can calculate the amount of the image for each combination of the multiple positions of the focus lens 210 and the multiple pixels of the image sensor 120. Each pixel corresponds to multiple pixel positions.
  • the correction unit 140 may calculate the calculated average position of the plurality of pixel positions for the plurality of pixels, so as to correct the image.
  • the image sensor 120 may be exposed according to different exposure periods of the multiple pixel columns included in the image sensor 120.
  • the imaging control unit 110 may read pixel information from the image sensor 120 by scroll reading.
  • the acquisition unit 142 acquires information indicating a plurality of positions of the focus lens 210 moved within the exposure period of each of the plurality of pixel columns.
  • the correction unit 140 corrects the images respectively acquired by the plurality of pixel rows based on a plurality of positions in the exposure period of each of the plurality of pixel rows.
  • the focus lens 210 reciprocates in the optical axis direction during the exposure period.
  • the focus adjustment part 112 causes the focus lens 210 to wobbling within the exposure period.
  • the acquisition section 142 acquires information indicating a plurality of positions of the focus lens 210 that reciprocate within the exposure period.
  • the correction section 140 corrects the image based on the multiple positions of the focus lens 210 that reciprocate within the exposure period.
  • the focus lens 210 may be moved in one direction in the optical axis direction during the exposure period.
  • the acquisition section 142 acquires information representing a plurality of positions of the focus lens 210 moving in one direction within the exposure period.
  • the correction part 140 corrects the image based on a plurality of positions of the focus lens 210 moving in one direction within the exposure period.
  • the exposure period may be a period during which the image sensor 120 is repeatedly exposed in order to obtain each of a plurality of dynamic image constituent images constituting the dynamic image.
  • the correcting unit 140 may correct each of the moving image constituting images based on a plurality of positions within the exposure period for each of the plurality of moving image constituting images.
  • the focus adjustment unit 112 adjusts the focus of the optical system including the focus lens 210 based on the image corrected by the correction unit 140. For example, the focus adjustment unit 112 determines the position of the focus lens 210 in the optical axis direction based on the contrast value of the image corrected by the correction unit 140, and moves the focus lens 210 to the determined position.
  • FIG. 3 is a diagram schematically illustrating processing performed by the correction unit 140.
  • FIG. 3 relates to the processing performed by the correction unit 140 when the exposure periods of all horizontal pixel columns included in the image sensor 120 are the same. A specific description will be given of processing when the image sensor 120 performs continuous shooting through global reading.
  • the image sensor 120 has N (N is a natural number) horizontal pixel columns.
  • the imaging control unit 110 exposes the horizontal pixel column 1 to the horizontal pixel column N included in the image sensor 120 at a timing based on the vertical synchronization signal VD.
  • FIG. 3 shows the exposure period from time t1 to time t7 and the exposure period from time t9 to time t15.
  • the focus adjustment unit 112 detects the position of the focus lens 210 in the optical axis direction multiple times. Specifically, the focus adjustment unit 112 starts from the vertical synchronization signal VD at time t0, and detects the position of the focus lens 210 in the optical axis direction at a predetermined time interval and a predetermined number of times. In addition, the focus adjusting unit 112 starts from the vertical synchronization signal VD at time t7, and detects the position of the focus lens 210 in the optical axis direction at a predetermined time interval and a predetermined number of times.
  • the lens control unit 220 detects LP1, LP2, LP3, LP4, LP5, LP6, and LP7 during the exposure period from time t1 to time t7.
  • the lens control unit 220 detects LP9, LP10, LP11, LP12, LP13, LP14, and LP15 during the exposure period from time t9 to time t15.
  • LPi (i is a natural number) represents the position of the focus lens 210 in the optical axis direction.
  • the focus adjustment unit 112 obtains lens position information from the lens control unit 220 and outputs it to the image processing unit 104.
  • the lens position information indicates the position in the exposure period from time t1 to time t7. LP1, LP2, LP3, LP4, LP5, LP6 and LP7 detected.
  • the acquisition unit 142 acquires lens position information output from the focus adjustment unit 112.
  • the correction unit 140 calculates the average value of LP1, LP2, LP3, LP4, LP5, LP6, and LP7 based on the lens position information.
  • the correction unit 140 calculates the distortion coefficient based on the average value of LP1, LP2, LP3, LP4, LP5, LP6, and LP7.
  • the distortion coefficient is information indicating distortion and distortion determined according to the position of the focus lens 210. Regarding the distortion coefficient, there are related explanations in Figure 4 and so on.
  • the average value is a value calculated by dividing the sum of LP1 to LP7 by the time from time t1 to time t7. For the specific calculation method of the average, there are related explanations in Figure 6 and so on.
  • the image processing unit 104 acquires the pixel data 310 of the horizontal pixel column 1 to the horizontal pixel column N from the image sensor 120 based on the vertical synchronization signal VD at time t7.
  • the image processing unit 104 corrects the image of the pixel data 310 based on the pixel data 310 acquired from the image sensor 120 and the distortion coefficient corresponding to the average position of the focus lens 210 to generate a corrected image 311.
  • the corrected image 311 generated by the correction unit 140 is stored in the memory 130 and is output to the display unit 160 as a display image of the display unit 160.
  • the focus adjustment unit 112 obtains lens position information from the lens control unit 220 and outputs it to the image processing unit 104.
  • the lens position information indicates the exposure period from time t9 to time t15. LP9, LP10, LP11, LP12, LP13, LP14 and LP15 detected within.
  • the acquisition unit 142 acquires lens position information output from the focus adjustment unit 112.
  • the correction unit 140 calculates the average value of LP9, LP10, LP11, LP12, LP13, LP14, and LP15 based on the lens position information.
  • the correction unit 140 calculates the distortion coefficient based on the average value of LP9, LP10, LP11, LP12, LP13, LP14, and LP15.
  • the image processing unit 104 acquires the pixel data 320 of the horizontal pixel column 1 to the horizontal pixel column N from the image sensor 120 based on the vertical synchronization signal VD at time t15.
  • the image processing unit 104 corrects the image of the pixel data 320 based on the pixel data 320 acquired from the image sensor 120 and the distortion coefficient corresponding to the average position of the focus lens 210 to generate a corrected image 321.
  • the corrected image 321 generated by the correction unit 140 is stored in the memory 130 and is output to the display unit 160 as a display image of the display unit 160.
  • the imaging device 100 Based on the vertical synchronization signal VD, the imaging device 100 performs one exposure processing, image data reading processing, image correction processing, and processing to be displayed on the display unit 160 as described above. The imaging device 100 repeatedly executes the above-mentioned processing for each vertical synchronization signal VD.
  • Fig. 4 shows an example of the distortion coefficient.
  • the horizontal axis of FIG. 4 is the position of the focus lens 210, and the vertical axis is the distortion coefficient value.
  • the distortion coefficients include k1, k2, and k3.
  • the memory 130 stores distortion coefficient data indicating the dependence of k1, k2, and k3 on the focus lens 210.
  • the distortion coefficient data can be calculated in advance based on the lens design data of the optical system of the lens unit 200 and stored in the memory 130.
  • the distortion coefficient data may also be calculated in advance through experiments and stored in the memory 130.
  • the distortion coefficient data of k1, k2, and k3 may be data representing a function with the position of the focus lens 210 as a variable.
  • the correlation function can be obtained by fitting the pre-calculated distortion coefficients k1, k2, and k3 to a function with the position of the focus lens 210 as a variable.
  • the distortion coefficient data of k1, k2, and k3 may be mapping data for mapping the position of the focus lens 210 to k1, k2, and k3.
  • the distortion coefficients k1, k2, and k3 can be used to express the relationship between the coordinates (x distorted , y distorted ) on the image sensor 120 and the normalized image coordinates (x, y) with the following equation 1.
  • the coordinates (x, y) represent the coordinates in the corrected image.
  • LP in Equation 1 represents the position of the focus lens 210.
  • the distortion coefficients k1, k2, and k3 depend on LP, and therefore, are expressed as k1(LP), k2(LP), and k3(LP) in Equation 1.
  • FIG. 5 schematically shows an example of the positional relationship between the coordinates (x distorted , y distorted ) and the image coordinates (x, y) on the image sensor 120.
  • the correction unit 140 applies the coordinates of each pixel in the pixel data to (x distorted , y distorted ) in Equation 1, and applies k1, k2, and k3 calculated based on the average position of the focus lens 210 and the aforementioned distortion coefficient data to k1(LP), k2(LP), and k3(LP) are calculated to satisfy the coordinates (x, y) of formula 1. Then, the correction section 140 uses the pixel value of the coordinate (x distorted , y distorted ) in the image data acquired from the image sensor 120 as the pixel value of the coordinate (x, y), thereby generating a corrected image.
  • the correction unit 140 uses the distortion coefficient determined according to the average position of the focus lens 210 during the exposure period to correct the image. As a result, as shown in FIG. 3, it is possible to generate the corrected image 311 and the corrected image 321 in which the change in the angle of view caused by the movement of the focus lens 210 is suppressed.
  • FIG. 6 is a diagram for explaining a calculation method of the position average value of the focus lens 210.
  • T is the exposure time.
  • Te is the time from t1 to t7.
  • Td is the time interval for detecting the position of the focus lens 210.
  • T 6Td.
  • the position average value of the focus lens 210 is a value calculated from the time average value of LP1 to LP7.
  • the weighted sum of LP1 to LP7 is divided by T to calculate the average position of the focus lens 210. According to ⁇ i ⁇ LPi, the weighted sum of LP1 to LP7 is calculated. i is a natural number from 1 to 7.
  • ⁇ i is the weight coefficient of the weighted sum.
  • the sum of ⁇ 1 to ⁇ 7 is T.
  • the ⁇ i may be specified based on the period time included in the exposure period in the period from time ti-Td/2 to time ti+Td/2. Specifically, ⁇ 1 and ⁇ 7 are Td/2, and ⁇ 2 to ⁇ 6 are Td.
  • the weight coefficient ⁇ i is determined according to the time when the position of the focus lens 210 is detected, so that the time average value of the position of the focus lens 210 can be calculated.
  • the average value may be a value calculated by dividing the sum of LP1 to LP7 by 7. That is, the average value may be a value calculated by dividing the position of the focus lens 210 detected during the exposure period by the number of detections of the position of the focus lens 210.
  • FIG. 7 is a flowchart showing an example of the execution procedure of the imaging device 100. When the vertical synchronization signal VD is triggered, this flowchart starts.
  • the focus adjustment unit 112 outputs a timing signal for detecting the position of the focus lens 210 to the lens control unit 220, thereby detecting the position of the focus lens 210.
  • the imaging control unit 110 starts to expose the image sensor 120.
  • the lens control unit 220 detects the position of the focus lens 210 based on the timing signal output from the focus adjustment unit 112.
  • the imaging control unit 110 determines whether to end the exposure. For example, when a trigger of a new vertical synchronization signal VD is detected, the imaging control unit 110 determines that the exposure has ended. Until it is determined that the exposure is ended, the image sensor 120 is kept exposed, and the process of S604 is repeatedly executed.
  • the correction unit 140 obtains the lens position information of the focus lens 210 from the lens control unit 220 through the focus adjustment unit 112, and calculates the average position of the focus lens 210.
  • the correction unit 140 acquires pixel data output from the image sensor 120.
  • the correction unit 140 calculates the image coordinates (x, y) corresponding to the pixel coordinates (x distorted , y distorted ) of the image sensor 120 based on the distortion coefficient data.
  • the correction unit 140 reflects each coordinate pixel value of the pixel data output from the image sensor 120 as the pixel value of the image coordinate calculated in S612, thereby generating a corrected image.
  • the image processing unit 104 stores the corrected image generated by the correction unit 140 in the memory 130. After the processing of S616 is completed, the processing of this flowchart ends.
  • the corrected image stored in the memory 130 in S616 is output to the display unit 160, for example, as a display moving image composition image.
  • the corrected image stored in the memory 130 is recorded in the memory 130 as a movie constituent image of the movie data.
  • the correction unit 140 may associate the image coordinates calculated in S612 with the average position of the focus lens 210 and store them in the memory 130.
  • the correction unit 140 may use the image coordinates stored in the memory 130 without performing the processing of S612. This can reduce the amount of calculation used to calculate the image coordinates.
  • the correction unit 140 corrects image distortion based on the average position of the focus lens 210 during the exposure period. It is thereby possible to generate a corrected image in which the influence of the change in the angle of view that occurs due to the movement of the focus lens 210 during the exposure period is suppressed.
  • FIG. 8 is a diagram illustrating another correction method executed by the correction unit 140.
  • FIG. 8 enlarges part of the image coordinates, showing the relationship between the pixel coordinates (x distorted , y distorted ) of the image sensor 120 and the corrected image coordinates (x, y).
  • the correction unit 140 calculates the image coordinates (x, y) based on each position of the focus lens 210 detected during the exposure period.
  • the correction processing performed based on other correction methods will be described with reference to the example shown in FIG. 3.
  • the correction unit 140 calculates the image coordinates (x, y) based on LP1, LP2, LP3, LP4, LP5, LP6, and LP7 in the exposure period from time t1 to t7, respectively. Specifically, the correction unit 140 applies the coordinates of the pixels in the pixel data acquired from the image sensor 120 to (x distorted , y distorted ) in Equation 1, and applies k1, k2, and k3 calculated from the LP1 and the distortion coefficient data, respectively Go to k1(LP), k2(LP), and k3(LP) of Equation 1, and calculate the coordinates (x, y) that satisfy Equation 1.
  • the calculated coordinates (x, y) are shown as (x1, y1) in FIG. 8.
  • the correction unit 140 applies the coordinates of the pixels in the pixel data to (x distorted , y distorted ) in Equation 1, and applies k1, k2, and k3 calculated from LP2 and the distortion coefficient data to For k1(LP), k2(LP) and k3(LP) of Equation 1, the coordinates (x, y) satisfying Equation 1 are calculated.
  • the calculated coordinates (x, y) are shown as (x2, y2) in FIG. 8.
  • the correction unit 140 applies the pixel value of the coordinate (x distorted , y distorted ) in the pixel data as the pixel value of the average coordinate (x, y) of (xi, yi) (i is a natural number from 1 to 7).
  • the correction unit 140 performs the same processing on each pixel of the pixel data acquired from the image sensor 120 to generate a corrected image.
  • the correction method described in association with FIG. 8 can also provide a corrected image in which the influence of the change in the viewing angle due to the movement of the focus lens 210 during the exposure period is suppressed. Compared with the correction method described in connection with FIGS. 3 and 5, it is possible to generate a corrected image that further reduces the influence of viewing angle changes.
  • FIG. 9 is a diagram schematically illustrating a correction process of the correction unit 140 when reading pixel data from the image sensor 120 by scroll reading.
  • the imaging control unit 110 sequentially shifts the exposure start time of the horizontal pixel row 1 to the horizontal pixel row N to perform exposure. Therefore, the exposure period of the image sensor 120 is different according to the pixel columns included in the image sensor 120.
  • the exposure period of the horizontal pixel column 1 is from time t1 to time t7.
  • the exposure period of the horizontal pixel column N is from time t7 to time t13.
  • the exposure period of the horizontal pixel column 1+i is from time t1+ ⁇ T ⁇ (i-1) to time t7+ ⁇ T ⁇ (i-1) (i is a natural number from 1 to N).
  • ⁇ T is the interval between the exposure start time of adjacent horizontal pixel rows.
  • the positions of the focus lens 210 detected during the exposure period of the horizontal pixel column 1 are LP1, LP2, LP3, LP4, LP5, LP6, and LP7.
  • the correction unit 140 calculates the distortion coefficient k1 corresponding to the average value of LP1, LP2, LP3, LP4, LP5, LP6, and LP7 based on the average value of LP1, LP2, LP3, LP4, LP5, LP6, and LP7 and the distortion coefficient data. k2 and k3.
  • the correction unit 140 applies the calculated distortion coefficients k1, k2, and k3 to Equation 1, and corrects the pixel data 810 of the horizontal pixel column 1 to generate corrected pixel data 811.
  • the positions of the focus lens 210 detected during the exposure period of the horizontal pixel column 2 are LP2, LP3, LP4, LP5, LP6, and LP7.
  • the correction unit 140 calculates the distortion coefficients k1, k2, and k3 corresponding to the average value of LP2, LP3, LP4, LP5, LP6, and LP7 based on the average value of LP2, LP3, LP4, LP5, LP6, and LP7 and the distortion coefficient data.
  • the correction unit 140 applies the calculated distortion coefficients k1, k2, and k3 to the calculation formula 1, and corrects the pixel data of the horizontal pixel column 2 to generate corrected pixel data.
  • the average value is a value calculated by dividing the weighted sum of LP1 to LP7 by the time from time t1 to time t7. Specifically, the average value is a value calculated by the calculation method described in relation to FIG. 6 and the like.
  • the correction unit 140 performs the same processing on the horizontal pixel column 3 to the horizontal pixel column N, and generates corrected pixel data.
  • the positions of the focus lens 210 detected during the exposure period of the horizontal pixel column N are LP7, LP8, LP9, LP10, LP11, LP12, and LP13
  • the correction unit 140 is based on LP7, LP8, LP9, LP10, LP11, LP12 and Calculate the average value of LP13 and distortion coefficient data, calculate the average value of LP7, LP8, LP9, LP10, LP11, LP12 and LP13, and calculate the distortion coefficient k1, k2 and k3 corresponding to the calculated average value.
  • the correction unit 140 applies the calculated distortion coefficients k1, k2, and k3 to Equation 1, and corrects the pixel data 820 of the horizontal pixel column N, thereby generating corrected pixel data 821.
  • the correction unit 140 generates one corrected image based on corrected pixel data generated from each horizontal pixel column.
  • the correction unit 140 applies the distortion coefficients k1, k2, and k3 based on the average position of the focus lens 210 during the exposure time period to the formula 1 for the pixel column group with the same position of the focus lens 210 detected during the exposure time period, to generate Correct the pixel data.
  • FIG. 9 and the like explain in relation to correction processing of image data read by scroll reading based on the average value of the positions of the focus lens 210.
  • the correction method described in connection with FIG. 8 and the like can be applied to the correction of image data read by scroll reading.
  • the correction unit 140 may calculate the image coordinates corresponding to each position of the focus lens 210 for each pixel column, and apply the calculated average coordinates of the image coordinates as the pixel value of the corrected image.
  • the imaging device 100 it is possible to provide an image in which the influence of the change in the angle of view caused by the movement of the focus lens 210 is suppressed.
  • This effect is particularly effective for a small lens device that is relatively small compared to the size of the image sensor.
  • the optical system is miniaturized relative to the size of the image sensor, the effect of distortion caused by the movement of the focus lens becomes significant. Therefore, for example, if the focus lens swings during live view shooting, a change in angle of view accompanying the swing may be observed in the live view image.
  • the imaging device 100 the influence of the angle of view variation caused by the movement of the focus lens 210 can be suppressed, so that the angle of view variation caused by the swing is not easily observed on the live view image.
  • the contrast value used for the focus control is detected from the corrected image, so that the image area of the contrast value detection target can be suppressed from changing due to the movement of the focus lens 210. Therefore, it is possible to more accurately perform focus control based on the contrast value.
  • the processing described in connection with the imaging device 100 of this embodiment is not only applicable to the swing of the focus lens 210, but also applicable to processing when the focus lens 210 is moved in one direction during the exposure period.
  • the processing described in relation to the imaging device 100 of the present embodiment is not only applicable to images during framing shooting, but also to correction processing when generating moving image data for recording and correction processing when generating still image data for recording.
  • the processing described in connection with the imaging device 100 of this embodiment is also applicable to the movement of lenses other than the focus lens 210. That is, the correction section 140 corrects the image acquired by exposing the image sensor 120 during the relevant exposure period based on a plurality of positions of the lens other than the focus lens 210 moved during the exposure period of the image sensor 120.
  • the aforementioned imaging device 100 may be mounted on a mobile body.
  • the camera device 100 may also be mounted on an unmanned aerial vehicle (UAV) as shown in FIG. 10.
  • UAV 10 may include a UAV main body 20, a universal joint 50, a plurality of camera devices 60, and a camera device 100.
  • the universal joint 50 and the camera device 100 are an example of a camera system.
  • UAV10 is an example of a mobile body propelled by a propulsion unit.
  • the concept of moving objects refers to flying objects such as other airplanes moving in the air, vehicles moving on the ground, ships moving on the water, etc. in addition to UAVs.
  • the UAV main body 20 includes a plurality of rotors. Multiple rotors are an example of a propulsion section.
  • the UAV main body 20 makes the UAV 10 fly by controlling the rotation of a plurality of rotors.
  • the UAV main body 20 uses, for example, four rotors to fly the UAV 10. The number of rotors is not limited to four.
  • UAV10 can also be a fixed-wing aircraft without rotors.
  • the imaging device 100 is an imaging camera that captures a subject included in a desired imaging range.
  • the universal joint 50 rotatably supports the imaging device 100.
  • the universal joint 50 is an example of a supporting mechanism.
  • the gimbal 50 uses an actuator to rotatably support the imaging device 100 with a pitch axis.
  • the universal joint 50 uses an actuator to further rotatably support the imaging device 100 around the roll axis and the yaw axis, respectively.
  • the gimbal 50 can change the posture of the imaging device 100 by rotating the imaging device 100 around at least one of the yaw axis, the pitch axis, and the roll axis.
  • the plurality of imaging devices 60 are sensing cameras that photograph the surroundings of the UAV 10 in order to control the flight of the UAV 10.
  • the two camera devices 60 can be installed on the nose of the UAV 10, that is, on the front.
  • the other two camera devices 60 may be provided on the bottom surface of the UAV 10.
  • the two camera devices 60 on the front side may be paired to function as a so-called stereo camera.
  • the two imaging devices 60 on the bottom side may also be paired to function as a stereo camera.
  • the three-dimensional spatial data around the UAV 10 can be generated based on the images captured by the plurality of camera devices 60.
  • the number of imaging devices 60 included in the UAV 10 is not limited to four.
  • the UAV 10 may include at least one camera device 60.
  • the UAV 10 may also include at least one camera 60 on the nose, tail, side, bottom and top surfaces of the UAV 10, respectively.
  • the viewing angle that can be set in the camera device 60 may be larger than the viewing angle that can be set in the camera device 100.
  • the imaging device 60 may have a single focus lens or a fisheye lens.
  • the remote operation device 300 communicates with the UAV 10 to remotely operate the UAV 10.
  • the remote operation device 300 can perform wireless communication with the UAV 10.
  • the remote operation device 300 transmits to the UAV 10 instruction information indicating various commands related to the movement of the UAV 10 such as ascending, descending, accelerating, decelerating, forwarding, retreating, and rotating.
  • the instruction information includes, for example, instruction information for raising the height of the UAV 10.
  • the indication information may indicate the height at which the UAV10 should be located.
  • the UAV 10 moves to be located at the height indicated by the instruction information received from the remote operation device 300.
  • the instruction information may include an ascending instruction to raise the UAV10. UAV10 rises while receiving the rise command. When the height of UAV10 has reached the upper limit height, even if the ascending instruction is accepted, the ascent of UAV10 can be restricted.
  • FIG. 11 shows an example of a computer 1200 that can embody various aspects of the present invention in whole or in part.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device.
  • a program installed on the computer 1200 can make the computer 1200 function as the correction unit 140 or the image processing unit 104.
  • the program can enable the computer 1200 to perform related operations or related functions of one or more "parts".
  • This program can make the computer 1200 execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes specified operations associated with some or all blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates according to the programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices via a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, and/or a program that depends on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the operation or processing of information can be implemented as the computer 1200 is used, thereby constituting an apparatus or method.
  • the CPU 1212 may execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instruct the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory, and transmits the read transmission data to the network or from the network The received received data is written into the receiving buffer provided on the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • CPU 1212 can perform various types of operations, information processing, conditional judgment, conditional transfer, unconditional transfer, and information retrieval described in various parts of this disclosure, including those specified by the program's instruction sequence. /Replace and other various types of processing, and write the results back to RAM1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when multiple entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry to obtain the attribute value of the second attribute associated with the first attribute meeting the predetermined condition.
  • the above-mentioned programs or software modules may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)
PCT/CN2020/071175 2019-01-31 2020-01-09 图像处理装置、摄像装置、无人驾驶航空器、图像处理方法以及程序 WO2020156085A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080002874.4A CN112204947A (zh) 2019-01-31 2020-01-09 图像处理装置、摄像装置、无人驾驶航空器、图像处理方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019015752A JP6746857B2 (ja) 2019-01-31 2019-01-31 画像処理装置、撮像装置、無人航空機、画像処理方法、及びプログラム
JP2019-015752 2019-01-31

Publications (1)

Publication Number Publication Date
WO2020156085A1 true WO2020156085A1 (zh) 2020-08-06

Family

ID=71840865

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/071175 WO2020156085A1 (zh) 2019-01-31 2020-01-09 图像处理装置、摄像装置、无人驾驶航空器、图像处理方法以及程序

Country Status (3)

Country Link
JP (1) JP6746857B2 (ja)
CN (1) CN112204947A (ja)
WO (1) WO2020156085A1 (ja)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011061444A (ja) * 2009-09-09 2011-03-24 Hitachi Information & Communication Engineering Ltd 収差補正装置及び収差補正方法
CN103038689A (zh) * 2011-05-16 2013-04-10 松下电器产业株式会社 透镜单元以及摄像装置
US20140300799A1 (en) * 2013-04-05 2014-10-09 Olympus Corporation Imaging device, method for controlling imaging device, and information storage device
CN104380709A (zh) * 2012-06-22 2015-02-25 富士胶片株式会社 摄像装置及其动作控制方法

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1320813C (zh) * 2003-06-20 2007-06-06 北京中星微电子有限公司 一种镜头成像畸变校正的方法
US7596286B2 (en) * 2003-08-06 2009-09-29 Sony Corporation Image processing apparatus, image processing system, imaging apparatus and image processing method
JP4522207B2 (ja) * 2004-09-17 2010-08-11 キヤノン株式会社 カメラシステム、カメラ本体及び交換レンズ
JP4310645B2 (ja) * 2004-12-28 2009-08-12 ソニー株式会社 撮像画像信号の歪み補正方法および撮像画像信号の歪み補正装置
JP2009296561A (ja) * 2008-05-02 2009-12-17 Olympus Imaging Corp 撮像装置及び撮像方法
JP5272699B2 (ja) * 2008-12-15 2013-08-28 株式会社ニコン 画像処理装置、撮像装置、プログラムおよび画像処理方法
JP5934940B2 (ja) * 2012-05-17 2016-06-15 パナソニックIpマネジメント株式会社 撮像装置、半導体集積回路および撮像方法
JP5963542B2 (ja) * 2012-05-30 2016-08-03 キヤノン株式会社 画像処理装置、その制御方法及びプログラム
JP6136019B2 (ja) * 2014-02-03 2017-05-31 パナソニックIpマネジメント株式会社 動画像撮影装置、および、動画像撮影装置の合焦方法
JP6313685B2 (ja) * 2014-05-01 2018-04-18 キヤノン株式会社 撮像装置およびその制御方法
JP6516443B2 (ja) * 2014-11-10 2019-05-22 オリンパス株式会社 カメラシステム
WO2017122348A1 (ja) * 2016-01-15 2017-07-20 オリンパス株式会社 フォーカス制御装置、内視鏡装置及びフォーカス制御装置の作動方法
WO2018025659A1 (ja) * 2016-08-05 2018-02-08 ソニー株式会社 撮像装置、固体撮像素子、カメラモジュール、駆動制御部、および撮像方法
JP6906947B2 (ja) * 2016-12-22 2021-07-21 キヤノン株式会社 画像処理装置、撮像装置、画像処理方法およびコンピュータのプログラム
US10705312B2 (en) * 2017-02-02 2020-07-07 Canon Kabushiki Kaisha Focus control apparatus, image capturing apparatus, and focus control method
CN110337805B (zh) * 2017-03-01 2021-03-23 富士胶片株式会社 摄像装置、图像处理装置、图像处理方法及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011061444A (ja) * 2009-09-09 2011-03-24 Hitachi Information & Communication Engineering Ltd 収差補正装置及び収差補正方法
CN103038689A (zh) * 2011-05-16 2013-04-10 松下电器产业株式会社 透镜单元以及摄像装置
CN104380709A (zh) * 2012-06-22 2015-02-25 富士胶片株式会社 摄像装置及其动作控制方法
US20140300799A1 (en) * 2013-04-05 2014-10-09 Olympus Corporation Imaging device, method for controlling imaging device, and information storage device

Also Published As

Publication number Publication date
JP6746857B2 (ja) 2020-08-26
CN112204947A (zh) 2021-01-08
JP2020123897A (ja) 2020-08-13

Similar Documents

Publication Publication Date Title
WO2018185939A1 (ja) 撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラム
WO2019120082A1 (zh) 控制装置、***、控制方法以及程序
WO2021013143A1 (zh) 装置、摄像装置、移动体、方法以及程序
WO2020156085A1 (zh) 图像处理装置、摄像装置、无人驾驶航空器、图像处理方法以及程序
WO2019174343A1 (zh) 活动体检测装置、控制装置、移动体、活动体检测方法及程序
WO2020216037A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2018185940A1 (ja) 撮像制御装置、撮像装置、撮像システム、移動体、撮像制御方法、及びプログラム
WO2021031833A1 (zh) 控制装置、摄像***、控制方法以及程序
US11125970B2 (en) Method for lens autofocusing and imaging device thereof
WO2019223614A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
JP7043706B2 (ja) 制御装置、撮像システム、制御方法、及びプログラム
WO2021204020A1 (zh) 装置、摄像装置、摄像***、移动体、方法以及程序
WO2021052216A1 (zh) 控制装置、摄像装置、控制方法以及程序
WO2021249245A1 (zh) 装置、摄像装置、摄像***及移动体
WO2020244440A1 (zh) 控制装置、摄像装置、摄像***、控制方法以及程序
JP6961888B1 (ja) 装置、撮像装置、移動体、プログラム及び方法
JP6569157B1 (ja) 制御装置、撮像装置、移動体、制御方法、及びプログラム
WO2018163300A1 (ja) 制御装置、撮像装置、撮像システム、移動体、制御方法、及びプログラム
WO2021233177A1 (zh) 图像处理装置、摄像装置、移动体、程序以及方法
WO2021031840A1 (zh) 装置、摄像装置、移动体、方法以及程序
JP6878738B1 (ja) 制御装置、撮像システム、移動体、制御方法、及びプログラム
WO2021143425A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020083342A1 (zh) 控制装置、摄像装置、移动体、控制方法以及程序
WO2020088438A1 (zh) 控制装置、摄像装置、***、控制方法以及程序
WO2019085794A1 (zh) 控制装置、摄像装置、飞行体、控制方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20748134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20748134

Country of ref document: EP

Kind code of ref document: A1