WO2021088669A1 - 控制装置、摄像装置、控制方法以及程序 - Google Patents

控制装置、摄像装置、控制方法以及程序 Download PDF

Info

Publication number
WO2021088669A1
WO2021088669A1 PCT/CN2020/123565 CN2020123565W WO2021088669A1 WO 2021088669 A1 WO2021088669 A1 WO 2021088669A1 CN 2020123565 W CN2020123565 W CN 2020123565W WO 2021088669 A1 WO2021088669 A1 WO 2021088669A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image sensor
cropped
optical system
exposure time
Prior art date
Application number
PCT/CN2020/123565
Other languages
English (en)
French (fr)
Inventor
本庄谦一
安田知长
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN202080004464.3A priority Critical patent/CN112585938B/zh
Publication of WO2021088669A1 publication Critical patent/WO2021088669A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to a control device, an imaging device, a control method, and a program.
  • Patent Document 1 discloses that moiré removal is performed by moving an imaging element when generating an image signal of one frame or moving the imaging element at intervals of generating an image signal of four frames.
  • the control device may be a control device that controls an imaging device, the imaging device including: an optical system; an image sensor; A moving mechanism that moves the optical system or image sensor to perform shake correction.
  • the control device may include a circuit configured to cause the image sensor to output the first captured image at a first point in the first exposure time during which the moving mechanism performs shake correction, and control the moving mechanism to cause the optical system or the image sensor to follow
  • the optical system or the image sensor is moved to a first predetermined position within the movable range by moving the optical system or the image sensor by the first movement amount in the first direction.
  • the circuit may be configured to cause the image sensor to output the second captured image at a second time point after the first time point within the first exposure time.
  • the circuit may be configured to generate an image based on the first captured image and the second captured image.
  • the circuit may be configured to: crop the first cropped image from the first cropped area in the first captured image; and move the first cropped area in a second direction opposite to the first direction by an amount equivalent to the first moving amount.
  • the obtained second cropped image is cropped in the second cropped area in the second camera image; the image is generated based on the first cropped image and the second cropped image.
  • the circuit may be configured as: at a second point in time during the first exposure time, the moving mechanism is controlled to move the optical system or image sensor in the third direction by a second amount of movement to move the optical system or image sensor into the movable range The first position.
  • the circuit may be configured to: at a third time point after the second time point in the first exposure time, make the image sensor output a third captured image, from the second cropped area along the fourth direction opposite to the third direction
  • the third cropped image is cropped in the third cropped area in the third captured image obtained by moving an amount equivalent to the second moving amount.
  • the circuit may be configured to further generate an image based on the third cropped image.
  • the circuit may be configured to: when the optical system or the image sensor moves beyond the first movement range within the movable range, it is determined as the first time point, and the image sensor outputs the first captured image.
  • the circuit may be configured to adjust the timing of the second time point based on the first captured image.
  • the circuit may be configured to adjust the timing of the second time point so that when the first captured image is in a saturated state, the time from the first time point to the second time point is greater than when the first captured image is not in a saturated state The time is short.
  • the imaging device may be operated in a first imaging mode in which a subject having a first brightness is captured, and in a second imaging mode in which a subject having a second brightness brighter than the first brightness is captured.
  • the circuit may be configured to set the first exposure time to the first time period when the camera device is operating in the first camera mode, and to set the first exposure time to the first time period when the camera device is operating in the second camera mode. A second time period shorter than the first time period.
  • the circuit can be configured to: acquire multiple cropped images including the first cropped image and the second cropped image within the first exposure time, select cropped images that meet predetermined conditions from the multiple cropped images, and select the cropped image based on the selected cropped image. Generate images.
  • the circuit may be configured to generate an image by adding the pixel value of each pixel of the first cropped image and the pixel value of each pixel of the second cropped image.
  • the imaging device may include the above-mentioned control device, optical system, image sensor, and moving mechanism.
  • the control method may be a control method for controlling an imaging device, the imaging device including: an optical system; an image sensor; Move the optical system or the image sensor in the direction intersecting the optical axis of the optical system within the movable range to perform shake correction.
  • the control method may include: causing the image sensor to output the first captured image at a first point in the first exposure time during which the moving mechanism performs the shake correction, and controlling the moving mechanism to move the optical system or the image sensor in the first direction. The amount of movement thereby moves the optical system or the image sensor to the first predetermined position within the movable range.
  • the control method may include the step of causing the image sensor to output a second captured image at a second time point after the first time point within the first exposure time.
  • the control method may include the step of generating an image based on the first captured image and the second captured image.
  • the program according to one aspect of the present invention may be a program for causing a computer to function as the above-mentioned control device.
  • FIG. 1 is a diagram showing an example of an external perspective view of the imaging device according to this embodiment.
  • FIG. 2 is a diagram showing functional blocks of the imaging device according to this embodiment.
  • FIG. 3 is a diagram showing how the shake angle of the imaging device changes with time.
  • FIG. 4 is a situation diagram showing a shooting flow of the imaging device.
  • FIG. 5 is a situation diagram showing the imaging flow of the imaging device.
  • FIG. 6 is a situation diagram showing a shooting flow of the imaging device.
  • Fig. 7 is a diagram showing an example of the hardware configuration.
  • the blocks may represent (1) a stage of a process of performing operations or (2) a "part" of a device that performs operations.
  • Specific stages and “parts” can be implemented by programmable circuits and/or processors.
  • Dedicated circuits may include digital and/or analog hardware circuits. May include integrated circuits (ICs) and/or discrete circuits.
  • Programmable circuits may include reconfigurable hardware circuits.
  • Reconfigurable hardware circuits can include logical AND, logical OR, logical exclusive OR, logical NAND, logical NOR, and other logical operations, flip-flops, registers, field programmable gate arrays (FPGA), programmable logic arrays (PLA) ) And other memory components.
  • the computer-readable medium may include any tangible device that can store instructions to be executed by a suitable device.
  • the computer-readable medium on which instructions are stored includes a product that includes instructions that can be executed to create means for performing operations specified by the flowchart or block diagram.
  • a computer-readable medium it may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
  • the computer-readable medium may include floppy disk (registered trademark), floppy disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or flash memory ), electrically erasable programmable read-only memory (EEPROM), static random access memory (SRAM), compact disc read-only memory (CD-ROM), digital versatile disc (DVD), Blu-ray (RTM) disc, memory stick, Integrated circuit cards, etc.
  • floppy disk registered trademark
  • floppy disk hard disk
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • flash memory electrically erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disc
  • RTM Blu-ray
  • the computer-readable instructions may include any one of source code or object code described in any combination of one or more programming languages.
  • the source code or object code includes a traditional procedural programming language.
  • Traditional programming languages can be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-related instructions, microcode, firmware instructions, status setting data, or Smalltalk, JAVA (registered trademark), C++, etc.
  • the computer-readable instructions may be provided locally or via a wide area network (WAN) such as a local area network (LAN) or the Internet to a processor or programmable circuit of a general-purpose computer, a special-purpose computer, or other programmable data processing device.
  • WAN wide area network
  • LAN local area network
  • the processor or programmable circuit can execute computer-readable instructions to create means for performing the operations specified in the flowchart or block diagram.
  • Examples of processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, and so on.
  • FIG. 1 is a diagram showing an example of an external perspective view of an imaging device 100 according to this embodiment.
  • FIG. 2 is a diagram showing functional blocks of the imaging device 100 according to this embodiment.
  • the imaging device 100 includes an imaging unit 102 and a lens unit 200.
  • the imaging unit 102 includes an image sensor 120, an imaging control unit 110, and a memory 130.
  • the image sensor 120 may be composed of CCD or CMOS.
  • the image sensor 120 outputs image data of the optical image formed by the zoom lens 211 and the focus lens 210 to the imaging control unit 110.
  • the imaging control unit 110 may be constituted by a microprocessor such as a CPU or an MPU, a microcontroller such as an MCU, or the like.
  • the memory 130 may be a computer-readable recording medium, and may also include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the memory 130 stores programs and the like necessary for the imaging control unit 110 to control the image sensor 120 and the like.
  • the memory 130 may be provided inside the housing of the imaging device 100.
  • the memory 130 may be configured to be detachable from the housing of the camera 100.
  • the imaging unit 102 may further include an indication unit 162 and a display unit 160.
  • the instruction unit 162 is a user interface that accepts instructions to the imaging device 100 from the user.
  • the display unit 160 displays images captured by the image sensor 120, various setting information of the imaging device 100, and the like.
  • the display part 160 may be composed of a touch panel.
  • the lens unit 200 includes a focus lens 210, a zoom lens 211, a lens drive unit 212, a lens drive unit 213, and a lens control unit 220.
  • the focus lens 210 and the zoom lens 211 may include at least one lens. At least a part or all of the focus lens 210 and the zoom lens 211 are configured to be movable along the optical axis.
  • the lens unit 200 may be an interchangeable lens provided to be detachable from the imaging unit 102.
  • the lens driving unit 212 moves at least a part or all of the focus lens 210 along the optical axis via mechanical components such as a cam ring and a guide shaft.
  • the lens driving unit 213 moves at least a part or all of the zoom lens 211 along the optical axis via a mechanism member such as a cam ring and a guide shaft.
  • the lens control section 220 drives at least one of the lens drive section 212 and the lens drive section 213 in accordance with a lens control instruction from the imaging section 102, and makes at least one of the focus lens 210 and the zoom lens 211 along the optical axis direction via a mechanism member Move to perform at least one of a zoom operation and a focus operation.
  • the lens control commands are, for example, zoom control commands and focus control commands.
  • the lens unit 200 further includes a memory 240, a position sensor 214, and a position sensor 215.
  • the memory 240 stores the control values of the focus lens 210 and the zoom lens 211 driven via the lens drive unit 212 and the lens drive unit 213.
  • the memory 240 may include at least one of flash memory such as SRAM, DRAM, EPROM, EEPROM, and USB memory.
  • the position sensor 214 detects the position of the focus lens 210.
  • the position sensor 214 can detect the current focus position.
  • the position sensor 215 detects the position of the zoom lens 211.
  • the position sensor 215 can detect the current zoom position of the zoom lens 211.
  • the lens part 200 includes an optical image stabilization mechanism (OIS). More specifically, the lens section 200 includes a lens 231 for image stabilization, a lens driving section 233, a position sensor 235, and a vibration sensor 250.
  • the vibration sensor 250 may be a gyro sensor that detects the vibration of the imaging device 100.
  • the vibration sensor 250 may be an acceleration sensor that detects the vibration of the imaging device 100.
  • the gyro sensor detects, for example, angular jitter and rotational jitter.
  • the acceleration sensor detects displacement jitter in the X direction or the Y direction, for example.
  • the gyroscope sensor can also convert the angle and rotation into an X-direction vector or a Y-direction vector.
  • the acceleration sensor can also convert displacement jitter in the X or Y direction into angular jitter and rotational jitter.
  • the vibration sensor 250 may be a combination of an acceleration sensor and a gyro sensor.
  • the lens driving section 233 moves the lens 231 in a direction intersecting the optical axis.
  • the lens driving part 233 can move the lens 231 in a direction perpendicular to the optical axis.
  • the lens driving part 233 may include a voice coil motor.
  • the position sensor 235 detects the position of the lens 231.
  • the position sensor 235 can detect the position of the lens 231 in the direction perpendicular to the optical axis.
  • the position sensor 235 may output the position of the lens 231 in the direction perpendicular to the optical axis as a vibration signal indicating the vibration of the lens 231.
  • the lens section 200 is an example of an image stabilization device.
  • the lens control unit 220 acquires a vibration signal indicating vibration from the vibration sensor 250, and based on the vibration signal, the lens driving unit 233 performs image stabilization by vibrating the lens 231 in a direction intersecting the optical axis.
  • the image sensor 120 captures images imaged via the zoom lens 211, the focus lens 210, and the lens 231.
  • the imaging unit 102 also includes a body image stabilization mechanism (BIS). More specifically, the imaging unit 102 further includes an image sensor driving unit 150 and a position sensor 152.
  • the image sensor driving part 150 moves the image sensor 120 in a direction intersecting the optical axis.
  • the image sensor driving part 150 moves the image sensor 120 in a direction perpendicular to the optical axis.
  • the image sensor driving part 150 may include a voice coil motor.
  • the position sensor 152 detects the position of the image sensor 120.
  • the position sensor 152 can detect the position of the image sensor 120 in a direction perpendicular to the optical axis.
  • the position sensor 152 may output the position of 120 in the direction perpendicular to the optical axis as a vibration signal indicating the vibration of the image sensor 120.
  • the imaging control unit 110 acquires a vibration signal indicating vibration from the vibration sensor 250, and performs image stabilization by vibrating the image sensor 120 in a direction intersecting the optical axis via the image sensor driving unit 150
  • the imaging device 100 may include at least one of OIS and BIS.
  • the image sensor driving section 150 or the lens driving section 233 is an example of a moving mechanism.
  • the imaging device 100 since the imaging device 100 includes OIS or BIS, image stabilization is performed by moving the lens 231 or the image sensor 120 in the direction in which the shake of the imaging device 100 is eliminated.
  • the movable range of the lens 231 or the image sensor 120 has physical limitations. Therefore, if the shake of the imaging device 100 is large, the imaging device 100 may not be able to perform image stabilization sufficiently.
  • the exposure time tends to be relatively long.
  • the user holds the imaging device 100 by hand to shoot, if the exposure time is long, the shake of the imaging device 100 becomes large, and the imaging device 100 may not be able to sufficiently perform image stabilization.
  • FIG. 3 shows how the shake angle of the imaging device 100 changes with time.
  • the exposure time is relatively long, such as 12 seconds.
  • the shake angle of the imaging device 100 may exceed the shake angle indicating the limit of the movable range 500 of the lens 231 or the image sensor 120.
  • the position of the subject 520 such as a star on the captured image 510 captured by the imaging apparatus 100 shifts within the exposure time, and the subject 520 in the captured image 510 is blurred.
  • the imaging control unit 110 divides the desired exposure time required for shooting the desired subject into a plurality of exposure times, and for each divided exposure time (S1 to S6), causing the lens driving section 233 or the image sensor driving section 150 to perform a reset operation to move the position of the lens 231 or the image sensor 120 to a first predetermined position within the movable range.
  • the first position may be, for example, the center position of the movable range of the lens 231 or the image sensor 120.
  • the imaging control unit 110 acquires the captured images (600-1 to 600-6) output from the image sensor 120 for each divided exposure time (S1 to S6).
  • the imaging control unit 110 generates an image of a desired exposure time based on the captured images (600-1 to 600-6) output from the image sensor 120.
  • the imaging control unit 110 can align each of the captured images (600-1 to 600-6) output from the image sensor 120, and calculate the pixel value of each pixel of each captured image (600-1 to 600-6).
  • the addition is performed to generate an image with the desired exposure time.
  • the image of the desired exposure time is equivalent to one image taken with the total exposure time obtained by adding the divided exposure time.
  • the imaging control unit 110 crops a cropped area (610-1 to 610-6) of a predetermined size from each captured image (600-1 to 600-6), and obtains a cropped image. That is, the imaging control unit 110 obtains a cropped image by cropping images of cropped regions (610-1 to 610-6) of a predetermined size from each of the captured images (600-1 to 600-6).
  • the imaging control unit 110 adjusts the position of the cropping area (610-1 to 610-6) for each divided exposure time.
  • the imaging control unit 110 moves the lens 231 or the image sensor 120 to a first predetermined position within the movable range by, for example, moving the lens 231 or the image sensor 120 in a first direction by a first movement amount via the image sensor driving unit 150. Therefore, the imaging control unit 110 moves the cropped region in the captured image in the second direction opposite to the first direction by an amount equivalent to the first movement amount.
  • the imaging control section 110 generates an image 620 of a desired exposure time based on the respective cropped images acquired for each divided exposure time.
  • the imaging control unit 110 generates an image 620 of a desired exposure time by adding the pixel value of each pixel of each cropped image acquired for each divided exposure time.
  • the lens 231 or the image sensor 120 is moved to the first position. Therefore, the movement range of the lens 231 or the image sensor 120 to be moved within the divided exposure time for shake correction may be included in the movable range.
  • the imaging control section 110 can reliably perform shake correction at each divided exposure time.
  • the imaging control section 110 moves the cropping area at each divided exposure time so that the desired subject exists in a specific position in the cropping area.
  • the imaging control section 110 generates one image by adding the pixel value of each pixel of the plurality of cropped images. As a result, an image equivalent to an image captured with a desired exposure time can be obtained.
  • the imaging control section 110 causes the image sensor 120 to perform the shake correction at the first time point (for example, 2[S] in FIG. 4) during the first exposure time during which the image sensor driving section 150 or the lens driving section 233 performs shake correction, where S represents "seconds". ”) output the captured image 600-1.
  • the imaging control unit 110 crops the first cropped image from the cropped area 610-1 in the captured image 600-1.
  • the imaging control part 110 may control the image sensor driving part 150 or the lens driving part 233 to move the lens 231 or the image sensor 120 in the first direction by the first movement amount to move the lens 231 or the image sensor 120 to within the movable range.
  • the first predetermined location for example, 2[S] in FIG. 4
  • the imaging control section 110 causes the image sensor 120 to output the captured image 600-2 at a second time point (4[S]) after the first time point (2[S]) within the first exposure time.
  • the imaging control unit 110 moves the cropping area 610-1 in a second direction opposite to the first direction by an amount equivalent to the first movement amount. Then, the imaging control unit 110 crops the second cropped image from the moved cropped area 610-2 in the captured image 600-2.
  • the imaging control part 110 may control the image sensor driving part 150 or the lens driving part 233 to move the lens 231 or the image sensor 120 in the third direction. The amount of movement thereby moves the lens 231 or the image sensor 120 to the first position within the movable range.
  • the imaging control section 110 causes the image sensor 120 to output the captured image 600-3 at a third time point (6[S]) after the second time point (4[S]) within the first exposure time.
  • the imaging control unit 110 moves the cropping area 610-2 in a fourth direction opposite to the third direction by an amount equivalent to the second movement amount.
  • the imaging control unit 110 crops the third cropped image from the moved cropped area 610-3 in the captured image 600-3.
  • the imaging control section 110 causes the image sensor 120 to output the captured images 600-4 to 600-6 every predetermined time period (for example, 2[S]), and returns the lens 231 or the image sensor 120 to the first position.
  • the imaging control unit 110 crops the fourth cropped image, the fifth cropped image, and the sixth cropped image from the respective cropped regions 610-4 to 600-6 of the captured images 600-4 to 600-6.
  • the imaging control unit 110 generates an image 620 by performing pixel addition on the first cropped image to the sixth cropped image.
  • the shake correction can be sufficiently performed.
  • the imaging control section 110 may set the total exposure time based on the brightness of the subject.
  • the imaging control section 110 may set the total exposure time based on the ISO sensitivity and the F value (aperture value).
  • the imaging control section 110 may set the total exposure time based on the level of the star as the photographic subject, for example.
  • the imaging control unit 110 may set the total exposure time corresponding to the level of the stars so that the total exposure time of the stars when shooting the second level lower than the first level is longer than the total exposure time when shooting the stars of the first level.
  • the imaging control unit 110 may set the total exposure time so that the total exposure time when shooting stars up to the sixth level is longer than the total exposure time when shooting stars up to the third level.
  • the duration of the exposure time divided from the total exposure time may be variable. As shown in FIG. 5, when the lens 231 or the image sensor 120 moves beyond the first movement range 501 within the movable range 500, the imaging control unit 110 may determine the time point when the lens 231 or the image sensor 120 is moved to the first position. . When the lens 231 or the image sensor 120 reaches the limit position determined based on the movable range, the imaging control section 110 may determine the time point when the lens 231 or the image sensor 120 is moved to the first position.
  • the imaging control section 110 may cause the image sensor 120 to output a captured image, and move the lens 231 or the image sensor 120 to the first position .
  • the imaging control unit 110 adjusts the timing of moving the lens 231 or the image sensor 120 to the first position within the total exposure time based on the S/N ratio of the captured image taken within the divided exposure time.
  • the imaging control section 110 may set the shortest time of the divided exposure time so that the S/N ratio of the captured image is equal to or greater than the threshold value.
  • the S/N ratio of the captured image depends on the pixel size of the image sensor 120. Therefore, the imaging control section 110 may set the shortest time among the divided exposure times based on the pixel size of the image sensor 120.
  • the imaging control section 110 may set the timing of moving the lens 231 or the image sensor 120 to the first position so that the divided exposure time is equal to or greater than the shortest time based on the S/N ratio.
  • the imaging control section 110 may adjust the timing of the second time point after the first time point based on the first captured image at the first time point captured with the divided exposure time.
  • the imaging control unit 110 may adjust the timing of the second time point after the first time point based on whether the first captured image at the first time point captured with the divided exposure time is in a saturated state. For example, when the camera 100 photographs the night sky including the moon and stars, the moon may be too bright, and the moon area in the photographed image taken with the divided exposure time may be in a saturated state. Therefore, as shown in FIG.
  • the first captured image 600-1 can adjust the timing of the second time point so that The time from the first time point to the second time point (S2), that is, the divided exposure time, is shorter than the exposure time when the first captured image 600-1 is not in a saturated state.
  • S2 The time from the first time point to the second time point (S2), that is, the divided exposure time
  • the saturated state of the captured image refers to a state in which the pixel value or brightness of pixels in an area of the captured image whose size is equal to or greater than a predetermined size is equal to or greater than a threshold value.
  • the imaging apparatus 100 may operate in a first imaging mode in which a subject having a first brightness is captured, and in a second imaging mode in which a subject having a second brightness brighter than the first brightness is captured.
  • the imaging apparatus 100 may operate in a first imaging mode in which stars of a first level or higher are photographed, and in a second imaging mode in which stars of a second level or higher lower than the first level are photographed.
  • the image capturing apparatus 100 may operate in an image capturing mode according to the star level of the imaged subject.
  • the imaging control section 110 sets the first exposure time to the first time period.
  • the imaging control section may set the first The exposure time is set to a second time period shorter than the first time period.
  • the imaging control part 110 may acquire a plurality of cropped images from a plurality of camera images taken within the first exposure time, and select a cropped image that satisfies a predetermined condition from the plurality of cropped images.
  • the imaging control section 110 may generate an image of the first exposure time based on the selected cropped image.
  • the predetermined condition may be a condition based on image quality.
  • the predetermined condition may be a condition based on the S/N ratio of the image.
  • the imaging control section 110 may select a cropped image having an S/N ratio equal to or greater than a threshold value from a plurality of cropped images.
  • the imaging control section 110 may extend the first exposure time by an amount of time equivalent to the exposure time corresponding to the cropped image that is not selected.
  • the imaging control section 110 may determine whether to perform pixel addition for each pixel in the plurality of cropped images.
  • the imaging control unit 110 trims an image of a first region whose brightness is equal to or greater than a threshold value from one cropped image to obtain a first partial image equivalent to the first region.
  • the imaging control unit 110 adds the pixel values of the respective pixels in the second area excluding the first area of each cropped image to obtain a second partial image corresponding to the second area.
  • the imaging control unit 110 may synthesize the first partial image of the first area and the second partial image of the second area to generate one image including the first area and the second area.
  • the imaging control section 110 trims the image of the first region including the moon from one cropped image to obtain a first partial image equivalent to the first region.
  • the imaging control section 110 adds the pixel values of each pixel in the second area including the stars other than the first area of each cropped image to obtain a second partial image corresponding to the second area.
  • the imaging control unit 110 may synthesize the first partial image of the first region including the moon and the second partial image of the second region including stars other than the moon, and generate one image including the first region and the second region.
  • FIG. 7 shows an example of a computer 1200 that may fully or partially embody various aspects of the present invention.
  • the program installed on the computer 1200 can make the computer 1200 function as an operation associated with the device according to the embodiment of the present invention or one or more "parts" of the device. Alternatively, the program can cause the computer 1200 to perform the operation or the one or more "parts".
  • This program enables the computer 1200 to execute the process or stages of the process involved in the embodiment of the present invention.
  • Such a program may be executed by the CPU 1212, so that the computer 1200 executes designated operations associated with some or all of the blocks in the flowcharts and block diagrams described in this specification.
  • the computer 1200 of this embodiment includes a CPU 1212 and a RAM 1214, which are connected to each other through a host controller 1210.
  • the computer 1200 further includes a communication interface 1222, an input/output unit, which is connected to the host controller 1210 through the input/output controller 1220.
  • the computer 1200 also includes a ROM 1230.
  • the CPU 1212 operates in accordance with programs stored in the ROM 1230 and RAM 1214 to control each unit.
  • the communication interface 1222 communicates with other electronic devices through a network.
  • the hard disk drive can store programs and data used by the CPU 1212 in the computer 1200.
  • the ROM 1230 stores therein a boot program executed by the computer 1200 during operation, etc., and/or programs that depend on the hardware of the computer 1200.
  • the program is provided via a computer-readable recording medium such as CR-ROM, USB memory, or IC card, or a network.
  • the program is installed in RAM 1214 or ROM 1230 which is also an example of a computer-readable recording medium, and is executed by CPU 1212.
  • the information processing described in these programs is read by the computer 1200 and causes cooperation between the programs and the various types of hardware resources described above.
  • the apparatus or method can be constituted by realizing operation or processing of information according to the use of the computer 1200.
  • the CPU 1212 can execute a communication program loaded in the RAM 1214, and based on the processing described in the communication program, instructs the communication interface 1222 to perform communication processing.
  • the communication interface 1222 reads the transmission data stored in the transmission buffer provided in a recording medium such as RAM 1214 or USB memory under the control of the CPU 1212, and sends the read transmission data to the network or receives the data from the network.
  • the received data is written in the receiving buffer provided in the recording medium, etc.
  • the CPU 1212 can make the RAM 1214 read all or necessary parts of files or databases stored in an external recording medium such as a USB memory, and perform various types of processing on the data on the RAM 1214. Then, the CPU 1212 can write the processed data back to the external recording medium.
  • an external recording medium such as a USB memory
  • the CPU 1212 can perform various types of operations, information processing, condition determination, conditional transfer, unconditional transfer, and information retrieval/retrieval/information specified by the instruction sequence of the program described in various places in this disclosure. Replace various types of processing, and write the results back to RAM 1214.
  • the CPU 1212 can search for information in files, databases, and the like in the recording medium. For example, when multiple entries having the attribute value of the first attribute respectively associated with the attribute value of the second attribute are stored in the recording medium, the CPU 1212 may retrieve the attribute value of the specified first attribute from the multiple entries. And read the attribute value of the second attribute stored in the entry that matches the condition, so as to obtain the attribute value of the second attribute associated with the first attribute that satisfies the predetermined condition.
  • the programs or software modules described above may be stored on the computer 1200 or on a computer-readable storage medium near the computer 1200.
  • a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium so that the program can be provided to the computer 1200 via the network.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Adjustment Of Camera Lenses (AREA)

Abstract

一种控制装置,其对摄像装置(100)进行控制,摄像装置(100)包括:光学***;图像传感器(120);以及移动机构(150/233),其使光学***或图像传感器(120)在预定的可移动范围沿与光学***的光轴相交的方向移动来执行抖动校正。控制装置可以包括电路,该电路被构成为:在移动机构(150/233)执行抖动校正的第一曝光时间内的第一时间点,由图像传感器(120)输出第一摄像图像,并控制移动机构(150/233)使光学***或图像传感器(120)沿第一方向移动第一移动量而将光学***或图像传感器(120)移动到可移动范围内的第一预定位置;在第一曝光时间内的第一时间点之后的第二时间点处使图像传感器(120)输出第二摄像图像;基于第一摄像图像和第二摄像图像来生成图像。可以解决如果曝光时间较长,可能无法充分进行抖动校正。

Description

控制装置、摄像装置、控制方法以及程序
本申请要求于2019-11-05递交的、申请号为JP2019-200783的日本专利申请的优先权,其内容一并在此作为参考。
技术领域
本发明涉及一种控制装置、摄像装置、控制方法以及程序。
背景技术
专利文献1公开了通过在生成一帧的图像信号时移动撮像元件或在生成四帧的图像信号的间隔移动撮像元件来执行摩尔纹去除。
[现有技术文献]
[专利文献]
[专利文献1]日本特开2008-35241号公报
发明内容
发明所要解决的技术问题
当通过沿与光轴相交的方向移动光学***或图像传感器进行抖动校正时,如果曝光时间较长,则仅在物理可移动范围内移动光学***或图像传感器,可能无法充分地进行抖动校正。
用于解决问题的技术手段
根据本发明的一个方面所涉及的控制装置可以是控制摄像装置的控制装置,该摄像装置包括:光学***;图像传感器;以及通过在预定的可移动范围内沿与光学***的光轴相交的方向移动光学***或图像传感器来执行抖动校正的移动机构。控制装置可以包括电路,该电路被构成为:在移动机构执行抖动校正的第一曝光时间内的第一时间点处使图像传感器输出第一摄像图像,并控制移动机构使光学***或图像传感器沿第一方向移动第一移动量从而将光学***或图像传感器移动到可移动范围内的第一预定位置。电路可以被构成为:在第一曝光时间内的第一时间点之后的第二时间点处使图像传感器输出第二摄像图像。电路可以被构成为基于第一摄像图像和第二摄像图像来生成图像。
电路可以被构成为:从第一摄像图像内的第一裁剪区域中裁剪第一裁剪图像;从将第 一裁剪区域沿与第一方向相反的第二方向移动与第一移动量相当的量而获得的、第二摄像图像内的第二裁剪区域中裁剪第二裁剪图像;基于第一裁剪图像和第二裁剪图像来生成图像。
电路可以被构成为:在第一曝光时间内的第二时间点处,控制移动机构使光学***或图像传感器沿第三方向移动第二移动量从而将光学***或图像传感器移动到可移动范围内的第一位置。电路可以被构成为:在第一曝光时间内的第二时间点之后的第三时间点处,使图像传感器输出第三摄像图像,从将第二裁剪区域沿与第三方向相反的第四方向移动与第二移动量相当的量而获得的、第三摄像图像内的第三裁剪区域中裁剪第三裁剪图像。电路可以被构成为进一步基于第三裁剪图像来生成图像。
电路可以被构成为:当光学***或图像传感器移动超过可移动范围内的第一移动范围时,确定其为第一时间点,而使图像传感器输出第一摄像图像。
电路可以被构成为基于第一摄像图像来调整第二时间点的时机。
电路可以被构成为:调整第二时间点的时机,使得当第一摄像图像处于饱和状态时,从第一时间点到第二时间点的时间比第一摄像图像不处于饱和状态时的情况下的时间短。
摄像装置可以在拍摄具有第一明亮度的被摄体的第一摄像模式下以及在拍摄具有比第一明亮度更明亮的第二明亮度的被摄体的第二摄像模式下操作。电路可以被构成为:当摄像装置在第一摄像模式下操作时,将第一曝光时间设置为第一时间段,并且当摄像装置在第二摄像模式下操作时,将第一曝光时间设置为比第一时间段短的第二时间段。
电路可以被构成为:在第一曝光时间内获取包括第一裁剪图像以及第二裁剪图像的多个裁剪图像,从多个裁剪图像中选择满足预定条件的裁剪图像,基于所选择的裁剪图像来生成图像。
该电路可以被构成为:通过将第一裁剪图像的各个像素的像素值与第二裁剪图像的各个像素的像素值相加来生成图像。
本发明的一个方面所涉及的摄像装置可以包括上述控制装置、光学***、图像传感器以及移动机构。
根据本发明的一个方面所涉及的控制方法可以是对摄像装置进行控制的控制方法,该摄像装置包括:光学***;图像传感器;以及移动机构,其基于表示抖动振动的振动信息,通过在预定的可移动范围内沿与光学***的光轴相交的方向移动光学***或图像传感器来执行抖动校正。控制方法可以包括:在移动机构执行抖动校正的第一曝光时间内的第一时间点处使图像传感器输出第一摄像图像,并控制移动机构以使光学***或图像传感器沿 第一方向移动第一移动量从而将光学***或图像传感器移动到可移动范围内的第一预定位置。控制方法可以包括:在所述第一曝光时间内的所述第一时间点之后的第二时间点处使图像传感器输出第二摄像图像的步骤。控制方法可以包括:基于第一摄像图像和第二摄像图像来生成图像的步骤。
本发明的一个方面所涉及的程序可以是一种用于使计算机作为上述控制装置发挥作用的程序。
根据本发明的一个方面,即使当曝光时间较长时,通过在物理可移动范围内沿与光轴相交的方向移动光学***或图像传感器,也可以充分地执行抖动校正。
此外,上述发明内容未列举本发明的必要的全部特征。此外,这些特征组的子组合也可以构成发明。
附图说明
图1是示出本实施方式所涉及的摄像装置的外观立体图的一个示例的图。
图2是示出本实施方式所涉及的摄像装置的功能块的图。
图3是示出摄像装置的抖动角度随时间变化情况的图。
图4是示出摄像装置的拍摄流程的情形图。
图5是示出摄像装置的拍摄流程的情形图。
图6是示出摄像装置的拍摄流程的情形图。
图7是示出硬件构成的一个示例的图。
具体实施方式
以下,通过发明的实施方式来说明本发明,但是以下的实施方式并不限定权利要求书所涉及的发明。此外,实施方式中所说明的所有特征组合对于发明的解决方案未必是必须的。对本领域普通技术人员来说,显然可以对以下实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
权利要求书、说明书、说明书附图以及说明书摘要中包含作为著作权所保护对象的事项。任何人只要如专利局的文档或者记录所表示的那样进行这些文件的复制,著作权人则不会提出异议。但是,在除此以外的情况下,保留一切的著作权。
本发明的各种实施方式可参照流程图及框图来描述,这里,方框可表示(1)执行操作的过程的阶段或者(2)具有执行操作的作用的装置的“部”。特定的阶段和“部”可以 通过可编程电路和/或处理器来实现。专用电路可以包括数字和/或模拟硬件电路。可以包括集成电路(IC)和/或分立电路。可编程电路可以包括可重构硬件电路。可重构硬件电路可以包括逻辑与、逻辑或、逻辑异或、逻辑与非、逻辑或非、及其它逻辑操作、触发器、寄存器、现场可编程门阵列(FPGA)、可编程逻辑阵列(PLA)等存储器元件等。
计算机可读介质可以包括可以对由适宜的设备执行的指令进行存储的任意有形设备。其结果是,其上存储有指令的计算机可读介质包括一种包括指令的产品,该指令可被执行以创建用于执行流程图或框图所指定的操作的手段。作为计算机可读介质的示例,可以包括电子存储介质、磁存储介质、光学存储介质、电磁存储介质、半导体存储介质等。作为计算机可读介质的更具体的示例,可以包括软盘(注册商标)、软磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦除可编程只读存储器(EPROM或者闪存)、电可擦可编程只读存储器(EEPROM)、静态随机存取存储器(SRAM)、光盘只读存储器(CD-ROM)、数字多用途光盘(DVD)、蓝光(RTM)光盘、记忆棒、集成电路卡等。
计算机可读指令可以包括由一种或多种编程语言的任意组合描述的源代码或者目标代码中的任意一个。源代码或者目标代码包括传统的程序式编程语言。传统的程序式编程语言可以为汇编指令、指令集架构(ISA)指令、机器指令、与机器相关的指令、微代码、固件指令、状态设置数据、或者Smalltalk、JAVA(注册商标)、C++等面向对象编程语言以及“C”编程语言或者类似的编程语言。计算机可读指令可以在本地或者经由局域网(LAN)、互联网等广域网(WAN)提供给通用计算机、专用计算机或者其它可编程数据处理装置的处理器或可编程电路。处理器或可编程电路可以执行计算机可读指令,以创建用于执行流程图或框图所指定操作的手段。处理器的示例包括计算机处理器、处理单元、微处理器、数字信号处理器、控制器、微控制器等。
图1是示出本实施方式所涉及的摄像装置100的外观立体图的一个示例的图。图2是示出本实施方式所涉及的摄像装置100的功能块的图。
摄像装置100包括摄像部102及镜头部200。摄像部102包括图像传感器120、摄像控制部110及存储器130。图像传感器120可以由CCD或CMOS构成。图像传感器120将通过变焦镜头211以及聚焦镜头210成像的光学图像的图像数据输出至摄像控制部110。摄像控制部110可以由CPU或MPU等微处理器、MCU等微控制器等构成。存储器130可以是计算机可读记录介质,也可以包括诸如SRAM、DRAM、EPROM、EEPROM和USB存储器等闪存中的至少一种。存储器130储存摄像控制部110对图像传感器120等进行控制所需的程序等。存储器130可以设置于摄像装置100的壳体内部。存储器130可以 设置成可从摄像装置100的壳体上拆卸下来。
摄像部102还可以包括指示部162及显示部160。指示部162是从用户处接受对摄像装置100的指示的用户界面。显示部160显示由图像传感器120所摄像的图像、摄像装置100的各种设置信息等。显示部160可以由触控面板组成。
镜头部200包括聚焦镜头210、变焦镜头211、镜头驱动部212、镜头驱动部213以及镜头控制部220。聚焦镜头210和变焦镜头211可以包括至少一个镜头。聚焦镜头210和变焦镜头211的至少一部分或全部被构成为能够沿着光轴移动。镜头部200可以是被设置成能够相对摄像部102拆装的可更换镜头。镜头驱动部212经由凸轮环、引导轴等机构构件使聚焦镜头210的至少一部分或全部沿着光轴移动。镜头驱动部213经由凸轮环、引导轴等机构构件使变焦镜头211的至少一部分或全部沿着光轴移动。镜头控制部220按照来自摄像部102的镜头控制指令来驱动镜头驱动部212和镜头驱动部213中的至少一个,并经由机构构件使聚焦镜头210和变焦镜头211中的至少一个沿着光轴方向移动,以执行变焦操作和聚焦操作中的至少一个。镜头控制指令例如为变焦控制指令及对焦控制指令。
镜头部200还包括存储器240、位置传感器214以及位置传感器215。存储器240对经由镜头驱动部212和镜头驱动部213驱动的聚焦镜头210和变焦镜头211的控制值进行存储。存储器240可以包括SRAM、DRAM、EPROM、EEPROM及USB存储器等闪速存储器中的至少一个。位置传感器214检测聚焦镜头210的位置。位置传感器214可以检测当前的聚焦位置。位置传感器215检测变焦镜头211的位置。位置传感器215可以检测变焦镜头211的当前的变焦位置。
镜头部200包括光学式图像稳定机构(OIS)。更具体地,镜头部200包括图像稳定用的镜头231、镜头驱动部233、位置传感器235和振动传感器250。振动传感器250可以是检测摄像装置100的振动的陀螺仪传感器。振动传感器250可以是检测摄像装置100的振动的加速度传感器。陀螺仪传感器检测例如角度抖动和旋转抖动。加速度传感器例如检测在X方向或Y方向上的位移抖动。陀螺仪传感器还可以将角度和旋转转换为X方向矢量或Y方向矢量。加速度传感器也可以将X方向或Y方向的位移抖动转换为角度抖动和旋转抖动。振动传感器250可以是加速度传感器和陀螺仪传感器的组合。镜头驱动部233沿与光轴相交的方向移动镜头231。镜头驱动部233可以沿与光轴垂直的方向移动镜头231。镜头驱动部233可以包括音圈电机。
位置传感器235检测镜头231的位置。位置传感器235可以检测镜头231在垂直于光轴的方向上的位置。位置传感器235可以输出镜头231在垂直于光轴的方向上的位置作为 指示镜头231的振动的振动信号。
镜头部200是图像稳定装置的示例。镜头控制部220从振动传感器250获取指示振动的振动信号,基于该振动信号由镜头驱动部233通过沿与光轴相交的方向振动镜头231来进行图像稳定。图像传感器120对经由变焦镜头211、聚焦镜头210和镜头231成像的图像进行拍摄。
摄像部102还包括机身图像稳定机构(BIS)。更具体地,摄像部102还包括图像传感器驱动部150和位置传感器152。图像传感器驱动部150沿与光轴相交的方向移动图像传感器120。图像传感器驱动部150沿与光轴垂直的方向移动图像传感器120。图像传感器驱动部150可以包括音圈电机。位置传感器152检测图像传感器120的位置。位置传感器152可以检测图像传感器120在垂直于光轴的方向上的位置。位置传感器152可以输出120在垂直于光轴的方向上的位置作为指示图像传感器120的振动的振动信号。摄像控制部110从振动传感器250获取指示振动的振动信号,基于该振动信号经由图像传感器驱动部150通过沿与光轴相交的方向使图像传感器120振动来进行图像稳定。
摄像装置100包括OIS和BIS中的至少一种即可。图像传感器驱动部150或镜头驱动部233是移动机构的示例。
如上所述,由于摄像装置100包括OIS或BIS,因此通过沿消除摄像装置100的抖动的方向移动镜头231或图像传感器120来进行图像稳定。然而,镜头231或图像传感器120的可移动范围具有物理性限制。因此,如果摄像装置100的抖动较大,则摄像装置100可能无法充分进行图像稳定。
例如,当在暗处拍摄被摄体例如拍摄星星等时,曝光时间趋于相对较长。当用户用手握持摄像装置100来拍摄时,如果曝光时间较长,则摄像装置100的抖动变大,摄像装置100可能无法充分进行图像稳定。
图3示出了摄像装置100的抖动角度随时间变化的情况。当拍摄诸如星星等时,曝光时间相对较长,例如12秒等。当用户手持摄像装置100要拍摄星星时,难以在曝光时间内保持摄像装置100静止。因此,在曝光时间期间,例如在时间点T1,摄像装置100的抖动角度可能超过指示镜头231或图像传感器120的可移动范围500的限度的抖动角度。在这种情况下,摄像装置100所拍摄的摄像图像510上的诸如星星等的被摄体520的位置在曝光时间内偏移,摄像图像510中的被摄体520模糊。
因此,在本实施方式中,如图4所示,摄像控制部110将对期望的被摄体拍摄所需的期望曝光时间分割为多个曝光时间,针对每个所分割的曝光时间(S1~S6),使镜头驱动部 233或图像传感器驱动部150执行复位操作,以将镜头231或图像传感器120的位置移动到可移动范围内的第一预定位置。第一位置可以是例如镜头231或图像传感器120的可移动范围的中心位置。
摄像控制部110在每个所分割的曝光时间(S1~S6)获取从图像传感器120输出的摄像图像(600-1~600-6)。摄像控制部110基于从图像传感器120输出的摄像图像(600-1~600-6)来生成期望曝光时间的图像。摄像控制部110可以通过将从图像传感器120输出的各个摄像图像(600-1~600-6)对准,并对每个摄像图像(600-1~600-6)的每个像素的像素值进行相加来生成期望曝光时间的图像。期望曝光时间的图像相当于以将所分割后的曝光时间相加而获得的总曝光时间拍摄的一个图像。
摄像控制部110从各个摄像图像(600-1~600-6)中裁剪预定尺寸的裁剪区域(610-1~610-6),获取裁剪图像。即,摄像控制部110通过从各个摄像图像(600-1~600-6)中修剪预定尺寸的裁剪区域(610-1~610-6)的图像,来获取裁剪图像。
摄像控制部110在每个所分割的曝光时间调整裁剪区域(610-1~610-6)的位置。摄像控制部110例如通过经由图像传感器驱动部150使镜头231或图像传感器120沿第一方向移动第一移动量从而将镜头231或图像传感器120移动到可移动范围内的第一预定位置。因此,摄像控制部110沿与第一方向相反的第二方向将摄像图像中的裁剪区域移动与第一移动量相当的量。
摄像控制部110基于针对每个所分割的曝光时间获取的各个裁剪图像来生成期望曝光时间的图像620。摄像控制部110通过对每个所分割的曝光时间获取的各个裁剪图像的每个像素的像素值进行相加来生成期望曝光时间的图像620。
在每个所分割的曝光时间,将镜头231或图像传感器120移动到第一位置。因此,可以将镜头231或图像传感器120为了进行抖动校正而在所分割的曝光时间内要移动的移动范围包括在可移动范围内。摄像控制部110可以在每个所分割的曝光时间可靠地执行抖动校正。此外,摄像控制部110在每个所分割的曝光时间移动裁剪区域,使得期望的被摄体存在于裁剪区域中的特定位置。由此,即使在每个所分割的曝光时间将镜头231或图像传感器120移动到第一位置,也可以将期望的被摄体定位在各个裁剪图像的特定位置。此外,摄像控制部110通过将多个裁剪图像的每个像素的像素值相加来生成一个图像。由此,可以获得与以期望的曝光时间拍摄的图像相当的图像。
摄像控制部110使图像传感器120在图像传感器驱动部150或镜头驱动部233执行抖动校正的第一曝光时间内的第一时间点(例如,图4中的2[S],其中S表示“秒”)处输 出摄像图像600-1。此外,摄像控制部110从摄像图像600-1内的裁剪区域610-1裁剪第一裁剪图像。然后,摄像控制部110可以控制图像传感器驱动部150或镜头驱动部233以使镜头231或图像传感器120沿第一方向移动第一移动量从而将镜头231或图像传感器120移动到可移动范围内的第一预定位置。
摄像控制部110使图像传感器120在第一曝光时间内的第一时间点(2[S])之后的第二时间点(4[S])处输出摄像图像600-2。此外,摄像控制部110沿与第一方向相反的第二方向将裁剪区域610-1移动与第一移动量相当的量。然后,摄像控制部110从摄像图像600-2内的被移动裁剪区域610-2中裁剪第二裁剪图像。
在第一曝光时间内的第二时间点(4[S])处,摄像控制部110可以控制图像传感器驱动部150或镜头驱动部233以使镜头231或图像传感器120沿第三方向移动第二移动量从而将镜头231或图像传感器120移动到可移动范围内的第一位置。摄像控制部110使图像传感器120在第一曝光时间内的第二时间点(4[S])之后的第三时间点(6[S])处输出摄像图像600-3。摄像控制部110沿与第三方向相反的第四方向将裁剪区域610-2移动与第二移动量相当的量。摄像控制部110从摄像图像600-3内的被移动裁剪区域610-3中裁剪第三裁剪图像。类似地,摄像控制部110使图像传感器120每隔预定时间段(例如,2[S])输出摄像图像600-4~600-6,并将镜头231或图像传感器120返回到第一位置。摄像控制部110从摄像图像600-4~600-6的各个裁剪区域610-4~600-6裁剪第四裁剪图像、第五裁剪图像和第六裁剪图像。摄像控制部110通过对第一裁剪图像到第六裁剪图像进行像素相加来生成一个图像620。
因此,根据本实施方式,即使当拍摄诸如星星等曝光时间相对较长时,通过在物理可移动范围内沿与光轴相交的方向移动镜头231或图像传感器120,也可以充分地执行抖动校正。
摄像控制部110可以基于被摄体的明亮度来设置总曝光时间。摄像控制部110可以基于ISO感光度和F值(光圈值)来设置总曝光时间。摄像控制部110可以例如基于作为拍摄对象的星星的等级来设置总曝光时间。摄像控制部110可以设置与星星的等级对应的总曝光时间,使得拍摄比第一等级低的第二等级时星星的总曝光时间比拍摄第一等级的星星时的总曝光时间长。例如,摄像控制部110可以设置总曝光时间,使得拍摄直到第六等级的星星时的总曝光时间比拍摄直到第三等级的星星时的总曝光时间长。
从总曝光时间分割的曝光时间的时长可以是可变的。如图5所示,当镜头231或图像传感器120移动超过可移动范围500内的第一移动范围501时,摄像控制部110可以确定 是将镜头231或图像传感器120移动到第一位置的时间点。当镜头231或图像传感器120到达基于可移动范围确定的界限位置时,摄像控制部110可以确定是将镜头231或图像传感器120移动到第一位置的时间点。
在镜头231或图像传感器120移动超过可移动范围500内的第一移动范围501的时间点,摄像控制部110可以使图像传感器120输出摄像图像,并且将镜头231或图像传感器120移动到第一位置。
摄像控制部110基于在所分割的曝光时间内所拍摄的摄像图像的S/N比来调整在总曝光时间内将镜头231或图像传感器120移动到第一位置的时机。摄像控制部110可以设置所分割的曝光时间的最短时间,使得摄像图像的S/N比等于或大于阈值。摄像图像的S/N比取决于图像传感器120的像元尺寸。因此,摄像控制部110可以基于图像传感器120的像元尺寸来设置所分割的曝光时间中的最短时间。摄像控制部110可以将镜头231或图像传感器120移动到第一位置的时机设置成使得所分割的曝光时间等于或大于基于S/N比的最短时间。
摄像控制部110可以基于以所分割的曝光时间拍摄的第一时间点的第一摄像图像来调整在第一时间点之后的第二时间点的时机。摄像控制部110可以基于以所分割的曝光时间拍摄的第一时间点的第一摄像图像的是否处于饱和状态来调整在第一时间点之后的第二时间点的时机。例如,当摄像装置100拍摄包括月亮和星星的夜空时,月亮可能太亮,以所分割的曝光时间拍摄的摄像图像中的月亮区域可能处于饱和状态。因此,如图6所示,当在第一时间点拍摄的第一摄像图像600-1的至少一部分的区域处于饱和状态时,第一摄像图像600-1可以调整第二时间点的时机,使得从第一时间点到第二时间点(S2)的时间,即,所分割的曝光时间,比第一摄像图像600-1的不处于饱和状态的情况下的曝光时间短。在此,摄像图像达到饱和状态是指摄像图像的至少一部分区域达到饱和状态。摄像图像达到饱和状态是指在摄像图像内的尺寸等于或大于预定尺寸的区域中的像素的像素值或亮度等于或大于阈值的状态。
摄像装置100可以在拍摄具有第一明亮度的被摄体的第一摄像模式下以及在拍摄具有比第一明亮度更明亮的第二明亮度的被摄体的第二摄像模式下操作。摄像装置100可以在拍摄第一等级以上的星星的第一摄像模式下以及在拍摄比第一等级低的第二等级以上的星星的第二摄像模式下操作。摄像装置100可以以按摄像对象星星等级的摄像模式操作。当摄像装置100在第一摄像模式下操作时,摄像控制部110将第一曝光时间设置为第一时间段,当摄像装置100在第二摄像模式下操作时,该摄像控制部可以将第一曝光时间设置 为比第一时间段短的第二时间段。
摄像控制部110可以从在第一曝光时间内拍摄的多个摄像图像中获取多个裁剪图像,并且从多个裁剪图像中选择满足预定条件的裁剪图像。摄像控制部110可以基于所选择的裁剪图像来生成第一曝光时间的图像。预定条件可以是基于图像质量的条件。预定条件可以是基于图像的S/N比的条件。摄像控制部110可以从多个裁剪图像中选择S/N比等于或大于阈值的裁剪图像。摄像控制部110可以将第一曝光时间延长相当于与未被选择的裁剪图像相对应的曝光时间的时间量。
摄像控制部110可以确定是否针对多个裁剪图像中的每个像素进行像素相加。摄像控制部110从一个裁剪图像中修剪亮度等于或大于阈值的第一区域的图像,以获取与第一区域相当的第一部分图像。摄像控制部110将各个裁剪图像的除了第一区域之外的第二区域中的各个像素的像素值相加,以获取与第二区域相当的第二部分图像。摄像控制部110可以对第一区域的第一部分图像和第二区域的第二部分图像进行合成而生成包括第一区域和第二区域的一个图像。
例如,当摄像装置100拍摄包括月亮和星星的图像时,摄像控制部110从一个裁剪图像中修剪包括月亮的第一区域的图像以获取与第一区域相当的第一部分图像。摄像控制部110将各个裁剪图像的除了第一区域之外的包括星星的第二区域中的各个像素的像素值相加,以获取与第二区域相对应的第二部分图像。摄像控制部110可以对包括月亮的第一区域的第一部分图像和包括除了月亮之外的星星的第二区域的第二部分图像进行合成,生成包括第一区域和第二区域的一个图像。
图7示出可全部或部分地体现本发明的多个方面的计算机1200的一个示例。安装在计算机1200上的程序能够使计算机1200作为与本发明的实施方式所涉及的装置相关联的操作或者该装置的一个或多个“部”而起作用。或者,该程序能够使计算机1200执行该操作或者该一个或多个“部”。该程序能够使计算机1200执行本发明的实施方式所涉及的过程或者该过程的阶段。这种程序可以由CPU1212执行,以使计算机1200执行与本说明书所述的流程图及框图中的一些或者全部方框相关联的指定操作。
本实施方式的计算机1200包括CPU1212以及RAM1214,它们通过主机控制器1210相互连接。计算机1200还包括通信接口1222、输入/输出单元,它们通过输入/输出控制器1220与主机控制器1210连接。计算机1200还包括ROM1230。CPU1212按照ROM1230及RAM1214内存储的程序而工作,从而控制各单元。
通信接口1222通过网络与其他电子装置通信。硬盘驱动器可以存储计算机1200内的 CPU1212所使用的程序及数据。ROM1230在其中存储运行时由计算机1200执行的引导程序等、和/或取决于计算机1200的硬件的程序。程序通过CR-ROM、USB存储器或IC卡之类的计算机可读记录介质或者网络来提供。程序安装在也作为计算机可读记录介质的示例的RAM1214或ROM1230中,并通过CPU1212执行。这些程序中记述的信息处理由计算机1200读取,并引起程序与上述各种类型的硬件资源之间的协作。可以通过根据计算机1200的使用而实现信息的操作或者处理来构成装置或方法。
例如,当在计算机1200和外部装置之间执行通信时,CPU1212可执行加载在RAM1214中的通信程序,并且基于通信程序中描述的处理,命令通信接口1222进行通信处理。通信接口1222在CPU1212的控制下,读取存储在RAM1214或USB存储器之类的记录介质内提供的发送缓冲区中的发送数据,并将读取的发送数据发送到网络,或者将从网络接收的接收数据写入记录介质内提供的接收缓冲区等中。
此外,CPU1212可以使RAM1214读取USB存储器等外部记录介质所存储的文件或数据库的全部或者需要的部分,并对RAM1214上的数据执行各种类型的处理。接着,CPU1212可以将处理过的数据写回到外部记录介质中。
可以将各种类型的程序、数据、表格及数据库之类的各种类型的信息存储在记录介质中,并接受信息处理。对于从RAM1214读取的数据,CPU1212可执行在本公开的各处描述的、包括由程序的指令序列指定的各种类型的操作、信息处理、条件确定、条件转移、无条件转移、信息的检索/替换等各种类型的处理,并将结果写回到RAM1214中。此外,CPU1212可以检索记录介质内的文件、数据库等中的信息。例如,在记录介质中存储具有分别与第二属性的属性值相关联的第一属性的属性值的多个条目时,CPU1212可以从该多个条目中检索出与指定第一属性的属性值的条件相匹配的条目,并读取该条目内存储的第二属性的属性值,从而获取与满足预定条件的第一属性相关联的第二属性的属性值。
以上描述的程序或者软件模块可以存储在计算机1200上或者计算机1200附近的计算机可读存储介质上。另外,连接到专用通信网络或因特网的服务器***中提供的诸如硬盘或RAM之类的记录介质可以用作计算机可读存储介质,从而可以经由网络将程序提供给计算机1200。
以上使用实施方式对本发明进行了说明,但是本发明的技术范围并不限于上述实施方式所描述的范围。对本领域普通技术人员来说,显然可对上述实施方式加以各种变更或改良。从权利要求书的描述显而易见的是,加以了这样的变更或改良的方式都可包含在本发明的技术范围之内。
应该注意的是,权利要求书、说明书以及说明书附图中所示的装置、***、程序以及方法中的操作、顺序、步骤以及阶段等各项处理的执行顺序,只要没有特别明示“在…之前”、“事先”等,且只要前面处理的输出并不用在后面的处理中,则可以任意顺序实现。关于权利要求书、说明书以及说明书附图中的操作流程,为方便起见而使用“首先”、“接着”等进行了说明,但并不意味着必须按照这样的顺序实施。
符号说明
100摄像装置
102摄像部
110摄像控制部
120图像传感器
130存储器
150图像传感器驱动部
152位置传感器
160显示部
162指示部
200镜头部
210聚焦镜头
211变焦镜头
212,213镜头驱动部
214,215位置传感器
220镜头控制部
231图像稳定用镜头
233镜头驱动部
235位置传感器
240存储器
250振动传感器
500可移动范围
501移动范围
510摄像图像
520被摄体
600摄像图像
610裁剪区域
620图像
1200计算机
1210主机控制器
1212 CPU
1214 RAM
1220输入/输出控制器
1222通信接口
1230 ROM

Claims (12)

  1. 一种控制装置,其对包括光学***、图像传感器以及在预定的可移动范围内沿与所述光学***的光轴相交的方向移动所述光学***或所述图像传感器来执行抖动校正的移动机构的摄像装置进行控制,其特征在于,
    所述控制装置包括电路,所述电路被构成为:在所述移动机构所述执行抖动校正的第一曝光时间内的第一时间点处使所述图像传感器输出第一摄像图像,并控制所述移动机构使所述光学***或所述图像传感器沿第一方向移动第一移动量从而将所述光学***或所述图像传感器移动到可移动范围内的第一预定位置;
    在所述第一曝光时间内的所述第一时间点之后的第二时间点处使所述图像传感器输出第二摄像图像;
    基于所述第一摄像图像和所述第二摄像图像来生成图像。
  2. 根据权利要求1所述的控制装置,其特征在于,所述电路被构成为:
    从所述第一摄像图像内的第一裁剪区域中裁剪第一裁剪图像;
    从将所述第一裁剪区域沿与所述第一方向相反的第二方向移动与所述第一移动量相当的量而获得的、所述第二摄像图像内的第二裁剪区域中裁剪第二裁剪图像;
    基于所述第一裁剪图像和所述第二裁剪图像来生成所述图像。
  3. 根据权利要求2所述的控制装置,其特征在于,所述电路被构成为:
    在所述第一曝光时间内的所述第二时间点处,控制移动机构使所述光学***或所述图像传感器沿第三方向移动第二移动量从而将所述光学***或所述图像传感器移动到所述可移动范围内的所述第一位置;
    在所述第一曝光时间内的所述第二时间点之后的第三时间点处,使所述图像传感器输出第三摄像图像,从将所述第二裁剪区域沿与所述第三方向相反的第四方向移动与所述第二移动量相当的量而获得的、所述第三摄像图像内的第三裁剪区域中裁剪第三裁剪图像;
    进一步基于所述第三裁剪图像来生成所述图像。
  4. 根据权利要求1所述的控制装置,其特征在于,当所述光学***或所述图像传感器移动超过所述可移动范围内的第一移动范围时,所述电路确定其为所述第一时间点,而使所述图像传感器输出所述第一摄像图像。
  5. 根据权利要求1所述的控制装置,其特征在于,所述电路被构成为基于所述第一摄像图像来调整所述第二时间点的时机。
  6. 根据权利要求5所述的控制装置,其特征在于,所述电路被构成为:调整所述第二时间点的时机,使得当所述第一摄像图像处于饱和状态时,从所述第一时间点到所述第二时间点的时间比所述第一摄像图像不处于饱和状态时的情况下的时间短。
  7. 根据权利要求1所述的控制装置,其特征在于,所述摄像装置可以在拍摄具有第一明亮度的被摄体的第一摄像模式下以及在拍摄具有比所述第一明亮度更明亮的第二明亮度的被摄体的第二摄像模式下操作,
    所述电路被构成为:当所述摄像装置在所述第一摄像模式下操作时,将所述第一曝光时间设置为第一时间段;当所述摄像装置在所述第二摄像模式下操作时,将所述第一曝光时间设置为比所述第一时间段短的第二时间段。
  8. 根据权利要求2所述的控制装置,其特征在于,所述电路被构成为:从在所述第一曝光时间内拍摄的多个摄像图像中获取包括所述第一裁剪图像以及所述第二裁剪图像的多个裁剪图像,从所述多个裁剪图像中选择满足预定条件的裁剪图像,基于所选择的裁剪图像来生成所述图像。
  9. 根据权利要求2所述的控制装置,其特征在于,所述电路被构成为:通过将所述第一裁剪图像的各个像素的像素值与所述第二裁剪图像的各个像素的像素值相加来生成所述图像。
  10. 一种摄像装置,其特征在于,包括:根据权利要求1至9中任意一项所述的控制装置、所述光学***、所述所述图像传感器以及所述移动机构。
  11. 一种控制方法,其对包括光学***、图像传感器以及基于表示抖动振动的振动信息而在预定的可移动范围内沿与所述光学***的光轴相交的方向移动所述光学***或所述图像传感器来执行抖动校正的移动机构的摄像装置进行控制,其特征在于,包括:
    在所述移动机构执行所述抖动校正的第一曝光时间内的第一时间点处使所述图像传感器输出第一摄像图像,并控制所述移动机构使所述光学***或所述图像传感器沿第一方向移动第一移动量从而将所述光学***或所述图像传感器移动到可移动范围内的第一预定位置的步骤;
    在所述第一曝光时间内的所述第一时间点之后的第二时间点处使所述图像传感器输出第二摄像图像的步骤;以及
    基于所述第一摄像图像和所述第二摄像图像来生成图像的步骤。
  12. 一种程序,其特征在于,其用于使计算机用作权利要求1至9中任意一项所述的控制装置而发挥功能。
PCT/CN2020/123565 2019-11-05 2020-10-26 控制装置、摄像装置、控制方法以及程序 WO2021088669A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202080004464.3A CN112585938B (zh) 2019-11-05 2020-10-26 控制装置、摄像装置、控制方法以及程序

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-200783 2019-11-05
JP2019200783A JP2021077935A (ja) 2019-11-05 2019-11-05 制御装置、撮像装置、制御方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2021088669A1 true WO2021088669A1 (zh) 2021-05-14

Family

ID=75849410

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/123565 WO2021088669A1 (zh) 2019-11-05 2020-10-26 控制装置、摄像装置、控制方法以及程序

Country Status (2)

Country Link
JP (1) JP2021077935A (zh)
WO (1) WO2021088669A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114592339A (zh) * 2022-03-31 2022-06-07 南通成鹏纺织有限公司 一种纺织面料生产用的智能裁剪装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003032540A (ja) * 2001-07-12 2003-01-31 Olympus Optical Co Ltd 撮像装置
JP2005176050A (ja) * 2003-12-12 2005-06-30 Nikon Corp 撮像装置
CN1960446A (zh) * 2005-11-04 2007-05-09 索尼株式会社 摄像装置、摄像方法以及程序
CN101035206A (zh) * 2006-03-10 2007-09-12 奥林巴斯映像株式会社 电子抖动校正装置以及电子抖动校正方法
CN104243863A (zh) * 2013-06-06 2014-12-24 奥林巴斯株式会社 拍摄装置、拍摄方法
CN104796621A (zh) * 2014-01-20 2015-07-22 奥林巴斯株式会社 摄像装置以及摄像方法
CN107465867A (zh) * 2016-06-06 2017-12-12 奥林巴斯株式会社 摄像装置和摄像方法

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010078635A (ja) * 2008-09-24 2010-04-08 Sanyo Electric Co Ltd ブレ補正装置及び撮像装置
JP2016054447A (ja) * 2014-09-04 2016-04-14 オリンパス株式会社 撮像装置、画像処理装置、画像処理方法、および画像処理プログラム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003032540A (ja) * 2001-07-12 2003-01-31 Olympus Optical Co Ltd 撮像装置
JP2005176050A (ja) * 2003-12-12 2005-06-30 Nikon Corp 撮像装置
CN1960446A (zh) * 2005-11-04 2007-05-09 索尼株式会社 摄像装置、摄像方法以及程序
CN101035206A (zh) * 2006-03-10 2007-09-12 奥林巴斯映像株式会社 电子抖动校正装置以及电子抖动校正方法
CN104243863A (zh) * 2013-06-06 2014-12-24 奥林巴斯株式会社 拍摄装置、拍摄方法
CN104796621A (zh) * 2014-01-20 2015-07-22 奥林巴斯株式会社 摄像装置以及摄像方法
CN107465867A (zh) * 2016-06-06 2017-12-12 奥林巴斯株式会社 摄像装置和摄像方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114592339A (zh) * 2022-03-31 2022-06-07 南通成鹏纺织有限公司 一种纺织面料生产用的智能裁剪装置
CN114592339B (zh) * 2022-03-31 2022-11-11 南通成鹏纺织有限公司 一种纺织面料生产用的智能裁剪装置

Also Published As

Publication number Publication date
JP2021077935A (ja) 2021-05-20

Similar Documents

Publication Publication Date Title
JP6700872B2 (ja) 像振れ補正装置及びその制御方法、撮像装置、プログラム、記憶媒体
US20190052808A1 (en) Smart Image Sensor Having Integrated Memory and Processor
US10212348B2 (en) Image processing apparatus, its control method, image capturing apparatus, and storage medium
JP2016201662A (ja) 撮像装置およびその制御方法
WO2019206052A1 (zh) 控制装置、摄像装置、控制方法以及程序
WO2021088669A1 (zh) 控制装置、摄像装置、控制方法以及程序
CN112585938B (zh) 控制装置、摄像装置、控制方法以及程序
JP2018007272A (ja) 画像処理装置、撮像装置およびプログラム
US20210235017A1 (en) Imaging device and control method thereof
WO2021057462A1 (zh) 控制装置、摄像装置、控制方法以及程序
TWI424739B (zh) 電子裝置、影像擷取裝置及其控制方法
WO2020192551A1 (zh) 控制装置、摄像***、控制方法以及程序
WO2019179374A1 (zh) 显示控制装置、摄像装置、显示控制方法
JP2010157792A (ja) 被写体追跡装置
US9549112B2 (en) Image capturing apparatus, and control method therefor
JP2019197295A (ja) 画像処理装置、画像処理方法およびプログラム
JP2021152595A (ja) 制御装置、撮像装置、制御方法、及びプログラム
JP6687210B1 (ja) 撮像装置
US11127118B2 (en) Image processing apparatus, image pickup apparatus, control method to control an image processing apparatus, and storage medium
JP2021153264A (ja) 制御装置、撮像システム、制御方法、及びプログラム
US20240028113A1 (en) Control apparatus, image pickup apparatus, control method, and storage medium
US10681274B2 (en) Imaging apparatus and control method thereof
JP6746857B2 (ja) 画像処理装置、撮像装置、無人航空機、画像処理方法、及びプログラム
JP2012027302A (ja) 投影機能付き撮像装置
JP2018014659A (ja) 撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20884619

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20884619

Country of ref document: EP

Kind code of ref document: A1