WO2012140899A1 - 撮像装置、半導体集積回路および撮像方法 - Google Patents
撮像装置、半導体集積回路および撮像方法 Download PDFInfo
- Publication number
- WO2012140899A1 WO2012140899A1 PCT/JP2012/002555 JP2012002555W WO2012140899A1 WO 2012140899 A1 WO2012140899 A1 WO 2012140899A1 JP 2012002555 W JP2012002555 W JP 2012002555W WO 2012140899 A1 WO2012140899 A1 WO 2012140899A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- focus lens
- displacement
- imaging
- displacement pattern
- image sensor
- Prior art date
Links
- 238000000034 method Methods 0.000 title description 46
- 239000004065 semiconductor Substances 0.000 title 1
- 238000006073 displacement reaction Methods 0.000 claims abstract description 330
- 230000003287 optical effect Effects 0.000 claims abstract description 17
- 238000003384 imaging method Methods 0.000 claims description 214
- 238000006243 chemical reaction Methods 0.000 claims description 26
- 238000001514 detection method Methods 0.000 claims description 21
- 238000012545 processing Methods 0.000 description 19
- 238000010586 diagram Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 5
- 238000005096 rolling process Methods 0.000 description 5
- 230000012447 hatching Effects 0.000 description 4
- 230000010365 information processing Effects 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 3
- 238000003672 processing method Methods 0.000 description 3
- 230000001678 irradiating effect Effects 0.000 description 2
- 238000010408 sweeping Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000010287 polarization Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B7/00—Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
- G03B7/08—Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0075—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/665—Control of cameras or camera modules involving internal camera communication with the image sensor, e.g. synchronising or multiplexing SSIS control signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
Definitions
- the present application relates to an image sensor that can shoot a moving image using subject depth expansion technology.
- EDOF extended depth of field
- a focus sweep operation that moves the focus lens or image sensor during the exposure time is performed, and the image that is uniformly focused in the depth direction is convoluted (that is, synonymous with equalizing blur at each depth).
- a method of obtaining an EDOF image by performing image restoration processing using the obtained blur pattern has been proposed (Non-patent Document 1). This method is called “Flexible DOF” (hereinafter referred to as “F-DOF”).
- F-DOF is known as a method for obtaining good image quality and has a high EDOF effect. Since off-axis characteristics also depend on the lens characteristics themselves, it is easy to improve performance. However, since the same subject needs to be folded on the same image position even if the focus position during exposure is moved, it is necessary to be an image side telecentric lens as an optical condition.
- One of the fields of application of EDOF technology is a microscope.
- the subject to be imaged is a static object, so that it can be taken over time.
- the Focal Stack method has been used for a long time.
- a plurality of images with different in-focus positions are photographed, and regions that are thought to be in focus are extracted from each image and combined to obtain an EDOF image. Since these operations require labor and time, techniques that also use the F-DOF method have been proposed (Patent Documents 1 to 4).
- Patent Documents 1 to 4 When F-DOF is used for a microscope, a sample or a lens barrel that is a subject is moved during exposure.
- Non-patent Document 5 When moving the image sensor, the image sensor is moved at a constant speed. When the focus lens is moved, it is necessary to displace the focus lens corresponding to the movement of the imaging surface at a constant speed (Non-patent Document 1). It is known that the moving pattern may be from the back-side focusing end position to the near-side focusing end position or vice versa.
- FIG. 12 (a) and 12 (b) show the exposure state of the image sensor and the readout state of the image sensor when time is taken on the horizontal axis
- FIG. 12 (c) shows time on the horizontal axis
- the vertical axis shows the displacement of the focus lens when the focus position is taken.
- the hatched portion indicates the timing for performing exposure to the image sensor and reading of data.
- the sweep image is obtained by reciprocating the focus position between the near focus position and the far focus position during the exposure time of the image sensor. Even with such a sweep pattern, if the displacement speed of the focus lens in the linear displacement portion is constant, the exposure time at each focus position becomes uniform, and is equivalent to that described with reference to FIG. Can be obtained.
- This technology can be applied to ordinary digital still cameras and digital video cameras. In recent years, digital still cameras and digital video cameras are required to shoot more easily and with less failure.
- the EDOF technology can be expected to have the effect of releasing all-in-focus images, that is, focusing errors. High image quality, EDOF effect magnitude, EDOF range can be arbitrarily changed, and can be realized by applying a normal autofocus mechanism (no need to prepare a special optical system) Since it is easy to switch between EDOF shooting and normal shooting, the F-DOF method is considered preferable when EDOF technology is used for digital still cameras and digital video cameras.
- EDOF technology When EDOF technology is used for a digital still camera or a digital video camera, it is preferable that EDOF shooting can be performed even in moving image shooting.
- the present application provides an imaging device capable of shooting a high-quality EDOF moving image, an integrated circuit used in the imaging device, and an imaging method.
- An imaging device is an imaging device having a plurality of photoelectric conversion elements that are two-dimensionally arranged and constitute an imaging surface, and exposes the plurality of photoelectric conversion elements, By reading an electrical signal from the photoelectric conversion element, an image sensor that generates an image signal, a lens optical system that includes a focus lens that collects light toward the image sensor, and a distance between the image sensor and the focus lens A drive unit that drives one of the image sensor or the focus lens so as to change, and by outputting a command to the drive unit, the image sensor or the focus lens that is driven based on a predetermined displacement pattern A displacement control unit configured to control the displacement of the image sensor, and the displacement control unit configured to control the displacement control unit based on an exposure timing of the imaging device
- the predetermined displacement pattern is focused at a first subject distance in the imaging scene, and a second in-focus state of the focus lens or the image sensor.
- the first type displacement pattern and the second type displacement pattern are alternately repeated.
- the focus lens or the imaging element is displaced, and in the Flexible DOF method, the sweep imaging for obtaining the omnifocal image and the sweep imaging for obtaining the depth information are alternately repeated. Therefore, it is possible to perform shooting that satisfies both the image quality of the omnifocal image and the depth measurement accuracy.
- FIG. 3 shows a block configuration diagram of an imaging apparatus according to first and third embodiments. It is a flowchart which shows operation
- FIG. 3 is a flowchart illustrating the exposure / sweep step in FIG. 2 in more detail.
- (A) is a figure which shows an example of the displacement pattern of the focus lens or image pick-up element in 1st and 2nd embodiment, (b), (c) is a figure which shows the other example of a displacement pattern. .
- the block block diagram of the imaging device by 2nd Embodiment is shown. It is a flowchart which shows operation
- (A) It is a figure explaining the rolling shutter operation
- (b) And (c) is a figure which respectively shows the example of the displacement pattern of a focus lens.
- (A) And (b) has shown an example of the displacement pattern of a focus lens in the case of using an image pick-up element comprised by a CMOS image sensor and obtaining a full sweep image and a half sweep image.
- (A) And (b) is a figure which shows an example of the displacement pattern of the focus lens or imaging device in 3rd Embodiment.
- (A) shows the timing of exposure and readout of the image sensor when the exposure time is limited using an electronic shutter in the third embodiment
- (b) shows the displacement pattern of the focus lens in this case. It is.
- 3rd Embodiment it is a figure which shows an example of the displacement pattern of a focus lens or an image pick-up element in the case of restrict
- (A), (b) is a figure which shows the timing of exposure in an image pick-up element, (c), (d) is for obtaining an EDOF image corresponding to the timing of exposure of (a), (b). It is a figure which shows the displacement pattern of a focus lens. It is a block block diagram of the imaging device which this inventor examined.
- FIG. 13 It is a figure which shows the positional relationship of a to-be-photographed object, a focus lens, and an image pick-up element. It is a figure which shows the relationship between the position of the focus lens in the imaging device shown in FIG. 13, and exposure time. It is a graph which shows an example of the relationship between subject distance u and image plane side focal length v. It is a block block diagram of the other imaging device which this inventor examined. It is a figure which shows the displacement pattern of the focus lens by F-DOF system.
- A)-(c) is a figure which shows the measurement result by a depth measurement method. It is another figure which shows the result of Fig.19 (a) to (c).
- (A) And (b) is a figure which shows the displacement pattern of the focus lens for obtaining a full sweep image and a half sweep image.
- the inventor of the present application has examined in detail a structure suitable for obtaining an EDOF video in a digital still camera or a digital video camera having a mechanism for driving a focus lens such as an autofocus mechanism used for normal photographing.
- An imaging apparatus 300 shown in FIG. 13 has a structure that displaces the focus lens during the exposure time.
- the imaging apparatus 300 includes a lens optical system 120 including a focus lens 101, a focus lens driving unit 103 that drives the focus lens 101, and an imaging element 104.
- a lens optical system 120 including a focus lens 101, a focus lens driving unit 103 that drives the focus lens 101, and an imaging element 104.
- the focus lens position detecting unit 115 detects the current position (initial position) of the focus lens 101. After detection, the position of the focus lens 101 is displaced to a predetermined end position, for example, the nearest end or the farthest end.
- FIG. 14 is a schematic diagram showing the positional relationship between the subject included in the shooting scene and the focus lens 101 and the image sensor 104 in the image capturing apparatus 300.
- the closest end is the position of the focus lens 101 when the focus lens 101 is moved so that the subject closest to the imaging device 300 among the subjects included in the shooting scene forms an image on the imaging surface of the imaging device 104.
- the distance u from the subject in focus on the imaging surface to the focus lens 101 of the imaging apparatus 300 is the shortest, and the distance v between the focus lens 101 and the imaging element 104 is the longest.
- the farthest end is a focus lens when the focus lens 101 is moved so that the subject farthest from the imaging device 300 among the subjects included in the shooting scene forms an image on the imaging surface of the imaging device 104.
- the position of 101 is said.
- the distance u from the subject focused on the imaging surface to the focus lens 101 of the imaging apparatus 300 is the longest, and the distance v between the focus lens 101 and the imaging element is the shortest.
- the distance between the nearest end and the farthest end of the focus lens 101 is shown larger than the distance between the subject and the imaging apparatus 300 for ease of illustration.
- the exposure time determination unit 114 determines shooting parameters such as a shutter speed and an aperture value.
- the exposure / focus lens displacement synchronization unit 107 that synchronizes the exposure and the displacement of the focus lens outputs an exposure start command to the focus lens displacement control unit 106 and the shutter opening / closing command unit 112.
- the farthest end is the farthest end if the end position is the farthest end, and the farthest from the nearest end if the end position is the nearest end.
- a command for displacing the focus lens 101 within the exposure time is output to the focus lens displacement control unit 106.
- FIG. 15 shows the relationship between the exposure time and exposure amount and the focal length on the image plane side.
- the focal length on the image plane side changes depending on the position of the focus lens 101, and the focus lens 101 is focused on the basis of a command from the focus lens displacement control unit 106 so that the position of the focus lens is displaced at a constant speed with respect to the imaging element surface. It is driven by the lens driving unit 103.
- FIG. 16 shows the relationship between u and v when f is 18 [mm].
- the distance v between the lens principal point and the image sensor changes.
- Driving the focus lens 101 so that the displacement of the focus lens changes at a constant speed with respect to the imaging element surface means that the changing speed of v is constant.
- the distance u between the focal plane on the subject side and the lens principal point is not displaced at a constant speed.
- the horizontal axis of FIG. 16 is the image plane side focal length v, the relationship is opposite to the size of the subject distance u. In other words, the longer the subject distance (distant), the shorter the image plane side focal length v.
- the shutter opening / closing command unit 112 Upon receiving an exposure start command from the exposure / focus lens displacement synchronization unit 107, the shutter opening / closing command unit 112 immediately controls to open the shutter 111. After a predetermined exposure time has elapsed, the exposure / focus lens displacement synchronization unit 107 outputs an exposure end command to the shutter opening / closing command unit 112. The shutter open / close command unit 112 receives the exposure end command, and immediately controls to close the shutter 111.
- the formed optical image is converted into an electric signal by the image sensor 104, and an image signal is output to the image processing unit 109 via the readout circuit 108.
- the exposure / focus lens displacement synchronization unit 107 notifies the image processing unit 109 that the exposure has been completed and that the focus lens displacement has been imaged by F-DOF.
- the image processing unit 109 receives the image signal, performs necessary signal processing, and outputs it to the recording unit 110.
- An imaging apparatus 400 illustrated in FIG. 17 includes an imaging device 104, an imaging device position detection unit 202, an exposure / imaging device displacement synchronization unit 207, an imaging device displacement control unit 206, and an imaging device driving unit 203, and the imaging device during the exposure time. Is displaced. Unlike the imaging device 300, the imaging element position detection unit 202 detects the position of the imaging element 104. An exposure / image sensor displacement synchronization unit 207 synchronizes the exposure timing with the displacement of the image sensor 104. The image sensor displacement control unit 206 controls the displacement of the image sensor 104. The image sensor driving unit 203 receives the signal from the image sensor displacement control unit 206 and drives the image sensor 104.
- the image sensor position detection unit 202 detects the current position (initial position) of the image sensor 104. After detection, the position of the image sensor 104 is displaced to a predetermined end position, for example, the nearest end or the farthest end.
- the nearest end of the predetermined focusing range is the image sensor 104 so that the subject closest to the imaging device 400 among the subjects included in the shooting scene forms an image on the imaging surface of the image sensor 104. This refers to the position of the image sensor 104 when moved.
- the distance u from the subject to the focus lens 101 is the shortest, and the distance v between the focus lens 101 and the image sensor 104 is the longest.
- the farthest end is an image sensor when the image sensor 104 is moved so that a subject farthest from the imaging device 400 among the subjects included in the shooting scene forms an image on the imaging surface of the image sensor 104. 104 position.
- the distance u from the subject to the focus lens 101 is the longest, and the distance v between the focus lens 101 and the image sensor 104 is the shortest.
- the exposure time determination unit 114 determines shooting parameters such as a shutter speed and an aperture value.
- the exposure / imaging device displacement synchronization unit 207 that synchronizes the exposure and the imaging device displacement outputs an exposure start command to the imaging device displacement control unit 206 and the shutter opening / closing command unit 112.
- the farthest end is the farthest end if the end position is the farthest end, and the farthest from the nearest end if the end position is the nearest end.
- a command for displacing the image sensor 104 within the exposure time is output to the image sensor displacement control unit 206.
- the image sensor 104 is displaced at a constant speed.
- the shutter opening / closing command unit 112 Upon receiving an exposure start command from the exposure / imaging element displacement synchronization unit 207, the shutter opening / closing command unit 112 immediately controls the shutter 111 to open. Further, after a predetermined exposure time has elapsed, the exposure / imaging element displacement synchronization unit 207 outputs an exposure end command to the shutter opening / closing command unit 112. The shutter opening / closing command unit 112 receives the exposure end command, and immediately controls to close the shutter 111.
- the formed optical image is converted into an electric signal by the image sensor 104, and the electric signal is output to the image processing unit 109 via the readout circuit 108.
- the exposure / imaging element displacement synchronization unit 207 notifies the image processing unit 109 that the exposure has been completed and that the focus lens displacement has been imaged by F-DOF.
- Other configurations perform the same operation as the imaging apparatus 300 shown in FIG.
- F-DOF shooting can be realized in a digital still camera / digital video camera.
- a moving image there is no time lag between the images of each frame constituting the moving image. It is preferable to continuously shoot.
- the position of the focus lens is reciprocated between the farthest end and the nearest end, and one video frame period is assigned to each of the forward displacement and the backward displacement. It is possible to shoot a smooth EDOF video.
- depth information of the shooting scene that is, depth information indicating the front-rear relationship of a plurality of subjects included in the shooting scene
- three-dimensional information of the shooting scene is obtained.
- Various methods for measuring the depth of a shooting scene have been proposed. They can be broadly divided into active methods that calculate the distance based on the time until the reflected wave returns and the angle of the reflected wave by irradiating with infrared rays, ultrasonic waves, lasers, etc. There is a passive method to calculate the distance. In particular, passive methods that do not require a device for irradiating infrared rays or the like are widely used in cameras.
- DFD Depth from Defocus
- Non-Patent Document 2 a method for realizing DFD, a method called half sweep using the above-mentioned F-DOF has been proposed (Non-Patent Document 2).
- the focus sweep range in F-DOF is divided into two at the in-focus position between the far-side focus end position (farmost end) and the near-side focus end position (nearest end).
- the depth is estimated using two images obtained by sweeping in step (1).
- the above-described method for sweeping the entire section from the far-side in-focus end position to the near-side in-focus end position is referred to as a full sweep for distinction.
- FIG. 19 and 20 show an example of the result of estimating the depth by the DFD disclosed in Non-Patent Document 2.
- FIG. 19 (a), (b), and (c) the left half uses an image with a strong texture for depth estimation including many edges, and the right half uses an image with a weak texture for depth estimation with few edges.
- Used to estimate the depth ie the distance from the imaging device.
- the lower end and the upper end correspond to the near side and the far side of the depth, respectively.
- the shades of hatching in these drawings indicate the estimated distance value, and the darker the hatching, the farther the estimated distance is.
- FIG. 19A shows the true value of the depth. That is, when the depth is correctly estimated, the upper part of the figure is indicated by darker hatching and the lower part is indicated by thinner hatching.
- FIG. 19B shows the depth estimation result obtained by a general DFD method (full sweep method)
- FIG. 19C shows the depth estimation result obtained by the half sweep method.
- FIG. 20 shows the results shown in FIGS. 19B and 19C in numerical values.
- the horizontal axis indicates the depth, and the upper end of FIG. 19 corresponds to the left end of the horizontal axis.
- the vertical axis indicates the correct rate of each depth estimated in FIGS. 19B and 19C with respect to the true value shown in FIG. This accuracy rate is obtained by quantifying the areas of texture strength (left half and right half in the image) as one value. It can be seen from FIG. 20 that the depth estimation obtained by the half sweep method is excellent.
- EDOF when EDOF is adopted in a digital still camera or a digital video camera, it is preferable to use the F-DOF method.
- F-DOF method when shooting a moving image, it is required to continuously shoot without causing a time lag between frames.
- the position of the focus lens is reciprocated between the farthest end and the nearest end as shown in FIG.
- the video frame period By assigning the video frame period, it is possible to shoot a smooth EDOF moving image.
- depth estimation by DFD can be realized with a moving image by still images being acquired at the near focus position and the far focus position.
- FIG. 21B an area where the focus lens sweeps at an intermediate focus position between the near focus position and the far focus position is shown. Divide into two. Specifically, according to the displacement pattern shown by the near sweep NS and the far sweep FS, the focus lens is alternately reciprocated between the farthest end and the nearest end, thereby similarly performing the EDOF image and the depth estimation by the omnifocal image. It can be performed continuously and an EDOF moving image can be taken.
- the displacement pattern for obtaining the omnifocal image that is, the displacement pattern AS from the farthest end to the nearest end
- the displacement pattern AS spans two video frames.
- an omnifocal image is generated from two images shifted in time.
- the image quality of the omnifocal image is greatly deteriorated due to the positional deviation between the subject position in the image obtained by the near sweep NS and the subject position in the image obtained by the far sweep FS. It turns out that there is a problem.
- An imaging device is an imaging device having a plurality of photoelectric conversion elements that are two-dimensionally arranged and constitute an imaging surface, and exposes the plurality of photoelectric conversion elements, By reading an electrical signal from the photoelectric conversion element, an image sensor that generates an image signal, a lens optical system that includes a focus lens that collects light toward the image sensor, and a distance between the image sensor and the focus lens A drive unit that drives one of the image sensor or the focus lens so as to change, and by outputting a command to the drive unit, the image sensor or the focus lens that is driven based on a predetermined displacement pattern A displacement control unit configured to control the displacement of the image sensor, and the displacement control unit configured to control the displacement control unit based on an exposure timing of the imaging element.
- a synchronization unit, and the predetermined displacement pattern is focused at a first subject distance in the imaging scene, and a first in-focus position of the focus lens or the imaging element, and a second subject in the imaging scene.
- a first type displacement pattern and a second type displacement pattern in which the image sensor or the front focus lens is displaced in a different range between the focus lens or the second focus position of the image sensor that is focused at a distance.
- the first type displacement pattern and the second type displacement pattern are alternately repeated.
- the displacement range of the first type displacement pattern includes at least a part of the displacement range of the second type displacement pattern.
- the displacement range of the first type displacement pattern is the entire section between the first focus position and the second focus position.
- the second type displacement pattern is a second F type displacement pattern in which the entire range between the first in-focus position and the intermediate position between the first in-focus position and the second in-focus position is the displacement range.
- a second N type displacement pattern in which the entire section between the intermediate position and the second focus position is the displacement range, and the first type displacement pattern is the second F type displacement pattern and the second N type. It is sandwiched between types of displacement patterns.
- the first type displacement pattern, the second F type displacement pattern, and the second N type displacement pattern each displace the entire displacement range at least once in one direction.
- the first type displacement pattern and the second F type displacement pattern, and the first type displacement pattern and the second N type displacement pattern are connected to each other.
- the imaging apparatus includes an exposure time determination unit that determines an exposure time of the image sensor based on the imaging scene, and the displacement pattern based on the first focus position, the second focus position, and the exposure time. And a displacement setting unit for determining.
- the imaging apparatus further includes a position detection unit that detects a position of the driven imaging element or the focus lens, and the displacement control unit is configured to detect the position of the drive unit based on the output of the position detection unit and the displacement pattern. Command the drive amount.
- the imaging apparatus further includes a readout circuit that reads out the image signal from the imaging element, and the synchronization unit controls the displacement control unit and the readout circuit based on an exposure timing of the imaging element.
- an omnifocal image is generated from an image signal obtained while the image pickup device to be driven or the focus lens is displaced.
- depth information is generated from an image signal obtained while the driven imaging device or the focus lens is displaced.
- the image sensor is a CCD image sensor.
- the image sensor is a CMOS image sensor.
- the first type displacement pattern, the second F type displacement pattern, and the second N type displacement pattern each reciprocate at least the entire displacement range an integer number of times.
- An integrated circuit which is one embodiment of the present invention is an image pickup device having a plurality of photoelectric conversion elements that are two-dimensionally arranged and constitute an image pickup surface, exposing the plurality of photoelectric conversion elements, and By reading an electrical signal from the photoelectric conversion element, an image sensor that generates an image signal, a lens optical system that includes a focus lens that collects light toward the image sensor, and a distance between the image sensor and the focus lens
- An integrated circuit of an image pickup apparatus including a drive unit that drives one of the image sensor or the focus lens so as to change, based on a predetermined displacement pattern by outputting a command to the drive unit, Based on the displacement control unit configured to control the displacement of the driven image sensor or the front focus lens and the timing of exposure of the image sensor, A synchronization unit configured to control a displacement control unit, and the predetermined displacement pattern is focused at a first subject distance in an imaging scene, and the first focus position of the focus lens or the image sensor A first type in which the image sensor or the front focus lens is displaced in a
- An imaging method is an imaging device that includes a plurality of photoelectric conversion elements that are two-dimensionally arranged and configure an imaging surface, exposing the plurality of photoelectric conversion elements, and An imaging method for imaging an imaging scene by focusing light on an imaging element that generates an image signal by reading an electrical signal from a photoelectric conversion element, and the first subject in the imaging scene
- the first focus position of the focus lens or the image sensor that is in focus at a distance and the second focus position of the focus lens or the image sensor that is focused at a second subject distance in the imaging scene.
- Surenzu or while displacing the imaging element is exposed a plurality of photoelectric conversion elements.
- FIG. 1 is a block diagram showing an imaging apparatus 100 according to this embodiment.
- the imaging apparatus 100 includes a focus lens driving unit 103, an image sensor 104, a focus lens displacement control unit 106, an exposure / focus lens displacement synchronization unit 107, and a lens optical system 120.
- the imaging element 104 is a CCD image sensor in the present embodiment, and has a plurality of photoelectric conversion elements that are two-dimensionally arranged and constitute an imaging surface. After light is incident on and exposed to a plurality of photoelectric conversion elements, an electric signal is read from the plurality of photoelectric conversion elements, thereby generating an image signal.
- the lens optical system 120 includes a focus lens 101 that collects light toward the image sensor 104 and forms an image on the image sensor 104.
- the lens optical system 120 may include one or more other lenses in addition to the focus lens 101.
- the focus lens 101 may also be composed of a plurality of lenses.
- the position of the focus lens refers to the position of the principal point by the plurality of lenses.
- the focus lens drive unit 103 functions as a drive unit that drives one of the image sensor 104 or the focus lens 101 so that the distance between the image sensor 104 and the focus lens 101 changes. That is, the focus lens driving unit 103 drives the focus lens 101 based on the drive signal so that the distance between the image sensor 104 and the focus lens 101 changes.
- the focus lens displacement control unit 106 is configured to control the displacement of the focus lens 101 based on a predetermined displacement pattern by outputting a command to the focus lens driving unit 103 as described below.
- the exposure / focus lens displacement synchronization unit 107 is configured to control the focus lens displacement control unit 106 based on the exposure timing of the image sensor 104.
- the imaging apparatus 100 further includes a focus lens position detection unit 102, a focus lens displacement setting unit 105, a readout circuit 108, an image processing unit 109, a recording unit 110, a shutter 111, a shutter opening / closing command unit 112, A release reception unit 113 and an exposure time determination unit 114 are included.
- the focus lens position detection unit 102 includes a position sensor, detects the position of the focus lens 101, and outputs a detection signal to the focus lens displacement control unit 106.
- the focus lens displacement setting unit 105 sets a displacement pattern of the focus lens 101 and sets it as the position of the target focus lens. Accordingly, the focus lens displacement control unit 106 calculates a drive signal from the difference between the position of the target focus lens and the current position of the focus lens 101 detected by the focus lens position detection unit 102 and outputs the drive signal to the focus lens drive unit 103. To do.
- the exposure time determining unit 114 determines the exposure time of the image sensor 104. Also, information relating to the exposure time is output to the exposure / focus lens displacement synchronization unit 107 and the focus lens displacement setting unit 105.
- the exposure / focus lens displacement synchronization unit 107 is configured to perform exposure, drive of the focus lens 101, and readout of an electric signal from the image sensor 104 at a synchronized timing based on information on the exposure time. Commands are output to the lens displacement control unit 106 and the readout circuit 108. Specifically, the shutter opening / closing command unit 112 is instructed about the exposure timing and the exposure time. Further, it instructs the focus lens displacement control unit 106 to drive the focus lens 101 and drive time.
- the shutter 111 performs an opening / closing operation in accordance with a command from the shutter opening / closing command unit 112.
- the image sensor 104 is exposed by the light collected by the focus lens 101, and the exposed light is converted into an electrical signal and output.
- the readout circuit 108 reads out an electrical signal by outputting a readout signal to the image sensor 104, and outputs the readout electrical signal to the image processing unit 109.
- the image processing unit 109 performs various corrections and the like on the input electric signal, sequentially constructs an image signal constituting an image of a shooting scene for one video frame, and outputs the image signal to the recording unit 110. Further, as will be described below, three-dimensional information of a shooting scene may be obtained.
- the image pickup apparatus 100 can expose the image pickup device 104 while driving the focus lens 101 and changing the position of the focus lens to obtain a sweep image.
- the focus lens position detection unit 102, the focus lens driving unit 103, the imaging element 104, the image processing unit 109, the release receiving unit 113, and the recording unit 110 are configured by known hardware. It may be.
- Some or all of the components may be configured by software stored in an information processing circuit such as a CPU and a storage unit such as a memory.
- the information processing circuit controls each component of the imaging apparatus 100 by reading out software defining the procedure of the imaging method described below from the memory and executing the procedure of the imaging method.
- Some of the components realized by the software stored in the information processing circuit and the memory may be configured by a dedicated integrated circuit.
- the focus lens displacement setting unit 105, the focus lens displacement control unit 106, the exposure / focus lens displacement synchronization unit 107, and the shutter opening / closing command unit 112 may constitute an integrated circuit.
- the imaging method of the present embodiment in particular, the position of the focus lens for obtaining the sweep image and the timing of exposure and signal readout of the image sensor 104. explain.
- FIG. 2 is a flowchart showing the imaging method of the present embodiment.
- the exposure time determination unit 114 determines an exposure time parameter from shooting parameters such as a shutter speed and an aperture value (S102).
- the exposure time parameter is output to the focus lens displacement setting unit 105 and the exposure / focus lens displacement synchronization unit 107.
- the focus lens displacement setting unit 105 generates a displacement pattern of the focus lens position (S103).
- the displacement pattern will be described in detail below.
- the exposure / focus lens displacement synchronization unit 107 After determining the displacement pattern of the focus lens position, the exposure / focus lens displacement synchronization unit 107 operates the shutter opening / closing command so that the focus lens displacement setting unit 105 and the readout circuit 108 operate based on the exposure timing of the image sensor 104. Commands are output to the unit 112, the focus lens displacement setting unit 105, and the readout circuit 108. As a result, the shutter opening / closing command unit 112 releases the shutter 111 (S104), exposure of the image sensor 104 is started, and the focus lens driving unit 103 is commanded by the focus lens displacement control unit 106 in synchronization with the start of exposure. Displaces the focus lens 101 (S105).
- the term “synchronization” includes the case of simultaneous and the case of sandwiching a predetermined delay time.
- an electrical signal constituting an image of the photographic scene is output from the image sensor 104 to the readout circuit 108 at a predetermined timing synchronized with the displacement of the focus lens 101.
- the shutter 111 is closed (S106), and the focus position displacement is stopped (S107) to complete the photographing.
- the exposure / sweep operation may be continued until a recording stop processing command is input from the user.
- FIG. 3 is a flowchart showing the flow of the sweep operation during shooting.
- FIG. 4A is a diagram showing a position change of the focus lens, that is, a sweep pattern (displacement pattern) when the position of the focus lens is displaced between the farthest end and the nearest end.
- the horizontal axis represents time, and the vertical axis represents the position of the focus lens (distance from the image sensor).
- a solid line indicates a full sweep displacement pattern (first type displacement pattern), and a double line and a dotted line indicate a half sweep displacement pattern (second type displacement pattern). More specifically, the double line indicates a near sweep displacement pattern (second N type displacement pattern), and the dotted line indicates a far sweep displacement pattern (second F type displacement pattern).
- the near sweep displacement pattern has the entire range between the nearest end and the middle position of the nearest end and the farthest end as the displacement range
- the far sweep displacement pattern has the entire range between the farthest end and the intermediate position as the displacement range.
- the displacement range of the near sweep displacement pattern and the far sweep displacement pattern is different from the displacement range of the full sweep displacement pattern, and is a part of the displacement range of the full sweep displacement pattern.
- the displacement range of the near-sweep displacement pattern and the displacement range of the far-sweep displacement pattern are mutually exclusive without overlapping.
- the intermediate position may not be a strict intermediate between the nearest end and the farthest end.
- a full sweep displacement pattern, a near sweep displacement pattern, and a far sweep displacement pattern are each a period required for the image sensor 104 to acquire one image, that is, one video frame period. It matches.
- Each of the entire displacement ranges is displaced in one direction at least once.
- the full sweep displacement pattern and the near sweep displacement pattern, and the full sweep displacement pattern and the far sweep displacement pattern are connected to each other. For this reason, when the displacement of the focus lens 101 is switched from full sweep to half sweep, it is not necessary to jump the position of the focus lens 101, and the focus lens 101 can be moved smoothly.
- the farthest end and the nearest end are in focus on the imaging surface of the image sensor 104 for various subjects within a predetermined distance range in an imaging scene including subjects at various distances from the imaging device.
- the focus lens 101 When the focus lens 101 is moved as described above, it means the position of the focus lens 101 when the subject closest to the imaging device forms an image and when the subject farthest from the imaging device forms an image.
- the subject that forms an image at the farthest end (first focus position) is located at the longest distance from the imaging device (first subject distance) within the predetermined distance range, and the nearest end (second focus position).
- the subject that forms an image is positioned at the shortest distance from the imaging device (second subject distance) within a predetermined distance range.
- the focus lens driving unit 103 moves the focus lens 101 to the farthest end, which is the initial position, based on a command from the focus lens displacement control unit 106 (S1). .
- the position of the focus lens 101 is displaced between the farthest end and the intermediate position, from the farthest end to the intermediate position and back to the farthest end again. (S2). That is, the focus lens 101 is half-swept according to the far sweep displacement pattern.
- a full sweep is performed so that the focus lens 101 is displaced from the farthest end toward the nearest end (S3).
- the position of the focus lens 101 is displaced so as to move from the nearest end to the intermediate position and back to the nearest end in accordance with the near sweep displacement pattern.
- half sweep is performed again.
- S4 a full sweep is performed again so that the focus lens 101 is displaced from the farthest end toward the nearest end according to the full sweep displacement pattern (S5).
- S2-S5 one cycle of the sweep operation of the focus lens in this embodiment is completed. When shooting a moving image, this operation may be repeated.
- the input indicating the completion of shooting by the user is confirmed (S6), and when the completion of shooting is instructed, the sweep operation is terminated.
- the image processing unit 109 may be configured to obtain 3D information of a shooting scene.
- the electrical signal of the image obtained by performing full sweep that is, exposure while moving the focus lens 101 with a displacement pattern of full sweep (S3).
- An all-focus image can be obtained.
- the electrical signals of the images obtained by exposing the front and rear half sweeps that is, the far sweep displacement pattern and the near sweep displacement pattern while moving the focus lens 101 (S2, S4), etc.
- depth information of the shooting scene can be obtained.
- 3D information in the shooting scene can be obtained by using this omnifocal image and depth information. That is, it is possible to obtain three-dimensional information in one scene from one full sweep image and a total of three images, i.e., a far sweep image and a near sweep image taken at timings before and after that. For example, it is possible to reconstruct an image focused on an arbitrary subject between the farthest end and the nearest end of the shooting scene.
- an image by half sweep is used for depth estimation. As described above, since the depth estimation based on the half sweep image is more accurate than the depth estimation based on the full sweep image, the accuracy of the three-dimensional information obtained in the present embodiment is high.
- the full sweep displacement pattern is sandwiched between the far sweep displacement pattern and the near sweep displacement pattern.
- the full sweep displacement pattern in step S3 is sandwiched between the far sweep displacement pattern in step S2 and the near sweep displacement pattern in step S4.
- the full sweep displacement pattern in step S5 is sandwiched between the near sweep displacement pattern in step S4 and the far sweep displacement pattern in step S2.
- the displacement pattern used in this embodiment has four video frames as one period, but the omnifocal image and the above-described three-dimensional information are obtained in two video frame periods. For this reason, according to this embodiment, it is possible to obtain a smooth EDOF moving image. For example, if an image sensor capable of imaging at, for example, 30 fps is used, a 15 fps three-dimensional video can be obtained. If an image sensor capable of performing higher-speed imaging is used, a smoother (high frame rate) EDOF moving image can be realized.
- an omnifocal image is obtained using a full-sweep displacement pattern that matches the duration of one video frame, the entire omnifocal image is acquired at a timing that coincides in time, An all-in-focus image can be obtained. From these facts, according to the present embodiment, it is possible to obtain a smooth EDOF moving image with high quality and no sense of incongruity.
- the initial position of the focus lens is set to the farthest end, but the initial position may be the nearest end.
- the displacement pattern used in the present embodiment may not include the displacement pattern in step S5.
- the three-dimensional information may be performed by a signal processing unit other than the image processing unit 109, for example, a computer or a signal processing unit outside the imaging apparatus 100.
- FIG. 4B shows an example of another sweep pattern that realizes 3D video shooting.
- the order of each displacement pattern is the same as the example described with reference to FIG. 4 (a), but the example described with reference to FIG. 4 (a) in that the start position of each displacement pattern is an intermediate position. Is different.
- the full sweep displacement pattern is such that the focus lens 101 moves from the intermediate position to the nearest end, then moves to the farthest end, and then returns to the intermediate position.
- the time for moving the focus lens 101 to the initial position at the start of shooting is generally shortened, and shooting can be started earlier.
- FIG. 4C shows an example of another sweep pattern that realizes three-dimensional still image shooting.
- the order of each displacement pattern is the same as the example described with reference to FIG. 4A, but it has been described with reference to FIG. 4A in that each displacement pattern is a displacement in one direction. It is different from the example. According to this example, since each displacement pattern does not include a reciprocal displacement, the displacement distance of the focus lens 101 is shortened, and the power consumption of the imaging apparatus 100 can be reduced.
- This sweep pattern is preferably used when continuous shooting is not required as in still image shooting. However, the displacement pattern of this example may be used for moving image shooting.
- the omnifocal image and the three-dimensional information are obtained in three video frame periods, and the rate is only slightly reduced compared to the example shown with reference to FIGS. 4 (a) and 4 (b).
- the imaging device in which the focus lens is displaced with the displacement pattern of this example can be suitably used as an imaging device with low power consumption, for example, in applications where smoothness of moving images is not important, such as a surveillance camera.
- FIG. 5 is a block diagram showing the imaging apparatus 200 of the present embodiment.
- the same components as those in the imaging device 100 according to the first embodiment are denoted by the same reference numerals.
- the imaging apparatus 200 is different from the imaging apparatus 100 in that the distance between the lens optical system 120 and the focus lens 101 is changed by moving the position of the imaging element 104.
- the imaging apparatus 200 includes an imaging element position detection unit 202, an imaging element driving unit 203, an imaging element displacement setting unit 205, an imaging element displacement control unit 206, and an exposure / imaging element displacement synchronization unit 207.
- the image sensor position detector 202 includes a position sensor, detects the position of the image sensor 104, and outputs a detection signal to the image sensor displacement controller 206.
- the image sensor displacement setting unit 205 sets a displacement pattern of the image sensor 104 and sets it as the position of the target image sensor.
- the image sensor displacement control unit 206 calculates a drive signal from the difference between the target image sensor position and the current position of the image sensor 104 detected by the image sensor position detector 202 and outputs the drive signal to the image sensor drive unit 203. .
- the exposure time determining unit 114 determines the exposure time of the image sensor 104. Also, information relating to the exposure time is output to the exposure / image sensor displacement synchronization unit 207.
- the exposure / imaging element displacement synchronization unit 207 is configured to perform exposure, drive of the imaging element 104, and readout of an electrical signal from the imaging element 104 at synchronized timing based on information related to the exposure time. Commands are output to the element displacement control unit 206 and the readout circuit 108. Specifically, the shutter opening / closing command unit 112 is instructed about the exposure timing and the exposure time. Further, it instructs the image sensor displacement control unit 206 to drive the image sensor 104 and the drive time. As a result, the imaging apparatus 200 can drive the imaging element 104 to expose the imaging element 104 while changing the position of the imaging element, and obtain a sweep image.
- FIG. 6 is a flowchart showing the imaging method of the present embodiment. Except for displacing the image sensor to change the distance between the image sensor and the focus lens, this is the same as the image capturing method in the first embodiment described in FIG.
- the displacement pattern of the image sensor is the same as that of FIGS. 4A, 4B, and 4C, which is the displacement pattern of the focus lens position described in the first embodiment.
- the imaging devices of the first and second embodiments use a CCD image sensor as an imaging element. Since the CCD image sensor can perform a global shutter operation in which all pixels can be read out simultaneously, the displacement pattern of the focus lens in the image pickup apparatus of the first and second embodiments has also come to the CCD image sensor.
- an imaging device, an integrated circuit, and an imaging method using a CMOS image sensor as an imaging element will be described.
- An image sensor constituted by a CMOS image sensor is suitable for reading a large number of pixels at high speed.
- an image sensor capable of reading a FullHD (1920 ⁇ 1080) size image for 60 frames per second is realized. .
- FIG. 7A shows the timing for reading out charges from the pixel set in such an image sensor.
- the horizontal axis indicates time, and the vertical axis indicates the position of the readout row of the image sensor.
- the image sensor is composed of a plurality of N pixel rows.
- scanning is sequentially performed from the first row of the image sensor to read out charges from each pixel, and charge is accumulated immediately thereafter. After a predetermined time has elapsed, scanning is performed again to read out charges from each pixel. Thus, an image signal is obtained. After the scanning of the Nth row is completed, a continuous moving image can be obtained by repeating the scanning from the top again.
- FIG. 7A when shooting is performed with the rolling shutter, the imaging timing within the imaging element plane is shifted, and a maximum shift of one video frame occurs between the first row and the last row.
- FIG. 7B corresponds to the horizontal axis of FIG. 7A, and shows a displacement pattern when the focus lens is swept from the farthest end to the nearest end within one video frame period by the rolling shutter. ing.
- the focus lens moves in the entire range from the farthest end to the nearest end during the exposure period of the first readout row.
- the focus lens is located only at the nearest end during the exposure period of the Nth row.
- FIG. 7C shows an example of a displacement pattern suitable for an image sensor constituted by a CMOS image sensor.
- the displacement pattern shown in FIG. 7C starts displacement from the farthest end in one video frame period, returns to the farthest end after reaching the nearest end, that is, from the farthest end to the nearest end in one video frame period. 1 reciprocating displacement.
- the reciprocating motion of the displacement pattern matches one video frame, but the reciprocating motion only needs to be synchronized with the exposure time. That is, a reciprocating operation that is an integral multiple of 2 or more of the exposure time may be performed.
- all the pixels in the image pickup device surface are uniformly exposed by continuing the displacement pattern for a period of two video frames, that is, by performing the sweep operation twice. it can.
- FIGS. 8A and 8B show an example of the displacement pattern of the focus lens when a full sweep image and a half sweep image are obtained using an image sensor constituted by a CMOS image sensor.
- the focus lens for obtaining one sweep image may be swept twice.
- the imaging apparatus according to the present embodiment is the same as the imaging apparatus according to the first embodiment except that the imaging element 104 is a CMOS image sensor. For this reason, differences will be particularly described.
- FIGS. 9A and 9B show examples of displacement patterns of the focus lens used in the imaging apparatus of the present embodiment
- FIGS. 9A and 9B are views of FIG. 4 of the first embodiment. This corresponds to (a) and (b).
- step S3 and step S5 a one-way movement for moving the focus lens position is performed. Since image exposure and readout are performed in units of video frames, the focus position is temporarily stopped at the near end or the nearest end. For this reason, it takes a total of three video frames to capture full sweep images in steps S3 and S5.
- step S2 and step S4 a half sweep image by two reciprocating sweeps is taken, and the required time is two video frame periods.
- the shooting interval of each sweep image is made equal.
- this waiting time may be omitted. In this case, the time intervals from step S2 to step S3 and from step S3 to step S4 are not the same.
- the full sweep operation for obtaining the omnifocal image and the half sweep operation for obtaining the depth information are continuously repeated, and the near sweep operation and the far sweep operation are alternately repeated once each in the half sweep operation.
- the near sweep operation and the far sweep operation are alternately repeated once each in the half sweep operation.
- FIG. 9B is an example of another sweep pattern that realizes three-dimensional moving image shooting using a CMOS image sensor.
- the imaging order of each sweep image is the same as the displacement pattern described with reference to FIG. 9A, but the start position of each sweep operation is an intermediate position, and the full sweep operation starts from the intermediate position. However, the point is a reciprocating sweep between the nearest end and the farthest end, which is different from the displacement pattern described in FIG.
- the time for moving the focus lens 101 to the initial position at the start of photographing is generally shortened.
- the displacement pattern shown in FIG. 9A does not include the one-way displacement for moving the focus lens position, the interval for capturing the omnifocal image can be shortened. This is effective in improving the frame rate of the moving image. Specifically, the frame rate can be increased by 1.5 times compared to the case of using the displacement pattern shown in FIG.
- the omnifocal image can be taken once every four video frames. For this reason, for example, if an image sensor constituted by a CMOS image sensor capable of imaging at 60 fps is used, a three-dimensional moving image at 15 fps can be obtained. If an image sensor capable of high-speed reading is combined, the effect can be further exhibited.
- the exposure time may be controlled using an electronic shutter to limit the amount of light incident on the image sensor.
- FIG. 10A shows the read row timing of the image sensor that performs such exposure.
- the horizontal axis indicates time
- the vertical axis indicates the position of the readout row of the image sensor.
- FIG. 10B shows a displacement pattern of the focus lens in this case.
- the reciprocation cycle of the displacement pattern is shortened according to the exposure time.
- the number of times the focus lens included in the displacement pattern reciprocates between the nearest end and the farthest end is more than two times from the start of exposure to the end of readout. .
- the focus lens at the start of exposure is positioned approximately in the middle between the nearest end and the farthest end.
- the initial position of the focus lens changes according to the exposure time varied by the electronic shutter. Note that the position of the focus lens is arbitrary in the non-exposure state. In the example shown in FIG. 10B, the initial position of the focus lens is the nearest end (dotted line portion in FIG. 10B).
- FIG. 11 shows an example of a displacement pattern in which a sweep image equivalent to the displacement pattern shown in FIG. 9B is obtained using an electronic shutter.
- the order of photographing each sweep image is the same as the displacement pattern of FIG. 9B, but the exposure time is shortened by the electronic shutter. For this reason, the period of the reciprocating displacement in each displacement pattern is shortened. Since the exposure is not performed from the initial position of each displacement pattern to the start of exposure, the focus lens may be at an arbitrary position. In the example shown in FIG. 11, considering the continuity with the preceding and following polarization turns, the initial position of the displacement pattern is the nearest end or the farthest end, respectively. By doing so, it is possible to obtain the same performance as the sweep pattern described in FIG. 9B while controlling the exposure time.
- the displacement pattern of the focus lens in the imaging apparatus of the present embodiment preferably includes a reciprocal displacement. For this reason, it is difficult to adopt the displacement pattern corresponding to FIG.
- a still image may be taken with a displacement pattern for obtaining three consecutive sweep images among the displacement patterns described with reference to FIGS.
- the above-described effects can be obtained even when the position of the image pickup device is displaced according to the displacement pattern shown in FIGS. 9 and 11 instead of the focus lens. Can do.
- This half sweep operation is an operation in which a near sweep operation and a far sweep operation are alternately repeated once, and each sweep operation makes it possible to take a three-dimensional moving image by making a predetermined sweep range two reciprocations.
- the imaging apparatus, integrated circuit, and imaging method disclosed in the present application it is possible to obtain an omnifocal image and depth information alternately and continuously at high speed by devising a sweep method.
- the imaging device, integrated circuit, and imaging method disclosed in the present application are suitably used for an imaging device such as a consumer or business digital still camera or digital movie camera.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Lens Barrels (AREA)
Abstract
Description
1/f=1/u+1/v (式1)
の関係が成り立つ。レンズが複数枚存在するときは、レンズ主点位置で考慮する。一例として、fが18[mm]のときのuとvとの関係を図16に示す。フォーカスレンズ101が変位することにより、レンズ主点と撮像素子間の距離vが変化する。撮像素子面に対して等速にフォーカスレンズの変位が変化するよう、フォーカスレンズ101を駆動するとは、このvの変化速度が一定であることを意味する。図16に示すように、vが等速度で変位しても、被写体側の焦点面とレンズ主点間の距離uが等速度で変位するわけではない。また図16の横軸は像面側焦点距離vであるため、被写体距離uの大小とは逆の関係になる。つまり、被写体距離が長い(遠くに位置する)被写体ほど像面側焦点距離vが短くなる。
以下、図面を参照しながら、本発明による撮像装置、集積回路および撮像方法の第1の実施形態を説明する。
図5および図6を参照しながら、本発明による撮像装置、集積回路および撮像方法の第2の実施形態を説明する。
第1および第2の実施形態の撮像装置は、撮像素子として、CCDイメージセンサを用いている。CCDイメージセンサは、全画素を同時に読み出すことができるグローバルシャッタ動作が可能であるため、第1および第2の実施形態の撮像装置におけるフォーカスレンズの変位パターンもCCDイメージセンサにてきしている。本実施の形態では撮像素子としてCMOSイメージセンサを用いた撮像装置、集積回路および撮像方法を説明する。
101 フォーカスレンズ
102 フォーカスレンズ位置検出部
103 フォーカスレンズ駆動部
104 撮像素子
105 フォーカスレンズ変位設定部
106 フォーカスレンズ変位制御部
107 露光・フォーカスレンズ変位同期部
108 読み出し回路
109 画像処理部
110 記録部
111 シャッタ
112 シャッタ開閉指示部
113 レリーズ受付部
114 露光時間決定部
115 フォーカスレンズ位置検出部
120 レンズ
202 撮像素子位置検出部
203 撮像素子駆動部
205 撮像素子変位設定部
206 撮像素子変位制御部
207 露光・撮像素子変位同期部
Claims (16)
- 二次元に配列され、撮像面を構成している複数の光電変換素子を有する撮像素子であって、前記複数の光電変換素子を露光させ、前記複数の光電変換素子から電気信号を読み出すことにより、画像信号を生成する撮像素子と、
前記撮像素子に向けて集光する、フォーカスレンズを含むレンズ光学系と、
前記撮像素子と前記フォーカスレンズとの距離が変化するように、前記撮像素子または前記フォーカスレンズの一方を駆動する駆動部と、
前記駆動部に指令を出力することにより、所定の変位パターンに基づき、前記駆動される前記撮像素子または前記フォーカスレンズの変位を制御するように構成された変位制御部と、
前記撮像素子の露光のタイミングに基づき、前記変位制御部を制御するように構成された同期部と、
を備え、
前記所定の変位パターンは、撮像シーン中の第1被写体距離において合焦する、前記フォーカスレンズまたは前記撮像素子の第1合焦位置と、前記撮像シーン中の第2被写体距離において合焦する、前記フォーカスレンズまたは前記撮像素子の第2合焦位置との間の異なる範囲で前記撮像素子または前フォーカスレンズが変位する第1タイプの変位パターンおよび第2タイプの変位パターンを含み、前記第1タイプの変位パターンおよび前記第2タイプの変位パターンは交互に繰り返される、撮像装置。 - 前記第1タイプの変位パターンの変位範囲は、前記第2タイプの変位パターンの変位範囲の少なくとも一部を含む請求項1に記載の撮像装置。
- 前記第1タイプの変位パターンの変位範囲は、前記第1合焦位置と前記第2合焦位置との間の全区間である請求項2に記載の撮像装置。
- 前記第2タイプの変位パターンは、前記第1合焦位置と、前記第1合焦位置および前記第2合焦位置の中間位置との全区間が前記変位範囲である第2Fタイプの変位パターンと、前記中間位置と、前記第2合焦位置との全区間が前記変位範囲である第2Nタイプの変位パターンとを含み、
第1タイプの変位パターンは前記第2Fタイプの変位パターンおよび前記第2Nタイプの変位パターンに挟まれている請求項3に記載の撮像装置。 - 前記第1タイプの変位パターン、前記第2Fタイプの変位パターンおよび前記第2Nタイプの変位パターンは、それぞれ、全変位範囲を少なくとも1回、一方向に変位する請求項4に記載の撮像装置。
- 前記第1タイプの変位パターンと前記第2Fタイプの変位パターンおよび前記第1タイプの変位パターンと前記第2Nタイプの変位パターンはそれぞれ、つながっている請求項5に記載の撮像装置。
- 前記撮像シーンに基づいて前記撮像素子の露光時間を決定する露光時間決定部と、
前記第1合焦位置、前記第2合焦位置および前記露光時間に基づいて前記変位パターンを決定する変位設定部と
をさらに備える請求項1から6のいずれかに記載の撮像装置。 - 前記駆動される前記撮像素子または前記フォーカスレンズの位置を検出する位置検出部をさらに備え、
前記変位制御部は前記位置検出部の出力および前記変位パターンに基づき、前記駆動部に駆動量を指令する請求項7に記載の撮像装置。 - 前記撮像素子から前記画像信号を読み出す読み出し回路をさらに備え、
前記同期部は、前記撮像素子の露光のタイミングに基づき、前記変位制御部および前記読み出し回路を制御する請求項8に記載の撮像装置。 - 前記第1タイプの変位パターンに基づき、前記駆動される前記撮像素子または前記フォーカスレンズが変位している間に得られた画像信号から全焦点画像を生成する請求項1から9のいずれかに記載の撮像装置。
- 前記第2タイプの変位パターンに基づき、前記駆動される前記撮像素子または前記フォーカスレンズが変位している間に得られた画像信号から奥行き情報を生成する請求項1から10のいずれかに記載の撮像装置。
- 前記撮像素子は、CCDイメージセンサである請求項1から11のいずれかに記載の撮像装置。
- 前記撮像素子は、CMOSイメージセンサである請求項1から11のいずれかに記載の撮像装置。
- 前記第1タイプの変位パターン、前記第2Fタイプの変位パターンおよび前記第2Nタイプの変位パターンは、それぞれ、少なくとも全変位範囲を整数回、往復変位する請求項13に記載の撮像装置。
- 二次元に配列され、撮像面を構成している複数の光電変換素子を有する撮像素子であって、前記複数の光電変換素子を露光させ、前記複数の光電変換素子から電気信号を読み出すことにより、画像信号を生成する撮像素子と、前記撮像素子に向けて集光する、フォーカスレンズを含むレンズ光学系と、前記撮像素子と前記フォーカスレンズとの距離が変化するように、前記撮像素子または前記フォーカスレンズの一方を駆動する駆動部とを備えた撮像装置の集積回路であって、
前記駆動部に指令を出力することにより、所定の変位パターンに基づき、前記駆動される前記撮像素子または前フォーカスレンズの変位を制御するように構成された変位制御部と、
前記撮像素子の露光のタイミングに基づき、前記変位制御部を制御するように構成された同期部と、
を備え、
前記所定の変位パターンは、撮像シーン中の第1被写体距離において合焦する、前記フォーカスレンズまたは前記撮像素子の第1合焦位置と、前記撮像シーン中の第2被写体距離において合焦する、前記フォーカスレンズまたは前記撮像素子の第2合焦位置との間の異なる範囲で前記撮像素子または前フォーカスレンズが変位する第1タイプの変位パターンおよび第2タイプの変位パターンを含み、前記第1タイプの変位パターンおよび前記第2タイプの変位パターンは交互に繰り返される、集積回路。 - 二次元に配列され、撮像面を構成している複数の光電変換素子を有する撮像素子であって、前記複数の光電変換素子を露光させ、前記複数の光電変換素子から電気信号を読み出すことにより、画像信号を生成する撮像素子に、フォーカスレンズによって光を集光することにより、撮像シーンを結像させる撮像方法であって、
撮像シーン中の第1被写体距離において合焦する、前記フォーカスレンズまたは前記撮像素子の第1合焦位置と、前記撮像シーン中の第2被写体距離において合焦する、前記フォーカスレンズまたは前記撮像素子の第2合焦位置との間の異なる範囲で前記撮像素子または前フォーカスレンズが変位する第1タイプの変位パターンおよび第2タイプの変位パターンで、前記前記フォーカスレンズまたは前記撮像素子を変位させながら前記複数の光電変換素子を露光させる撮像方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/809,951 US8890995B2 (en) | 2011-04-15 | 2012-04-12 | Image pickup apparatus, semiconductor integrated circuit and image pickup method |
EP12771918.5A EP2698658B1 (en) | 2011-04-15 | 2012-04-12 | Image pickup apparatus, semiconductor integrated circuit and image pickup method |
JP2012542282A JP5847728B2 (ja) | 2011-04-15 | 2012-04-12 | 撮像装置、半導体集積回路および撮像方法 |
CN201280001608.5A CN102934003B (zh) | 2011-04-15 | 2012-04-12 | 摄像装置、半导体集成电路以及摄像方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-090782 | 2011-04-15 | ||
JP2011090782 | 2011-04-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012140899A1 true WO2012140899A1 (ja) | 2012-10-18 |
Family
ID=47009088
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/002555 WO2012140899A1 (ja) | 2011-04-15 | 2012-04-12 | 撮像装置、半導体集積回路および撮像方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8890995B2 (ja) |
EP (1) | EP2698658B1 (ja) |
JP (1) | JP5847728B2 (ja) |
CN (1) | CN102934003B (ja) |
WO (1) | WO2012140899A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013171954A1 (ja) * | 2012-05-17 | 2013-11-21 | パナソニック株式会社 | 撮像装置、半導体集積回路および撮像方法 |
JP2014130131A (ja) * | 2012-11-29 | 2014-07-10 | Panasonic Corp | 撮像装置、半導体集積回路および撮像方法 |
WO2014171051A1 (ja) * | 2013-04-15 | 2014-10-23 | パナソニック株式会社 | 距離測定装置、及び、距離測定方法 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8989447B2 (en) | 2012-08-13 | 2015-03-24 | Texas Instruments Incorporated | Dynamic focus for computational imaging |
JP5744161B2 (ja) * | 2013-11-18 | 2015-07-01 | シャープ株式会社 | 画像処理装置 |
JP5895270B2 (ja) * | 2014-03-28 | 2016-03-30 | パナソニックIpマネジメント株式会社 | 撮像装置 |
TWI524108B (zh) * | 2014-04-24 | 2016-03-01 | 瑞昱半導體股份有限公司 | 被動式自動對焦裝置與方法 |
US9525814B2 (en) * | 2014-10-12 | 2016-12-20 | Himax Imaging Limited | Automatic focus searching using focal sweep technique |
CN104639831B (zh) * | 2015-01-05 | 2018-12-11 | 信利光电股份有限公司 | 一种照相机及拓展景深的方法 |
CN105163034B (zh) * | 2015-09-28 | 2018-06-29 | 广东欧珀移动通信有限公司 | 一种拍照方法及移动终端 |
US10033917B1 (en) | 2015-11-13 | 2018-07-24 | Apple Inc. | Dynamic optical shift/tilt lens |
US11893755B2 (en) | 2018-01-19 | 2024-02-06 | Interdigital Vc Holdings, Inc. | Multi-focal planes with varying positions |
US11477434B2 (en) | 2018-03-23 | 2022-10-18 | Pcms Holdings, Inc. | Multifocal plane based method to produce stereoscopic viewpoints in a DIBR system (MFP-DIBR) |
WO2020010018A1 (en) | 2018-07-05 | 2020-01-09 | Pcms Holdings, Inc. | Method and system for near-eye focal plane overlays for 3d perception of content on 2d displays |
CN111064895B (zh) * | 2019-12-31 | 2022-02-01 | 维沃移动通信有限公司 | 一种虚化拍摄方法和电子设备 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2301800A1 (de) | 1973-01-15 | 1974-10-10 | Leitz Ernst Gmbh | Verfahren zur erweiterung des schaerfentiefebereiches bei der optischen und elektronenmikroskopischen abbildung |
JPH0527084A (ja) | 1991-07-25 | 1993-02-05 | Toshiba Eng & Constr Co Ltd | 燃料集合体のチヤンネルボツクス載置確認装置 |
JP3084130B2 (ja) | 1992-05-12 | 2000-09-04 | オリンパス光学工業株式会社 | 画像入力装置 |
JP3191928B2 (ja) | 1988-02-23 | 2001-07-23 | オリンパス光学工業株式会社 | 画像入出力装置 |
JP2009545929A (ja) * | 2006-08-01 | 2009-12-24 | クゥアルコム・インコーポレイテッド | 平面視低電力モバイル装置における立体画像およびビデオのリアルタイム取得および生成 |
US7711259B2 (en) | 2006-07-14 | 2010-05-04 | Aptina Imaging Corporation | Method and apparatus for increasing depth of field for an imager |
JP2010175435A (ja) * | 2009-01-30 | 2010-08-12 | Nippon Hoso Kyokai <Nhk> | 三次元情報検出装置及び三次元情報検出方法 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6068312A (ja) | 1983-05-13 | 1985-04-18 | Shimadzu Corp | 光学顕微鏡像の撮影方法 |
DE3905619C2 (de) | 1988-02-23 | 2000-04-13 | Olympus Optical Co | Bildeingabe-/Ausgabevorrichtung |
JP3752510B2 (ja) * | 1996-04-15 | 2006-03-08 | イーストマン コダック カンパニー | 画像の自動被写体検出方法 |
JP2001281529A (ja) * | 2000-03-29 | 2001-10-10 | Minolta Co Ltd | デジタルカメラ |
US6956612B2 (en) * | 2001-07-31 | 2005-10-18 | Hewlett-Packard Development Company, L.P. | User selectable focus regions in an image capturing device |
US7064904B2 (en) * | 2004-01-23 | 2006-06-20 | Canon Kabushiki Kaisha | Image processor allowing shooting at close range |
GB0406730D0 (en) * | 2004-03-25 | 2004-04-28 | 1 Ltd | Focussing method |
JP4348261B2 (ja) * | 2004-08-31 | 2009-10-21 | Hoya株式会社 | トリミング撮像装置 |
JP4756960B2 (ja) * | 2005-09-02 | 2011-08-24 | キヤノン株式会社 | 撮像装置及びその制御方法、コンピュータプログラム及び記憶媒体 |
JP5032775B2 (ja) * | 2006-02-17 | 2012-09-26 | 富士フイルム株式会社 | レンズ装置 |
JP2008009341A (ja) * | 2006-06-30 | 2008-01-17 | Sony Corp | オートフォーカス装置、撮像装置及びオートフォーカス方法 |
JP4951433B2 (ja) * | 2007-07-24 | 2012-06-13 | ペンタックスリコーイメージング株式会社 | 焦点調節方法及び焦点調節装置 |
US8436935B2 (en) * | 2007-08-29 | 2013-05-07 | Panasonic Corporation | Image picking-up device with a moving focusing lens |
US8390729B2 (en) * | 2007-09-05 | 2013-03-05 | International Business Machines Corporation | Method and apparatus for providing a video image having multiple focal lengths |
JP5070023B2 (ja) | 2007-12-10 | 2012-11-07 | 三星電子株式会社 | 撮像装置および撮像方法 |
JP5139840B2 (ja) | 2008-02-27 | 2013-02-06 | 京セラ株式会社 | 撮像装置、画像生成方法、および電子機器 |
JP5009880B2 (ja) * | 2008-09-19 | 2012-08-22 | 富士フイルム株式会社 | 撮像装置及び撮像方法 |
JP5138521B2 (ja) * | 2008-09-19 | 2013-02-06 | 富士フイルム株式会社 | 撮像装置及び撮像方法 |
JP5328384B2 (ja) | 2009-01-14 | 2013-10-30 | キヤノン株式会社 | レンズ制御装置、光学機器及びレンズ制御方法 |
WO2011070755A1 (ja) * | 2009-12-07 | 2011-06-16 | パナソニック株式会社 | 撮像装置及びその制御方法 |
WO2011070757A1 (ja) * | 2009-12-07 | 2011-06-16 | パナソニック株式会社 | 撮像装置および撮像方法 |
JP5177184B2 (ja) * | 2010-07-30 | 2013-04-03 | 株式会社ニコン | 焦点調節装置および撮像装置 |
-
2012
- 2012-04-12 US US13/809,951 patent/US8890995B2/en not_active Expired - Fee Related
- 2012-04-12 CN CN201280001608.5A patent/CN102934003B/zh not_active Expired - Fee Related
- 2012-04-12 JP JP2012542282A patent/JP5847728B2/ja not_active Expired - Fee Related
- 2012-04-12 WO PCT/JP2012/002555 patent/WO2012140899A1/ja active Application Filing
- 2012-04-12 EP EP12771918.5A patent/EP2698658B1/en not_active Not-in-force
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2301800A1 (de) | 1973-01-15 | 1974-10-10 | Leitz Ernst Gmbh | Verfahren zur erweiterung des schaerfentiefebereiches bei der optischen und elektronenmikroskopischen abbildung |
JP3191928B2 (ja) | 1988-02-23 | 2001-07-23 | オリンパス光学工業株式会社 | 画像入出力装置 |
JPH0527084A (ja) | 1991-07-25 | 1993-02-05 | Toshiba Eng & Constr Co Ltd | 燃料集合体のチヤンネルボツクス載置確認装置 |
JP3084130B2 (ja) | 1992-05-12 | 2000-09-04 | オリンパス光学工業株式会社 | 画像入力装置 |
US7711259B2 (en) | 2006-07-14 | 2010-05-04 | Aptina Imaging Corporation | Method and apparatus for increasing depth of field for an imager |
JP2009545929A (ja) * | 2006-08-01 | 2009-12-24 | クゥアルコム・インコーポレイテッド | 平面視低電力モバイル装置における立体画像およびビデオのリアルタイム取得および生成 |
JP2010175435A (ja) * | 2009-01-30 | 2010-08-12 | Nippon Hoso Kyokai <Nhk> | 三次元情報検出装置及び三次元情報検出方法 |
Non-Patent Citations (3)
Title |
---|
H. NAGAHARA; S. KUTHIRUMMAL; C. ZHOU; S. NAYAR: "Flexible Depth of Field Photography", EUROPEAN CONFERENCE ON COMPUTER VISION (ECCV, 16 October 2008 (2008-10-16) |
See also references of EP2698658A4 |
SHUHEI MATSUI; HAJIME NAGAHARA; RIN'ICHIRO TANIGUCHI: "Focus Sweep Imaging for Depth From Defocus", INFORMATION PROCESSING SOCIETY OF JAPAN, SIG NOTES, 2010-CVIM-174, 2010 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013171954A1 (ja) * | 2012-05-17 | 2013-11-21 | パナソニック株式会社 | 撮像装置、半導体集積回路および撮像方法 |
US8890996B2 (en) | 2012-05-17 | 2014-11-18 | Panasonic Corporation | Imaging device, semiconductor integrated circuit and imaging method |
JP2014130131A (ja) * | 2012-11-29 | 2014-07-10 | Panasonic Corp | 撮像装置、半導体集積回路および撮像方法 |
WO2014171051A1 (ja) * | 2013-04-15 | 2014-10-23 | パナソニック株式会社 | 距離測定装置、及び、距離測定方法 |
US9467616B2 (en) | 2013-04-15 | 2016-10-11 | Panasonic Intellectual Property Management Co., Ltd. | Distance measurement apparatus and distance measurement method |
JPWO2014171051A1 (ja) * | 2013-04-15 | 2017-02-16 | パナソニックIpマネジメント株式会社 | 距離測定装置、及び、距離測定方法 |
Also Published As
Publication number | Publication date |
---|---|
CN102934003A (zh) | 2013-02-13 |
EP2698658A4 (en) | 2014-09-24 |
US20130113984A1 (en) | 2013-05-09 |
EP2698658B1 (en) | 2018-09-12 |
EP2698658A1 (en) | 2014-02-19 |
JPWO2012140899A1 (ja) | 2014-07-28 |
CN102934003B (zh) | 2016-06-08 |
JP5847728B2 (ja) | 2016-01-27 |
US8890995B2 (en) | 2014-11-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5847728B2 (ja) | 撮像装置、半導体集積回路および撮像方法 | |
JP5934940B2 (ja) | 撮像装置、半導体集積回路および撮像方法 | |
JP5809688B2 (ja) | 撮像装置、半導体集積回路および撮像方法 | |
JP5914834B2 (ja) | 撮像装置、半導体集積回路および撮像方法 | |
US20120268613A1 (en) | Image capturing apparatus and control method thereof | |
JP6210333B2 (ja) | 距離測定装置、及び、距離測定方法 | |
JP2014007580A (ja) | 撮像装置およびその制御方法ならびにプログラム | |
JP6152772B2 (ja) | 撮像装置、半導体集積回路および撮像方法 | |
JP6238578B2 (ja) | 撮像装置およびその制御方法 | |
JP6486041B2 (ja) | 撮像装置およびその制御方法 | |
JP3542312B2 (ja) | 電子的撮像装置 | |
JP6994665B1 (ja) | 撮像装置 | |
JP2010107662A (ja) | 撮像装置、測距装置および測距方法 | |
JP2006023653A (ja) | 光学機器 | |
WO2012157407A1 (ja) | 撮像装置及び合焦制御方法 | |
JP2017134154A (ja) | フォーカス制御装置、撮像装置およびフォーカス制御プログラム | |
JP7175635B2 (ja) | 撮像装置およびその制御方法 | |
JP6900228B2 (ja) | 撮像装置、撮像システム、撮像装置の制御方法、および、プログラム | |
JP2006237764A (ja) | ビデオカメラ | |
JP2021043371A (ja) | 撮像装置およびその制御方法 | |
JP2010054922A (ja) | 電子カメラ |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280001608.5 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2012542282 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12771918 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13809951 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012771918 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |