WO2017163588A1 - Image processing apparatus, image pickup apparatus, and control methods therefor, and program - Google Patents
Image processing apparatus, image pickup apparatus, and control methods therefor, and program Download PDFInfo
- Publication number
- WO2017163588A1 WO2017163588A1 PCT/JP2017/002504 JP2017002504W WO2017163588A1 WO 2017163588 A1 WO2017163588 A1 WO 2017163588A1 JP 2017002504 W JP2017002504 W JP 2017002504W WO 2017163588 A1 WO2017163588 A1 WO 2017163588A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- viewpoint
- changing
- processing apparatus
- image processing
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 254
- 238000000034 method Methods 0.000 title claims description 76
- 230000008859 change Effects 0.000 claims abstract description 76
- 238000003384 imaging method Methods 0.000 claims description 61
- 210000001747 pupil Anatomy 0.000 description 116
- 230000008569 process Effects 0.000 description 56
- 238000009826 distribution Methods 0.000 description 45
- 238000006243 chemical reaction Methods 0.000 description 34
- 230000003287 optical effect Effects 0.000 description 34
- 238000010586 diagram Methods 0.000 description 14
- 239000002131 composite material Substances 0.000 description 12
- 238000011161 development Methods 0.000 description 11
- 230000006870 function Effects 0.000 description 9
- 230000002093 peripheral effect Effects 0.000 description 7
- 230000003595 spectral effect Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000035945 sensitivity Effects 0.000 description 6
- 230000004907 flux Effects 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 238000003702 image correction Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 235000002597 Solanum melongena Nutrition 0.000 description 1
- 244000061458 Solanum melongena Species 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/34—Systems for automatic generation of focusing signals using different areas in a pupil plane
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
Definitions
- the present invention relates to an image processing apparatus, an imaging apparatus, a control method thereof, and a program.
- Non-Patent Document 1 discloses an imaging apparatus using an imaging element in which one microlens and a plurality of divided photoelectric conversion units are formed for one pixel. This imaging device is configured such that each of the divided photoelectric conversion units receives a light beam that has passed through different pupil partial regions of the photographing lens via one microlens, thereby realizing so-called pupil division. For this reason, the output signal is equivalent to Light Field (LF) data including spatial distribution of light intensity and angle distribution information, and a plurality of viewpoint images can be obtained.
- LF Light Field
- Non-Patent Document 1 the acquired LF data is used to generate a composite image formed on a virtual surface different from the imaging surface, so that the focus position (in-focus position) of the captured image after shooting is generated.
- a refocusing technique that can change also called).
- Non-Patent Document 1 does not consider a method of operating the display of the viewpoint change image and the display of the image with the focus position changed in parallel.
- the present invention has been made in view of the above-mentioned problems. That is, it provides a technique capable of operating in parallel an operation for displaying a viewpoint change image and an operation for displaying an image whose focus position has been changed based on a plurality of viewpoint images.
- the image processing apparatus of the present invention has the following configuration. That is, an acquisition means for acquiring an image signal including the intensity and angle information of incident light, an operation means for receiving an operation for changing the viewpoint and an operation for changing the focus position, and a plurality of viewpoints obtained based on the image signal And processing means for generating a display image in which the viewpoint is changed according to the operation for changing the viewpoint and the focus position is changed according to the operation for changing the focus position, based on the image.
- an operation for displaying a viewpoint change image and an operation for displaying an image with a changed focus position can be operated in parallel.
- FIG. 1 is a block diagram illustrating a functional configuration example of a digital camera as an example of an image processing apparatus according to an embodiment of the present invention.
- 1 is a block diagram illustrating an example of a functional configuration of an image processing unit according to a first embodiment.
- FIG. 3 is a diagram schematically illustrating a pixel arrangement according to the first embodiment.
- FIG. 2 is a plan view and a cross-sectional view schematically illustrating a pixel according to the first embodiment. The figure explaining the outline
- FIG. 6 is a diagram for explaining an example of a light intensity distribution inside a pixel according to the first embodiment.
- FIG. 6 is a diagram for explaining an example of pupil intensity distribution according to the first embodiment.
- FIG. 4 is a diagram for explaining a relationship between an image sensor according to Embodiment 1 and pupil division. The figure explaining the relationship between the defocus amount and the image shift amount in the first viewpoint image and the second viewpoint image
- FIG. 5 is a diagram illustrating an example of a contrast distribution of a captured image according to the first embodiment.
- FIG. 6 is a diagram for explaining an example of parallax enhancement in which a difference between viewpoint images according to the first embodiment is enlarged. The figure explaining the outline of the refocus process which concerns on Embodiment 1.
- FIG. 4 is a diagram for explaining a relationship between an image sensor according to Embodiment 1 and pupil division.
- FIG. 5 is a diagram
- FIG. , , FIG. 6 is a diagram for explaining pupil shift in the peripheral image height of the image sensor according to the first embodiment.
- 7 is a flowchart showing a series of operations related to viewpoint movement and focus adjustment operation for a captured image according to the first embodiment.
- 7 is a flowchart showing a series of operations of viewpoint image operation processing according to the first embodiment.
- 8 is a flowchart showing a series of operations of development processing according to the first embodiment.
- FIG. 7 is a flowchart showing a series of operations of parallax image operation processing according to the second embodiment.
- FIG. 10 is a diagram schematically illustrating a UI for viewpoint movement and focus adjustment according to the third embodiment.
- 8 is a flowchart showing a series of operations of parallax image operation processing according to the fourth embodiment.
- FIG. 10 is a diagram schematically illustrating a UI example of viewpoint movement and focus adjustment according to the fourth embodiment.
- FIG. 10 is a diagram schematically illustrating a UI example of viewpoint movement and focus adjustment according to the fourth embodiment.
- FIG. 10 is a diagram schematically illustrating a UI example of viewpoint movement and focus adjustment according to the fourth embodiment.
- an arbitrary device transmits LF data and operation contents to a server device (including a virtual machine) provided with processing means such as a processor on the Internet or a local network, and a part or all of the processing for the LF data is performed by a server.
- a server device including a virtual machine
- processing means such as a processor on the Internet or a local network
- Configurations that execute on the device may be included.
- the configuration may be such that any device receives and displays the processing result from the server device.
- FIG. 1 is a block diagram illustrating a functional configuration example of a digital camera 100 as an example of an image processing apparatus according to the present embodiment.
- One or more of the functional blocks shown in FIG. 1 may be realized by hardware such as an ASIC or a programmable logic array (PLA), or may be realized by a programmable processor such as a CPU or MPU executing software. May be. Further, it may be realized by a combination of software and hardware. Therefore, in the following description, even when different functional blocks are described as the operation subject, the same hardware can be realized as the subject.
- PDA programmable logic array
- the first lens group 101 includes, for example, a zoom lens that constitutes the imaging optical system, is disposed at the tip of the imaging optical system, and is held so as to be movable back and forth in the optical axis direction.
- the shutter 102 includes a diaphragm, and adjusts the amount of light incident on the image sensor 107 during photographing by adjusting the aperture diameter. Also, when a still image is taken, it functions as a shutter that adjusts the exposure time.
- the shutter 102 and the second lens group 103 constituting the imaging optical system integrally move forward and backward in the optical axis direction, and perform a zooming function (zoom function) by interlocking with the forward and backward movement of the first lens group 101. Eggplant.
- the third lens group 105 includes, for example, a focus lens that forms an imaging optical system, and performs focus adjustment by advancing and retracting in the optical axis direction.
- the optical element 106 includes an optical low-pass filter, and reduces false colors and moire in the captured image.
- the image sensor 107 includes an image sensor composed of, for example, a CMOS photosensor and a peripheral circuit, and is disposed on the imaging surface of the imaging optical system.
- the zoom actuator 111 includes a driving device that generates an advance / retreat operation of the first lens group 101 to the third lens group 103. By rotating a cam cylinder (not shown), the zoom actuator 111 rotates the first lens group 101 to the third lens group 103. Is moved forward and backward in the optical axis direction.
- the aperture shutter actuator 112 includes a drive device that generates the operation of the shutter 102, and controls the aperture diameter and shutter operation of the shutter 102 according to the control of the aperture shutter drive unit 128.
- the focus actuator 114 includes a driving device that generates an advance / retreat operation of the third lens group 105, and performs focus adjustment by driving the third lens group 105 forward / backward in the optical axis direction.
- the illuminating device 115 includes an electronic flash for illuminating a subject at the time of photographing.
- the auxiliary light emitting unit 116 includes a light emitting device for AF auxiliary light, and projects an image of a mask having a predetermined aperture pattern onto a subject field through a light projecting lens to detect a focus on a dark subject or a low contrast subject. Improve ability.
- the control unit 121 includes a CPU (or MPU), a ROM, and a RAM, and controls each unit of the entire digital camera 100 by expanding and executing a program stored in the ROM to execute AF, photographing, and image processing. And a series of operations such as recording.
- the control unit 121 may include an A / D converter, a D / A converter, a communication interface circuit, and the like.
- the control unit 121 includes a function as a display control unit that controls display contents displayed on the display unit 131, and may execute processing executed by the image processing unit 125 instead.
- the electronic flash control unit 122 includes a control circuit or a control module, and controls lighting of the illumination device 115 in synchronization with the photographing operation.
- the auxiliary light driving unit 123 controls the lighting of the auxiliary light emitting unit 116 in synchronization with the focus detection operation.
- the image sensor driving unit 124 controls the imaging operation of the image sensor 107 and A / D-converts the acquired image signal and transmits it to the control unit 121.
- An image processing circuit 125 performs processes such as ⁇ conversion, color interpolation, and JPEG compression of the image acquired by the image sensor 107.
- the focus driving unit 126, the aperture shutter driving unit 128, and the zoom driving unit 129 each include a control circuit or a control module.
- the focus driving unit 126 controls the focus actuator 114 based on the focus detection result.
- the aperture shutter drive unit 128 controls the aperture shutter actuator 112 at a predetermined timing of the photographing operation.
- the zoom drive unit 129 controls the zoom actuator 111 according to the zoom operation of the photographer.
- the display unit 131 includes a display device such as an LCD, and displays, for example, information on the shooting mode of the camera, a preview image before shooting and a confirmation image after shooting, a display image in a focused state at the time of focus detection, and the like.
- the operation unit 132 includes a switch group for operating the digital camera 100, and includes, for example, a power switch, a release (shooting trigger) switch, a zoom operation switch, a shooting mode selection switch, and the like.
- the control unit 121 controls each unit of the digital camera 100 in order to execute an operation corresponding to the user operation.
- the recording medium 133 includes, for example, a detachable flash memory, and records captured images.
- the communication unit 134 includes a communication circuit or a module, and establishes communication with an external device (for example, a server installed outside) using a communication method compliant with a predetermined standard.
- the communication unit 134 performs upload / download of image data, reception of a result of a predetermined process performed by the external device with respect to the uploaded image data, and the like.
- the image acquisition unit 151 stores image data read from the recording medium 133.
- the image data is image data composed of an image (also referred to as an A + B image) obtained by combining a first viewpoint image and a second viewpoint image, which will be described later, and the first viewpoint image.
- the subtraction unit 152 generates a second viewpoint image by subtracting the first viewpoint image from the A + B image.
- the shading processing unit 153 corrects the light amount change due to the image heights of the first viewpoint image and the second viewpoint image.
- the operation information acquisition unit 154 receives adjustment values for viewpoint movement and refocus changed by the user, and supplies adjustment values operated by the user to the viewpoint change processing unit 155 and the refocus processing unit 156.
- the viewpoint change processing unit 155 synthesizes an image in which the viewpoint is changed by changing the addition ratio (weighting) of the first viewpoint image and the second viewpoint image. Although details will be described later, an image in which the depth of field is enlarged or reduced can be generated by the processing of the viewpoint change processing unit 155.
- the refocus processing unit 156 generates a composite image by shift-adding the first viewpoint image and the second viewpoint image in the pupil division direction, and generates images at different focus positions. Details of the processing by the refocus processing unit 156 will be described later.
- the image processing unit 125 performs development processing by the configuration of a white balance unit 157, a demosaicing unit 158, a gamma conversion unit 159, and a color adjustment unit 160 described below.
- the white balance unit 157 performs white balance processing. Specifically, a gain is applied to each of R, G, and B so that R, G, and B in the white region have the same color.
- the demosaicing unit 158 generates a color image in which R, G, and B color image data are arranged in all pixels by interpolating two color mosaic image data out of the three primary colors missing in each pixel. .
- the demosaicing unit 158 performs interpolation on the target pixel using pixels around the target pixel. Thereafter, color image data of the three primary colors R, G, and B is generated as a result of the interpolation processing for each pixel.
- the gamma conversion unit 159 applies gamma correction processing to the color image data of each pixel to generate color image data matched with the display characteristics of the display unit 131, for example.
- the color adjustment unit 160 applies various color adjustment processes such as noise reduction, saturation enhancement, hue correction, and edge enhancement, which are processes for improving the appearance of the image, to the color image data.
- the compression unit 161 compresses the color-adjusted color image data by a method based on a predetermined compression method such as JPEG, and reduces the data size of the color image data when recording.
- the output unit 163 outputs the above-described color image data, compressed image data, or display data for a user interface.
- FIG. 3 shows a two-dimensionally arranged pixel array in a range of 4 columns ⁇ 4 rows, and further shows a subpixel array included in each pixel in a range of 8 columns ⁇ 4 rows.
- each pixel 200R having R (red) spectral sensitivity at the upper left position and the G (green) spectral sensitivity at the upper right and lower left positions.
- Each pixel 200G has a pixel 200B having a spectral sensitivity of B (blue) at the lower right position.
- each pixel has a sub-pixel 201 and a sub-pixel 202 arranged in 2 columns ⁇ 1 row.
- FIG. 3 is a plan view of the pixel 200G viewed from the light receiving surface side (+ z side) of the image sensor 107
- FIG. 4B is a cross-sectional view of the aa cross section of FIG. 4A viewed from the ⁇ y side. ing.
- the pixel 200G is configured to include a photoelectric conversion unit 301 and a photoelectric conversion unit 302 that are divided into NH in the x direction (two divisions) and NV in the y direction (one division).
- the photoelectric conversion unit 301 and the photoelectric conversion unit 302 correspond to the sub-pixel 201 and the sub-pixel 202, respectively.
- the pixel 200G has a microlens 305 for condensing incident light on the light receiving side (+ z direction) of the pixel, and a light beam incident through the microlens 305 is converted into the photoelectric conversion unit 301 or the photoelectric conversion unit.
- 302 is configured to receive light.
- the photoelectric conversion unit 301 and the photoelectric conversion unit 302 may be a pin structure photodiode in which an intrinsic layer is sandwiched between a p-type layer and an n-type layer, or an intrinsic layer may be omitted and a pn junction if necessary. A photodiode may be used.
- the color filter 306 is disposed between the microlens 305, the photoelectric conversion unit 301, and the photoelectric conversion unit 302, and allows light having a predetermined frequency to pass therethrough.
- FIG. 4B shows an example in which one color filter 306 is provided for the pixel 200G. However, if necessary, a color filter having a different spectral transmittance may be provided for each sub-pixel, or the color filter may be omitted. May be.
- pairs of electrons and holes are generated according to the amount of received light, and further separated by a depletion layer. Thereafter, negatively charged electrons are accumulated in the n-type layer, and holes are output to the outside of the image sensor 107 through the p-type layer 300 connected to a constant voltage source (not shown). Electrons accumulated in the n-type layers of the photoelectric conversion unit 301 and the photoelectric conversion unit 302 are transferred to the capacitance unit (FD) through the transfer gate and converted into a voltage signal.
- FD capacitance unit
- FIG. 5 shows a correspondence between a cross-sectional view of the pixel 200G shown in FIG. 4A taken along the line aa from the + y side and the exit pupil plane of the imaging optical system.
- the x-axis and y-axis of the cross-sectional view of the pixel 200G are inverted with respect to FIGS. 4A to 4B in order to correspond to the coordinate axis of the exit pupil plane.
- the pupil partial area 501 of the sub-pixel 201 represents a pupil area that can be received by the sub-pixel 201.
- the pupil partial area 501 of the sub-pixel 201 has a centroid eccentric on the + X side on the pupil plane, and is generally in a conjugate relationship by the light receiving surface of the photoelectric conversion unit 301 whose centroid is eccentric in the ⁇ x direction and the microlens. It has become.
- the pupil partial area 502 of the sub-pixel 202 represents a pupil area that can be received by the sub-pixel 202.
- the pupil partial region 502 of the sub-pixel 202 has a centroid eccentric on the ⁇ X side on the pupil plane, and is generally in a conjugate relationship by the light receiving surface of the photoelectric conversion unit 302 whose centroid is eccentric in the + x direction and the microlens. It has become.
- the pupil region 500 is a pupil region that can receive light in the entire pixel 200G when the photoelectric conversion unit 301 and the photoelectric conversion unit 302 (subpixel 201 and subpixel 202) are all combined.
- 6A to 6B show examples of light intensity distributions when light is incident on the microlens 305 formed in the pixel 200G.
- 6A shows a light intensity distribution in a cross section parallel to the microlens optical axis
- FIG. 6B shows a light intensity distribution in a cross section perpendicular to the microlens optical axis at the microlens focal position.
- H represents the convex surface of the microlens 305
- f represents the focal length of the microlens
- nF ⁇ represents the movable range of the focal position by refocusing described later
- ⁇ represents the maximum angle of the incident light beam.
- the incident light is condensed at the focal position by the microlens 305, but the diameter of the condensing spot is not smaller than the diffraction limit ⁇ due to the influence of diffraction due to the wave nature of the light.
- the diameter of the condensing spot is not smaller than the diffraction limit ⁇ due to the influence of diffraction due to the wave nature of the light.
- the condensing spot of the microlens is also about 1 ⁇ m.
- the pupil partial region 501 (pupil partial region 502 in the photoelectric conversion unit 302) that is in a conjugate relationship with the light receiving surface of the photoelectric conversion unit 301 via the microlens 305 is clearly divided into pupils due to diffraction blur.
- the light reception rate distribution (pupil intensity distribution) is not.
- the pupil intensity distribution in this pixel 200G is as shown in FIG. 7 schematically showing the pupil coordinates on the horizontal axis and the light reception rate on the vertical axis.
- the pupil intensity distribution 701 is an example of the pupil intensity distribution along the X axis (solid line) of the pupil partial area 501 in FIG. 5, and the pupil intensity distribution 702 is an example of the pupil intensity distribution along the X axis of the pupil partial area 502. (Broken lines) are shown.
- the pupil partial area 501 and the pupil partial area 502 have gentle pupil intensity peaks at different pupil positions, indicating that the light passing through the microlens 305 is gently divided into pupils.
- each light flux that has passed through different pupil partial areas passes through the imaging surface 800 and enters each pixel of the image sensor 107 at different angles.
- the light is received by the subpixel 201 (photoelectric conversion unit 301) and the subpixel 202 (photoelectric conversion unit 302) of each pixel divided by 2 ⁇ 1. That is, the image sensor 107 has a plurality of pixels arranged with a plurality of sub-pixels configured to receive light beams that pass through different pupil partial regions of the imaging optical system.
- the light reception signals of the sub-pixels 201 of each pixel are collected to generate a first viewpoint image
- the light reception signals of the sub-pixels 202 of each pixel are collected to obtain the second viewpoint image.
- the first viewpoint image and the second viewpoint image are images of a Bayer array
- a demosaicing process is applied to the first viewpoint image and the second viewpoint image as necessary. Also good.
- FIG. 8A shows an example in which the pupil region is divided into two pupils in the horizontal direction, pupil division may be performed in the vertical direction according to the subpixel division method.
- the present invention is not limited to this, and this embodiment and other embodiments can be applied as long as a plurality of viewpoint images can be acquired by a known technique. For example, as in Japanese Patent Application Laid-Open No.
- a configuration in which a plurality of cameras with different viewpoints are collectively regarded as the image sensor 107 may be used.
- the image from the imaging optical system is re-imaged on the microlens array (this is called re-imaging because the image once formed is diffused), and the image plane
- a configuration in which an image pickup element is provided in the apparatus may be used.
- a method of inserting a mask (gain modulation element) with an appropriate pattern into the optical path of the photographing optical system can also be used.
- FIG. 8B schematically shows the relationship between the defocus amount between the first viewpoint image and the second viewpoint image and the image shift amount between the first viewpoint image and the second viewpoint image.
- An imaging element 107 (not shown in FIG. 8B) is arranged on the imaging surface 800, and the exit pupil of the imaging optical system is divided into two in the pupil partial area 501 and the pupil partial area 502, as in FIGS. Divided.
- the defocus amount d represents the distance from the imaging position of the subject to the imaging surface 800 by the magnitude
- the defocus amount d represents, for example, a negative sign (d ⁇ 0) when the imaging position of the subject is on the subject side of the imaging surface 800 (also referred to as a front pin state).
- a state where the subject imaging position is on the opposite side of the subject from the imaging surface 800 (also referred to as a rear pin state) is represented by a positive sign (d> 0).
- the front pin state (d ⁇ 0) and the rear pin state (d> 0) are combined to form a defocus state (
- the luminous flux that has passed through the pupil partial area 501 (pupil partial area 502) out of the luminous flux from the subject 802 is condensed once and then the gravity center position G1 (G2) of the luminous flux.
- a width ⁇ 1 ( ⁇ 2) and the image is blurred on the imaging surface 800.
- the blurred image is received by the sub-pixel 201 (sub-pixel 202) constituting each pixel arranged in the image sensor, and a first viewpoint image (second viewpoint image) is generated. Therefore, the first viewpoint image (second viewpoint image) is recorded as a subject image in which the subject 802 is blurred by the width ⁇ 1 ( ⁇ 2) at the gravity center position G1 (G2) on the imaging surface 800.
- the blur width ⁇ 1 ( ⁇ 2) of the subject image increases approximately in proportion to the increase of the magnitude
- of the image shift amount p of the subject image between the first viewpoint image and the second viewpoint image is also the magnitude
- the image shift direction of the subject image between the first viewpoint image and the second viewpoint image is opposite to that in the front pin state, but the same.
- the first viewpoint image and the second viewpoint image, or the first viewpoint image and the second viewpoint image are increased in accordance with the increase in the defocus amount of the imaging signal obtained by adding the first viewpoint image and the second viewpoint image.
- the amount of image shift between the second viewpoint images increases.
- viewpoint image correction processing and refocus processing As a first step, the viewpoint change processing unit 155 calculates a contrast distribution representing the level of contrast based on each pixel value of the captured image. As a second stage, the viewpoint change processing unit 155 performs conversion for enlarging the difference between a plurality of viewpoint images (first viewpoint image and second viewpoint image) for each pixel to enhance parallax based on the calculated contrast distribution. To generate a plurality of corrected viewpoint images (first corrected viewpoint image and second corrected viewpoint image). As a third stage, the refocus processing unit 156 relatively shift-adds a plurality of corrected viewpoint images (first corrected viewpoint image and second corrected viewpoint image) to generate a refocus image.
- i and j are integers, and the j-th position in the row direction and the i-th direction in the column direction of the image sensor 107 are represented as (j, i).
- the first viewpoint image of the pixel at the position (j, i) is A0 (j, i)
- the second viewpoint image is B0 (j, i)
- the viewpoint change processing unit 155 matches the color centroids of RGB for each position (j, i) with respect to the Bayer-arrayed captured image I (j, i) according to the equation (1) to obtain the luminance Y (j, i) is calculated.
- the viewpoint change processing unit 155 [1, 2, ⁇ 1, ⁇ 4, ⁇ 1, 2] in the horizontal direction (column i direction) that is the pupil division direction with respect to the luminance Y (j, i). 1] is applied to calculate the high-frequency component dY (j, i) in the horizontal direction.
- the viewpoint change processing unit 155 applies high-frequency cut filter processing such as [1, 1, 1, 1, 1, 1, 1] in the vertical direction (row j direction) that is not the pupil division direction as necessary. The high frequency noise in the vertical direction may be suppressed.
- the viewpoint change processing unit 155 calculates a normalized (normalized) horizontal high-frequency component dZ (j, i) according to Equation (2).
- the reason why the constant Y0 is added to the denominator is to prevent the expression (2) from diverging by dividing by zero.
- the viewpoint change processing unit 155 may suppress high-frequency noise by applying a high-frequency cut filter process to the luminance Y (j, i) as necessary before normalization by Expression (2).
- the viewpoint change processing unit 155 calculates the contrast distribution C (j, i) according to the equation (3).
- the first line of Expression (3) indicates that the contrast distribution C (j, i) is set to 0 when the luminance of the captured image is lower than the predetermined luminance Yc.
- the third line of Equation (3) indicates that the contrast distribution C (j, i) is set to 1 when the normalized high-frequency component dZ (j, i) is larger than the predetermined value Zc.
- the second line of Expression (3) indicates that a value obtained by normalizing dZ (j, i) with Zc is a contrast distribution C (j, i).
- the contrast distribution C (j, i) takes a value in the range of [0, 1]. The closer to 0, the lower the contrast, and the closer to 1, the higher the contrast.
- FIG. 9 shows an example of the contrast distribution C (j, i) of the captured image obtained by Expression (3).
- the white portion indicates that there are many horizontal high-frequency components and the contrast is high
- the black portion indicates that there are few horizontal high-frequency components and the contrast is low.
- parallax enhancement processing of the parallax image
- an image shift distribution of the viewpoint image is calculated.
- the calculation of the image shift distribution is obtained by performing a correlation operation on a pair of images of the first viewpoint image A0 and the second viewpoint image B0 and calculating a relative positional shift amount of the pair of images. .
- Various known methods are known for the correlation calculation.
- the viewpoint change processing unit 155 for example, adds the absolute values of the difference between a pair of images as shown in Expression (4) to correlate the images. A value can be obtained.
- A0i and B0i represent the luminance of the i-th pixel of the first viewpoint image A0 and the second viewpoint image B0, respectively.
- Ni is a number representing the number of pixels used in the calculation, and is appropriately set according to the minimum calculation range of the image shift distribution.
- the viewpoint change processing unit 155 calculates, for example, k that minimizes COR (k) in Expression (4) as the image shift amount. That is, with the pair of images shifted by k pixels, the absolute value of the difference between each i-th A0 pixel and B0 pixel in the row direction is taken, and the absolute value is added to a plurality of pixels in the row direction. Then, the viewpoint change processing unit 155 considers the added value, that is, k when COR (k) is smallest, as the image shift amount of A0 and B0, and calculates the shift amount k pixels.
- the correlation calculation is It is defined by equation (5).
- A0ij and B0ij represent the luminance of the i-th pixel in the j-th column of the first viewpoint image A0 and the second viewpoint image B0, respectively.
- ni represents the number of pixels used for the calculation
- nj represents the number in the column direction of the pair of images for which the correlation calculation is performed.
- the viewpoint change processing unit 155 calculates k that minimizes the COR (k) in the equation (5) as the image deviation amount in the same manner as the equation (4). Note that the subscript k is added only to i and is independent of j. This corresponds to performing the correlation calculation while moving the two-dimensional image only in the pupil division direction.
- the viewpoint change processing unit 155 can calculate the image shift amount of each region of the first viewpoint image A0 and the second viewpoint image B0 according to the equation (5), and calculate the image shift distribution.
- the sharpness processing described later is performed only on the high contrast portion to perform the refocus processing. Therefore, in the above-described contrast distribution calculation process, the correlation calculation according to the equation (5) is not performed on the region where the contrast distribution C (j, i) is 0 (that is, a position having a luminance lower than the predetermined luminance Yc). It may be.
- an example of the pupil intensity distribution is a gentle pupil division due to diffraction blur in the pupil division by the microlens formed for each pixel and the photoelectric conversion unit divided into a plurality of pixels. For this reason, in a plurality of viewpoint images corresponding to the gently divided pupil intensity distribution, the effective F value in the pupil division direction does not become sufficiently dark (large), so that the effective depth of focus is difficult to increase.
- the viewpoint change processing unit 155 increases the difference between the viewpoint images for each pixel and emphasizes the parallax with respect to a plurality of viewpoint images (first viewpoint image and second viewpoint image). I do.
- the viewpoint change processing unit 155 generates a plurality of corrected viewpoint images (first corrected viewpoint image and second corrected viewpoint image) by the parallax enhancement processing.
- the viewpoint change processing unit 155 enlarges the difference between the viewpoint images according to Expression (6) and Expression (7) for the first viewpoint image A0 (j, i) and the second viewpoint image B0 (j, i).
- First corrected viewpoint image A (j, i) and second corrected viewpoint image B (j, i) are generated.
- k (0 ⁇ k ⁇ 1) and ⁇ (0 ⁇ ⁇ ⁇ 1) are real numbers.
- FIG. 10 shows an example in which the difference between the viewpoint images is enlarged at a predetermined position by the parallax enhancement processing.
- An example of the first viewpoint image A0 (101) and the second viewpoint image B0 (102) before performing the parallax enhancement processing is indicated by a broken line, and the first viewpoint image after performing the parallax enhancement processing by Expression (4) and Expression (5)
- Examples of the first corrected viewpoint image A (103) and the second corrected viewpoint image B (104) are shown by solid lines.
- the horizontal axis indicates the 1152 to 1156th pixels in units of sub-pixels (sub-pixels), and the vertical axis indicates the magnitude of parallax in each pixel.
- the portion where the difference between the viewpoint images is small does not change much (for example, near the 1154th), but the portion where the difference between the viewpoint images is large is further enlarged (for example, near the 1153rd and 1155th), The parallax is emphasized.
- the viewpoint change processing unit 155 generates a plurality of corrected viewpoint images in which the difference between the plurality of viewpoint images is enlarged and the parallax is emphasized for each of the plurality of viewpoint images. Note that the viewpoint change processing unit 155 suppresses the load of the parallax enhancement processing by performing calculations using signals of a plurality of subpixels included in the pixel as in Expression (6) and Expression (7). Can do.
- Equation (6) when the value of k is increased to increase the parallax enhancement, the parallax between a plurality of corrected viewpoint images (first corrected viewpoint image and second corrected viewpoint image) increases. Therefore, by increasing the value of k, it is possible to darken (increase) the effective F value in the dividing direction and deeply correct the effective depth of focus in the dividing direction. However, if the parallax enhancement is excessively increased, the noise of the corrected viewpoint image increases and the S / N decreases.
- the strength of the parallax enhancement conversion is adjusted in a region-adaptive manner based on the contrast distribution C (j, i). For example, in the region with high contrast, the viewpoint change processing unit 155 increases the parallax enhancement intensity by increasing the parallax, and darkens (increases) the effective F value in the division direction. On the other hand, in a low contrast area, the intensity of parallax enhancement is weakened to maintain S / N, and the S / N reduction is suppressed.
- the parallax between the plurality of corrected viewpoint images (the first corrected viewpoint image and the second corrected viewpoint image) is increased, the effective F value in the division direction is darkened (increased), and the effective in the division direction is increased.
- the depth of focus can be corrected deeply.
- a refocus image is generated using a plurality of corrected viewpoint images (first corrected viewpoint image and second corrected viewpoint image), thereby improving the refocus effect (change in image due to refocusing). Can be emphasized).
- the viewpoint change processing unit 155 can suppress a decrease in S / N, for example, by increasing the intensity of parallax enhancement in a high-brightness area rather than a low-brightness area of a captured image as necessary. Further, for example, even if the intensity of parallax enhancement is increased in a region where the high-frequency component is larger than a region where the high-frequency component of the captured image is small, for example, S / N reduction can be similarly suppressed.
- FIG. 11 schematically shows a first modified viewpoint image Ai and a second modified viewpoint image Bi including the signal of the i-th pixel in the column direction of the image sensor 107 arranged on the imaging surface 800.
- the first modified viewpoint image Ai includes a light reception signal of a light beam incident on the i-th pixel at the principal ray angle ⁇ a (corresponding to the pupil partial region 501 in FIG. 8A).
- the second modified viewpoint image Bi includes a light reception signal of the light beam incident on the i-th pixel at the principal ray angle ⁇ b (corresponding to the pupil partial region 502 in FIG. 8A). That is, the first modified viewpoint image Ai and the second modified viewpoint image Bi have incident angle information in addition to the light intensity distribution information.
- the refocus processing unit 156 can generate a refocus image on a predetermined virtual imaging plane. Specifically, the refocus processing unit 156 translates the first modified viewpoint image Ai along the angle ⁇ a and the second modified viewpoint image Bi along the angle ⁇ b to the virtual imaging plane 810, respectively. Then, the refocused images on the virtual imaging plane 810 can be generated by adding the parallel corrected viewpoint images for each pixel. In the example of FIG. 11, translating the first corrected viewpoint image Ai along the angle ⁇ a to the virtual imaging plane 810 corresponds to shifting the first corrected viewpoint image Ai by +0.5 pixels in the column direction. .
- translating the second corrected viewpoint image Bi to the virtual imaging plane 810 along the angle ⁇ b corresponds to shifting the second corrected viewpoint image Bi by ⁇ 0.5 pixels in the column direction. That is, in the example of FIG. 11, the combination of the first corrected viewpoint image Ai and the second corrected viewpoint image Bi on the virtual imaging plane 810 is relatively +1 between the first corrected viewpoint image Ai and the second corrected viewpoint image Bi. Obtained by pixel shift. Therefore, a refocus image on the virtual imaging plane 810 can be generated by adding the first modified viewpoint image Ai and the shifted second modified viewpoint image Bi + 1 for each pixel.
- the refocus processing unit 156 shift-adds the first modified viewpoint image A and the second modified viewpoint image B in accordance with the equation (8) as the integer shift amount s, and thereby each virtual amount corresponding to the integer shift amount s.
- a refocus image I (j, i; s) on the imaging plane is generated.
- the refocus processing unit 156 first performs demosaicing processing on the first modified viewpoint image A and the second modified viewpoint image B as necessary, and the first modified viewpoint image and the second modified viewpoint image after the demosaicing processing. Shift addition processing may be performed using the viewpoint image. Further, the refocus processing unit 156 generates an interpolated signal between each pixel of the first corrected viewpoint image A and the second corrected viewpoint image B as necessary, and generates a refocus image corresponding to the non-integer shift amount. It may be generated. In this way, it is possible to generate a refocus image in which the position of the virtual imaging plane is changed with a more detailed granularity.
- the first corrected viewpoint image A and the second corrected viewpoint image B are shift-added to generate a refocus image on the virtual imaging plane. Since the images of the first corrected viewpoint image A and the second corrected viewpoint image B are shifted by shift addition, the relative shift amount (also referred to as image shift amount) with respect to the image before the refocus processing can be known.
- the integer shift amount s by the refocus processing described above corresponds to this image shift amount.
- the refocus processing unit 156 can realize the contour enhancement of the subject in the refocus image by performing the sharpness process on the region corresponding to the image shift amount s.
- Unsharp mask processing applies a blur filter to the local region (original signal) centered on the pixel of interest, and reflects the difference between the pixel values before and after applying the blur processing on the pixel value of the pixel of interest. Realize contour enhancement.
- the unsharp mask process for the pixel value P to be processed is calculated according to Equation (9).
- P ′ is a pixel value after application of the processing
- R is a radius of the blurring filter
- T is an application amount (%).
- F (i, j, R) is a pixel value obtained by applying a blurring filter having a radius R to the pixel P (i, j).
- a known method such as Gaussian blur can be used.
- Gaussian blur is a process of averaging by applying weighting according to a Gaussian distribution according to the distance from the pixel to be processed, and a natural processing result can be obtained.
- the size of the radius R of the blurring filter is related to the wavelength of the frequency on the image to which the sharpness processing is to be applied. That is, as R is smaller, a finer pattern is emphasized, and as R is larger, a gentler pattern is emphasized.
- the application amount T (i, j) is a value that changes the application amount of contour enhancement by unsharp mask processing according to the image shift distribution. Specifically, if the image shift amount at the position of each pixel is pred (i, j) and the shift amount s by refocus processing is,
- the application amount T is increased in a region that is within pixels), that is, a region that is in a focused state on the virtual imaging plane.
- the refocusable range represents a range of a focus position that can be changed by the refocus process.
- FIG. 13 schematically illustrates a refocusable range according to the present embodiment.
- the allowable circle of confusion is ⁇ and the aperture value of the imaging optical system is F
- the depth of field at the aperture value F is ⁇ F ⁇ .
- the effective depth of field for each first modified viewpoint image (second modified viewpoint image) is ⁇ NHF ⁇ and NH times deeper, and the focusing range is expanded NH times.
- the refocus processing unit 156 performs the focus after the shooting by the refocus processing of translating the first corrected viewpoint image (second corrected viewpoint image) along the principal ray angle ⁇ a ( ⁇ b) shown in FIG.
- the position can be readjusted (refocused).
- the defocus amount d from the imaging surface where the focus position can be readjusted (refocused) after shooting is limited, and the refocusable range of the defocus amount d is approximately in the range of Expression (10). is there.
- ⁇ 2 ⁇ X (the reciprocal of the Nyquist frequency 1 / (2 ⁇ X) of the pixel period ⁇ X).
- FIG. 14A to 14C show the principle of the viewpoint movement process.
- an image sensor 107 (not shown) is arranged on the imaging surface 600, and the exit pupil of the imaging optical system is divided into two, a pupil partial region 501 and a pupil partial region 502, as in FIG. .
- FIG. 14A shows an example of a case where a blurred image ⁇ 1 + ⁇ 2 of the subject q2 in front is photographed with the focused image p1 of the main subject q1 (also referred to as a front blurring on the main subject).
- 14B and 14C show the example shown in FIG. 14A separately for the light beam passing through the pupil partial region 501 and the light beam passing through the pupil partial region 502 of the imaging optical system.
- the light beam from the main subject q1 passes through the pupil partial region 501 and forms an image p1 in a focused state.
- the light beam from the subject q2 on the near side passes through the pupil partial region 501 and spreads in the blurred image ⁇ 1 in the defocused state.
- Each light beam is received by the sub-pixel 201 of a different pixel of the image sensor 107, and a first viewpoint image is generated.
- the image p1 of the main subject q1 and the blurred image ⁇ 1 of the subject q2 in front are received without overlapping. This is because, in a predetermined area (near the image p1 of the subject q1), the closest subject (the blurred image ⁇ 1 of the subject q2) among the plurality of viewpoint images (first viewpoint image and second viewpoint image) is the narrowest range. This is the viewpoint image being shot.
- the blur image ⁇ 1 of the subject q2 is less reflected and the contrast evaluation value is the highest.
- a large viewpoint image is obtained.
- the light beam from the main subject q1 passes through the pupil partial region 502 and forms an image p1 in a focused state.
- the light beam from the subject q2 on the near side passes through the pupil partial area 502 and spreads in the blurred image ⁇ 2 in the defocused state.
- Each light beam is received by the sub-pixel 202 of each pixel of the image sensor 107, and a second viewpoint image is generated.
- the image p1 of the main subject q1 and the blurred image ⁇ 2 of the subject q2 in front are overlapped and received.
- the viewpoint change processing unit 155 inputs the above-described first viewpoint image A (j, i) and second viewpoint image B (j, i).
- a table function T (j, i) corresponding to the boundary width ⁇ between the predetermined region R and the predetermined region is calculated.
- the table function T (j, i) is a function that becomes 1 inside the predetermined region R and 0 outside the predetermined region R, and changes almost continuously from 1 to 0 with the boundary width ⁇ of the predetermined region R.
- the viewpoint change processing unit 155 may set the predetermined area to be circular or any other shape as necessary, and may set a plurality of predetermined areas and a plurality of boundary widths.
- the viewpoint change processing unit 155 uses the real coefficient w ( ⁇ 1 ⁇ w ⁇ 1) and the first weight coefficient Wa (j of the first viewpoint image A (j, i) according to the equation (12A). , I). In addition, the viewpoint change processing unit 155 calculates the second weight coefficient Wb (j, i) of the second viewpoint image B (j, i) according to the equation (12B).
- the viewpoint change processing unit 155 includes the first viewpoint image A (j, i), the second viewpoint image B (j, i), the first weighting coefficient Wa (j, i), and the second Using the weight coefficient Wb (j, i), an output image I (j, i) is generated according to the equation (13).
- the viewpoint change processing unit 155 may generate the output image Is (j, i) according to the equation (14A) or the equation (14B) in combination with the refocus processing using the shift amount s.
- the output image Is (j, i) output in this way is an image whose viewpoint has moved and an image whose focus position has been readjusted (refocused).
- a plurality of viewpoint images are multiplied and combined to generate an output image. That is, when the viewpoint change processing unit 155 reduces the front blurring with respect to the main subject using Expression (13), the first weight of the first viewpoint image in which the overlap between the image p1 and the blurred image ⁇ 1 is small in the vicinity of the image p1.
- the coefficient Wa is set larger than the second weighting coefficient Wb of the second viewpoint image in which the overlap between the image p1 and the blurred image ⁇ 2 is large, and an output image is generated.
- the viewpoint change processing unit 155 minimizes the weight coefficient of the viewpoint image in which the closest subject is photographed in the widest range in the predetermined area of the image, or the narrowest range of the closest subject. The weighting coefficient of the viewpoint image captured at is maximized. In addition, the viewpoint change processing unit 155 minimizes the weight coefficient of the viewpoint image having the smallest contrast evaluation value or maximizes the weight coefficient of the viewpoint image having the largest contrast evaluation value in the predetermined region of the output image.
- the viewpoint change processing unit 155 does not change the blurring shape of the imaging optical system in a region other than the predetermined area where the viewpoint movement process is not performed.
- the output image may be generated by adding the coefficients and the second weighting coefficient substantially evenly. Further, although a method for generating an output image in which the weighting coefficient (that is, the addition ratio) is changed according to the user's designation will be described later, the user may designate a predetermined area for performing the viewpoint movement process.
- 15A to 15C show the relationship between the pupil partial areas (501, 502) received by the sub-pixel 201 and the sub-pixel 202 of each pixel and the exit pupil 400 of the imaging optical system.
- FIG. 15A shows a case where the exit pupil distance Dl of the imaging optical system and the set pupil distance Ds of the image sensor 107 are the same.
- the exit pupil 400 of the imaging optical system is divided into pupils approximately equally by the pupil partial area 501 and the pupil partial area 502 in the same manner for both the central image height and the peripheral image height.
- FIG. 15B shows a case where the exit pupil distance Dl of the imaging optical system is shorter than the set pupil distance Ds of the image sensor 107.
- the exit pupil 400 of the imaging optical system is non-uniformly divided by the pupil partial region 501 and the pupil partial region 502.
- the effective aperture value of the first viewpoint image corresponding to the pupil partial area 501 is smaller (brighter) than the effective aperture value of the second viewpoint image corresponding to the pupil partial area 502.
- the effective aperture value of the first viewpoint image corresponding to the pupil partial region 501 is larger (darker) than the effective aperture value of the second viewpoint image corresponding to the pupil partial region 502. ) Value.
- FIG. 15C shows the case where the exit pupil distance Dl of the imaging optical system is longer than the set pupil distance Ds of the image sensor 107. Also in this case, at the peripheral image height, the exit pupil 400 of the imaging optical system is non-uniformly pupil-divided by the pupil partial area 501 and the pupil partial area 502.
- the effective aperture value of the first viewpoint image corresponding to the pupil partial area 501 is larger (darker) than the effective aperture value of the second viewpoint image corresponding to the pupil partial area 502.
- the effective aperture value of the first viewpoint image corresponding to the pupil partial region 501 is smaller (brighter) than the effective aperture value of the second viewpoint image corresponding to the pupil partial region 502. ) Value.
- the effective F values of the first viewpoint image and the second viewpoint image also become nonuniform. For this reason, one of the first viewpoint image and the second viewpoint image has a larger spread of blur, and the other blur has a smaller spread.
- the viewpoint change processing unit 155 minimizes the weight coefficient of the viewpoint image with the smallest effective aperture value or the weight coefficient of the viewpoint image with the largest effective aperture value in a predetermined area of the output image as necessary. It is desirable to maximize the value. By performing such viewpoint movement processing, it is possible to reduce front blurring on the main subject.
- each viewpoint image is an image obtained by passing through half of the original pupil region. Therefore, in the case of the pupil division region divided in two in the horizontal direction, the aperture diameter in the horizontal direction is half. Become. For this reason, the depth of field in the horizontal direction is quadrupled.
- the first viewpoint image or the second viewpoint image has a depth of field that is twice as long as the vertical and horizontal average of the depth of field of the image (A + B image) obtained by combining the first viewpoint image and the second viewpoint image. It becomes the image which has.
- the viewpoint change processing unit 155 generates an image with an increased depth of field by generating a composite image by changing the addition ratio of the first viewpoint image or the second viewpoint image to other than 1: 1. be able to. Further, the viewpoint change processing unit 155 applies the above-described unsharp mask process using the contrast distribution and the image shift distribution to an image in which the addition ratio of the first viewpoint image or the second viewpoint image is changed. By doing in this way, the synthetic
- step S ⁇ b> 101 the imaging element 107 performs imaging in accordance with an instruction from the control unit 121.
- the image sensor 107 outputs parallax image data. Specifically, the image sensor 107 outputs the above-described viewpoint images (A + B image and A image) as image data of one file format.
- the recording medium 133 temporarily stores image data output from the image sensor 107.
- the image processing unit 125 reads parallax image data in accordance with an instruction from the control unit 121. For example, the image processing unit 125 acquires the image data stored in the recording medium 133 using the image acquisition unit 151. At this time, the image processing unit 125 generates a B image from the A + B image and, for example, a first viewpoint image (A image) that is an image of the left viewpoint and a second viewpoint image (an image of the right viewpoint) ( B image).
- the control unit 121 controls the operation unit 132 and the output of the image processing unit 125 to perform viewpoint image operation processing described later, that is, viewpoint movement and focus adjustment for the captured image.
- viewpoint image operation process is completed, the control unit 121 ends the series of processes.
- control unit 121 causes the display unit 131 to display a user interface (hereinafter simply referred to as UI) including a viewpoint movement UI and a focus adjustment UI, and a captured image.
- UI user interface
- the control unit 121 determines whether or not to perform viewpoint movement based on the user operation input via the operation unit 132.
- the control unit 121 determines that the viewpoint movement is performed and advances the process to S203.
- the control unit 121 determines that the viewpoint is not moved and advances the process to S207.
- the control unit 121 further acquires a user operation for operating the viewpoint movement UI via the operation unit 132.
- FIG. 19A an example of the viewpoint movement UI displayed on the display unit 131 is shown in FIG. 19A.
- an image (a captured image or a viewpoint image) is displayed in a part of the area 1000 that forms the UI.
- the viewpoint video is generated using only the left and right viewpoint images.
- the viewpoint movement UI arranges the slider 1001 and the slider bar 1002 in the horizontal direction so that the user can operate the operation member in the direction in which the viewpoint changes. As a result, the user can more intuitively operate the viewpoint movement.
- the control unit 121 uses the image processing unit 125 to generate a composite image in which the addition ratio of the viewpoint images is changed. Specifically, the image processing unit 125 acquires the position of the slider 1001 designated in S203 via the operation information acquisition unit 154. The image processing unit 125 generates an image whose viewpoint has been moved by changing the addition ratio of the first viewpoint image and the second viewpoint image in accordance with the position of the slider 101 and combining them (that is, viewpoint movement processing).
- viewpoint movement processing When the value at the right end of the slider bar 1002 is defined as 1, the value at the center is defined as 0, and the value at the left end is defined as ⁇ 1, the image processing unit 125 calculates the ratio between the first viewpoint image and the second viewpoint image when the slider 1001 is at the position x. Is changed to (1 + x) :( 1 ⁇ x).
- control unit 121 uses the image processing unit 125 to apply development processing to the image synthesized in S204.
- the development process will be described later with reference to the flowchart of FIG.
- control unit 121 displays an image to which the development process has been applied in step S205 on the display unit 131.
- the control unit 121 determines whether or not to perform focus adjustment based on a user operation input via the operation unit 132. If the input user operation indicates that focus adjustment is to be performed, the control unit 121 determines that the focus adjustment is to be performed, and advances the processing to S208. On the other hand, if the input user operation does not indicate that the focus adjustment is to be performed, the control unit 121 determines that the focus adjustment is not performed and ends the series of processes.
- FIG. 19A shows an example of the focus adjustment UI.
- a slider bar is set in the direction in which the viewpoint is moved.
- the focus adjustment UI is installed in a direction different from the viewpoint movement (with a different angle).
- the slider bar 1003 and the slider 1004 of the focus adjustment UI are set in a direction orthogonal to the direction of the slider bar 1002 that is the viewpoint movement UI (that is, the vertical direction).
- control unit 121 controls focus adjustment in a direction in which the rear pin state becomes stronger when the slider 1004 is moved upward, and in a direction in which the front pin state becomes stronger when the slider 1004 is moved downward.
- the focus adjustment range corresponds to the refocusable range described above, and is calculated according to Equation (10).
- step S209 the control unit 121 calculates the focus adjustment position based on the slider position designated in step S208 using the image processing unit 125, and performs the above-described refocus processing.
- the image processing unit 125 determines the defocus amount (or shift amount) for the refocusable range based on the position of the slider 1004 with respect to the slider bar 1002.
- step S ⁇ b> 210 the control unit 121 performs development processing using the image processing unit 125. Then, the control unit 121 displays the image developed on the display unit 131 in step S211, ends the series of operations related to the parallax image operation processing, and returns the processing to the caller.
- step S ⁇ b> 301 the image processing unit 125 performs white balance processing by applying a gain to each color of R, G, and B so that R, G, and B in the white region have the same color.
- step S302 the image processing unit 125 performs a demosaicing process. Specifically, the image processing unit 125 interpolates the input image in each specified direction, and then performs direction selection, so that the interpolation processing result of each of the three primary colors R, G, and B is obtained for each pixel. A color image signal is generated.
- the image processing unit 125 performs gamma processing.
- the image processing unit 125 performs various color adjustment processes such as noise reduction, saturation enhancement, hue correction, and edge enhancement to improve the appearance of the image.
- the image processing unit 125 performs compression processing on the color image signal color-adjusted in step S304 using a predetermined method such as JPEG, and outputs compressed image data.
- the control unit 121 records the image data output from the image processing unit 125 on the recording medium 133, ends a series of operations related to the development processing, and returns the processing to the caller.
- FIG. 19B shows an example in which a composite image that has been subjected to the refocus processing by operating the focus adjustment UI is displayed.
- the control unit 121 performs focus adjustment (refocus processing) so that the front focus state becomes strong (using the image processing unit 125) and displays an output image. ing.
- FIG. 19C shows an example in which the viewpoint moving UI slider 1001 is further moved to the right with respect to FIG.
- the control unit 121 displays an output image in which the depth of field is expanded by performing viewpoint movement processing (using the image processing unit 125).
- FIG. 19D shows an example in which the viewpoint movement process is performed by moving the slider 1001 of the viewpoint movement UI to the left with respect to FIG. 19B to display a composite image in which the depth of field is expanded.
- the viewpoint movement process is performed by moving the slider 1001 of the viewpoint movement UI to the left with respect to FIG. 19B to display a composite image in which the depth of field is expanded.
- FIG. 20 shows an example in which a UI that can change the degree of enhancement in the above-described parallax enhancement processing and sharpness processing is further added.
- the control unit 121 arranges a slider 1005 and a slider bar 1006 that can change the degree of emphasis.
- the operation on the slider 1005 changes the parameter corresponding to the variable k in the above-described parallax enhancement processing or the application amount T in the sharpness processing, and changes the degree of enhancement in the displayed composite image.
- the viewpoint movement UI is arranged in the horizontal direction in order to perform the viewpoint movement process in the horizontal direction based on the signal obtained by the image sensor in which each pixel is divided in the vertical direction.
- the direction in which the viewpoint movement UI is disposed may be aligned with the dividing direction (for example, the vertical direction).
- the focus adjustment UI may be arranged in a direction different from the viewpoint movement UI so that the distinction between the two UIs can be made clearer, or the vertical direction can be maintained and the operation on the focus position can be performed more intuitively. May be.
- an operation for moving the viewpoint and an operation for operating the focus position are received, and a composite image corresponding to the operation is generated, Displayed.
- the user can perform viewpoint movement, depth of field expansion, and focus position adjustment (refocus) in parallel.
- an operation for displaying a viewpoint change image and an operation for displaying an image with a changed focus position can be operated in parallel.
- the viewpoint movement UI is arranged to be operable in the horizontal direction. In this way, the direction in which the viewpoint can be moved matches the direction in which the user can operate, so that the user can operate more intuitively.
- Embodiment 2 Next, Embodiment 2 will be described.
- the configuration of the digital camera 100 of the present embodiment is the same as that of the first embodiment, and a part of the parallax image operation processing is different. For this reason, the same reference numerals are assigned to the same components, and redundant descriptions are omitted, and differences will be mainly described.
- step S401 the control unit 121 determines whether to display the vertical position of the image data. For example, the control unit 121 determines whether the image is captured in the vertical position with reference to the metadata of the input image. If the control unit 121 determines that the image is captured in the vertical position, the control unit 121 advances the processing to S402 to display the vertical position. On the other hand, if it is determined that the image is not captured in the vertical position, the process proceeds to S403 to display the horizontal position.
- the determination in S401 may be performed when the user sets the vertical position display via the button of the operation unit 132 or the like, or indicates the division direction of the pixels of the image sensor from the metadata. Information may be acquired and the display orientation may be determined according to the division direction.
- control unit 121 displays the image in the vertical position, displays the viewpoint movement UI so as to be changeable in the vertical direction, and displays the focus adjustment UI so as to be changeable in the horizontal direction.
- control unit 121 displays the image in the horizontal position, displays the viewpoint movement UI to be changeable in the horizontal direction, and displays the focus adjustment UI in the change display in the vertical direction.
- the control unit 121 performs the processing related to S202 to S211 in the same manner as in the first embodiment, and returns the processing to the caller.
- the viewpoint movement UI and the focus adjustment UI are dynamically switched according to whether the input image is the vertical position display or the horizontal position display. In this way, the user can perform an operation according to a direction in which the viewpoint of the captured image can be moved even when there are captured images with different display directions.
- Embodiment 3 Next, Embodiment 3 will be described.
- the third embodiment is different in that an image sensor in which each pixel is divided into two in the horizontal direction and the vertical direction is used. For this reason, the configuration of the digital camera 100 other than this point is the same as that of the first embodiment. Accordingly, the same components are denoted by the same reference numerals and redundant description is omitted, and differences will be mainly described.
- FIG. 22 shows an array of pixels in a range of 4 columns ⁇ 4 rows and an array of subpixels in a range of 8 columns ⁇ 8 rows for the image sensor 107 of the present embodiment.
- each pixel includes sub-pixels 221 to 224 arranged in 2 columns ⁇ 2 rows.
- the image sensor 107 has a large number of 4 columns ⁇ 4 rows of pixels (8 columns ⁇ 8 rows of sub-pixels) shown in FIG. 22 arranged on the surface so that a captured image (sub-pixel signal) can be acquired.
- FIG. 23A shows a plan view of one pixel 200G shown in FIG. 22 viewed from the light receiving surface side (+ z side) of the image sensor 107
- FIG. 23A is a cross-sectional view of the aa cross section of FIG. 23A viewed from the ⁇ y side. Shown in 23B.
- photoelectric conversion units 2301 to 2304 that are NH-divided (two divisions) in the x direction and NV-divided (two divisions) in the y direction are formed.
- the photoelectric conversion units 2301 to 2304 correspond to the subpixels 221 to 224, respectively.
- the first viewpoint image is generated by collecting the light reception signals of the sub-pixels 201 of each pixel.
- the second viewpoint image is the light reception signal of the sub-pixel 202 of each pixel
- the third viewpoint image is the light reception signal of the sub-pixel 203 of each pixel
- the fourth viewpoint image is the light reception signal of the sub-pixel 204 of each pixel. Collect and generate each.
- the first viewpoint image to the fourth viewpoint image are images of a Bayer array, and the demosaicing process may be performed from the first viewpoint image to the fourth viewpoint image as necessary. Good.
- the viewpoint change processing unit 155 performs contrast processing as in the first embodiment. That is, the luminance Y (j, i) is calculated according to the equation (1) for the captured image I (j, i) with the Bayer array. In addition, the viewpoint change processing unit 155 calculates a high frequency component dY (j, i), a high frequency component dZ (j, i), and a contrast distribution C (j, i).
- the viewpoint change processing unit 155 performs parallax enhancement processing of the viewpoint image.
- the viewpoint change processing unit 155 enlarges the difference between the viewpoint images from the first viewpoint image A0 (j, i) to the fourth viewpoint image D0 (j, i) according to Expression (15) and Expression (16). Conversion for parallax enhancement is performed.
- the viewpoint change processing unit 155 generates a fourth modified viewpoint image D (j, i) from the first modified viewpoint image A (j, i) by this process.
- the refocus processing unit 156 performs refocus processing using the corrected viewpoint image output by the viewpoint change processing unit 155. Specifically, the refocus processing unit 156 shifts and adds the fourth corrected viewpoint image D from the first corrected viewpoint image A according to Expression (17) as the integer shift amount s.
- the refocus image I (j, i; s) is generated while maintaining the Bayer array.
- the image processing unit 125 performs a demosaicing process on the generated refocus image I (j, i; s).
- the demosaicing process is applied from the first corrected viewpoint image to the fourth corrected viewpoint image, and the refocus processing unit 156 shifts from the first corrected viewpoint image after the demosaicing process to the fourth corrected viewpoint image.
- a refocus image may be generated by performing addition processing. Further, the refocus processing unit 156 generates an interpolated signal between each pixel of the fourth corrected viewpoint image from the first corrected viewpoint image as necessary, and generates a refocus image corresponding to the non-integer shift amount. Also good.
- the viewpoint change processing unit 155 moves the two-dimensional image by k pixels only in the vertical pupil division direction, and obtains the difference between the pixel of the first viewpoint image A0 and the third viewpoint image C0. Therefore, the equation of correlation calculation to be added for a plurality of rows is defined by equation (18).
- A0ij and C0ij represent the luminance of the i-th pixel in the j-th column of the first viewpoint image A0 and the third viewpoint image B0, respectively.
- Ni is a number representing the number of pixels used in the calculation, and nj is the number in the column direction of a pair of images for which correlation calculation is performed.
- the viewpoint change processing unit 155 calculates k that minimizes COR ′ (k) shown in Expression (18) as an image shift amount. Note that the subscript k is added only to j and is independent of i. This corresponds to performing the correlation calculation while moving the two-dimensional image only in the vertical pupil division direction. As described above, the viewpoint change processing unit 155 can generate an image shift distribution by calculating the image shift amount of each region of the first viewpoint image A0 and the third viewpoint image C0. Although A0 and C0 are used in this embodiment, correlation calculation may be performed using B0 and D0, or correlation calculation may be performed using a signal obtained by adding A0 and B0 and a signal obtained by adding C0 and D0. Good.
- the viewpoint change processing unit 155 calculates the weight coefficient of each viewpoint image according to the equations (19A) to (19D) as the actual coefficient w ( ⁇ 1 ⁇ w ⁇ 1).
- Wa (j, i) is the first weighting factor of the first viewpoint image A (j, i)
- Wb (j, i) is the second weighting factor of the second viewpoint image B (j, i)
- Wc (J, i) is the third weighting coefficient of the third viewpoint image C (j, i)
- Wd (j, i) is the fourth weighting coefficient of the fourth viewpoint image D (j, i).
- the viewpoint change processing unit 155 generates an output image I (j, i) according to the equation (20) from the weighting coefficient corresponding to each viewpoint image.
- the configuration of the viewpoint movement UI and the focus adjustment UI according to the present embodiment will be described with reference to FIG.
- the pupil division direction is divided in two directions, the horizontal direction and the vertical direction, so that the user can move the viewpoint in the vertical direction and the horizontal direction.
- the biaxial slider and slider bar for a user to operate in each direction are provided.
- a horizontal slider bar 3001 and a slider 3002 are arranged for horizontal viewpoint movement, and a vertical slider bar 4001 and a slider 4002 are arranged for vertical viewpoint movement.
- the slider bar 5001 and the slider 5002 are arranged in different directions from the viewpoint movement UI.
- the focus adjustment UI is arranged so as to pass through the intersection of the viewpoint movement UIs arranged in a cross shape, but may be arranged at other positions. In this way, by moving the sliders in the two directions of viewpoint movement, the weighting coefficients of the first to fourth viewpoint images can be changed, and images with different viewpoints can be generated.
- the viewpoint movement UI and the focus adjustment UI can be operated in parallel (simultaneously).
- the operation of moving the viewpoint in two dimensions (horizontal and vertical directions) and the focus position are operated.
- An operation is received, and a composite image corresponding to the operation is generated and displayed.
- the user performs two-dimensional viewpoint movement and focus position adjustment (refocus) in parallel. It becomes possible to do.
- a fourth embodiment Next, a fourth embodiment will be described.
- the notation of the UI for operating the viewpoint image indicates a direction in which the viewpoint of the viewpoint image is changed according to the operation of the viewpoint image, for example.
- the configuration of the digital camera 100 of the present embodiment is the same as that of the first embodiment, and a part of the parallax image operation processing is different. For this reason, the same reference numerals are assigned to the same components, and redundant descriptions are omitted, and differences will be mainly described.
- the control unit 121 determines whether to display the input image in the vertical position in order to determine whether the UI notation for performing the operation on the parallax image is matched with the vertical position display of the image. For example, the control unit 121 refers to the metadata of the input image to determine whether the input image is an image captured in the vertical position. If the input image is determined to be displayed in the vertical position, the control unit 121 sets the UI notation to vertical. The process proceeds to S502 for use in position.
- step S502 the control unit 121 further determines the rotation angle of the image.
- the control unit 121 determines the angle (for example, the vertical position rotated 90 degrees to the right or the vertical position rotated 90 degrees to the left) with reference to the metadata of the input image, for example. If the control unit 121 determines that the input image is an image captured by 90-degree right rotation, the control unit 121 advances the processing to S504 in order to display 90-degree right rotation on the display unit 131.
- the process proceeds to S505 in order to perform a 90-degree left rotation.
- the determination in S501 and S502 may be performed for the vertical position when the user sets the vertical position display via the button of the operation unit 132 or the like.
- Information indicating the division direction may be acquired, and UI notation for performing an operation on the parallax image may be determined according to the division direction.
- step S503 the control unit 121 displays the image in the horizontal position, and displays the UI for performing the viewpoint movement operation as the horizontal position (left and right) (FIG. 26).
- step S504 the control unit 121 displays the image at the vertical position rotated 90 degrees rightward, and displays the UI notation for performing the viewpoint movement operation with the left side of the slider on the upper side and the right side on the lower side (FIG. 27).
- step S505 the control unit 121 displays the image at the vertical position rotated 90 degrees rightward, and displays the UI notation for performing the viewpoint movement operation with the left side of the slider on the bottom and the right side on the top (FIG. 28).
- control unit 121 can switch the notation of the UI for performing the viewpoint movement operation in accordance with the direction of the viewpoint movement UI, and can change the notation in accordance with the rotation angle even in the same vertical position. Switching can be done.
- the control unit 121 thereafter performs the processes according to S202 to S211 shown in FIG. 25B in the same manner as in the first embodiment, and returns the process to the caller.
- the notation of the viewpoint movement UI is dynamically switched according to whether the input image is the vertical position display or the horizontal position display and the rotation angle. In this way, the user can perform an operation according to a direction in which the viewpoint of the captured image can be moved even when there are captured images with different display directions.
- the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
- a circuit for example, ASIC
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
以下、本発明の例示的な実施形態について、図面を参照して詳細に説明する。なお、以下では画像処理装置の一例として、LFデータを取得可能な任意のデジタルカメラに本発明を適用した例を説明する。しかし、本発明は、デジタルカメラに限らず、取得したLFデータを処理可能な任意の機器にも適用可能である。これらの機器には、例えば携帯電話機、ゲーム機、タブレット端末、パーソナルコンピュータ、時計型や眼鏡型の情報端末、監視システム、車載システム、内視鏡などの医療システム、画像を提供可能なロボットなどが含まれてよい。また、任意の機器がインターネット又はローカルネットワーク上の、プロセッサ等の処理手段を備えたサーバ機器(仮想マシンを含む)にLFデータと操作内容を送信し、LFデータに対する処理の一部又は全部をサーバ機器で実行する構成が含まれてよい。この場合、任意の機器はサーバ機器から処理結果を受信して表示させる構成が含まれてよい。 (Embodiment 1)
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the drawings. In the following, an example in which the present invention is applied to an arbitrary digital camera capable of acquiring LF data will be described as an example of an image processing apparatus. However, the present invention is not limited to a digital camera and can be applied to any device that can process acquired LF data. These devices include, for example, mobile phones, game machines, tablet terminals, personal computers, clock-type or spectacle-type information terminals, monitoring systems, in-vehicle systems, medical systems such as endoscopes, robots that can provide images, and the like. May be included. Also, an arbitrary device transmits LF data and operation contents to a server device (including a virtual machine) provided with processing means such as a processor on the Internet or a local network, and a part or all of the processing for the LF data is performed by a server. Configurations that execute on the device may be included. In this case, the configuration may be such that any device receives and displays the processing result from the server device.
図1は、本実施形態の画像処理装置の一例としてのデジタルカメラ100の機能構成例を示すブロック図である。なお、図1に示す機能ブロックの1つ以上は、ASICやプログラマブルロジックアレイ(PLA)などのハードウェアによって実現されてもよいし、CPUやMPU等のプログラマブルプロセッサがソフトウェアを実行することによって実現されてもよい。また、ソフトウェアとハードウェアの組み合わせによって実現されてもよい。従って、以下の説明において、異なる機能ブロックが動作主体として記載されている場合であっても、同じハードウェアが主体として実現されうる。 (Overall configuration of digital camera 100)
FIG. 1 is a block diagram illustrating a functional configuration example of a
次に、画像処理部125の構成について、図2を参照して説明する。画像取得部151は、記録媒体133から読み出された画像データを保存する。画像データは、後述する第1の視点画像と第2の視点画像とを合成した画像(A+B像ともいう)と、第1の視点画像とから構成される画像データである。 (Configuration of the image processing unit 125)
Next, the configuration of the
本実施形態に係る撮像素子107の画素及び副画素の配列について、図3を参照して説明する。図3は、2次元状に配置された画素配列を4列×4行の範囲で示しており、更に各画素に含まれる副画素配列を8列×4行の範囲で示している。 (Configuration of the image sensor 107)
The arrangement of pixels and sub-pixels of the
次に、図4A~図4Bに示した撮像素子107の画素構造と瞳分割との対応関係を、図5を参照して説明する。図5は、図4Aに示した画素200Gのa-a断面を+y側から見た断面図と、結像光学系の射出瞳面の対応関係を示している。なお、図5では、射出瞳面の座標軸と対応を取るために、画素200Gの断面図のx軸とy軸を図4A~図4Bに対して反転させている。 (Relationship between pixel structure of
Next, the correspondence between the pixel structure of the
次に、撮像素子107から出力される第1視点画像と第2視点画像との間のデフォーカス量と像ずれ量の関係について説明する。図8Bは、第1視点画像と第2視点画像とのデフォーカス量と、第1視点画像と第2視点画像との間の像ずれ量との関係を模式的に示している。撮像面800に撮像素子107(図8Bでは不図示)が配置され、図5、図8A及び図8Bと同様に、結像光学系の射出瞳が、瞳部分領域501と瞳部分領域502に2分割される。 (Relationship between defocus amount and image shift amount between parallax images)
Next, the relationship between the defocus amount and the image shift amount between the first viewpoint image and the second viewpoint image output from the
次に、本実施形態に係る視点画像修正処理とリフォーカス処理について説明する。本実施形態のリフォーカス処理では、第1段階として、視点変更処理部155が撮像画像の各画素値に基づくコントラストの高低を表すコントラスト分布を算出する。第2段階として、視点変更処理部155は算出したコントラスト分布に基づいて、画素毎に複数の視点画像(第1視点画像と第2視点画像)間の差を拡大して視差を強調する変換を行い、複数の修正視点画像(第1修正視点画像と第2修正視点画像)を生成する。第3段階として、リフォーカス処理部156は複数の修正視点画像(第1修正視点画像と第2修正視点画像)を相対的にシフト加算して、リフォーカス画像を生成する。 (Viewpoint image correction and refocus)
Next, viewpoint image correction processing and refocus processing according to the present embodiment will be described. In the refocus processing of the present embodiment, as a first step, the viewpoint
視点変更処理部155は、ベイヤー配列の撮像画像I(j、i)に対し、式(1)に従って、位置(j、i)毎にRGB毎の色重心を一致させて、輝度Y(j、i)を算出する。 (First stage: calculation of contrast distribution)
The viewpoint
次に、視差画像の視差強調処理について説明する。視差強調処理では、まず視点画像の像ずれ分布を算出する。像ずれ分布の算出は、第1視点画像A0と第2視点画像B0の1対の像に対して相関演算を行って、1対の像の相対的な位置ずれ量を計算することにより得られる。相関演算には、様々な公知の方法が知られているが、視点変更処理部155は、例えば式(4)に示すような一対の像の差の絶対値を加算することにより、像の相関値を得ることができる。 (Second stage: parallax enhancement processing of parallax image)
Next, the parallax enhancement processing of the parallax image will be described. In the parallax enhancement processing, first, an image shift distribution of the viewpoint image is calculated. The calculation of the image shift distribution is obtained by performing a correlation operation on a pair of images of the first viewpoint image A0 and the second viewpoint image B0 and calculating a relative positional shift amount of the pair of images. . Various known methods are known for the correlation calculation. The viewpoint
複数の修正視点画像(第1修正視点画像と第2修正視点画像)を用いた、瞳分割方向(列方向或いは水平方向)のリフォーカス処理について、図11を参照して説明する。図11は、撮像面800に配置された撮像素子107の列方向i番目の画素の信号を含む、第1修正視点画像Aiと第2修正視点画像Biとを模式的に示している。第1修正視点画像Aiは、(図8Aの瞳部分領域501に対応した)主光線角度θaでi番目の画素に入射した光束の受光信号を含む。第2修正視点画像Biは、(図8Aの瞳部分領域502に対応した)主光線角度θbでi番目の画素に入射した光束の受光信号を含む。つまり、第1修正視点画像Aiと第2修正視点画像Biは、光強度分布情報に加えて、入射角度情報も有している。 (3rd stage: refocus processing)
Refocus processing in the pupil division direction (column direction or horizontal direction) using a plurality of corrected viewpoint images (first corrected viewpoint image and second corrected viewpoint image) will be described with reference to FIG. FIG. 11 schematically shows a first modified viewpoint image Ai and a second modified viewpoint image Bi including the signal of the i-th pixel in the column direction of the
上述したように、リフォーカス処理では、第1修正視点画像Aと第2修正視点画像Bとがシフト加算されて、仮想結像面におけるリフォーカス画像が生成される。シフト加算により第1修正視点画像Aと第2修正視点画像Bの像をずらすため、リフォーカス処理前の画像に対する相対的なずらし量(像ずらし量ともいう)がわかる。 (Sharpness processing)
As described above, in the refocus process, the first corrected viewpoint image A and the second corrected viewpoint image B are shift-added to generate a refocus image on the virtual imaging plane. Since the images of the first corrected viewpoint image A and the second corrected viewpoint image B are shifted by shift addition, the relative shift amount (also referred to as image shift amount) with respect to the image before the refocus processing can be known.
リフォーカス可能範囲は、リフォーカス処理によって変更可能なピント位置の範囲を表す。例えば、図13は、本実施形態に係るリフォーカス可能範囲を模式的に示している。許容錯乱円をδ、結像光学系の絞り値をFとすると、絞り値Fにおける被写界深度は±Fδである。これに対して、NH×NV(2×1)分割されて狭くなった瞳部分領域501(502)の水平方向の実効絞り値F01(F02)は、F01=NHFと暗くなる。第1修正視点画像(第2修正視点画像)毎の実効的な被写界深度は±NHFδとNH倍深くなり、合焦範囲がNH倍に広がる。すなわち、実効的な被写界深度±NHFδの範囲内では、第1修正視点画像(第2修正視点画像)毎に合焦した被写体像が取得されている。このため、リフォーカス処理部156は、図11に示した主光線角度θa(θb)に沿って第1修正視点画像(第2修正視点画像)を平行移動するリフォーカス処理により、撮影後に、ピント位置を再調整(リフォーカス)することができる。換言すれば、撮影後にピント位置を再調整(リフォーカス)できる撮像面からのデフォーカス量dは限定されており、デフォーカス量dのリフォーカス可能範囲は、概ね、式(10)の範囲である。 (Calculation of refocusable range)
The refocusable range represents a range of a focus position that can be changed by the refocus process. For example, FIG. 13 schematically illustrates a refocusable range according to the present embodiment. If the allowable circle of confusion is δ and the aperture value of the imaging optical system is F, the depth of field at the aperture value F is ± Fδ. On the other hand, the effective aperture value F01 (F02) in the horizontal direction of the pupil partial region 501 (502) narrowed by dividing by NH × NV (2 × 1) becomes dark as F01 = NHF. The effective depth of field for each first modified viewpoint image (second modified viewpoint image) is ± NHFδ and NH times deeper, and the focusing range is expanded NH times. That is, within the range of effective depth of field ± NHFδ, a focused subject image is acquired for each first modified viewpoint image (second modified viewpoint image). For this reason, the
(視点移動処理)
次に、視点変更処理部155によって実行される、本実施形態に係る視点移動処理について説明する。なお、視点移動処理は、手前側の非主被写体のボケが主被写体に被る場合に、非主被写体によるボケを低減するための処理である。 The allowable circle of confusion δ is defined by, for example, δ = 2ΔX (the reciprocal of the
Next, the viewpoint movement process according to the present embodiment, which is executed by the viewpoint
次に、撮像素子107の周辺像高における瞳ずれについて説明する。図15A~図15Cは、各画素の副画素201と副画素202がそれぞれ受光する瞳部分領域(501、502)と結像光学系の射出瞳400との関係を示している。 (Viewpoint movement processing for pupil misalignment)
Next, pupil shift in the peripheral image height of the
次に、視点変更処理部155による深度拡大処理について、再び図14Bを参照して説明する。上述したように、図14Bにおいて瞳部分領域501を通過した像が第1視点画像、瞳部分領域502を通過した像が第2視点画像である。各視点画像は、図からも明らかなように、本来の瞳領域の半分を通過して得られる画像であるため、水平方向2分割の瞳分割領域の場合には水平方向の絞り径が半分となる。このため、水平方向の被写界深度は4倍になる。一方、本実施形態では垂直方向に瞳分割した構成ではないため、垂直方向の被写界深度の変化はない。従って、第1視点画像または第2視点画像は、第1視点画像と第2視点画像とを合成した画像(A+B像)の被写界深度に対して、縦横平均として2倍の被写界深度を有する画像となる。 (Depth of field expansion process)
Next, depth expansion processing by the viewpoint
次に、図16を参照して、撮像画像の視点移動及びピント調整操作に係る一連の動作について説明する。なお、本処理は、例えば操作部132に含まれるレリーズスイッチ等がユーザによって押下された場合に開始される。また、本処理は、制御部121が不図示のROMに記憶されたプログラムをRAMの作業用領域に展開、実行すると共に、画像処理部125等の各部を制御することにより実現される。 (A series of operations related to viewpoint movement and focus adjustment operation of captured images)
Next, with reference to FIG. 16, a series of operations related to the viewpoint movement and focus adjustment operation of the captured image will be described. This process is started when, for example, a release switch included in the
次に、S104における視点画像操作処理に係る一連の動作について、図17に示すフローチャートを参照して説明する。なお、以下の説明では、視点移動(及び被写界深度の変更)を行うためのユーザインターフェース(視点移動UI)における操作を、ピント調整を行うためのユーザインターフェース(ピント調整UI)における操作より先に行う例を示す。しかし、ピント調整UI操作を、視点移動UI操作より先に行ってもかまわない。 (A series of operations related to viewpoint image manipulation processing)
Next, a series of operations related to the viewpoint image operation processing in S104 will be described with reference to the flowchart shown in FIG. In the following description, the operation on the user interface (viewpoint movement UI) for performing viewpoint movement (and changing the depth of field) is preceded by the operation on the user interface (focus adjustment UI) for performing focus adjustment. An example is shown below. However, the focus adjustment UI operation may be performed prior to the viewpoint movement UI operation.
次に、S205及びS210における現像処理について、図18を参照して説明する。S301において、画像処理部125は、白の領域のR,G,Bが等色になるようにR,G,Bの各色にゲインをかけてホワイトバランス処理を行う。S302において、画像処理部125は、デモザイキング処理を行う。具体的に、画像処理部125は、入力画像に対して、それぞれの規定方向で補間を行って、その後方向選択を行うことにより、各画素について補間処理結果としてR、G,Bの3原色のカラー画像信号を生成する。 (A series of operations related to development processing)
Next, the development processing in S205 and S210 will be described with reference to FIG. In step S <b> 301, the
次に、図19B~図19Eを参照して、上述した視点移動UI及びピント調整UIの操作例と操作後の合成画像の例を説明する。図19Bは、ピント調整UIを操作してリフォーカス処理を行った合成画像を表示した例を示している。スライダー1004を下方向に動かすユーザ操作に対して、制御部121は、(画像処理部125を用いて)前ピン状態が強くなるようにピント調整(リフォーカス処理)を行って出力画像を表示している。 (Example of viewpoint movement UI and focus adjustment UI)
Next, with reference to FIGS. 19B to 19E, an example of the operation of the viewpoint movement UI and the focus adjustment UI described above and an example of the composite image after the operation will be described. FIG. 19B shows an example in which a composite image that has been subjected to the refocus processing by operating the focus adjustment UI is displayed. In response to a user operation to move the
次に実施形態2について説明する。実施形態2では、画像を縦位置表示又は横位置表示に切り換えて視点画像の操作を行う例について説明する。本実施形態のデジタルカメラ100の構成は実施形態1と同一構成であり、視差画像操作処理の一部が異なる。このため、同一の構成については同一の符号を付して重複する説明は省略し、相違点について重点的に説明する。 (Embodiment 2)
Next,
本実施形態に係る視点画像操作処理について、図21を参照して説明する。S401において、制御部121は、画像データの縦位置表示を行うかどうかを判定する。制御部121は、例えば入力画像のメタデータを参照して縦位置で撮影された画像であるかを判定する。制御部121は、縦位置で撮影された画像であると判定した場合、縦位置表示を行うためにS402へ処理を進める。一方、縦位置で撮影された画像でないと判定した場合、横位置表示を行うためにS403へ処理を進める。なお、S401の判定は、ユーザが操作部132のボタン等を介して縦位置表示に設定した場合に縦位置表示を行うようにしてもよいし、メタデータから撮像素子の画素の分割方向を示す情報を取得して、当該分割方向に応じて表示の向きを判定してもよい。 (A series of operations related to viewpoint image manipulation processing)
The viewpoint image operation processing according to the present embodiment will be described with reference to FIG. In step S401, the
次に実施形態3について説明する。実施形態3では、各画素が水平方向と垂直方向とにそれぞれ2分割された撮像素子を用いる点が異なる。このため、この点以外のデジタルカメラ100の構成は、実施形態1と同一である。従って、同一の構成については同一の符号を付して重複する説明は省略し、相違点について重点的に説明する。 (Embodiment 3)
Next, Embodiment 3 will be described. The third embodiment is different in that an image sensor in which each pixel is divided into two in the horizontal direction and the vertical direction is used. For this reason, the configuration of the
視点変更処理部155は、実施形態1と同様に、コントラスト処理を行う。すなわち、ベイヤー配列の撮像画像I(j、i)に対し、式(1)に従って輝度Y(j、i)を算出する。また、視点変更処理部155は、高周波成分dY(j、i)、高周波成分dZ(j、i)、コントラスト分布C(j、i)を算出する。 (Viewpoint image correction and refocus)
The viewpoint
さらに、本実施形態における視点画像の像ずれ分布について説明する。水平方向の瞳分割方向の像ずれ分布は実施形態1と同様のため省略し、垂直方向の瞳分割方向の像ずれ分布について説明する。 (Image shift distribution)
Furthermore, the image shift distribution of the viewpoint image in the present embodiment will be described. Since the image shift distribution in the horizontal pupil division direction is the same as that in the first embodiment, it will be omitted, and the image shift distribution in the vertical pupil division direction will be described.
視点変更処理部155は、実係数w(-1≦w≦1)として、式(19A)~(19D)に従って各視点画像の重み係数をそれぞれ算出する。 (Depth of field expansion process)
The viewpoint
更に、本実施形態に係る視点移動UIとピント調整UIの構成について、図24を参照して説明する。本実施形態の構成では、瞳分割方向が水平方向と垂直方向の2方向で分割されているため、ユーザは垂直方向と水平方向とで視点移動させることが可能である。このため、本実施形態では、ユーザがそれぞれの方向に操作するための、2軸のスライダーとスライダーバーを設ける。 (Example of viewpoint movement UI and focus adjustment UI)
Further, the configuration of the viewpoint movement UI and the focus adjustment UI according to the present embodiment will be described with reference to FIG. In the configuration of the present embodiment, the pupil division direction is divided in two directions, the horizontal direction and the vertical direction, so that the user can move the viewpoint in the vertical direction and the horizontal direction. For this reason, in this embodiment, the biaxial slider and slider bar for a user to operate in each direction are provided.
次に実施形態4について説明する。実施形態4では、画像を縦位置表示又は横位置表示に切り換えて視点画像の操作を行う際のUIの表記(上述した視点移動UI等に付す表示)を制御する例について説明する。視点画像の操作を行うUIの表記は、例えば視点画像の操作に応じて視点画像の視点が変更される向きを示す。本実施形態のデジタルカメラ100の構成は実施形態1と同一構成であり、視差画像操作処理の一部が異なる。このため、同一の構成については同一の符号を付して重複する説明は省略し、相違点について重点的に説明する。 (Embodiment 4)
Next, a fourth embodiment will be described. In the fourth embodiment, an example of controlling UI notation (display attached to the above-described viewpoint movement UI or the like) when the image is switched to the vertical position display or the horizontal position display and the viewpoint image is operated will be described. The notation of the UI for operating the viewpoint image indicates a direction in which the viewpoint of the viewpoint image is changed according to the operation of the viewpoint image, for example. The configuration of the
本実施形態に係る視点画像操作処理について、図25A~図25Bを参照して説明する。図25AのS501において、制御部121は、視差画像の操作を行うUIの表記を画像の縦位置表示に合わせるかを判定するため、入力画像を縦位置で表示するかを判定する。制御部121は、例えば入力画像のメタデータを参照して入力画像が縦位置で撮影された画像であるかを判定し、入力画像を縦位置で表示すると判定した場合は、UIの表記を縦位置用とするためにS502へ処理を進める。一方、縦位置で撮影された画像でないと判定した場合、横位置用の表記を行うためにS503へ処理を進める。S502において、制御部121は、さらに画像の回転角度を判定する。制御部121は、例えば入力画像のメタデータを参照して撮影画像の角度(例えば、90度右回転した縦位置、又は90度左回転した縦位置)を判定する。制御部121は、入力画像が90度右回転で撮影された画像であると判定した場合、表示部131に90度右回転の表記を行うためにS504へ処理を進める。一方、撮影画像が90度右回転でない(かつ縦位置表示で撮影された)画像であると判定した場合、90度左回転の表記を行うためにS505へ処理を進める。なお、S501及びS502における判定は、ユーザが操作部132のボタン等を介して縦位置表示に設定した場合に縦位置用の表記を行うようにしてもよいし、メタデータから撮像素子の画素の分割方向を示す情報を取得して、当該分割方向に応じて視差画像の操作を行うUIの表記を判定してもよい。 (A series of operations related to viewpoint image manipulation processing)
The viewpoint image operation processing according to the present embodiment will be described with reference to FIGS. 25A to 25B. In S501 of FIG. 25A, the
本発明は、上述の実施形態の1以上の機能を実現するプログラムを、ネットワーク又は記憶媒体を介してシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサーがプログラムを読出し実行する処理でも実現可能である。また、1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。 (Other embodiments)
The present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
Claims (18)
- 入射する光線の強度と角度情報を含む画像信号を取得する取得手段と、
視点を変更する操作とピント位置を変更する操作とを受け付ける操作手段と、
前記画像信号に基づいて得られる複数の視点画像に基づいて、前記視点を変更する操作に応じて視点を変更すると共に、前記ピント位置を変更する操作に応じてピント位置を変更した表示画像を生成する処理手段と、を備える
ことを特徴とする画像処理装置。 An acquisition means for acquiring an image signal including intensity and angle information of an incident light beam;
An operation means for accepting an operation for changing the viewpoint and an operation for changing the focus position;
Based on a plurality of viewpoint images obtained based on the image signal, a viewpoint is changed according to an operation for changing the viewpoint, and a display image in which a focus position is changed according to an operation for changing the focus position is generated. And an image processing apparatus. - 前記操作手段は、前記画像信号に基づいて所定の方向に視点を変更した前記表示画像を生成可能な場合に、前記視点を変更する操作を、前記所定の方向に沿って受け付ける、
ことを特徴とする請求項1に記載の画像処理装置。 The operation means accepts an operation for changing the viewpoint along the predetermined direction when the display image whose viewpoint has been changed in a predetermined direction can be generated based on the image signal.
The image processing apparatus according to claim 1. - 前記操作手段は、前記画像信号に基づいて所定の方向に視点を変更した前記表示画像を生成可能な場合に、前記ピント位置を変更する操作を、前記所定の方向と異なる方向に沿って受け付ける、
ことを特徴とする請求項1に記載の画像処理装置。 The operation means receives an operation for changing the focus position along a direction different from the predetermined direction when the display image whose viewpoint has been changed in a predetermined direction can be generated based on the image signal.
The image processing apparatus according to claim 1. - 前記操作手段は、所定の位置から前記所定の方向に沿って離れるほど前記所定の方向への視点の変更が大きくなり、前記所定の位置から前記所定の方向に沿って反対側に離れるほど、前記所定の方向と反対側に視点の変更が大きくなるように、前記視点を変更する操作を受け付ける、
ことを特徴とする請求項2又は3に記載の画像処理装置。 As the operation means moves away from the predetermined position along the predetermined direction, the change of the viewpoint in the predetermined direction increases, and as the operation means moves away from the predetermined position along the predetermined direction, Accepting an operation to change the viewpoint such that the change of the viewpoint becomes larger in the direction opposite to the predetermined direction;
The image processing apparatus according to claim 2, wherein the image processing apparatus is an image processing apparatus. - 前記所定の方向は、水平方向又は垂直方向である、
ことを特徴とする請求項2から4のいずれか1項に記載の画像処理装置。 The predetermined direction is a horizontal direction or a vertical direction.
The image processing apparatus according to claim 2, wherein the image processing apparatus is an image processing apparatus. - 前記操作手段は、横位置において水平方向に視点を変更可能な前記表示画像を縦位置で表示する場合、前記視点を変更する操作を、垂直方向に受け付ける、
ことを特徴とする請求項5に記載の画像処理装置。 The operation means accepts an operation for changing the viewpoint in the vertical direction when the display image whose viewpoint can be changed in the horizontal direction at the horizontal position is displayed in the vertical position.
The image processing apparatus according to claim 5. - 前記視点を変更する操作に応じて視点が変更される向きを示す表記を表示する表示手段を更に備え、
前記表示手段は、前記表記を、前記操作手段が前記視点を変更する操作を受け付ける方向に応じて異ならせる、
ことを特徴とする請求項6に記載の画像処理装置。 Further comprising display means for displaying a notation indicating a direction in which the viewpoint is changed according to the operation of changing the viewpoint;
The display means changes the notation according to a direction in which the operation means receives an operation of changing the viewpoint.
The image processing apparatus according to claim 6. - 前記操作手段は、前記ピント位置を変更する操作が所定の位置から操作を受け付ける方向に離れるほど前記ピント位置の変更が大きくなるように、前記ピント位置を変更する操作を受け付ける、
ことを特徴とする請求項2から7のいずれか1項に記載の画像処理装置。 The operation means accepts an operation for changing the focus position so that the change in the focus position increases as the operation for changing the focus position moves away from a predetermined position in a direction in which the operation is accepted.
The image processing apparatus according to claim 2, wherein the image processing apparatus is an image processing apparatus. - 前記操作手段は、前記画像信号に基づく表示画像が表示された状態で前記視点を変更する操作と前記ピント位置を変更する操作とを受け付ける、
ことを特徴とする請求項1から8のいずれか1項に記載の画像処理装置。 The operation means receives an operation for changing the viewpoint and an operation for changing the focus position in a state where a display image based on the image signal is displayed.
The image processing apparatus according to claim 1, wherein the image processing apparatus is an image processing apparatus. - 前記操作手段は、前記表示画像における輪郭強調の度合を変更する操作を更に受け付け、
前記処理手段は、前記輪郭強調の度合に応じて、前記表示画像における合焦している領域に対する輪郭強調を行う、
ことを特徴とする請求項1から9のいずれか1項に記載の画像処理装置。 The operation means further accepts an operation for changing the degree of contour emphasis in the display image,
The processing means performs edge enhancement on a focused area in the display image according to the degree of edge enhancement.
The image processing apparatus according to claim 1, wherein the image processing apparatus is an image processing apparatus. - 前記処理手段は、前記視点を変更する操作に応じて前記複数の視点画像を加算するための重み付けを異ならせ、前記重み付けを用いて前記複数の視点画像を加算して、前記表示画像を生成する、
ことを特徴とする請求項1から10のいずれか1項に記載の画像処理装置。 The processing unit generates weights for adding the plurality of viewpoint images according to an operation of changing the viewpoint, and adds the plurality of viewpoint images using the weights to generate the display image. ,
The image processing apparatus according to claim 1, wherein the image processing apparatus is an image processing apparatus. - 前記処理手段は、前記ピント位置を変更する操作に応じて前記複数の視点画像間のシフト量を変更し、前記シフト量によりずらした前記複数の視差画像を加算して、前記表示画像を生成する、
ことを特徴とする請求項1から11のいずれか1項に記載の画像処理装置。 The processing unit changes a shift amount between the plurality of viewpoint images in accordance with an operation to change the focus position, and adds the plurality of parallax images shifted by the shift amount to generate the display image. ,
The image processing apparatus according to claim 1, wherein the image processing apparatus is an image processing apparatus. - 複数の副画素を含む画素が2次元状に配置され、前記副画素から出力された信号に基づいて入射する光線の強度と角度情報を含む画像信号を出力する撮像素子と、
視点を変更する操作とピント位置を変更する操作とを受け付ける操作手段と、
前記画像信号に基づいて得られる複数の視点画像に基づいて、前記視点を変更する操作に応じて視点を変更すると共に、前記ピント位置を変更する操作に応じてピント位置を変更した表示画像を生成する処理手段と、を備え、
前記撮像素子は、前記画素のなかで所定の方向に複数の前記副画素が配置され、
前記操作手段は、前記画像信号に基づいて前記所定の方向に視点を変更した前記表示画像を生成可能な場合に、前記視点を変更する操作を、前記所定の方向に沿って受け付ける、
ことを特徴とする撮像装置。 An image pickup device in which pixels including a plurality of subpixels are two-dimensionally arranged and outputs an image signal including intensity and angle information of incident light rays based on a signal output from the subpixel;
An operation means for accepting an operation for changing the viewpoint and an operation for changing the focus position;
Based on a plurality of viewpoint images obtained based on the image signal, a viewpoint is changed according to an operation for changing the viewpoint, and a display image in which a focus position is changed according to an operation for changing the focus position is generated. Processing means
The image sensor has a plurality of sub-pixels arranged in a predetermined direction among the pixels,
The operation means accepts an operation for changing the viewpoint along the predetermined direction when the display image whose viewpoint has been changed in the predetermined direction can be generated based on the image signal.
An imaging apparatus characterized by that. - 前記操作手段は、前記撮像装置を用いて横位置で撮影された前記画像信号を横位置の前記表示画像として表示する場合、前記視点を変更する操作を、前記所定の方向に沿って受け付け、横位置で撮影された前記画像信号を縦位置の前記表示画像として表示する場合、前記視点を変更する操作を、前記所定の方向と垂直な方向に沿って受け付ける、
ことを特徴とする請求項13に記載の撮像装置。 The operation means accepts an operation for changing the viewpoint along the predetermined direction when displaying the image signal captured in the horizontal position using the imaging device as the display image in the horizontal position, When displaying the image signal captured at a position as the display image at a vertical position, an operation for changing the viewpoint is accepted along a direction perpendicular to the predetermined direction.
The imaging apparatus according to claim 13. - 前記視点を変更する操作に応じて視点が変更される向きを示す表記を表示する表示手段を更に備え、
前記表示手段は、前記表記を、前記画像信号の撮影された、縦位置又は横位置からの回転角に応じて異ならせる、
ことを特徴とする請求項14に記載の撮像装置。 Further comprising display means for displaying a notation indicating a direction in which the viewpoint is changed according to the operation of changing the viewpoint;
The display means varies the notation according to a rotation angle from a vertical position or a horizontal position where the image signal is captured.
The imaging apparatus according to claim 14. - 取得手段が、入射する光線の強度と角度情報を含む画像信号を取得する取得工程と、
操作手段が、視点を変更する操作とピント位置を変更する操作とを受け付ける操作工程と、
処理手段が、前記画像信号に基づいて得られる複数の視点画像に基づいて、前記視点を変更する操作に応じて視点を変更すると共に、前記ピント位置を変更する操作に応じてピント位置を変更した表示画像を生成する処理工程と、を有する
ことを特徴とする画像処理装置の制御方法。 An acquisition step in which the acquisition means acquires an image signal including the intensity and angle information of the incident light beam;
An operation step in which the operation means receives an operation of changing a viewpoint and an operation of changing a focus position;
The processing unit changes the viewpoint according to the operation for changing the viewpoint based on the plurality of viewpoint images obtained based on the image signal, and changes the focus position according to the operation for changing the focus position. And a processing step of generating a display image. A control method for an image processing device. - 複数の副画素を含む画素が2次元状に配置され、前記副画素から出力された信号に基づいて入射する光線の強度と角度情報を含む画像信号を出力する撮像素子を有する撮像装置の制御方法であって、
取得手段が、前記撮像素子から出力される前記画像信号を取得する取得工程と、
操作手段が、視点を変更する操作とピント位置を変更する操作とを受け付ける操作工程と、
処理手段が、前記画像信号に基づいて得られる複数の視点画像に基づいて、前記視点を変更する操作に応じて視点を変更すると共に、前記ピント位置を変更する操作に応じてピント位置を変更した表示画像を生成する処理工程と、を有し、
前記撮像素子は、前記画素のなかで所定の方向に複数の前記副画素が配置され、
前記操作工程では、前記画像信号に基づいて前記所定の方向に視点を変更した前記表示画像を生成可能な場合に、前記視点を変更する操作を、前記所定の方向に沿って受け付ける、
ことを特徴とする撮像装置の制御方法。 Control method for an imaging apparatus having an imaging element in which pixels including a plurality of subpixels are arranged two-dimensionally and output an image signal including intensity and angle information of incident light rays based on a signal output from the subpixel Because
An acquisition step of acquiring the image signal output from the image sensor;
An operation step in which the operation means receives an operation of changing a viewpoint and an operation of changing a focus position;
The processing unit changes the viewpoint according to the operation for changing the viewpoint based on the plurality of viewpoint images obtained based on the image signal, and changes the focus position according to the operation for changing the focus position. A processing step of generating a display image,
The image sensor has a plurality of sub-pixels arranged in a predetermined direction among the pixels,
In the operation step, when the display image whose viewpoint is changed in the predetermined direction can be generated based on the image signal, an operation for changing the viewpoint is accepted along the predetermined direction.
And a method of controlling the imaging apparatus. - コンピュータに、請求項16に記載の画像処理装置の制御方法の各工程を実行させるためのプログラム。 A program for causing a computer to execute each step of the control method of the image processing apparatus according to claim 16.
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780019656.XA CN108886581B (en) | 2016-03-24 | 2017-01-25 | Image processing apparatus, image pickup apparatus, and control method thereof |
RU2018137046A RU2707066C1 (en) | 2016-03-24 | 2017-01-25 | Image processing device, image creation device, their control methods and program |
SG11201808036XA SG11201808036XA (en) | 2016-03-24 | 2017-01-25 | Image processing apparatus, imaging apparatus, and control methods thereof |
KR1020187029815A KR102157491B1 (en) | 2016-03-24 | 2017-01-25 | Image processing apparatus, imaging apparatus and control method thereof, storage medium |
DE112017001458.1T DE112017001458T5 (en) | 2016-03-24 | 2017-01-25 | Image processing apparatus, imaging apparatus and control method thereof |
PH12018502032A PH12018502032A1 (en) | 2016-03-24 | 2018-09-21 | Image processing apparatus, imaging apparatus, and control methods thereof |
US16/137,801 US10924665B2 (en) | 2016-03-24 | 2018-09-21 | Image processing apparatus, imaging apparatus, control methods thereof, and storage medium for generating a display image based on a plurality of viewpoint images |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016060897 | 2016-03-24 | ||
JP2016-060897 | 2016-03-24 | ||
JP2016-112103 | 2016-06-03 | ||
JP2016112103A JP6757184B2 (en) | 2016-03-24 | 2016-06-03 | Image processing equipment, imaging equipment and their control methods and programs |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/137,801 Continuation US10924665B2 (en) | 2016-03-24 | 2018-09-21 | Image processing apparatus, imaging apparatus, control methods thereof, and storage medium for generating a display image based on a plurality of viewpoint images |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017163588A1 true WO2017163588A1 (en) | 2017-09-28 |
Family
ID=59900206
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/002504 WO2017163588A1 (en) | 2016-03-24 | 2017-01-25 | Image processing apparatus, image pickup apparatus, and control methods therefor, and program |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2017163588A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110388880A (en) * | 2018-04-20 | 2019-10-29 | 株式会社基恩士 | Form measuring instrument and form measuring method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009211335A (en) * | 2008-03-04 | 2009-09-17 | Nippon Telegr & Teleph Corp <Ntt> | Virtual viewpoint image generation method, virtual viewpoint image generation apparatus, virtual viewpoint image generation program, and recording medium from which same recorded program can be read by computer |
JP2013110556A (en) * | 2011-11-21 | 2013-06-06 | Olympus Corp | Prenoptic camera |
JP2015115818A (en) * | 2013-12-12 | 2015-06-22 | キヤノン株式会社 | Image processing apparatus, image processing method, and program |
JP2015198340A (en) * | 2014-04-01 | 2015-11-09 | キヤノン株式会社 | Image processing system and control method therefor, and program |
-
2017
- 2017-01-25 WO PCT/JP2017/002504 patent/WO2017163588A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009211335A (en) * | 2008-03-04 | 2009-09-17 | Nippon Telegr & Teleph Corp <Ntt> | Virtual viewpoint image generation method, virtual viewpoint image generation apparatus, virtual viewpoint image generation program, and recording medium from which same recorded program can be read by computer |
JP2013110556A (en) * | 2011-11-21 | 2013-06-06 | Olympus Corp | Prenoptic camera |
JP2015115818A (en) * | 2013-12-12 | 2015-06-22 | キヤノン株式会社 | Image processing apparatus, image processing method, and program |
JP2015198340A (en) * | 2014-04-01 | 2015-11-09 | キヤノン株式会社 | Image processing system and control method therefor, and program |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110388880A (en) * | 2018-04-20 | 2019-10-29 | 株式会社基恩士 | Form measuring instrument and form measuring method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6757184B2 (en) | Image processing equipment, imaging equipment and their control methods and programs | |
US10681286B2 (en) | Image processing apparatus, imaging apparatus, image processing method, and recording medium | |
JP6789833B2 (en) | Image processing equipment, imaging equipment, image processing methods and programs | |
CN107465866B (en) | Image processing apparatus and method, image capturing apparatus, and computer-readable storage medium | |
JP6972266B2 (en) | Image processing method, image processing device, and image pickup device | |
CN107431755B (en) | Image processing apparatus, image capturing apparatus, image processing method, and storage medium | |
JP6516510B2 (en) | Image processing apparatus, imaging apparatus, image processing method, image processing program, and storage medium | |
JP6254843B2 (en) | Image processing apparatus and control method thereof | |
JP6976754B2 (en) | Image processing equipment and image processing methods, imaging equipment, programs | |
US10868953B2 (en) | Image processing device capable of notifying an effect of emphasizing a subject before performing imaging, control method thereof, and medium | |
JP7204357B2 (en) | Imaging device and its control method | |
WO2017163588A1 (en) | Image processing apparatus, image pickup apparatus, and control methods therefor, and program | |
WO2016143913A1 (en) | Image processing method, image processing device, and image pickup apparatus | |
JP6800648B2 (en) | Image processing device and its control method, program and imaging device | |
JP2020171050A (en) | Image processing apparatus, imaging apparatus, image processing method, and storage medium | |
JP6817855B2 (en) | Image processing equipment, imaging equipment, image processing methods, and programs | |
US10964739B2 (en) | Imaging apparatus and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 11201808036X Country of ref document: SG |
|
ENP | Entry into the national phase |
Ref document number: 20187029815 Country of ref document: KR Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17769635 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17769635 Country of ref document: EP Kind code of ref document: A1 |