WO2014045741A1 - Image processing device, imaging device, image processing method, and image processing program - Google Patents

Image processing device, imaging device, image processing method, and image processing program Download PDF

Info

Publication number
WO2014045741A1
WO2014045741A1 PCT/JP2013/071183 JP2013071183W WO2014045741A1 WO 2014045741 A1 WO2014045741 A1 WO 2014045741A1 JP 2013071183 W JP2013071183 W JP 2013071183W WO 2014045741 A1 WO2014045741 A1 WO 2014045741A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
unit
pixel
instruction
Prior art date
Application number
PCT/JP2013/071183
Other languages
French (fr)
Japanese (ja)
Inventor
智行 河合
沖川 満
林 淳司
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2014045741A1 publication Critical patent/WO2014045741A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • G02B7/346Systems for automatic generation of focusing signals using different areas in a pupil plane using horizontal and vertical areas in the pupil plane, i.e. wide area autofocusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light

Definitions

  • the present invention relates to an image processing device, an imaging device, an image processing method, and an image processing program.
  • a digital camera having a so-called manual focus mode in which a user can manually perform focus adjustment in addition to auto-focus using a phase difference detection method or a contrast detection method is widely known.
  • a method using a split micro prism screen that displays a phase difference visually by providing a reflex mirror so that focus adjustment can be performed while checking a subject is known.
  • a method employing a method for visually confirming contrast is also known.
  • a split image is displayed in a live view image (also referred to as a through image) in order to make it easier for the operator to focus on the subject in the manual focus mode.
  • the split image is, for example, a divided image divided into two (for example, images divided in the vertical direction), and is shifted in the parallax generation direction (for example, the horizontal direction) according to the shift of the focus, and is in focus. In this state, it indicates a divided image in which the shift in the parallax generation direction is eliminated.
  • An operator for example, a photographer adjusts the focus by operating the manual focus ring so that the split image (for example, each image divided in the vertical direction) is not displaced.
  • aperture moving means for moving the aperture stop on the subject optical path on a plane perpendicular to the optical axis, and the aperture stop move.
  • Storage means for storing two subject images respectively captured at two distance measuring positions.
  • the image processing apparatus includes a display unit that outputs a split image in which two subject images are combined and displays whether the focus state is appropriate.
  • Patent Document 2 An imaging apparatus described in Japanese Patent Laying-Open No. 2009-147665 (hereinafter referred to as Patent Document 2) includes a first subject image and a first object image formed by a light beam divided by a pupil division unit among light beams from an imaging optical system. A first image and a second image are generated by photoelectrically converting the two subject images. Then, a split image is generated using the first and second images, and a third image is generated by photoelectric conversion of a third subject image formed by a light beam that is not divided by the pupil dividing unit. The third image is displayed on the display unit, the generated split image is displayed in the third image, and the color information extracted from the third image is added to the split image. In this way, by adding the color information extracted from the third image to the split image, the visibility of the split image can be improved.
  • Patent Document 3 An imaging apparatus described in Japanese Patent Laying-Open No. 2009-163220 (hereinafter referred to as Patent Document 3) displays a superimposed image obtained by superimposing a first image and a second image obtained by dividing by a pupil dividing unit.
  • the processing means to display is provided.
  • Patent Document 4 A digital camera described in Japanese Patent Laid-Open No. 2001-309210 (hereinafter referred to as Patent Document 4) detects a shift amount between a focus position and a subject position, and changes the display content of the split image according to the shift amount. Means.
  • Patent Document 5 In the imaging apparatus described in Japanese Patent Application Laid-Open No. 2009-237214 (hereinafter referred to as Patent Document 5), the split image is erased from the screen when the aperture button is operated in the manual focus mode, and the split image is displayed when the aperture button is not operated. The displayed configuration is provided.
  • the split image may be continuously displayed even when the time when the split image is unnecessary for the user has come.
  • the split image may not be displayed even when the time when the user needs the split image has arrived.
  • all of the techniques described in Patent Documents 1 to 5 have a problem that it is difficult to switch between the split image display state and the non-display state at an appropriate time.
  • the present invention has been proposed in view of such circumstances, and an image processing apparatus, an imaging apparatus, an image processing method, and an image processing apparatus that can switch between display and non-display of an image used for in-focus confirmation at an appropriate time.
  • An object is to provide an image processing program.
  • An image processing apparatus includes first and second pixel groups in which a subject image that has passed through first and second regions in a photographic lens is divided into pupils and formed, respectively.
  • a first display image is generated based on the image signal output from the element, and in-focus confirmation is performed based on the first and second image signals output from the first and second pixel groups.
  • a generation unit that generates a second display image to be used; a first instruction unit that instructs display of the first display image; and a first instruction unit that instructs a non-holding type; and the second A second instruction unit for instructing display of the display image, a display unit for displaying the image, and the first instruction unit in a state in which display of the second display image is instructed by the second instruction unit.
  • the second display unit performs the second display.
  • the display instruction of the first display image by the first instruction unit is canceled in the state where the display of the image for use is instructed, the first unit generated by the generation unit with respect to the display unit And a display control unit that performs control to display a display image and display the second display image generated by the generation unit in a display area of the first display image.
  • the instruction by the second instruction unit may be a holding instruction.
  • the imaging element forms a third image by subjecting the subject image transmitted through the photographing lens without being divided into pupils.
  • the generation unit may generate the first display image based on the third image output from the third image group.
  • the generation unit interpolates the first and second images based on the third image, and the third The first display image may be generated based on the image.
  • the image quality of the first display image can be further improved with a simple structure.
  • An imaging device includes an image processing device according to any one of the first to fourth aspects of the present invention, an imaging element having the first and second pixel groups, And a storage unit that stores an image output from the image sensor. Thereby, the display and non-display of the image used for focus confirmation can be switched at an appropriate time as compared with the case where the present configuration is not provided.
  • the first instruction unit is moved to a first instruction position that instructs adjustment of an imaging condition and a second instruction position that instructs the start of imaging.
  • the release switch is held at the first designated position, the display of the first display image is designated.
  • the release switch is held at the first designated position, the release switch is held at the first designated position.
  • the display may be switched when the state is released. Accordingly, the second display image can be quickly displayed at an appropriate time as compared with the case where the present configuration is not provided.
  • an imaging operation is started when the release switch is held at the second designated position, and the display control unit is configured so that the release switch is When the second instruction position is held, the display instruction of the second display image by the second instruction unit is canceled, so that the first display is displayed on the display unit after the imaging operation is finished.
  • Control for displaying an image for use may be performed.
  • the release switch when the release switch is held at the second designated position, an imaging operation is started, and the release switch is held at the second designated position.
  • the case where it is performed may be a case where the display instruction of the first display image is canceled.
  • the first instruction unit is a setting unit that sets an aperture value of the photographing lens.
  • the case where the display of one display image is instructed is the case where a specific aperture value is set by the setting unit, and the case where the specific aperture value is not set by the setting unit is the first display image It may be a case where the image display instruction is canceled. Accordingly, the second display image can be displayed at an appropriate time according to the aperture value as compared with the case where the present configuration is not provided.
  • an electronic viewfinder capable of displaying the first display image and the second display image.
  • the first instruction unit includes a detection unit that detects use of the electronic viewfinder, and the detection unit detects that the use of the electronic viewfinder is not detected.
  • the case where the display is instructed, and the case where the use of the electronic viewfinder is detected may be the case where the display instruction for the first display image is canceled. Accordingly, the second display image can be displayed at an appropriate time according to the usage state of the electronic viewfinder, as compared with the case where the present configuration is not provided.
  • the imaging element forms an object image that has passed through the photographing lens without pupil division.
  • each pixel included in the first and second pixel groups is further divided into the first pixel group and the first pixel group.
  • the pixel group may be adjacent to each other in a predetermined direction between the two pixel groups.
  • each pixel included in the first and second pixel groups is further replaced with the third pixel. It is arranged between the pixels included in the pixel group, provided on the first to third pixel groups, and each pixel included in the first and second pixel groups and the first and second pixel groups.
  • a color filter that assigns a specific primary color to pixels included in the adjacent third pixel group may be further included.
  • the specific primary color may be green in the thirteenth aspect of the present invention.
  • the primary color array of the color filter may be a Bayer array in the thirteenth or fourteenth aspect of the present invention.
  • each pixel included in the first and second pixel groups is further replaced with the third pixel.
  • 4th and 5th obtained by arranging the pixel group together with the pixel group in a matrix shifted by half a pixel and between the pixels included in the third pixel group and classifying the constituent pixels in the image sensor into two types.
  • the constituent pixels are provided on the fourth and fifth pixel groups that are alternately shifted by half a pixel in the row direction and the column direction.
  • a color filter that assigns a primary color of the Bayer array to each may be further included.
  • the first and second pixel groups may be arranged with respect to a green filter in the color filter.
  • the image quality can be further improved with a simple configuration as compared with the case without this configuration.
  • An image processing method is an image processing method including first and second pixel groups in which a subject image that has passed through first and second regions in a photographing lens is divided into pupils and formed.
  • a first display image is generated based on the image signal output from the element, and in-focus confirmation is performed based on the first and second image signals output from the first and second pixel groups.
  • a generation step of generating a second display image to be used, a first instruction step of instructing display of the first display image, a first instruction step of instructing a non-holding type, and the second A second instruction step for instructing display of the display image, and the first instruction image by the first instruction step in a state where the display of the second display image is instructed by the second instruction step.
  • the display unit that displays the image Control is performed to display the first display image generated by the generation step without displaying the second display image generated by the generation step, and the second instruction step performs the second display step.
  • the display instruction of the first display image in the first instruction step is canceled in a state where display of the display image is instructed, the first generated by the generation step with respect to the display unit And a display control step of performing control to display the second display image generated by the generation step in the display area of the first display image.
  • An image processing program is for causing a computer to function as the generation unit and the display unit in the image processing device according to any one of claims 1 to 4. . Thereby, the display and non-display of the image used for focus confirmation can be switched at an appropriate time as compared with the case where the present configuration is not provided.
  • FIG. 2 is a schematic layout diagram illustrating an example of a layout of color filters provided in an image sensor included in the imaging device according to the first embodiment.
  • FIG. 5 is a diagram for explaining a method of determining a correlation direction from pixel values of 2 ⁇ 2 G pixels included in the color filter illustrated in FIG. 4. It is a figure for demonstrating the concept of a basic sequence pattern.
  • FIG. 1 It is a figure for demonstrating the concept of the basic sequence pattern contained in the color filter shown in FIG.
  • FIG. It is a schematic block diagram which shows an example of arrangement
  • FIG. 6 is a screen diagram illustrating an example of a live view image that is displayed on a display unit of the imaging apparatus according to the first embodiment and is out of focus.
  • FIG. 6 is a screen diagram illustrating an example of a live view image displayed on a display unit of the imaging apparatus according to the first embodiment and in a focused state. It is a schematic block diagram which shows the modification of arrangement
  • FIG. 20 is a schematic configuration diagram illustrating an example of a configuration of phase difference pixels (first pixel and second pixel) included in the imaging element illustrated in FIG. 19.
  • FIG. 20 is a schematic configuration diagram illustrating a modified example of the arrangement of phase difference pixels included in the image sensor illustrated in FIG. 19.
  • FIG. 6 is a schematic diagram illustrating an example of a split image according to the first embodiment, in which the first image and the second image are divided into odd lines and even lines and alternately arranged. It is.
  • FIG. 9 is a schematic diagram showing an example of a split image according to the first embodiment, which is an example of a split image divided by oblique dividing lines inclined with respect to the horizontal direction.
  • FIG. 10 is a schematic diagram illustrating an example of a split image according to the first embodiment, which is an example of a split image divided by a grid-like dividing line.
  • FIG. 10 is a schematic diagram illustrating an example of a split image formed in a checkered pattern, which is a modification of the split image according to the first embodiment. It is a flowchart which shows an example of the flow of the image output process which concerns on 2nd Embodiment. It is a flowchart which shows an example of the flow of the image output process which concerns on 3rd Embodiment. It is a flowchart which shows an example of the flow of the image output process which concerns on 4th Embodiment. It is a perspective view which shows an example of the external appearance of the smart phone which concerns on 5th Embodiment. It is a block diagram which shows an example of the principal part structure of the electrical system of the smart phone which concerns on 5th Embodiment.
  • FIG. 1 is a perspective view illustrating an example of an appearance of the imaging apparatus 100 according to the first embodiment
  • FIG. 2 is a rear view of the imaging apparatus 100 illustrated in FIG.
  • the imaging apparatus 100 is an interchangeable lens camera, and includes a camera body 200 and an interchangeable lens 300 (photographing lens, focus lens 302) that is replaceably attached to the camera body 200, and a reflex mirror is omitted. It is a digital camera.
  • the camera body 200 is provided with a hybrid finder (registered trademark) 220.
  • the hybrid viewfinder 220 here refers to a viewfinder in which, for example, an optical viewfinder (hereinafter referred to as “OVF”) and an electronic viewfinder (hereinafter referred to as “EVF”) are selectively used.
  • OPF optical viewfinder
  • EMF electronic viewfinder
  • the camera body 200 and the interchangeable lens 300 are interchangeably mounted by combining a mount 256 provided in the camera body 200 and a mount 346 (see FIG. 3) on the interchangeable lens 300 side corresponding to the mount 256.
  • the lens barrel of the interchangeable lens 300 is provided with a focus ring, and the focus lens 302 is moved in the optical axis direction in accordance with the rotation operation of the focus ring, and an imaging device 20 (described later) at a focus position corresponding to the subject distance.
  • the subject light can be imaged on (see FIG. 3).
  • a front view of the camera body 200 is provided with an OVF viewfinder window 241 included in the hybrid viewfinder 220.
  • a finder switching lever (finder switching unit) 214 is provided on the front surface of the camera body 200. When the viewfinder switching lever 214 is rotated in the direction of the arrow SW, it switches between an optical image that can be viewed with OVF and an electronic image (live view image) that can be viewed with EVF (described later).
  • the optical axis L2 of the OVF is an optical axis different from the optical axis L1 of the interchangeable lens 300.
  • a release switch 211 and a dial 212 for setting a shooting mode, a playback mode, and the like are mainly provided on the upper surface of the camera body 200.
  • the release switch 211 is pressed from a standby position to an intermediate position (half-pressed position) that is an example of a first indicated position, and a final pressed position (full-pressed position) that is an example of a second indicated position that exceeds the intermediate position. ) In a state where the button is pressed down until it is pressed down).
  • a state where the button is pressed from the standby position to the half-pressed position is referred to as “half-pressed state”
  • “a state where the button is pressed from the standby position to the fully-pressed position” is referred to as “full-pressed state”.
  • the shooting conditions are adjusted by pressing the release button 211 halfway, and then exposure (shooting) is performed when the release button 211 is fully pressed.
  • the “imaging condition” referred to here indicates, for example, at least one of an exposure state and a focused state.
  • the exposure state and the focus state are adjusted. In other words, by pressing the release button 211 halfway, the AE (Automatic Exposure) function is activated and the exposure state (shutter speed, aperture state) is set, and then the AF function is activated and the focus is controlled.
  • AE Automatic Exposure
  • an OVF viewfinder eyepiece 242 On the back of the camera body 200, an OVF viewfinder eyepiece 242, a display unit 213, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225 are provided.
  • the cross key 222 functions as a multi-function key that outputs various command signals such as menu selection, zoom and frame advance.
  • the MENU / OK key 224 has both a function as a menu button for instructing to display a menu on the screen of the display unit 213 and an function as an OK button for instructing confirmation and execution of selection contents. Key.
  • the BACK / DISP button 225 is used for deleting a desired object such as a selection item, canceling a designated content, or returning to the previous operation state.
  • the display unit 213 is realized by, for example, an LCD, and is used to display a live view image (through image) that is an example of a continuous frame image obtained by capturing a continuous frame in the shooting mode.
  • the display unit 213 is also used to display a still image that is an example of a single frame image obtained by capturing a single frame when a still image shooting instruction is given.
  • the display unit 213 is also used for displaying a playback image and a menu screen in the playback mode.
  • FIG. 3 is a block diagram showing an example of the electrical configuration (internal configuration) of the imaging apparatus 100 according to the first embodiment.
  • the imaging device 100 is a digital camera that records captured still images and moving images, and the overall operation of the camera is controlled by a CPU (central processing unit) 12.
  • the imaging apparatus 100 includes an operation unit 14 that is an example of a first instruction unit, a second instruction unit, and a setting unit according to the present invention.
  • the imaging unit 100 includes an interface unit 24, a memory 26, and an encoder 34.
  • the imaging unit 100 includes display control units 36A and 36B which are examples of the display control unit according to the present invention.
  • the imaging unit 100 includes an eyepiece detection unit 37 that is an example of a detection unit according to the present invention.
  • the imaging apparatus 100 includes an image processing unit 28 that is an example of a generation unit according to the present invention.
  • the display control unit 36 is provided as a hardware configuration different from the image processing unit 28.
  • the present invention is not limited to this, and the image processing unit 28 has the same function as the display control unit 36. In this case, the display control unit 36 is not necessary.
  • the CPU 12, the operation unit 14, the interface unit 24, the memory 26, which is an example of a storage unit, the image processing unit 28, the encoder 34, the display control units 36 ⁇ / b> A and 36 ⁇ / b> B, the eyepiece detection unit 37, and the external interface (I / F) 39 40 are connected to each other.
  • the memory 26 includes a non-volatile storage area (such as an EEPROM) that stores parameters, programs, and the like, and a volatile storage area (such as an SDRAM) that temporarily stores various information such as images.
  • the CPU 12 performs focusing control by driving and controlling the focus adjustment motor so that the contrast value of the image obtained by imaging is maximized. Further, the CPU 12 calculates AE information that is a physical quantity indicating the brightness of an image obtained by imaging. When the release switch 211 is half-pressed, the CPU 12 derives the shutter speed and F value corresponding to the brightness of the image indicated by the AE information. Then, the exposure state is set by controlling each related part so that the derived shutter speed and F value are obtained.
  • the operation unit 14 is a user interface operated by the operator when giving various instructions to the imaging apparatus 100. Various instructions received by the operation unit 14 are output as operation signals to the CPU 12, and the CPU 12 executes processing according to the operation signals input from the operation unit 14.
  • the operation unit 14 includes a release switch 211, a dial (focus mode switching unit) 212 for selecting a shooting mode, a display unit 213, a viewfinder switching lever 214, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225. .
  • the operation unit 14 also includes a touch panel that accepts various types of information. This touch panel is overlaid on the display screen of the display unit 213, for example.
  • the operation unit 14 also includes a depth-of-field confirmation button that is pressed when confirming the depth of field on the screen of the display unit 213, for example.
  • the F value of the photographing lens is adjusted by pressing the depth-of-field confirmation button, and a predetermined F value is set as the F value used when capturing a still image. Is done.
  • the image light indicating the subject is coupled to the light receiving surface of the color image sensor (for example, a CMOS sensor) 20 via the photographing lens 16 including the focus lens that can be moved manually and the shutter 18.
  • the signal charge accumulated in the image sensor 20 is sequentially read out as a digital signal corresponding to the signal charge (voltage) by a read signal applied from the device control unit 22.
  • the imaging element 20 has a so-called electronic shutter function, and controls the charge accumulation time (shutter speed) of each photosensor according to the timing of the readout signal by using the electronic shutter function.
  • the image sensor 20 according to the first embodiment is a CMOS image sensor, but is not limited thereto, and may be a CCD image sensor.
  • the image sensor 20 is provided with a color filter 21 shown in FIG. 4 as an example.
  • FIG. 4 schematically shows an example of the arrangement of the color filters 21.
  • (4896 ⁇ 3264) pixels are adopted as an example of the number of pixels and 3: 2 is adopted as the aspect ratio.
  • the number of pixels and the aspect ratio are not limited thereto.
  • the color filter 21 includes a first filter G corresponding to G (green) that contributes most to obtain a luminance signal, and second filters R and B corresponding to R (red).
  • a third filter B corresponding to (blue) is included.
  • the arrangement pattern of the first filter G (hereinafter referred to as G filter), the second filter R (hereinafter referred to as R filter), and the third filter B (hereinafter referred to as B filter) is the first arrangement. It is classified into a pattern A and a second array pattern B.
  • the G filters are arranged on the four corners and the center pixel of the 3 ⁇ 3 pixel square array.
  • the R filter is arranged on the central vertical line in the horizontal direction (an example of the row direction) of the square arrangement.
  • the B filter is arranged on the central horizontal line in the vertical direction (an example of the column direction) of the square arrangement.
  • the second arrangement pattern B is a pattern in which the arrangement of the filter G is the same as that of the first basic arrangement pattern A and the arrangement of the filter R and the arrangement of the filter B are interchanged.
  • the color filter 21 includes a basic array pattern C composed of a square array pattern corresponding to 6 ⁇ 6 pixels.
  • the basic array pattern C is a 6 ⁇ 6 pixel pattern in which the first array pattern A and the second array pattern B are arranged point-symmetrically, and the basic array pattern C is repeatedly arranged in the horizontal direction and the vertical direction. ing.
  • filters of R, G, and B colors R filter, G filter, and B filter
  • R filter, G filter, and B filter filters of R, G filter, and B colors
  • the color filter array of the reduced image after the thinning process can be the same as the color filter array before the thinning process, and a common processing circuit is provided. Can be used.
  • the color filter 21 has a G filter corresponding to the color that contributes most to obtain a luminance signal (G color in the first embodiment) in each horizontal, vertical, and diagonal line of the color filter array. Is arranged. Therefore, it is possible to improve the reproduction accuracy of the synchronization process in the high frequency region regardless of the direction of high frequency.
  • the color filter 21 includes an R filter and a B filter corresponding to two or more other colors (R and B colors in the first embodiment) other than the G color. Located within each line of direction. For this reason, the occurrence of color moire (false color) is suppressed, whereby an optical low-pass filter for suppressing the occurrence of false color can be prevented from being arranged in the optical path from the incident surface of the optical system to the imaging surface. . Even when an optical low-pass filter is applied, it is possible to apply a filter having a weak function of cutting a high-frequency component for preventing the occurrence of false colors, so that the resolution is not impaired.
  • the basic array pattern C includes a 3 ⁇ 3 pixel first array pattern A surrounded by a broken line frame and a 3 ⁇ 3 pixel second array pattern B surrounded by a one-dot chain line. It can also be understood that the array is arranged alternately.
  • G filters which are luminance pixels, are arranged at the four corners and the center, and are arranged on both diagonal lines.
  • the B filter is arranged in the horizontal direction and the R filter is arranged in the vertical direction with the central G filter interposed therebetween.
  • the R filters are arranged in the horizontal direction and the B filters are arranged in the vertical direction across the center G filter. That is, in the first arrangement pattern A and the second arrangement pattern B, the positional relationship between the R filter and the B filter is reversed, but the other arrangements are the same.
  • the G filters at the four corners of the first array pattern A and the second array pattern B are shown in FIG. 5 as an example, in which the first array pattern A and the second array pattern B are horizontal and vertical. Are alternately arranged, a square array of G filters corresponding to 2 ⁇ 2 pixels is formed.
  • a square array of G filters corresponding to 2 ⁇ 2 pixels is formed.
  • the difference absolute value of the pixel value of the G pixel in the horizontal direction the difference absolute value of the pixel value of the G pixel in the vertical direction, and the diagonal direction
  • the difference absolute value of the pixel values of the G pixels (upper right diagonal, upper left diagonal) is calculated.
  • the basic array pattern C of the color filter 21 is arranged point-symmetrically with respect to the center of the basic array pattern C (the centers of the four G filters).
  • the first array pattern A and the second array pattern B in the basic array pattern C are also arranged point-symmetrically with respect to the central G filter, so that the circuit scale of the subsequent processing circuit is reduced. Or can be simplified.
  • the color filter array of the first and third lines of the first to sixth lines in the horizontal direction is GRGGBG.
  • the color filter array of the second line is BGBGR.
  • the color filter array of the fourth and sixth lines is GBGGRG.
  • the color filter array of the fifth line is RGRBGB.
  • basic array patterns C, C ′, and C ′′ are shown.
  • the basic array pattern C ′ indicates a pattern obtained by shifting the basic array pattern C by one pixel in the horizontal direction and the vertical direction
  • the basic array pattern C ′′ indicates a pattern obtained by shifting the basic array pattern C by two pixels each in the horizontal direction and the vertical direction.
  • the color filter 21 has the same color filter array even if the basic array patterns C ′ and C ′′ are repeatedly arranged in the horizontal direction and the vertical direction.
  • the imaging apparatus 100 has a phase difference AF function.
  • the image sensor 20 includes a plurality of phase difference detection pixels used when the phase difference AF function is activated.
  • the plurality of phase difference detection pixels are arranged in a predetermined pattern.
  • FIG. 7 schematically shows an example of a correspondence relationship between a part of the color filter 21 and a part of the pixels for detecting the phase difference.
  • the phase difference detection pixels include a first pixel L in which the left half pixel in the horizontal direction is shielded and a second pixel R in which the right half pixel in the horizontal direction is shielded. Any of them.
  • phase difference pixels when it is not necessary to distinguish between the first pixel L and the second pixel R, they are referred to as “phase difference pixels”.
  • FIG. 8 shows an example of the first pixel L and the second pixel R arranged in the image sensor 20.
  • the first pixel L has a light shielding member 20A
  • the second pixel R has a light shielding member 20B.
  • the light shielding member 20A is provided on the front side (microlens L side) of the photodiode PD, and shields the left half of the light receiving surface.
  • the light shielding member 20B is provided on the front side of the photodiode PD and shields the right half of the light receiving surface.
  • the microlens L and the light shielding members 20A and 20B function as a pupil dividing unit, the first pixel L receives only the left side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16, and the second pixel R is Only the right side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16 is received. In this way, the light beam passing through the exit pupil is divided into the left and right by the microlens L and the light shielding members 20A and 20B, which are pupil dividing portions, and enter the first pixel L and the second pixel R, respectively.
  • the subject image corresponding to the left half of the light beam passing through the exit pupil of the photographing lens 16 and the subject image corresponding to the right half of the light beam are in focus (in focus).
  • the portion forms an image at the same position on the image sensor 20.
  • the front pin portion or the rear pin portion is incident on different positions on the image sensor 20 (the phase is shifted).
  • the subject image corresponding to the left half light beam and the subject image corresponding to the right half light beam can be acquired as parallax images (left eye image and right eye image) having different parallaxes.
  • the imaging apparatus 100 detects a phase shift amount based on the pixel value of the first pixel L and the pixel value of the second pixel R by using the phase difference AF function. Then, the focal position of the photographing lens is adjusted based on the detected phase shift amount.
  • the light shielding members 20A and 20B are referred to as “light shielding members” without reference numerals.
  • the image sensor 20 is classified into a first pixel group, a second pixel group, and a third pixel group.
  • the first pixel group refers to a plurality of first pixels L, for example.
  • the second pixel group refers to a plurality of second pixels R, for example.
  • the third pixel group refers to, for example, a plurality of normal pixels (an example of a third pixel).
  • the “normal pixel” here refers to, for example, a pixel other than the phase difference pixel (for example, a pixel having no light shielding members 20A and 20B).
  • the RAW image output from the first pixel group is referred to as a “first image”
  • the RAW image output from the second pixel group is referred to as a “second image”
  • the third image The RAW image output from the pixel group is referred to as a “third image”.
  • Each pixel included in the first and second pixel groups is arranged at a position where the positions in the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group.
  • each pixel included in the first and second pixel groups is arranged at a position where the positions in the vertical direction are aligned within the one pixel between the first pixel group and the second pixel group.
  • one pixel L and second pixel R are alternately arranged linearly in the horizontal direction and the vertical direction at intervals of a plurality of pixels.
  • the positions of the pixels included in the first and second pixel groups are positions aligned within one pixel in each of the horizontal direction and the vertical direction, but at least in either the horizontal direction or the vertical direction.
  • the position may be within a predetermined number of pixels (for example, within 2 pixels).
  • the position of each pixel included in the first and second pixel groups is set in the horizontal direction as shown in FIG. 7 as an example.
  • it is preferable that the positions are aligned within one pixel in each of the vertical directions.
  • the phase difference pixels are provided for the pixels of the G filter in a square array corresponding to 2 ⁇ 2 pixels. That is, in the example shown in FIG. 7, the pixel in the upper right corner of the front view in the figure of the 2 ⁇ 2 pixel G filter is assigned to the phase difference pixel. Further, normal pixels are arranged between the phase difference pixels, and the remaining pixels of the 2 ⁇ 2 pixel G filter are assigned to the normal pixels. Further, in the example shown in FIG.
  • the rows of the phase difference pixels in which the first pixels L and the second pixels R are alternately arranged in the horizontal direction are set as a set in units of two rows, and each set includes They are arranged in the vertical direction at intervals of a predetermined number of pixels (eight pixels in the example shown in FIG. 7).
  • the light shielding member is provided for the pixel in the upper right corner of the 2 ⁇ 2 pixel G filter, and the phase difference pixel is spaced by a plurality of pixels in both the vertical and horizontal directions.
  • the phase difference pixel is spaced by a plurality of pixels in both the vertical and horizontal directions.
  • the interpolation accuracy in the case of interpolating the pixel values of the phase difference pixels from the pixel values of the normal pixels can be improved.
  • the pixels included in the first to third pixel groups are arranged so that the normal pixels used for interpolation do not overlap between the phase difference pixels, further improvement in interpolation accuracy can be expected.
  • the image sensor 20 outputs a first image (a digital signal indicating the pixel value of each first pixel) from the first pixel group, and outputs a second image (from the second pixel group). A digital signal indicating the pixel value of each second pixel). Further, the image sensor 20 outputs a third image (a digital signal indicating the pixel value of each normal pixel) from the third pixel group. Note that the third image output from the third pixel group is a chromatic image, for example, a color image having the same color array as the normal pixel array.
  • the first image, the second image, and the third image output from the image sensor 20 are temporarily stored in a volatile storage area in the memory 26 via the interface unit 24.
  • the image processing unit 28 includes a normal processing unit 30.
  • the normal processing unit 30 processes the R, G, and B signals corresponding to the third pixel group to generate a chromatic color normal image that is an example of the first display image.
  • the image processing unit 28 includes a split image processing unit 32.
  • the split image processing unit 32 processes the G signal corresponding to the first pixel group and the second pixel group to generate an achromatic split image that is an example of a second display image.
  • the image processing unit 28 according to the first embodiment is realized by an ASIC (Application (Specific Integrated Circuit) that is an integrated circuit in which a plurality of functions related to image processing are integrated into one.
  • the hardware configuration is not limited to this, and may be another hardware configuration such as a programmable logic device, a computer including a CPU, a ROM, and a RAM.
  • the hybrid viewfinder 220 has an LCD 247 that displays an electronic image.
  • the number of pixels in a predetermined direction on the LCD 247 (for example, the number of pixels in the horizontal direction, which is a parallax generation direction) is smaller than the number of pixels in the same direction on the display unit 213.
  • the display control unit 36A is connected to the display unit 213, and the display control part 36B is connected to the LCD 247. By selectively controlling the LCD 247 and the display unit 213, an image is displayed on the LCD 247 or the display unit 213.
  • the display unit 213 and the LCD 247 are referred to as “display devices” when it is not necessary to distinguish between them.
  • the imaging apparatus 100 is configured to be able to switch between a manual focus mode and an autofocus mode by a dial 212 (focus mode switching unit).
  • the display control unit 36 causes the display device to display a live view image obtained by combining the split images.
  • the CPU 12 operates as a phase difference detection unit and an automatic focus adjustment unit.
  • the phase difference detection unit detects a phase difference between the first image output from the first pixel group and the second image output from the second pixel group.
  • the automatic focus adjustment unit controls the lens driving unit (not shown) from the device control unit 22 via the mounts 256 and 346 so that the defocus amount of the photographing lens 16 is zero based on the detected phase difference. Then, the photographing lens 16 is moved to the in-focus position.
  • the above “defocus amount” refers to, for example, the amount of phase shift between the first image and the second image.
  • the eyepiece detection unit 37 detects that a person (for example, a photographer) has looked into the viewfinder eyepiece 242 and outputs the detection result to the CPU 12. Therefore, the CPU 12 can grasp whether or not the finder eyepiece unit 242 is used based on the detection result of the eyepiece detection unit 37.
  • the external I / F 39 is connected to a communication network such as a LAN (Local Area Network) or the Internet, and controls transmission / reception of various information between the external device (for example, a printer) and the CPU 12 via the communication network. Therefore, when a printer is connected as an external device, the imaging apparatus 100 can output a captured still image to the printer for printing. Further, when a display is connected as an external device, the imaging apparatus 100 can output and display a captured still image or live view image on the display.
  • a communication network such as a LAN (Local Area Network) or the Internet
  • FIG. 9 is a functional block diagram illustrating an example of main functions of the imaging apparatus 100 according to the first embodiment.
  • symbol is attached
  • the normal processing unit 30 and the split image processing unit 32 each have a WB gain unit, a gamma correction unit, and a synchronization processing unit (not shown), and with respect to the original digital signal (RAW image) temporarily stored in the memory 26.
  • Each processing unit sequentially performs signal processing. That is, the WB gain unit executes white balance (WB) by adjusting the gains of the R, G, and B signals.
  • the gamma correction unit performs gamma correction on each of the R, G, and B signals that have been subjected to WB by the WB gain unit.
  • the synchronization processing unit performs color interpolation processing corresponding to the color filter array of the image sensor 20, and generates synchronized R, G, B signals.
  • the normal processing unit 30 and the split image processing unit 32 perform image processing on the RAW image in parallel every time a RAW image for one screen is acquired by the image sensor 20.
  • the normal processing unit 30 receives R, G, B RAW images from the interface unit 24, and uses the R, G, B pixels of the third pixel group as an example as shown in FIG. And it is generated by interpolating with peripheral pixels of the same color in the second pixel group (for example, adjacent G pixels). Thereby, a normal image for recording can be generated based on the third image output from the third pixel group.
  • the normal processing unit 30 outputs the generated image data of the normal image for recording to the encoder 34.
  • the R, G, B signals processed by the normal processing unit 30 are converted (encoded) into recording signals by the encoder 34 and recorded in the recording unit 40 (see FIG. 7).
  • a normal image for display that is an image based on the third image processed by the normal processing unit 30 is output to the display control unit 36.
  • the term “for recording” and “for display” are used. Is referred to as a “normal image”.
  • the image sensor 20 can change the exposure conditions (shutter speed by an electronic shutter as an example) of each of the first pixel group and the second pixel group, and thereby can simultaneously acquire images having different exposure conditions. . Therefore, the image processing unit 28 can generate an image with a wide dynamic range based on images with different exposure conditions. In addition, a plurality of images can be simultaneously acquired under the same exposure condition, and by adding these images, a high-sensitivity image with little noise can be generated, or a high-resolution image can be generated.
  • the split image processing unit 32 extracts the G signal of the first pixel group and the second pixel group from the RAW image once stored in the memory 26, and the G of the first pixel group and the second pixel group. An achromatic split image is generated based on the signal.
  • Each of the first pixel group and the second pixel group extracted from the RAW image is a pixel group including G filter pixels as described above. Therefore, the split image processing unit 32 can generate an achromatic left parallax image and an achromatic right parallax image based on the G signals of the first pixel group and the second pixel group.
  • the above “achromatic left parallax image” is referred to as a “left eye image”
  • the above “achromatic right parallax image” is referred to as a “right eye image”.
  • the split image processing unit 32 combines the left-eye image based on the first image output from the first pixel group and the right-eye image based on the second image output from the second pixel group to split the image. Generate an image.
  • the generated split image data is output to the display controller 36.
  • the display control unit 36 records image data corresponding to the third pixel group input from the normal processing unit 30 and the split corresponding to the first and second pixel groups input from the split image processing unit 32. Display image data is generated based on the image data of the image. For example, the display control unit 36 displays the image input from the split image processing unit 32 in the display area of the normal image indicated by the recording image data corresponding to the third pixel group input from the normal processing unit 30. The split image indicated by the data is synthesized. Then, the image data obtained by the synthesis is output to the display device. That is, the display control unit 36A outputs the image data to the display unit 213, and the display control unit 36B outputs the image data to the LCD 247.
  • the split image generated by the split image processing unit 32 is a multi-divided image obtained by combining a part of the left eye image and a part of the right eye image.
  • Examples of the “multi-divided image” mentioned here include split images shown in FIGS. 13A and 13B.
  • the split image shown in FIG. 13 is an image obtained by combining the upper half image of the left-eye image and the lower half image of the right-eye image, and the image divided into two in the vertical direction is in focus. Accordingly, the image is shifted in a predetermined direction (for example, a parallax generation direction).
  • the split image mode is not limited to the example shown in FIGS.
  • a part of the left eye image and a part of the right eye image at a position corresponding to the position of the predetermined area of the display unit 213 are combined. It may be an image. In this case, for example, an image divided into four in the vertical direction is shifted in a predetermined direction (for example, a parallax generation direction) according to the in-focus state.
  • a predetermined direction for example, a parallax generation direction
  • the method of combining the split image with the normal image is not limited to the combining method of fitting the split image in place of a part of the normal image.
  • a synthesis method in which a split image is superimposed on a normal image may be used.
  • a combining method may be used in which the transmittance of a part of the normal image on which the split image is superimposed and the split image are appropriately adjusted and superimposed.
  • the hybrid finder 220 includes an OVF 240 and an EVF 248.
  • the OVF 240 is an inverse Galileo finder having an objective lens 244 and an eyepiece 246, and the EVF 248 has an LCD 247, a prism 245, and an eyepiece 246.
  • a liquid crystal shutter 243 is disposed in front of the objective lens 244, and the liquid crystal shutter 243 shields light so that an optical image does not enter the objective lens 244 when the EVF 248 is used.
  • the prism 245 reflects an electronic image or various information displayed on the LCD 247 and guides it to the eyepiece 246, and combines the optical image and information (electronic image and various information) displayed on the LCD 247.
  • an OVF mode in which an optical image can be visually recognized by the OVF 240 and an electronic image can be visually recognized by the EVF 248 each time it is rotated.
  • the EVF mode is switched alternately.
  • the display control unit 36B controls the liquid crystal shutter 243 to be in a non-light-shielding state so that an optical image can be visually recognized from the eyepiece unit. Further, only the split image is displayed on the LCD 247. Thereby, a finder image in which a split image is superimposed on a part of the optical image can be displayed.
  • the display control unit 36B controls the liquid crystal shutter 243 to be in a light shielding state so that only the electronic image displayed on the LCD 247 can be visually recognized from the eyepiece unit.
  • the image data equivalent to the image data obtained by combining the split image output to the display unit 213 is input to the LCD 247, whereby the split image is combined with a part of the normal image in the same manner as the display unit 213. Electronic images can be displayed.
  • FIG. 11 shows an example of display areas of the normal image and the split image in the display device.
  • the display device displays the input split image within a rectangular frame (a split image display area) at the center of the screen.
  • the display device displays the input normal image in the outer peripheral area of the split image (normal image display area).
  • the display device displays the input normal image in the entire screen area (full screen display).
  • the display device displays the input split image as an example in a rectangular frame at the center of the screen shown in FIG. It is a blank area. Note that the edge line representing the rectangular frame at the center of the screen shown in FIG. 11 is not actually displayed, but is shown in FIG. 11 for convenience of explanation.
  • an image output process performed by the CPU 12 when a normal image and a split image are generated by the image processing unit 28 (for example, every time a live view image for one frame is generated)
  • the image output process is performed by the imaging apparatus 100 when the CPU 12 executes the image output process program.
  • the image output processing program is stored in a predetermined storage area (for example, the memory 26).
  • the image processing unit 28 may execute the image output process.
  • step 300 the CPU 12 determines whether or not the split image generated by the image processing unit 28 has been instructed via the operation unit 14.
  • the instruction by the operation unit 14 is preferably a holding type instruction.
  • the instruction by the operation unit 14 may be an instruction by a hard key or an instruction by a soft key.
  • an alternate operation type switch (holding type switch) is preferably applied. That is, it refers to a switch that is maintained in an operating state (in this case, a state in which display of a split image is instructed) until a release operation is performed when pushed into a predetermined position.
  • the alternate operation type switch may be a lock type that is held in the pushed-in position when the pressing operation is released, and returns to the original state when the pressing operation is performed again, or is free when the pressing operation is released.
  • a non-locking type returning to the position may be used.
  • the CPU 12 causes the display unit 213 to display a soft key (an example of a second instruction unit) that is pressed when instructing display of the split image.
  • the soft key is operated, it is determined that the split image display is instructed. If split image display is instructed via the operation unit 14 in step 300, the determination is affirmed and the routine proceeds to step 302. If the split image display is not instructed via the operation unit 14 in step 300, the determination is negative and the process proceeds to step 312.
  • the CPU 12 determines whether or not the display of the normal image generated by the image processing unit 28 is instructed via the operation unit 14.
  • the instruction by the operation unit 14 is preferably a non-holding type instruction.
  • the instruction by the operation unit 14 may be an instruction by a hard key or an instruction by a soft key.
  • a momentary operation type switch non-holding type switch
  • it refers to a switch that maintains an operating state (here, as an example, a state instructing display of a normal image) only while being pushed into a predetermined position.
  • the momentary operation type switch may be a push-pull type that is pressed when instructing display of a normal image and is pulled out when canceling the instruction, or may be of other types.
  • a switch that realizes the same function as a hard key in a software configuration may be applied.
  • the CPU 12 causes the display unit 213 to display a soft key (an example of a first instruction unit) that is pressed when instructing display of a normal image. Then, it is determined that the display of the normal image is instructed when the soft key is operated. If the display of the normal image is instructed via the operation unit 14 in step 302, the determination is affirmed and the process proceeds to step 306. If the display of the normal image is not instructed via the operation unit 14 in step 302, the determination is negative and the process proceeds to step 308.
  • step 306 the CPU 12 controls the image processing unit 28 to output the normal image to the display control units 36 ⁇ / b> A and 36 ⁇ / b> B and discards the split image, and then proceeds to step 310.
  • the image processing unit 28 executes this step 305, the image processing unit 28 outputs the generated normal image to the display control units 36A and 36B, and discards the generated split image.
  • the display control unit 36A outputs the input normal image to the display unit 213, thereby causing the display unit 213 to display the normal image. In this case, the display unit 213 displays a normal image in the entire area of the screen.
  • the display control unit 36B outputs the input normal image to the LCD 247 to display the normal image on the LCD 247.
  • the LCD 247 displays a normal image in the entire area of the screen.
  • step 308 the CPU 12 controls the image processing unit 28 to output the normal image and the split image to the display control units 36A and 36B, and then proceeds to step 310.
  • the image processing unit 28 outputs the generated normal image and split image to the display control units 36A and 36B.
  • the display control unit 36A causes the display unit 213 to display the input normal image and split image.
  • the display unit 213 displays a normal image in the normal image display area shown in FIG. 11 as an example, and displays the split image in the split image display area shown in FIG. 11 as an example.
  • the display control unit 36B causes the LCD 247 to display the input normal image and split image.
  • the LCD 247 displays the normal image in the normal image display area shown in FIG. 11 as an example, and displays the split image in the split image display area shown in FIG. 11 as an example.
  • step 312 the CPU 12 performs the same processing as in step 306, and then proceeds to step 310.
  • step 310 the CPU 12 determines whether or not a condition (end condition) for ending the main image output process is satisfied.
  • the “end condition” here include a condition that an instruction to end the main image output process is given via the operation unit 14 and a condition that the normal image and the split image are no longer generated by the image processing unit 28. Can be mentioned. If the end condition is not satisfied in step 310, the process returns to step 300. If the end condition is satisfied in step 310, the determination is affirmed and the main image output process is ended.
  • step 308 when step 308 is executed by the CPU 12, a live view image is displayed on the display unit 213 and the hybrid viewfinder 220 as shown in FIGS. 13A and 13B as an example.
  • the split image is displayed in the inner area of the frame 60 at the center of the screen corresponding to the split image display area shown in FIG. 11, and the frame corresponding to the display area of the normal image is displayed.
  • a normal image is displayed in the outer area 60.
  • the split image is an image (parallax image) of the upper half 60A of the frame 60 in the left-eye image corresponding to the first image output from the first pixel group, and the second image output from the second pixel group. And the image (parallax image) of the lower half 60B of the frame 60 in the right-eye image corresponding to.
  • the parallax image of the upper half 60A of the split image and the parallax of the lower half 60B as shown in FIG. 13A when the subject corresponding to the image in the frame 60 is not in focus, the parallax image of the upper half 60A of the split image and the parallax of the lower half 60B as shown in FIG. 13A.
  • the image at the boundary with the image is shifted in the parallax generation direction (for example, the horizontal direction).
  • the boundary image between the normal image and the split image is also shifted in the parallax generation direction. This indicates that a phase difference has occurred, and the photographer can visually recognize the phase difference and the parallax generation direction through the split image.
  • the photographing lens 16 when the photographing lens 16 is focused on the subject corresponding to the image in the frame 60, the parallax image of the upper half 60A and the parallax image of the lower half 60B as shown in FIG. 13B.
  • the border images match.
  • the images at the boundary between the normal image and the split image also match. This indicates that no phase difference has occurred, and the photographer can recognize that no phase difference has occurred visually through the split image.
  • the photographer can check the in-focus state of the photographing lens 16 by the split image displayed on the display device.
  • the focus shift amount (defocus amount) can be reduced to zero by manually operating the focus ring 302 of the photographing lens 16.
  • the normal image and the split image can be displayed as color images without color misregistration, and the manual focus adjustment by the photographer can be supported by the color split image.
  • the display device does not display the split image when the display of the normal image is instructed to display the split image, and displays the normal image and the split image when the display of the normal image is not instructed. indicate.
  • the case where the display of the normal image is not instructed indicates, for example, the case where the display instruction of the normal image is canceled. Therefore, the imaging apparatus 100 according to the present embodiment can switch between displaying and not displaying the split image at an appropriate time as compared with the case where the present configuration is not provided.
  • the normal image and the split image are output from the image processing unit 28 to the display control unit 36.
  • the normal image and the split image are output to an external device connected to the external I / F 39. May be output via the external I / F 39.
  • the external device is a storage device (for example, a server device)
  • a normal image and a split image can be stored in the storage device.
  • the external device is an external display, a normal image and a split image can be displayed on the external display in the same manner as the display device described above.
  • the imaging apparatus 100 has the left side and the right side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16 (the subject image that has passed through the first and second regions).
  • An example includes first and second pixel groups that are imaged by pupil division.
  • the image processing unit 28, which is an example of a generation unit, generates a normal image, which is an example of a first display image, based on an image output from the image sensor 20 having the first and second pixel groups. . Further, the operation unit 14 instructs to display a normal image and also displays a split image (steps 300 and 302).
  • the image processing unit 28 when display of a normal image is instructed by the operation unit 14 in a state in which display of the split image is instructed by the operation unit 14, the image processing unit 28 generates the generated normal image without outputting the generated split image. An image is output (step 306).
  • the image processing unit 28 when the operation unit 14 cancels the normal image output instruction while the operation unit 14 instructs the display of the split image, the image processing unit 28 outputs the generated split image and normal image (step). 308).
  • the display control unit 36 controls the display device to display the normal image and display the split image in the normal image display area. Thereby, the display and non-display of the split image can be switched at an appropriate time as compared with the case where this configuration is not provided. Further, since the split image is output when the instruction to display the normal image by the operation unit 14 is canceled (step 302: N), the split image can be displayed more quickly than in the case where the present configuration is not provided. .
  • a non-holding type instruction is applied as an instruction to display a normal image.
  • the display and non-display of a split image can be switched quickly compared with the case where this configuration is not provided.
  • a holding type instruction is applied as an instruction to display a split image.
  • the imaging device 20 includes a third pixel group that outputs a third image by forming a subject image that has passed through the photographing lens 16 without being divided into pupils. Have.
  • the image processing unit 28 generates a normal image based on the third image output from the third pixel group. Thereby, compared with the case where this structure is not provided, the image quality of a normal image can be improved.
  • a single phase difference pixel is arranged with respect to a 2 ⁇ 2 pixel G filter.
  • a 2 ⁇ 2 pixel G filter is used.
  • a pair of first pixel L and second pixel R may be arranged.
  • a pair of first and second pixels L and R adjacent in the horizontal direction with respect to a 2 ⁇ 2 pixel G filter may be arranged.
  • a pair of first and second pixels L and R adjacent to each other in the vertical direction with respect to the 2 ⁇ 2 pixel G filter may be arranged.
  • the positions of the first pixel L and the second pixel R are set at least in the vertical direction and the horizontal direction between the first pixel group and the second pixel group. It is preferable to align one side within a predetermined number of pixels. 14 and 15, the first pixel L and the second pixel R, and the position in the vertical direction and the horizontal direction between the first pixel group and the second pixel group are one pixel. An example is shown in which they are arranged at the same positions.
  • the color filter 21 having the basic arrangement pattern C is exemplified, but the present invention is not limited to this.
  • the arrangement of the primary colors (R filter, G filter, B filter) of the color filter may be a Bayer arrangement.
  • phase difference pixels are arranged for the G filter.
  • the color filter 21A shown in FIG. 16, 3 ⁇ 3 four corners and the center of the pixel square matrix are disposed phase difference pixels in the center of the arrangement pattern G 1 which is a G filter as an example.
  • the first pixel L and the second pixel R are alternately arranged by skipping the G filter for one pixel in each of the horizontal direction and the vertical direction (with the G filter for one pixel in between). Yes.
  • the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group. ing.
  • an image based on the center of the phase difference pixel arrangement pattern G 1 because it is possible interpolation using the image based on normal pixels at the four corners of the array pattern G 1, compared to a case in which this configuration, interpolation accuracy Can be improved.
  • each of the arrangement pattern G 1 does not position overlap with each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups.
  • the pixels included in the second image are arranged at positions that do not overlap in pixel units. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
  • the color filter 21B shown in FIG. 17, the phase difference pixels are arranged in the center and in the drawing front view the lower right corner of the array pattern G 1 as an example. Further, the first pixel L and the second pixel R are alternately arranged by skipping the G filters for two pixels in each of the horizontal direction and the vertical direction (with the G filters for two pixels in between). Yes. Thereby, the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group. Thus, the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
  • each of the arrangement pattern G 1 does not position overlap with each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups. In addition, the pixels included in the second image are arranged at positions that do not overlap in a pair of pixel units.
  • the "pair of pixels" refers to the first pixel L and a second pixel R (a pair of phase difference pixels) included, for example, in the arrangement pattern G 1.
  • the first pixel L is disposed by skipping the G filters for two pixels in each of the horizontal direction and the vertical direction
  • the second pixel R is also a G filter for two pixels in each of the horizontal direction and the vertical direction. It is arranged by skipping.
  • the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within the two pixels between the first pixel group and the second pixel group.
  • the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
  • each of the examples as well as the arrangement pattern G 1 shown in FIG. 17 in the example shown in FIG. 18 does not position overlap with each other. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
  • FIG. 19 schematically shows an example of the arrangement of primary colors (R filter, G filter, B filter) of the color filter 21D provided in the image sensor 20 and the arrangement of the light shielding members.
  • the constituent pixels of the image sensor 20 are classified into two types of pixel groups.
  • a pixel group on the A surface that is an example of the fourth pixel group and a pixel group on the B surface that is an example of the fifth pixel group can be cited.
  • the pixel group on the A plane includes a first pixel group
  • the pixel group on the B plane includes a second pixel group.
  • Each of the pixel group on the A plane and the pixel group on the B plane includes a third pixel group.
  • the pixel group on the A surface and the pixel group on the B surface are pixel groups in which the constituent pixels in the image sensor 20 are alternately arranged in the horizontal direction and the vertical direction.
  • Each of the A-side pixel group and the B-side pixel group is assigned a Bayer array primary color by the color filter 21D.
  • the A-side pixel group and the B-side pixel group are mutually aligned in the horizontal direction and the vertical direction. They are shifted by half a pixel (half pitch).
  • FIG. 20 schematically shows an example of the first pixel group and the second pixel group in the image sensor 20 shown in FIG.
  • the arrangement of the R filter, G filter, and B filter of the phase difference pixels in each of the first pixel group and the second pixel group is a Bayer arrangement.
  • the first pixel L and the second pixel R are disposed adjacent to each other (with a minimum pitch) in pairs. Thereby, the phase difference between the first pixel group and the second pixel group is calculated with higher accuracy than in the case where the present configuration is not provided.
  • the arrangement of the R filter, the G filter, and the B filter of the phase difference pixels in each of the first pixel group and the second pixel group is a Bayer array.
  • the invention is not limited to this.
  • a part of the first pixels L in the first pixel group and a part of the second pixels R in the second pixel group are arranged adjacent to each other. May be. Since the pixel provided with the G filter is more sensitive than the pixel provided with the filter of other colors, the interpolation accuracy can be increased, and the pixel provided with the G filter is more Interpolation is easy because the G filter has continuity.
  • the split image divided in the vertical direction is exemplified.
  • the present invention is not limited to this, and an image divided into a plurality of parts in the horizontal direction or the diagonal direction may be applied as the split image.
  • the split image 66a shown in FIG. 22 is divided into odd and even lines by a plurality of dividing lines 63a parallel to the horizontal direction.
  • a line-like (eg, strip-like) phase difference image 66La generated based on the output signal outputted from the first pixel group is displayed on an odd line (even an even line is acceptable).
  • a line-shaped (eg, strip-shaped) phase difference image 66Ra generated based on the output signal output from the second pixel group is displayed on even lines.
  • the split image 66b shown in FIG. 23 is divided into two by a dividing line 63b (for example, a diagonal line of the split image 66b) having an inclination angle in the horizontal direction.
  • the phase difference image 66Lb generated based on the output signal output from the first pixel group is displayed in one area.
  • the phase difference image 66Rb generated based on the output signal output from the second pixel group is displayed in the other region.
  • the split image 66c shown in FIGS. 24A and 24B is divided by grid-like dividing lines 63c parallel to the horizontal direction and the vertical direction, respectively.
  • the phase difference image 66Lc generated based on the output signal output from the first pixel group is displayed in a checkered pattern (checker pattern).
  • the phase difference image 66Rc generated based on the output signal output from the second pixel group is displayed in a checkered pattern.
  • another focus confirmation image may be generated from the two phase difference images, and the focus confirmation image may be displayed.
  • two phase difference images may be superimposed and displayed as a composite image. If the image is out of focus, the image may be displayed as a double image, and the image may be clearly displayed when the image is in focus.
  • the output content of the image processing unit 28 is changed depending on whether the display of the normal image is instructed.
  • the image processing unit 28 is changed according to the operation state of the release switch 211.
  • a non-holding type switch is applied as an example of the release switch 211.
  • An example of the non-holding type switch is a momentary operation type switch.
  • a software key that realizes the same function as the release switch 211 may be displayed on the display unit 213 by a software configuration, and the displayed software key may be operated by the user via the touch panel.
  • symbol is attached
  • FIG. 25 is a flowchart showing an example of the flow of image output processing according to the second embodiment.
  • steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted.
  • the flowchart shown in FIG. 25 differs from the flowchart shown in FIG. 12 in that step 302A is applied instead of step 302 and that steps 350 to 360 are newly provided.
  • step 302A the CPU 12 determines whether or not the release switch 211 is in a half-pressed state (for example, held in a half-pressed state). If the release switch 211 is pressed halfway in step 302A, the determination is affirmed and the routine proceeds to step 306. If the release switch 211 is not pressed halfway in this step 302A (for example, if it is held at the standby position), the determination is negative and the routine proceeds to step 308.
  • “when the release switch 211 is not half-pressed” includes, for example, a case where the release switch 211 is not pressed.
  • the present invention is not limited to this, and the release switch 211 is half-pressed. It may be determined whether the in-focus state has been reached. That is, in this case, the case where the release switch 211 is half-pressed and brought into a focused state is a case where the display of the first display image is instructed by the first instruction unit according to the present invention.
  • step 350 the CPU 12 determines whether or not the release switch 211 is fully pressed (for example, held in the fully pressed state). If the release switch 211 is fully pressed in step 350, the determination is affirmed and the routine proceeds to step 352. If the release switch 211 is not fully pressed in this step 350 (for example, if the half-pressed state is maintained or if the half-pressed state is released), the determination is negative and the routine proceeds to step 302A.
  • step 352 the CPU 12 performs control so as to start capturing a still image. Then, in accordance with the processing in step 352, still image capturing is started. When the imaging is completed, the process proceeds to step 354.
  • step 354 the CPU 12 determines whether or not the split image display instruction by the operation unit 14 (an instruction unit different from the release switch 211 as an example) has been released. If the split image display instruction issued via the operation unit 14 in step 354 is canceled, the determination is affirmed and the process proceeds to step 356. If the split image display instruction given via the operation unit 14 in step 354 has not been canceled, the determination is negative and the routine proceeds to step 360.
  • step 356 the CPU 12 performs the same processing as the processing in step 306, and then proceeds to step 310.
  • step 360 the CPU 12 performs the same processing as the processing in step 308, and then proceeds to step 310.
  • the split image is not displayed on the display device and the normal image is displayed. Is displayed.
  • the split image is displayed when the operation state of the release switch 211 is canceled when the split image is not displayed (step 358: Y), the split image is more appropriate than the case without this configuration. It can be displayed quickly at the right time.
  • the display is switched when the state where the release switch 211 is held at the half-pressed position is released. That is, in the example shown in FIG. 25, when the release switch 211 is held in the fully-pressed position, imaging is performed (step 352), and then the split image display instruction performed via the operation unit 14 is canceled.
  • the CPU 12 controls the display device to display the normal image (step 356).
  • AF autofocus
  • step 356 when the determination is affirmed in step 354 and the process proceeds to step 360 when the determination is denied.
  • the invention is not limited to this.
  • a step (addition step A) is inserted between step 354 and step 356 to cause the CPU 12 to determine whether or not the operation with respect to the release switch 211 has been released, and a step of performing similar processing (addition step B) is performed. It may be inserted between 354 and step 360.
  • “operation on the release switch 211” refers to full-pressing and half-pressing of the release switch 211, for example. That is, the state in which the operation with respect to the release switch 211 is released means, for example, a state in which the release switch 211 is released without the release switch 211 being fully pressed or half pressed.
  • step A if the release switch 211 is fully pressed, the determination is denied and the process proceeds to step 354.
  • step A when the release switch 211 is shifted from the fully pressed state to the state where the release switch 211 is not operated, an affirmative determination is made and the routine proceeds to step 356.
  • the operation state of the release switch 211 is released, a live view image based on a normal image can be displayed on the display device.
  • step B when the release switch 211 is shifted from the fully pressed state to a state where the release switch 211 is not operated, an affirmative determination is made and the routine proceeds to step 360.
  • the split image can be displayed on the display device together with the live view image based on the normal image.
  • the present invention is not limited to the image output process described above, and the CPU 12 may execute an image output process in which steps 354 and 356 are omitted from the image output process according to the second embodiment.
  • the release switch 211 is held at the fully-pressed position while the split image display instruction is being issued, the normal image display instruction is canceled, and the split image is displayed on the display device together with the normal image accordingly.
  • You may make CPU12 perform control to perform. In this case, for example, in a scene where a plurality of images of the same subject are changed, the setting of performing manual focus adjustment using a split image can be maintained even after imaging, which contributes to improvement of convenience for the user. Can do.
  • the release switch 211 when the release switch 211 is held at the fully-pressed position (an example of the second indication position) or the standby position (when the release switch 211 is not operated at the time of capturing a live view image), By displaying the normal image and displaying the split image in the normal image display area, the split image can be displayed at an appropriate time according to the operation state of the release switch 211.
  • the output content of the image processing unit 28 is changed depending on whether the display of the normal image is instructed.
  • the output content of the image processing unit 28 is changed according to the F value. The case of changing will be described. In the following description, differences from the first embodiment will be described. Moreover, the same code
  • FIG. 26 is a flowchart showing an example of the flow of image output processing according to the third embodiment.
  • steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted.
  • the flowchart shown in FIG. 26 is different from the flowchart shown in FIG. 12 in that step 302B is applied instead of step 302.
  • step 302B whether or not a predetermined F value (an example of a specific aperture value) is set by the CPU 12 via the operation unit 14 (for example, a depth-of-field confirmation button) as an F value used when capturing a still image. Is determined. If a predetermined F value is set via the operation unit 14 in step 302B, the determination is affirmed and the routine proceeds to step 306. If the predetermined F value is not set via the operation unit 14 in step 302B, the determination is negative and the routine proceeds to step 308.
  • a predetermined F value an example of a specific aperture value
  • a split image is displayed on the display device when an F value that is set in advance as an F value used when capturing a still image is set via the operation unit 14.
  • a normal image is displayed without being displayed.
  • an F value that is set in advance as an F value used when capturing a still image is not set via the operation unit 14
  • a normal image and a split image are displayed on the display device.
  • the output content of the image processing unit 28 is changed depending on whether or not the display of the normal image is instructed.
  • FIG. 27 shows a flowchart showing an example of the flow of image output processing according to the fourth embodiment.
  • steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted.
  • the flowchart shown in FIG. 26 is different from the flowchart shown in FIG. 12 in that step 302C is applied instead of step 302.
  • step 302C the CPU 12 determines whether or not the user is using the EVF 248. Whether or not the user is using the EVF 248 is determined based on, for example, whether or not the viewfinder eyepiece unit 242 is determined to be used based on the detection result of the eyepiece detection unit 37. That is, when it is determined that the finder eyepiece 242 is used, it is determined that the user is using the EVF 248, and when it is determined that the finder eyepiece 242 is not used, the user uses the EVF 248. It is determined that it is not being used. If the user is using the EVF 248 in step 302C, the determination is affirmed and the routine proceeds to step 306. If the user does not use the EVF 248 in step 302C, the determination is negative and the process proceeds to step 308.
  • the image output process according to the fourth embodiment is performed by the CPU 12, when the EVF 248 is used, the normal image is displayed without displaying the split image on the display device.
  • the EVF 248 is not used, a normal image and a split image are displayed on the display device.
  • the split image can be displayed at an appropriate time according to the usage state of the EVF 248 as compared with the case where this configuration is not provided.
  • the image output processing according to the fourth embodiment and at least one of the image output processing according to the first to third embodiments may be performed in parallel, or in the first to third embodiments. At least two or more of the image output processes may be performed in parallel. In this case, for example, when at least one of the steps 302, 302A, 302B, and 302C is affirmative, the normal image is displayed and the split image is discarded. Further, when at least one of the determinations is negative, a process of displaying a normal image and a split image is performed.
  • each process included in the image output process described in the first to fourth embodiments may be realized by a software configuration using a computer by executing a program, or may be realized by a hardware configuration. May be. Further, it may be realized by a combination of a hardware configuration and a software configuration.
  • the program may be stored in a predetermined storage area (for example, the memory 26) in advance. It is not always necessary to store in the memory 26 from the beginning.
  • a program is first stored in an arbitrary “portable storage medium” such as a flexible disk connected to a computer, so-called FD, CD-ROM, DVD disk, magneto-optical disk, IC card, etc. Also good. Then, the computer may acquire the program from these portable storage media and execute it.
  • each program may be stored in another computer or server device connected to the computer via the Internet, LAN (Local Area Network), etc., and the computer may acquire and execute the program from these. Good.
  • the imaging device 100 is illustrated.
  • a mobile terminal device that is a modification of the imaging device 100, for example, a mobile phone or a smartphone having a camera function, a PDA (Personal Digital Assistant), a portable game machine Etc.
  • a smartphone will be described as an example, and will be described in detail with reference to the drawings.
  • FIG. 28 is a perspective view showing an example of the appearance of the smartphone 500.
  • a smartphone 500 illustrated in FIG. 28 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520.
  • the housing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541. Note that the configuration of the housing 502 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide structure may be employed.
  • FIG. 29 is a block diagram showing an example of the configuration of the smartphone 500 shown in FIG.
  • the main components of the smartphone 500 include a wireless communication unit 510, a display input unit 520, a call unit 530, an operation unit 540, a camera unit 541, a storage unit 550, and an external input / output. Part 560.
  • the smartphone 500 includes a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.
  • GPS Global Positioning System
  • a wireless communication function for performing mobile wireless communication via the base station device BS and the mobile communication network NW is provided as a main function of the smartphone 500.
  • the wireless communication unit 510 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 501. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
  • the display input unit 520 is a so-called touch panel, and includes a display panel 521 and an operation panel 522. For this reason, the display input unit 520 displays images (still images and moving images), character information, and the like visually by controlling the main control unit 501, and visually transmits information to the user, and performs user operations on the displayed information. To detect. Note that when viewing the generated 3D, the display panel 521 is preferably a 3D display panel.
  • the display panel 521 uses an LCD, OELD (Organic Electro-Luminescence Display), or the like as a display device.
  • the operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
  • the display panel 521 and the operation panel 522 of the smartphone 500 integrally form the display input unit 520, but the operation panel 522 is disposed so as to completely cover the display panel 521. ing.
  • the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
  • the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
  • the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like.
  • examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method. You can also
  • the call unit 530 includes a speaker 531 and a microphone 532.
  • the call unit 530 converts the user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501 and outputs the voice data to the main control unit 501. Further, the call unit 530 decodes the audio data received by the wireless communication unit 510 or the external input / output unit 560 and outputs it from the speaker 531.
  • the speaker 531 can be mounted on the same surface as the surface on which the display input unit 520 is provided, and the microphone 532 can be mounted on the side surface of the housing 502.
  • the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
  • the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
  • the storage unit 550 stores the control program and control data of the main control unit 501, application software, address data that associates the name and telephone number of the communication partner, and transmitted / received e-mail data.
  • the storage unit 550 stores Web data downloaded by Web browsing and downloaded content data.
  • the storage unit 550 temporarily stores streaming data and the like.
  • the storage unit 550 includes an external storage unit 552 having an internal storage unit 551 built in the smartphone and a removable external memory slot.
  • Each of the internal storage unit 551 and the external storage unit 552 constituting the storage unit 550 is realized using a storage medium such as a flash memory type (hard memory type) or a hard disk type (hard disk type).
  • multimedia card micro type multimedia card micro type
  • card type memory for example, MicroSD (registered trademark) memory
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and is used to connect directly or indirectly to other external devices through communication or the like or a network. is there. Examples of communication with other external devices include universal serial bus (USB), IEEE 1394, and the like. Examples of the network include the Internet, wireless LAN, Bluetooth (Bluetooth (registered trademark)), RFID (Radio Frequency Identification), and infrared communication (Infrared Data Association: IrDA (registered trademark)). Other examples of the network include UWB (Ultra Wideband (registered trademark)) and ZigBee (registered trademark).
  • Examples of the external device connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, and a memory card connected via a card socket.
  • Other examples of external devices include SIM (Subscriber Identity Module Card) / UIM (User Identity Module Card) cards, and external audio / video devices connected via audio / video I / O (Input / Output) terminals. Can be mentioned.
  • an external audio / video device that is wirelessly connected can be used.
  • the external input / output unit may transmit data received from such an external device to each component inside the smartphone 500, or may allow data inside the smartphone 500 to be transmitted to the external device. it can.
  • the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, and calculates the latitude of the smartphone 500 Detect the position consisting of longitude and altitude.
  • the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
  • the motion sensor unit 580 includes a triaxial acceleration sensor, for example, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
  • the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
  • the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the storage unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function for controlling each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 510.
  • the application processing function is realized by the main control unit 501 operating according to the application software stored in the storage unit 550.
  • Application processing functions include, for example, an infrared communication function that controls the external input / output unit 560 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
  • the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image data or moving image data) such as received data or downloaded streaming data.
  • the image processing function is a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
  • the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
  • the main control unit 501 By executing the display control, the main control unit 501 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
  • a software key such as a scroll bar
  • the scroll bar refers to a software key for accepting an instruction to move a display portion of an image, such as a large image that does not fit in the display area of the display panel 521.
  • the main control unit 501 detects a user operation through the operation unit 540, or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 522. Or, by executing the operation detection control, the main control unit 501 accepts a display image scroll request through a scroll bar.
  • the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 21.
  • a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key is provided.
  • the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function according to the detected gesture operation.
  • Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
  • the camera unit 541 is a digital camera that captures an image using an image sensor such as a CMOS or a CCD, and has the same function as the image capturing apparatus 100 shown in FIG.
  • the camera unit 541 can switch between a manual focus mode and an autofocus mode.
  • the photographing lens of the camera unit 541 can be focused by operating a focus icon button or the like displayed on the operation unit 540 or the display input unit 520.
  • the manual focus mode the live view image obtained by combining the split images is displayed on the display panel 521 so that the in-focus state during the manual focus can be confirmed.
  • the camera unit 541 converts the image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501.
  • the converted image data can be recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510.
  • the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and the camera unit 541 may be mounted on the back surface of the display input unit 520.
  • a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for imaging may be switched and imaged alone, or a plurality of camera units 541 may be used simultaneously for imaging. it can.
  • the camera unit 541 can be used for various functions of the smartphone 500.
  • an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
  • the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
  • the optical axis direction of the camera unit 541 of the smartphone 500 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
  • the image from the camera unit 541 can be used in the application software.
  • various information can be added to still image or moving image data and recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510.
  • the “various information” herein include, for example, position information acquired by the GPS receiving unit 570 and image information of the still image or moving image, audio information acquired by the microphone 532 (sound text conversion by the main control unit or the like). May be text information).
  • posture information acquired by the motion sensor unit 580 may be used.
  • the image pickup device 20 having the first to third pixel groups is illustrated, but the present invention is not limited to this, and only the first pixel group and the second pixel group are used.
  • An image sensor made of A digital camera having this type of image sensor generates a three-dimensional image (3D image) based on the first image output from the first pixel group and the second image output from the second pixel group. 2D images (2D images) can also be generated. In this case, the generation of the two-dimensional image is realized, for example, by performing an interpolation process between pixels of the same color in the first image and the second image. Moreover, you may employ
  • split image refers to a case where an image sensor including only a phase difference pixel group (for example, a first pixel group and a second pixel group) is used, and a phase difference pixel at a predetermined ratio with respect to a normal pixel.
  • an image output from the phase difference image group for example, the first image output from the first pixel group
  • a split image based on the second image output from the second pixel group is not limited to a mode in which both the normal image and the split image are simultaneously displayed on the same screen of the display device, and the normal image is displayed in a state where the split image display is instructed.
  • the display control unit 36 may perform control to display the split image without displaying the normal image on the display device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

Provided are an image processing device, an imaging device, an image processing method, and an image processing program with which switching between displaying and not displaying an image used to confirm focus can be performed at an appropriate time. An operation unit instructs the output of a normal image, and instructs the output of a split image (steps 300, 302). In cases when a display instruction for the normal image is cancelled by the operation unit while the operation unit instructs the output of the split image, an image processing unit outputs a generated split image (step 308). Furthermore, in cases when the output of the normal image is instructed by the operation unit while the operation unit instructs the output of the split image, the image processing unit outputs a generated normal image without outputting the generated split image (step 306).

Description

画像処理装置、撮像装置、画像処理方法及び画像処理プログラムImage processing apparatus, imaging apparatus, image processing method, and image processing program
 本発明は、画像処理装置、撮像装置、画像処理方法及び画像処理プログラムに関する。 The present invention relates to an image processing device, an imaging device, an image processing method, and an image processing program.
 デジタルカメラとして、位相差検出方式やコントラスト検出方式を用いたオートフォーカスの他に、使用者が手動でフォーカス調整を行うことができる、いわゆるマニュアルフォーカスモードを備えるものが広く知られている。 A digital camera having a so-called manual focus mode in which a user can manually perform focus adjustment in addition to auto-focus using a phase difference detection method or a contrast detection method is widely known.
 マニュアルフォーカスモードを有するデジタルカメラとしては、被写体を確認しながらフォーカス調整ができるようにレフレックスミラーを設けて、目視による位相差を表示するスプリットマイクロプリズムスクリーンを用いた方法を採用したものが知られている。また、目視によるコントラストの確認を行う方法を採用したものも知られている。 As a digital camera having a manual focus mode, a method using a split micro prism screen that displays a phase difference visually by providing a reflex mirror so that focus adjustment can be performed while checking a subject is known. ing. Also known is a method employing a method for visually confirming contrast.
 ところで、近年普及しているレフレックスミラーを省略したデジタルカメラでは、レフレックスミラーがないため位相差を表示しながら被写体像を確認する方法がなく、コントラス検出方式に頼らざるを得なかった。しかし、この場合には、(liquid crystal display:液晶ディスプレイ)等の表示装置の解像度以上のコントラストの表示ができず、一部拡大するなどして表示する方法を採らざるを得なかった。 By the way, in a digital camera that omits the reflex mirror that has been widely used in recent years, there is no method for checking the subject image while displaying the phase difference because there is no reflex mirror, and it has been necessary to rely on the contrast detection method. However, in this case, it is impossible to display a contrast higher than the resolution of a display device such as (liquid crystal display), and it has been necessary to adopt a method of displaying the image by partially enlarging it.
 そこで、近年では、マニュアルフォーカスモード時に操作者が被写体に対してピントを合わせる作業を容易にするために、スプリットイメージをライブビュー画像(スルー画像ともいう)内に表示している。ここで、スプリットイメージとは、例えば2分割された分割画像(例えば上下方向に分割された各画像)であって、ピントのずれに応じて視差発生方向(例えば左右方向)にずれ、ピントが合った状態だと視差発生方向のずれがなくなる分割画像を指す。操作者(例えば撮影者)は、スプリットイメージ(例えば上下方向に分割された各画像)のずれがなくなるように、マニュアルフォーカスリングを操作してピントを合わせる。 Therefore, in recent years, a split image is displayed in a live view image (also referred to as a through image) in order to make it easier for the operator to focus on the subject in the manual focus mode. Here, the split image is, for example, a divided image divided into two (for example, images divided in the vertical direction), and is shifted in the parallax generation direction (for example, the horizontal direction) according to the shift of the focus, and is in focus. In this state, it indicates a divided image in which the shift in the parallax generation direction is eliminated. An operator (for example, a photographer) adjusts the focus by operating the manual focus ring so that the split image (for example, each image divided in the vertical direction) is not displaced.
 特開2004-40740号公報(以下、特許文献1という)に記載のマニュアルフォーカス装置は、被写体光路上の開口絞りを光軸と垂直な面上で移動させる絞り移動手段と、開口絞りが移動する二つの測距位置でそれぞれ撮像された二つの被写体画像を記憶する記憶手段と、を有している。また、二つの被写体画像が合成されたスプリットイメージが出力され、ピント状態の適否が表示される表示手段を有している。 In a manual focus device described in Japanese Patent Application Laid-Open No. 2004-40740 (hereinafter referred to as Patent Document 1), aperture moving means for moving the aperture stop on the subject optical path on a plane perpendicular to the optical axis, and the aperture stop move. Storage means for storing two subject images respectively captured at two distance measuring positions. In addition, the image processing apparatus includes a display unit that outputs a split image in which two subject images are combined and displays whether the focus state is appropriate.
 特開2009-147665号公報(以下、特許文献2という)に記載の撮像装置は、撮像光学系からの光束のうち、瞳分割部によって分割された光束により形成された第1の被写体像及び第2の被写体像をそれぞれ光電変換した第1の画像及び第2の画像を生成する。そして、これらの第1及び第2の画像を用いてスプリットイメージを生成すると共に、瞳分割部によって分割されない光束により形成された第3の被写体像を光電変換して第3の画像を生成する。そして、第3の画像を表示部に表示すると共に、第3の画像内に、生成したスプリットイメージを表示し、かつ第3の画像から抽出した色情報をスプリットイメージに付加するようにしている。このように第3の画像から抽出した色情報をスプリットイメージに付加することにより、スプリットイメージの視認性を良好にすることができる。 An imaging apparatus described in Japanese Patent Laying-Open No. 2009-147665 (hereinafter referred to as Patent Document 2) includes a first subject image and a first object image formed by a light beam divided by a pupil division unit among light beams from an imaging optical system. A first image and a second image are generated by photoelectrically converting the two subject images. Then, a split image is generated using the first and second images, and a third image is generated by photoelectric conversion of a third subject image formed by a light beam that is not divided by the pupil dividing unit. The third image is displayed on the display unit, the generated split image is displayed in the third image, and the color information extracted from the third image is added to the split image. In this way, by adding the color information extracted from the third image to the split image, the visibility of the split image can be improved.
 特開2009-163220号公報(以下、特許文献3という)に記載の撮像装置は、瞳分割手段によって分割されて得られた第1の画像及び第2の画像を重畳させた重畳画像を表示手段に表示させる処理手段を備えている。 An imaging apparatus described in Japanese Patent Laying-Open No. 2009-163220 (hereinafter referred to as Patent Document 3) displays a superimposed image obtained by superimposing a first image and a second image obtained by dividing by a pupil dividing unit. The processing means to display is provided.
 特開2001-309210号公報(以下、特許文献4という)に記載のデジタルカメラは、ピント位置と被写***置とのずれ量を検出し、ずれ量に応じてスプリットイメージの表示内容を変更する表示変更手段を備えている。 A digital camera described in Japanese Patent Laid-Open No. 2001-309210 (hereinafter referred to as Patent Document 4) detects a shift amount between a focus position and a subject position, and changes the display content of the split image according to the shift amount. Means.
 特開2009-237214号公報(以下、特許文献5という)に記載の撮像装置は、マニュアルフォーカスモードにおいて絞り込みボタンが操作されるとスプリットイメージが画面から消され、絞り込みボタンが操作されないとスプリットイメージが表示される構成を具備している。 In the imaging apparatus described in Japanese Patent Application Laid-Open No. 2009-237214 (hereinafter referred to as Patent Document 5), the split image is erased from the screen when the aperture button is operated in the manual focus mode, and the split image is displayed when the aperture button is not operated. The displayed configuration is provided.
 しかしながら、特許文献1~5に記載の技術は何れもユーザにとってスプリットイメージが不要な時期が到来しているにも拘らずスプリットイメージが継続的に表示されてしまう場合がある。逆に、ユーザにとってスプリットイメージが必要な時期が到来したにも拘らずスプリットイメージが表示されない場合がある。このように特許文献1~5に記載の技術は何れもスプリットイメージの表示状態と非表示状態とを適切な時期に切り替えることが難しい、という問題点があった。 However, in any of the techniques described in Patent Documents 1 to 5, the split image may be continuously displayed even when the time when the split image is unnecessary for the user has come. On the other hand, the split image may not be displayed even when the time when the user needs the split image has arrived. As described above, all of the techniques described in Patent Documents 1 to 5 have a problem that it is difficult to switch between the split image display state and the non-display state at an appropriate time.
 本発明は、このような実情を鑑みて提案されたものであり、合焦確認に使用する画像の表示及び非表示を適切な時期に切り替えることができる画像処理装置、撮像装置、画像処理方法及び画像処理プログラムを提供することを目的とする。 The present invention has been proposed in view of such circumstances, and an image processing apparatus, an imaging apparatus, an image processing method, and an image processing apparatus that can switch between display and non-display of an image used for in-focus confirmation at an appropriate time. An object is to provide an image processing program.
 本発明の第1の態様に係る画像処理装置は、撮影レンズにおける第1及び第2の領域を通過した被写体像が瞳分割されてそれぞれ結像される第1及び第2の画素群を有する撮像素子から出力された画像信号に基づいて第1の表示用画像を生成し、かつ、前記第1及び第2の画素群から出力された第1及び第2の画像信号に基づいて合焦確認に使用する第2の表示用画像を生成する生成部と、前記第1の表示用画像の表示を指示する第1指示部であって非保持型の指示を行う第1指示部と、前記第2の表示用画像の表示を指示する第2指示部と、画像を表示する表示部と、前記第2指示部により前記第2の表示用画像の表示が指示されている状態で前記第1指示部により前記第1の表示用画像の表示が指示された場合は前記表示部に対して前記生成部により生成された前記第2の表示用画像を表示させずに前記生成部により生成された前記第1の表示用画像を表示させる制御を行い、前記第2指示部により前記第2の表示用画像の表示が指示されている状態で前記第1指示部による前記第1の表示用画像の表示指示が解除された場合は前記表示部に対して前記生成部により生成された前記第1の表示用画像を表示させ、かつ、該第1の表示用画像の表示領域内に前記生成部により生成された前記第2の表示用画像を表示させる制御を行う表示制御部と、を含む。これにより、本構成を有しない場合に比べ、合焦確認に使用する画像の表示及び非表示を適切な時期に切替えることができる。 An image processing apparatus according to a first aspect of the present invention includes first and second pixel groups in which a subject image that has passed through first and second regions in a photographic lens is divided into pupils and formed, respectively. A first display image is generated based on the image signal output from the element, and in-focus confirmation is performed based on the first and second image signals output from the first and second pixel groups. A generation unit that generates a second display image to be used; a first instruction unit that instructs display of the first display image; and a first instruction unit that instructs a non-holding type; and the second A second instruction unit for instructing display of the display image, a display unit for displaying the image, and the first instruction unit in a state in which display of the second display image is instructed by the second instruction unit. When the display of the first display image is instructed by the Control is performed to display the first display image generated by the generation unit without displaying the second display image generated by the generation unit, and the second display unit performs the second display. When the display instruction of the first display image by the first instruction unit is canceled in the state where the display of the image for use is instructed, the first unit generated by the generation unit with respect to the display unit And a display control unit that performs control to display a display image and display the second display image generated by the generation unit in a display area of the first display image. Thereby, the display and non-display of the image used for the focus confirmation can be switched at an appropriate time as compared with the case where this configuration is not provided.
 本発明の第2の態様は、本発明の第1の態様において、前記第2指示部よる指示を保持型の指示としたものとしてもよい。これにより、本構成を有しない場合に比べ、第2の表示用画像の表示と非表示とを切り替える手間を軽減することができる。 In the second aspect of the present invention, in the first aspect of the present invention, the instruction by the second instruction unit may be a holding instruction. Thereby, compared with the case where it does not have this structure, the effort which switches the display and non-display of the 2nd image for a display can be reduced.
 本発明の第3の態様は、本発明の第1の態様又は第2の態様において、前記撮像素子が、前記撮影レンズを透過した被写体像が瞳分割されずに結像されて第3の画像を出力する第3の画素群を更に有し、前記生成部が、前記第3の画像群から出力された前記第3の画像に基づいて前記第1の表示用画像を生成するものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、第1の表示用画像の画質を向上させることができる。 According to a third aspect of the present invention, in the first aspect or the second aspect of the present invention, the imaging element forms a third image by subjecting the subject image transmitted through the photographing lens without being divided into pupils. May be further included, and the generation unit may generate the first display image based on the third image output from the third image group. . Thereby, compared with the case where it does not have this structure, the image quality of the 1st image for a display can be improved with a simple structure.
 本発明の第4の態様は、本発明の第3の態様において、前記生成部が、前記第1及び第2の画像を前記第3の画像に基づいて補間して得た画像及び前記第3の画像に基づいて前記第1の表示用画像を生成するものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、第1の表示用画像の画質をより一層向上させることができる。 According to a fourth aspect of the present invention, in the third aspect of the present invention, the generation unit interpolates the first and second images based on the third image, and the third The first display image may be generated based on the image. Thereby, compared with the case where this structure is not provided, the image quality of the first display image can be further improved with a simple structure.
 本発明の第5の態様に係る撮像装置は、本発明の第1の態様から第4の態様の何れか1つの画像処理装置と、前記第1及び第2の画素群を有する撮像素子と、前記撮像素子から出力された画像を記憶する記憶部と、を含む。これにより、本構成を有しない場合に比べ、合焦確認に使用する画像の表示及び非表示を適切な時期に切り替えることができる。 An imaging device according to a fifth aspect of the present invention includes an image processing device according to any one of the first to fourth aspects of the present invention, an imaging element having the first and second pixel groups, And a storage unit that stores an image output from the image sensor. Thereby, the display and non-display of the image used for focus confirmation can be switched at an appropriate time as compared with the case where the present configuration is not provided.
 本発明の第6の態様は、本発明の第5の態様において、前記第1指示部を、撮像条件の調整を指示する第1指示位置と撮像の開始を指示する第2指示位置とに移動可能なレリーズスイッチとし、前記レリーズスイッチが前記第1指示位置に保持された場合を、前記第1の表示用画像の表示が指示された場合とし、前記レリーズスイッチが前記第1指示位置に保持された状態が解除された場合に表示が切り替わるものとしてもよい。これにより、本構成を有しない場合に比べ、第2の表示用画像を適切な時期に迅速に表示させることができる。 According to a sixth aspect of the present invention, in the fifth aspect of the present invention, the first instruction unit is moved to a first instruction position that instructs adjustment of an imaging condition and a second instruction position that instructs the start of imaging. When the release switch is held at the first designated position, the display of the first display image is designated. When the release switch is held at the first designated position, the release switch is held at the first designated position. The display may be switched when the state is released. Accordingly, the second display image can be quickly displayed at an appropriate time as compared with the case where the present configuration is not provided.
 本発明の第7の態様は、本発明の第6の態様において、前記レリーズスイッチが前記第2指示位置に保持された場合に撮像動作が開始され、前記表示制御部が、前記レリーズスイッチが前記第2指示位置に保持された場合、前記第2指示部による前記第2の表示用画像の表示指示が解除されることで、前記撮像動作の終了後に前記表示部に対して前記第1の表示用画像を表示させる制御を行うものとしてもよい。これにより、本構成を有しない場合に比べ、撮像後に第2の表示用画像の表示を解除して第1の表示用画像を復帰させることを要望するユーザの利便性を向上させることができる。 According to a seventh aspect of the present invention, in the sixth aspect of the present invention, an imaging operation is started when the release switch is held at the second designated position, and the display control unit is configured so that the release switch is When the second instruction position is held, the display instruction of the second display image by the second instruction unit is canceled, so that the first display is displayed on the display unit after the imaging operation is finished. Control for displaying an image for use may be performed. Thereby, compared with the case where this configuration is not provided, it is possible to improve the convenience of a user who desires to cancel the display of the second display image and return the first display image after imaging.
 本発明の第8の態様は、本発明の第6の態様において、前記レリーズスイッチが前記第2指示位置に保持された場合に撮像動作が開始され、前記レリーズスイッチが前記第2指示位置に保持された場合を、前記第1の表示用画像の表示指示が解除された場合としたものとしてもよい。これにより、本構成を有しない場合に比べ、撮像後も第2の表示用画像の表示を保持したことを要望するユーザの利便性を向上させることができる。 According to an eighth aspect of the present invention, in the sixth aspect of the present invention, when the release switch is held at the second designated position, an imaging operation is started, and the release switch is held at the second designated position. The case where it is performed may be a case where the display instruction of the first display image is canceled. Thereby, compared with the case where this configuration is not provided, it is possible to improve the convenience of the user who desires to retain the display of the second display image even after imaging.
 本発明の第9の態様は、本発明の第5の態様から第8の態様の何れか1つにおいて、前記第1指示部を、前記撮影レンズの絞り値を設定する設定部とし、前記第1の表示用画像の表示が指示された場合を、前記設定部により特定の絞り値が設定された場合とし、前記設定部により前記特定の絞り値が設定されない場合を、前記第1の表示用画像の表示指示が解除された場合としたものとしてもよい。これにより、本構成を有しない場合に比べ、絞り値に応じて、第2の表示用画像を適切な時期に表示させることができる。 According to a ninth aspect of the present invention, in any one of the fifth to eighth aspects of the present invention, the first instruction unit is a setting unit that sets an aperture value of the photographing lens. The case where the display of one display image is instructed is the case where a specific aperture value is set by the setting unit, and the case where the specific aperture value is not set by the setting unit is the first display image It may be a case where the image display instruction is canceled. Accordingly, the second display image can be displayed at an appropriate time according to the aperture value as compared with the case where the present configuration is not provided.
 本発明の第10の態様は、本発明の第5の態様から第9の態様の何れか1つにおいて、前記第1の表示用画像及び前記第2の表示用画像を表示可能な電子ビューファインダーを更に含み、前記第1指示部が、前記電子ビューファインダーの使用を検出する検出部を有し、前記検出部により前記電子ビューファインダーの使用が検出されない場合を、前記第1の表示用画像の表示が指示された場合とし、前記電子ビューファインダーの使用が検出された場合を、前記第1の表示用画像の表示指示が解除された場合としたものとしてもよい。これにより、本構成を有しない場合に比べ、電子ビューファインダーの使用状態に応じて、第2の表示用画像を適切な時期に表示させることができる。 According to a tenth aspect of the present invention, in any one of the fifth to ninth aspects of the present invention, an electronic viewfinder capable of displaying the first display image and the second display image. The first instruction unit includes a detection unit that detects use of the electronic viewfinder, and the detection unit detects that the use of the electronic viewfinder is not detected. The case where the display is instructed, and the case where the use of the electronic viewfinder is detected may be the case where the display instruction for the first display image is canceled. Accordingly, the second display image can be displayed at an appropriate time according to the usage state of the electronic viewfinder, as compared with the case where the present configuration is not provided.
 本発明の第11の態様は、本発明の第5の態様から第10の態様の何れか1つにおいて、前記撮像素子が、前記撮影レンズを透過した被写体像が瞳分割されずに結像される第3の画素群を更に有し、前記撮像素子において、前記第1及び第2の画素群に含まれる各画素を、前記第3の画素群と共に行列状に配置し、且つ、前記第1の画素群と前記第2の画素群との間で行方向及び列方向の少なくとも一方についての位置が所定画素数内に収まる位置に配置したものとしてもよい。これにより、本構成を有しない場合に比べ、画質を向上させることができる。 According to an eleventh aspect of the present invention, in any one of the fifth to tenth aspects of the present invention, the imaging element forms an object image that has passed through the photographing lens without pupil division. A third pixel group, and in the imaging device, the pixels included in the first and second pixel groups are arranged in a matrix together with the third pixel group, and the first The pixel group may be arranged at a position where the position in at least one of the row direction and the column direction falls within a predetermined number of pixels between the second pixel group and the second pixel group. As a result, the image quality can be improved as compared with the case without this configuration.
 本発明の第12の態様は、本発明の第11の態様において、前記撮像素子において、更に、前記第1及び第2の画素群に含まれる各画素を、前記第1の画素群と前記第2の画素群との間で画素単位で所定方向に隣接させたものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、画質をより一層向上させることができる。 According to a twelfth aspect of the present invention, in the eleventh aspect of the present invention, in the imaging element, each pixel included in the first and second pixel groups is further divided into the first pixel group and the first pixel group. The pixel group may be adjacent to each other in a predetermined direction between the two pixel groups. As a result, the image quality can be further improved with a simple configuration as compared with the case without this configuration.
 本発明の第13の態様は、本発明の第11の態様又は第12の態様において、前記撮像素子において、更に、前記第1及び第2の画素群に含まれる各画素を、前記第3の画素群に含まれる画素間に配置し、前記第1~第3の画素群上に設けられ、前記第1及び第2の画素群並びに前記第1及び第2の画素群に含まれる各画素に隣接する前記第3の画素群に含まれる画素に対して特定の原色を割り当てるカラーフィルタを更に含むものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、画質をより一層向上させることができる。 According to a thirteenth aspect of the present invention, in the eleventh aspect or the twelfth aspect of the present invention, in the imaging device, each pixel included in the first and second pixel groups is further replaced with the third pixel. It is arranged between the pixels included in the pixel group, provided on the first to third pixel groups, and each pixel included in the first and second pixel groups and the first and second pixel groups. A color filter that assigns a specific primary color to pixels included in the adjacent third pixel group may be further included. As a result, the image quality can be further improved with a simple configuration as compared with the case without this configuration.
 本発明の第14の態様は、本発明の第13の態様において、前記特定の原色を緑色としたものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、画質をより一層向上させることができる。 In the fourteenth aspect of the present invention, the specific primary color may be green in the thirteenth aspect of the present invention. As a result, the image quality can be further improved with a simple configuration as compared with the case without this configuration.
 本発明の第15の態様は、本発明の第13の態様又は第14の態様において、前記カラーフィルタの原色の配列をベイヤ配列としたものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、画質をより一層向上させることができる。 In the fifteenth aspect of the present invention, the primary color array of the color filter may be a Bayer array in the thirteenth or fourteenth aspect of the present invention. As a result, the image quality can be further improved with a simple configuration as compared with the case without this configuration.
 本発明の第16の態様は、本発明の第11の態様又は第12の態様において、前記撮像素子において、更に、前記第1及び第2の画素群に含まれる各画素を、前記第3の画素群と共に半画素ずれた行列状に配置し、且つ、前記第3の画素群に含まれる画素間に配置し、前記撮像素子における構成画素が2類型に分類されて得た第4及び第5の画素群であって前記構成画素が前記行方向及び列方向について交互に半画素分ずれて配置された第4及び第5の画素群上に設けられ、前記第4及び第5の画素群の各々に対してベイヤ配列の原色を割り当てるカラーフィルタを更に含むものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、画質をより一層向上させることができる。 According to a sixteenth aspect of the present invention, in the eleventh aspect or the twelfth aspect of the present invention, in the imaging element, each pixel included in the first and second pixel groups is further replaced with the third pixel. 4th and 5th obtained by arranging the pixel group together with the pixel group in a matrix shifted by half a pixel and between the pixels included in the third pixel group and classifying the constituent pixels in the image sensor into two types. And the constituent pixels are provided on the fourth and fifth pixel groups that are alternately shifted by half a pixel in the row direction and the column direction. A color filter that assigns a primary color of the Bayer array to each may be further included. As a result, the image quality can be further improved with a simple configuration as compared with the case without this configuration.
 本発明の第17の態様は、本発明の第16の態様において、前記第1及び第2の画素群を、前記カラーフィルタにおける緑色のフィルタに対して配置したものとしてもよい。これにより、本構成を有しない場合に比べ、簡素な構成で、画質をより一層向上させることができる。 In a seventeenth aspect of the present invention, in the sixteenth aspect of the present invention, the first and second pixel groups may be arranged with respect to a green filter in the color filter. As a result, the image quality can be further improved with a simple configuration as compared with the case without this configuration.
 本発明の第18の態様に係る画像処理方法は、撮影レンズにおける第1及び第2の領域を通過した被写体像が瞳分割されてそれぞれ結像される第1及び第2の画素群を有する撮像素子から出力された画像信号に基づいて第1の表示用画像を生成し、かつ、前記第1及び第2の画素群から出力された第1及び第2の画像信号に基づいて合焦確認に使用する第2の表示用画像を生成する生成工程と、前記第1の表示用画像の表示を指示する第1指示工程であって非保持型の指示を行う第1指示工程と、前記第2の表示用画像の表示を指示する第2指示工程と、前記第2指示工程により前記第2の表示用画像の表示が指示されている状態で前記第1指示工程により前記第1の表示用画像の表示が指示された場合は、画像を表示する表示部に対して前記生成工程により生成された前記第2の表示用画像を表示させずに前記生成工程により生成された前記第1の表示用画像を表示させる制御を行い、前記第2指示工程により前記第2の表示用画像の表示が指示されている状態で前記第1指示工程による前記第1の表示用画像の表示指示が解除された場合は前記表示部に対して前記生成工程により生成された前記第1の表示用画像を表示させ、かつ、該第1の表示用画像の表示領域内に前記生成工程により生成された前記第2の表示用画像を表示させる制御を行う表示制御工程と、を含む。これにより、本構成を有しない場合に比べ、合焦確認に使用する画像の表示及び非表示を適切な時期に切り替えることができる。 An image processing method according to an eighteenth aspect of the present invention is an image processing method including first and second pixel groups in which a subject image that has passed through first and second regions in a photographing lens is divided into pupils and formed. A first display image is generated based on the image signal output from the element, and in-focus confirmation is performed based on the first and second image signals output from the first and second pixel groups. A generation step of generating a second display image to be used, a first instruction step of instructing display of the first display image, a first instruction step of instructing a non-holding type, and the second A second instruction step for instructing display of the display image, and the first instruction image by the first instruction step in a state where the display of the second display image is instructed by the second instruction step. Is displayed, the display unit that displays the image Control is performed to display the first display image generated by the generation step without displaying the second display image generated by the generation step, and the second instruction step performs the second display step. When the display instruction of the first display image in the first instruction step is canceled in a state where display of the display image is instructed, the first generated by the generation step with respect to the display unit And a display control step of performing control to display the second display image generated by the generation step in the display area of the first display image. Thereby, the display and non-display of the image used for focus confirmation can be switched at an appropriate time as compared with the case where the present configuration is not provided.
 本発明の第19の態様に係る画像処理プログラムは、請求項1~請求項4の何れか1項に記載の画像処理装置における前記生成部及び前記表示部としてコンピュータを機能させるためのものである。これにより、本構成を有しない場合に比べ、合焦確認に使用する画像の表示及び非表示を適切な時期に切り替えることができる。 An image processing program according to a nineteenth aspect of the present invention is for causing a computer to function as the generation unit and the display unit in the image processing device according to any one of claims 1 to 4. . Thereby, the display and non-display of the image used for focus confirmation can be switched at an appropriate time as compared with the case where the present configuration is not provided.
 本発明によれば、合焦確認に使用する画像の表示及び非表示を適切な時期に切り替えることができる、という効果が得られる。 According to the present invention, it is possible to obtain an effect that display and non-display of an image used for focus confirmation can be switched at an appropriate time.
第1実施形態に係るレンズ交換式カメラである撮像装置の外観の一例を示す斜視図である。It is a perspective view which shows an example of the external appearance of the imaging device which is a lens interchangeable camera which concerns on 1st Embodiment. 図1に示す撮像装置の背面側を示す背面図である。It is a rear view which shows the back side of the imaging device shown in FIG. 第1実施形態に係る撮像装置の電気系の構成の一例を示すブロック図である。It is a block diagram which shows an example of a structure of the electric system of the imaging device which concerns on 1st Embodiment. 第1実施形態に係る撮像装置に含まれる撮像素子に設けられているカラーフィルタの配置の一例を示す概略配置図である。FIG. 2 is a schematic layout diagram illustrating an example of a layout of color filters provided in an image sensor included in the imaging device according to the first embodiment. 図4に示すカラーフィルタに含まれる2×2画素のG画素の画素値から相関方向を判別する方法の説明に供する図である。基本配列パターンの概念を説明するための図である。FIG. 5 is a diagram for explaining a method of determining a correlation direction from pixel values of 2 × 2 G pixels included in the color filter illustrated in FIG. 4. It is a figure for demonstrating the concept of a basic sequence pattern. 図4に示すカラーフィルタに含まれる基本配列パターンの概念を説明するための図である。It is a figure for demonstrating the concept of the basic sequence pattern contained in the color filter shown in FIG. 第1実施形態に係る撮像装置に含まれる撮像素子における位相差画素の配置の一例を示す概略構成図である。It is a schematic block diagram which shows an example of arrangement | positioning of the phase difference pixel in the image pick-up element contained in the imaging device which concerns on 1st Embodiment. 第1実施形態に係る撮像装置の撮像素子に含まれる位相差画素(第1の画素及び第2の画素)の構成の一例を示す概略構成図である。It is a schematic block diagram which shows an example of a structure of the phase difference pixel (1st pixel and 2nd pixel) contained in the image pick-up element of the imaging device which concerns on 1st Embodiment. 第1実施形態に係る撮像装置の要部機能の一例を示すブロック図である。It is a block diagram which shows an example of the principal part function of the imaging device which concerns on 1st Embodiment. 第1実施形態に係る撮像装置に含まれる撮像素子における位相差画素を通常画素で補間する態様の一例を示す模式図である。It is a schematic diagram which shows an example of the aspect which interpolates the phase difference pixel in an imaging device contained in the imaging device which concerns on 1st Embodiment by a normal pixel. 第1実施形態に係る撮像装置に含まれる表示装置におけるスプリットイメージの表示領域及び通常画像の表示領域の位置の一例を示す模式図である。It is a schematic diagram which shows an example of the position of the display area of the split image in the display apparatus contained in the imaging device which concerns on 1st Embodiment, and the display area of a normal image. 第1実施形態に係る画像出力処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the image output process which concerns on 1st Embodiment. 第1実施形態に係る撮像装置の表示部に表示されたライブビュー画像であって、ピントが合っていない状態のライブビュー画像の一例を示す画面図である。FIG. 6 is a screen diagram illustrating an example of a live view image that is displayed on a display unit of the imaging apparatus according to the first embodiment and is out of focus. 第1実施形態に係る撮像装置の表示部に表示されたライブビュー画像であって、ピントが合っている状態のライブビュー画像の一例を示す画面図である。FIG. 6 is a screen diagram illustrating an example of a live view image displayed on a display unit of the imaging apparatus according to the first embodiment and in a focused state. 第1実施形態に係る撮像素子に含まれる撮像素子における位相差画素の配置の変形例を示す概略構成図である。It is a schematic block diagram which shows the modification of arrangement | positioning of the phase difference pixel in the image pick-up element contained in the image pick-up element which concerns on 1st Embodiment. 第1実施形態に係る撮像素子に含まれる撮像素子における位相差画素の配置の変形例を示す概略構成図である。It is a schematic block diagram which shows the modification of arrangement | positioning of the phase difference pixel in the image pick-up element contained in the image pick-up element which concerns on 1st Embodiment. 第1実施形態に係る撮像素子に含まれる撮像素子における位相差画素の配置の変形例を示す概略構成図である。It is a schematic block diagram which shows the modification of arrangement | positioning of the phase difference pixel in the image pick-up element contained in the image pick-up element which concerns on 1st Embodiment. 第1実施形態に係る撮像素子に含まれる撮像素子に対して適用されるカラーフィルタの原色の配列をベイヤ配列とした場合の位相差画素の配置の変形例を示す概略構成図である。It is a schematic block diagram which shows the modification of arrangement | positioning of a phase difference pixel when the primary color arrangement | sequence of the color filter applied with respect to the image pick-up element contained in the image pick-up element which concerns on 1st Embodiment is made into a Bayer array. 第1実施形態に係る撮像素子に含まれる撮像素子に対して適用されるカラーフィルタの原色の配列をベイヤ配列とした場合の位相差画素の配置の変形例を示す概略構成図である。It is a schematic block diagram which shows the modification of arrangement | positioning of a phase difference pixel when the primary color arrangement | sequence of the color filter applied with respect to the image pick-up element contained in the image pick-up element which concerns on 1st Embodiment is made into a Bayer array. 第1実施形態に係る撮像素子に含まれる撮像素子における位相差画素の配置の変形例を示す概略構成図である。It is a schematic block diagram which shows the modification of arrangement | positioning of the phase difference pixel in the image pick-up element contained in the image pick-up element which concerns on 1st Embodiment. 図19に示す撮像素子に含まれる位相差画素(第1の画素及び第2の画素)の構成の一例を示す概略構成図である。FIG. 20 is a schematic configuration diagram illustrating an example of a configuration of phase difference pixels (first pixel and second pixel) included in the imaging element illustrated in FIG. 19. 図19に示す撮像素子に含まれる位相差画素の配置の変形例を示す概略構成図である。FIG. 20 is a schematic configuration diagram illustrating a modified example of the arrangement of phase difference pixels included in the image sensor illustrated in FIG. 19. 第1実施形態に係るスプリットイメージの変形例であって、第1の画像及び第2の画像を奇数ラインと偶数ラインとに分けて交互に並べられて形成されたスプリットイメージの一例を示す模式図である。FIG. 6 is a schematic diagram illustrating an example of a split image according to the first embodiment, in which the first image and the second image are divided into odd lines and even lines and alternately arranged. It is. 第1実施形態に係るスプリットイメージの変形例であって、水平方向に対して傾いた斜めの分割線により分割されているスプリットイメージの一例を示す模式図である。FIG. 9 is a schematic diagram showing an example of a split image according to the first embodiment, which is an example of a split image divided by oblique dividing lines inclined with respect to the horizontal direction. 第1実施形態に係るスプリットイメージの変形例であって、格子状の分割線で分割されたスプリットイメージの一例を示す模式図である。FIG. 10 is a schematic diagram illustrating an example of a split image according to the first embodiment, which is an example of a split image divided by a grid-like dividing line. 第1実施形態に係るスプリットイメージの変形例であって、市松模様に形成されたスプリットイメージの一例を示す模式図である。FIG. 10 is a schematic diagram illustrating an example of a split image formed in a checkered pattern, which is a modification of the split image according to the first embodiment. 第2実施形態に係る画像出力処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the image output process which concerns on 2nd Embodiment. 第3実施形態に係る画像出力処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the image output process which concerns on 3rd Embodiment. 第4実施形態に係る画像出力処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of the image output process which concerns on 4th Embodiment. 第5実施形態に係るスマートフォンの外観の一例を示す斜視図である。It is a perspective view which shows an example of the external appearance of the smart phone which concerns on 5th Embodiment. 第5実施形態に係るスマートフォンの電気系の要部構成の一例を示すブロック図である。It is a block diagram which shows an example of the principal part structure of the electrical system of the smart phone which concerns on 5th Embodiment.
 以下、添付図面に従って本発明に係る撮像装置の実施の形態の一例について説明する。 Hereinafter, an example of an embodiment of an imaging apparatus according to the present invention will be described with reference to the accompanying drawings.
 [第1実施形態]
 図1は、第1実施形態に係る撮像装置100の外観の一例を示す斜視図であり、図2は、図1に示す撮像装置100の背面図である。
[First Embodiment]
FIG. 1 is a perspective view illustrating an example of an appearance of the imaging apparatus 100 according to the first embodiment, and FIG. 2 is a rear view of the imaging apparatus 100 illustrated in FIG.
 撮像装置100は、レンズ交換式カメラであり、カメラ本体200と、カメラ本体200に交換可能に装着される交換レンズ300(撮影レンズ、フォーカスレンズ302)と、を含み、レフレックスミラーが省略されたデジタルカメラである。また、カメラ本体200には、ハイブリッドファインダー(登録商標)220が設けられている。ここで言うハイブリッドファインダー220とは、例えば光学ビューファインダー(以下、「OVF」という)及び電子ビューファインダー(以下、「EVF」という)が選択的に使用されるファインダーを指す。 The imaging apparatus 100 is an interchangeable lens camera, and includes a camera body 200 and an interchangeable lens 300 (photographing lens, focus lens 302) that is replaceably attached to the camera body 200, and a reflex mirror is omitted. It is a digital camera. The camera body 200 is provided with a hybrid finder (registered trademark) 220. The hybrid viewfinder 220 here refers to a viewfinder in which, for example, an optical viewfinder (hereinafter referred to as “OVF”) and an electronic viewfinder (hereinafter referred to as “EVF”) are selectively used.
 カメラ本体200と交換レンズ300とは、カメラ本体200に備えられたマウント256と、マウント256に対応する交換レンズ300側のマウント346(図3参照)とが結合されることにより交換可能に装着される。また、交換レンズ300の鏡筒にはフォーカスリングが設けられ、フォーカスリングの回転操作に伴ってフォーカスレンズ302を光軸方向に移動させ、被写体距離に応じた合焦位置で後述の撮像素子20(図3参照)に被写体光を結像させることができる。 The camera body 200 and the interchangeable lens 300 are interchangeably mounted by combining a mount 256 provided in the camera body 200 and a mount 346 (see FIG. 3) on the interchangeable lens 300 side corresponding to the mount 256. The In addition, the lens barrel of the interchangeable lens 300 is provided with a focus ring, and the focus lens 302 is moved in the optical axis direction in accordance with the rotation operation of the focus ring, and an imaging device 20 (described later) at a focus position corresponding to the subject distance. The subject light can be imaged on (see FIG. 3).
 カメラ本体200の前面には、ハイブリッドファインダー220に含まれるOVFのファインダー窓241が設けられている。また、カメラ本体200の前面には、ファインダー切替えレバー(ファインダー切替え部)214が設けられている。ファインダー切替えレバー214を矢印SW方向に回動させると、OVFで視認可能な光学像とEVFで視認可能な電子像(ライブビュー画像)との間で切り換わるようになっている(後述)。なお、OVFの光軸L2は、交換レンズ300の光軸L1とは異なる光軸である。また、カメラ本体200の上面には、主としてレリーズスイッチ211及び撮影モードや再生モード等の設定用のダイヤル212が設けられている。 A front view of the camera body 200 is provided with an OVF viewfinder window 241 included in the hybrid viewfinder 220. A finder switching lever (finder switching unit) 214 is provided on the front surface of the camera body 200. When the viewfinder switching lever 214 is rotated in the direction of the arrow SW, it switches between an optical image that can be viewed with OVF and an electronic image (live view image) that can be viewed with EVF (described later). The optical axis L2 of the OVF is an optical axis different from the optical axis L1 of the interchangeable lens 300. A release switch 211 and a dial 212 for setting a shooting mode, a playback mode, and the like are mainly provided on the upper surface of the camera body 200.
 レリーズスイッチ211は、待機位置から第1指示位置の一例である中間位置(半押し位置)まで押下される状態と、中間位置を超えた第2指示位置の一例である最終押下位置(全押し位置)まで押下される状態と、の2段階の押圧操作が検出可能に構成されている。なお、以下では、「待機位置から半押し位置まで押下される状態」を「半押し状態」といい、「待機位置から全押し位置まで押下される状態」を「全押し状態」という。 The release switch 211 is pressed from a standby position to an intermediate position (half-pressed position) that is an example of a first indicated position, and a final pressed position (full-pressed position) that is an example of a second indicated position that exceeds the intermediate position. ) In a state where the button is pressed down until it is pressed down). In the following description, “a state where the button is pressed from the standby position to the half-pressed position” is referred to as “half-pressed state”, and “a state where the button is pressed from the standby position to the fully-pressed position” is referred to as “full-pressed state”.
 本第1実施形態に係る撮像装置100では、レリーズボタン211を半押し状態にすることにより撮影条件の調整が行われ、その後、引き続き全押し状態にすると露光(撮影)が行われる。ここで言う「撮影条件」とは、例えば露出状態及び合焦状態の少なくとも1つを指す。なお、本第1実施形態に係る撮像装置100では、露出状態及び合焦状態の調整が行われる。つまり、レリーズボタン211を半押し状態にすることによりAE(Automatic Exposure、自動露出)機能が働いて露出状態(シャッタースピード、絞りの状態)が設定された後、AF機能が働いて合焦制御される。 In the imaging apparatus 100 according to the first embodiment, the shooting conditions are adjusted by pressing the release button 211 halfway, and then exposure (shooting) is performed when the release button 211 is fully pressed. The “imaging condition” referred to here indicates, for example, at least one of an exposure state and a focused state. In the imaging apparatus 100 according to the first embodiment, the exposure state and the focus state are adjusted. In other words, by pressing the release button 211 halfway, the AE (Automatic Exposure) function is activated and the exposure state (shutter speed, aperture state) is set, and then the AF function is activated and the focus is controlled. The
 カメラ本体200の背面には、OVFのファインダー接眼部242、表示部213、十字キー222、MENU/OKキー224、BACK/DISPボタン225が設けられている。 On the back of the camera body 200, an OVF viewfinder eyepiece 242, a display unit 213, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225 are provided.
 十字キー222は、メニューの選択、ズームやコマ送り等の各種の指令信号を出力するマルチファンクションのキーとして機能する。MENU/OKキー224は、表示部213の画面上にメニューを表示させる指令を行うためのメニューボタンとしての機能と、選択内容の確定及び実行などを指令するOKボタンとしての機能とを兼備した操作キーである。BACK/DISPボタン225は、選択項目など所望の対象の消去や指定内容の取消し、あるいは1つ前の操作状態に戻すときなどに使用される。 The cross key 222 functions as a multi-function key that outputs various command signals such as menu selection, zoom and frame advance. The MENU / OK key 224 has both a function as a menu button for instructing to display a menu on the screen of the display unit 213 and an function as an OK button for instructing confirmation and execution of selection contents. Key. The BACK / DISP button 225 is used for deleting a desired object such as a selection item, canceling a designated content, or returning to the previous operation state.
 表示部213は、例えばLCDにより実現され、撮影モード時に連続フレームで撮像されて得られた連続フレーム画像の一例であるライブビュー画像(スルー画像)の表示に用いられる。また、表示部213は、静止画撮影の指示が与えられた場合に単一フレームで撮像されて得られた単一フレーム画像の一例である静止画像の表示にも用いられる。更に、表示部213は、再生モード時の再生画像の表示やメニュー画面等の表示にも用いられる。 The display unit 213 is realized by, for example, an LCD, and is used to display a live view image (through image) that is an example of a continuous frame image obtained by capturing a continuous frame in the shooting mode. The display unit 213 is also used to display a still image that is an example of a single frame image obtained by capturing a single frame when a still image shooting instruction is given. Furthermore, the display unit 213 is also used for displaying a playback image and a menu screen in the playback mode.
 図3は第1実施形態に係る撮像装置100の電気系の構成(内部構成)の一例を示すブロック図である。 FIG. 3 is a block diagram showing an example of the electrical configuration (internal configuration) of the imaging apparatus 100 according to the first embodiment.
 撮像装置100は、撮影した静止画像や動画像を記録するデジタルカメラであり、カメラ全体の動作は、CPU(central processing unit:中央処理装置)12によって統括制御されている。撮像装置100は、CPU12の他に、本発明に係る第1指示部、第2指示部及び設定部の一例である操作部14を含む。また、撮像部100は、インタフェース部24、メモリ26及びエンコーダ34を含む。また、撮像部100は、本発明に係る表示制御部の一例である表示制御部36A,36Bを含む。また、撮像部100は、本発明に係る検出部の一例である接眼検出部37を含む。また、撮像装置100は、本発明に係る生成部の一例である画像処理部28を含む。なお、以下では、表示制御部36A,36Bを区別して説明する必要がない場合は「表示制御部36」と称する。また、本第1実施形態では、画像処理部28とは別のハードウェア構成として表示制御部36を設けているが、これに限らず、画像処理部28が表示制御部36と同様の機能を有するものとしてもよく、この場合、表示制御部36は不要となる。 The imaging device 100 is a digital camera that records captured still images and moving images, and the overall operation of the camera is controlled by a CPU (central processing unit) 12. In addition to the CPU 12, the imaging apparatus 100 includes an operation unit 14 that is an example of a first instruction unit, a second instruction unit, and a setting unit according to the present invention. The imaging unit 100 includes an interface unit 24, a memory 26, and an encoder 34. The imaging unit 100 includes display control units 36A and 36B which are examples of the display control unit according to the present invention. The imaging unit 100 includes an eyepiece detection unit 37 that is an example of a detection unit according to the present invention. In addition, the imaging apparatus 100 includes an image processing unit 28 that is an example of a generation unit according to the present invention. Hereinafter, when it is not necessary to distinguish between the display control units 36A and 36B, they are referred to as “display control unit 36”. In the first embodiment, the display control unit 36 is provided as a hardware configuration different from the image processing unit 28. However, the present invention is not limited to this, and the image processing unit 28 has the same function as the display control unit 36. In this case, the display control unit 36 is not necessary.
 CPU12、操作部14、インタフェース部24、記憶部の一例であるメモリ26、画像処理部28、エンコーダ34、表示制御部36A,36B、接眼検出部37及び外部インタフェース(I/F)39は、バス40を介して相互に接続されている。なお、メモリ26は、パラメータやプログラムなどが記憶された不揮発性の記憶領域(一例としてEEPROMなど)と画像などの各種情報が一時的に記憶される揮発性の記憶領域(一例としてSDRAMなど)とを有する。 The CPU 12, the operation unit 14, the interface unit 24, the memory 26, which is an example of a storage unit, the image processing unit 28, the encoder 34, the display control units 36 </ b> A and 36 </ b> B, the eyepiece detection unit 37, and the external interface (I / F) 39 40 are connected to each other. The memory 26 includes a non-volatile storage area (such as an EEPROM) that stores parameters, programs, and the like, and a volatile storage area (such as an SDRAM) that temporarily stores various information such as images. Have
 なお、本第1実施形態に係る撮像装置100では、CPU12が、撮像によって得られた画像のコントラスト値が最大となるように焦点調整モータを駆動制御することによって合焦制御を行う。また、CPU12は、撮像によって得られた画像の明るさを示す物理量であるAE情報を算出する。CPU12は、レリーズスイッチ211が半押し状態とされたときには、AE情報により示される画像の明るさに応じたシャッタースピード及びF値を導出する。そして、導出したシャッタースピード及びF値となるように関係各部を制御することによって露出状態の設定を行う。 In the imaging apparatus 100 according to the first embodiment, the CPU 12 performs focusing control by driving and controlling the focus adjustment motor so that the contrast value of the image obtained by imaging is maximized. Further, the CPU 12 calculates AE information that is a physical quantity indicating the brightness of an image obtained by imaging. When the release switch 211 is half-pressed, the CPU 12 derives the shutter speed and F value corresponding to the brightness of the image indicated by the AE information. Then, the exposure state is set by controlling each related part so that the derived shutter speed and F value are obtained.
 操作部14は、撮像装置100に対して各種指示を与える際に操作者によって操作されるユーザインタフェースである。操作部14によって受け付けられた各種指示は操作信号としてCPU12に出力され、CPU12は、操作部14から入力された操作信号に応じた処理を実行する。 The operation unit 14 is a user interface operated by the operator when giving various instructions to the imaging apparatus 100. Various instructions received by the operation unit 14 are output as operation signals to the CPU 12, and the CPU 12 executes processing according to the operation signals input from the operation unit 14.
 操作部14は、レリーズスイッチ211、撮影モード等を選択するダイヤル(フォーカスモード切替え部)212、表示部213、ファインダー切替えレバー214、十字キー222、MENU/OKキー224及びBACK/DISPボタン225を含む。また、操作部14は、各種情報を受け付けるタッチパネルも含む。このタッチパネルは、例えば表示部213の表示画面に重ねられている。また、操作部14は、被写界深度を例えば表示部213の画面で確認する際に押下される被写界深度確認ボタンも含む。本第1実施形態に係る撮像装置100では、被写界深度確認ボタンが押下されることにより撮影レンズのF値が調節され、静止画像の撮像時に用いるF値として予め定められたF値が設定される。 The operation unit 14 includes a release switch 211, a dial (focus mode switching unit) 212 for selecting a shooting mode, a display unit 213, a viewfinder switching lever 214, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225. . The operation unit 14 also includes a touch panel that accepts various types of information. This touch panel is overlaid on the display screen of the display unit 213, for example. The operation unit 14 also includes a depth-of-field confirmation button that is pressed when confirming the depth of field on the screen of the display unit 213, for example. In the imaging apparatus 100 according to the first embodiment, the F value of the photographing lens is adjusted by pressing the depth-of-field confirmation button, and a predetermined F value is set as the F value used when capturing a still image. Is done.
 撮影モードが設定されると、被写体を示す画像光は、手動操作により移動可能なフォーカスレンズを含む撮影レンズ16及びシャッタ18を介してカラーの撮像素子(一例としてCMOSセンサ)20の受光面に結像される。撮像素子20に蓄積された信号電荷は、デバイス制御部22から加えられる読出し信号によって信号電荷(電圧)に応じたデジタル信号として順次読み出される。撮像素子20は、いわゆる電子シャッタ機能を有しており、電子シャッタ機能を働かせることで、読出し信号のタイミングによって各フォトセンサの電荷蓄積時間(シャッタスピード)を制御する。なお、本第1実施形態に係る撮像素子20は、CMOS型のイメージセンサであるが、これに限らず、CCDイメージセンサでもよい。 When the photographing mode is set, the image light indicating the subject is coupled to the light receiving surface of the color image sensor (for example, a CMOS sensor) 20 via the photographing lens 16 including the focus lens that can be moved manually and the shutter 18. Imaged. The signal charge accumulated in the image sensor 20 is sequentially read out as a digital signal corresponding to the signal charge (voltage) by a read signal applied from the device control unit 22. The imaging element 20 has a so-called electronic shutter function, and controls the charge accumulation time (shutter speed) of each photosensor according to the timing of the readout signal by using the electronic shutter function. The image sensor 20 according to the first embodiment is a CMOS image sensor, but is not limited thereto, and may be a CCD image sensor.
 撮像素子20には一例として図4に示すカラーフィルタ21が設けられている。図4にはカラーフィルタ21の配列の一例が模式的に示されている。なお、図4に示す例では、画素数の一例として(4896×3264)画素を採用し、アスペクト比として3:2を採用しているが、画素数及びアスペクト比はこれに限られるものではない。一例として図4に示すように、カラーフィルタ21は、輝度信号を得るために最も寄与するG(緑)に対応する第1のフィルタG、R(赤)に対応する第2のフィルタR及びB(青)に対応する第3のフィルタBを含む。第1のフィルタG(以下、Gフィルタと称する)、第2のフィルタR(以下、Rフィルタと称する)及び第3のフィルタB(以下、Bフィルタと称する)の配列パターンは、第1の配列パターンAと第2の配列パターンBとに分類される。 The image sensor 20 is provided with a color filter 21 shown in FIG. 4 as an example. FIG. 4 schematically shows an example of the arrangement of the color filters 21. In the example shown in FIG. 4, (4896 × 3264) pixels are adopted as an example of the number of pixels and 3: 2 is adopted as the aspect ratio. However, the number of pixels and the aspect ratio are not limited thereto. . As an example, as illustrated in FIG. 4, the color filter 21 includes a first filter G corresponding to G (green) that contributes most to obtain a luminance signal, and second filters R and B corresponding to R (red). A third filter B corresponding to (blue) is included. The arrangement pattern of the first filter G (hereinafter referred to as G filter), the second filter R (hereinafter referred to as R filter), and the third filter B (hereinafter referred to as B filter) is the first arrangement. It is classified into a pattern A and a second array pattern B.
 第1の配列パターンAにおいて、Gフィルタは、3×3画素の正方配列の四隅及び中央の画素上に配置されている。第1の配列パターンAにおいて、Rフィルタは、正方配列の水平方向(行方向の一例)における中央の垂直ライン上に配置されている。第1の配列パターンAにおいて、Bフィルタは、正方配列の垂直方向(列方向の一例)における中央の水平ライン上に配置されている。第2の配列パターンBは、第1の基本配列パターンAとフィルタGの配置が同一で且つフィルタRの配置とフィルタBの配置とを入れ替えたパターンである。カラーフィルタ21は、6×6画素に対応する正方配列パターンからなる基本配列パターンCを含む。基本配列パターンCは、第1の配列パターンAと第2の配列パターンBとが点対称で配置された6×6画素のパターンであり、基本配列パターンCが水平方向及び垂直方向に繰り返し配置されている。すなわち、カラーフィルタ21では、R,G,Bの各色のフィルタ(Rフィルタ、Gフィルタ及びBフィルタ)が所定の周期性をもって配列されている。そのため、カラー撮像素子から読み出されるR,G,B信号の同時化(補間)処理等を行う際に、繰り返しパターンにしたがって処理を行うことができる。 In the first array pattern A, the G filters are arranged on the four corners and the center pixel of the 3 × 3 pixel square array. In the first arrangement pattern A, the R filter is arranged on the central vertical line in the horizontal direction (an example of the row direction) of the square arrangement. In the first arrangement pattern A, the B filter is arranged on the central horizontal line in the vertical direction (an example of the column direction) of the square arrangement. The second arrangement pattern B is a pattern in which the arrangement of the filter G is the same as that of the first basic arrangement pattern A and the arrangement of the filter R and the arrangement of the filter B are interchanged. The color filter 21 includes a basic array pattern C composed of a square array pattern corresponding to 6 × 6 pixels. The basic array pattern C is a 6 × 6 pixel pattern in which the first array pattern A and the second array pattern B are arranged point-symmetrically, and the basic array pattern C is repeatedly arranged in the horizontal direction and the vertical direction. ing. In other words, in the color filter 21, filters of R, G, and B colors (R filter, G filter, and B filter) are arranged with a predetermined periodicity. Therefore, when performing synchronization (interpolation) processing of R, G, and B signals read from the color image sensor, processing can be performed according to a repetitive pattern.
 また、基本配列パターンCの単位で間引き処理して画像を縮小する場合、間引き処理した縮小画像のカラーフィルタ配列は、間引き処理前のカラーフィルタ配列と同じにすることができ、共通の処理回路を使用することができる。 When the image is reduced by thinning out in units of the basic arrangement pattern C, the color filter array of the reduced image after the thinning process can be the same as the color filter array before the thinning process, and a common processing circuit is provided. Can be used.
 カラーフィルタ21は、輝度信号を得るために最も寄与する色(本第1実施形態では、Gの色)に対応するGフィルタが、カラーフィルタ配列の水平、垂直、及び斜め方向の各ライン内に配置されている。そのため、高周波となる方向によらず高周波領域での同時化処理の再現精度を向上させることができる。 The color filter 21 has a G filter corresponding to the color that contributes most to obtain a luminance signal (G color in the first embodiment) in each horizontal, vertical, and diagonal line of the color filter array. Is arranged. Therefore, it is possible to improve the reproduction accuracy of the synchronization process in the high frequency region regardless of the direction of high frequency.
 カラーフィルタ21は、上記Gの色以外の2色以上の他の色(本第1実施形態では、R,Bの色)に対応するRフィルタ及びBフィルタが、カラーフィルタ配列の水平方向及び垂直方向の各ライン内に配置されている。そのため、色モワレ(偽色)の発生が抑制され、これにより、偽色の発生を抑制するための光学ローパスフィルタを光学系の入射面から撮像面までの光路に配置しないようにすることができる。また、光学ローパスフィルタを適用する場合でも、偽色の発生を防止するための高周波数成分をカットする働きの弱いものを適用することができ、解像度を損なわないようにすることができる。 The color filter 21 includes an R filter and a B filter corresponding to two or more other colors (R and B colors in the first embodiment) other than the G color. Located within each line of direction. For this reason, the occurrence of color moire (false color) is suppressed, whereby an optical low-pass filter for suppressing the occurrence of false color can be prevented from being arranged in the optical path from the incident surface of the optical system to the imaging surface. . Even when an optical low-pass filter is applied, it is possible to apply a filter having a weak function of cutting a high-frequency component for preventing the occurrence of false colors, so that the resolution is not impaired.
 基本配列パターンCは、破線の枠で囲んだ3×3画素の第1の配列パターンAと、一点鎖線の枠で囲んだ3×3画素の第2の配列パターンBとが、水平、垂直方向に交互に並べられた配列となっていると捉えることもできる。 The basic array pattern C includes a 3 × 3 pixel first array pattern A surrounded by a broken line frame and a 3 × 3 pixel second array pattern B surrounded by a one-dot chain line. It can also be understood that the array is arranged alternately.
 第1の配列パターンA及び第2の配列パターンBは、それぞれ輝度系画素であるGフィルタが4隅と中央に配置され、両対角線上に配置されている。また、第1の配列パターンAは、中央のGフィルタを挟んでBフィルタが水平方向に配列され、Rフィルタが垂直方向に配列される。一方、第2の配列パターンBは、中央のGフィルタを挟んでRフィルタが水平方向に配列され、Bフィルタが垂直方向に配列されている。すなわち、第1の配列パターンAと第2の配列パターンBとは、RフィルタとBフィルタとの位置関係が逆転しているが、その他の配置は同様になっている。 In the first array pattern A and the second array pattern B, G filters, which are luminance pixels, are arranged at the four corners and the center, and are arranged on both diagonal lines. In the first arrangement pattern A, the B filter is arranged in the horizontal direction and the R filter is arranged in the vertical direction with the central G filter interposed therebetween. On the other hand, in the second arrangement pattern B, the R filters are arranged in the horizontal direction and the B filters are arranged in the vertical direction across the center G filter. That is, in the first arrangement pattern A and the second arrangement pattern B, the positional relationship between the R filter and the B filter is reversed, but the other arrangements are the same.
 また、第1の配列パターンAと第2の配列パターンBの4隅のGフィルタは、一例として図5に示すように第1の配列パターンAと第2の配列パターンBとが水平、垂直方向に交互に配置されることにより、2×2画素に対応する正方配列のGフィルタが形成される。一例として図5に示すように取り出されたGフィルタからなる2×2画素については、水平方向のG画素の画素値の差分絶対値、垂直方向のG画素の画素値の差分絶対値、斜め方向(右上斜め、左上斜め)のG画素の画素値の差分絶対値が算出される。これにより、水平方向、垂直方向、及び斜め方向のうち、差分絶対値の小さい方向に相関があると判断することができる。すなわち、水平方向、垂直方向、及び斜め方向のうちの相関の高い方向が最小画素間隔のG画素の情報を使用して判別される。この判別結果は、周囲の画素から補間する処理(同時化処理)に使用することができる。 The G filters at the four corners of the first array pattern A and the second array pattern B are shown in FIG. 5 as an example, in which the first array pattern A and the second array pattern B are horizontal and vertical. Are alternately arranged, a square array of G filters corresponding to 2 × 2 pixels is formed. As an example, with respect to 2 × 2 pixels composed of G filters extracted as shown in FIG. 5, the difference absolute value of the pixel value of the G pixel in the horizontal direction, the difference absolute value of the pixel value of the G pixel in the vertical direction, and the diagonal direction The difference absolute value of the pixel values of the G pixels (upper right diagonal, upper left diagonal) is calculated. Thereby, it can be determined that there is a correlation in a direction having a small difference absolute value among the horizontal direction, the vertical direction, and the diagonal direction. That is, the direction with the highest correlation among the horizontal direction, the vertical direction, and the diagonal direction is determined using the information of the G pixels having the minimum pixel interval. This discrimination result can be used for the process of interpolating from surrounding pixels (synchronization process).
 カラーフィルタ21の基本配列パターンCは、その基本配列パターンCの中心(4つのGフィルタの中心)に対して点対称に配置されている。また、基本配列パターンC内の第1の配列パターンA及び第2の配列パターンBも、それぞれ中心のGフィルタに対して点対称に配置されている従って、後段の処理回路の回路規模を小さくしたり、簡略化したりすることが可能になる。 The basic array pattern C of the color filter 21 is arranged point-symmetrically with respect to the center of the basic array pattern C (the centers of the four G filters). In addition, the first array pattern A and the second array pattern B in the basic array pattern C are also arranged point-symmetrically with respect to the central G filter, so that the circuit scale of the subsequent processing circuit is reduced. Or can be simplified.
 一例として図6に示すように基本配列パターンCにおいて、水平方向の第1から第6のラインのうちの第1及び第3のラインのカラーフィルタ配列は、GRGGBGである。第2のラインのカラーフィルタ配列は、BGBRGRである。第4及び第6のラインのカラーフィルタ配列は、GBGGRGである。第5のラインのカラーフィルタ配列は、RGRBGBである。図6に示す例では、基本配列パターンC,C’,C”が示されている。基本配列パターンC’は基本配列パターンCを水平方向及び垂直方向にそれぞれ1画素ずつシフトしたパターンを示し、基本配列パターンC”は、基本配列パターンCを水平方向及び垂直方向にそれぞれ2画素ずつシフトしたパターンを示す。このように、カラーフィルタ21は、基本配列パターンC’、C”を水平方向及び垂直方向に繰り返し配置しても、同じカラーフィルタ配列となる。 As an example, in the basic array pattern C as shown in FIG. 6, the color filter array of the first and third lines of the first to sixth lines in the horizontal direction is GRGGBG. The color filter array of the second line is BGBGR. The color filter array of the fourth and sixth lines is GBGGRG. The color filter array of the fifth line is RGRBGB. In the example shown in FIG. 6, basic array patterns C, C ′, and C ″ are shown. The basic array pattern C ′ indicates a pattern obtained by shifting the basic array pattern C by one pixel in the horizontal direction and the vertical direction, The basic array pattern C ″ indicates a pattern obtained by shifting the basic array pattern C by two pixels each in the horizontal direction and the vertical direction. Thus, the color filter 21 has the same color filter array even if the basic array patterns C ′ and C ″ are repeatedly arranged in the horizontal direction and the vertical direction.
 撮像装置100は、位相差AF機能を有する。撮像素子20は、位相差AF機能を働かせた場合に用いられる複数の位相差検出用の画素を含む。複数の位相差検出用の画素は予め定めたパターンで配置されている。 The imaging apparatus 100 has a phase difference AF function. The image sensor 20 includes a plurality of phase difference detection pixels used when the phase difference AF function is activated. The plurality of phase difference detection pixels are arranged in a predetermined pattern.
 図7には、カラーフィルタ21の一部と一部の位相差検出用の画素の対応関係の一例が模式的に示されている。一例として図7に示すように、位相差検出用の画素は、水平方向の左半分の画素が遮光された第1の画素L及び水平方向の右半分の画素が遮光された第2の画素Rの何れかである。なお、以下では、第1の画素L及び第2の画素Rを区別して説明する必要がない場合は「位相差画素」と称する。 FIG. 7 schematically shows an example of a correspondence relationship between a part of the color filter 21 and a part of the pixels for detecting the phase difference. As an example, as shown in FIG. 7, the phase difference detection pixels include a first pixel L in which the left half pixel in the horizontal direction is shielded and a second pixel R in which the right half pixel in the horizontal direction is shielded. Any of them. Hereinafter, when it is not necessary to distinguish between the first pixel L and the second pixel R, they are referred to as “phase difference pixels”.
 図8には撮像素子20に配置されている第1の画素L及び第2の画素Rの一例が示されている。一例として図8に示すように、第1の画素Lは遮光部材20Aを有し、第2の画素Rは遮光部材20Bを有する。遮光部材20Aは、フォトダイオードPDの前面側(マイクロレンズL側)に設けられており、受光面の左半分を遮光する。一方、遮光部材20Bは、フォトダイオードPDの前面側に設けられており、受光面の右半分を遮光する。 FIG. 8 shows an example of the first pixel L and the second pixel R arranged in the image sensor 20. As an example, as shown in FIG. 8, the first pixel L has a light shielding member 20A, and the second pixel R has a light shielding member 20B. The light shielding member 20A is provided on the front side (microlens L side) of the photodiode PD, and shields the left half of the light receiving surface. On the other hand, the light shielding member 20B is provided on the front side of the photodiode PD and shields the right half of the light receiving surface.
 マイクロレンズL及び遮光部材20A,20Bは瞳分割部として機能し、第1の画素Lは、撮影レンズ16の射出瞳を通過する光束の光軸の左側のみを受光し、第2の画素Rは、撮影レンズ16の射出瞳を通過する光束の光軸の右側のみを受光する。このように、射出瞳を通過する光束は、瞳分割部であるマイクロレンズL及び遮光部材20A,20Bにより左右に分割され、それぞれ第1の画素Lおよび第2の画素Rに入射する。 The microlens L and the light shielding members 20A and 20B function as a pupil dividing unit, the first pixel L receives only the left side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16, and the second pixel R is Only the right side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16 is received. In this way, the light beam passing through the exit pupil is divided into the left and right by the microlens L and the light shielding members 20A and 20B, which are pupil dividing portions, and enter the first pixel L and the second pixel R, respectively.
 また、撮影レンズ16の射出瞳を通過する光束のうちの左半分の光束に対応する被写体像と、右半分の光束に対応する被写体像のうち、ピントが合っている(合焦状態である)部分は、撮像素子20上の同じ位置に結像する。これに対し、前ピン又は後ピンの部分は、それぞれ撮像素子20上の異なる位置に入射する(位相がずれる)。これにより、左半分の光束に対応する被写体像と右半分の光束に対応する被写体像とは、視差が異なる視差画像(左目画像、右目画像)として取得することができる。 In addition, the subject image corresponding to the left half of the light beam passing through the exit pupil of the photographing lens 16 and the subject image corresponding to the right half of the light beam are in focus (in focus). The portion forms an image at the same position on the image sensor 20. On the other hand, the front pin portion or the rear pin portion is incident on different positions on the image sensor 20 (the phase is shifted). Thereby, the subject image corresponding to the left half light beam and the subject image corresponding to the right half light beam can be acquired as parallax images (left eye image and right eye image) having different parallaxes.
 撮像装置100は、位相差AF機能を働かせることにより、第1の画素Lの画素値と第2の画素Rの画素値とに基づいて位相のずれ量を検出する。そして、検出した位相のずれ量に基づいて撮影レンズの焦点位置を調整する。なお、以下では、遮光部材20A,20Bを区別して説明する必要がない場合は符号を付さずに「遮光部材」と称する。 The imaging apparatus 100 detects a phase shift amount based on the pixel value of the first pixel L and the pixel value of the second pixel R by using the phase difference AF function. Then, the focal position of the photographing lens is adjusted based on the detected phase shift amount. Hereinafter, when it is not necessary to distinguish between the light shielding members 20A and 20B, the light shielding members 20A and 20B are referred to as “light shielding members” without reference numerals.
 撮像素子20は、第1の画素群、第2の画素群及び第3の画素群に分類される。第1の画素群とは、例えば複数の第1の画素Lを指す。第2の画素群とは、例えば複数の第2の画素Rを指す。第3の画素群とは、例えば複数の通常画素(第3の画素の一例)を指す。ここで言う「通常画素」とは、例えば位相差画素以外の画素(例えば遮光部材20A,20Bを有しない画素)を指す。なお、以下では、第1の画素群から出力されるRAW画像を「第1の画像」と称し、第2の画素群から出力されるRAW画像を「第2の画像」と称し、第3の画素群から出力されるRAW画像を「第3の画像」と称する。 The image sensor 20 is classified into a first pixel group, a second pixel group, and a third pixel group. The first pixel group refers to a plurality of first pixels L, for example. The second pixel group refers to a plurality of second pixels R, for example. The third pixel group refers to, for example, a plurality of normal pixels (an example of a third pixel). The “normal pixel” here refers to, for example, a pixel other than the phase difference pixel (for example, a pixel having no light shielding members 20A and 20B). In the following, the RAW image output from the first pixel group is referred to as a “first image”, the RAW image output from the second pixel group is referred to as a “second image”, and the third image The RAW image output from the pixel group is referred to as a “third image”.
 第1及び第2の画素群に含まれる各画素は、第1の画素群と第2の画素群との間で水平方向についての位置が1画素内で揃う位置に配置されている。また、第1及び第2の画素群に含まれる各画素は、第1の画素群と第2の画素群との間で垂直方向についての位置も1画素内で揃う位置に配置されている。図7に示す例では、水平方向及び垂直方向の各々について直線状に1の画素Lと第2の画素Rとが複数画素分の間隔を空けて交互に配置されている。 Each pixel included in the first and second pixel groups is arranged at a position where the positions in the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group. In addition, each pixel included in the first and second pixel groups is arranged at a position where the positions in the vertical direction are aligned within the one pixel between the first pixel group and the second pixel group. In the example shown in FIG. 7, one pixel L and second pixel R are alternately arranged linearly in the horizontal direction and the vertical direction at intervals of a plurality of pixels.
 図7に示す例では、第1及び第2の画素群に含まれる各画素の位置を水平方向及び垂直方向の各々について1画素内で揃う位置としているが、水平方向及び垂直方向の少なくとも一方について所定画素数内(例えば2画素以内)に収まる位置としてもよい。なお、ピントずれ以外の要因で画像ずれが発生するのを最大限に抑制するためには、一例として図7に示すように第1及び第2の画素群に含まれる各画素の位置を水平方向及び垂直方向の各々について1画素内で揃う位置とすることが好ましい。 In the example shown in FIG. 7, the positions of the pixels included in the first and second pixel groups are positions aligned within one pixel in each of the horizontal direction and the vertical direction, but at least in either the horizontal direction or the vertical direction. The position may be within a predetermined number of pixels (for example, within 2 pixels). In order to minimize the occurrence of image shift due to factors other than focus shift, the position of each pixel included in the first and second pixel groups is set in the horizontal direction as shown in FIG. 7 as an example. In addition, it is preferable that the positions are aligned within one pixel in each of the vertical directions.
 位相差画素は、一例として図7に示すように、2×2画素に対応する正方配列のGフィルタの画素に対して設けられている。すなわち、図7に示す例では、2×2画素のGフィルタの図中正面視右上角の画素が位相差画素に対して割り当てられている。また、位相差画素間には通常画素が配置されており、2×2画素のGフィルタの残りの画素が通常画素に対して割り当てられる。また、図7に示す例では、水平方向に第1の画素Lと第2の画素Rとが交互に配置された位相差画素の行は2行単位で一組とされており、各組が垂直方向に所定画素数(図7に示す例では8画素)分の間隔を空けて配置されている。 As an example, as shown in FIG. 7, the phase difference pixels are provided for the pixels of the G filter in a square array corresponding to 2 × 2 pixels. That is, in the example shown in FIG. 7, the pixel in the upper right corner of the front view in the figure of the 2 × 2 pixel G filter is assigned to the phase difference pixel. Further, normal pixels are arranged between the phase difference pixels, and the remaining pixels of the 2 × 2 pixel G filter are assigned to the normal pixels. Further, in the example shown in FIG. 7, the rows of the phase difference pixels in which the first pixels L and the second pixels R are alternately arranged in the horizontal direction are set as a set in units of two rows, and each set includes They are arranged in the vertical direction at intervals of a predetermined number of pixels (eight pixels in the example shown in FIG. 7).
 このように、カラーフィルタ21では、2×2画素のGフィルタの右上角部の画素に対して遮光部材が設けられており、垂直方向及び水平方向ともに複数画素分の間隔を空けて位相差画素が規則的に配置されている。このため、位相差画素の周囲に通常画素が比較的多く配置されるので、通常画素の画素値から位相差画素の画素値を補間する場合における補間精度を向上させることができる。しかも、位相差画素間で補間に利用する通常画素が重複しないように第1~第3の画素群に含まれる各画素が配置されているので、補間精度のより一層の向上が期待できる。 As described above, in the color filter 21, the light shielding member is provided for the pixel in the upper right corner of the 2 × 2 pixel G filter, and the phase difference pixel is spaced by a plurality of pixels in both the vertical and horizontal directions. Are regularly arranged. For this reason, since a relatively large number of normal pixels are arranged around the phase difference pixels, the interpolation accuracy in the case of interpolating the pixel values of the phase difference pixels from the pixel values of the normal pixels can be improved. In addition, since the pixels included in the first to third pixel groups are arranged so that the normal pixels used for interpolation do not overlap between the phase difference pixels, further improvement in interpolation accuracy can be expected.
 図3に戻って、撮像素子20は、第1の画素群から第1の画像(各第1の画素の画素値を示すデジタル信号)を出力し、第2の画素群から第2の画像(各第2の画素の画素値を示すデジタル信号)を出力する。また、撮像素子20は、第3の画素群から第3の画像(各通常画素の画素値を示すデジタル信号)を出力する。なお、第3の画素群から出力される第3の画像は有彩色の画像であり、例えば通常画素の配列と同じカラー配列のカラー画像である。撮像素子20から出力された第1の画像、第2の画像及び第3の画像は、インタフェース部24を介してメモリ26における揮発性の記憶領域に一時記憶される。 Returning to FIG. 3, the image sensor 20 outputs a first image (a digital signal indicating the pixel value of each first pixel) from the first pixel group, and outputs a second image (from the second pixel group). A digital signal indicating the pixel value of each second pixel). Further, the image sensor 20 outputs a third image (a digital signal indicating the pixel value of each normal pixel) from the third pixel group. Note that the third image output from the third pixel group is a chromatic image, for example, a color image having the same color array as the normal pixel array. The first image, the second image, and the third image output from the image sensor 20 are temporarily stored in a volatile storage area in the memory 26 via the interface unit 24.
 画像処理部28は、通常処理部30を有する。通常処理部30は、第3の画素群に対応するR,G,B信号を処理することで第1の表示用画像の一例である有彩色の通常画像を生成する。また、画像処理部28は、スプリットイメージ処理部32を有する。スプリットイメージ処理部32は、第1の画素群及び第2の画素群に対応するG信号を処理することで第2の表示用画像の一例である無彩色のスプリットイメージを生成する。なお、本第1実施形態に係る画像処理部28は、画像処理に係る複数の機能の回路を1つにまとめた集積回路であるASIC(Application Specific Integrated Circuit)により実現される。しかし、ハードウェア構成はこれに限定されるものではなく、例えばプログラマブルロジックデバイスやCPU、ROM及びRAMを含むコンピュータなどの他のハードウェア構成であっても良い。 The image processing unit 28 includes a normal processing unit 30. The normal processing unit 30 processes the R, G, and B signals corresponding to the third pixel group to generate a chromatic color normal image that is an example of the first display image. Further, the image processing unit 28 includes a split image processing unit 32. The split image processing unit 32 processes the G signal corresponding to the first pixel group and the second pixel group to generate an achromatic split image that is an example of a second display image. The image processing unit 28 according to the first embodiment is realized by an ASIC (Application (Specific Integrated Circuit) that is an integrated circuit in which a plurality of functions related to image processing are integrated into one. However, the hardware configuration is not limited to this, and may be another hardware configuration such as a programmable logic device, a computer including a CPU, a ROM, and a RAM.
 エンコーダ34は、入力された信号を別の形式の信号に変換して出力する。ハイブリッドファインダー220は、電子像を表示するLCD247を有する。LCD247における所定方向の画素数(一例として視差発生方向である水平方向の画素数)は、表示部213における同方向の画素数よりも少ない。表示制御部36Aは表示部213に、表示制御分36BはLCD247に各々接続されており、LCD247及び表示部213が選択的に制御されることによりLCD247又は表示部213により画像が表示される。なお、以下では、表示部213及びLCD247を区別して説明する必要がない場合は「表示装置」と称する。 Encoder 34 converts the input signal into a signal of another format and outputs it. The hybrid viewfinder 220 has an LCD 247 that displays an electronic image. The number of pixels in a predetermined direction on the LCD 247 (for example, the number of pixels in the horizontal direction, which is a parallax generation direction) is smaller than the number of pixels in the same direction on the display unit 213. The display control unit 36A is connected to the display unit 213, and the display control part 36B is connected to the LCD 247. By selectively controlling the LCD 247 and the display unit 213, an image is displayed on the LCD 247 or the display unit 213. Hereinafter, the display unit 213 and the LCD 247 are referred to as “display devices” when it is not necessary to distinguish between them.
 なお、本第1実施形態に係る撮像装置100は、ダイヤル212(フォーカスモード切替え部)によりマニュアルフォーカスモードとオートフォーカスモードとを切り替え可能に構成されている。何れかのフォーカスモードが選択されると、表示制御部36は、スプリットイメージが合成されたライブビュー画像を表示装置に表示させる。また、ダイヤル212によりオートフォーカスモードが選択されると、CPU12は、位相差検出部及び自動焦点調整部として動作する。位相差検出部は、第1の画素群から出力された第1の画像と第2の画素群から出力された第2の画像との位相差を検出する。自動焦点調整部は、検出された位相差に基づいて撮影レンズ16のデフォーカス量をゼロにするように、デバイス制御部22からマウント256,346を介してレンズ駆動部(図示省略)を制御し、撮影レンズ16を合焦位置に移動させる。なお、上記の「デフォーカス量」とは、例えば第1の画像及び第2の画像の位相ずれ量を指す。 Note that the imaging apparatus 100 according to the first embodiment is configured to be able to switch between a manual focus mode and an autofocus mode by a dial 212 (focus mode switching unit). When any one of the focus modes is selected, the display control unit 36 causes the display device to display a live view image obtained by combining the split images. When the autofocus mode is selected by the dial 212, the CPU 12 operates as a phase difference detection unit and an automatic focus adjustment unit. The phase difference detection unit detects a phase difference between the first image output from the first pixel group and the second image output from the second pixel group. The automatic focus adjustment unit controls the lens driving unit (not shown) from the device control unit 22 via the mounts 256 and 346 so that the defocus amount of the photographing lens 16 is zero based on the detected phase difference. Then, the photographing lens 16 is moved to the in-focus position. Note that the above “defocus amount” refers to, for example, the amount of phase shift between the first image and the second image.
 接眼検出部37は、人(例えば撮影者)がファインダー接眼部242を覗き込んだことを検出し、検出結果をCPU12に出力する。従って、CPU12は、接眼検出部37での検出結果に基づいてファインダー接眼部242が使用されているか否かを把握することができる。 The eyepiece detection unit 37 detects that a person (for example, a photographer) has looked into the viewfinder eyepiece 242 and outputs the detection result to the CPU 12. Therefore, the CPU 12 can grasp whether or not the finder eyepiece unit 242 is used based on the detection result of the eyepiece detection unit 37.
 外部I/F39は、LAN(Local Area Network)やインターネットなどの通信網に接続され、通信網を介して、外部装置(例えばプリンタ)とCPU12との間の各種情報の送受信を司る。従って、撮像装置100は、外部装置としてプリンタが接続されている場合、撮影した静止画像をプリンタに出力して印刷させることができる。また、撮像装置100は、外部装置としてディスプレイが接続されている場合は、撮影した静止画像やライブビュー画像をディスプレイに出力して表示させることができる。 The external I / F 39 is connected to a communication network such as a LAN (Local Area Network) or the Internet, and controls transmission / reception of various information between the external device (for example, a printer) and the CPU 12 via the communication network. Therefore, when a printer is connected as an external device, the imaging apparatus 100 can output a captured still image to the printer for printing. Further, when a display is connected as an external device, the imaging apparatus 100 can output and display a captured still image or live view image on the display.
 図9は第1実施形態に係る撮像装置100の要部機能の一例を示す機能ブロック図である。なお、図3に示すブロック図と共通する部分には同一の符号が付されている。 FIG. 9 is a functional block diagram illustrating an example of main functions of the imaging apparatus 100 according to the first embodiment. In addition, the same code | symbol is attached | subjected to the part which is common in the block diagram shown in FIG.
 通常処理部30及びスプリットイメージ処理部32は、それぞれWBゲイン部、ガンマ補正部及び同時化処理部を有し(図示省略)、メモリ26に一時記憶された元のデジタル信号(RAW画像)に対して各処理部で順次信号処理を行う。すなわち、WBゲイン部は、R,G,B信号のゲインを調整することによりホワイトバランス(WB)を実行する。ガンマ補正部は、WBゲイン部でWBが実行された各R,G,B信号をガンマ補正する。同時化処理部は、撮像素子20のカラーフィルタの配列に対応した色補間処理を行い、同時化したR,G,B信号を生成する。なお、通常処理部30及びスプリットイメージ処理部32は、撮像素子20により1画面分のRAW画像が取得される毎に、そのRAW画像に対して並列に画像処理を行う。 The normal processing unit 30 and the split image processing unit 32 each have a WB gain unit, a gamma correction unit, and a synchronization processing unit (not shown), and with respect to the original digital signal (RAW image) temporarily stored in the memory 26. Each processing unit sequentially performs signal processing. That is, the WB gain unit executes white balance (WB) by adjusting the gains of the R, G, and B signals. The gamma correction unit performs gamma correction on each of the R, G, and B signals that have been subjected to WB by the WB gain unit. The synchronization processing unit performs color interpolation processing corresponding to the color filter array of the image sensor 20, and generates synchronized R, G, B signals. The normal processing unit 30 and the split image processing unit 32 perform image processing on the RAW image in parallel every time a RAW image for one screen is acquired by the image sensor 20.
 通常処理部30は、インタフェース部24からR,G,BのRAW画像が入力され、第3の画素群のR,G,B画素を、一例として図10に示すように、第1の画素群及び第2の画素群のうちの同色の周辺画素(例えば隣接するG画素)により補間して生成する。これにより、第3の画素群から出力された第3の画像に基づいて記録用の通常画像を生成することができる。 The normal processing unit 30 receives R, G, B RAW images from the interface unit 24, and uses the R, G, B pixels of the third pixel group as an example as shown in FIG. And it is generated by interpolating with peripheral pixels of the same color in the second pixel group (for example, adjacent G pixels). Thereby, a normal image for recording can be generated based on the third image output from the third pixel group.
 また、通常処理部30は、生成した記録用の通常画像の画像データをエンコーダ34に出力する。通常処理部30により処理されたR,G,B信号は、エンコーダ34により記録用の信号に変換(エンコーディング)され、記録部40(図7参照)に記録される。また、通常処理部30により処理された第3の画像に基づく画像である表示用の通常画像は、表示制御部36に出力される。なお、以下では、説明の便宜上、上記の「記録用の通常画像」及び「表示用の通常画像」を区別して説明する必要がない場合は「記録用の」との文言及び「表示用の」との文言を省略して「通常画像」と称する。 Further, the normal processing unit 30 outputs the generated image data of the normal image for recording to the encoder 34. The R, G, B signals processed by the normal processing unit 30 are converted (encoded) into recording signals by the encoder 34 and recorded in the recording unit 40 (see FIG. 7). In addition, a normal image for display that is an image based on the third image processed by the normal processing unit 30 is output to the display control unit 36. In the following description, for convenience of explanation, when it is not necessary to distinguish between the above-mentioned “normal image for recording” and “normal image for display”, the term “for recording” and “for display” are used. Is referred to as a “normal image”.
 撮像素子20は、第1の画素群及び第2の画素群の各々の露出条件(一例として電子シャッタによるシャッタ速度)を変えることができ、これにより露出条件の異なる画像を同時に取得することができる。従って、画像処理部28は、露出条件の異なる画像に基づいて広ダイナミックレンジの画像を生成することができる。また、同じ露出条件で複数の画像を同時に取得することができ、これら画像を加算することによりノイズの少ない高感度の画像を生成し、あるいは高解像度の画像を生成することができる。 The image sensor 20 can change the exposure conditions (shutter speed by an electronic shutter as an example) of each of the first pixel group and the second pixel group, and thereby can simultaneously acquire images having different exposure conditions. . Therefore, the image processing unit 28 can generate an image with a wide dynamic range based on images with different exposure conditions. In addition, a plurality of images can be simultaneously acquired under the same exposure condition, and by adding these images, a high-sensitivity image with little noise can be generated, or a high-resolution image can be generated.
 一方、スプリットイメージ処理部32は、メモリ26に一旦記憶されたRAW画像から第1の画素群及び第2の画素群のG信号を抽出し、第1の画素群及び第2の画素群のG信号に基づいて無彩色のスプリットイメージを生成する。RAW画像から抽出される第1の画素群及び第2の画素群の各々は、上述したようにGフィルタの画素による画素群である。そのため、スプリットイメージ処理部32は、第1の画素群及び第2の画素群のG信号に基づいて、無彩色の左の視差画像及び無彩色の右の視差画像を生成することができる。なお、以下では、説明の便宜上、上記の「無彩色の左の視差画像」を「左目画像」と称し、上記の「無彩色の右の視差画像」を「右目画像」と称する。 On the other hand, the split image processing unit 32 extracts the G signal of the first pixel group and the second pixel group from the RAW image once stored in the memory 26, and the G of the first pixel group and the second pixel group. An achromatic split image is generated based on the signal. Each of the first pixel group and the second pixel group extracted from the RAW image is a pixel group including G filter pixels as described above. Therefore, the split image processing unit 32 can generate an achromatic left parallax image and an achromatic right parallax image based on the G signals of the first pixel group and the second pixel group. In the following, for convenience of explanation, the above “achromatic left parallax image” is referred to as a “left eye image”, and the above “achromatic right parallax image” is referred to as a “right eye image”.
 スプリットイメージ処理部32は、第1の画素群から出力された第1の画像に基づく左目画像と、第2の画素群から出力された第2の画像に基づく右目画像とを合成することによりスプリットイメージを生成する。生成したスプリットイメージの画像データは表示制御部36に出力される。 The split image processing unit 32 combines the left-eye image based on the first image output from the first pixel group and the right-eye image based on the second image output from the second pixel group to split the image. Generate an image. The generated split image data is output to the display controller 36.
 表示制御部36は、通常処理部30から入力された第3の画素群に対応する記録用の画像データと、スプリットイメージ処理部32から入力された第1、第2の画素群に対応するスプリットイメージの画像データとに基づいて表示用の画像データを生成する。例えば、表示制御部36は、通常処理部30から入力された第3の画素群に対応する記録用の画像データにより示される通常画像の表示領域内に、スプリットイメージ処理部32から入力された画像データにより示されるスプリットイメージを合成する。そして、合成して得た画像データを表示装置に出力する。すなわち、表示制御部36Aは画像データを表示部213に出力し、表示制御部36Bは画像データをLCD247に出力する。 The display control unit 36 records image data corresponding to the third pixel group input from the normal processing unit 30 and the split corresponding to the first and second pixel groups input from the split image processing unit 32. Display image data is generated based on the image data of the image. For example, the display control unit 36 displays the image input from the split image processing unit 32 in the display area of the normal image indicated by the recording image data corresponding to the third pixel group input from the normal processing unit 30. The split image indicated by the data is synthesized. Then, the image data obtained by the synthesis is output to the display device. That is, the display control unit 36A outputs the image data to the display unit 213, and the display control unit 36B outputs the image data to the LCD 247.
 スプリットイメージ処理部32により生成されるスプリットイメージは、左目画像の一部と右目画像の一部とを合成した複数分割の画像である。ここで言う「複数分割の画像」としては、例えば図13A,13Bに示すスプリットイメージが挙げられる。図13に示すスプリットイメージは、左目画像のうちの上半分の画像と右目画像のうちの下半分の画像とを合成した画像であって、上下方向に2分割された画像間が合焦状態に応じて所定方向(例えば視差発生方向)にずれた画像である。なお、スプリットイメージの態様は図13A,13Bに示す例に限定されるものではなく、表示部213の所定領域の位置に対応する位置の左目画像の一部と右目画像の一部とを合成した画像であってもよい。この場合、例えば上下方向に4分割された画像間が合焦状態に応じて所定方向(例えば視差発生方向)にずれる。 The split image generated by the split image processing unit 32 is a multi-divided image obtained by combining a part of the left eye image and a part of the right eye image. Examples of the “multi-divided image” mentioned here include split images shown in FIGS. 13A and 13B. The split image shown in FIG. 13 is an image obtained by combining the upper half image of the left-eye image and the lower half image of the right-eye image, and the image divided into two in the vertical direction is in focus. Accordingly, the image is shifted in a predetermined direction (for example, a parallax generation direction). Note that the split image mode is not limited to the example shown in FIGS. 13A and 13B, and a part of the left eye image and a part of the right eye image at a position corresponding to the position of the predetermined area of the display unit 213 are combined. It may be an image. In this case, for example, an image divided into four in the vertical direction is shifted in a predetermined direction (for example, a parallax generation direction) according to the in-focus state.
 通常画像にスプリットイメージを合成する方法は、通常画像の一部の画像に代えて、スプリットイメージを嵌め込む合成方法に限定されない。例えば、通常画像の上にスプリットイメージを重畳させる合成方法であってもよい。また、スプリットイメージを重畳する際に、スプリットイメージが重畳される通常画像の一部の画像とスプリットイメージとの透過率を適宜調整して重畳させる合成方法であってもよい。これにより、連続的に撮影している被写体像を示すライブビュー画像が表示装置の画面上に表示されるが、表示されるライブビュー画像は、通常画像の表示領域内にスプリットイメージが表示された画像となる。 The method of combining the split image with the normal image is not limited to the combining method of fitting the split image in place of a part of the normal image. For example, a synthesis method in which a split image is superimposed on a normal image may be used. In addition, when the split image is superimposed, a combining method may be used in which the transmittance of a part of the normal image on which the split image is superimposed and the split image are appropriately adjusted and superimposed. As a result, a live view image showing a subject image continuously shot is displayed on the screen of the display device, but the displayed live view image has a split image displayed in the normal image display area. It becomes an image.
 ハイブリッドファインダー220は、OVF240及びEVF248を含む。OVF240は、対物レンズ244と接眼レンズ246とを有する逆ガリレオ式ファインダーであり、EVF248は、LCD247、プリズム245及び接眼レンズ246を有する。 The hybrid finder 220 includes an OVF 240 and an EVF 248. The OVF 240 is an inverse Galileo finder having an objective lens 244 and an eyepiece 246, and the EVF 248 has an LCD 247, a prism 245, and an eyepiece 246.
 また、対物レンズ244の前方には、液晶シャッタ243が配設されており、液晶シャッタ243は、EVF248を使用する際に、対物レンズ244に光学像が入射しないように遮光する。 Further, a liquid crystal shutter 243 is disposed in front of the objective lens 244, and the liquid crystal shutter 243 shields light so that an optical image does not enter the objective lens 244 when the EVF 248 is used.
 プリズム245は、LCD247に表示される電子像又は各種の情報を反射させて接眼レンズ246に導くと共に、光学像とLCD247に表示される情報(電子像、各種の情報)とを合成する。 The prism 245 reflects an electronic image or various information displayed on the LCD 247 and guides it to the eyepiece 246, and combines the optical image and information (electronic image and various information) displayed on the LCD 247.
 ここで、ファインダー切替えレバー214を図1に示す矢印SW方向に回動させると、回動させる毎にOVF240により光学像を視認することができるOVFモードと、EVF248により電子像を視認することができるEVFモードとが交互に切り替えられる。 Here, when the viewfinder switching lever 214 is rotated in the direction of the arrow SW shown in FIG. 1, an OVF mode in which an optical image can be visually recognized by the OVF 240 and an electronic image can be visually recognized by the EVF 248 each time it is rotated. The EVF mode is switched alternately.
 表示制御部36Bは、OVFモードの場合、液晶シャッタ243が非遮光状態になるように制御し、接眼部から光学像が視認できるようにする。また、LCD247には、スプリットイメージのみを表示させる。これにより、光学像の一部にスプリットイメージが重畳されたファインダー像を表示させることができる。 In the OVF mode, the display control unit 36B controls the liquid crystal shutter 243 to be in a non-light-shielding state so that an optical image can be visually recognized from the eyepiece unit. Further, only the split image is displayed on the LCD 247. Thereby, a finder image in which a split image is superimposed on a part of the optical image can be displayed.
 一方、表示制御部36Bは、EVFモードの場合、液晶シャッタ243が遮光状態になるように制御し、接眼部からLCD247に表示される電子像のみが視認できるようにする。なお、LCD247には、表示部213に出力されるスプリットイメージが合成された画像データと同等の画像データが入力され、これにより、表示部213と同様に通常画像の一部にスプリットイメージが合成された電子像を表示させることができる。 On the other hand, in the EVF mode, the display control unit 36B controls the liquid crystal shutter 243 to be in a light shielding state so that only the electronic image displayed on the LCD 247 can be visually recognized from the eyepiece unit. Note that the image data equivalent to the image data obtained by combining the split image output to the display unit 213 is input to the LCD 247, whereby the split image is combined with a part of the normal image in the same manner as the display unit 213. Electronic images can be displayed.
 図11には表示装置における通常画像及びスプリットイメージの各々の表示領域の一例が示されている。一例として図11に示すように、表示装置は、通常画像及びスプリットイメージが入力された場合、入力されたスプリットイメージを画面中央部の矩形枠内(スプリットイメージの表示領域)に表示する。また、表示装置は、入力された通常画像をスプリットイメージの外周領域(通常画像の表示領域)に表示する。また、表示装置は、スプリットイメージが入力されずに通常画像が入力された場合、入力された通常画像を画面全領域に表示(全画面表示)する。また、表示装置は、通常画像が入力されずにスプリットイメージが入力された場合、入力されたスプリットイメージを一例として図11に示す画面中央部の矩形枠内に表示し、スプリットイメージの外周領域を空白領域とする。なお、図11に示す画面中央部の矩形枠を表す縁の線は実際には表示されないが、図11では説明の便宜上示されている。 FIG. 11 shows an example of display areas of the normal image and the split image in the display device. As an example, as illustrated in FIG. 11, when a normal image and a split image are input, the display device displays the input split image within a rectangular frame (a split image display area) at the center of the screen. The display device displays the input normal image in the outer peripheral area of the split image (normal image display area). Further, when the normal image is input without inputting the split image, the display device displays the input normal image in the entire screen area (full screen display). In addition, when a split image is input without inputting a normal image, the display device displays the input split image as an example in a rectangular frame at the center of the screen shown in FIG. It is a blank area. Note that the edge line representing the rectangular frame at the center of the screen shown in FIG. 11 is not actually displayed, but is shown in FIG. 11 for convenience of explanation.
 次に本第1実施形態の作用として画像処理部28により通常画像及びスプリットイメージが生成された場合(例えば1コマ分のライブビュー画像が生成される毎)にCPU12で行われる画像出力処理について、図12を参照して説明する。なお、ここでは、CPU12が画像出力処理プログラムを実行することにより撮像装置100で画像出力処理が行われる。画像出力処理プログラムは所定の記憶領域(例えばメモリ26)に記憶されている。また、ここでは、画像出力処理がCPU12で行われる場合を例示するが、本発明はこれに限定されるものではなく、例えば画像処理部28が画像出力処理を実行するものとしてもよい。 Next, as an operation of the first embodiment, an image output process performed by the CPU 12 when a normal image and a split image are generated by the image processing unit 28 (for example, every time a live view image for one frame is generated) This will be described with reference to FIG. Here, the image output process is performed by the imaging apparatus 100 when the CPU 12 executes the image output process program. The image output processing program is stored in a predetermined storage area (for example, the memory 26). Although the case where the image output process is performed by the CPU 12 is illustrated here, the present invention is not limited to this. For example, the image processing unit 28 may execute the image output process.
 図12において、先ず、ステップ300では、CPU12により、画像処理部28で生成されたスプリットイメージの表示が操作部14を介して指示されたか否かが判定される。なお、操作部14による指示は、保持型の指示であることが好ましい。また、操作部14による指示は、ハードキーによる指示であってもよいし、ソフトキーによる指示であってもよい。ハードキーによる指示の場合は、例えばオルタネイト動作型のスイッチ(保持型スイッチ)を適用することが好ましい。つまり、所定位置まで押し込むと解除操作が行われるまで動作状態(ここではスプリットイメージの表示を指示する状態)が維持されるスイッチを指す。オルタネイト動作型のスイッチは、押圧操作を解除すると押し込んだ位置で保持され、再び押圧操作が行われることにより元の状態に復帰するロック式であってもよいし、押圧操作が解除されると自由位置に戻るノンロック式であってもよい。一方、ソフトキーによる指示の場合はソフトウェア構成でハードキーと同様の機能を実現するスイッチを適用すればよい。この場合、例えば、CPU12が、表示部213に対して、スプリットイメージの表示を指示する際に押圧操作されるソフトキー(第2指示部の一例)を表示させる。そして、ソフトキーが操作された場合にスプリットイメージの表示が指示されたと判定する。本ステップ300においてスプリットイメージの表示が操作部14を介して指示された場合は判定が肯定されてステップ302へ移行する。本ステップ300においてスプリットイメージの表示が操作部14を介して指示されない場合は判定が否定されてステップ312へ移行する。 In FIG. 12, first, in step 300, the CPU 12 determines whether or not the split image generated by the image processing unit 28 has been instructed via the operation unit 14. The instruction by the operation unit 14 is preferably a holding type instruction. Further, the instruction by the operation unit 14 may be an instruction by a hard key or an instruction by a soft key. In the case of an instruction by a hard key, for example, an alternate operation type switch (holding type switch) is preferably applied. That is, it refers to a switch that is maintained in an operating state (in this case, a state in which display of a split image is instructed) until a release operation is performed when pushed into a predetermined position. The alternate operation type switch may be a lock type that is held in the pushed-in position when the pressing operation is released, and returns to the original state when the pressing operation is performed again, or is free when the pressing operation is released. A non-locking type returning to the position may be used. On the other hand, in the case of an instruction using a soft key, a switch that realizes the same function as a hard key in a software configuration may be applied. In this case, for example, the CPU 12 causes the display unit 213 to display a soft key (an example of a second instruction unit) that is pressed when instructing display of the split image. When the soft key is operated, it is determined that the split image display is instructed. If split image display is instructed via the operation unit 14 in step 300, the determination is affirmed and the routine proceeds to step 302. If the split image display is not instructed via the operation unit 14 in step 300, the determination is negative and the process proceeds to step 312.
 ステップ302では、CPU12により、画像処理部28で生成された通常画像の表示が操作部14を介して指示されたか否かが判定される。なお、操作部14による指示は、非保持型の指示であることが好ましい。また、操作部14による指示は、ハードキーによる指示であってもよいし、ソフトキーによる指示であってもよい。ハードキーによる指示の場合は、例えばモーメンタリ動作型のスイッチ(非保持型スイッチ)を適用することが好ましい。つまり、所定位置に押し込んでいる間だけ動作状態(ここでは一例として通常画像の表示を指示する状態)を維持するスイッチを指す。モーメンタリ動作型のスイッチは、通常画像の表示を指示するときは押して、指示を解除するときは引き出すプッシュプル式であってもよいし、その他の型式のものであってもよい。一方、ソフトキーによる指示の場合はソフトウェア構成でハードキーと同様の機能を実現するスイッチを適用すればよい。この場合、例えば、CPU12が、表示部213に対して、通常画像の表示を指示する際に押圧操作されるソフトキー(第1指示部の一例)を表示させる。そして、ソフトキーが操作された場合に通常画像の表示が指示されたと判定する。本ステップ302において通常画像の表示が操作部14を介して指示された場合は判定が肯定されてステップ306へ移行する。本ステップ302において通常画像の表示が操作部14を介して指示されない場合は判定が否定されてステップ308へ移行する。 In step 302, the CPU 12 determines whether or not the display of the normal image generated by the image processing unit 28 is instructed via the operation unit 14. The instruction by the operation unit 14 is preferably a non-holding type instruction. Further, the instruction by the operation unit 14 may be an instruction by a hard key or an instruction by a soft key. In the case of an instruction by a hard key, for example, a momentary operation type switch (non-holding type switch) is preferably applied. That is, it refers to a switch that maintains an operating state (here, as an example, a state instructing display of a normal image) only while being pushed into a predetermined position. The momentary operation type switch may be a push-pull type that is pressed when instructing display of a normal image and is pulled out when canceling the instruction, or may be of other types. On the other hand, in the case of an instruction using a soft key, a switch that realizes the same function as a hard key in a software configuration may be applied. In this case, for example, the CPU 12 causes the display unit 213 to display a soft key (an example of a first instruction unit) that is pressed when instructing display of a normal image. Then, it is determined that the display of the normal image is instructed when the soft key is operated. If the display of the normal image is instructed via the operation unit 14 in step 302, the determination is affirmed and the process proceeds to step 306. If the display of the normal image is not instructed via the operation unit 14 in step 302, the determination is negative and the process proceeds to step 308.
 ステップ306では、CPU12により、画像処理部28に対して通常画像を表示制御部36A,36Bに出力させると共にスプリットイメージを廃棄させる制御が行われ、その後、ステップ310へ移行する。本ステップ305がCPU12によって実行されることで、画像処理部28は、生成した通常画像を表示制御部36A,36Bに出力し、生成したスプリットイメージを廃棄する。表示制御部36Aは、画像処理部28から通常画像が入力された場合、入力された通常画像を表示部213に出力することで表示部213に通常画像を表示させる。この場合、表示部213は、画面の全領域に通常画像を表示する。また、表示制御部36Bは、画像処理部28から通常画像が入力された場合、入力された通常画像をLCD247に出力することでLCD247に通常画像を表示させる。この場合、LCD247は、画面の全領域に通常画像を表示する。 In step 306, the CPU 12 controls the image processing unit 28 to output the normal image to the display control units 36 </ b> A and 36 </ b> B and discards the split image, and then proceeds to step 310. When the CPU 12 executes this step 305, the image processing unit 28 outputs the generated normal image to the display control units 36A and 36B, and discards the generated split image. When a normal image is input from the image processing unit 28, the display control unit 36A outputs the input normal image to the display unit 213, thereby causing the display unit 213 to display the normal image. In this case, the display unit 213 displays a normal image in the entire area of the screen. In addition, when a normal image is input from the image processing unit 28, the display control unit 36B outputs the input normal image to the LCD 247 to display the normal image on the LCD 247. In this case, the LCD 247 displays a normal image in the entire area of the screen.
 ステップ308では、CPU12により、画像処理部28に対して通常画像及びスプリットイメージを表示制御部36A,36Bに出力させる制御が行われ、その後、ステップ310へ移行する。本ステップ308がCPU12によって実行されることで、画像処理部28は、生成した通常画像及びスプリットイメージを表示制御部36A,36Bに出力する。表示制御部36Aは、画像処理部28から通常画像及びスプリットイメージが入力された場合、入力された通常画像及びスプリットイメージを表示部213に表示させる。この場合、表示部213は、一例として図11に示す通常画像の表示領域に通常画像を表示し、一例として図11に示すスプリットイメージの表示領域にスプリットイメージを表示する。また、表示制御部36Bは、画像処理部28から通常画像及びスプリットイメージが入力された場合、入力された通常画像及びスプリットイメージをLCD247に表示させる。この場合、LCD247は、一例として図11に示す通常画像の表示領域に通常画像を表示し、一例として図11に示すスプリットイメージの表示領域にスプリットイメージを表示する。 In step 308, the CPU 12 controls the image processing unit 28 to output the normal image and the split image to the display control units 36A and 36B, and then proceeds to step 310. As the step 308 is executed by the CPU 12, the image processing unit 28 outputs the generated normal image and split image to the display control units 36A and 36B. When the normal image and the split image are input from the image processing unit 28, the display control unit 36A causes the display unit 213 to display the input normal image and split image. In this case, the display unit 213 displays a normal image in the normal image display area shown in FIG. 11 as an example, and displays the split image in the split image display area shown in FIG. 11 as an example. In addition, when the normal image and the split image are input from the image processing unit 28, the display control unit 36B causes the LCD 247 to display the input normal image and split image. In this case, the LCD 247 displays the normal image in the normal image display area shown in FIG. 11 as an example, and displays the split image in the split image display area shown in FIG. 11 as an example.
 ステップ312では、CPU12により上記ステップ306と同様の処理が行われ、その後、ステップ310へ移行する。ステップ310では、CPU12により、本画像出力処理を終了する条件(終了条件)を満足したか否かが判定される。ここで言う「終了条件」としては、例えば操作部14を介して本画像出力処理を終了する指示が与えられたとの条件や画像処理部28により通常画像及びスプリットイメージが生成されなくなったとの条件が挙げられる。本ステップ310において終了条件を満足しない場合はステップ300へ戻る。本ステップ310において終了条件を満足した場合は判定が肯定されて本画像出力処理を終了する。 In step 312, the CPU 12 performs the same processing as in step 306, and then proceeds to step 310. In step 310, the CPU 12 determines whether or not a condition (end condition) for ending the main image output process is satisfied. Examples of the “end condition” here include a condition that an instruction to end the main image output process is given via the operation unit 14 and a condition that the normal image and the split image are no longer generated by the image processing unit 28. Can be mentioned. If the end condition is not satisfied in step 310, the process returns to step 300. If the end condition is satisfied in step 310, the determination is affirmed and the main image output process is ended.
 以上のように上記ステップ308がCPU12によって実行されると、一例として図13A及び図13Bに示すように表示部213やハイブリッドファインダー220にライブビュー画像が表示される。図13A及び図13Bに示す例では、一例として図11に示すスプリットイメージの表示領域に相当する画面中央の枠60の内側領域にスプリットイメージが表示されており、通常画像の表示領域に相当する枠60の外側領域に通常画像が表示されている。 As described above, when step 308 is executed by the CPU 12, a live view image is displayed on the display unit 213 and the hybrid viewfinder 220 as shown in FIGS. 13A and 13B as an example. In the example shown in FIGS. 13A and 13B, as an example, the split image is displayed in the inner area of the frame 60 at the center of the screen corresponding to the split image display area shown in FIG. 11, and the frame corresponding to the display area of the normal image is displayed. A normal image is displayed in the outer area 60.
 すなわち、第1及び第2の画素群は、枠60のサイズに対応して設けられている。スプリットイメージは、第1の画素群から出力された第1の画像に対応する左目画像おける枠60の上半分60Aの画像(視差画像)と、第2の画素群から出力された第2の画像に対応する右目画像における枠60の下半分60Bの画像(視差画像)とに大別される。 That is, the first and second pixel groups are provided corresponding to the size of the frame 60. The split image is an image (parallax image) of the upper half 60A of the frame 60 in the left-eye image corresponding to the first image output from the first pixel group, and the second image output from the second pixel group. And the image (parallax image) of the lower half 60B of the frame 60 in the right-eye image corresponding to.
 ここで、枠60内の画像に対応する被写体に対して、撮影レンズ16のピントが合っていない場合は、図13Aに示すようにスプリットイメージの上半分60Aの視差画像と、下半分60Bの視差画像との境界の画像が視差発生方向(一例として水平方向)にずれる。また、通常画像とスプリットイメージとの境界の画像も視差発生方向にずれる。これは、位相差が生じていることを表しており、撮影者はスプリットイメージを通して視覚的に位相差が生じていること及び視差発生方向を認識することができる。 Here, when the subject corresponding to the image in the frame 60 is not in focus, the parallax image of the upper half 60A of the split image and the parallax of the lower half 60B as shown in FIG. 13A. The image at the boundary with the image is shifted in the parallax generation direction (for example, the horizontal direction). In addition, the boundary image between the normal image and the split image is also shifted in the parallax generation direction. This indicates that a phase difference has occurred, and the photographer can visually recognize the phase difference and the parallax generation direction through the split image.
 一方、枠60内の画像に対応する被写体に対して、撮影レンズ16のピントが合っている場合は、図13Bに示すようにスプリットイメージの上半分60Aの視差画像と、下半分60Bの視差画像との境界の画像が一致する。また、通常画像とスプリットイメージとの境界の画像も一致する。これは、位相差が生じていないことを表しており、撮影者はスプリットイメージを通して視覚的に位相差が生じていないことを認識することができる。 On the other hand, when the photographing lens 16 is focused on the subject corresponding to the image in the frame 60, the parallax image of the upper half 60A and the parallax image of the lower half 60B as shown in FIG. 13B. The border images match. The images at the boundary between the normal image and the split image also match. This indicates that no phase difference has occurred, and the photographer can recognize that no phase difference has occurred visually through the split image.
 このように、撮影者は、表示装置に表示されるスプリットイメージにより撮影レンズ16の合焦状態を確認することができる。また、マニュアルフォーカスモード時には、撮影レンズ16のフォーカスリング302を手動操作することによりピントのずれ量(デフォーカス量)をゼロにすることができる。また、通常画像とスプリットイメージとをそれぞれ色ずれのないカラー画像で表示することができ、撮影者の手動によるフォーカス調整をカラーのスプリットイメージで支援することができる。 As described above, the photographer can check the in-focus state of the photographing lens 16 by the split image displayed on the display device. In the manual focus mode, the focus shift amount (defocus amount) can be reduced to zero by manually operating the focus ring 302 of the photographing lens 16. Further, the normal image and the split image can be displayed as color images without color misregistration, and the manual focus adjustment by the photographer can be supported by the color split image.
 また、表示装置は、スプリットイメージの表示が指示されている状態で、通常画像の表示が指示された場合はスプリットイメージを表示せず、通常画像の表示が指示されない場合は通常画像及びスプリットイメージを表示する。通常画像の表示が指示されない場合とは例えば通常画像の表示指示が解除された場合を指す。従って、本実施形態に係る撮像装置100は、本構成を有しない場合に比べ、スプリットイメージの表示と非表示とを適切な時期に切り替えることができる。 In addition, the display device does not display the split image when the display of the normal image is instructed to display the split image, and displays the normal image and the split image when the display of the normal image is not instructed. indicate. The case where the display of the normal image is not instructed indicates, for example, the case where the display instruction of the normal image is canceled. Therefore, the imaging apparatus 100 according to the present embodiment can switch between displaying and not displaying the split image at an appropriate time as compared with the case where the present configuration is not provided.
 また、本第1実施形態では、画像処理部28から表示制御部36に通常画像及びスプリットイメージを出力する態様を例示したが、外部I/F39に接続されている外部装置に通常画像及びスプリットイメージを外部I/F39を介して出力してもよい。この場合、外部装置が記憶装置(例えばサーバ装置)であれば、記憶装置に対して通常画像及びスプリットイメージを記憶させることができる。また、外部装置が外部ディスプレイであれば、外部ディスプレイに対して、上述した表示装置と同様に通常画像及びスプリットイメージを表示させることができる。 In the first embodiment, the normal image and the split image are output from the image processing unit 28 to the display control unit 36. However, the normal image and the split image are output to an external device connected to the external I / F 39. May be output via the external I / F 39. In this case, if the external device is a storage device (for example, a server device), a normal image and a split image can be stored in the storage device. Further, if the external device is an external display, a normal image and a split image can be displayed on the external display in the same manner as the display device described above.
 以上に説明したように、本第1実施形態に係る撮像装置100は、撮影レンズ16の射出瞳を通過する光束の光軸の左側及び右側(第1及び第2の領域を通過した被写体像の一例)が瞳分割されてそれぞれ結像される第1及び第2の画素群を含む。また、生成部の一例である画像処理部28は、第1及び第2の画素群を有する撮像素子20から出力された画像に基づいて第1の表示用画像の一例である通常画像を生成する。また、操作部14が、通常画像の表示を指示すると共にスプリットイメージの表示を指示する(ステップ300,302)。また、操作部14によりスプリットイメージの表示が指示されている状態で操作部14により通常画像の表示が指示された場合、画像処理部28が、生成したスプリットイメージを出力せずに、生成した通常画像を出力する(ステップ306)。また、操作部14によりスプリットイメージの表示が指示されている状態で操作部14により通常画像の出力指示が解除された場合、画像処理部28が、生成したスプリットイメージ及び通常画像を出力する(ステップ308)。そして、表示制御部36が、通常画像及びスプリットイメージが入力された場合は、表示装置に対して、通常画像を表示させると共に通常画像の表示領域内にスプリットイメージを表示させる制御を行う。これにより、本構成を有しない場合に比べ、スプリットイメージの表示及び非表示を適切な時期に切り替えることができる。また、操作部14による通常画像の表示指示が解除された場合(ステップ302:N)にスプリットイメージが出力されるので、本構成を有しない場合に比べ、スプリットイメージを迅速に表示させることができる。 As described above, the imaging apparatus 100 according to the first embodiment has the left side and the right side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16 (the subject image that has passed through the first and second regions). An example) includes first and second pixel groups that are imaged by pupil division. The image processing unit 28, which is an example of a generation unit, generates a normal image, which is an example of a first display image, based on an image output from the image sensor 20 having the first and second pixel groups. . Further, the operation unit 14 instructs to display a normal image and also displays a split image (steps 300 and 302). In addition, when display of a normal image is instructed by the operation unit 14 in a state in which display of the split image is instructed by the operation unit 14, the image processing unit 28 generates the generated normal image without outputting the generated split image. An image is output (step 306). In addition, when the operation unit 14 cancels the normal image output instruction while the operation unit 14 instructs the display of the split image, the image processing unit 28 outputs the generated split image and normal image (step). 308). When the normal image and the split image are input, the display control unit 36 controls the display device to display the normal image and display the split image in the normal image display area. Thereby, the display and non-display of the split image can be switched at an appropriate time as compared with the case where this configuration is not provided. Further, since the split image is output when the instruction to display the normal image by the operation unit 14 is canceled (step 302: N), the split image can be displayed more quickly than in the case where the present configuration is not provided. .
 また、本第1実施形態に係る撮像装置100では、通常画像の表示の指示として、非保持型の指示を適用している。これにより、本構成を有しない場合に比べ、スプリットイメージの表示と非表示とを迅速に切り替えることができる。 Further, in the imaging apparatus 100 according to the first embodiment, a non-holding type instruction is applied as an instruction to display a normal image. Thereby, the display and non-display of a split image can be switched quickly compared with the case where this configuration is not provided.
 また、本第1実施形態に係る撮像装置100では、スプリットイメージの表示の指示として、保持型の指示を適用している。これにより、本構成を有しない場合に比べ、スプリットイメージの表示と非表示とを切り替える手間を軽減することができる。 Further, in the imaging apparatus 100 according to the first embodiment, a holding type instruction is applied as an instruction to display a split image. Thereby, compared with the case where this configuration is not provided, it is possible to reduce the trouble of switching between split image display and non-display.
 また、本第1実施形態に係る撮像装置100では、撮像素子20が、撮影レンズ16を透過した被写体像が瞳分割されずに結像されて第3の画像を出力する第3の画素群を有する。また、画像処理部28が、第3の画素群から出力された第3の画像に基づいて通常画像を生成する。これにより、本構成を有しない場合に比べ、通常画像の画質を高めることができる。 Further, in the imaging apparatus 100 according to the first embodiment, the imaging device 20 includes a third pixel group that outputs a third image by forming a subject image that has passed through the photographing lens 16 without being divided into pupils. Have. In addition, the image processing unit 28 generates a normal image based on the third image output from the third pixel group. Thereby, compared with the case where this structure is not provided, the image quality of a normal image can be improved.
 なお、上記第1実施形態では、2×2画素のGフィルタに対して単一の位相差画素を配置する形態例を挙げて説明したが、これに限らず、例えば2×2画素のGフィルタに対して一対の第1画素L及び第2画素Rを配置してもよい。例えば図14に示すように、2×2画素のGフィルタに対して水平方向に隣接した一対の第1画素L及び第2画素Rを配置してもよい。また、例えば図15に示すように、2×2画素のGフィルタに対して垂直方向に隣接した一対の第1画素L及び第2画素Rを配置してもよい。いずれの場合も上記第1実施形態で説明したように、第1画素L及び第2画素Rの位置を、第1の画素群と第2の画素群との間で垂直方向及び水平方向の少なくとも一方について所定画素数内で揃えることが好ましい。なお、図14及び図15には、第1画素Lと第2画素Rとを、第1の画素群と第2の画素群との間で垂直方向及び水平方向の各々についての位置を1画素内で揃えた位置に配置した例が示されている。 In the first embodiment described above, a single phase difference pixel is arranged with respect to a 2 × 2 pixel G filter. However, the present invention is not limited to this. For example, a 2 × 2 pixel G filter is used. Alternatively, a pair of first pixel L and second pixel R may be arranged. For example, as shown in FIG. 14, a pair of first and second pixels L and R adjacent in the horizontal direction with respect to a 2 × 2 pixel G filter may be arranged. For example, as shown in FIG. 15, a pair of first and second pixels L and R adjacent to each other in the vertical direction with respect to the 2 × 2 pixel G filter may be arranged. In any case, as described in the first embodiment, the positions of the first pixel L and the second pixel R are set at least in the vertical direction and the horizontal direction between the first pixel group and the second pixel group. It is preferable to align one side within a predetermined number of pixels. 14 and 15, the first pixel L and the second pixel R, and the position in the vertical direction and the horizontal direction between the first pixel group and the second pixel group are one pixel. An example is shown in which they are arranged at the same positions.
 また、上記第1実施形態では、基本配列パターンCを有するカラーフィルタ21を例示したが、本発明はこれに限定されるものではない。例えば図16~18に示すようにカラーフィルタの原色(Rフィルタ,Gフィルタ,Bフィルタ)の配列をベイヤ配列としてもよい。図16~図18に示す例では、Gフィルタに対して位相差画素が配置されている。 In the first embodiment, the color filter 21 having the basic arrangement pattern C is exemplified, but the present invention is not limited to this. For example, as shown in FIGS. 16 to 18, the arrangement of the primary colors (R filter, G filter, B filter) of the color filter may be a Bayer arrangement. In the examples shown in FIGS. 16 to 18, phase difference pixels are arranged for the G filter.
 一例として図16に示すカラーフィルタ21Aは、3×3画素の正方行列の四隅及び中央がGフィルタとされた配列パターンGの中央に位相差画素が配置されている。また、第1の画素Lと第2の画素Rとが水平方向及び垂直方向の各々について1画素分のGフィルタを飛ばして(1画素分のGフィルタを間に置いて)交互に配置されている。また、第1画素Lと第2画素Rとが、第1の画素群と第2の画素群との間で垂直方向及び水平方向の各々についての位置を1画素内で揃えた位置に配置されている。これにより、配列パターンGの中央の位相差画素に基づく画像は配列パターンGの四隅の通常画素に基づく画像を利用して補間可能となるため、本構成を有しない場合に比べ、補間精度を向上させることができる。 The color filter 21A shown in FIG. 16, 3 × 3 four corners and the center of the pixel square matrix are disposed phase difference pixels in the center of the arrangement pattern G 1 which is a G filter as an example. Further, the first pixel L and the second pixel R are alternately arranged by skipping the G filter for one pixel in each of the horizontal direction and the vertical direction (with the G filter for one pixel in between). Yes. Further, the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group. ing. Thus, an image based on the center of the phase difference pixel arrangement pattern G 1 because it is possible interpolation using the image based on normal pixels at the four corners of the array pattern G 1, compared to a case in which this configuration, interpolation accuracy Can be improved.
 しかも、配列パターンGの各々は互いに位置が重複していない。つまり、第1の画素L及び第2の画素Rは、第1及び第2の画素群に含まれる各画素に隣接する第3の画素群に含まれる画素による第3の画像で補間する第1及び第2の画像に含まれる画素が画素単位で重複しない位置に配置されている。そのため、位相差画素に基づく画像が他の位相差画素に基づく画像の補間で利用された通常画素に基づく画像で補間されることを回避することができる。よって、補間精度のより一層の向上が期待できる。 Moreover, each of the arrangement pattern G 1 does not position overlap with each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups. The pixels included in the second image are arranged at positions that do not overlap in pixel units. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
 一例として図17に示すカラーフィルタ21Bは、配列パターンGの中央及び図中正面視右下角に位相差画素が配置されている。また、第1の画素Lと第2の画素Rとが水平方向及び垂直方向の各々について2画素分のGフィルタを飛ばして(2画素分のGフィルタを間に置いて)交互に配置されている。これにより、第1画素Lと第2画素Rとが、第1の画素群と第2の画素群との間で垂直方向及び水平方向の各々についての位置を1画素内で揃えた位置に配置され、第1の画素L及び第2の画素Rを隣接させることができる。よって、ピントずれ以外の要因で画像ずれが発生するのを抑制することができる。 The color filter 21B shown in FIG. 17, the phase difference pixels are arranged in the center and in the drawing front view the lower right corner of the array pattern G 1 as an example. Further, the first pixel L and the second pixel R are alternately arranged by skipping the G filters for two pixels in each of the horizontal direction and the vertical direction (with the G filters for two pixels in between). Yes. Thereby, the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group. Thus, the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
 しかも、各位相差画素には同色のフィルタ(Gフィルタ)が設けられた通常画素が隣接するので、補間精度を向上させることができる。その上、配列パターンGの各々は互いに位置が重複していない。つまり、第1の画素L及び第2の画素Rは、第1及び第2の画素群に含まれる各画素に隣接する第3の画素群に含まれる画素による第3の画像で補間する第1及び第2の画像に含まれる画素が一対の画素単位で重複しない位置に配置されている。ここで言う「一対の画素」とは、例えば各配列パターンGに含まれる第1の画素L及び第2の画素R(一対の位相差画素)を指す。そのため、一対の位相差画素に基づく画像が他の一対の位相差画素に基づく画像の補間で利用された通常画素に基づく画像で補間されることを回避することができる。よって、補間精度のより一層の向上が期待できる。 In addition, since each normal pixel provided with the same color filter (G filter) is adjacent to each phase difference pixel, the interpolation accuracy can be improved. Moreover, each of the arrangement pattern G 1 does not position overlap with each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups. In addition, the pixels included in the second image are arranged at positions that do not overlap in a pair of pixel units. The "pair of pixels" refers to the first pixel L and a second pixel R (a pair of phase difference pixels) included, for example, in the arrangement pattern G 1. Therefore, it is possible to avoid an image based on a pair of phase difference pixels being interpolated with an image based on a normal pixel used in the interpolation of an image based on another pair of phase difference pixels. Therefore, further improvement in interpolation accuracy can be expected.
 一例として図18に示すカラーフィルタ21Cは、配列パターンGの中央に第1の画素Lが、図中正面視右下角に第2の画素Rが配置されている。また、第1の画素Lは水平方向及び垂直方向の各々について2画素分のGフィルタを飛ばして配置されており、第2の画素Rも水平方向及び垂直方向の各々について2画素分のGフィルタを飛ばして配置されている。これにより、第1画素Lと第2画素Rとが、第1の画素群と第2の画素群との間で垂直方向及び水平方向の各々についての位置を2画素内で揃えた位置に配置され、第1の画素L及び第2の画素Rを隣接させることができる。よって、ピントずれ以外の要因で画像ずれが発生するのを抑制することができる。 The color filter 21C shown in FIG. 18 as an example, the first pixel L in the center of the arrangement pattern G 1 is, the second pixel R are arranged in a front view the lower right corner in FIG. The first pixel L is disposed by skipping the G filters for two pixels in each of the horizontal direction and the vertical direction, and the second pixel R is also a G filter for two pixels in each of the horizontal direction and the vertical direction. It is arranged by skipping. Accordingly, the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within the two pixels between the first pixel group and the second pixel group. Thus, the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
 しかも、図18に示す例においても図17に示す例と同様に配列パターンGの各々は互いに位置が重複していない。そのため、位相差画素に基づく画像が他の位相差画素に基づく画像の補間で利用された通常画素に基づく画像で補間されることを回避することができる。よって、補間精度のより一層の向上が期待できる。 Moreover, each of the examples as well as the arrangement pattern G 1 shown in FIG. 17 in the example shown in FIG. 18 does not position overlap with each other. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
 また、カラーフィルタの他の構成例としては例えば図19に示すカラーフィルタ21Dが挙げられる。図19には撮像素子20に設けられているカラーフィルタ21Dの原色(Rフィルタ,Gフィルタ,Bフィルタ)の配列及び遮光部材の配置の一例が模式的に示されている。図19に示すように撮像素子20の構成画素は、2類型の画素群に分類される。2類型の画素群の一例としては、第4の画素群一例であるA面の画素群及び第5の画素群の一例であるB面の画素群が挙げられる。A面の画素群は第1の画素群を含み、B面の画素群は第2の画素群を含む。また、A面の画素群及びB面の画素群の各々は、それぞれ第3の画素群を含む。A面の画素群及びB面の画素群は、撮像素子20における構成画素が水平方向及び垂直方向について交互に配置された画素群である。A面の画素群及びB面の画素群の各々にはカラーフィルタ21Dによりベイヤ配列の原色が割り当てられており、A面の画素群とB面の画素群とは、水平方向及び垂直方向に互いに半画素分(半ピッチ)ずれて配置されている。 Further, as another configuration example of the color filter, for example, a color filter 21D shown in FIG. FIG. 19 schematically shows an example of the arrangement of primary colors (R filter, G filter, B filter) of the color filter 21D provided in the image sensor 20 and the arrangement of the light shielding members. As shown in FIG. 19, the constituent pixels of the image sensor 20 are classified into two types of pixel groups. As an example of the two types of pixel groups, a pixel group on the A surface that is an example of the fourth pixel group and a pixel group on the B surface that is an example of the fifth pixel group can be cited. The pixel group on the A plane includes a first pixel group, and the pixel group on the B plane includes a second pixel group. Each of the pixel group on the A plane and the pixel group on the B plane includes a third pixel group. The pixel group on the A surface and the pixel group on the B surface are pixel groups in which the constituent pixels in the image sensor 20 are alternately arranged in the horizontal direction and the vertical direction. Each of the A-side pixel group and the B-side pixel group is assigned a Bayer array primary color by the color filter 21D. The A-side pixel group and the B-side pixel group are mutually aligned in the horizontal direction and the vertical direction. They are shifted by half a pixel (half pitch).
 図20には、図19に示す撮像素子20における第1の画素群及び第2の画素群の一例が模式的に示されている。図19に示すように、第1の画素群及び第2の画素群の各々における位相差画素のRフィルタ、Gフィルタ及びBフィルタの配列は、ベイヤ配列とされている。第1の画素L及び第2の画素Rは、一例として図19に示すように、互いに隣接して(最小ピッチで)対になって配設されている。これにより、第1の画素群と第2の画素群との位相差は、本構成を有しない場合に比べ、高精度に算出される。 FIG. 20 schematically shows an example of the first pixel group and the second pixel group in the image sensor 20 shown in FIG. As shown in FIG. 19, the arrangement of the R filter, G filter, and B filter of the phase difference pixels in each of the first pixel group and the second pixel group is a Bayer arrangement. As shown in FIG. 19 as an example, the first pixel L and the second pixel R are disposed adjacent to each other (with a minimum pitch) in pairs. Thereby, the phase difference between the first pixel group and the second pixel group is calculated with higher accuracy than in the case where the present configuration is not provided.
 なお、図19及び図20に示す例では、第1の画素群及び第2の画素群の各々における位相差画素のRフィルタ、Gフィルタ及びBフィルタの配列がベイヤ配列とされているが、本発明はこれに限定されるものではない。例えば図21に示すように2画素分のGフィルタについて第1の画素群における一部の第1の画素Lと第2の画素群における一部の第2の画素Rとを隣接させて配置してもよい。Gフィルタが設けられた画素は他色のフィルタが設けられた画素に比べ感度が良いため、補間精度を高めることができ、しかも、Gフィルタが設けられた画素は、他色のフィルタに比べ、Gフィルタに連続性があるため、補間がし易い。 In the example shown in FIGS. 19 and 20, the arrangement of the R filter, the G filter, and the B filter of the phase difference pixels in each of the first pixel group and the second pixel group is a Bayer array. The invention is not limited to this. For example, as shown in FIG. 21, with respect to the G filter for two pixels, a part of the first pixels L in the first pixel group and a part of the second pixels R in the second pixel group are arranged adjacent to each other. May be. Since the pixel provided with the G filter is more sensitive than the pixel provided with the filter of other colors, the interpolation accuracy can be increased, and the pixel provided with the G filter is more Interpolation is easy because the G filter has continuity.
 また、上記第1実施形態では、上下方向に2分割されたスプリットイメージを例示したが、これに限らず、左右方向又は斜め方向に複数分割された画像をスプリットイメージとして適用してもよい。 In the first embodiment, the split image divided in the vertical direction is exemplified. However, the present invention is not limited to this, and an image divided into a plurality of parts in the horizontal direction or the diagonal direction may be applied as the split image.
 例えば、図22に示すスプリットイメージ66aは、水平方向に平行な複数の分割線63aにより奇数ラインと偶数ラインとに分割されている。このスプリットイメージ66aでは、第1の画素群から出力された出力信号に基づいて生成されたライン状(一例として短冊状)の位相差画像66Laが奇数ライン(偶数ラインでも可)に表示される。また、第2の画素群から出力された出力信号に基づき生成されたライン状(一例として短冊状)の位相差画像66Raが偶数ラインに表示される。 For example, the split image 66a shown in FIG. 22 is divided into odd and even lines by a plurality of dividing lines 63a parallel to the horizontal direction. In the split image 66a, a line-like (eg, strip-like) phase difference image 66La generated based on the output signal outputted from the first pixel group is displayed on an odd line (even an even line is acceptable). In addition, a line-shaped (eg, strip-shaped) phase difference image 66Ra generated based on the output signal output from the second pixel group is displayed on even lines.
 また、図23に示すスプリットイメージ66bは、水平方向に傾き角を有する分割線63b(例えば、スプリットイメージ66bの対角線)により2分割されている。このスプリットイメージ66bでは、第1の画素群から出力された出力信号に基づき生成された位相差画像66Lbが一方の領域に表示される。また、第2の画素群から出力された出力信号に基づき生成された位相差画像66Rbが他方の領域に表示される。 Further, the split image 66b shown in FIG. 23 is divided into two by a dividing line 63b (for example, a diagonal line of the split image 66b) having an inclination angle in the horizontal direction. In the split image 66b, the phase difference image 66Lb generated based on the output signal output from the first pixel group is displayed in one area. Further, the phase difference image 66Rb generated based on the output signal output from the second pixel group is displayed in the other region.
 また、図24A及び図24Bに示すスプリットイメージ66cは、水平方向及び垂直方向にそれぞれ平行な格子状の分割線63cにより分割されている。スプリットイメージ66cでは、第1の画素群から出力された出力信号に基づき生成された位相差画像66Lcが市松模様(チェッカーパターン)状に並べられて表示される。また、第2の画素群から出力された出力信号に基づき生成された位相差画像66Rcが市松模様状に並べられて表示される。 Also, the split image 66c shown in FIGS. 24A and 24B is divided by grid-like dividing lines 63c parallel to the horizontal direction and the vertical direction, respectively. In the split image 66c, the phase difference image 66Lc generated based on the output signal output from the first pixel group is displayed in a checkered pattern (checker pattern). Further, the phase difference image 66Rc generated based on the output signal output from the second pixel group is displayed in a checkered pattern.
 更に、スプリットイメージに限らず、2つの位相差画像から他の合焦確認画像を生成し、合焦確認画像を表示するようにしてもよい。例えば、2つの位相差画像を重畳して合成表示し、ピントがずれている場合は2重像として表示され、ピントが合った状態ではクリアに画像が表示されるようにしてもよい。 Furthermore, not limited to the split image, another focus confirmation image may be generated from the two phase difference images, and the focus confirmation image may be displayed. For example, two phase difference images may be superimposed and displayed as a composite image. If the image is out of focus, the image may be displayed as a double image, and the image may be clearly displayed when the image is in focus.
 [第2実施形態]
 記第1実施形態では、通常画像の表示が指示されたか否かによって画像処理部28の出力内容を変更したが、本第2実施形態では、レリーズスイッチ211の操作状態に応じて画像処理部28の出力内容を変更する場合について説明する。また、本第2実施形態では、レリーズスイッチ211の一例として非保持型スイッチを適用している。非保持型スイッチの一例としては、モーメンタリ動作型のスイッチが挙げられる。また、レリーズスイッチ211に代えて、ソフトウェア構成によりレリーズスイッチ211と同様の機能を実現するソフトキーを表示部213に表示させ、表示されたソフトキーをタッチパネルを介してユーザに操作させてもよい。なお、以下では、上記第1実施形態と異なる箇所について説明する。また、同一の構成部材については同一の符号を付し、その説明を省略する。
[Second Embodiment]
In the first embodiment, the output content of the image processing unit 28 is changed depending on whether the display of the normal image is instructed. However, in the second embodiment, the image processing unit 28 is changed according to the operation state of the release switch 211. The case of changing the output contents of will be described. In the second embodiment, a non-holding type switch is applied as an example of the release switch 211. An example of the non-holding type switch is a momentary operation type switch. Further, instead of the release switch 211, a software key that realizes the same function as the release switch 211 may be displayed on the display unit 213 by a software configuration, and the displayed software key may be operated by the user via the touch panel. In the following description, differences from the first embodiment will be described. Moreover, the same code | symbol is attached | subjected about the same structural member, and the description is abbreviate | omitted.
 図25には、本第2実施形態に係る画像出力処理の流れの一例を示すフローチャートが示されている。なお、以下では、図12に示すフローチャートと異なるステップを説明し、同一のステップについては同一のステップ番号を付してその説明を省略する。図25に示すフローチャートは、図12に示すフローチャートに比べ、ステップ302に代えてステップ302Aを適用した点及びステップ350~360を新たに設けた点が異なっている。 FIG. 25 is a flowchart showing an example of the flow of image output processing according to the second embodiment. In the following, steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted. The flowchart shown in FIG. 25 differs from the flowchart shown in FIG. 12 in that step 302A is applied instead of step 302 and that steps 350 to 360 are newly provided.
 ステップ302Aでは、CPU12により、レリーズスイッチ211が半押し状態とされたか(例えば半押し状態で保持されたか)否かが判定される。本ステップ302Aにおいてレリーズスイッチ211が半押し状態とされた場合は判定が肯定されてステップ306へ移行する。本ステップ302Aにおいてレリーズスイッチ211が半押し状態とされていない場合(例えば待機位置に保持された場合)は判定が否定されてステップ308へ移行する。ここで言う「レリーズスイッチ211が半押し状態とされていない場合」とは、例えばレリーズスイッチ211の押圧操作が行われていない場合が挙げられる。なお、ここでは、レリーズスイッチ211が半押し状態とされたか否かを判定する例を挙げて説明したが、本発明はこれに限定されるものではなく、レリーズスイッチ211が半押し状態とされて合焦状態となったか否かを判定してもよい。つまり、この場合、レリーズスイッチ211が半押し状態とされて合焦状態となった場合が、本発明に係る第1指示部による第1の表示用画像の表示が指示された場合とされる。 In step 302A, the CPU 12 determines whether or not the release switch 211 is in a half-pressed state (for example, held in a half-pressed state). If the release switch 211 is pressed halfway in step 302A, the determination is affirmed and the routine proceeds to step 306. If the release switch 211 is not pressed halfway in this step 302A (for example, if it is held at the standby position), the determination is negative and the routine proceeds to step 308. Here, “when the release switch 211 is not half-pressed” includes, for example, a case where the release switch 211 is not pressed. Here, an example in which it is determined whether or not the release switch 211 is half-pressed has been described, but the present invention is not limited to this, and the release switch 211 is half-pressed. It may be determined whether the in-focus state has been reached. That is, in this case, the case where the release switch 211 is half-pressed and brought into a focused state is a case where the display of the first display image is instructed by the first instruction unit according to the present invention.
 一方、本第2実施形態に係る画像出力処理では、CPU12により、ステップ306の処理が行われた後、ステップ350へ移行する。ステップ350では、CPU12により、レリーズスイッチ211が全押し状態とされたか(例えば全押し状態で保持されたか)否かが判定される。本ステップ350においてレリーズスイッチ211が全押し状態とされた場合は判定が肯定されてステップ352へ移行する。本ステップ350においてレリーズスイッチ211が全押し状態とされていない場合(例えば半押し状態が保持された場合又は半押し状態が解除された場合)は判定が否定されてステップ302Aへ移行する。 On the other hand, in the image output process according to the second embodiment, after the process of step 306 is performed by the CPU 12, the process proceeds to step 350. In step 350, the CPU 12 determines whether or not the release switch 211 is fully pressed (for example, held in the fully pressed state). If the release switch 211 is fully pressed in step 350, the determination is affirmed and the routine proceeds to step 352. If the release switch 211 is not fully pressed in this step 350 (for example, if the half-pressed state is maintained or if the half-pressed state is released), the determination is negative and the routine proceeds to step 302A.
 ステップ352では、CPU12により、静止画像の撮像が開始されるように制御が行われる。そして、本ステップ352の処理に応じて静止画像の撮像が開始される。撮像が終了すると、ステップ354へ移行する。 In step 352, the CPU 12 performs control so as to start capturing a still image. Then, in accordance with the processing in step 352, still image capturing is started. When the imaging is completed, the process proceeds to step 354.
 ステップ354では、操作部14(一例としてレリーズスイッチ211とは異なる指示部)によるスプリットイメージの表示指示が解除されたか否かがCPU12により判定される。本ステップ354において操作部14を介して行われたスプリットイメージの表示指示が解除された場合は判定が肯定されてステップ356へ移行する。本ステップ354において操作部14を介して行われたスプリットイメージの表示指示が解除されていない場合は判定が否定されてステップ360へ移行する。 In step 354, the CPU 12 determines whether or not the split image display instruction by the operation unit 14 (an instruction unit different from the release switch 211 as an example) has been released. If the split image display instruction issued via the operation unit 14 in step 354 is canceled, the determination is affirmed and the process proceeds to step 356. If the split image display instruction given via the operation unit 14 in step 354 has not been canceled, the determination is negative and the routine proceeds to step 360.
 ステップ356では、CPU12により、上記ステップ306の処理と同様の処理が行われ、その後、ステップ310へ移行する。 In step 356, the CPU 12 performs the same processing as the processing in step 306, and then proceeds to step 310.
 ステップ360では、CPU12により上記ステップ308の処理と同様の処理が行われ、その後、ステップ310へ移行する。 In step 360, the CPU 12 performs the same processing as the processing in step 308, and then proceeds to step 310.
 本第2実施形態に係る画像出力処理がCPU12によって行われると、レリーズスイッチ211が半押し位置(第1指示位置の一例)に保持された場合は表示装置にスプリットイメージが表示されずに通常画像が表示される。また、スプリットイメージが表示されていない状態でレリーズスイッチ211の操作状態が解除された場合(ステップ358:Y)にスプリットイメージが表示されるので、本構成を有しない場合に比べ、スプリットイメージを適切な時期に迅速に表示させることができる。 When the image output processing according to the second embodiment is performed by the CPU 12, when the release switch 211 is held at the half-pressed position (an example of the first designated position), the split image is not displayed on the display device and the normal image is displayed. Is displayed. In addition, since the split image is displayed when the operation state of the release switch 211 is canceled when the split image is not displayed (step 358: Y), the split image is more appropriate than the case without this configuration. It can be displayed quickly at the right time.
 また、本第2実施形態に係る画像出力処理によれば、レリーズスイッチ211が半押し位置に保持された状態が解除された場合に表示が切り替わる。すなわち、図25に示す例では、レリーズスイッチ211が全押し位置に保持された場合、撮像が実施され(ステップ352)、その後、操作部14を介して行われたスプリットイメージの表示指示が解除されることで(ステップ354:Y)、表示装置に対して通常画像を表示させる制御がCPU12によって行われる(ステップ356)。この場合、撮像後にスプリットイメージの表示を解除して通常画像によるライブビュー画像の表示に戻すことができるので(ステップ352,354,356)、例えば通常時はオートフォーカス(AF)を頼りにした撮像を行うことを望む一方で、自分の思い通りにピント調整を行いたい被写体の場合のみスプリットイメージによるマニュアルフォーカス調整を行いたい場面において、ユーザにとっての利便性の向上に寄与することができる。 Further, according to the image output processing according to the second embodiment, the display is switched when the state where the release switch 211 is held at the half-pressed position is released. That is, in the example shown in FIG. 25, when the release switch 211 is held in the fully-pressed position, imaging is performed (step 352), and then the split image display instruction performed via the operation unit 14 is canceled. Thus (step 354: Y), the CPU 12 controls the display device to display the normal image (step 356). In this case, it is possible to cancel the split image display after imaging and return to the live view image display as a normal image ( steps 352, 354, and 356), for example, imaging that relies on autofocus (AF) at normal times. On the other hand, it is possible to contribute to the improvement of convenience for the user in the scene where the manual focus adjustment by the split image is desired only for the subject for which the focus adjustment is desired as desired.
 なお、本第2実施形態に係る画像出力処理では、ステップ354において、判定が肯定されるとステップ356へ移行し、判定が否定されるとステップ360へ移行する例を挙げて説明したが、本発明はこれに限定されるものではない。例えば、ステップ354とステップ356との間にレリーズスイッチ211に対する操作が解除されたか否かをCPU12に判定させるステップ(追加ステップA)を挿入し、同様の処理を行うステップ(追加ステップB)をステップ354とステップ360との間に挿入してもよい。ここで言う「レリーズスイッチ211に対する操作」とは、例えばレリーズスイッチ211の全押し及び半押しを指す。すなわち、レリーズスイッチ211に対する操作が解除された状態とは、例えばレリーズスイッチ211が全押し状態でも半押し状態でもなく、レリーズスイッチ211が解放された状態を意味する。 In the image output processing according to the second embodiment, an example has been described in which the process proceeds to step 356 when the determination is affirmed in step 354 and the process proceeds to step 360 when the determination is denied. The invention is not limited to this. For example, a step (addition step A) is inserted between step 354 and step 356 to cause the CPU 12 to determine whether or not the operation with respect to the release switch 211 has been released, and a step of performing similar processing (addition step B) is performed. It may be inserted between 354 and step 360. Here, “operation on the release switch 211” refers to full-pressing and half-pressing of the release switch 211, for example. That is, the state in which the operation with respect to the release switch 211 is released means, for example, a state in which the release switch 211 is released without the release switch 211 being fully pressed or half pressed.
 よって、追加ステップAにおいて、レリーズスイッチ211が全押し状態の場合は判定が否定されてステップ354へ移行する。追加ステップAにおいて、レリーズスイッチ211が全押し状態からレリーズスイッチ211が操作されていない状態へ遷移すると肯定が判定されてステップ356へ移行する。これにより、レリーズスイッチ211の操作状態が解除された場合に通常画像によるライブビュー画像を表示装置に表示させることが可能となる。 Therefore, in the additional step A, if the release switch 211 is fully pressed, the determination is denied and the process proceeds to step 354. In addition step A, when the release switch 211 is shifted from the fully pressed state to the state where the release switch 211 is not operated, an affirmative determination is made and the routine proceeds to step 356. As a result, when the operation state of the release switch 211 is released, a live view image based on a normal image can be displayed on the display device.
 また、追加ステップBにおいて、レリーズスイッチ211が全押し状態の場合は判定が否定されてステップ354へ移行する。追加ステップBにおいて、レリーズスイッチ211が全押し状態からレリーズスイッチ211が操作されていない状態へ遷移すると肯定が判定されてステップ360へ移行する。これにより、レリーズスイッチ211の操作状態が解除された場合に通常画像によるライブビュー画像と共にスプリットイメージを表示装置に表示させることが可能となる。 In addition, if the release switch 211 is fully pressed in the additional step B, the determination is denied and the process proceeds to step 354. In addition step B, when the release switch 211 is shifted from the fully pressed state to a state where the release switch 211 is not operated, an affirmative determination is made and the routine proceeds to step 360. As a result, when the operation state of the release switch 211 is released, the split image can be displayed on the display device together with the live view image based on the normal image.
 なお、本発明は上述した画像出力処理に限定されるものではなく、本第2実施形態に係る画像出力処理からステップ354,356を省略した画像出力処理をCPU12に実行させてもよい。すなわち、スプリットイメージの表示指示が行われている状態でレリーズスイッチ211が全押し位置に保持された場合に通常画像の表示指示が解除され、これに応じて通常画像と共にスプリットイメージも表示装置に表示させる制御をCPU12に行わせてもよい。この場合、例えば同じ被写体について構図を変えて複数枚撮像する場面において、撮像後もスプリットイメージによるマニュアルフォーカス調整を行うという設定を保持することができるので、ユーザにとっての利便性の向上に寄与することができる。このように、レリーズスイッチ211が全押し位置(第2指示位置の一例)又は待機位置に保持された場合(ライブビュー画像の撮像時においてレリーズスイッチ211が操作されていない場合)は表示装置に対して通常画像を表示させると共に通常画像の表示領域内にスプリットイメージを表示させることで、レリーズスイッチ211の操作状態に応じて、スプリットイメージを適切な時期に表示させることができる。 Note that the present invention is not limited to the image output process described above, and the CPU 12 may execute an image output process in which steps 354 and 356 are omitted from the image output process according to the second embodiment. In other words, when the release switch 211 is held at the fully-pressed position while the split image display instruction is being issued, the normal image display instruction is canceled, and the split image is displayed on the display device together with the normal image accordingly. You may make CPU12 perform control to perform. In this case, for example, in a scene where a plurality of images of the same subject are changed, the setting of performing manual focus adjustment using a split image can be maintained even after imaging, which contributes to improvement of convenience for the user. Can do. As described above, when the release switch 211 is held at the fully-pressed position (an example of the second indication position) or the standby position (when the release switch 211 is not operated at the time of capturing a live view image), By displaying the normal image and displaying the split image in the normal image display area, the split image can be displayed at an appropriate time according to the operation state of the release switch 211.
 [第3実施形態]
 上記第1実施形態では、通常画像の表示が指示されたか否かによって画像処理部28の出力内容を変更したが、本第3実施形態では、F値に応じて画像処理部28の出力内容を変更する場合について説明する。なお、以下では、上記第1実施形態と異なる箇所について説明する。また、同一の構成部材については同一の符号を付し、その説明を省略する。
[Third Embodiment]
In the first embodiment, the output content of the image processing unit 28 is changed depending on whether the display of the normal image is instructed. However, in the third embodiment, the output content of the image processing unit 28 is changed according to the F value. The case of changing will be described. In the following description, differences from the first embodiment will be described. Moreover, the same code | symbol is attached | subjected about the same structural member and the description is abbreviate | omitted.
 図26には、本第3実施形態に係る画像出力処理の流れの一例を示すフローチャートが示されている。なお、以下では、図12に示すフローチャートと異なるステップを説明し、同一のステップについては同一のステップ番号を付してその説明を省略する。図26に示すフローチャートは、図12に示すフローチャートに比べ、ステップ302に代えてステップ302Bを適用した点が異なっている。 FIG. 26 is a flowchart showing an example of the flow of image output processing according to the third embodiment. In the following, steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted. The flowchart shown in FIG. 26 is different from the flowchart shown in FIG. 12 in that step 302B is applied instead of step 302.
 ステップ302Bでは、CPU12により、静止画像の撮像時に用いるF値として予め定められたF値(特定の絞り値の一例)が操作部14(例えば被写界深度確認ボタン)を介して設定されたか否かが判定される。本ステップ302Bにおいて予め定められたF値が操作部14を介して設定された場合は判定が肯定されてステップ306へ移行する。本ステップ302Bにおいて予め定められたF値が操作部14を介して設定されていない場合は判定が否定されてステップ308へ移行する。 In step 302B, whether or not a predetermined F value (an example of a specific aperture value) is set by the CPU 12 via the operation unit 14 (for example, a depth-of-field confirmation button) as an F value used when capturing a still image. Is determined. If a predetermined F value is set via the operation unit 14 in step 302B, the determination is affirmed and the routine proceeds to step 306. If the predetermined F value is not set via the operation unit 14 in step 302B, the determination is negative and the routine proceeds to step 308.
 本第3実施形態に係る画像出力処理がCPU12によって行われると、静止画像の撮像時に用いるF値として予め定められたF値が操作部14を介して設定された場合は表示装置にスプリットイメージが表示されずに通常画像が表示される。また、静止画像の撮像時に用いるF値として予め定められたF値が操作部14を介して設定されない場合は表示装置に通常画像及びスプリットイメージが表示される。これにより、本構成を有しない場合に比べ、F値に応じて、スプリットイメージを適切な時期に表示させることができる。 When the image output process according to the third embodiment is performed by the CPU 12, a split image is displayed on the display device when an F value that is set in advance as an F value used when capturing a still image is set via the operation unit 14. A normal image is displayed without being displayed. In addition, when an F value that is set in advance as an F value used when capturing a still image is not set via the operation unit 14, a normal image and a split image are displayed on the display device. Thereby, compared with the case where this configuration is not provided, the split image can be displayed at an appropriate time according to the F value.
 [第4実施形態]
 上記第1実施形態では、通常画像の表示が指示されたか否かによって画像処理部28の出力内容を変更したが、本第4実施形態では、EVF248の使用状態に応じて画像処理部28の出力内容を変更する場合について説明する。なお、以下では、上記第1実施形態と異なる箇所について説明する。また、同一の構成部材については同一の符号を付し、その説明を省略する。
[Fourth Embodiment]
In the first embodiment, the output content of the image processing unit 28 is changed depending on whether or not the display of the normal image is instructed. However, in the fourth embodiment, the output of the image processing unit 28 according to the usage state of the EVF 248. A case where the contents are changed will be described. In the following description, differences from the first embodiment will be described. Moreover, the same code | symbol is attached | subjected about the same structural member, and the description is abbreviate | omitted.
 図27には、本第4実施形態に係る画像出力処理の流れの一例を示すフローチャートが示されている。なお、以下では、図12に示すフローチャートと異なるステップを説明し、同一のステップについては同一のステップ番号を付してその説明を省略する。図26に示すフローチャートは、図12に示すフローチャートに比べ、ステップ302に代えてステップ302Cを適用した点を適用した点が異なっている。 FIG. 27 shows a flowchart showing an example of the flow of image output processing according to the fourth embodiment. In the following, steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted. The flowchart shown in FIG. 26 is different from the flowchart shown in FIG. 12 in that step 302C is applied instead of step 302.
 ステップ302Cでは、CPU12により、ユーザがEVF248を使用しているか否かが判定される。ユーザがEVF248を使用しているか否かは、例えば接眼検出部37での検出結果に基づいてファインダー接眼部242が使用されていると判定されたか否かにより判定される。つまり、ファインダー接眼部242が使用されていると判定された場合はユーザがEVF248を使用していると判定され、ファインダー接眼部242が使用されていないと判定された場合はユーザがEVF248を使用していないと判定される。本ステップ302CにおいてユーザがEVF248を使用している場合は判定が肯定されてステップ306へ移行する。本ステップ302CにおいてユーザがEVF248を使用していない場合は判定が否定されてステップ308へ移行する。 In step 302C, the CPU 12 determines whether or not the user is using the EVF 248. Whether or not the user is using the EVF 248 is determined based on, for example, whether or not the viewfinder eyepiece unit 242 is determined to be used based on the detection result of the eyepiece detection unit 37. That is, when it is determined that the finder eyepiece 242 is used, it is determined that the user is using the EVF 248, and when it is determined that the finder eyepiece 242 is not used, the user uses the EVF 248. It is determined that it is not being used. If the user is using the EVF 248 in step 302C, the determination is affirmed and the routine proceeds to step 306. If the user does not use the EVF 248 in step 302C, the determination is negative and the process proceeds to step 308.
 本第4実施形態に係る画像出力処理がCPU12によって行われると、EVF248を使用している場合は表示装置にスプリットイメージが表示されずに通常画像が表示される。また、EVF248を使用していない場合は表示装置に通常画像及びスプリットイメージが表示される。これにより、本構成を有しない場合に比べ、EVF248の使用状態に応じて、スプリットイメージを適切な時期に表示させることができる。 When the image output process according to the fourth embodiment is performed by the CPU 12, when the EVF 248 is used, the normal image is displayed without displaying the split image on the display device. When the EVF 248 is not used, a normal image and a split image are displayed on the display device. As a result, the split image can be displayed at an appropriate time according to the usage state of the EVF 248 as compared with the case where this configuration is not provided.
 また、本第4実施形態に係る画像出力処理と上記第1~第3実施形態に係る画像出力処理の少なくとも1つとを並行して実施してもよいし、上記第1~第3実施形態に係る画像出力処理の少なくとも2つ以上を並行してもよい。この場合、例えば、上記ステップ302,302A,302B,302Cの少なくとも1つが肯定判定となった場合に通常画像を表示すると共にスプリットイメージを廃棄する処理を行う。また、少なくとも1つが否定判定となった場合に通常画像及びスプリットイメージを表示する処理を行う。 Further, the image output processing according to the fourth embodiment and at least one of the image output processing according to the first to third embodiments may be performed in parallel, or in the first to third embodiments. At least two or more of the image output processes may be performed in parallel. In this case, for example, when at least one of the steps 302, 302A, 302B, and 302C is affirmative, the normal image is displayed and the split image is discarded. Further, when at least one of the determinations is negative, a process of displaying a normal image and a split image is performed.
 なお、上記第1~第4実施形態で説明した画像出力処理の流れ(図12,図25~27参照)はあくまでも一例である。従って、主旨を逸脱しない範囲内において不要なステップを削除したり、新たなステップを追加したり、処理順序を入れ替えたりしてもよいことは言うまでもない。また、上記第1~第4実施形態で説明した画像出力処理に含まれる各処理は、プログラムを実行することにより、コンピュータを利用してソフトウェア構成により実現されてもよいし、ハードウェア構成で実現されてもよい。また、ハードウェア構成とソフトウェア構成の組み合わせによって実現してもよい。 Note that the flow of image output processing (see FIGS. 12 and 25 to 27) described in the first to fourth embodiments is merely an example. Therefore, it goes without saying that unnecessary steps may be deleted, new steps may be added, and the processing order may be changed within a range not departing from the spirit. Further, each process included in the image output process described in the first to fourth embodiments may be realized by a software configuration using a computer by executing a program, or may be realized by a hardware configuration. May be. Further, it may be realized by a combination of a hardware configuration and a software configuration.
 上記第1~第4実施形態で説明した画像出力処理を、コンピュータによりプログラムを実行することにより実現する場合は、プログラムを所定の記憶領域(例えばメモリ26)に予め記憶しておけばよい。なお、必ずしも最初からメモリ26に記憶させておく必要はない。例えば、コンピュータに接続されて使用されるフレキシブルディスク、いわゆるFD、CD-ROM、DVDディスク、光磁気ディスク、ICカードなどの任意の「可搬型の記憶媒体」に先ずはプログラムを記憶させておいてもよい。そして、コンピュータがこれらの可搬型の記憶媒体からプログラムを取得して実行するようにしてもよい。また、インターネットやLAN(Local Area Network)などを介してコンピュータに接続される他のコンピュータまたはサーバ装置などに各プログラムを記憶させておき、コンピュータがこれらからプログラムを取得して実行するようにしてもよい。 When the image output processing described in the first to fourth embodiments is realized by executing a program by a computer, the program may be stored in a predetermined storage area (for example, the memory 26) in advance. It is not always necessary to store in the memory 26 from the beginning. For example, a program is first stored in an arbitrary “portable storage medium” such as a flexible disk connected to a computer, so-called FD, CD-ROM, DVD disk, magneto-optical disk, IC card, etc. Also good. Then, the computer may acquire the program from these portable storage media and execute it. Also, each program may be stored in another computer or server device connected to the computer via the Internet, LAN (Local Area Network), etc., and the computer may acquire and execute the program from these. Good.
 [第5実施形態] [Fifth embodiment]
 上記第1実施形態では、撮像装置100を例示したが、撮像装置100の変形例である携帯端末装置としては、例えばカメラ機能を有する携帯電話機やスマートフォン、PDA(Personal Digital Assistants)、携帯型ゲーム機などが挙げられる。以下、スマートフォンを例に挙げ、図面を参照しつつ、詳細に説明する。 In the first embodiment, the imaging device 100 is illustrated. However, as a mobile terminal device that is a modification of the imaging device 100, for example, a mobile phone or a smartphone having a camera function, a PDA (Personal Digital Assistant), a portable game machine Etc. Hereinafter, a smartphone will be described as an example, and will be described in detail with reference to the drawings.
 図28は、スマートフォン500の外観の一例を示す斜視図である。図28に示すスマートフォン500は、平板状の筐体502を有し、筐体502の一方の面に表示部としての表示パネル521と、入力部としての操作パネル522とが一体となった表示入力部520を備えている。また、筐体502は、スピーカ531と、マイクロホン532と、操作部540と、カメラ部541とを備えている。なお、筐体502の構成はこれに限定されず、例えば、表示部と入力部とが独立した構成を採用したり、折り畳み構造やスライド構造を有する構成を採用することもできる。 FIG. 28 is a perspective view showing an example of the appearance of the smartphone 500. A smartphone 500 illustrated in FIG. 28 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520. The housing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541. Note that the configuration of the housing 502 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide structure may be employed.
 図29は、図28に示すスマートフォン500の構成の一例を示すブロック図である。図29に示すように、スマートフォン500の主たる構成要素として、無線通信部510と、表示入力部520と、通話部530と、操作部540と、カメラ部541と、記憶部550と、外部入出力部560と、を備える。また、スマートフォン500の主たる構成要素として、GPS(Global Positioning System)受信部570と、モーションセンサ部580と、電源部590と、主制御部501と、を備える。また、スマートフォン500の主たる機能として、基地局装置BSと移動通信網NWとを介した移動無線通信を行う無線通信機能を備える。 FIG. 29 is a block diagram showing an example of the configuration of the smartphone 500 shown in FIG. As shown in FIG. 29, the main components of the smartphone 500 include a wireless communication unit 510, a display input unit 520, a call unit 530, an operation unit 540, a camera unit 541, a storage unit 550, and an external input / output. Part 560. The smartphone 500 includes a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501. In addition, as a main function of the smartphone 500, a wireless communication function for performing mobile wireless communication via the base station device BS and the mobile communication network NW is provided.
 無線通信部510は、主制御部501の指示に従って、移動通信網NWに収容された基地局装置BSに対して無線通信を行うものである。この無線通信を使用して、音声データ、画像データ等の各種ファイルデータ、電子メールデータなどの送受信や、Webデータやストリーミングデータなどの受信を行う。 The wireless communication unit 510 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 501. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
 表示入力部520は、いわゆるタッチパネルであって、表示パネル521と、操作パネル522とを備える。そのため、表示入力部520は、主制御部501の制御により、画像(静止画像および動画像)や文字情報などを表示して視覚的にユーザに情報を伝達するとともに、表示した情報に対するユーザ操作を検出する。なお、生成された3Dを鑑賞する場合には、表示パネル521は、3D表示パネルであることが好ましい。 The display input unit 520 is a so-called touch panel, and includes a display panel 521 and an operation panel 522. For this reason, the display input unit 520 displays images (still images and moving images), character information, and the like visually by controlling the main control unit 501, and visually transmits information to the user, and performs user operations on the displayed information. To detect. Note that when viewing the generated 3D, the display panel 521 is preferably a 3D display panel.
 表示パネル521は、LCD、OELD(Organic Electro-Luminescence Display)などを表示デバイスとして用いたものである。操作パネル522は、表示パネル521の表示面上に表示される画像を視認可能に載置され、ユーザの指や尖筆によって操作される一又は複数の座標を検出するデバイスである。係るデバイスをユーザの指や尖筆によって操作すると、操作に起因して発生する検出信号を主制御部501に出力する。次いで、主制御部501は、受信した検出信号に基づいて、表示パネル521上の操作位置(座標)を検出する。 The display panel 521 uses an LCD, OELD (Organic Electro-Luminescence Display), or the like as a display device. The operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
 図28に示すように、スマートフォン500の表示パネル521と操作パネル522とは一体となって表示入力部520を構成しているが、操作パネル522が表示パネル521を完全に覆うような配置となっている。この配置を採用した場合、操作パネル522は、表示パネル521外の領域についても、ユーザ操作を検出する機能を備えてもよい。換言すると、操作パネル522は、表示パネル521に重なる重畳部分についての検出領域(以下、表示領域と称する)と、それ以外の表示パネル521に重ならない外縁部分についての検出領域(以下、非表示領域と称する)とを備えていてもよい。 As shown in FIG. 28, the display panel 521 and the operation panel 522 of the smartphone 500 integrally form the display input unit 520, but the operation panel 522 is disposed so as to completely cover the display panel 521. ing. When this arrangement is adopted, the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521. In other words, the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
 なお、表示領域の大きさと表示パネル521の大きさとを完全に一致させても良いが、両者を必ずしも一致させる必要は無い。また、操作パネル522が、外縁部分と、それ以外の内側部分の2つの感応領域を備えていてもよい。更に、外縁部分の幅は、筐体502の大きさなどに応じて適宜設計されるものである。更にまた、操作パネル522で採用される位置検出方式としては、マトリクススイッチ方式、抵抗膜方式、表面弾性波方式、赤外線方式、電磁誘導方式、静電容量方式などが挙げられ、いずれの方式を採用することもできる。 Although the size of the display area and the size of the display panel 521 may be completely matched, it is not always necessary to match the two. In addition, the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like. Furthermore, examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method. You can also
 通話部530は、スピーカ531やマイクロホン532を備える。通話部530は、マイクロホン532を通じて入力されたユーザの音声を主制御部501にて処理可能な音声データに変換して主制御部501に出力する。また、通話部530は、無線通信部510あるいは外部入出力部560により受信された音声データを復号してスピーカ531から出力する。また、図28に示すように、例えば、スピーカ531を表示入力部520が設けられた面と同じ面に搭載し、マイクロホン532を筐体502の側面に搭載することができる。 The call unit 530 includes a speaker 531 and a microphone 532. The call unit 530 converts the user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501 and outputs the voice data to the main control unit 501. Further, the call unit 530 decodes the audio data received by the wireless communication unit 510 or the external input / output unit 560 and outputs it from the speaker 531. As shown in FIG. 28, for example, the speaker 531 can be mounted on the same surface as the surface on which the display input unit 520 is provided, and the microphone 532 can be mounted on the side surface of the housing 502.
 操作部540は、キースイッチなどを用いたハードウェアキーであって、ユーザからの指示を受け付けるものである。例えば、図28に示すように、操作部540は、スマートフォン500の筐体502の側面に搭載され、指などで押下されるとオンとなり、指を離すとバネなどの復元力によってオフ状態となる押しボタン式のスイッチである。 The operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user. For example, as illustrated in FIG. 28, the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
 記憶部550は、主制御部501の制御プログラムや制御データ、アプリケーションソフトウェア、通信相手の名称や電話番号などを対応づけたアドレスデータ、送受信した電子メールのデータを記憶する。また、記憶部550は、WebブラウジングによりダウンロードしたWebデータや、ダウンロードしたコンテンツデータを記憶する。また、記憶部550は、ストリーミングデータなどを一時的に記憶する。また、記憶部550は、スマートフォン内蔵の内部記憶部551と着脱自在な外部メモリスロットを有する外部記憶部552を有する。なお、記憶部550を構成するそれぞれの内部記憶部551と外部記憶部552は、フラッシュメモリタイプ(flash memory type)、ハードディスクタイプ(hard disk type)などの格納媒体を用いて実現される。格納媒体としては、この他にも、マルチメディアカードマイクロタイプ(multimedia card micro type)、カードタイプのメモリ(例えば、MicroSD(登録商標)メモリ等)、RAM(Random Access Memory)、ROM(Read Only Memory)が例示できる。 The storage unit 550 stores the control program and control data of the main control unit 501, application software, address data that associates the name and telephone number of the communication partner, and transmitted / received e-mail data. In addition, the storage unit 550 stores Web data downloaded by Web browsing and downloaded content data. The storage unit 550 temporarily stores streaming data and the like. In addition, the storage unit 550 includes an external storage unit 552 having an internal storage unit 551 built in the smartphone and a removable external memory slot. Each of the internal storage unit 551 and the external storage unit 552 constituting the storage unit 550 is realized using a storage medium such as a flash memory type (hard memory type) or a hard disk type (hard disk type). Other storage media include a multimedia card micro type (multimedia card micro type), a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory). ) Can be exemplified.
 外部入出力部560は、スマートフォン500に連結される全ての外部機器とのインタフェースの役割を果たすものであり、他の外部機器に通信等又はネットワークにより直接的又は間接的に接続するためのものである。他の外部機器に通信等としては、例えば、ユニバーサルシリアルバス(USB)、IEEE1394などが挙げられる。ネットワークとしては、例えば、インターネット、無線LAN、ブルートゥース(Bluetooth(登録商標))、RFID(Radio Frequency Identification)、赤外線通信(Infrared Data Association:IrDA(登録商標))が挙げられる。また、ネットワークの他の例としては、UWB(Ultra Wideband(登録商標))、ジグビー(ZigBee(登録商標))などが挙げられる。 The external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and is used to connect directly or indirectly to other external devices through communication or the like or a network. is there. Examples of communication with other external devices include universal serial bus (USB), IEEE 1394, and the like. Examples of the network include the Internet, wireless LAN, Bluetooth (Bluetooth (registered trademark)), RFID (Radio Frequency Identification), and infrared communication (Infrared Data Association: IrDA (registered trademark)). Other examples of the network include UWB (Ultra Wideband (registered trademark)) and ZigBee (registered trademark).
 スマートフォン500に連結される外部機器としては、例えば、有/無線ヘッドセット、有/無線外部充電器、有/無線データポート、カードソケットを介して接続されるメモリカード(Memory card)が挙げられる。外部機器の他の例としては、SIM(Subscriber Identity Module Card)/UIM(User Identity Module Card)カード、オーディオ・ビデオI/O(Input/Output)端子を介して接続される外部オーディオ・ビデオ機器が挙げられる。外部オーディオ・ビデオ機器の他にも、無線接続される外部オーディオ・ビデオ機器が挙げられる。また、外部オーディオ・ビデオ機器に代えて、例えば有/無線接続されるスマートフォン、有/無線接続されるパーソナルコンピュータ、有/無線接続されるPDA、有/無線接続されるパーソナルコンピュータ、イヤホンなども適用可能である。 Examples of the external device connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, and a memory card connected via a card socket. Other examples of external devices include SIM (Subscriber Identity Module Card) / UIM (User Identity Module Card) cards, and external audio / video devices connected via audio / video I / O (Input / Output) terminals. Can be mentioned. In addition to the external audio / video device, an external audio / video device that is wirelessly connected can be used. Also, instead of external audio / video devices, for example, smartphones with wired / wireless connection, personal computers with wired / wireless connection, PDAs with wired / wireless connection, personal computers with wired / wireless connection, earphones, etc. also apply Is possible.
 外部入出力部は、このような外部機器から伝送を受けたデータをスマートフォン500の内部の各構成要素に伝達することや、スマートフォン500の内部のデータが外部機器に伝送されるようにすることができる。 The external input / output unit may transmit data received from such an external device to each component inside the smartphone 500, or may allow data inside the smartphone 500 to be transmitted to the external device. it can.
 GPS受信部570は、主制御部501の指示にしたがって、GPS衛星ST1~STnから送信されるGPS信号を受信し、受信した複数のGPS信号に基づく測位演算処理を実行し、当該スマートフォン500の緯度、経度、高度からなる位置を検出する。GPS受信部570は、無線通信部510や外部入出力部560(例えば、無線LAN)から位置情報を取得できる時には、その位置情報を用いて位置を検出することもできる。 The GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, and calculates the latitude of the smartphone 500 Detect the position consisting of longitude and altitude. When the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
 モーションセンサ部580は、例えば、3軸の加速度センサなどを備え、主制御部501の指示にしたがって、スマートフォン500の物理的な動きを検出する。スマートフォン500の物理的な動きを検出することにより、スマートフォン500の動く方向や加速度が検出される。この検出結果は、主制御部501に出力されるものである。 The motion sensor unit 580 includes a triaxial acceleration sensor, for example, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
 電源部590は、主制御部501の指示にしたがって、スマートフォン500の各部に、バッテリ(図示しない)に蓄えられる電力を供給するものである。 The power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
 主制御部501は、マイクロプロセッサを備え、記憶部550が記憶する制御プログラムや制御データにしたがって動作し、スマートフォン500の各部を統括して制御するものである。また、主制御部501は、無線通信部510を通じて、音声通信やデータ通信を行うために、通信系の各部を制御する移動通信制御機能と、アプリケーション処理機能を備える。 The main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the storage unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function for controlling each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 510.
 アプリケーション処理機能は、記憶部550が記憶するアプリケーションソフトウェアにしたがって主制御部501が動作することにより実現するものである。アプリケーション処理機能としては、例えば、外部入出力部560を制御して対向機器とデータ通信を行う赤外線通信機能や、電子メールの送受信を行う電子メール機能、Webページを閲覧するWebブラウジング機能などがある。 The application processing function is realized by the main control unit 501 operating according to the application software stored in the storage unit 550. Application processing functions include, for example, an infrared communication function that controls the external input / output unit 560 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
 また、主制御部501は、受信データやダウンロードしたストリーミングデータなどの画像データ(静止画像や動画像のデータ)に基づいて、映像を表示入力部520に表示する等の画像処理機能を備える。画像処理機能とは、主制御部501が、上記画像データを復号し、この復号結果に画像処理を施して、画像を表示入力部520に表示する機能のことをいう。 Also, the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image data or moving image data) such as received data or downloaded streaming data. The image processing function is a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
 更に、主制御部501は、表示パネル521に対する表示制御と、操作部540、操作パネル522を通じたユーザ操作を検出する操作検出制御とを実行する。 Further, the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
 表示制御の実行により、主制御部501は、アプリケーションソフトウェアを起動するためのアイコンや、スクロールバーなどのソフトウェアキーを表示したり、あるいは電子メールを作成するためのウィンドウを表示する。なお、スクロールバーとは、表示パネル521の表示領域に収まりきれない大きな画像などについて、画像の表示部分を移動する指示を受け付けるためのソフトウェアキーのことをいう。 By executing the display control, the main control unit 501 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail. Note that the scroll bar refers to a software key for accepting an instruction to move a display portion of an image, such as a large image that does not fit in the display area of the display panel 521.
 また、操作検出制御の実行により、主制御部501は、操作部540を通じたユーザ操作を検出したり、操作パネル522を通じて、上記アイコンに対する操作や、上記ウィンドウの入力欄に対する文字列の入力を受け付けたりする。また、操作検出制御の実行により、主制御部501は、スクロールバーを通じた表示画像のスクロール要求を受け付ける。 Further, by executing the operation detection control, the main control unit 501 detects a user operation through the operation unit 540, or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 522. Or In addition, by executing the operation detection control, the main control unit 501 accepts a display image scroll request through a scroll bar.
 更に、操作検出制御の実行により主制御部501は、操作パネル522に対する操作位置が、表示パネル521に重なる重畳部分(表示領域)か、それ以外の表示パネル21に重ならない外縁部分(非表示領域)かを判定する。そして、この判定結果を受けて、操作パネル522の感応領域や、ソフトウェアキーの表示位置を制御するタッチパネル制御機能を備える。 Furthermore, by executing the operation detection control, the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 21. ) In response to the determination result, a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key is provided.
 また、主制御部501は、操作パネル522に対するジェスチャ操作を検出し、検出したジェスチャ操作に応じて、予め設定された機能を実行することもできる。ジェスチャ操作とは、従来の単純なタッチ操作ではなく、指などによって軌跡を描いたり、複数の位置を同時に指定したり、あるいはこれらを組み合わせて、複数の位置から少なくとも1つについて軌跡を描く操作を意味する。 The main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function according to the detected gesture operation. Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
 カメラ部541は、CMOSやCCDなどの撮像素子を用いて撮像するデジタルカメラであり、図1等に示す撮像装置100と同様の機能を備えている。 The camera unit 541 is a digital camera that captures an image using an image sensor such as a CMOS or a CCD, and has the same function as the image capturing apparatus 100 shown in FIG.
 また、カメラ部541は、マニュアルフォーカスモードとオートフォーカスモードとを切り替え可能である。マニュアルフォーカスモードが選択されると、操作部540又は表示入力部520に表示されるフォーカス用のアイコンボタン等を操作することにより、カメラ部541の撮影レンズのピント合わせを行うことができる。そして、マニュアルフォーカスモード時には、スプリットイメージが合成されたライブビュー画像を表示パネル521に表示させ、これによりマニュアルフォーカス時の合焦状態を確認できるようにしている。なお、図9に示すハイブリッドファインダー220をスマートフォン500に設けるようにしてもよい。 The camera unit 541 can switch between a manual focus mode and an autofocus mode. When the manual focus mode is selected, the photographing lens of the camera unit 541 can be focused by operating a focus icon button or the like displayed on the operation unit 540 or the display input unit 520. In the manual focus mode, the live view image obtained by combining the split images is displayed on the display panel 521 so that the in-focus state during the manual focus can be confirmed. In addition, you may make it provide the smart phone 500 with the hybrid finder 220 shown in FIG.
 また、カメラ部541は、主制御部501の制御により、撮像によって得た画像データを例えばJPEG(Joint Photographic coding Experts Group)などの圧縮した画像データに変換する。そして、変換して得た画像データを記憶部550に記録したり、入出力部560や無線通信部510を通じて出力することができる。図28に示すにスマートフォン500において、カメラ部541は表示入力部520と同じ面に搭載されているが、カメラ部541の搭載位置はこれに限らず、表示入力部520の背面に搭載されてもよいし、あるいは、複数のカメラ部541が搭載されてもよい。なお、複数のカメラ部541が搭載されている場合には、撮像に供するカメラ部541を切り替えて単独にて撮像したり、あるいは、複数のカメラ部541を同時に使用して撮像したりすることもできる。 Also, the camera unit 541 converts the image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501. The converted image data can be recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510. In the smartphone 500 shown in FIG. 28, the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and the camera unit 541 may be mounted on the back surface of the display input unit 520. Alternatively, a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for imaging may be switched and imaged alone, or a plurality of camera units 541 may be used simultaneously for imaging. it can.
 また、カメラ部541はスマートフォン500の各種機能に利用することができる。例えば、表示パネル521にカメラ部541で取得した画像を表示することや、操作パネル522の操作入力のひとつとして、カメラ部541の画像を利用することができる。また、GPS受信部570が位置を検出する際に、カメラ部541からの画像を参照して位置を検出することもできる。更には、カメラ部541からの画像を参照して、3軸の加速度センサを用いずに、或いは、3軸の加速度センサと併用して、スマートフォン500のカメラ部541の光軸方向を判断することや、現在の使用環境を判断することもできる。勿論、カメラ部541からの画像をアプリケーションソフトウェア内で利用することもできる。 In addition, the camera unit 541 can be used for various functions of the smartphone 500. For example, an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522. Further, when the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541. Furthermore, referring to an image from the camera unit 541, the optical axis direction of the camera unit 541 of the smartphone 500 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment. Of course, the image from the camera unit 541 can be used in the application software.
 その他、静止画又は動画の画像データに各種情報を付加して記憶部550に記録したり、入出力部560や無線通信部510を通じて出力したりすることもできる。ここで言う「各種情報」としては、例えば、静止画又は動画の画像データにGPS受信部570により取得した位置情報、マイクロホン532により取得した音声情報(主制御部等により、音声テキスト変換を行ってテキスト情報となっていてもよい)が挙げられる。この他にも、モーションセンサ部580により取得した姿勢情報等などであってもよい。 In addition, various information can be added to still image or moving image data and recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510. Examples of the “various information” herein include, for example, position information acquired by the GPS receiving unit 570 and image information of the still image or moving image, audio information acquired by the microphone 532 (sound text conversion by the main control unit or the like). May be text information). In addition, posture information acquired by the motion sensor unit 580 may be used.
 なお、上記各実施形態では、第1~第3の画素群を有する撮像素子20を例示したが、本発明はこれに限定されるものではなく、第1の画素群及び第2の画素群のみからなる撮像素子であってもよい。この種の撮像素子を有するデジタルカメラは、第1の画素群から出力された第1の画像及び第2の画素群から出力された第2の画像に基づいて3次元画像(3D画像)を生成することができるし、2次元画像(2D画像)も生成することができる。この場合、2次元画像の生成は、例えば第1の画像及び第2の画像の相互における同色の画素間で補間処理を行うことで実現される。また、補間処理を行わずに、第1の画像又は第2の画像を2次元画像として採用してもよい。 In each of the above embodiments, the image pickup device 20 having the first to third pixel groups is illustrated, but the present invention is not limited to this, and only the first pixel group and the second pixel group are used. An image sensor made of A digital camera having this type of image sensor generates a three-dimensional image (3D image) based on the first image output from the first pixel group and the second image output from the second pixel group. 2D images (2D images) can also be generated. In this case, the generation of the two-dimensional image is realized, for example, by performing an interpolation process between pixels of the same color in the first image and the second image. Moreover, you may employ | adopt a 1st image or a 2nd image as a two-dimensional image, without performing an interpolation process.
 上記各実施形態では、通常画像の表示領域内にスプリットイメージが表示される例を挙げて説明したが、本発明はこれに限定されるものではなく、表示装置に対して通常画像(第1の表示用画像の一例)を表示させずにスプリットイメージ(第2の表示用画像)を表示させてもよい。また、全画面を使用してスプリットイメージを表示させてもよい。ここで言う「スプリットイメージ」としては、位相差画素群(例えば第1の画素群及び第2の画素群)のみからなる撮像素子を使用する場合及び通常画素に対して所定の割合で位相差画素(例えば第1の画素群及び第2の画素群)が配置された撮像素子を使用する場合において、位相差画群から出力された画像(例えば第1の画素群から出力された第1の画像及び第2の画素群から出力された第2の画像)に基づくスプリットイメージが例示できる。このように、本発明は、通常画像とスプリットイメージとの双方を表示装置の同画面に同時に表示する態様に限定されるものではなく、スプリットイメージの表示が指示されている状態で通常画像の表示指示が解除された場合は表示制御部36が表示装置に対して通常画像を表示させずにスプリットイメージを表示させる制御を行うようにしてもよい。 In each of the above-described embodiments, the example in which the split image is displayed in the display area of the normal image has been described. However, the present invention is not limited to this, and the normal image (first image) is displayed on the display device. A split image (second display image) may be displayed without displaying an example of the display image. Alternatively, the split image may be displayed using the full screen. As used herein, the term “split image” refers to a case where an image sensor including only a phase difference pixel group (for example, a first pixel group and a second pixel group) is used, and a phase difference pixel at a predetermined ratio with respect to a normal pixel. When using an image sensor in which (for example, the first pixel group and the second pixel group) are arranged, an image output from the phase difference image group (for example, the first image output from the first pixel group) And a split image based on the second image output from the second pixel group. As described above, the present invention is not limited to a mode in which both the normal image and the split image are simultaneously displayed on the same screen of the display device, and the normal image is displayed in a state where the split image display is instructed. When the instruction is released, the display control unit 36 may perform control to display the split image without displaying the normal image on the display device.
12 CPU
20 撮像素子
26 メモリ
28 画像処理部
30 通常処理部
32 スプリットイメージ処理部
36A,36B 表示制御部
100 撮像装置
213 表示部
247 LCD
12 CPU
20 Image sensor 26 Memory 28 Image processing unit 30 Normal processing unit 32 Split image processing unit 36A, 36B Display control unit 100 Imaging device 213 Display unit 247 LCD

Claims (19)

  1.  撮影レンズにおける第1及び第2の領域を通過した被写体像が瞳分割されてそれぞれ結像される第1及び第2の画素群を有する撮像素子から出力された画像信号に基づいて第1の表示用画像を生成し、かつ、前記第1及び第2の画素群から出力された第1及び第2の画像信号に基づいて合焦確認に使用する第2の表示用画像を生成する生成部と、
     前記第1の表示用画像の表示を指示する第1指示部であって非保持型の指示を行う第1指示部と、
     前記第2の表示用画像の表示を指示する第2指示部と、
     画像を表示する表示部と、
     前記第2指示部により前記第2の表示用画像の表示が指示されている状態で前記第1指示部により前記第1の表示用画像の表示が指示された場合は前記表示部に対して前記生成部により生成された前記第2の表示用画像を表示させずに前記生成部により生成された前記第1の表示用画像を表示させる制御を行い、前記第2指示部により前記第2の表示用画像の表示が指示されている状態で前記第1指示部による前記第1の表示用画像の表示指示が解除された場合は前記表示部に対して前記生成部により生成された前記第1の表示用画像を表示させ、かつ、該第1の表示用画像の表示領域内に前記生成部により生成された前記第2の表示用画像を表示させる制御を行う表示制御部と、
     を含む画像処理装置。
    The first display is based on the image signal output from the image sensor having the first and second pixel groups in which the subject image that has passed through the first and second regions in the photographing lens is divided into pupils. A generating unit that generates an image for use and generates a second display image to be used for focusing confirmation based on the first and second image signals output from the first and second pixel groups; ,
    A first instruction unit for instructing display of the first display image and a non-holding type instruction;
    A second instruction unit for instructing display of the second display image;
    A display for displaying an image;
    When the display of the first display image is instructed by the first instruction unit while the display of the second display image is instructed by the second instruction unit, the display unit Control is performed to display the first display image generated by the generation unit without displaying the second display image generated by the generation unit, and the second display unit performs the second display. When the display instruction of the first display image by the first instruction unit is canceled in the state where the display of the image for use is instructed, the first unit generated by the generation unit with respect to the display unit A display control unit that performs control to display a display image and display the second display image generated by the generation unit in a display area of the first display image;
    An image processing apparatus.
  2.  前記第2指示部よる指示を保持型の指示とした請求項1に記載の画像処理装置。 The image processing apparatus according to claim 1, wherein the instruction by the second instruction unit is a holding instruction.
  3.  前記撮像素子は、前記撮影レンズを透過した被写体像が瞳分割されずに結像されて第3の画像を出力する第3の画素群を更に有し、
     前記生成部は、前記第3の画像群から出力された前記第3の画像に基づいて前記第1の表示用画像を生成する請求項1又は請求項2に記載の画像処理装置。
    The image pickup device further includes a third pixel group that forms a subject image that has passed through the photographing lens without being divided into pupils and outputs a third image,
    The image processing apparatus according to claim 1, wherein the generation unit generates the first display image based on the third image output from the third image group.
  4.  前記生成部は、前記第1及び第2の画像を前記第3の画像に基づいて補間して得た画像及び前記第3の画像に基づいて前記第1の表示用画像を生成する請求項3に記載の画像処理装置。 The said production | generation part produces | generates the said 1st image for a display based on the image obtained by interpolating the said 1st and 2nd image based on the said 3rd image, and the said 3rd image. An image processing apparatus according to 1.
  5.  請求項1~請求項4の何れか1項に記載の画像処理装置と、
     前記第1及び第2の画素群を有する撮像素子と、
     前記撮像素子から出力された画像を記憶する記憶部と、
     を含む撮像装置。
    The image processing apparatus according to any one of claims 1 to 4,
    An imaging device having the first and second pixel groups;
    A storage unit for storing an image output from the image sensor;
    An imaging apparatus including:
  6.  前記第1指示部を、撮像条件の調整を指示する第1指示位置と撮像の開始を指示する第2指示位置とに移動可能なレリーズスイッチとし、
     前記レリーズスイッチが前記第1指示位置に保持された場合を、前記第1の表示用画像の表示が指示された場合とし、
     前記レリーズスイッチが前記第1指示位置に保持された状態が解除された場合に表示が切り替わる請求項5に記載の撮像装置。
    The first instruction section is a release switch that can move between a first instruction position that instructs adjustment of imaging conditions and a second instruction position that instructs the start of imaging,
    When the release switch is held at the first designated position, the display of the first display image is designated,
    The imaging apparatus according to claim 5, wherein the display is switched when the state where the release switch is held at the first designated position is released.
  7.  前記レリーズスイッチが前記第2指示位置に保持された場合に撮像動作が開始され、
     前記表示制御部は、前記レリーズスイッチが前記第2指示位置に保持された場合、前記第2指示部による前記第2の表示用画像の表示指示が解除されることで、前記撮像動作の終了後に前記表示部に対して前記第1の表示用画像を表示させる制御を行う請求項6に記載の撮像装置。
    When the release switch is held in the second designated position, an imaging operation is started,
    When the release switch is held at the second instruction position, the display control unit cancels the display instruction of the second display image by the second instruction unit, so that the imaging operation is completed. The imaging apparatus according to claim 6, wherein control is performed to display the first display image on the display unit.
  8.  前記レリーズスイッチが前記第2指示位置に保持された場合に撮像動作が開始され、
     前記レリーズスイッチが前記第2指示位置に保持された場合を、前記第1の表示用画像の表示指示が解除された場合とした請求項6に記載の撮像装置。
    When the release switch is held in the second designated position, an imaging operation is started,
    The imaging apparatus according to claim 6, wherein when the release switch is held at the second designated position, the display instruction for the first display image is released.
  9.  前記第1指示部を、前記撮影レンズの絞り値を設定する設定部とし、
     前記第1の表示用画像の表示が指示された場合を、前記設定部により特定の絞り値が設定された場合とし、
     前記設定部により前記特定の絞り値が設定されない場合を、前記第1の表示用画像の表示指示が解除された場合とした請求項5~請求項8の何れか1項に記載の撮像装置。
    The first instruction unit is a setting unit for setting an aperture value of the photographing lens,
    When the display of the first display image is instructed, a specific aperture value is set by the setting unit,
    9. The imaging apparatus according to claim 5, wherein when the specific aperture value is not set by the setting unit, a display instruction for the first display image is canceled.
  10.  前記第1の表示用画像及び前記第2の表示用画像を表示可能な電子ビューファインダーを更に含み、
     前記第1指示部は、前記電子ビューファインダーの使用を検出する検出部を有し、
     前記検出部により前記電子ビューファインダーの使用が検出されない場合を、前記第1の表示用画像の表示が指示された場合とし、
     前記電子ビューファインダーの使用が検出された場合を、前記第1の表示用画像の表示指示が解除された場合とした請求項5~請求項9の何れか1項に記載の撮像装置。
    An electronic viewfinder capable of displaying the first display image and the second display image;
    The first instruction unit includes a detection unit that detects use of the electronic viewfinder,
    When the use of the electronic viewfinder is not detected by the detection unit, the display of the first display image is instructed,
    10. The imaging apparatus according to claim 5, wherein use of the electronic viewfinder is detected when a display instruction for the first display image is canceled.
  11.  前記撮像素子は、前記撮影レンズを透過した被写体像が瞳分割されずに結像される第3の画素群を更に有し、
     前記撮像素子において、前記第1及び第2の画素群に含まれる各画素を、前記第3の画素群と共に行列状に配置し、且つ、前記第1の画素群と前記第2の画素群との間で行方向及び列方向の少なくとも一方についての位置が所定画素数内に収まる位置に配置した請求項5~請求項10の何れか1項に記載の撮像装置。
    The imaging device further includes a third pixel group on which a subject image transmitted through the photographing lens is formed without being divided into pupils,
    In the imaging device, the pixels included in the first and second pixel groups are arranged in a matrix with the third pixel group, and the first pixel group, the second pixel group, The imaging apparatus according to any one of claims 5 to 10, wherein a position in at least one of the row direction and the column direction is arranged within a predetermined number of pixels.
  12.  前記撮像素子において、更に、前記第1及び第2の画素群に含まれる各画素を、前記第1の画素群と前記第2の画素群との間で画素単位で所定方向に隣接させた請求項11に記載の撮像装置。 In the imaging device, each pixel included in the first and second pixel groups is further adjacent to each other in a predetermined direction in units of pixels between the first pixel group and the second pixel group. Item 12. The imaging device according to Item 11.
  13.  前記撮像素子において、更に、前記第1及び第2の画素群に含まれる各画素を、前記第3の画素群に含まれる画素間に配置し、
     前記第1~第3の画素群上に設けられ、前記第1及び第2の画素群並びに前記第1及び第2の画素群に含まれる各画素に隣接する前記第3の画素群に含まれる画素に対して特定の原色を割り当てるカラーフィルタを更に含む請求項11又は請求項12に記載の撮像装置。
    In the imaging device, each pixel included in the first and second pixel groups is further arranged between the pixels included in the third pixel group,
    Provided on the first to third pixel groups and included in the first and second pixel groups and the third pixel group adjacent to each pixel included in the first and second pixel groups The imaging apparatus according to claim 11, further comprising a color filter that assigns a specific primary color to a pixel.
  14.  前記特定の原色を緑色とした請求項13に記載の撮像装置。 The imaging apparatus according to claim 13, wherein the specific primary color is green.
  15.  前記カラーフィルタの原色の配列をベイヤ配列とした請求項13又は請求項14に記載の撮像装置。 15. The imaging apparatus according to claim 13, wherein the primary color array of the color filter is a Bayer array.
  16.  前記撮像素子において、更に、前記第1及び第2の画素群に含まれる各画素を、前記第3の画素群と共に半画素ずれた行列状に配置し、且つ、前記第3の画素群に含まれる画素間に配置し、
     前記撮像素子における構成画素が2類型に分類されて得た第4及び第5の画素群であって前記構成画素が前記行方向及び列方向について交互に半画素分ずれて配置された第4及び第5の画素群上に設けられ、前記第4及び第5の画素群の各々に対してベイヤ配列の原色を割り当てるカラーフィルタを更に含む請求項11又は請求項12に記載の撮像装置。
    In the imaging device, the pixels included in the first and second pixel groups are further arranged in a matrix shifted by a half pixel together with the third pixel group, and are included in the third pixel group. Between the pixels
    4th and 5th pixel groups obtained by classifying the constituent pixels in the image sensor into two types, wherein the constituent pixels are alternately shifted by half a pixel in the row direction and the column direction. The imaging device according to claim 11, further comprising a color filter provided on a fifth pixel group and assigning a primary color in a Bayer array to each of the fourth and fifth pixel groups.
  17.  前記第1及び第2の画素群を、前記カラーフィルタにおける緑色のフィルタに対して配置した請求項16に記載の撮像装置。 The imaging device according to claim 16, wherein the first and second pixel groups are arranged with respect to a green filter in the color filter.
  18.  撮影レンズにおける第1及び第2の領域を通過した被写体像が瞳分割されてそれぞれ結像される第1及び第2の画素群を有する撮像素子から出力された画像信号に基づいて第1の表示用画像を生成し、かつ、前記第1及び第2の画素群から出力された第1及び第2の画像信号に基づいて合焦確認に使用する第2の表示用画像を生成する生成工程と、
     前記第1の表示用画像の表示を指示する第1指示工程であって非保持型の指示を行う第1指示工程と、
     前記第2の表示用画像の表示を指示する第2指示工程と、
     前記第2指示工程により前記第2の表示用画像の表示が指示されている状態で前記第1指示工程により前記第1の表示用画像の表示が指示された場合は、画像を表示する表示部に対して前記生成工程により生成された前記第2の表示用画像を表示させずに前記生成工程により生成された前記第1の表示用画像を表示させる制御を行い、前記第2指示工程により前記第2の表示用画像の表示が指示されている状態で前記第1指示工程による前記第1の表示用画像の表示指示が解除された場合は前記表示部に対して前記生成工程により生成された前記第1の表示用画像を表示させ、かつ、該第1の表示用画像の表示領域内に前記生成工程により生成された前記第2の表示用画像を表示させる制御を行う表示制御工程と、
     を含む画像処理方法。
    The first display is based on the image signal output from the image sensor having the first and second pixel groups in which the subject image that has passed through the first and second regions in the photographing lens is divided into pupils. Generating a display image and generating a second display image to be used for focusing confirmation based on the first and second image signals output from the first and second pixel groups; ,
    A first instruction step of instructing display of the first display image, wherein the first instruction step of instructing a non-holding type;
    A second instruction step for instructing display of the second display image;
    A display unit that displays an image when the display of the first display image is instructed by the first instruction step while the display of the second display image is instructed by the second instruction step Control to display the first display image generated by the generation step without displaying the second display image generated by the generation step, and by the second instruction step When the display instruction of the first display image in the first instruction step is canceled in the state where the display of the second display image is instructed, the display unit is generated by the generation step A display control step for controlling the display of the first display image and the display of the second display image generated by the generation step in a display area of the first display image;
    An image processing method including:
  19.  コンピュータを、
     請求項1~請求項4の何れか1項に記載の画像処理装置における前記生成部及び前記表示制御部として機能させるための画像処理プログラム。
    Computer
    5. An image processing program for functioning as the generation unit and the display control unit in the image processing apparatus according to claim 1.
PCT/JP2013/071183 2012-09-19 2013-08-05 Image processing device, imaging device, image processing method, and image processing program WO2014045741A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-205908 2012-09-19
JP2012205908 2012-09-19

Publications (1)

Publication Number Publication Date
WO2014045741A1 true WO2014045741A1 (en) 2014-03-27

Family

ID=50341065

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/071183 WO2014045741A1 (en) 2012-09-19 2013-08-05 Image processing device, imaging device, image processing method, and image processing program

Country Status (1)

Country Link
WO (1) WO2014045741A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112640437A (en) * 2018-08-31 2021-04-09 富士胶片株式会社 Imaging element, imaging device, image data processing method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004040740A (en) * 2002-07-08 2004-02-05 Fuji Photo Film Co Ltd Manual focus equipment
JP2005025055A (en) * 2003-07-04 2005-01-27 Olympus Corp Digital single-lens reflex camera
JP2007248852A (en) * 2006-03-16 2007-09-27 Olympus Imaging Corp Focusing device for camera
JP2009147665A (en) * 2007-12-13 2009-07-02 Canon Inc Image-pickup apparatus
JP2009237214A (en) * 2008-03-27 2009-10-15 Canon Inc Imaging apparatus
WO2012002297A1 (en) * 2010-06-30 2012-01-05 富士フイルム株式会社 Imaging device and imaging method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004040740A (en) * 2002-07-08 2004-02-05 Fuji Photo Film Co Ltd Manual focus equipment
JP2005025055A (en) * 2003-07-04 2005-01-27 Olympus Corp Digital single-lens reflex camera
JP2007248852A (en) * 2006-03-16 2007-09-27 Olympus Imaging Corp Focusing device for camera
JP2009147665A (en) * 2007-12-13 2009-07-02 Canon Inc Image-pickup apparatus
JP2009237214A (en) * 2008-03-27 2009-10-15 Canon Inc Imaging apparatus
WO2012002297A1 (en) * 2010-06-30 2012-01-05 富士フイルム株式会社 Imaging device and imaging method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112640437A (en) * 2018-08-31 2021-04-09 富士胶片株式会社 Imaging element, imaging device, image data processing method, and program
CN112640437B (en) * 2018-08-31 2024-05-14 富士胶片株式会社 Imaging element, imaging device, image data processing method, and storage medium

Similar Documents

Publication Publication Date Title
JP6033454B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5681329B2 (en) Imaging apparatus and image display method
JP5931206B2 (en) Image processing apparatus, imaging apparatus, program, and image processing method
JP6158340B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5960286B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5889441B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5901801B2 (en) Image processing apparatus, imaging apparatus, program, and image processing method
JP5901782B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5901781B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5753323B2 (en) Imaging apparatus and image display method
JP5833254B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP6086975B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP6000446B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5955417B2 (en) Image processing apparatus, imaging apparatus, program, and image processing method
JP5972485B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP5901780B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
WO2014045741A1 (en) Image processing device, imaging device, image processing method, and image processing program
JP5934844B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13839134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13839134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP