WO2013190906A1 - Dispositif de contrôle d'affichage, dispositif de formation d'image, et procédé de contrôle d'affichage - Google Patents

Dispositif de contrôle d'affichage, dispositif de formation d'image, et procédé de contrôle d'affichage Download PDF

Info

Publication number
WO2013190906A1
WO2013190906A1 PCT/JP2013/062149 JP2013062149W WO2013190906A1 WO 2013190906 A1 WO2013190906 A1 WO 2013190906A1 JP 2013062149 W JP2013062149 W JP 2013062149W WO 2013190906 A1 WO2013190906 A1 WO 2013190906A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
user
display unit
monitor
finder
Prior art date
Application number
PCT/JP2013/062149
Other languages
English (en)
Japanese (ja)
Inventor
熊木 甚洋
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2013190906A1 publication Critical patent/WO2013190906A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators

Definitions

  • the present disclosure relates to a display control device, an imaging device, and a display control method. More specifically, the present disclosure relates to a display control device, an imaging device, and a display control method that do not require an arrangement of a proximity sensor on a finder.
  • a digital camera has a monitor that displays an image obtained by imaging. Therefore, a camera user (hereinafter, simply referred to as “user” as appropriate) can confirm on the spot whether a desired image has been obtained by displaying a captured image on a monitor.
  • a camera equipped with both a monitor and a finder is also known.
  • the finder By using the finder, the user can stably hold the camera during shooting. For this reason, use of the finder makes it easy to capture a composition desired by the user.
  • the camera has both a monitor and a viewfinder
  • the display of the monitor is turned on while the user is using the viewfinder, the light emitted from the monitor enters the eyes of the user looking into the viewfinder or is wasted May cause excessive power consumption.
  • Patent Document 1 it is proposed to dispose a proximity sensor on a finder and invalidate an operation on a touch panel (also referred to as “touch screen”) based on a detection result of the proximity sensor.
  • a dedicated component is required to detect the use of the finder by the user.
  • Providing the camera with a dedicated part for detecting the use of the finder by the user increases the structure of the camera.
  • the first preferred embodiment of the present disclosure is:
  • the display control device includes a first display unit, a second display unit, and a control unit.
  • the first display unit accepts user operations and presents information to the user.
  • a control part switches the display by a 1st display part, and the display by a 2nd display part based on the input from a 1st display part.
  • a second preferred embodiment of the present disclosure is:
  • An imaging device includes an imaging element, a first display unit, a second display unit, and a control unit.
  • the image sensor is disposed inside the housing.
  • a 1st display part is connected with a housing, accepts a user's operation, and presents information to a user.
  • the display direction of the second display unit is set to be substantially the same as the display direction of the first display unit.
  • a control part switches the display by a 1st display part, and the display by a 2nd display part based on the input from a 1st display part.
  • a third preferred embodiment of the present disclosure is: A display control method including switching between display by the first display unit and display by the second display unit based on an input from the first display unit that accepts user operations and presents information to the user It is.
  • touch panel does not indicate a single input element, but indicates a device having a function of displaying information.
  • user convenience can be improved while suppressing an increase in manufacturing cost.
  • FIG. 1A is a rear view illustrating an example of an imaging apparatus according to an embodiment of the present disclosure.
  • FIG. 1B is a schematic diagram illustrating an example of a usage state of the imaging apparatus.
  • FIG. 1C is a schematic diagram illustrating an example of an image observed by the user when the user looks into the viewfinder.
  • FIG. 2A is a functional block diagram illustrating a configuration example of the display control apparatus according to the embodiment of the present disclosure.
  • FIG. 2B is a functional block diagram illustrating a configuration example of the imaging device according to the embodiment of the present disclosure.
  • 3A and 3B are schematic diagrams illustrating an example of an input operation to the monitor by the user.
  • FIG. 1A is a rear view illustrating an example of an imaging apparatus according to an embodiment of the present disclosure.
  • FIG. 1B is a schematic diagram illustrating an example of a usage state of the imaging apparatus.
  • FIG. 1C is a schematic diagram illustrating an example of an image observed by the user when the user looks into the viewfinder
  • FIG. 4A is a schematic diagram illustrating a state in which the user looks into the viewfinder of the imaging apparatus with the bottom surface of the imaging apparatus being substantially horizontal.
  • FIG. 4B is a schematic diagram illustrating an example of weighting for lattice points corresponding to the arrangement of electrodes.
  • 5A and 5B are schematic diagrams illustrating an example of an input operation to the monitor by the user.
  • FIG. 6A is a schematic diagram illustrating a state where the user looks into the viewfinder of the imaging apparatus with the bottom surface of the imaging apparatus being substantially horizontal.
  • FIG. 6B is a schematic diagram illustrating an example of weighting according to the high possibility of user contact with the monitor.
  • FIG. 7A is a schematic diagram schematically showing an area of the monitor display area where the user's nose, lips or cheeks are likely to approach when the user looks into the viewfinder of the imaging apparatus with his right eye.
  • FIG. 7B is a schematic diagram schematically showing an area of the monitor display area where the user's nose, lips or cheeks are likely to approach when the user looks through the viewfinder of the imaging device with his left eye.
  • FIG. 8A shows an area where the user's nose, lips or cheeks are likely to approach in the state where the user looks into the finder of the imaging apparatus with the right eye, and the finder of the imaging apparatus with the left eye in the display area of the monitor.
  • FIG. 6 is a schematic diagram showing a region where the user's nose, lips, or cheeks are likely to approach in a state of looking into the camera.
  • FIG. 8B schematically shows an area where the user's nose, lips or cheeks are likely to approach when the user looks through the viewfinder of the imaging apparatus in the vertical state with his right eye.
  • FIG. 9A schematically illustrates an area of the monitor display area where the user's nose, lips, or cheeks are likely to approach when the user looks into the viewfinder of the imaging apparatus in a vertical state with the left eye.
  • FIG. 9B shows an area where the user's nose, lips or cheeks are likely to approach in the state where the user looks into the finder of the imaging device with the right eye, and the finder of the imaging device with the user's left eye.
  • FIG. 6 is a schematic diagram showing a region where the user's nose, lips, or cheeks are likely to approach in a state of looking into the camera.
  • FIG. 10A is a schematic diagram illustrating a state in which the user looks through the viewfinder of the imaging apparatus in the vertical state with the right eye.
  • FIG. 10B is a schematic diagram illustrating a state in which the user looks through the viewfinder of the imaging apparatus in the vertical state with the left eye.
  • FIG. 11 is a flowchart illustrating an example of processing according to the embodiment of the present disclosure.
  • FIG. 10A is a schematic diagram illustrating a state in which the user looks through the viewfinder of the imaging apparatus in the vertical state with the right eye.
  • FIG. 10B is a schematic diagram illustrating a state
  • FIG. 12A is a flowchart illustrating an example of an evaluation value calculation process.
  • FIG. 12B is a flowchart illustrating an example of threshold selection processing.
  • FIG. 13A is a side view showing a second modification of the imaging apparatus.
  • FIG. 13B is a side view showing a state in which the finder unit is removed from the main body of the imaging apparatus.
  • 14A and 14B are rear views illustrating a third modification of the imaging apparatus.
  • FIG. 15A and FIG. 15B are rear views illustrating an example of an imaging device in which function icons are displayed in an area that does not overlap with the user's face when the user uses the viewfinder in the display area of the monitor.
  • Embodiment> [1-1. Outline of configuration and operation of imaging apparatus] [1-2. Configuration example of display control apparatus and imaging apparatus] (1-2-1. Configuration Example of Display Control Device) (1-2-2. Configuration Example of Imaging Device) [1-3. Estimating the object that touches or approaches] [1-4. Judgment based on object contact] (1-4-1. Evaluation value based on the number of contact points) (1-4-2. Evaluation value based on relative positional relationship between a plurality of contact points) (1-4-3. Evaluation value based on contact area) [1-5. Judgment based on object contact or approach] [1-6.
  • the embodiment described below is a preferable specific example of a display control device, an imaging device, and a display control method.
  • various technically preferable limitations are given. Examples of the display control device, the imaging device, and the display control method are shown below unless otherwise specified to limit the present disclosure. It is not limited to the embodiment.
  • members having the same or substantially the same configuration are denoted by common reference symbols.
  • Embodiment> hereinafter, taking a digital camera as an example, preferred specific examples of the display control apparatus, the imaging apparatus, and the display control method of the present disclosure will be described. As will be apparent from the following description, application examples of the display control device, the imaging device, and the display control method of the present disclosure are not limited to digital cameras.
  • FIG. 1A is a rear view illustrating an example of an imaging apparatus according to an embodiment of the present disclosure.
  • FIG. 1B is a schematic diagram illustrating an example of a usage state of the imaging apparatus.
  • the image pickup apparatus 10 is specifically a digital camera in which an image pickup element is arranged inside a housing 100. Therefore, the imaging apparatus 10 is provided with a monitor 11m on the rear surface of the housing 100 or the like.
  • the monitor 11m has a function as an input device that accepts user operations and also has a function as a display device that presents information to the user. That is, the monitor 11m is specifically configured as a touch panel including a display element and an input element (detection element). In FIG. 1A, the user's finger touching the monitor 11m is schematically shown by a two-dot chain line. The same applies to the following description.
  • the monitor 11m displays an image related to the subject acquired by the image sensor. Further, on the monitor 11m, information related to parameters used for shooting is displayed superimposed on an image related to the subject as necessary. Examples of information relating to parameters used for shooting include shutter speed, aperture opening, ISO sensitivity, and the like. These pieces of information are presented to the user in the form of icons, indicators, characters, pictograms, and the like.
  • FIG. 1A shows an example in which an icon Ic for switching face detection ON / OFF is displayed on the monitor 11m.
  • FIG. 1A shows an example in which face detection is turned on by a user operation.
  • FIG. 1A shows a state in which an indicator Id indicating that the imaging apparatus 10 recognizes the face of the subject is displayed superimposed on the image relating to the subject.
  • the imaging apparatus 10 further includes a finder 12f for confirming an image obtained by the imaging element. Therefore, as shown in FIG. 1B, the user can check the subject and the composition using the finder 12f.
  • the user's face looking into the finder 12f is schematically shown as a projection diagram by a two-dot chain line. The same applies to the following description.
  • FIG. 1B is a diagram schematically showing a state where the user is looking through the viewfinder with the right eye.
  • the user can use the monitor 11m or the finder 12f depending on the scene at the time of shooting and the user's preference.
  • FIG. 1C is a schematic diagram illustrating an example of an image observed by the user when the user looks into the viewfinder.
  • the imaging apparatus 10 includes icons and indicators displayed on the monitor 11m when the user confirms an image displayed on the monitor 11m or when the user operates the monitor 11m as a touch panel.
  • Information similar to the above is presented to the user.
  • the information observed by the user through the finder 12f and the information presented by the monitor 11m may not be the same.
  • the amount of information to be displayed and the form of display may differ between when the user uses the finder 12f and when the user uses the monitor 11m.
  • FIG. 1C shows an example in which information regarding parameters used for shooting, a face detection indicator Id, and an indicator If indicating the center and range of an image obtained by shooting are observed by the user together with the subject image.
  • the finder 12f of the imaging apparatus 10 is configured as an electronic finder or an optical finder.
  • a display element such as an organic EL (Electro Luminescence) display or a liquid crystal display (LCD) is disposed inside the finder 12f. Is done. Therefore, when the finder 12f of the imaging apparatus 10 is configured as an electronic finder, an image related to the subject and information related to parameters used for shooting, which are acquired by the imaging device, are arranged inside the finder 12f. It is displayed superimposed on the display element.
  • organic EL Electro Luminescence
  • LCD liquid crystal display
  • the finder 12f of the imaging device 10 is configured as an optical finder, for example, an optical system including a pentaprism and a display element such as a liquid crystal display are arranged inside the finder 12f.
  • the user simultaneously displays information about the parameters used for photographing displayed on the display element arranged inside the viewfinder 12f and an image about the subject obtained through an optical system such as a pentaprism. Observe. Therefore, the user can check both the information related to the parameters used for shooting and the image related to the subject in the same manner as in the case of the electronic finder.
  • a display element disposed inside the finder 12f is referred to as an “in-finder monitor”.
  • both the display content of the monitor 11m and the display content of the monitor in the finder are changed. Specifically, for example, almost the entire display area of the monitor 11m is changed to “black” display, and information corresponding to the information displayed on the monitor 11m is displayed on the in-finder monitor.
  • the display content on the monitor 11m is returned to the state before the user looks into the finder 12f, and the display on the finder monitor is changed to “black”.
  • the display of the monitor 11m is turned off and the display of the monitor in the finder is turned on.
  • the display on the finder monitor is turned off and the display on the monitor 11m is turned on again.
  • the display of the monitor 11m is switched on or off and the display of the monitor in the finder is switched on or off.
  • the display of the monitor 11m is turned on as shown in FIG. 1A.
  • the display in the finder monitor is turned off.
  • the display of the monitor in the finder is turned on, whereas the display of the monitor 11m is turned off.
  • the display element on the side that is not used is turned off, for example. Wasteful consumption is suppressed.
  • the light emitted from the monitor 11m does not enter the eyes of the user looking through the finder 12f, the user's shooting is not hindered.
  • the display of the monitor 11m and the display of the monitor in the finder are switched depending on whether or not the user is looking into the finder 12f.
  • whether or not the user is looking through the finder 12f is determined based on whether or not a human face is in contact with or close to the monitor 11m. Whether a human face is in contact with or close to the monitor 11m is determined based on an input from the monitor 11m configured as a touch panel. The determination of whether or not a person's face is in contact with or approaching the monitor 11m, and switching between the display of the monitor 11m and the display of the monitor in the finder are performed by, for example, a control unit described later.
  • whether or not a person's face is in contact with or approaching the monitor is determined based on an input from a monitor configured as a touch panel, and therefore proximity for detecting use of a finder by a user Dedicated parts such as sensors can be dispensed with.
  • FIG. 2A is a functional block diagram illustrating a configuration example of the display control apparatus according to the embodiment of the present disclosure.
  • the display control device 1 includes at least a first display unit 11, a second display unit 12, and a control unit 13.
  • the first display unit 11 accepts user operations and presents information to the user.
  • the monitor 11m shown in FIGS. 1A and 1B corresponds to the first display unit 11
  • the finder 12f shown in FIGS. 1A and 1B corresponds to the second display unit 12.
  • the control unit 13 switches between the display by the first display unit 11 and the display by the second display unit 12 based on the input from the first display unit 11.
  • control unit 13 determines whether or not the object approaching the first display unit 11 is a human face based on the input from the first display unit 11. Details of the determination as to whether or not the object approaching the first display unit 11 is a human face will be described later.
  • the control unit 13 displays the display by the first display unit 11 and the display by the second display unit 12 according to the determination result of whether or not the object approaching the first display unit 11 is a human face. And switch.
  • FIG. 2B is a functional block diagram illustrating a configuration example of the imaging device according to the embodiment of the present disclosure.
  • the imaging device 10 includes at least an imaging element 15, a first display unit 11, a second display unit 12, and a control unit 13.
  • the imaging optical system 111 is an optical system including one or more lenses.
  • the imaging optical system 111 forms an image of the subject on the imaging surface of the imaging element 15 disposed inside the housing 100 of the imaging device 10.
  • At least one of the one or more lenses arranged in the imaging optical system 111 is movable for autofocusing, for example. These lenses are moved by a lens driving mechanism.
  • a control signal for the lens driving mechanism is supplied from the control unit 13 described later. Further, information regarding the amount of movement of the lens is supplied to the control unit 13 described later.
  • the imaging optical system 111 includes one or more mirrors as necessary.
  • the imaging apparatus is a single-lens reflex camera
  • light from a subject is reflected by a mirror disposed inside the housing and guided to an optical system of a finder disposed above the housing.
  • the imaging device is a camera with a pellicle mirror
  • a part of the light from the subject is reflected by a semi-transparent mirror disposed inside the housing and placed on the upper or lower portion of the housing. Led to the autofocus sensor.
  • the present disclosure can also be applied when the imaging apparatus 10 does not include a mirror in the imaging optical system 111. That is, the present disclosure can be applied even if the imaging apparatus 10 is a digital camera called a so-called “mirrorless single-lens camera” or the like. Thus, this indication does not ask
  • Imaging element 15 Light from the subject enters the image sensor 15 via the imaging optical system 111.
  • the image sensor 15 outputs an image signal related to the subject in accordance with a timing signal supplied under the control of the control unit 13 described later.
  • the image signal regarding the subject acquired by the image sensor 15 is output to the A (Analog) / D (Digital) conversion unit 113.
  • Examples of the imaging element 15 include a CMOS (Complementary Metal-Oxide Semiconductor) and a CCD (Charge-Coupled Device).
  • the A / D conversion unit 113 performs A / D (Analog-Digital) conversion on an analog signal output from the image sensor 15 according to control of the control unit 13 described later, and outputs a digital signal.
  • the signal output from the A / D conversion unit 113 is supplied to the preprocessing unit 115.
  • the preprocessing unit 115 performs predetermined signal processing on the image signal related to the subject in accordance with the control of the control unit 13 described later.
  • Examples of signal processing for an image signal related to a subject include digital gain adjustment, gamma correction, color correction, and contrast correction.
  • the encoder / decoder 117 performs encoding (encoding) using a coding method for the signal subjected to signal processing by the preprocessing unit 115 under the control of the control unit 13 described later.
  • the encoder / decoder 117 compresses the image signal subjected to the signal processing by the preprocessing unit 115 by a compression encoding method such as JPEG (Joint Photographic Experts Group), and generates compressed data.
  • Data obtained by encoding is stored in, for example, a storage unit 119 described later.
  • the encoder / decoder 117 performs decoding (decoding) of data read from the storage unit 119 described later.
  • the decrypted information is supplied to the control unit 13 described later, for example.
  • the storage unit 119 includes, for example, an external storage device that can be freely attached to and detached from the imaging device 10 and an internal storage device that is fixed inside the imaging device 10.
  • Examples of the storage medium applied to the external storage device or the internal storage device include a flash memory, an optical disk, a hard disk, a magneto-optical disk, an MRAM (Magnetic Resistive Random Access Memory), and the like.
  • the image data obtained by shooting is stored in the storage unit 119 via a reader / writer, for example. Whether the image data is stored in the external storage device or the internal storage device can be arbitrarily set by the user, for example.
  • the operation unit 121 includes various input elements such as a group of function buttons and a release button arranged on the housing 100 of the imaging apparatus 10.
  • the operation unit 121 functions as a user interface for operating the imaging device 10.
  • the operation unit 121 may include an external control device such as a remote controller. An operation signal received by the operation unit 121 and corresponding to a user input operation is output to the control unit 13 described later.
  • the imaging device 10 includes one or more sensors 19 for detecting the posture of the imaging device 10 and vibrations applied to the imaging device 10 as necessary. Information regarding the orientation of the imaging device 10 at the time of shooting is added to the image data as metadata, for example, by the control unit 13 described later.
  • Detecting elements for detecting the attitude of the imaging device 10 include a vibration gyroscope and an acceleration sensor.
  • the angle of the imaging device 10 may be detected by an angle sensor or a so-called vertical / horizontal sensor, or the posture of the imaging device 10 may be detected from a load on the actuator of the lens driving mechanism.
  • the vertical / horizontal sensor is a sensor that detects the angle of the casing 100 of the imaging device 10 by detecting which surface inside the slot is in contact with the ball disposed inside the slot.
  • the imaging device 10 includes one or more camera shake correction mechanisms 17 as necessary.
  • the camera shake correction mechanism 17 is an optical correction mechanism
  • the correction lens disposed in the image sensor 15 itself or the imaging optical system 111 according to the vibration applied to the imaging device 10 is caused by the vibration. It can be moved in the direction to cancel the influence.
  • the vibration applied to the imaging device 10 is detected by, for example, a vibration gyroscope that is disposed as the sensor 19 inside the housing 100 of the imaging device 10.
  • the camera shake correction may be executed by the control unit 13 described later. In this case, correction is performed by calculation on an image signal obtained from the image sensor 15. Of course, physical camera shake correction for the optical system and electrical camera shake correction for the image signal may be combined.
  • the control unit 13 is a processing device including a processor, and the control unit 13 is configured as, for example, a digital signal processor (Digital Signal Processor (DSP)) or a CPU (Central Processing Unit).
  • DSP Digital Signal Processor
  • CPU Central Processing Unit
  • the control unit 13 determines, for example, whether or not the object approaching the first display unit 11 is a human face. In response, the display by the first display unit 11 and the display by the second display unit 12 are switched. The control unit 13 controls each unit of the imaging apparatus 10 and outputs a processing result corresponding to an input from the operation unit 121 or the first display unit 11 described later, for example.
  • various arithmetic processes and programs for controlling each unit of the imaging apparatus 10 include, for example, a ROM (Read-Only Memory) 123, a RAM (Random Access Memory) 125, and a storage unit connected to the control unit 13. 119 or the like.
  • the control unit 13 controls reading and writing of data in the storage unit 119, and acquires a program stored in an external storage device, for example, as necessary.
  • a program or software for controlling the imaging device 10 is updated.
  • the first display unit 11 is a display device arranged on the back surface of the housing 100 of the imaging device 10.
  • the 1st display part 11 is connected with the housing 100 of the imaging device 10 via a hinge etc., for example.
  • the first display unit 11 accepts user operations and presents information to the user.
  • the first display unit 11 is specifically configured as a touch panel, and the touch panel includes, for example, a set of display elements and input elements.
  • Examples of the display element constituting the first display unit 11 include an organic EL display and a liquid crystal display.
  • the display elements constituting the first display unit 11 are driven under the control of the control unit 13. Therefore, the first display unit 11 displays, for example, an image relating to the subject obtained by the photoelectric conversion action of the image sensor 15 and a result of processing for input from the user. Further, for example, the first display unit 11 displays one or more icons for changing setting values of various parameters used for shooting, one or more icons for menu operations, and the like.
  • the input element as the detection element is laminated on the surface of the display element constituting the first display unit 11, for example.
  • the input element as the detection element is disposed on the outer peripheral portion of the display element constituting the first display unit 11.
  • the 1st display part 11 may be comprised as a touch panel by which the function of the input element was mounted in the inside of a display element called "in-cell (in-cell)."
  • the first display unit 11 detects, for example, a user's contact or approach to the first display unit 11, but the input element as a detection element only needs to be able to detect at least the contact or approach of an object. It is not necessary for the unit 11 to select and detect only a specific target.
  • the input element that constitutes the first display unit 11 detects, for example, a contact of a user's finger or the like with respect to an icon displayed on the display element that constitutes the first display unit 11.
  • the input element that constitutes the first display unit 11 outputs to the control unit 13 information regarding which part of the display area of the first display unit 11 the user has touched.
  • the control unit 13 receives the output from the first display unit 11 and executes the function corresponding to the icon displayed on the part touched by the user among the display elements constituting the first display unit 11. An instruction is given to each unit of the imaging apparatus 10 or a setting value of a parameter corresponding to the icon is changed.
  • the first display unit 11 can further detect not only the user's contact with the first display unit 11 but also the approach of the user's body with respect to the first display unit 11. In particular, it is preferable that the first display unit 11 can detect the approach of the user's face to the first display unit 11. Note that there is a direct relationship between the approach of the user's face to the first display unit 11 and the contents of the screen (icons, indicators, etc.) displayed on the display elements constituting the first display unit 11. It doesn't have to be sex.
  • the first display unit 11 only needs to be able to detect at least either the contact or approach of the user's body to the first display unit 11, and the detection method in the first display unit 11 is not particularly limited.
  • Specific examples of detection methods in the first display unit 11 include, for example, a projection capacitance method, a surface capacitance method (analog capacitance coupling method), an image recognition method, an infrared scanning method, and an infrared recursion. Examples include a reflection method, a resistive film method, an ultrasonic surface acoustic wave method, an acoustic pulse recognition method, and a vibration detection method. Two or more of these may be used in combination.
  • the input element constituting the first display unit 11 is preferably an input element capable of detecting multiple points, and the input element constituting the first display unit 11 detects the approach of an object to the first display unit 11. More preferably, the input element can be used.
  • the projection capacitive method is selected as the detection method in the first display unit 11. For example, when the user's body is not in contact with the first display unit 11, a user's body with respect to the first display unit 11 is increased by combining a plurality of adjacent electrodes to increase the effective electrode area. It is also possible to make the first display unit 11 detect the approach of the body.
  • the detection method of the contact or approach of the user's body to the first display unit 11 is not limited to the detection method based on the projected capacitive method. For example, if a plurality of infrared scanning input elements are stacked on the outer periphery of the display element constituting the first display unit 11, the contact and approach of the user's body to the first display unit 11 can be reduced. It can be detected by the display unit 11.
  • the object approaching the first display unit 11 is a human face, as will be described later. It is possible to give a wider design to the determination of whether or not.
  • the second display unit 12 includes an eyepiece and a display element such as an organic EL display or a liquid crystal display.
  • a display element as a monitor in the finder is disposed inside the finder 12f.
  • the display element arranged inside the finder 12f is driven under the control of the control unit 13 and displays at least information related to photographing.
  • the display of various types of information related to photographing can be switched by an input operation to the operation unit 121 by the user. Therefore, the user can perform photographing by selectively displaying information of a desired type on the display element in the finder 12f.
  • the housing 100 of the imaging device 10 and the second display unit 12 are integrally configured.
  • the second display unit 12 is connected to the housing 100 of the imaging device 10.
  • the second display unit 12 is disposed above the first display unit 11, for example.
  • the display direction of the second display unit 12 is at least that of the first display unit 11.
  • the direction of display is almost the same. Therefore, when the user observes the information displayed on the display element in the finder 12f through the eyepiece, the display surface of the monitor 11m faces the user's face.
  • the determination as to whether or not the object that touches or approaches the first display unit 11 is a human face is, for example, the evaluation value calculated based on the input from the first display unit 11 and the first display This is done by comparison with a reference value (hereinafter simply referred to as “threshold” as appropriate) for determining whether or not the object approaching the unit 11 is a human face.
  • a reference value hereinafter simply referred to as “threshold” as appropriate
  • the first display unit 11 when contact or approach of an object is detected by the first display unit 11, first, an object that touches or approaches the first display unit 11 is first detected from the output of the first display unit 11. An evaluation value for determining whether or not the face is determined is calculated. Next, the calculated evaluation value is compared with a threshold value prepared in advance. For example, when the evaluation value exceeds the threshold value, it is determined that the object approaching the first display unit 11 is a human face, and the display by the first display unit 11 and the second display unit 12 is switched.
  • the evaluation value is based on, for example, the number of contact points with respect to the first display unit 11, the relative positional relationship between the plurality of contact points with respect to the first display unit 11, the contact area with respect to the first display unit 11, and the like. Is calculated as follows. In addition to the contact with the first display unit 11, the evaluation value includes the degree of approach to the first display unit 11, the degree of contact with the first display unit 11 and the degree of approach to the first display unit 11. It can also be calculated as follows.
  • (1-4-1. Evaluation value based on the number of contact points) 3A and 3B are schematic diagrams illustrating an example of an input operation to the monitor by the user.
  • the monitor 11m has a function as a touch panel. Therefore, the imaging device 10 can detect the contact of the user's body with the monitor 11m by the output from the monitor 11m.
  • FIG. 3A shows a state where the user touches the monitor 11m with the index finger.
  • the user often touches the monitor with one finger in scenes such as selecting menu items displayed on the monitor, instructing execution of shooting, or changing the zoom magnification.
  • Examples of such an operation for input include a touch operation, a tracing operation, a so-called “flick” operation, a so-called “tap” operation, and the like.
  • the contact between the user's body and the monitor 11m can be roughly regarded as a contact at one point.
  • FIG. 3A the contact point between the user's body and the monitor 11m is schematically shown by a shaded circle. The same applies to the following description.
  • FIG. 3B shows a state where the user touches the monitor 11m with the index finger and the thumb.
  • the user can also instruct the imaging apparatus 10 to enlarge or reduce the image displayed on the monitor 11m, rotate, or the like by, for example, a so-called “pinch” operation.
  • the user often touches the monitor with two fingers.
  • the contact between the user's body and the monitor 11m can be roughly regarded as a contact at two points.
  • FIG. 4A is a schematic diagram illustrating a state in which the user looks into the viewfinder of the imaging apparatus with the bottom surface of the imaging apparatus being substantially horizontal.
  • the direction in which the light related to the subject image is directed from the eyepiece lens of the finder 12f to the user's eyes is substantially parallel to the display direction of the monitor 11m. Therefore, when the user uses the finder 12f, the user's face and the display surface of the monitor 11m face each other.
  • the user's nose, lips, and right cheek protrude relatively toward the display surface of the monitor 11m, as shown in FIG. 4A, the user's nose, lips, and right cheek Three points will contact the monitor 11m. That is, the number of contact points when the user operates the monitor 11m as a touch panel with a finger is about two at most, whereas the number of contact points when the user looks into the finder 12f is three or more. It is believed that there is.
  • the first display unit 11 detects the user's contact with the first display unit 11 and checks the number of contact points based on the output from the first display unit 11. It can be determined whether or not the object approaching the unit 11 is a human face.
  • the imaging apparatus 10 determines whether or not the user is looking into the finder 12f. Is possible.
  • the first display unit 11 detects the coordinates of the contact points of the user with respect to the first display unit 11 and examines the positions of the plurality of contact points based on these inputs, whereby the first display unit It is possible to determine whether or not the object approaching 11 is a human face.
  • a graphic pattern obtained by connecting a plurality of contact points to the first display unit 11 configured as a touch panel, and a triangular pattern obtained by connecting three points of the user's nose, lips and cheeks May be compared with the control unit 13. For example, based on the input from the first display unit 11, the control unit 13 calculates the similarity between the two patterns as an evaluation value. By making the control unit 13 compare the evaluation value with a threshold value prepared in advance, it can be determined whether or not the object approaching the first display unit 11 is a human face.
  • weighting is performed in advance for each contact point on the monitor 11m, and the weighted evaluation value is calculated by the control unit 13. May be.
  • FIG. 4B is a schematic diagram showing an example of weighting for lattice points corresponding to the arrangement of electrodes.
  • a grid point is set in correspondence with the arrangement of a plurality of electrodes arranged in the horizontal and vertical directions of the monitor 11m, and a weight W i (subscript i is a plurality of weights) for each grid point.
  • W i is a plurality of weights
  • FIG. 4B the positions of a plurality of electrodes arranged in the horizontal direction and the vertical direction of the monitor 11m are schematically shown by broken lines.
  • the arrangement direction of the electrodes may be arbitrarily set in a plane parallel to the display surface of the monitor 11m, and a plurality of electrodes arranged in the horizontal direction of the monitor 11m and a plurality of electrodes arranged in the vertical direction are: It does not necessarily have to be orthogonal.
  • Weight W i that is set for each grid point, for example, a user definitive when looking into the viewfinder 12f, and face of the user is set according to the potential of the height of the contact between the monitor 11m.
  • FIG. 4B the contact point between the user's body and the monitor 11m, schematically indicated by the shaded circle in FIG. 4A, is indicated by a two-dot chain line.
  • the determination based on the relative positional relationship between the plurality of contact points it is determined whether or not the object approaching the first display unit 11 is a human face by simple pattern matching.
  • the risk of malfunction of the device 10 is reduced. For example, assume that the user touches the monitor 11m with three fingers. At this time, compared with the determination based on the number of contact points, the determination based on the relative positional relationship between the plurality of contact points is less likely to be performed by the switching process not intended by the user. In particular, if the icon display position is adjusted to a location away from the position where the user's nose, lips and cheeks will come in contact, the possibility of malfunction can be further reduced.
  • grid points are set in correspondence with the arrangement of a plurality of electrodes arranged in the horizontal and vertical directions of the monitor 11m.
  • the contact area of the user's body with respect to the monitor 11m (the ratio of the portion touched by the user to the entire display area of the monitor 11m) (Area ratio) may be estimated). This is the same as setting the weight corresponding to each lattice point equal in the above-described determination based on the relative positional relationship between the plurality of contact points.
  • the contact area with respect to the monitor 11m is when the user looks into the finder 12f. Is considered larger. Therefore, it is possible to easily determine whether or not the user is looking through the finder 12f by comparing the evaluation value with a threshold value using the contact area with the monitor 11m as an evaluation value.
  • the determination based on the contact area is easy to mount as in the above-described determination based on the number of contact points.
  • the first display unit 11 may further detect not only the contact of the object with the first display unit 11 but also the approach of the object with respect to the first display unit 11.
  • the control unit 13 detects the approach of the user's body to the monitor 11m from multipoint detection of an approaching object and increase or decrease of the capacitance. It is also possible to determine the degree.
  • 5A and 5B are schematic diagrams illustrating an example of an input operation to the monitor by the user.
  • FIG. 5A shows a state where the user touches the monitor 11m with the index finger.
  • FIG. 5B shows a state where the user touches the monitor 11m with the index finger and the thumb.
  • the area of the portion where the user's body and the monitor 11m come close to the monitor 11m with one finger Increased compared to touch.
  • the portion of the display area of the monitor 11m where the user's body and the monitor 11m approach each other is schematically shown by a cross line. The same applies to the following description.
  • FIG. 6A is a schematic diagram illustrating a state in which the user looks into the viewfinder of the imaging apparatus with the bottom surface of the imaging apparatus being substantially horizontal.
  • the control unit 13 determines whether the user is looking through the finder 12f by using the area where the user's body and the monitor 11m approach each other as the evaluation value, or the user is operating the monitor 11m configured as a touch panel This can be determined by the control unit 13.
  • the target approaching the first display unit 11 is the person's body based on the number of parts where the user's body is close to the monitor 11m and the relative positional relationship. You may make it make the control part 13 determine whether it is a face. Further, a weighted evaluation value may be used in the process of determining whether or not the object approaching the first display unit 11 is a human face.
  • FIG. 6B is a schematic diagram illustrating an example of weighting according to the possibility of user contact with the monitor.
  • the distance between the periphery of each of the user's nose, lips, and cheeks that protrudes with respect to the monitor 11m is smaller than that of the monitor 11m as compared with other parts of the user's face.
  • the weight for calculating the evaluation value may be set according to the possibility that the user's body contacts the monitor 11m. For example, among the display areas of the monitor 11m, the largest weight is set for the area where the user's nose, lips, or cheeks are most likely to contact, and the smaller weight is set sequentially as the distance from the area increases. .
  • FIG. 6B the area where the user's nose, lips or cheeks are most likely to touch is shown as a black area in the display area of the monitor 11m. Further, in FIG. 6B, an area where the user's nose, lips or cheeks are most likely to come into contact is indicated by dark shading, and an area where the user's nose, lips or cheeks is next likely to touch Is shown by thin shading.
  • a weight corresponding to the distance between each part of the user's face and the monitor 11m may be set.
  • the object approaching the first display unit 11 is the human face. It is possible to increase the accuracy of the determination of whether or not.
  • FIG. 7A is a schematic diagram schematically showing an area of the monitor display area where the user's nose, lips or cheeks are likely to approach when the user looks into the viewfinder of the imaging device with his right eye.
  • FIG. 7B is a schematic diagram schematically showing an area of the monitor display area where the user's nose, lips or cheeks are likely to approach when the user looks through the viewfinder of the imaging device with his left eye.
  • FIGS. 7A and 7B in the display area of the monitor 11m, when the user looks into the finder 12f of the imaging device 10, the area where the user's nose, lips, or cheek is likely to approach is the user. Depends on which eye is used to look into the finder 12f.
  • FIGS. 7A and 7B in the display area of the monitor 11m, the area where the user's nose, lips, or cheek is likely to approach in the state where the user looks into the finder 12f of the imaging device 10 is shaded. Indicated.
  • FIG. 8A shows an area where the user's nose, lips or cheeks are likely to approach in the state where the user looks into the finder of the imaging apparatus with the right eye, and the finder of the imaging apparatus with the left eye in the display area of the monitor.
  • FIG. 6 is a schematic diagram showing a region where the user's nose, lips, or cheeks are likely to approach in a state of looking into the camera.
  • the region Rn illustrated in FIG. 8A is a region where the user's nose is likely to approach when the user looks into the finder 12f with the right eye.
  • a region Rm illustrated in FIG. 8A is a region where the user's lips are likely to approach when the user looks into the finder 12f with the right eye.
  • a region Rc illustrated in FIG. 8A is a region where the user's right cheek is likely to approach when the user looks into the finder 12f with the right eye.
  • the region Ln shown in FIG. 8A is a region where the user's nose is likely to approach when the user looks into the finder 12f with the left eye.
  • a region Lm illustrated in FIG. 8A is a region where the user's lips are likely to approach when the user looks through the finder 12f with the left eye.
  • a region Lc illustrated in FIG. 8A is a region where the user's left cheek is likely to approach when the user looks into the finder 12f with the left eye.
  • regions C1, C2, and C3 indicated by hatching and crossing lines in FIG. 8A approach the user's nose, lips, or cheeks even when the user looks through the finder 12f of the imaging apparatus with either eye. This is a highly likely area.
  • the weight W 0 in the regions C1, C2, and C3 is set larger than the weight W 1 (W 1 > 0) in the regions Rn, Rm, Rc, Ln, Lm, and Lc. keep the weight W 2 is set to 0 in the region.
  • the object approaching the first display unit 11 is the person who is approaching the first display unit 11 regardless of which side the user looks into the finder 12f. It can be determined whether or not it is a face. For example, if a grid point is set corresponding to the arrangement of a plurality of electrodes arranged in the horizontal and vertical directions of the monitor, and the contact or approach of the object to the monitor is detected for each grid point, the area ratio It is also possible to make a determination using. That is, if integration is performed over the entire display area of the monitor, with the presence or absence of detection at each grid point as a unit, the ratio of the portion where the object touches the monitor relative to the entire display area of the monitor can be calculated.
  • the user can also shoot with the bottom surface of the imaging apparatus being substantially horizontal, but the user can also shoot with the bottom surface of the imaging apparatus being substantially vertical, for example. Therefore, it is also assumed that the user looks into the viewfinder of the imaging apparatus with the bottom surface of the imaging apparatus being substantially vertical.
  • the state in which the bottom surface of the imaging device is substantially parallel to the horizontal plane will be referred to as “horizontal state” as appropriate, and the state in which the bottom surface of the imaging device is substantially perpendicular to the horizontal plane will be referred to as “vertical state”. As appropriate.
  • FIG. 8B schematically shows an area where the user's nose, lips or cheeks are likely to approach when the user looks through the viewfinder of the imaging apparatus in the vertical state with his right eye.
  • FIG. 9A schematically illustrates an area of the monitor display area where the user's nose, lips, or cheeks are likely to approach when the user looks through the viewfinder of the imaging apparatus in a vertical state with the left eye.
  • FIG. 9B shows an area where the user's nose, lips or cheeks are likely to approach in the state where the user looks into the finder of the imaging device with the right eye, and the finder of the imaging device with the user's left eye.
  • FIG. 6 is a schematic diagram showing a region where the user's nose, lips, or cheeks are likely to approach in a state of looking into the camera.
  • FIG. 10A is a schematic diagram illustrating a state in which the user looks through the viewfinder of the imaging apparatus in the vertical state with the right eye.
  • FIG. 10B is a schematic diagram illustrating a state where the user looks through the viewfinder of the imaging apparatus in the vertical state with the left eye.
  • FIGS. 8B, 9A, and 9B correspond to FIGS. 7A, 7B, and 8A, respectively.
  • 10A and 10B correspond to FIG. 6A.
  • FIGS. 10A and 10B the area where the user's nose, lips or cheeks are closer or in contact with each other in comparison with other areas when the user is looking through the viewfinder of the imaging device is shaded and crossed. Indicated by.
  • FIG. 8B, FIG. 9A, and FIG. 9B when the imaging device 10 is in a vertical state, an area where the user's nose, lips, or cheek is likely to approach is imaged in the display area of the monitor 11m. This is very different from the case where the apparatus 10 is in a horizontal state.
  • FIGS. 10A and 10B when the imaging apparatus 10 is in a vertical state, the area where the user's face and the display surface of the monitor 11m face each other is the area of the monitor 11m that is displayed by the user. It greatly differs depending on which side of the eye the finder 12f is viewed.
  • the display control device 1 may execute processing according to the posture of the imaging device 10 in order to prevent display switching that is not intended by the user. Specifically, for example, when the imaging device 10 is in the horizontal state, weighting is performed according to the mapping as illustrated in FIG. 8A, and when the imaging device 10 is in the vertical state, according to the mapping as illustrated in FIG. 9B. Make weighting.
  • the accuracy of the determination as to whether or not the object approaching the first display unit is a human face is further increased. be able to.
  • the case where the user holds the imaging device vertically or horizontally is shown.
  • a plurality of mappings corresponding to other postures are further prepared. May be.
  • the threshold value is appropriately set according to the orientation of the imaging device. You may change it.
  • a digital camera generally has a plurality of modes. That is, the digital camera includes, for example, a “shooting mode” and a “reproduction mode”.
  • the shooting mode is a mode in which an image relating to a subject is displayed on the monitor in the finder or the monitor on the back of the housing, and an input of an image recording instruction from the user is on standby.
  • the reproduction mode is a mode in which an image obtained by imaging is displayed on the monitor in the finder or the monitor on the back of the casing, and image data can be edited as necessary.
  • the setting switching operation using a menu is the main operation, so it is often sufficient to detect about one finger. This is because an imaging apparatus equipped with a finder is designed mainly for intermediate and advanced players.
  • the ease of switching between the display by the first display unit 11 and the display by the second display unit 12 may be changed according to the mode of the imaging device 10. More specifically, for example, the control unit 13 may select a threshold value corresponding to each mode depending on whether the imaging apparatus 10 is in the shooting mode or the playback mode.
  • a threshold value for determining whether or not an object approaching the first display unit 11 is a human face is set. It is set smaller than the threshold value in the playback mode.
  • the threshold value in the shooting mode it is easy to determine that the user is looking into the viewfinder 12f when an object touches or approaches the monitor 11m.
  • the imaging device 10 is in the reproduction mode, it is difficult to determine that the user is looking into the finder 12f even when an object touches or approaches the monitor 11m. For this reason, when the user performs an input operation or the like with a plurality of fingers, display switching unintended by the user is prevented.
  • the threshold value according to the mode of an imaging device when the threshold value according to the mode of an imaging device is selected, it is not necessary to change the weighting mapping according to the mode of the imaging device.
  • a weighted mapping according to the mode of the imaging apparatus may be selected. Also in this case, the same effect as when the threshold value corresponding to the mode of the imaging device is selected can be obtained according to the mode of the imaging device.
  • FIG. 11 is a flowchart illustrating an example of processing according to the embodiment of the present disclosure.
  • the display by the first display unit 11 and the display by the second display unit 12 are switched according to the determination result of whether or not the user is looking through the finder 12f. ing.
  • a program for a series of processes described below with reference to FIG. 11 is stored in, for example, the ROM 123, the RAM 125, the storage unit 119, and the series of processes described below with reference to FIG. It is executed by the control unit 13.
  • step St1 the output from the first display unit 11 is acquired by the control unit 13.
  • the control unit 13 receives the output from the monitor 11m as an input and the number of contact points (individual To obtain information on the coordinates of the contact point.
  • step St2 an evaluation value EV for determining whether or not the object approaching the first display unit 11 is a human face is calculated based on the input from the first display unit 11.
  • the evaluation value EV for example, the number of contact points of the user's body with respect to the monitor 11m, the relative positional relationship (similarity of the shape pattern) between the plurality of contact points, the contact area, and the like can be used as the evaluation value EV. .
  • the number of contact points of the user's body with respect to the monitor 11m is the evaluation value EV.
  • the evaluation value EV is the number of contact points of the user's body with respect to the monitor 11m.
  • a process for calculating an evaluation value EV having a magnitude corresponding to the attitude of the imaging apparatus 10 may be executed.
  • mapping corresponding to the posture of the imaging device 10 is applied as weighting mapping. That's fine.
  • a threshold value TH for determining whether or not the object approaching the first display unit 11 is a human face is read from the ROM 123 into the control unit 13, for example.
  • an appropriate threshold value is selected from the plurality of threshold values, for example, according to the mode of the imaging device 10.
  • step St4 the evaluation value EV calculated in step St2 is compared with the threshold value TH selected in step St3.
  • step St5 for example, the display of the first display unit 11 is turned off, and the second display unit 12 The display is turned on.
  • Step St6 the display of the second display unit 12 is turned off and the first display is performed.
  • the display of section 11 is turned on.
  • FIG. 12A is a flowchart illustrating an example of an evaluation value calculation process.
  • FIG. 12A shows an example of a process for calculating an evaluation value EV having a magnitude corresponding to the attitude of the imaging apparatus.
  • weighting mapping used when the user holds the imaging device 10 horizontally (hereinafter referred to as “horizontal state mapping” as appropriate) and used when the user holds the imaging device 10 vertically.
  • weighted mapping hereinafter referred to as “vertical state mapping” as appropriate
  • FIGS. 8A and 9B correspond to horizontal state mapping and vertical state mapping, respectively.
  • Each data of the horizontal state mapping and the vertical state mapping is stored in, for example, the ROM 123, the RAM 125, the storage unit 119, and the like.
  • step St21 it is determined whether or not the imaging apparatus 10 is in a horizontal state. This determination can be made based on an input from the sensor 19.
  • the horizontal state mapping is selected in step St22.
  • the vertical state mapping is selected in step St23.
  • an evaluation value EV is calculated based on either the horizontal state mapping or the vertical state mapping. For example, by calculating the product sum ⁇ W i B j as the evaluation value EV, the evaluation value EV having a magnitude corresponding to the posture of the imaging device 10 can be obtained.
  • FIG. 12B is a flowchart illustrating an example of threshold selection processing.
  • FIG. 12B shows an example of processing for setting a threshold value TH having a magnitude corresponding to the mode of the imaging apparatus.
  • a threshold value used when the imaging device 10 is in the shooting mode hereinafter referred to as “shooting mode threshold value” as appropriate
  • shooting mode threshold value a threshold value used when the imaging device 10 is in the shooting mode
  • production mode threshold a threshold value used in advance.
  • the shooting mode threshold and the playback mode threshold are stored in, for example, the ROM 123, the RAM 125, the storage unit 119, and the like.
  • step St31 it is determined whether or not the imaging device 10 is in the shooting mode. Whether or not the imaging device 10 is in the shooting mode can be easily confirmed by referring to a variable stored in the RAM 125, for example.
  • the threshold for shooting mode is read into the control unit 13 in step St32.
  • the reproduction mode threshold value is read by the control unit 13 in step St33. That is, a threshold value suitable for determining whether or not the object approaching the first display unit 11 is a human face is selected according to the mode of the imaging device 10.
  • processing for calculating an evaluation value having a size corresponding to the attitude of the imaging device and processing for setting a threshold value having a size corresponding to the mode of the imaging device are executed as necessary. It may be made to be.
  • processing for setting a threshold value corresponding to the attitude of the imaging device and processing for calculating an evaluation value corresponding to the mode of the imaging device are performed as necessary. May be.
  • a triangular pattern obtained by connecting three points of the user's nose, lips, and cheeks that are in contact with or close to the monitor 11m is a triangular pattern that is almost unique to each user. Therefore, it is preferable that calibration can be performed for each user.
  • each user brings his face closer to the monitor 11m by looking into the finder 12f.
  • the imaging device 10 detects contact or approach to the monitor 11m with the monitor 11m configured as a touch panel in a state where the user brings his face close to the monitor 11m.
  • the imaging apparatus 10 corrects a threshold value or the like for determining whether or not the object approaching the first display unit 11 is a human face based on information detected by the monitor 11m. I do.
  • the control unit 13 executes creation or correction of a pattern or map corresponding to the shape of the user's face.
  • the present disclosure since the contact or approach of an object is detected by the first display unit, it is possible to detect whether or not the user is looking into the viewfinder without using a dedicated part. Therefore, according to the present disclosure, it is possible to automatically switch between the display by the first display unit and the display by the second display unit while suppressing the cost of the component itself and the mounting cost.
  • a person's face is approaching the monitor is determined based on an input from a monitor configured as a touch panel.
  • a dedicated component such as a proximity sensor for detection can be eliminated.
  • a proximity sensor since it is not necessary to arrange a proximity sensor near the viewfinder, a large monitor can be arranged in the imaging apparatus.
  • the present disclosure based on the input from the first display unit, it is estimated which of the first display unit and the second display unit the user is using, and the display by the first display unit; Switching to the display by the second display unit is automatically executed. According to the present disclosure, since switching between the display by the first display unit and the display by the second display unit is automatically executed, wasteful power consumption is suppressed and the display is switched. There is no need for the user to perform complicated input operations and the convenience for the user is improved.
  • the display control device 1 is set so that it can be easily determined that the user is looking through the viewfinder 12f.
  • the viewfinder 12f is preferably used when the user shoots in a bright environment where it is difficult to confirm the display on the monitor 11m, such as outdoor shooting in fine weather.
  • the finder 12f may be used even when shooting is performed in a dark environment where the light emitted from the monitor 11m feels dazzling to the user or others.
  • the user can use the finder 12f to check the subject more easily than using the monitor 11m.
  • the light emitted from the monitor 11m does not cause discomfort to the user, other spectators, and performers.
  • the display control device 1 is set to be easily determined that the user is looking into the finder 12f. preferable.
  • the control unit 13 When determining whether or not the shooting environment is suitable for the use of the finder 12f, the control unit 13, for example, outputs an optical sensor for adjusting the brightness of the monitor 11m, or for automatic exposure. Information about the brightness around the user can be obtained from the output of the photometric sensor or the like.
  • the imaging apparatus 10 has an automatic scene recognition function, information regarding the environment at the time of shooting can also be obtained by the automatic scene recognition function.
  • the control unit 13 can easily obtain information regarding the location of the user.
  • GPS global positioning system
  • the threshold value is set small or the evaluation value is calculated large.
  • a weighted mapping For example, a mapping is prepared in which a weight greater than 1 is multiplied in each unit of detection. Then, when it is determined that the environment at the time of shooting is suitable for use of the finder 12f, the mapping is applied. Then, the evaluation value when the environment at the time of photographing is an environment suitable for using the finder 12f becomes larger than the evaluation value in other cases.
  • the user can perform better shooting by using the finder 12f.
  • the finder 12f when a lens with a large focal length is used or when shooting at a zoom position with a large focal length is performed, it is more advantageous to use the finder 12f than the monitor 11m.
  • the user needs to move the imaging device 10 away from his / her face to some extent.
  • the finder 12f when the subject is confirmed using the finder 12f, the user can hold the imaging device 10 by bringing the imaging device 10 closer to his / her face and tightening his / her side. Therefore, when the focal length at the time of shooting is large, the user can hold the imaging device 10 more stably by using the finder 12f to check the subject, thereby suppressing blurring of the obtained image. it can.
  • the control part 13 can obtain the information regarding a focal distance easily as control information.
  • the display control device 1 causes the user to view the viewfinder 12f. It is preferable that it is set so that it can be easily determined that the user is looking into the camera.
  • the control unit 13 When determining whether or not the user is performing handheld shooting, the control unit 13 obtains information on the magnitude of vibration applied to the imaging device 10 from the output of the sensor 19 including, for example, a vibration gyroscope and an acceleration sensor. Obtainable. The control unit 13 can determine whether or not the user is performing handheld shooting based on information on the magnitude of vibration applied to the imaging device 10.
  • the degree of camera shake correction may be adjusted according to the above.
  • the degree of camera shake correction is adjusted means that the operation of one or more camera shake correction mechanisms 17 is sequentially enabled or disabled according to the magnitude of vibration applied to the imaging apparatus 10. It means that the correction amount in the camera shake correction mechanism is changed stepwise or continuously.
  • the imaging apparatus 10 includes a physical camera shake correction mechanism for an optical system and an electrical camera shake correction mechanism for an image signal.
  • the imaging apparatus 10 includes a camera shake correction mechanism by movement of the image sensor 15 and a camera shake correction mechanism by movement of a correction lens in the imaging optical system 111 as physical camera shake correction for the optical system. .
  • any one of a camera shake correction mechanism by movement of the image sensor 15, a camera shake correction mechanism by movement of the correction lens, and an electric camera shake correction mechanism for an image signal is turned on.
  • the correction ratio with respect to the low frequency region is increased.
  • the mechanism and the electric camera shake correction mechanism for the image signal are sequentially turned on.
  • the electrical camera shake correction mechanism for the image signal is further turned on, the camera shake can be corrected with a larger angle of view.
  • the threshold value is set to a small value, or weighting mapping with a large evaluation value is used, and the imaging device 10 is used.
  • the degree of camera shake correction may be adjusted according to the ease of camera shake.
  • the display control device 1 may be set so that it can be easily determined that the user is looking into the finder 12f. In this way, smoother display switching is realized in a scene where the user is likely to use the finder 12f while preventing malfunction in a normal environment.
  • FIG. 13A is a side view showing a second modification of the imaging apparatus.
  • FIG. 13B is a side view showing a state in which the finder unit is removed from the main body of the imaging apparatus.
  • the imaging apparatus 20 shown in FIG. 13A is common to the imaging apparatus 10 described above in that it includes an imaging element 15, a monitor 11m, and a control unit 13.
  • the imaging device 20 includes, for example, a main body portion 20b in which the imaging element 15 is disposed, a lens unit 20r that can be freely attached to and detached from the main body portion 20b, and a finder unit 22f that can be freely attached to and detached from the main body portion 20b. Consists of A finder unit 22 f shown in FIGS. 13A and 13B corresponds to the second display unit 12.
  • the second modification is different from the imaging device 10 described above in that the second display unit 12 is detachable from the housing 200 of the main body 20b of the imaging device 20, for example.
  • an in-finder monitor 22m is arranged inside the finder unit 22f.
  • On the in-viewfinder monitor 22m for example, information on parameters used for photographing is displayed.
  • the finder unit 22f When the finder unit 22f is configured as an electronic finder, an image signal related to the subject and an image signal related to display of parameters used for photographing are supplied from the control unit 13 to the in-finder monitor 22m via the connector 23.
  • the finder unit 22f may be configured as an optical finder. In this case, an image signal related to display of parameters used for shooting is supplied from the control unit 13 to the in-finder monitor 22m via the connector 23.
  • the display direction of the finder unit 22f can be set to at least substantially the same direction as the display direction of the monitor 11m disposed on the rear surface of the housing 200 or the like. This is because when the user uses the finder unit 22f, the display surface of the monitor 11m disposed on the rear surface of the housing 200 or the like faces the user's face.
  • the second display unit 12 may be configured as a detachable display device.
  • a finder unit that can set the display direction to be substantially the same as the display direction of the monitor is detachable, a display switching function can be added to an imaging device including a monitor configured as a touch panel. it can.
  • the third modified example is an example in which, when the user uses the finder, the area that does not overlap the user's face is the area for the input operation.
  • FIG. 14A shows a state where the user looks at the finder 12f of the imaging device 30 with his right eye while the bottom surface of the imaging device 30 is substantially horizontal. As shown in FIG. 14A, when the user looks into the finder 12f of the imaging device 30 with the right eye, the lower right area of the display area of the monitor 11m is unlikely to overlap the user's face.
  • the display by the monitor 11m is turned off, but the monitor 11m configured as a touch panel can be used as an input element.
  • the monitor 11m it is possible to cause the monitor 11m to detect the contact or approach of the user's body to an area that does not overlap the user's face in the display area of the monitor 11m.
  • the user can touch the area of the monitor 11m that does not overlap the user's face with the thumb of the right hand.
  • the region where the user's face and the monitor 11m overlap is schematically shown by a cross line.
  • the imaging apparatus 30 executes a function that has been assigned in advance to the input operation for the area.
  • a function that has been assigned in advance to the input operation for the area.
  • input operations on the area for example, auto exposure fixing, auto focus fixing, face detection on / off switching, flash use / non-use, “aperture priority mode” and “shutter speed priority mode” It is possible to assign shooting mode switching such as “”.
  • a mold such as unevenness or Braille shaped like a pictogram is formed on the housing 300 of the imaging device 30 so that the user can recognize an area used for an input operation when using the finder 12f by touch.
  • 301 may be provided.
  • the texture of the area used for the input operation when using the finder 12f in the display area of the monitor may be different from that of other areas.
  • the detection result in the area that does not overlap the user's face is excluded from the determination as to whether or not the object approaching the first display unit 11 is a human face.
  • the detection result in the region that does not overlap with the user's face is not used to calculate an evaluation value for determining whether or not the object approaching the first display unit 11 is a human face.
  • the assignment of a function to an input operation to an area that does not overlap with the user's face is canceled when the user removes the face from the finder 12f.
  • the user when the user uses the finder 12f, by using the area that does not overlap the user's face as the area for the input operation, the user can use the shooting parameters even while looking through the finder 12f. Can be changed.
  • the user can change the composition while looking into the finder 12f without changing the exposure setting by instructing the imaging apparatus 30 to fix the automatic exposure.
  • the entire area of the monitor 11m can be effectively used as a detection element, and the number of buttons arranged on the housing of the imaging apparatus The design of the imaging device can be improved.
  • FIG. 15A and FIG. 15B are rear views illustrating an example of an imaging device in which function icons are displayed in an area that does not overlap with the user's face when the user uses the viewfinder in the display area of the monitor.
  • FIGS. 14A and 14B show an example in which the display on the monitor 11m is turned off when the user uses the finder 12f.
  • the display content of the monitor 11m is changed. It may be.
  • the brightness of the display of the area overlapping the user's face is reduced, and the function icon 303 is displayed in the area not overlapping the user's face.
  • FIG. 15B a region where the user's face and the monitor 11m overlap each other is schematically shown by a cross line.
  • the function icon 303 is displayed in an area that does not overlap with the user's face, and the touch operation on the function icon 303 displayed in the area is detected by the monitor 11m, which is assigned in advance.
  • the functions may be executed. Also in this case, the area near the function icon 303 is not used for detecting the approach of the user's body to the monitor 11m.
  • the present invention is not limited to this example.
  • the same technique can be applied when the user holds the imaging device 30 vertically or when the user looks into the viewfinder 12f of the imaging device 30 with his left eye.
  • the monitor disposed on the back surface of the housing of the imaging device is configured as a touch panel, and the contact or approach of an object is detected by the monitor. Therefore, according to the present disclosure, it is possible to detect contact or approach of the user's face while eliminating the need for dedicated parts.
  • whether or not the user's face is touched or approached is determined based on an output from a monitor configured as a touch panel. Therefore, according to the present disclosure, it is possible to switch between display of information by a monitor disposed inside the finder and display of information by a monitor disposed on the rear surface of the housing of the imaging device without the need for a dedicated component. Can be realized.
  • the present disclosure can be applied not only to still image shooting but also to moving image shooting.
  • the digital camera is taken as an example, but the present disclosure can be applied to a video camera, a microscope, a viewer of image data, and the like.
  • the above-described method for calculating the evaluation value and the method for determining whether or not the user is looking through the viewfinder are merely examples, and the above-described methods may be combined with each other.
  • the present invention is not limited to this example.
  • the user's input operation on the monitor may be disabled, and only the contact or approach of an object to the monitor may be detected by the monitor.
  • the user can select which of the input operation invalidation and display switching is to be executed with priority.
  • this indication can also take the following structures.
  • a first display that accepts user operations and presents information to the user;
  • a second display A display control apparatus comprising: a control unit that switches between display by the first display unit and display by the second display unit based on an input from the first display unit.
  • the control unit determines whether or not an object that contacts or approaches the first display unit is a human face, and the first display unit is determined according to the determination result.
  • the display control apparatus according to (1) which switches between display by the second display and display by the second display unit.
  • An image sensor disposed inside the housing; A first display unit coupled to the housing for receiving a user operation and presenting information to the user; A second display unit in which a display direction is set in substantially the same direction as the display direction of the first display unit; An imaging apparatus comprising: a control unit that switches between display by the first display unit and display by the second display unit based on an input from the first display unit. (10) The imaging apparatus according to (9), wherein the second display unit is detachable from the housing. (11) When the control unit compares the evaluation value calculated from the input with one of one or more threshold values prepared in advance, an object that touches or approaches the first display unit is a human face. (9) or (10). The imaging device according to (10).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • General Engineering & Computer Science (AREA)
  • Exposure Control For Cameras (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Viewfinders (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)
  • Adjustment Of Camera Lenses (AREA)

Abstract

La présente invention se rapporte à un dispositif de contrôle d'affichage comprenant : un premier module d'affichage qui reçoit une commande d'un utilisateur, et qui présente des informations à l'utilisateur ; un second module d'affichage ; et un contrôleur qui exécute une commutation entre l'affichage du premier module d'affichage et l'affichage du second module d'affichage, sur la base d'une entrée transmise par le premier module d'affichage.
PCT/JP2013/062149 2012-06-18 2013-04-18 Dispositif de contrôle d'affichage, dispositif de formation d'image, et procédé de contrôle d'affichage WO2013190906A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012136732A JP2014003422A (ja) 2012-06-18 2012-06-18 表示制御装置および撮像装置ならびに表示制御方法
JP2012-136732 2012-06-18

Publications (1)

Publication Number Publication Date
WO2013190906A1 true WO2013190906A1 (fr) 2013-12-27

Family

ID=49768513

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/062149 WO2013190906A1 (fr) 2012-06-18 2013-04-18 Dispositif de contrôle d'affichage, dispositif de formation d'image, et procédé de contrôle d'affichage

Country Status (2)

Country Link
JP (1) JP2014003422A (fr)
WO (1) WO2013190906A1 (fr)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6463135B2 (ja) * 2015-01-07 2019-01-30 キヤノン株式会社 電子機器及び表示制御方法
WO2016157923A1 (fr) * 2015-03-30 2016-10-06 ソニー株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations
JP6742730B2 (ja) * 2016-01-05 2020-08-19 キヤノン株式会社 電子機器及びその制御方法
CN113794822A (zh) 2016-07-29 2021-12-14 麦克赛尔株式会社 摄像装置
JP6708516B2 (ja) * 2016-08-05 2020-06-10 キヤノン株式会社 電子装置、その制御方法およびプログラム
JP6701027B2 (ja) * 2016-08-09 2020-05-27 キヤノン株式会社 撮像装置、その制御方法およびプログラム
JP7120235B2 (ja) * 2017-07-03 2022-08-17 ソニーグループ株式会社 撮像装置、撮像装置の制御方法、プログラム
JP7051344B2 (ja) * 2017-09-14 2022-04-11 キヤノン株式会社 電子機器
JP7298200B2 (ja) * 2019-03-07 2023-06-27 株式会社リコー 電子黒板および映像補正方法
JP7003193B2 (ja) * 2020-07-30 2022-01-20 マクセル株式会社 撮像装置

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005189654A (ja) * 2003-12-26 2005-07-14 Konica Minolta Photo Imaging Inc 手振れ補正機構を備えたカメラ
JP2007086460A (ja) * 2005-09-22 2007-04-05 Nikon Corp カメラ
JP2008245056A (ja) * 2007-03-28 2008-10-09 Fujifilm Corp 撮影装置
JP2009159380A (ja) * 2007-12-27 2009-07-16 Samsung Techwin Co Ltd 撮像装置及び撮像方法
JP2010134587A (ja) * 2008-12-03 2010-06-17 Sony Corp 情報処理装置および撮像装置
JP2010263425A (ja) * 2009-05-07 2010-11-18 Olympus Imaging Corp 撮像装置および撮像装置におけるモード切換え方法
JP2010283619A (ja) * 2009-06-04 2010-12-16 Olympus Imaging Corp 撮像装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005189654A (ja) * 2003-12-26 2005-07-14 Konica Minolta Photo Imaging Inc 手振れ補正機構を備えたカメラ
JP2007086460A (ja) * 2005-09-22 2007-04-05 Nikon Corp カメラ
JP2008245056A (ja) * 2007-03-28 2008-10-09 Fujifilm Corp 撮影装置
JP2009159380A (ja) * 2007-12-27 2009-07-16 Samsung Techwin Co Ltd 撮像装置及び撮像方法
JP2010134587A (ja) * 2008-12-03 2010-06-17 Sony Corp 情報処理装置および撮像装置
JP2010263425A (ja) * 2009-05-07 2010-11-18 Olympus Imaging Corp 撮像装置および撮像装置におけるモード切換え方法
JP2010283619A (ja) * 2009-06-04 2010-12-16 Olympus Imaging Corp 撮像装置

Also Published As

Publication number Publication date
JP2014003422A (ja) 2014-01-09

Similar Documents

Publication Publication Date Title
WO2013190906A1 (fr) Dispositif de contrôle d'affichage, dispositif de formation d'image, et procédé de contrôle d'affichage
US9678657B2 (en) Imaging apparatus, imaging method, and computer-readable storage medium providing a touch panel display user interface
US9001255B2 (en) Imaging apparatus, imaging method, and computer-readable storage medium for trimming and enlarging a portion of a subject image based on touch panel inputs
JP5775659B2 (ja) 撮像装置および撮像装置におけるモード切換え方法
JP5136669B2 (ja) 画像処理装置、画像処理方法及びプログラム
JP5109803B2 (ja) 画像処理装置、画像処理方法及び画像処理プログラム
JP2010044567A (ja) 情報処理装置
KR20170042491A (ko) 전자기기 및 그 제어 방법
US9071760B2 (en) Image pickup apparatus
US11381736B2 (en) Image capture apparatus and control method
CN115706850A (zh) 图像拍摄的方法、设备、存储介质和程序产品
JP5370555B2 (ja) 撮像装置、撮像方法及びプログラム
JP2012147100A (ja) 撮像装置
JP7490372B2 (ja) 撮像制御装置及びその制御方法
CN113364945A (zh) 电子装置、控制方法和计算机可读介质
US11934083B2 (en) Imaging apparatus
JP2022018244A (ja) 電子機器およびその制御方法
JP2015171116A (ja) カメラの表示装置
JP6465239B2 (ja) 撮像装置、撮像装置の制御方法およびプログラム
JP2016048306A (ja) 制御装置
US10924680B2 (en) Image capture control apparatus and method of controlling the same
JP2016034135A (ja) 撮像装置および撮像装置におけるモード切換え方法
JP6708516B2 (ja) 電子装置、その制御方法およびプログラム
US11438512B2 (en) Electronic apparatus and control method thereof
JP7466304B2 (ja) 撮像装置及びその制御方法、プログラム、記憶媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13806088

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13806088

Country of ref document: EP

Kind code of ref document: A1