CN105827985A - Imaging device and imaging method - Google Patents

Imaging device and imaging method Download PDF

Info

Publication number
CN105827985A
CN105827985A CN201610326097.4A CN201610326097A CN105827985A CN 105827985 A CN105827985 A CN 105827985A CN 201610326097 A CN201610326097 A CN 201610326097A CN 105827985 A CN105827985 A CN 105827985A
Authority
CN
China
Prior art keywords
camera head
main subject
subject
image
view data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610326097.4A
Other languages
Chinese (zh)
Other versions
CN105827985B (en
Inventor
石原晴之
福谷佳之
野中修
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aozhixin Digital Technology Co ltd
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Publication of CN105827985A publication Critical patent/CN105827985A/en
Application granted granted Critical
Publication of CN105827985B publication Critical patent/CN105827985B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

The invention provides an imaging device that can subsidiarily present an image in which a composition desired by a photographer is fixed even under a condition where a subject moves around, and an imaging method. The imaging device comprises: an imaging portion for continuously generating image data of an imaged object; a moving direction determination portion for determining the moving direction of the imaging device; a movement detection portion for detecting the movement, included in the image data, of the imaged object; and a maim imaged object setting portion for setting the imaged object to a main imaged object when an imaging are of the imaging device is tracked by the imaged object.

Description

Camera head, image capture method
The application is Application No. 201210210469.9, filing date on June 20th, 2012, the divisional application of the application for a patent for invention of invention entitled " camera head, image capture method ".
Technical field
The present invention relates to carry out opto-electronic conversion by shooting subject and generate the camera head of view data of electronics, image capture method.
Background technology
In recent years, in the camera head such as digital camera and digital camera, it is known to such technology: can automatically carry out photograph (with reference to patent documentation 1) when subject becomes specific expression or posture.In the art, from the live view image sequentially generated by image pickup part, extract the face of subject, photograph when the face of this subject extracted is consistent with specific pattern.
[patent documentation 1] Japanese Unexamined Patent Publication 2004-294498 publication
But, in the above art, in subject is in photography in the case of to-and-fro situation, position and the photography composition of taking in the face of the subject on live view image all change every time, thus result in the face extracted from live view image consistent with specific pattern time composition change, it is not necessary to main subject can be shot with predetermined composition.
Summary of the invention
The present invention is made in view of the foregoing, even if it is an object of the invention to provide a kind of the main subject as photography target come up away in photography rise situation under, cameraman also can photograph with predetermined pattern and take in the camera head of image of main subject, image capture method.
In order to solve above-mentioned problem and achieve the goal, the camera head that the present invention relates to has: image pickup part, and it is continuously generated the view data of subject;Moving direction detection unit, it judges the moving direction that this camera head moves;Mobile test section, the movement of the described subject that its detection is included in described view data;Main subject configuration part, described subject is set as main subject in the case of described subject is tracked by the photographing region of described camera head by it.
Further, the image capture method that the present invention relates to is performed by camera head, it is characterised in that described image capture method execution following steps: image pickup step, is continuously generated the view data of subject;Moving direction determination step, it is determined that the moving direction that this camera head moves;Mobile detecting step, the movement of the described subject that detection is included in described view data;Main subject setting procedure, in the case of described subject is tracked by the photographing region of described camera head, is set as main subject by described subject.
According to the present invention, in the case of subject is tracked by the photographing region of camera head, subject is set as main subject.Thus, such effect is obtained: even if cameraman uses the main subject making photography target under the situation of movement, also can extract the image that have taken main subject with predetermined pattern in photography.
Accompanying drawing explanation
Fig. 1 is the figure of the structure of the side in the face of subject of the camera head illustrating that embodiments of the present invention 1 relate to.
Fig. 2 is the figure of the structure of the side in the face of cameraman of the camera head illustrating that embodiments of the present invention 1 relate to.
Fig. 3 is the block diagram of the structure of the camera head illustrating that embodiments of the present invention 1 relate to.
Fig. 4 is the flow chart of the summary of the process illustrating that the camera head that embodiments of the present invention 1 relate to carries out.
Fig. 5 is the figure illustrating situation when cameraman uses camera head to shoot.
Fig. 6 is the flow chart of the summary of the main subject candidate's determination processing illustrating Fig. 4.
The figure of situation when Fig. 7 is to schematically show main subject detection unit judgement main subject.
Fig. 8 is the figure of the example being shown under the situation shown in Fig. 7 the image corresponding with the view data that image pickup part generates.
Fig. 9 is to be shown in the figure of the relation of the photographing region of subject and camera head under the situation shown in Fig. 7.
Figure 10 is the flow chart of the summary illustrating the slip image display process shown in Fig. 4.
Figure 11 is the figure of an example of the image illustrating that display part shows.
Figure 12 is the block diagram of the structure of the camera head illustrating that embodiments of the present invention 2 relate to.
Figure 13 is the flow chart of the summary of the main subject candidate's determination processing illustrating that the camera head that embodiments of the present invention 2 relate to carries out.
Figure 14 is the figure of the situation during mobile status schematically illustrating moving direction detection unit judgement camera head.
Figure 15 be Figure 14 to the top view regarding A direction.
Figure 16 be Figure 14 to the side view regarding B direction.
Figure 17 is to schematically illustrate the figure of the relation of the acceleration that the velocity and acceleration test section of camera head detects when cameraman makes camera head move.
Figure 18 is the figure of the situation during mobile status schematically illustrating moving direction detection unit judgement camera head.
Figure 19 is the figure of the relation schematically illustrating the testing result that the speed of camera head when cameraman makes camera head move detects with orientation detection portion.
Figure 20 is the figure of the situation during mobile status schematically illustrating moving direction detection unit judgement camera head.
Figure 21 is the figure of the relation of the acceleration of gravity schematically illustrating optical axis direction and vertical direction in camera head when cameraman makes camera head move.
Figure 22 is the block diagram of the structure of the camera head illustrating that embodiments of the present invention 3 relate to.
Figure 23 is the flow chart of the summary of the action illustrating that the camera head that embodiments of the present invention 3 relate to carries out.
Figure 24 is the figure of the situation during contrast schematically showing the view data that the detection of Contrast Detection portion is generated by image pickup part.
Figure 25 is to be shown in the figure of the relation of the contrast of the view data that Contrast Detection portion detects and the photo distance from camera head to subject under the situation shown in Figure 24.
Figure 26 is the flow chart of the summary of the main subject candidate's determination processing illustrating Figure 23.
Label declaration
1,100,200: camera head;2: image pickup part;3: acceleration detecting section;4: intervalometer;5: illuminating part;6: operation inputting part;7: display part;8: touch panel;9,209: storage part;10,210: control portion;21: camera lens part;22: lens driving portion;23: aperture;24: aperture drive division;25: shutter;26: shutter drive division;27: imaging apparatus;28: shooting drive division;29: signal processing part;61: on and off switch;62: release-push;63: photograph mode switching switch;64: menu switch;91: image data storing section;92: program storage part;93: interim storage part;94: contrast storage part;101: image processing part;102: moving direction detection unit;103: main subject couple candidate detection portion;104: main subject configuration part;105: image test section;106: information appendix;107: photography control portion;108: display control unit;110: orientation detection portion;211: Contrast Detection portion.
Detailed description of the invention
Hereinafter, with reference to the accompanying drawings of the mode (hereinafter referred to as " embodiment ") for implementing the present invention.It addition, the present invention is not limited by embodiments described below.Further, in accompanying drawing is recorded, same section is enclosed identical label.
(embodiment 1)
Fig. 1 is the figure of the structure of the side (front face side) in the face of subject of the camera head 1 illustrating that embodiments of the present invention 1 relate to.Fig. 2 is the figure of the structure of the side (rear side) in the face of cameraman of the camera head 1 illustrating that embodiments of the present invention 1 relate to.Fig. 3 is the block diagram of the structure of the camera head 1 illustrating that embodiments of the present invention 1 relate to.
As shown in FIG. 1 to 3, camera head 1 has: image pickup part 2, acceleration detecting section 3, intervalometer 4, illuminating part 5, operation inputting part 6, display part 7, touch panel 8, storage part 9 and control portion 10.
Image pickup part 2 shoots predetermined area of visual field and generates view data.Image pickup part 2 has: camera lens part 21, lens driving portion 22, aperture 23, aperture drive division 24, shutter 25, shutter drive division 26, imaging apparatus 27, shooting drive division 28 and signal processing part 29.
Camera lens part 21 is made up of, from predetermined area of visual field converging light the multiple battery of lens that can carry out focusing and zoom.Lens driving portion 22 uses the motor of step mode or direct current to constitute, and by making the battery of lens of camera lens part 21 move along optical axis O1, carries out the change of focal position and the focal length etc. of camera lens part 21.
Aperture 23 is exposed adjusting by limiting the amount of incident of the light that camera lens part 21 is assembled.Aperture drive division 24 is made up of motor etc., drives aperture 23.
The state of imaging apparatus 27 is set as exposure status or shading status by shutter 25.Shutter drive division 26 is made up of motor etc., drives shutter 25 according to release signal.
Imaging apparatus 27 is made up of CCD (ChargeCoupledDevice, charge-coupled image sensor) or CMOS (ComplementaryMetalOxideSemiconductor, complementary metal oxide semiconductors (CMOS)) etc..The light that imaging apparatus 27 is assembled by receiving camera lens part 21 carries out opto-electronic conversion, converts light to the signal of telecommunication (analogue signal).Shooting drive division 28 generates the commutator pulse driving imaging apparatus 27, and the signal of telecommunication after having been carried out opto-electronic conversion by imaging apparatus 27 exports signal processing part 29.
Signal processing part 29 is made up of analogue amplifier, A/D converter etc..The signal of telecommunication that exports from imaging apparatus 27 is implemented to amplify signal processing such as (Gain tuning) by signal processing part 29, afterwards by carrying out A/D and be converted to the direct picture data of numeral and control portion 10 being arrived in output.
Acceleration detecting section 3 uses the acceleration transducer composition of the capacitance type formed by MEMS (MicroElectroMechanicalSystems, MEMS) technique.Acceleration detecting section 3 has orthogonal three acceleration transducers in detection direction of acceleration.Specifically, as the coordinate system that camera head 1 is intrinsic, take the x-axis parallel with the width of camera head 1 y-axis parallel with the vertical direction of camera head 1 and the z-axis parallel with the optical axis O1 of image pickup part 2, three acceleration transducers detecting each axial component of acceleration respectively are arranged on the precalculated position of camera head 1.According to the acceleration detecting section 3 with this structure, make camera head 1 in the case of object side (z direction), horizontal direction (x direction) and vertical direction (y direction) are mobile cameraman, can accurately detect and move the acceleration of generation by this.Further, acceleration detecting section 3 detects the acceleration of all directions of camera head 1 in the case of the transverse direction of image of display part 7 display and the horizontal direction of camera head 1 are roughly equal.According to having the acceleration detecting section 3 of this structure, make camera head 1 in the case of object side (z direction) is mobile user, can accurately detect and move the acceleration of generation by this.
Intervalometer 4 has clocking capability and the decision-making function of photography date-time.Date-time data are exported control portion 10 by intervalometer 4, in order to captured view data additional date time data.
Illuminating part 5 uses xenon lamp or LED (LightEmittingDiode, light emitting diode) etc. to constitute.The area of visual field that illuminating part 5 shoots to camera head 1 irradiates the flash light as fill-in light.
Operation inputting part 6 has: on and off switch 61, and the power supply status of camera head 1 is switched on state or off-state by it;Release-push 62, it accepts the input of the release signal providing photography instruction;Photograph mode switching switch 63, its various photograph modes set in switching in camera head 1;And menu switch 64, it sets the various parameters of camera head 1.
Display part 7 uses the display floater being made up of liquid crystal or organic EL (ElectroLuminescence, electroluminescent) etc. to realize.The image that display part 7 display is corresponding with view data.Display part 7 display indicates relevant information and the photographic information relevant with photography to the action of camera head 1.
Touch panel 8 is located on the display picture of display part 7.Touch panel 8 detects user and contacts the position of (touch) according to the information of display on display part 7, accepts the input of the indication signal corresponding with the contact position that this detects.Typically, as touch panel, there are resistive film mode, electrostatic capacitance mode, optical mode etc..In present embodiment 1, the touch panel of either type can be applied.It addition, in present embodiment 1, touch panel 8 performs function as input unit.
Storage part 9 uses the semiconductor memories such as flash memory or the DRAM (DynamicRandomAccessMemory, dynamic random access memory) of the fixing inside being located at camera head 1 to realize.Storage part 9 has: image data storing section 91, and it stores view data;Program storage part 92, the various programs of its storage camera head 1 execution and the various data used in the execution of program and parameter etc.;And interim storage part 93, its store multiple view data that image pickup part 2 continuously generates temporarily and process in various contents.It addition, storage part 9 can comprise the storage medium that the storage card etc. installed from outside can be read by computer.
Control portion 10 uses CPU (CentralProcessingUnit, CPU) etc. to constitute.Control portion 10 carries out with each portion that constitute camera head 1 corresponding instruction and the forwarding etc. of data with the indication signal of touch panel 8 with switching signal etc. according to from operation inputting part 6, and the action to camera head 1 is uniformly controlled.
The detailed construction in control portion 10 is described.Control portion 10 has: image processing part 101, moving direction detection unit 102, main subject couple candidate detection portion 103, main subject configuration part 104, image test section 105, information appendix 106, photography control portion 107 and display control unit 108.
View data is implemented various image procossing by image processing part 101.Specifically, image processing part 101 comprises edge enhancement process to view data, white balance processes and the image procossing of γ correction process.Image processing part 101, according to JPEG (JointPhotographicExpertsGroup, JPEG) mode etc., carries out compression process and the decompression of view data.
Moving direction detection unit 102 is according to the testing result of acceleration detecting section 3, it is determined that the moving direction that camera head 1 moves.Specifically, the change of the acceleration of the horizontal direction that moving direction detection unit 102 detects according to acceleration detecting section 3, it is determined that camera head 1 moves the most in the horizontal direction.
Main subject couple candidate detection portion 103 is according to the change of the image information comprised respectively in the multiple view data continuously generated, and by process over time, in picture, the mobile subject of movement is detected as the candidate main subject candidate of the main subject determined according to moving direction.Here, image information is marginal information, colouring information, monochrome information and deep or light information.Main subject couple candidate detection portion 103 carries out predetermined process to the multiple view data each side continuously generated, such as edge detection process and binary conversion treatment etc., by process over time, in picture, the mobile subject of movement is detected as the candidate main subject candidate of the main subject determined according to moving direction.Specifically, main subject couple candidate detection portion 103 according to the image information of each pixel, such as colouring information over time through and the position of movement or distance, detection process over time and in picture the mobile subject of movement.It addition, main subject couple candidate detection portion 103 can use pattern match or other known technology that the mobile subject of movement is detected as main subject candidate.
In the case of the main subject candidate detected in main subject couple candidate detection portion 103 moves on the moving direction of the camera head 1 determined by moving direction detection unit 102, main subject candidate is set as main subject by main subject configuration part 104.
Image test section 105 detects the main subject set by main subject configuration part 104 from the image sets that interim storage part 93 stores temporarily and is positioned at the image of presumptive area.Specifically, image test section 105 detects the main subject set by main subject configuration part 104 from the image sets that interim storage part 93 stores and is positioned at the image of presumptive area, such as substantially central portion.
Information appendix 106 is the information of center image as expression to the view data additional marking corresponding with the center image that image test section 105 detects.
Photography control portion 107, in the case of have input release signal, carries out starting the control of the photographing actions in camera head 1.Here, the photographing actions in camera head 1 refers to that the view data that the driving by shooting drive division 28 is made imaging apparatus 27 export by signal processing part 29 and image processing part 101 implements the action of predetermined process.So be carried out process after view data photography control portion 107 control under, be stored in image data storing section 91.Further, photography control portion 107 is not in the case of inputting release signal via release-push 62, and the view data exported continuously by image pickup part 2 is stored sequentially in interim storage part 93 by output.
Display control unit 108 makes the image that display part 7 display is corresponding with the view data that image pickup part 2 generates.Display control unit 108 makes the display part 7 display photographs corresponding with captured view data, and make in multiple images that display part 7 comprises in showing the image data set that interim storage part 93 stores successively at least some of.
The camera head 1 with above structure can be made to have sound input/output function, the electronic viewfinder (EVF) of disassembled and assembled freely and the communication unit etc. of two-way communication can be carried out via external process devices (not shown) such as the Internet and personal computers.
Then, the process that the camera head 1 that present embodiment 1 relates to is carried out is described.Fig. 4 is the flow chart of the summary of the process illustrating that camera head 1 carries out.
In the diagram, the situation (step S101: yes) that camera head 1 is set to photograph mode illustrates.In this case, camera head 1 is under the control in photography control portion 107, make image pickup part 2 shoot predetermined area of visual field and generate view data (step S102), the view data generated is temporarily stored in interim storage part 93 (step S103).
Then, display control unit 108 makes the live view image (step S104) that display part 7 display is corresponding with the view data that image pickup part 2 generates.
Fig. 5 is the figure illustrating situation when cameraman uses camera head 1 to shoot.As it is shown in figure 5, cameraman K1 such as watch the live view image being shown on display part 7 while determine shoot subject A1 (spending) time composition.
After step S104, moving direction detection unit 102 judges whether camera head 1 moves (step S105).Specifically, in the case of the device carried out by cameraman in acceleration detecting section 3 detects the acceleration of vertical direction (y-axis direction) and horizontal direction (x-axis direction), beyond the acceleration of gravity of constant applying moves the acceleration change caused, moving direction detection unit 102 is judged to that camera head 1 moves.Being judged to (step S105: yes) in the case of camera head 1 moves at moving direction detection unit 102, step S106 described later transferred to by camera head 1.On the other hand, being judged to (step S105: no) in the case of camera head 1 does not moves at moving direction detection unit 102, step S111 described later transferred to by camera head 1.
In step s 106, camera head 1 detects the candidate of main subject, performs to judge main subject candidate's determination processing of the characteristic relevant to the movement of the candidate detected.
Fig. 6 is the flow chart of the summary of main subject candidate's determination processing of step S106 illustrating Fig. 4.
As shown in Figure 6, the testing result that moving direction detection unit 102 detects based on acceleration detecting section 3, the change of the acceleration according to camera head 1 generation judges moving direction (step S201), it is determined that camera head 1 moves (step S202) the most in the horizontal direction.Specifically, moving direction detection unit 102 judges that acceleration detecting section 3 detects acceleration the most in the horizontal direction.Being judged to (step S202: yes) in the case of camera head 1 moves in the horizontal direction at moving direction detection unit 102, step S203 described later transferred to by camera head 1.On the other hand, in the case of moving direction detection unit 102 is judged to camera head 1 movement the most in the horizontal direction (step S202: no), camera head 1 returns to the main routine shown in Fig. 4.In the case of the nervus motorius prosperity of this external cameraman, sometimes there is also the situation that the movement of photographing unit and subject is moved matchingly.Now, the subject in picture looks like static.
In step S203, the mobile subject having movement in the live view image that display part 7 shows is detected as main subject candidate by main subject couple candidate detection portion 103.Specifically, main subject couple candidate detection portion 103 is according to the change of the image information comprised respectively in continuous print live view image, and region detection approximate period property between continuous print image changed is mobile subject region.Such as, live view image is divided into each presumptive area (such as 9 segmentation) by main subject couple candidate detection portion 103, and region detection image information in each region after this segmentation changed in predetermined frequency band is mobile subject region.This predetermined frequency band is 2Hz~5Hz.
Then, main subject configuration part 104 judges whether the main subject candidate that main subject couple candidate detection portion 103 detects moves (step S204) to the moving direction of the camera head 1 judged by moving direction detection unit 102.In main subject candidate (step S204: yes) in the case of the moving direction of the camera head 1 judged by moving direction detection unit 102 moves, main subject configuration part 104 makes interim storage part 93 store the feature (step S205) of main subject candidate.Afterwards, camera head 1 returns to the main routine of Fig. 4.On the other hand, main subject candidate not in the case of the moving direction of the camera head 1 judged by moving direction detection unit 102 moves (step S204: no), camera head 1 returns to the main routine shown in Fig. 4.
Fig. 7 is the figure schematically showing situation when main subject configuration part 104 judges main subject.Fig. 8 is the figure of the example being shown under the situation shown in Fig. 7 the image corresponding with the view data that image pickup part 2 generates.Fig. 9 is to be shown in the figure of the relation of the photographing region of subject A1 and camera head 1 under the situation shown in Fig. 7.It addition, in the figure 7, the photographing region of camera head 1 is represented by single dotted broken line.Further, in fig .9, transverse axis t represents that moment, longitudinal axis D represent the displacement that subject A1 and camera head 1 move from the position of static state.And, in fig .9, the displacement D of the situation about moving towards right direction shown in Fig. 7 is just set as.Further, in fig .9, curve L1Represent the displacement of the center of subject A1, curve L2Represent the displacement of the center of the photographing region of camera head 1.
As shown in Figure 7, in the case of cameraman K1 wants to shoot subject A1 as main subject (Fig. 7 (a)), when fitful wind right direction (arrow Y1) is blown, subject A1 right direction tilts to rock (Fig. 7 (b)).Now, in order to the substantial middle in photographing region shoots subject A1, cameraman K1, according to the movement of subject A1, makes the position of camera head 1 move (panning operation).As a result, the photographing region of camera head 1 becomes photographing region F2 from photographing region F1.
Afterwards, make owing to fitful wind stops subject A1 mobile towards left direction (arrow Y2) and will be returned to initial condition (Fig. 7 (c)), thus the position (left direction) in the opposite direction that cameraman K1 makes camera head 1 is mobile.As a result, the photographing region of camera head 1 becomes photographing region F3 from photographing region F2.
Then, subject A1 is mobile (Fig. 7 (d)) towards left direction (arrow Y2) further from the position of initial condition due to inertia.Therefore, cameraman K1 makes the position of camera head 1 move further, thus the photographing region of camera head 1 becomes photographing region F4 from photographing region F3.
By so being followed the tracks of, by what cameraman K1 was carried out, the operation carrying out photographing while subject A1, watch to cameraman K1 the live view image being shown on display part 7 carries out photographing actions, time difference can be produced.Specifically, as it is shown in figure 9, cameraman K1 moves to make camera head 1 move from subject A1 can produce time delay (the time T of Fig. 91).And then, cameraman K1 is difficult to when making wind-engaging and subject A1 of movement is fixed on photographing region central be tracked as a result, the moment arriving maximum displacement postpones the (T of Fig. 92).Such as, as shown in Fig. 8 (a) and Fig. 8 (b), owing to subject A1 moves (image W towards right direction1→ image W2), thus cameraman K1 carrys out Mobile photographic device 1 by following the tracks of the movement of subject A1, makes photographing region move (the moment t of Fig. 91~t2)。
Afterwards, as shown in Fig. 8 (b) and Fig. 8 (c), when fitful wind stopping, subject A1 move and (the moment t of Fig. 9 during initial condition to be returned to towards left direction2~t3) time, make the displacement of the photographing region of its camera head 1 moved towards right direction consistent with the displacement of subject A1 at cameraman K1 in the case of (direction of advance is the most reverse), subject A1 becomes image (the image W being positioned at substantial middle3)。
Then, as shown in Fig. 8 (d)~Fig. 8 (e), after moving towards left direction (arrow b2) further due to subject A1, (image W is moved towards right direction3→ image W4→ image W5), thus cameraman K1 carrys out Mobile photographic device 1 by following the tracks of the movement of subject A1, makes photographing region move (the moment t of Fig. 93~t6).Now, (the moment t of Fig. 9 is again intersected due to the middle position of subject A1 and the middle position of the photographing region of camera head 16), thus photograph subject A1 and be positioned at image (the image W of substantial middle5) (with reference to Fig. 8 (e)).Then, the displacement of the photographing region when displacement of subject A1 and cameraman K1 make camera head 1 move is mutual approximate period property.Here, it is the general expression that in displacement for just and moving back and forth between negative for approximate period property.
So, in the case of cameraman K1 makes camera head 1 make camera head 1 carry out with predetermined amplitude moving back and forth while moving in the horizontal direction, when there being subject A1 moved towards the moving direction of camera head 1, subject A1 is set as the main subject of cameraman K1 expectation photography by main subject configuration part 104.Thus, in subject A1 by wind etc. under the situation of fierce movement, even if when determined photography composition, main subject configuration part 104 can also determine that the tracking action of cameraman K1, and desired for cameraman K1 main subject can be set as in photography the object followed the tracks of.
Return to Fig. 4, the explanation of the step afterwards started from step S107 that continues.In step s 107, main subject configuration part 104 judges that the main subject candidate detected by main subject couple candidate detection portion 103 is the most identical.In the case of the main subject candidate detected by main subject couple candidate detection portion 103 is the most identical (step S107: yes), the main subject candidate that main subject couple candidate detection portion 103 detects is set as main subject (step S108) by main subject configuration part 104.
Then, image test section 105 judges whether main subject is clapped in the middle section of live view image (step S109).Being judged to (step S109: yes) in the case of main subject is beated in the middle section of live view image at image test section 105, the information appendix 106 view data additional marking corresponding to the live view image currently displaying with display part 7 is as relative to the discernible information of other image data set (step S110) being stored in interim storage part 93.Afterwards, step S111 described later transferred to by camera head 1.On the other hand, being judged to (step S109: no) in the case of main subject is not beated in the middle section of live view image at image test section 105, step S111 described later transferred to by camera head 1.
Then, by operation release-push 62 thus in the case of have input release signal (step S111: yes), camera head 1 carries out, under the control in photography control portion 107, photograph (step S112), makes image data storing section 91 store the view data (step S113) that image pickup part 2 generates.
Afterwards, whether control portion 10 has view data (step S114) in judging interim storage part 93.In the case of having view data in control portion 10 is judged as interim storage part 93 (step S114: yes), step S115 described later transferred to by camera head 1.On the other hand, in the case of not having view data in control portion 10 is judged as interim storage part 93 (step S114: no), step S118 described later transferred to by camera head 1.
In step sl 15, whether control portion 10 has the view data being attached labelling in judging the image data set in being stored in interim storage part 93.It is judged as in the image data set in being stored in interim storage part 93 having in control portion 10 and has been attached in the case of the view data of labelling (step S115: yes), camera head 1 is while making display part 7 show the image data set being stored in interim storage part 93 successively, and the slip image display performing to make this image data set slide on the display picture of display part 7 processes (step S116).It addition, the details processed about slip image display, described below.After step sll beta, step S117 transferred to by camera head 1.
Then, control portion 10 judges whether to make the power supply of camera head 1 be off (step S117) by operation on and off switch 61.Be judged as (step S117: yes) in the case of the power supply of camera head 1 is off state in control portion 10, camera head 1 terminates present treatment.On the other hand, being judged as (step S117: no) in the case of the power supply of camera head 1 is not off state in control portion 10, camera head 1 returns to step S101.
Be judged as control portion 10 in step S114 not being temporarily stored in the situation (step S114: no) of the view data in interim storage part 93 and in step sl 15 control portion 10 be judged as the image data set in being stored in interim storage part 93 has been not attached to the situation (step S115: no) of the view data of labelling and illustrate.In this case, display control unit 108 makes display part 7 carry out the image corresponding with captured view data recording the browse displays scheduled time (such as 2 seconds) (step S118).Afterwards, step S117 transferred to by camera head 1.
In the case of not inputting release signal via release-push 62 in step S111 (step S111: no), step S117 transferred to by camera head 1.
In step s 107, main subject configuration part 104 be judged as the main subject candidate that detected by main subject couple candidate detection portion 103 the most identical in the case of (step S107: no), step S111 transferred to by camera head 1.
Then, camera head 1 is not set to photograph mode (step S101: no) and is set to the situation (step S119: yes) of reproduction mode and illustrates.In this case, the guide look (step S120) of the downscaled images (thumbnail image) after display control unit 108 makes the image down that display part 7 display is corresponding with each view data of image data storing section 91 storage.
Then, have selected from the guide look of downscaled images be amplified via operation inputting part 6 or touch panel 8 display image in the case of (step S121: yes), display control unit 108 make display part 7 full frame show selected by image (step S122).
Afterwards, in the case of the handover operation having carried out image via operation inputting part 6 or touch panel 8 (step S123: yes), camera head 1 returns to step S120.On the other hand, in the case of the handover operation not carrying out image via operation inputting part 6 or touch panel 8 (step S123: no), camera head 1 returns to step S122.
Illustrate not selecting the situation (step S121: no) of image being amplified display via operation inputting part 6 or touch panel 8 in step S121 from the guide look of downscaled images.In this case, control portion 10 judges whether have passed through the scheduled time from the guide look that display part 7 shows downscaled images (such as 3 seconds) (step S124).In the case of control portion 10 is judged as have passed through the scheduled time from the guide look that display part 7 shows downscaled images (step S124: yes), step S117 transferred to by camera head 1.On the other hand, being judged as from the guide look that display part 7 shows downscaled images (step S124: no) in the case of without the scheduled time in control portion 10, camera head 1 returns to step S120.
In step S119, in the case of camera head 1 is not set to reproduction mode (step S119: no), step S117 transferred to by camera head 1.
Then, the summary processed the slip image display of step S116 of Fig. 4 illustrates.Figure 10 is the flow chart of the summary illustrating that slip image display processes.
As shown in Figure 10, display control unit 108 makes display part 7 carry out the photographs corresponding with captured view data recording browse displays (step S301).Specifically, as shown in Figure 11 (a), display control unit 108 makes the right region of display part 7 to photographs W11Carry out recording browse displays.
Then, display control unit 108 starts to show the image sets (step S302) the most corresponding with the image data set being temporarily stored in interim storage part 93.Specifically, as shown in Figure 11 (b) and Figure 11 (c), the image W that display control unit 108 makes with interim storage part 93 stores temporarily each view data is correspondingn(n=natural number) from display part 7 to photographs W11Slide mobile (arrow (a)) is acted in the left side carrying out recording the lower area of browse displays, makes display part 7 show (image W successivelyn→ image Wn+1→ image Wn+2→... image Wn+3)。
Afterwards, control portion 10 judges the most just sliding in the image sets of display part 7 display to show the tape label image (step S303) that be addition of labelling by information appendix 106.The display that is judged as just sliding in the image sets of display part 7 display in control portion 10 be addition of (step S303: yes) in the case of the tape label image of labelling by information appendix 106, and display control unit 108 makes display part 7 be highlighted tape label image (step S304).Specifically, as shown in Figure 11 (d), display control unit 108 makes tape label image Wn+3With can be with photographs W11The mode carrying out contrasting amplifies the viewing area on the top being shown in display part 7.Thus, the substantial middle region at image can be have taken the image W of subject A1 by cameraman K1n+3With cameraman K1 by carrying out the photographs W captured by photographing actions11Confirm while comparing.After step S304, step S305 described later transferred to by camera head 1.
In step S303, the display that is judged as the most just sliding in the image sets of display part 7 display in control portion 10 be addition of (step S303: no) in the case of the tape label image of labelling by information appendix 106, and step S305 described later transferred to by camera head 1.
In step S305, control portion 10 judges for display part 7 to slide the selection the operation whether image sets of display has carried out selecting to make to be stored in the image in image data storing section 91.Specifically, control portion 10 judges whether have input the selection signal of the image that selection inputs from menu switch 64 or touch panel 8.Control portion 10 be judged as display part 7 sliding display image sets carried out select operation in the case of (step S305: yes), control portion 10 by selected image, such as image Wn+3View data be stored in image data storing section 91 (step S306).Afterwards, step S307 described later transferred to by camera head 1.
In step S305, control portion 10 be judged as via menu switch 64 or touch panel 8 for display part 7 within the scheduled time (such as 5 seconds) sliding display image sets do not carry out select operation in the case of (step S305: no), step S307 described later transferred to by camera head 1.
In step S307, control portion 10 judges whether all to finish the display of the image data set that interim storage part 93 stores temporarily.Be judged as all finishing in the case of the display of the image data set that interim storage part 93 stores temporarily (step S307: yes) in control portion 10, the image data set (step S308) of interim storage part 93 storage is all deleted in control portion 10.Afterwards, camera head 1 returns to the main routine of Fig. 4.
In step S307, being judged as in control portion 10 the most all terminating in the case of the display of the image data set that interim storage part 93 stores temporarily (step S307: no), camera head 1 returns to step S303.
Embodiments of the present invention 1 from the description above, in the case of the main subject candidate detected by main subject couple candidate detection portion 103 is mobile on the moving direction of the camera head 1 determined by moving direction detection unit 102, main subject candidate is set as main subject by main subject configuration part 104, and image test section 105 detects the main subject set by main subject configuration part 104 from the image sets that interim storage part 93 stores and is positioned at the image of presumptive area.Thus, even if using the main subject making photography target under the situation of movement, also can extract the image that have taken main subject with predetermined pattern in photography cameraman.
And, according to the embodiment of the present invention 1, display control unit 108 makes display part 7 show photographs, and makes the display part 7 display image corresponding with the image data set that interim storage part 93 stores.Thus, can be by the picture cues before and after photography to cameraman.
Further, according to the embodiment of the present invention 1, display control unit 108 makes display part 7 show successively with the display mode that can identify and be addition of labelling and main subject by the center image in substantial middle of beating in by information appendix 106.Thus, can point out and have taken the image of main subject with the desired predetermined pattern of cameraman.
(embodiment 2)
Then, embodiments of the present invention 2 are described.The camera head that embodiments of the present invention 2 relate to also has the orientation detection portion in detection orientation.And, in the action that the camera head related in embodiments of the present invention 2 is carried out, main subject candidate's determination processing is different from above-mentioned embodiment 1.Therefore, in the following, the structure in orientation detection portion, afterwards main subject candidate's determination processing of the action of the camera head that explanation embodiments of the present invention 2 relate to are described.It addition, in accompanying drawing is recorded, same section is enclosed identical label.
Figure 12 is the block diagram of the structure of the camera head 100 illustrating that embodiments of the present invention 2 relate to.As shown in figure 12, camera head 100 has: image pickup part 2, acceleration detecting section 3, intervalometer 4, illuminating part 5, operation inputting part 6, display part 7, touch panel 8, storage part 9, control portion 10 and orientation detection portion 110.
Orientation detection portion 110 is made up of geomagnetic sensor.The appointment orientation being set in advance in camera head 100 is detected in orientation detection portion 110.Specifically, orientation detection portion 110 passes through the component both vertically and horizontally of the detection earth magnetism when the transverse direction of the image of display part 7 display and the horizontal direction of camera head 100 are roughly equal, the orientation of the detection camera head 100 when the optical axis O1 using image pickup part 2 is as reference bearing.
Then, main subject candidate's determination processing that the camera head 100 related to present embodiment 2 is carried out illustrates.Figure 13 is the flow chart of the summary of the main subject candidate's determination processing (step S106 of Fig. 4) illustrating that the camera head 100 that present embodiment 2 relates to carries out.
As shown in figure 13, the testing result that moving direction detection unit 102 detects according to acceleration detecting section 3, it is determined that the acceleration (step S401) produced in camera head 100.It is judged to there is no acceleration (step S402: yes) in z direction (optical axis O1 direction) at moving direction detection unit 102, in the case of the acceleration in the acceleration ratio x direction (horizontal direction) in y direction (vertical direction) is big (step S403: yes), when moving direction detection unit 102 is judged to the testing result cyclically-varying of the acceleration in x direction or y direction (step S404: yes), the mobile subject having movement in the live view image that display part 7 shows is detected as subject candidate (step S405) by main subject couple candidate detection portion 103.
Figure 14 is the figure schematically illustrating situation when moving direction detection unit 102 judges the mobile status of camera head 100.Figure 15 be from Figure 14 to regarding A direction observe subject top view.Figure 16 be from Figure 14 to regarding B direction observe subject side view.Figure 17 is to schematically illustrate the figure of the relation of the acceleration that the velocity and acceleration test section 3 of camera head 100 detects when cameraman K1 makes camera head 100 move.In Figure 17 (a), transverse axis t express time, longitudinal axis v represents the speed of camera head 100, curve L11Represent the velocity variations of camera head 100.Further, in Figure 17 (b), transverse axis t express time, longitudinal axis a represents acceleration.And, in Figure 17 (b), curve Lx1Represent the acceleration of the horizontal direction of camera head 100, curve Ly1Represent the acceleration of the vertical direction of camera head 100, curve Lz1Represent the acceleration of the optical axis direction of camera head 100.It addition, in figures 14 and 15, it is considered to camera head 100 will be made just to be set as towards the situation that right direction moves, be set as bearing towards the situation that left direction moves by making camera head 100.
As shown in Figure 14~Figure 16, in the case of subject A1 moves irregularly, cameraman K1 watches the live view image of display part 7 display while follow the tracks of subject A1, thus in camera head 1, the acceleration of vertical direction of generation shows the value bigger than the acceleration of horizontal direction.Specifically, as shown in figure 17, (moment t in the case of cameraman K1 makes camera head 1 move according to the movement of subject A11~t2, moment t5~t6), acceleration detecting section 3 detection level direction and the acceleration of vertical direction, and detect the value that the acceleration of the acceleration ratio horizontal direction of vertical direction is big.And, (moment t in the case of cameraman K1 stopped the movement of camera head 1003~t4, moment t7~t8), acceleration detecting section 3 detection level direction and the acceleration of vertical direction, and detect the value that the acceleration of the acceleration ratio horizontal direction of vertical direction is big.And, cameraman K1 makes camera head 100 move according to the movement of the approximate period property of subject A1, thus the acceleration cyclically-varying horizontally and vertically that acceleration detecting section 3 detects.
As mentioned above, in the case of optical axis O1 direction not having the acceleration of the acceleration ratio horizontal direction of acceleration, vertical direction big, when the testing result of acceleration horizontally and vertically is periodicity, the main subject candidate that subject couple candidate detection portion 103 main in the live view image that display part 7 shows detects is set as main subject by main subject configuration part 104.
After step S405, step S406 carries out the process identical with above-mentioned step S204 and step S205 with step S407, thus omits the description.After step S407, camera head 100 returns to the main routine shown in Fig. 4.
The testing result that moving direction detection unit 102 in step s 404 is judged to the x direction of camera head 100 or the acceleration in y direction is not that periodic situation (step S404: no) illustrates.In this case, moving direction detection unit 102 judges whether the testing result in orientation detection portion 110 is periodically (step S408).Moving direction detection unit 102 be judged to the testing result in orientation detection portion 110 be periodically in the case of (step S408: yes), step S405 transferred to by camera head 100.On the other hand, moving direction detection unit 102 be judged to the testing result in orientation detection portion 110 be not periodically in the case of (step S408: no), camera head 100 returns to the main routine shown in Fig. 4.
Figure 18 is the figure schematically illustrating situation when moving direction detection unit 102 judges the mobile status of camera head 100.Figure 19 is the figure of the relation schematically illustrating the testing result that the speed of the camera head 100 when cameraman K1 makes camera head 100 move detects with orientation detection portion 110.In Figure 19 (a), transverse axis t express time, longitudinal axis θ represents the testing result in orientation detection portion 110, curve L11Represent the velocity variations of camera head 100.Further, in Figure 19 (b), horizontal axis representing time, the longitudinal axis represents the testing result in orientation detection portion 110, curve L12Represent the change of the testing result in orientation detection portion 110.It addition, in figure 18, it is considered to the situation making camera head 100 move towards right direction is just set as.
As shown in Figure 18 and Figure 19, in the case of subject A1 approximate period property moves, cameraman K1 watches the live view image of display part 7 display and follows the tracks of subject A1, thus the testing result that orientation detection portion 110 detects is approximate period property.Specifically, as shown in figure 19, (moment t in the case of cameraman K1 makes camera head 100 approximate period property move according to the movement of subject A11~t2, moment t5~t6), the numerical value approximate period property change that orientation detection portion 110 detects.
So, in the case of the testing result in the x direction of camera head 100 or the acceleration in y direction is not approximate period property, when the testing result approximate period property in orientation detection portion 110 changes, the main subject candidate that subject couple candidate detection portion 103 main in the live view image that display part 7 shows detects is set as main subject by main subject configuration part 104.
Moving direction detection unit 102 in step S402 is judged to the situation (step S402: no) not having acceleration change on the z direction that acceleration detecting section 3 detects and in step S403 moving direction detection unit 102 be judged to that the situation (step S403: no) of the acceleration that the acceleration in the y-axis direction that acceleration detecting section 3 detects is not more than x direction illustrates.In this case, camera head 100 returns to the main routine of Fig. 4.
Embodiments of the present invention 2 from the description above, in the case of the main subject candidate detected by main subject couple candidate detection portion 103 is mobile on the moving direction of the camera head 100 determined by moving direction detection unit 102, main subject candidate is set as main subject to be followed the tracks of in photography by main subject configuration part 104, detects, from the image sets of interim storage part 93 storage, the main subject set by main subject configuration part 104 and is positioned at the image of presumptive area.Thus, even if using the main subject making photography target under the situation of movement, also can extract the image that have taken main subject with predetermined pattern in photography cameraman.
And, according to the embodiment of the present invention 2, in the horizontal direction or the testing result of acceleration of vertical direction be not periodically in the case of, when the testing result cyclically-varying in orientation detection portion 110, the main subject candidate that subject couple candidate detection portion 103 main in the live view image that display part 7 shows detects is set as main subject by main subject configuration part 104, and detects, from the image sets that interim storage part 93 stores, the main subject set by main subject configuration part 104 and be positioned at the image of presumptive area.Thus, can reliably obtain and determine the image of the desired photography composition of cameraman.
(variation 1 of embodiment 2)
In above-mentioned embodiment 2, cameraman K1 makes camera head 100 be parallel to situation about horizontally and vertically moving, but does not such as change the extension style of arm at cameraman K1 and make camera head 100 also can apply when periodically moving on circular arc.
Figure 20 is the figure schematically illustrating situation when moving direction detection unit 102 judges the mobile status of camera head 100.In fig. 20, as the coordinate axes that the earth is intrinsic, take X-axis in the horizontal direction, take Y-axis (with vertically downward for just) in vertical direction, take Z axis in the direction vertical with horizontal direction.Further, Figure 21 is the figure of relation of the acceleration schematically illustrating optical axis O1 direction and vertical direction in the camera head 100 when cameraman K1 makes camera head 100 move.In figure 21, transverse axis t represents that moment, longitudinal axis a represent the acceleration of camera head 100.Further, curve LY2The acceleration of vertical direction when representing on the basis of the coordinate axes intrinsic by the earth, curve LZ2The acceleration in optical axis O1 direction when representing on the basis of the coordinate axes intrinsic by the earth.
As shown in Figure 20 and Figure 21, subject A1 with certain altitude in the case of vertical direction approximate period property moves, cameraman K1 watches the live view image of display part 7 display while following the tracks of subject A1, thus make camera head 1 move up and down, the therefore acceleration cyclically-varying of optical axis O1 direction and vertical direction.
So, in the case of the acceleration cyclically-varying of optical axis O1 direction and vertical direction, the main subject candidate that subject couple candidate detection portion 103 main in the live view image that display part 7 shows detects is set as main subject by main subject configuration part 104.
The variation of embodiments of the present invention 2 from the description above, in the case of the testing result cyclically-varying of optical axis direction or the acceleration of vertical direction, the main subject candidate that subject couple candidate detection portion 103 main in the live view image that display part 7 shows detects is set as main subject by main subject configuration part 104, and detects, from the image sets that interim storage part 93 stores, the main subject set by main subject configuration part 104 and be positioned at the image of presumptive area.Thus, can reliably obtain and determine the image of the desired photography composition of cameraman.
(embodiment 3)
Then, embodiments of the present invention 3 are described.The storage part of the camera head that embodiments of the present invention 3 relate to is different from above-mentioned camera head with the structure in control portion.And, the action that the camera head that embodiments of the present invention 3 relate to is carried out is different from above-mentioned embodiment.Therefore, in the following, the structure different from above-mentioned embodiment, the afterwards action of the camera head that explanation and embodiments of the present invention 3 relate to are described.It addition, in accompanying drawing is recorded, same section is enclosed identical label.
Figure 22 is the block diagram of the structure of the camera head 200 illustrating that embodiments of the present invention 3 relate to.As shown in figure 22, camera head 200 has: storage part 209, image data storing section 91, program storage part 92, interim storage part 93 and contrast storage part 94.
Contrast storage part 94 stores AF evaluation of estimate, and this AF evaluation of estimate represents the focal position of the image pickup part 2 that the contrast of the view data that Contrast Detection portion 211 described later detects by each displacement moved according to camera head 200 is mapped.Here, the value that AF evaluation of estimate is the maximum of contrast of the view data making image pickup part 2 generate and the focal position of image pickup part 2 is mapped.
Control portion 210 has: image processing part 101, moving direction detection unit 102, main subject couple candidate detection portion 103, main subject configuration part 104, image test section 105, information appendix 106, photography control portion 107, display control unit 108 and Contrast Detection portion 211.
When camera head 200 moves, the contrast of the Contrast Detection portion 211 just view data that detection image pickup part 2 generates.Specifically, Contrast Detection portion 211 is by the most certain period distances (60fps) or the contrast of the view data just detecting image pickup part 2 generation when camera head 200 moves.The contrast of the view data detected is exported contrast storage part 94 by Contrast Detection portion 211.
The action carrying out the camera head 200 with above structure illustrates.Figure 23 is the flow chart of the summary of the action illustrating that the camera head 200 that embodiments of the present invention 3 relate to carries out.
Step S501~step S505 correspond respectively to step S101 shown in Fig. 4~step S105.
In step S506, moving direction detection unit 102 judges whether camera head 200 moves up in optical axis O1 side.Specifically, moving direction detection unit 102 judges whether acceleration detecting section 3 goes out acceleration to optical axis O1 angle detecting.Moving direction detection unit 102 be judged to camera head 200 carried out on optical axis O1 direction mobile in the case of (step S506: yes), step S507 described later transferred to by camera head 200.On the other hand, being judged to camera head 200 (step S506: no) not in the case of optical axis O1 side moves up at moving direction detection unit 102, step S514 described later transferred to by camera head 200.
In step s 507, the contrast of the view data that image pickup part 2 generates is detected in Contrast Detection portion 211, and the contrast detected is stored in contrast storage part 94 (step S508).
Figure 24 is the figure schematically showing the situation when contrast of the view data generated by image pickup part 2 is detected in Contrast Detection portion 211.Figure 25 is to be shown in the figure of the relation of the contrast of the view data that Contrast Detection portion 211 detects and the photo distance from camera head 200 to subject A2 under the situation shown in Figure 24.It addition, in fig. 24, it is believed that the focus lens that the camera lens part 21 of image pickup part 2 has is the focus lens of position stopping (fixing) in the precalculated position on the optical axis O1 of image pickup part 2, such as recent side.Further, in fig. 25, transverse axis d represents the photo distance of camera head 200 and subject A2, and longitudinal axis c represents contrast.And, curve L21Represent contrast.
Under the situation shown in Figure 24, cameraman K1 makes the focal length D with image pickup part 22Fixing camera head 200 is close to subject A2 (Figure 24 (a)~Figure 24 (c)), thus the curve L that the contrast that detects of Contrast Detection portion 211 is as shown in figure 2521Change like that.Specifically, as shown in figure 25, at the photo distance D of camera head 200 with subject A22Under, the contrast of the view data that Contrast Detection portion 211 detects is at a P2Become peak C2.Camera head 200 is in this peak C2Photograph, thus the view data of focus alignment subject A2 can be obtained.That is, when making camera lens part 21 fix on the optical axis O1 of image pickup part 2, cameraman K1 only need to make camera head 200 close to or away from subject A2, just can shoot the view data of focus alignment.
And, when camera head 200 moves up in optical axis O1 side, the contrast that the contrast of the view data that Contrast Detection portion 211 is just detected by photography control portion 107 and contrast storage part 94 store compares, in the case of the contrast of view data is continuously reduced, the location determination before the contrast of view data will being reduced is the peak value of the contrast of view data.Specifically, in fig. 25, continue a P in Contrast Detection portion 2111After detecting the some P of contrast2(contrast C2) and some P3(contrast C3) contrast reduce in the case of, photography control portion 107 will put P2It is judged to the peak value of the contrast of view data.
In step S509, photography control portion 107 judges whether the contrast that Contrast Detection portion 211 detects is peak value (maximum).Be judged as (step S509: yes) in the case of the contrast that Contrast Detection portion 211 detects is peak value in photography control portion 107, the peak value of contrast is stored in contrast storage part 94 (step S510) by photography control portion 107.Afterwards, step S511 described later transferred to by camera head 200.On the other hand, photography control portion 107 be judged as the contrast that Contrast Detection portion 211 detects be not peak value in the case of (step S509: no), step S514 described later transferred to by camera head 200.
In step S511, camera head 200 performs to be determined to be main subject candidate's determination processing of the main subject candidate of main subject in the photography composition that cameraman sets.
Figure 26 is the flow chart of the summary of main subject candidate's determination processing of step S511 illustrating Figure 23.
As shown in figure 26, the acceleration change that moving direction detection unit 102 detects according to acceleration detecting section 3, it is determined that whether camera head 200 moves up (step S601) in optical axis O1 side.Being judged to camera head 200 (step S601: yes) in the case of optical axis O1 side moves up at moving direction detection unit 102, step S602 described later transferred to by camera head 200.On the other hand, being judged to camera head 200 (step S601: no) not in the case of optical axis O1 side moves up at moving direction detection unit 102, camera head 200 returns to the main routine of Figure 23.
In step S602, the subject that the live view image inner region shown at display part 7 changes is detected as main subject candidate by main subject couple candidate detection portion 103.Specifically, the area reduction of subject or the subject of expansion beated in continuous print live view image respectively are detected as subject candidate by main subject couple candidate detection portion 103.
Then, main subject configuration part 104 judges whether the main subject candidate detected by main subject couple candidate detection portion 103 changes (step S603) on the moving direction of the camera head 200 determined by moving direction detection unit 102.The main subject candidate detected in main subject couple candidate detection portion 103 change on the moving direction of the camera head 200 determined by moving direction detection unit 102 in the case of (step S603: yes), main subject configuration part 104 by the characteristic storage of main subject candidate in interim storage part 93 (step S604).Afterwards, camera head 1 returns to the main routine of Figure 23.On the other hand, the main subject candidate detected in main subject couple candidate detection portion 103 is not on the moving direction of the camera head 200 determined by moving direction detection unit 102 in the case of change (step S603: no), and camera head 200 returns to the main routine of Figure 23.
Return to Figure 23, continue the explanation that step S512 rises.In step S512, the live view image shown by display part 7 be the view data near the peak value of contrast and the most consistent with the feature of the main subject set by main subject configuration part 104 in the case of (step S512: yes), the view data of live view image that display part 7 is shown by photography control portion 107 is stored in image data storing section 91 (step S513).Afterwards, step S514 described later transferred to by camera head 200.On the other hand, the live view image shown by display part 7 be the view data near the peak value of contrast and the most not consistent with the feature of the main subject set by main subject configuration part 104 in the case of (step S512: no), step S514 described later transferred to by camera head 200.
Owing to step S514~step S518 are corresponding with step S111 shown in Fig. 4~S113, S117 and step S118 respectively, thus omit the description.
Then, camera head 200 is not set to photograph mode (step S501: no) and is set to the situation (step S519: yes) of reproduction mode and illustrates.In this case, camera head 200 performs step S520~step S524.Further, since step S520~step S524 are corresponding with step S120 shown in Fig. 4~step S124 respectively, thus omit the description.
The situation (step S519: no) that camera head 200 in step S519 is not set to reproduction mode illustrates.In this case, step S518 transferred to by camera head 200.
Embodiments of the present invention 3 from the description above, in the case of the peak value detecting the contrast detected by Contrast Detection portion 211, the main subject candidate detected by main subject couple candidate detection portion 103 in the live view image that display part 7 shows is set as main subject by main subject configuration part 104, and detects, from the image sets of interim storage part 93 storage, the main subject set by main subject configuration part 104 and be positioned at the image of presumptive area.Thus, can reliably obtain the image determining the desired photography composition of cameraman, and the image of the focus main subject of alignment can be obtained.
(other embodiment)
Further, in the above-described embodiment, about the motion flow in claims, description and accompanying drawing, use " first ", " then " etc. to be described for convenience's sake, but be not meant to implement in the order.
Further, in the above-described embodiment, it is that digital camera is described as camera head, but, can also apply to such as digital injection photographing unit, digital camera and there is the electronic equipments such as the portable phone of 2 kinds of camera functions and plate portable equipment.

Claims (13)

1. a camera head, it is characterised in that described camera head has:
Image pickup part, it is continuously generated the view data of subject;
Moving direction detection unit, it judges the moving direction that this camera head moves;
Mobile test section, the movement of the described subject that its detection is included in described view data;
Main subject configuration part, described subject is set as main subject in the case of described subject is tracked by the photographing region of described camera head by it.
Camera head the most according to claim 1, it is characterised in that described camera head also has:
Main subject couple candidate detection portion, described subject, according to the change of the image information comprised in the multiple described view data continuously generated, is detected as main subject candidate by it;
Described main subject configuration part is additionally operable to: in the case of the moving direction determined by described moving direction detection unit is consistent with the movement of the main subject candidate detected by described main subject couple candidate detection portion, described main subject candidate is set as described main subject.
Camera head the most according to claim 1, it is characterised in that
In the case of making described camera head move according to the movement of described subject, described main subject configuration part carries out the setting of described main subject.
Camera head the most according to claim 1, it is characterized in that, described main subject configuration part is additionally operable to: in the case of described camera head has carried out moving back and forth with predetermined amplitude, when there being the described subject moved towards the moving direction of described camera head, described subject is set as described main subject.
Camera head the most according to claim 1, it is characterised in that described camera head also has:
Interim storage part, it stores the described view data continuously generated by described image pickup part temporarily;
Image test section, it detects the described main subject set by described main subject configuration part from the image that the view data that described interim storage part stores is corresponding and is positioned at the image of presumptive area.
Camera head the most according to claim 2, it is characterised in that
In the case of the described main subject candidate detected by described main subject couple candidate detection portion is the most identical, described main subject candidate is set as described main subject by described main subject configuration part.
Camera head the most according to claim 1, it is characterised in that described camera head also has:
Display part, this display part shows in the way of being capable of identify that and is set as that the subject of described main subject is positioned at the image of picture predetermined portions by described main subject configuration part.
Camera head the most according to claim 7, it is characterised in that described camera head also has:
Input unit, its accept multiple images of showing from described display part at least some of select the input of selection signal of image;And
Image data storing section, it stores the view data corresponding to image that the described selection signal accepted with described input unit is corresponding.
Camera head the most according to claim 7, it is characterised in that described camera head also has:
Interim storage part, it stores the described view data continuously generated by described image pickup part temporarily;
Display control unit, it makes described display part show at least some of of image corresponding to view data by the storage of described interim storage part successively.
Camera head the most according to claim 1, it is characterised in that
Described camera head also has Contrast Detection portion, and contrast detects according to described view data in described Contrast Detection portion,
The change of the contrast that described moving direction detection unit detects according to described Contrast Detection portion, it is determined that described moving direction.
11. camera heads according to claim 2, it is characterised in that
Described main subject couple candidate detection portion is additionally operable to: according to the change of image information, be mobile subject region by the region detection of approximate period property change between the image corresponding in the view data continuously generated.
12. 1 kinds of image capture methods performed by camera head, it is characterised in that described image capture method execution following steps:
Image pickup step, is continuously generated the view data of subject;
Moving direction determination step, it is determined that the moving direction that this camera head moves;
Mobile detecting step, the movement of the described subject that detection is included in described view data;
Main subject setting procedure, in the case of described subject is tracked by the photographing region of described camera head, is set as main subject by described subject.
13. image capture methods according to claim 12, it is characterised in that described image capture method also executes the following steps:
Main subject couple candidate detection step, described subject, according to the change of the image information comprised in the multiple described view data continuously generated, is detected as main subject candidate by it;
Described main subject setting procedure is additionally operable to: in the case of the moving direction determined by described moving direction detection unit is consistent with the movement of the main subject candidate detected by described main subject couple candidate detection portion, described main subject candidate is set as described main subject.
CN201610326097.4A 2011-06-24 2012-06-20 Photographic device, image capture method Expired - Fee Related CN105827985B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JPJP2011-141146 2011-06-24
JP2011141146A JP5800600B2 (en) 2011-06-24 2011-06-24 Imaging apparatus, imaging method, and program
CN201210210469.9A CN102843512B (en) 2011-06-24 2012-06-20 Camera head, image capture method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201210210469.9A Division CN102843512B (en) 2011-06-24 2012-06-20 Camera head, image capture method

Publications (2)

Publication Number Publication Date
CN105827985A true CN105827985A (en) 2016-08-03
CN105827985B CN105827985B (en) 2019-02-05

Family

ID=47370543

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201210210469.9A Expired - Fee Related CN102843512B (en) 2011-06-24 2012-06-20 Camera head, image capture method
CN201610326097.4A Expired - Fee Related CN105827985B (en) 2011-06-24 2012-06-20 Photographic device, image capture method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201210210469.9A Expired - Fee Related CN102843512B (en) 2011-06-24 2012-06-20 Camera head, image capture method

Country Status (2)

Country Link
JP (1) JP5800600B2 (en)
CN (2) CN102843512B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5929774B2 (en) * 2013-02-08 2016-06-08 カシオ計算機株式会社 Image acquisition method, apparatus, and program
JP6249788B2 (en) * 2014-01-17 2017-12-20 オリンパス株式会社 Display device, display method and program
JP6128109B2 (en) * 2014-12-12 2017-05-17 カシオ計算機株式会社 Image capturing apparatus, image capturing direction control method, and program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1641467A (en) * 2003-11-27 2005-07-20 索尼株式会社 Photographing apparatus and method, supervising system, program and recording medium
CN101014095A (en) * 2006-01-31 2007-08-08 佳能株式会社 Method for displaying an identified region together with an image, and image pick-up apparatus
CN101739551A (en) * 2009-02-11 2010-06-16 北京智安邦科技有限公司 Method and system for identifying moving objects
CN101867725A (en) * 2009-01-23 2010-10-20 卡西欧计算机株式会社 Camera head and reference object tracking
US20100321505A1 (en) * 2009-06-18 2010-12-23 Kokubun Hideaki Target tracking apparatus, image tracking apparatus, methods of controlling operation of same, and digital camera
CN101931746A (en) * 2009-06-18 2010-12-29 奥林巴斯映像株式会社 Camera head and image capture method
CN101998054A (en) * 2009-08-18 2011-03-30 佳能株式会社 Focus adjustment apparatus and focus adjustment method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3860552B2 (en) * 2003-03-25 2006-12-20 富士通株式会社 Imaging device
JP4398346B2 (en) * 2004-10-29 2010-01-13 オリンパス株式会社 Camera, photographing sensitivity control method, shutter speed control method, and photographing sensitivity control program
JP2008278480A (en) * 2007-04-02 2008-11-13 Sharp Corp Photographing apparatus, photographing method, photographing apparatus control program and computer readable recording medium with the program recorded thereon
JP5065060B2 (en) * 2008-01-16 2012-10-31 キヤノン株式会社 Imaging apparatus and control method thereof
JP2009232276A (en) * 2008-03-24 2009-10-08 Olympus Imaging Corp Image pickup device
JP2010021598A (en) * 2008-07-08 2010-01-28 Victor Co Of Japan Ltd Image capturing apparatus and method
JP2010157851A (en) * 2008-12-26 2010-07-15 Olympus Imaging Corp Camera and camera system
JP5397078B2 (en) * 2009-08-11 2014-01-22 株式会社ニコン Imaging device
JP2011082770A (en) * 2009-10-06 2011-04-21 Canon Inc Data generation apparatus, method of controlling the same, and program
JP2011114487A (en) * 2009-11-25 2011-06-09 Olympus Imaging Corp Imaging apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1641467A (en) * 2003-11-27 2005-07-20 索尼株式会社 Photographing apparatus and method, supervising system, program and recording medium
CN101014095A (en) * 2006-01-31 2007-08-08 佳能株式会社 Method for displaying an identified region together with an image, and image pick-up apparatus
CN101867725A (en) * 2009-01-23 2010-10-20 卡西欧计算机株式会社 Camera head and reference object tracking
CN101739551A (en) * 2009-02-11 2010-06-16 北京智安邦科技有限公司 Method and system for identifying moving objects
US20100321505A1 (en) * 2009-06-18 2010-12-23 Kokubun Hideaki Target tracking apparatus, image tracking apparatus, methods of controlling operation of same, and digital camera
CN101931746A (en) * 2009-06-18 2010-12-29 奥林巴斯映像株式会社 Camera head and image capture method
CN101998054A (en) * 2009-08-18 2011-03-30 佳能株式会社 Focus adjustment apparatus and focus adjustment method

Also Published As

Publication number Publication date
CN102843512A (en) 2012-12-26
CN105827985B (en) 2019-02-05
JP5800600B2 (en) 2015-10-28
JP2013009204A (en) 2013-01-10
CN102843512B (en) 2016-05-25

Similar Documents

Publication Publication Date Title
US7623774B2 (en) Devices and methods for determining orientation of a camera
CN104052923B (en) The display control method of capture apparatus, image display and image display
TWI442328B (en) Shadow and reflection identification in image capturing devices
CN104683689A (en) Image pickup apparatus and method for controlling the apparatus
JP5054063B2 (en) Electronic camera, image processing apparatus, and image processing method
US8482648B2 (en) Image pickup apparatus that facilitates checking of tilt thereof, method of controlling the same, and storage medium
CN103813093B (en) Photographic device and its image capture method
CN106488116B (en) Photographic device
CN107787463B (en) The capture of optimization focusing storehouse
CN105681652A (en) Imaging device and imaging device control method
KR102661185B1 (en) Electronic device and method for obtaining images
WO2018173597A1 (en) Imaging device, imaging method, and imaging program
CN106101567B (en) Shoot light-regulating method, device and mobile terminal
CN114125268A (en) Focusing method and device
CN105827985A (en) Imaging device and imaging method
JP2013058923A (en) Photographing apparatus
JP2005204298A (en) System and method for showing exposure information during taking in image
CN109640075A (en) A kind of image pick-up detection device, method and mobile terminal
JP2015167310A (en) Imaging apparatus and imaging method
JP2015062052A (en) Imaging apparatus and imaging method
EP3885723A1 (en) Imager, user equipment, method for acquiring ambient light, and storage medium
CN102316256B (en) Portable device and reproduction display method
JP2009124210A (en) Image pickup apparatus, image pickup method, image searching apparatus and image searching method
KR20060124176A (en) Method for outfocusing and camera implementing the same
JP2016111521A (en) Information processing device, information processing program and information processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20211215

Address after: Tokyo, Japan

Patentee after: Aozhixin Digital Technology Co.,Ltd.

Address before: Tokyo, Japan

Patentee before: OLYMPUS Corp.

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190205

CF01 Termination of patent right due to non-payment of annual fee