CN103002223A - Photographic equipment - Google Patents

Photographic equipment Download PDF

Info

Publication number
CN103002223A
CN103002223A CN2012103273980A CN201210327398A CN103002223A CN 103002223 A CN103002223 A CN 103002223A CN 2012103273980 A CN2012103273980 A CN 2012103273980A CN 201210327398 A CN201210327398 A CN 201210327398A CN 103002223 A CN103002223 A CN 103002223A
Authority
CN
China
Prior art keywords
image
processing
camera
exposure
mentioned
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2012103273980A
Other languages
Chinese (zh)
Inventor
新谷浩一
平田真宽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Publication of CN103002223A publication Critical patent/CN103002223A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

The present invention provides photographic equipment contributing to raise of visual identity displayed by a live-view image. As a solution, the photographic equipment comprises: a photographic unit (12) for performing photographing to a photographed body with a predetermined cycle; a display unit (23) for displaying an image obtained by the photographic unit ; a display control unit (11) for controlling the display unit, wherein the display control unit performs switching control for a first display mode and a second display mode, the first display mode successively displays the images obtained by the photographic unit with the predetermined cycle, and the second display mode adds the plurality of images obtained by the photographic unit with the predetermined cycle, and displays the image serving as an added result.

Description

Photographic equipment
Technical field
The present invention relates to photographic equipment, be specifically related to also can show the photographic equipment that carries out the subject observation by the live view image to dark section.
Background technology
In the past in the photographic equipments such as digital camera of having used imaging apparatus etc., the general practical situation of quilt is to show by carrying out so-called live view image, thereby can will be attached to the display unit of photographic equipment as image viewing device (view finder), wherein the display mode of this live view image demonstration is to use imaging apparatus to obtain view data, meanwhile will continue to be shown in the display frame of display unit as dynamic image continuously based on the image of this view data that obtains.
In addition, in photographic equipment in recent years, be subject to the impact that the processing speed of image processing circuit improves, the image that will implement the result in the situation that various images process in can the live view image in stage has shown before carrying out photographic recording is reflected in this live view image.
For example, in existing photographic equipment, when the photography (so-called B door photography) of carrying out based on time exposure, can use display unit to show the live view image, the result's that photographs supervision, and TOHKEMY 2011-103499 communique etc. have proposed various schemes and have been able to practical.
The disclosed means such as above-mentioned TOHKEMY 2011-103499 communique are that the photographic recording when being independent of B door photography action is processed, and carry out the processing that the live view image shows usefulness.When B door photography action, obtain a plurality of dark images (under-exposed image), they are carried out addition process.On the other hand, in this B door photography course of action, carry out the demonstration of live view image and can carry out the supervision of photographic process.In this case, carry out the demonstration different from the recording processing of above-mentioned image and process (addition processing) with image, thereby can provide with more natural demonstration the live view image in the photography of B door to show.
On the other hand, when stating in the use the existing photographic equipment of mode, in the situation that display unit is photographed as view finder, when being low-light level if surrounding environment is dark so that as the subject brightness of photography target, such as in the open-air skifield at night or the dark section such as the indoor party meeting-place under the low lighting condition when photographing, exist in the image that in the display frame of display unit, shows because the deficiency of exposure and can't fully confirm the problem of slight expression and posture or the camera coverage etc. of subject.
In addition, when stating existing photographic equipment in the use, when carrying out carrying out autozoom (power zoom) action when the live view image shows, if (between the exposure period to imaging apparatus) carries out the zoom action during the obtaining of live view image, then exist obtained image can produce the problems such as distortion.
And then, in above-mentioned existing photographic equipment, when in the demonstration of live view image, carrying out automatically focusing (AF) action, there is the dimmed problem of live view image during this AF moves.Its reason is, when having carried out the AF action, during this AF action in, in order to realize the high speed of AF action, make the cycle high speed of vertical synchronizing signal.Thus so that shorten between the exposure period of a frame.
So, can consider to show with disclosed image addition processing such as the above-mentioned TOHKEMY 2011-103499 communiques of application in processing at the live view image.
[patent documentation 1] TOHKEMY 2011-103499 communique
The disclosed means such as above-mentioned TOHKEMY 2011-103499 communique are effective means when carrying out the photography gimmick (photography of B door) that fixtures such as photographic equipment being fixed in tripod carries out time exposure especially.
At above-mentioned use scenes, namely in the darker situation of surrounding environment, specifically in the open-air skifield at night, the dark sections such as indoor party meeting-place under the low lighting condition, usually use the floor light light such as flash light emission device to photograph.
Yet the use of floor light light only for the moment of actual photography action, does not have effect in live view shows.Therefore in existing photographic equipment, the live view image of dark section show can't be fully and the degree that can satisfy according to the user reliably used.
Summary of the invention
The present invention In view of the foregoing finishes, its purpose is to provide a kind of also can the demonstration with good state in the actions such as AF, zoom dark section by using the image addition treatment technology comprise as the subject of the photography target image at interior camera coverage, helps the photographic equipment of raising of the visuognosis of live view image demonstration.
In order to reach above-mentioned purpose, the photographic equipment of one aspect of the present invention has: image pickup part, and it is taken subject with predetermined period; Display part, it shows the image of being obtained by above-mentioned image pickup part; And display control unit, it controls above-mentioned display part, above-mentioned display control unit carries out switching controls to the 1st display mode and the 2nd display mode, wherein the 1st display mode shows the image of being obtained with predetermined period by above-mentioned image pickup part successively, a plurality of image additions that the 2nd display mode will be obtained with predetermined period by above-mentioned image pickup part show the image that the result as this addition obtains.
According to the present invention, can provide a kind of by using the image addition treatment technology, even dark section or also can show with good state comprise as the subject of the photography target image at interior camera coverage, help the photographic equipment of raising of the visuognosis of live view image demonstration in the actions such as AF, zoom.
Description of drawings
Fig. 1 is the structured flowchart of inside main composition of the photographic equipment (camera) of expression the present invention the 1st execution mode.
Fig. 2 is the key diagram that the situation that the photographic equipment (camera) of Fig. 1 photographs is used in expression.
Fig. 3 is the figure of the display frame of the photographic equipment (camera) under the state of main presentation graphs 2.
Fig. 4 is the figure that is illustrated in an example of the demonstration image that the result that carries out touch operation under the state of Fig. 1, Fig. 2 obtains.
Fig. 5 is the flow chart of the processing sequence of the camera control carried out of the photographic equipment (camera) of presentation graphs 1.
Fig. 6 is the flow chart of the subprogram of the frame addition Graphics Processing (the step S15 of Fig. 5) in the processing sequence of presentation graphs 5.
Fig. 7 is the flow chart of processing sequence of camera control of the photographic equipment (camera) of expression the present invention the 2nd execution mode.
Fig. 8 is the flow chart that the live view image in the processing sequence of presentation graphs 7 shows the subprogram of control (the step S47 of Fig. 7).
Fig. 9 is the synthetic flow chart of processing the subprogram of (the step S59 of Fig. 8) of the multiple image in the processing sequence of presentation graphs 8.
Figure 10 be in the processing sequence of presentation graphs 9 to showing that employed image carried out the flow chart that jitter correction is processed the subprogram of (the step S62 of Fig. 9) last time.
Figure 11 A, Figure 11 B, Figure 11 C, Figure 11 D, Figure 11 E, Figure 11 F, Figure 11 G, Figure 11 H are in the photographic equipment (camera) of expression the present invention the 2nd execution mode, have carried out the sequential chart of the effect in the situation of AF action in live view image display action.
Figure 12 is in the photographic equipment (camera) of expression the present invention the 2nd execution mode, the figure of the photometry region in effective sensitive surface of imaging apparatus.
Figure 13 A, Figure 13 B are the exposure control program line charts of the photographic equipment (camera) of the present invention's the 2nd execution mode.
Figure 14 A, Figure 14 B, Figure 14 C, Figure 14 D, Figure 14 E, Figure 14 F, Figure 14 G, Figure 14 H, Figure 14 I are in the photographic equipment (camera) of expression the present invention the 2nd execution mode, have carried out the sequential chart of the effect in the situation of zoom action in live view image display action.
Figure 15 is the key diagram that the jitter correction carried out in the photographic equipment of the present invention's the 2nd execution mode (camera) is processed, and is the figure of the concept of the sensitive surface size of expression imaging apparatus and effective image-region (viewing area) size.
Figure 16 is the key diagram that the jitter correction of execution in the photographic equipment of the present invention's the 2nd execution mode (camera) is processed, and is the schematic diagram in the dither image zone on the expression sensitive surface.
Figure 17 is the key diagram that the jitter correction carried out in the photographic equipment of the present invention's the 2nd execution mode (camera) is processed, and is the schematic diagram of the view data that obtains after the expression jitter correction is processed.
Figure 18 is in the photographic equipment (camera) of expression the present invention the 2nd execution mode, the figure of the concept when amplifying synthetic the processing.
Figure 19 is in the photographic equipment (camera) of expression the present invention the 2nd execution mode, the figure of the concept when dwindling synthetic the processing.
Label declaration
1 camera; 10 camera bodies; 11 camera control parts; 12 imaging apparatuss; The effective sensitive surface of 12a; 13 scratchpad memories; 14 exposure control parts; 15 image processing parts; 16 operation control parts; 17 operating portions; 18 camera Department of Communication Forces; 19SDRAM; 20 memory interfaces; 21 recording mediums; 22 demonstration drivers; 23 display unit; The 23a effective image-region; 24 touch panel drivers; 25 touch panels; 26 angular-rate sensors; 30 photographing lens barrels; 31 phtographic lenses; 32 camera lens holding frames; 33 aperture devices; 34 drivers; 35 lens control sections; 36 camera lens Department of Communication Forces; 39a, 39b communication contact.
Embodiment
[the 1st execution mode]
Fig. 1 ~ Fig. 6 is the figure of expression the present invention the 1st execution mode.Wherein, Fig. 1 is the structured flowchart of inside main composition of the photographic equipment (camera) of expression the present invention the 1st execution mode.Fig. 2 is the key diagram that the situation that the photographic equipment (camera) of present embodiment photographs is used in expression.Fig. 3 is the figure of the display frame of the photographic equipment (camera) under the state of main presentation graphs 2.Fig. 4 is the figure that is illustrated in an example of the demonstration image that the result that carries out touch operation under the state of Fig. 1, Fig. 2 obtains.Fig. 5 is the flow chart of the processing sequence of the camera control carried out of the photographic equipment (camera) of expression present embodiment.Fig. 6 is the flow chart of the subprogram of the frame addition Graphics Processing (the step S15 of Fig. 5) in the processing sequence of presentation graphs 5.
In the 1st execution mode of the present invention, show for example the digital camera as photographic equipment (being designated hereinafter simply as camera) of following formation: it for example uses solid-state imager that the optical image that is formed by optical lens is carried out opto-electronic conversion, the picture signal that obtains thus is converted to the DID of performance still image or dynamic image, the DID that as above generates is recorded in recording medium, and can uses display unit to reproduce demonstration still image or dynamic image according to the DID that is recorded in the recording medium.
And, in each accompanying drawing that uses in the following description, become the size that can identify at accompanying drawing in order to make each inscape, therefore sometimes take different contraction scales to show to each inscape.The size ratio of the quantity of the inscape of therefore, putting down in writing in these accompanying drawings among the present invention, the shape of inscape, inscape and the relative position relation of each inscape are not limited only to illustrated mode.
At first use the inside main composition of the photographic equipment (camera) of following explanation the present invention the 1st execution mode of Fig. 1.
As shown in Figure 1, the photographic equipment of present embodiment is that camera 1 mainly is made of camera body 10, photographing lens barrel 30.
Camera body 10 is made of the framework of taking in various component parts described later in inside, is the formation section that consists of the basic comprising of this camera 1.Photographing lens barrel 30 possesses photographic optical system etc., is for accepting from the light beam of subject and making the shot object image imaging of optics and the formation section that arranges.And this camera 1 is photographing lens barrel 30 to be equipped on the front surface of camera body 10 in the mode of disassembled and assembled freely and the so-called lens exchange type camera that consists of.
The inside of above-mentioned camera body 10 is equipped with: by the electronic units such as CPU consist of and unified electronically controlled camera control part 11(Fig. 1 that carries out this camera 1 in be labeled as CPU); The various component parts that under the control of this camera control part 11, play a role, i.e. display unit 23, touch panel driver 24, touch panel 25, the angular-rate sensors 26 etc. such as the scratchpad memories such as imaging apparatus 12, flash memory 13, exposure control part 14, image processing part 15, operation control part 16, operating portion 17, camera Department of Communication Force 18, SDRAM19, memory interface (I/F) 20, recording medium 21, demonstration driver 22, LCD.
Imaging apparatus 12 is the image pickup parts that are made of photo-electric conversion element and drive circuit etc. thereof.Imaging apparatus 12 is optical images of accepting by the subject of the photographic optical system imaging of above-mentioned photographing lens barrel 30, periodically it is carried out opto-electronic conversion and processes, and generates the formation section of electronic image data.And, as the imaging apparatus 12 that uses in the present embodiment, refer to and use CMOS(Complementary Metal Oxide Semiconductor, complementary metal oxide semiconductors (CMOS)) etc. solid-state imager, i.e. mos image sensor.This mos image sensor adopts according to every row phototiming that staggers, the row that exposes the successively successively playback mode (rolling shutter) that exposes.
Scratchpad memory 13 is made of flash memory etc., is the program of temporary storage when carrying out various image processing etc. and the internal memory-area of data etc.
Exposure control part 14 is for according to carrying out the photometry of shot object image from the output signal of imaging apparatus 12 etc. or according to the signal from the photometry unit (not shown) of other setting, the result sets the correct exposure value according to this photometry, and is (not shown according to the correct exposure value control tripper of this setting; Be contained in the so-called electronic shutter in the imaging apparatus 12 or be arranged in addition the mechanical shutter in the dead ahead of imaging apparatus 12), f-number mechanism (symbol 33; Aftermentioned), imaging apparatus 12(sensitivity adjustment, electronic shutter etc.) etc. control circuit.Obtain thus the image that becomes correct exposure, obtained image outputs to continuously successively display unit 23 and shows.
Image processing part 15 is except the frame addition of carrying out in the camera 1 of present embodiment carrying out is processed, the circuit part that the various picture signals of also carrying out carrying out in common camera are in the past processed.
Operating portion 17 is formation sections of the various functional units of needs when comprising operation this camera 1 etc.
Operation control part 16 is to receive from the index signal of operating portion 17 and pass to camera control part 11, also the indication of camera control part 11 is passed to the control circuit of the control of operating portion 17 grades.
Camera Department of Communication Force 18 be and photographing lens barrel 30 between communicate the formation section of transmitting-receiving control signal etc.Therefore be respectively equipped with in camera body 10 and photographing lens barrel 30 and communicate by letter with contact 39a, 39b, under the state that both are linked up, communicating by letter contacts and switches on contact 39a, 39b.
SDRAM(Synchronous Dynamic Random Access Memory) the 19th, temporary storage is pre-stored store in program not shown with ROM() etc. in various control usefulness/data processing with the operation storage area of program etc.
Memory interface 20 is as the medium between camera control part 11 and the recording medium 21, is auxiliary data recording and processing to recording medium 21 or from the formation section of the data reading of recording medium 21.
Recording medium 21 is for the memory cell that is recorded in the view data that this camera 1 obtains, such as using the small-sized small-scaled media such as semiconductor memory card, card type HDD.Recording medium 21 constitutes can be with respect to this camera 1 disassembled and assembled freely.Also comprise the medium that the mode with the internal memory of fixed configurations in camera body 10 consists of.
Demonstration is for driving control display devices 23 via showing with driver 22 under the control of camera control part 11, suitably showing as required the driver of image and various information etc. in the mode that can identify to vision with driver 22.
Display unit 23 is such as using LCD(Liquid Crystal Display) etc. panel shape display unit, be for can vision the mode of ground identification show the display part of image and various information etc.Display unit 23 drives control by showing with driver 22 under the control of camera control part 11.
Touch panel driver 24 is under the control of camera control part 11 touch panel 25 to be driven control, detects touch operation, and judges and its testing result operates the driver of input accordingly.
Touch panel 25 is overlapped on the display surface of above-mentioned display unit 23 and sets, be to carry out touch operation, touch and slide etc. by user's counter plate face, thereby carry out the formation section that comprises in the operation input unit of various appointment inputs etc. according to these operations.
Angular-rate sensor 26 is for detection of the posture of camera 1 (for horizontal inclination etc.), or detects the shaking detection element of shake that equipment is applied etc.This angular-rate sensor 26 for example is the electronic unit that consists of the part of image shake correction function.
On the other hand, photographing lens barrel 30 constitutes and mainly possesses phtographic lens 31, is labeled as camera lens CPU in the camera lens holding frame 32, aperture device 33, driver 34, the 35(Fig. 1 of lens control section), camera lens Department of Communication Force 36 etc.
Phtographic lens 31 is made of a plurality of optical lenses, is for accepting from the light beam of subject and making the formation section of optical image imaging.
Camera lens holding frame 32 is the formation sections that arrange for each optical lens that keeps respectively above-mentioned phtographic lens 31.
Aperture device 33 is to cross the light quantity of light beam of above-mentioned phtographic lens 31 and the formation section that arranges in order to adjust transmission, such as being made of aperture blades and the drive motors etc. that drives this aperture blades.
Communicating between camera lens Department of Communication Force 36 and the camera body 10, is the formation section of transmitting-receiving control signal etc.As mentioned above, in photographing lens barrel 30, be provided with and the communication of camera body 10 with communication contact 39b corresponding to contact 39a, when the state that both are linked up, communication is also switched on contact 39a, 39b contact.
And the camera 1 of present embodiment also has face detection function, subject tracking function etc. as other functions.Face detection function, subject tracking function are one of image processing functions of carrying out of the view data that generates according to the output signal of accepting imaging apparatus 12 under the control of camera control part 11.These face detection functions, subject tracking function use the function identical with the function of using in existing camera.Therefore omit its detailed description.Other formation has and has roughly same formation of camera now.
Use Fig. 2 ~ Fig. 4 to be simply described as follows the summary of effect of camera 1 of the present embodiment of above-mentioned formation.
The power supply that at first makes camera 1 is that on-state is started camera 1.After having carried out this power connection operation, camera 1 is with the pattern (photograph mode) of the action of photographing and display unit 23 can be observed pattern (live view pattern) starting of the viewer of usefulness as subject.
As shown in Figure 2, user 100 with the camera 1 of this state towards gripping as the subject of the photography target of expectation.At this moment, environment is in the situation of dark section around, during such as dark sections such as the indoor party meeting-place under the outdoor skifield that is in night or the low lighting condition, subject brightness as photography target is low-light level, thereby as shown in Figure 3, the live view image that is shown in the display frame of display unit 23 can because under-exposed and whole dimmed, be difficult to differentiate the subject of the expectation in the picture sometimes.
So, in the camera 1 of present embodiment, user 100 carries out Fig. 2, operation shown in Figure 3, carry out indication based on touch operation with the desired subject in the display frame of 101 pairs of display unit 23 of finger, carry out the addition of two field picture and process, be shown as and visuognosis go out to comprise the state that the image-region at position is indicated in this touch until can become clear.And in the example of Fig. 3, illustrate the state that the personage to the substantial middle zone of the display frame of display unit 23 carries out the touch operation indication.
Its result, for example shown in Figure 4, the subject (personage) at the indication position in the display frame of display unit 23 shows with state that can visuognosis.
Then by the processing sequence of the following explanation of the flow chart of Fig. 5, Fig. 6 for the camera control of the camera of the present embodiment of the effect that realizes above-mentioned camera 1.
At first in the step S11 of Fig. 5, camera control part 11 monitors through operation control parts 16 and from the output signal of operating portion 17 and be confirmed whether to have detected power throughing signal.Wherein, continue to monitor detecting in the situation of this signal until detect power throughing signal, enter the processing of next step S12.
In step S12, camera control part 11 relies on camera Department of Communication Force 18, the camera lens Department of Communication Force 36 that connects with contact 39a, 39b by communication, and the lens control section 35 of photographing lens barrel 30 between carry out predetermined communication process.
Then, in step S13, camera control part 11 confirms whether the pattern of camera 1 is set to photograph mode.Confirming the processing that enters into next step S14 in the situation of having set photograph mode.And confirming the processing that enters step S20 in the situation of having set photograph mode pattern in addition.
In step S14, camera control part 11 is via lens control section 35 control drivers 34, drive phtographic lens 31, aperture device 33 etc., and control imaging apparatus 12, display unit 23 etc. carry out the shooting processing and placeholder record is processed, and uses display unit 23 to carry out Graphics Processing according to identical view data.Carrying out the live view image in display unit 23 thus shows.And, the result (view data of a frame) that above-mentioned placeholder record is processed successively placeholder record in scratchpad memory 13 grades.
Then, in step S15, whether camera control part 11 carries out the judgement that (being designated hereinafter simply as the frame addition processes) processed in the addition of two field picture.The details of the processing of this step S15 as shown in Figure 6.
That is, in the step S31 of Fig. 6,11 supervision of camera control part from the input signal of touch panel 25, are confirmed whether to detect the input signal based on touch operation via touch panel driver 24.Detecting the processing that enters next step S32 in the situation of touch signal herein.And the processing that in the situation that does not detect touch signal, enters the step S17 of Fig. 5.
In the processing of above-mentioned steps S31, detect touch signal, after entering into the processing of step S32, in this step S32, camera control part 11 carries out judgement for the exposure in the predetermined image zone at the position of having carried out touching indication in the demonstration image that has comprised display unit 23 by exposure control part 14.Confirm that whether this regional subject brightness for example be the exposure (being equivalent to approximately 3EV) more than 10% herein.Be the processing that enters step S33 in the situation more than 10 confirming exposure.And confirm the processing that enters the step S17 of Fig. 5 in the situation of exposure less than 10.
Then, in step S33, camera control part 11 confirms whether include face image within comprising preset range zone by the position of above-mentioned touch operation indication (such as the rectangular-shaped zone that accounts for picture entire area 1/10 etc.).In the situation that can obtain face image, enter the processing of step S34.And can't obtain the processing that enters the step S17 of Fig. 5 in the situation of face image.
After having obtained face image and enter into the processing of step S34 in the processing of above-mentioned steps S33, in this step S34, camera control part 11 carries out whether can carrying out for obtained face image the judgement of tracking process.Be judged as the processing that enters the step S16 of Fig. 5 to carry out tracking process the time.And be judged as the processing that enters the step S17 of Fig. 5 in the time of to carry out tracking process.
And (being face image herein) is mobile in picture even above-mentioned tracking process refers to specified subject, along with it moves the processing that continues to follow the trail of the position in picture.This tracking process is generally to obtain practical processing in existing camera, therefore omits its detailed description.
As above, as whether carrying out the result of determination that the frame addition is processed, return the sequence of Fig. 5.
Carry out the judgement that the frame addition is processed when having carried out expression, and when entering into the processing of step S16 of Fig. 5, in this step S16, camera control part 11 carry out will be in the temporary transient processing of above-mentioned steps S14 the frame image data of placeholder record add that the frame addition on this frame image data of image pickup result processes, the view data of this result of placeholder record, and carry out that this view data is outputed to display unit 23 via showing with driver 22, the Graphics Processing that shows in display unit 23.After this enter the processing of step S18.
And, with the display mode of this moment, namely a plurality of images of being obtained with predetermined period by imaging apparatus 12 are carried out addition and process, show that the display mode of the image of obtaining as its result is called the 2nd display mode.
On the other hand, in the processing sequence at above-mentioned Fig. 6, make expression and do not carried out the judgement that the frame addition is processed, and when entering into the processing of step S17 of Fig. 5, in this step S17, camera control part 11 is carried out via showing will output to display unit 23 by the frame image data of placeholder record with driver 22 in the temporary transient processing of above-mentioned steps S14, the Graphics Processing that shows in display unit 23.After this enter the processing of step S18.
With the display mode of this moment, the display mode when the common live view image of the image that namely uses display unit 23 to show successively to be obtained with predetermined period by imaging apparatus 12 shows is called the 1st display mode.
In step S18, camera control part 11 is by operation control part 16 supervisory work sections 17, and by touch panel driver 24 supervision touch panels 25, being confirmed whether to carry out for the operation of carrying out the photography action is releasing operation.Particularly, such as being confirmed whether to have produced the release signal that the releasing parts (not shown) that comprises in the comfortable operating portion 17 or from the release signal of touch panel 25 etc.In the situation of the generation that confirms release signal, enter the processing of next step S19 herein.And return the processing of above-mentioned steps S11 in the situation of the generation to release signal unconfirmed, repeat processing after this.
In step S19, camera control part 11 is by lens control section 35 control drivers 34, drive phtographic lens 31, move according to carrying out automatically focus (AF) from the output of imaging apparatus 12, and the photometry that execution is set exposure value (considering that the not shown flash light emission device of use sends the correct exposure value of fill-in light etc.) by exposure control part 14 is moved.Then according to the set point that is obtained by above-mentioned AF action and photometry action, by lens control section 35 control drivers 34, drive phtographic lens 31, aperture device 33 etc., and imaging apparatus 12 etc. is driven control, execution photograph processing.The view data that execution obtains the result of this photograph processing is via showing the Graphics Processing of exporting to display unit 23 and demonstration with driver 22.Carry out simultaneously the recording processing of this view data that obtains being exported to recording medium 21 and record via memory interface 20.After this return the processing of above-mentioned steps S1, repeat processing after this.
On the other hand, set photograph mode pattern in addition when in the processing of above-mentioned steps S13, being judged as, and when entering into the processing of step S20, in this step S20, camera control part 11 confirms whether the pattern of cameras 1 is set to reproduction mode.Confirming the processing that enters next step S21 in the situation of having set reproduction mode herein.And be not in the situation of reproduction mode in the pattern that sets, return the processing of step S1, repeat processing after this.
And, generally, as the pattern of camera, be broadly divided into photograph mode and reproduction mode, yet also have the situation that possesses pattern in addition.In this case, can branch's step further be set from the processing of step S20, carry out the affirmation of pattern.Yet about the effect of other patterns outside the photograph mode, be the part with not direct correlation of the present invention, therefore omit explanation and the diagram of these parts.And in the processing sequence of Fig. 5 the pattern of current setting neither photograph mode also in the situation of non-reproduction mode, returns initial step S1 simply.
In step S21, camera control part 11 is by memory interface 20 control recording mediums 21, carry out reading in of view data from the posting field of this recording medium 21, and by showing with driver 22 control display devices 23, carry out predetermined image reproducing and process.
Then in step S22, camera control part 11 carries out the affirmation whether the carries out image change is processed.This image changing is processed and is for example referred to according to the view data that has been recorded in the recording medium 21, changes various parameters etc. and generates that the picture editting processes in the so-called camera of multi-form image etc.It is to be undertaken by image processing part 15 under the control of camera control part 11 that this image changing is processed.
In the processing of above-mentioned steps S22, the processing that enters step S23 in the situation that detects the operation index signal that should carry out the image changing processing is carried out predetermined image changing and is processed in the processing of this step S23.After this return the processing of above-mentioned steps S1, repeat processing after this.
And in the processing of above-mentioned steps S22, do not detect in the situation of the operation index signal that should carry out the image changing processing, return the processing of above-mentioned steps S1, repeat processing after this.
As mentioned above, according to above-mentioned the 1st execution mode, constitute for the common live view image display mode (the 1st display mode) that shows successively the image of being obtained with predetermined period by imaging apparatus 12 and process with a plurality of images of being obtained with predetermined period by imaging apparatus 12 being carried out addition, show that the 2nd display mode of the image that its result obtains carries out switching controls.
Therefore, for example when dark section being carried out the demonstration of live view image, by any part in the touch operation indicated number picture, when having the brightness of the degree that can identify the subject in this indicating image zone, the addition that automatically performs two field picture is processed, and shows even therefore dark section also can carry out fully identifying the live view image of the degree of the desired subject of having carried out the operation indication.
Particularly, such as in the outdoor skifield at night or the dark section such as the indoor party meeting-place under the low lighting condition when photographing, can be reliably and confirm fully expression and posture, the camera coverage etc. that subject is trickle, thereby can effectively use the live view image to show.
[the 2nd execution mode]
The below uses Fig. 7 ~ Figure 19 that the photographic equipment (camera) of the present invention's the 2nd execution mode is described.
The formation of the basic comprising of the camera of present embodiment and above-mentioned the 1st execution mode is identical, and only it processes sequence some differences.
In the camera 1 of above-mentioned the 1st execution mode, for the live view image that carries out reliably dark section shows, carry out the addition of two field picture and process.
The opportunity of carrying out frame addition processing in the camera of present embodiment is as follows.
When (1) similarly dark section being carried out the demonstration of live view image with above-mentioned the 1st execution mode;
When the automatic focusing (AF) during (2) the live view image shows is moved;
When the autozoom (power zoom) during (3) the live view image shows is moved.
The situation of above-mentioned (2) is the example of processing for the frame addition of the dimmed problem of the live view image during the reply AF action.And the situation of above-mentioned (3) is the zoom action control in the demonstration of live view image and the example that changes the frame addition processing of carrying out according to the image multiplying power based on this zoom action.
Fig. 7 is the flow chart of processing sequence of camera control of the photographic equipment (camera) of expression the present invention the 2nd execution mode.Fig. 8 is the flow chart that the live view image in the processing sequence of presentation graphs 7 shows the subprogram of control (the step S47 of Fig. 7).Fig. 9 is the synthetic flow chart of processing the subprogram of (the step S59 of Fig. 8) of the multiple image in the processing sequence of presentation graphs 8.Figure 10 be in the processing sequence of presentation graphs 9 to showing that employed image carried out the flow chart that jitter correction is processed the subprogram of (the step S62 of Fig. 9) last time.
At first, suppose that the power supply of camera 1 is in on-state, be in by the state of starting for moving.When being in this state, in the step S41 of Fig. 7, camera control part 11 confirms whether the action of camera 1 is set to reproduction mode.Confirming the processing that enters step S42 in the situation that is set to reproduction mode herein.And be not the processing that enters step S43 in the situation of reproduction mode in the pattern that sets.
In step S42, camera control part 11 is by memory interface 20 control recording mediums 21, carry out reading in of view data from the posting field of this recording medium 21, and by showing with driver 22 control display devices 23, carry out predetermined image reproducing and process.After this return the processing of step S41 and repeat after this processing.
On the other hand, in step S43, camera control part 11 is confirmed whether to carry out live view image Graphics Processing or is carrying out the dynamic image recording processing.Namely, be confirmed whether to be in the signal from imaging apparatus 12 continuous wave outputs is carried out the state that successional two field picture is processed, herein, during in carrying out live view image Graphics Processing or dynamic image recording processing any, the processing that enters step S44.When confirming not any that carrying out in live view image Graphics Processing or the dynamic image recording processing, return the processing of above-mentioned steps S41 and repeat after this processing.
Then, in step S44, camera control part 11 camera Department of Communication Force 18, the camera lens Department of Communication Force 36 by using communication to connect with contact 39a, 39b, and the lens control section 35 of photographing lens barrel 30 between carry out predetermined communication process.
Then, in step S45, camera control part 11 control exposure control parts 14 are carried out the exposure calculation process.This exposure calculation process is to accept to carry out photometry and to calculate the processing of suitable exposure value from the output signal of imaging apparatus 12.The exposure calculation process is identical with the common processing of carrying out in existing camera.
In addition, in step S46, camera control part 11 control exposure control parts 14 are carried out the exposure settings processing according to the correct exposure value that the exposure calculation process by above-mentioned steps S45 obtains.This exposure settings process be carry out imaging apparatus 12 rolling shutter shutter speed control or carry out and the communicating by letter and carry out the processing of the control of diaphragm etc. of aperture device 33 by lens control section 35 of photographing lens barrel 30.After this enter the processing of step S47.
In step S47, camera control part 11 control imaging apparatuss 12, image processing part 15 etc. are carried out the live view image control and are processed, are that the frame addition is processed.Aftermentioned Fig. 8 has represented the flow chart of this live view image Graphics Processing (frame addition processing) in detail.
In step S48, camera control part 11 is confirmed whether to carry out in the dynamic image operation of recording.Under confirming situation about carrying out in the dynamic image operation of recording, enter the processing of step S49 herein.Be not in the dynamic image operation of recording implementation, namely be in the processing that enters step S50 in the situation in the live view image display action process and confirm.
In step S49, camera control part 11 is carried out predetermined dynamic image recording processing.After this enter step S50.
In step S50, the view data that the result that camera control part 11 is processed according to the frame addition of carrying out in the processing of above-mentioned steps S47 obtains is carried out live view image Graphics Processing.After this return the processing of above-mentioned steps S41 and repeat after this processing.
The below utilizes the live view image control of the flowchart text above-mentioned steps S47 of Fig. 8 to process the details of (frame addition processing).
In the step S51 of Fig. 8, according to carrying out exposure-processed from the output of imaging apparatus 12, the frame image data (RAW data) that carries out obtaining temporarily is stored in the processing in the scratchpad memory 13 to camera control part 11 by exposure control part 14.After this enter the processing of step S52.
In step S52, camera control part 11 is confirmed whether to communicate between the camera lens Department of Communication Force 36 by camera Department of Communication Force 18 and photographing lens barrel 30, carries out the lens driving control for the AF action, namely whether has begun the AF action.This is confirmed to be output signal according to imaging apparatus 12 and is confirmed whether to be in and just carries out that AF carries out with the state that is taken into of frame image data.And AF uses frame image data corresponding to the view data in the photometry region shown in the symbol T1 among aftermentioned Figure 12, i.e. the view data of AF in the ranging region.
In the processing of above-mentioned steps S52, carry out the processing that enters step S54 in the situation of AF with the state that is taken into of frame image data confirming to be in.If do not carry out AF with the state that is taken into of frame image data, the processing that then enters step S53 and be in.
In step S53, camera control part 11 is carried out and is used for carrying out the image processing that common live view image shows, this live view image demonstration uses display unit 23 to show successively the image of periodically being obtained by imaging apparatus 12.After this revert to the originally processing sequence of Fig. 7.
On the other hand, in the processing of above-mentioned steps S52, be judged as and begun the AF action, and after entering into the processing of step S54, in this step S54, camera control part 11 control image processing parts 15 are carried out the focusing subject feature extraction processing of the characteristic of the object subject that the AF action carried out in the processing that extracts by above-mentioned steps S52 focus.This focusing subject feature extraction is processed such as being face detection processing etc.
Then, in step S55, the comparison of the exposure when camera control part 11 carries out by the obtained exposure of AF action and temporary transient storage in the processing of above-mentioned steps S51 common.
Herein, (when the live view image shows) compares the dimmed phenomenon of live view image during the simple declaration AF action when common.
Figure 13 represents the exposure control program line chart in the camera 1 of present embodiment.The exposure control program line chart of (during live view image display action) when wherein Figure 13 A is common.Exposure control program line chart when Figure 13 B is the AF action.
As mentioned above, when AF moves, carry out so that the control of the cycle high speed of vertical synchronizing signal (VD).In this case, if the frame per second of (during live view image display action) is 30fps usually the time, then the frame per second during the AF action for example raises speed to the 240fps.
As shown in FIG. 13A, in the exposure control program of usually (during live view image display action), for example can tackle open f-number f2(AV2 in the low-light level side), 1/30 second (TV5) (index mark Pa) of shutter speed value long second the time.
On the other hand, when AF moved, frame per second was 240fps as mentioned above, therefore with it corresponding long second the time shutter speed value be 1/250 second.Therefore in the exposure control program shown in Figure 13 B, for example tackle open f-number f2(AV2 in the low-light level side), 1/250 second (TV8) (index mark Pb) of shutter speed value long second the time.
Like this, when AF moved, the exposure of low-light level side interlock may be to the maximum below 3 grades in brightness as can be known.
Return Fig. 8, the processing that enters step S56 in the situation about setting up of in the processing of above-mentioned steps S55, " exposing during common exposure<AF ".And the processing that in " exposing during common exposure<AF " invalid situation, enters step S57.
In step S56, camera control part 11 control image processing parts 15 are carried out the image correction process of processing based on signals such as digital gain, gamma corrections.This image correction process is to process with the image in two field picture zone for AF.After this revert to the processing sequence of Fig. 7 originally.
In step S57, whether camera control part 11 exists the above frame image data (obtaining frame) of obtaining and temporarily storing of 2 frames.Wherein, there is the processing that enters step S58 in the above situation that obtains frame of 2 frames.And obtain the processing (above-mentioned) that enters step S56 in the situation of frame less than 2 frames.
Whether in step S58, the result that camera control part 11 is processed according to the focusing subject feature extraction of above-mentioned steps S54 follows the trail of subject, confirms whether the motion vector of this subject is larger, namely more than predetermined amount of movement., be judged as in the larger situation of motion vector herein, entering the processing (above-mentioned) of step S56.And be judged as the processing that enters step S59 in the little situation of motion vector.
At step S59, camera control part 11 is carried out the synthetic processing of multiple image (being specially subprogram shown in Figure 9).After this enter the processing (above-mentioned) of step S56.
Then, by the synthetic details of processing (processing of the step S59 of Fig. 8) of the above-mentioned multiple image of the following explanation of the flow chart of Fig. 9.
In the step S61 of Fig. 9, camera control part 11 carries out the detection of amount of jitter according to the signal of angular-rate sensor 26 grades.Then confirm that this amount of jitter is whether in the scope of recoverable shake.The processing that in the situation of recoverable shake, enters next step S62 herein.And the processing that in the situation that can't carry out jitter correction, enters step S74.
In step S62, camera control part 11 is carried out jitter correction according to employed image in showing in last time and is processed (subprogram shown in Figure 10).After this enter the processing of step S63.
Use the following explanation of Figure 15 ~ Figure 17 to be used for the summary of the view data of jitter correction processing.Figure 15 ~ Figure 17 is the key diagram that the jitter correction carried out in the photographic equipment (camera) in present embodiment is processed.Wherein Figure 15 is the figure of the concept of the expression sensitive surface size of imaging apparatus and effective image-region (viewing area) size.Figure 16 is the schematic diagram in the dither image zone on the expression sensitive surface.Figure 17 is the schematic diagram that is illustrated in the view data that obtains after jitter correction is processed.
Among Figure 15, symbol 12a represents effective sensitive surface of imaging apparatus 12, and symbol 23a represents effective image-region, is viewing area shown in the display frame of display unit 23.Show effective sensitive surface 12a this moment with height H, width W.Effective image-region 23a then shows with height H a, width W a.
When making the rectangular centre point shown in the rectangle shown in the symbol 12a and the symbol 23 as one man overlapping, become the situation shown in the subordinate of Figure 15.Herein, be directions X when making Width, when short transverse is Y-direction, symbol Xa represents can carry out on the directions X amount of jitter correction, symbol Ya represents can carry out on the Y-direction amount of jitter correction.That is,
W=Wa+2Xa
H=Ha+2Ya。
In the camera 1 of the present embodiment that possesses this imaging apparatus 12, for example produce shake shown in Figure 16.Can utilize the amount of jitter X1 of directions X and the amount of jitter Y1 of Y-direction to represent amount of jitter this moment.Therefore can derive in the situation of amount of jitter (X1, Y1), the rectangle of effective image-region 23a is because the amount of the length that is equivalent to this arrow has been moved in shake along the arrow B direction of Figure 16 in the zone of effective sensitive surface 12a.
Therefore, the view data of this moment becomes situation shown in Figure 17, i.e. the view data of height H 1, width W 1.Wherein,
H1=Ha+(Ya-Y1)
W1=Wa+(Xa-X1)。
Utilizing the view data that as above obtains to carry out jitter correction with the view data that had just obtained before this shake processes.
Use the processing (jitter correction processing) of the subprogram declaration above-mentioned steps S62 of Figure 10.
At first in step S81, camera control part 11 upgrades all images zone (effectively sensitive surface 12a).
Then, in step S82, camera control part 11 carries out the renewal of viewing area (effective image-region (23a)).After this revert to the originally processing sequence of Fig. 9.
Return Fig. 9, in step S63, camera control part 11 is confirmed whether to have carried out the alter operation of zoom ratio.Confirming the processing that enters step S67 in the situation of having carried out zoom operation herein.And the processing that in the situation of not carrying out zoom operation, enters step S64.
In step S64, camera control part 11 uses the image (the symbol A of Figure 18) that uses in showing last time as the synthetic image of using.Figure 18 is that the figure that synthesizes the concept when processing is amplified in expression.
Then, in step S65, camera control part 11 is carried out the two field picture (the symbol B of Figure 18, the E of Figure 19) and above-mentioned synthetic synthetic processing with image (being the symbol A of Figure 18 this moment) of this time obtaining.Figure 19 is that the figure that synthesizes the concept when processing is dwindled in expression.
Then, in step S66, the image that camera control part 11 will generate in the synthetic processing of above-mentioned steps S65 (the symbol D of Figure 18, the D of Figure 19) is as the image that uses in showing at this.After this finish a series of processing, revert to the processing sequence of Fig. 8 originally.
On the other hand, in the processing of above-mentioned steps S63, when confirming the zoom ratio alter operation, and when entering the processing of step S67, in this step S67, camera control part 11 confirms whether this zoom ratio alter operation is the operation that utilizes the zoom button in the functional unit that comprises in the operating portion 17 to carry out.This be confirmed to be utilize by operation control part 16 monitor from operating portion 17 in the signal of ZSW of zoom button interlock carry out.Herein, be that the situation that zoom button operates judges as carrying out power zoom action, the processing that enters step S68 confirming.And be not that the situation of zoom button operation judges as having carried out manual zoom operation confirming, enter the processing of step S74.
In step S68, camera control part 11 is obtained the zoom realized by above-mentioned zoom operation to the magnification ratio of image (namely the zoom ratio according to this calculates divided by the zoom ratio of last time).After this enter the processing of step S68.
Then, in step S69, whether camera control part 11 confirms magnification ratio in (magnification ratio>1) more than 1, namely has been confirmed whether to carry out picture and has amplified the zoom operation of direction.In the situation of magnification ratio>1, enter the processing of next step S70 herein.And in the situation that is not magnification ratio>1, enter the processing of step S71.
In step S70, camera control part 11 uses the image (the symbol C of Figure 18) that the image that uses in demonstration last time has been carried out amplifying after processing to use image as synthesizing.After this enter the processing of step S65.
In step S71, camera control part 11 is obtained the zoom realized by above-mentioned zoom operation to possible the minification of image (namely the pixel count by effective image-region (23a) calculates divided by the pixel count of all images regional (effective sensitive surface 12a)).After this enter the processing of step S72.
Then, in step S72, camera control part 11 confirm above-mentioned magnification ratio whether with above-mentioned may minification identical or less than above-mentioned may minification (magnification ratio≤possibility minification).The processing that in the situation of magnification ratio≤possibility minification, enters next step S73 herein.And be not the processing that enters step S74 in the situation of magnification ratio≤possibility minification.
In step S73, camera control part 11 uses the image (the symbol F of Figure 19) after the image that will use in showing last time dwindles to use image as synthesizing.After this enter the processing of step S65.
In step S74, what camera control part 11 was set these moment obtains frame number=1.
Then, in step S75, camera control part 11 uses the two field picture of this time obtaining as the image that uses in showing at this.After this revert to the originally processing sequence of Fig. 8.
The summary of (frame addition processing) is processed in the addition of the above-mentioned two field picture of on, having carried out when having carried out the AF action in live view image display action of using Figure 11 to be simply described as follows the photographic equipment (camera) of present embodiment.
Figure 11 is in the photographic equipment (camera) that is illustrated in present embodiment, the sequential chart of the effect when having carried out the AF action in live view image display action.
Figure 11 A represents the vertical synchronizing signal VD in the imaging apparatus 12.As shown in the figure, for example in the timing shown in the symbol A, supposed to carry out the phase I releasing operation and produced the index signal (from the index signal of phase I release-push) that is used for beginning AF action.So corresponding to this, camera control part 11 control imaging apparatuss 12 are so that the cycle high speed of vertical synchronizing signal VD.
Figure 11 B represents the exposure of each frame.Wherein, symbol LV(a), LV(b) ... the exposure of the every frame when shown rhombus represents live view image display action.And symbol AF(a), AF(b), AF ... the exposure of the every frame when shown diamond-shaped area represents the AF action.And what represent that the exposure of every frame shows employing with rhombus is the row successively playback mode (rolling shutter) that exposes.And the exposure of a frame is with the timing of its exposure center moment as vertical synchronizing signal VD.
As mentioned above, in AF action at once because so that the cycle high speed of vertical synchronizing signal (VD), therefore (exposure) when common (when the live view image shows) short (lacking) between the exposure period of a frame.Among Figure 11 B, between length (time) the expression exposure period with transverse axis, with the cartographic represenation of area exposure of rhombus.So as shown in the figure, have as can be known exposure LV(x)>exposure AF(x); ((x) is (a) and (b) ...) relation.
Figure 11 C represents camera lens communication synchronization signal.This camera lens communication synchronization signal is the signal that makes main body side frame per second and camera lens side frame per second synchronous.The vertical synchronizing signal of camera lens communication synchronization signal and Figure 11 A (VD) is synchronous.
Figure 11 D represents that lens location obtains signal.It is exposure center signal regularly from main body side to the transmission of camera lens side that this lens location is obtained signal.Therefore, at the exposure intermediate point of each frame shown in Figure 11 B, the lens location of Figure 11 D is obtained the output of signal and can be reversed.The output that camera control part 11 is obtained signal at this lens location regularly obtains the state (such as f-number, lens location etc.) of camera lens side.
Figure 11 E represents that main body (B)-camera lens (L) communication regularly.This B-L communication regularly is the mutually intercommunication timing of carrying out between main body side and camera lens side.Intercom mutually by this, carry out the information exchange of main body and camera lens both sides state according to every frame.
Figure 11 F represents AE control regularly.Wherein, AE control is the control that is reflected to the exposure actions of frame next time for the exposure value that will obtain according to the exposure actions of the frame of last time.Symbol AE(b shown in Figure 11 F), AE(c), AE(d) ... be illustrated in the exposure value of obtaining in the exposure actions of frame of respectively corresponding last time, be reflected to respectively in the exposure actions of each self-corresponding next frame.
For example, the exposure value symbol AE(b of Figure 11 F) is applied to the symbol LV(b of Figure 11 B) shown in exposed frame action.Similarly, the exposure value symbol AE(c of Figure 11 F) is applied to the symbol AF(a of Figure 11 B) shown in exposed frame action.And then similarly, the symbol AE(d of Figure 11 F) exposure value be applied to the symbol AF(b of Figure 11 B) shown in the exposed frame action.
Wherein, as mentioned above, when the timing at the symbol A of Figure 11 A has begun the AF action, the cycle high speed (Figure 11 A) of the vertical synchronizing signal (VD) that constantly rises of above-mentioned symbol A thus then, (Figure 11 B) shortens between exposure period.
When having begun AF when action, AE control, the auto-exposure control that namely comprises the photometry action become (when the live view image shows) different control when common.
For example for photometry region, as shown in figure 12, the photometry region the when photometry region when the live view image shows moves from AF is different.Figure 12 is the figure of the interior photometry region of effective sensitive surface 12a of expression imaging apparatus 12.
Photometry region when the zone of the substantially elliptical shape that the symbol T2 of Figure 12 is represented shows as the live view image.Photometry region when move as AF the rectangular area that the symbol T1 of Figure 12 is represented.Namely, photometry region T2 when the live view image shows be set to imaging apparatus 12 effective sensitive surface 12a remove near peripheral four bights outside roughly whole zone, and the photometry region T1 during the AF action is set to the presumptive area of the substantial middle section that only limits to comprise above-mentioned effective sensitive surface 12a.And the photometry region T1 during the AF action for example is equivalent to the AF ranging region of setting when carrying out the AF action.
As above, when the live view image shows, with effective image-region roughly whole as the photometry object, when AF moves then with AF with image-region as the photometry object, carry out photometry according to the image information in this narrower zone.Therefore, when the live view image shows and when AF moves, sometimes different by each exposure value that each photometry action is obtained.In this case, when the live view image shows and AF when action, show that the lightness of image becomes inhomogeneous.
So the demonstration image during the 1 couple of AF of the camera of present embodiment action carries out the addition of two field picture to be processed, so that the demonstration image of live view image when moving with AF when showing becomes even.
Figure 11 G represents to process regularly for the image of delta frame image.Use symbol P1, P2, P3, P4, P5 among Figure 11 G ... represent.
Figure 11 H represents to show the Displaying timer of image.Use symbol D1, D2, D3, D4, D5 among Figure 11 H ... represent.
Carry out the symbol LV(a that two field picture that image processes is based on Figure 11 B in the timing of symbol P1, the P2 of Figure 11 G), LV(B) the view data of exposure actions.The symbol LV(a of Figure 11 B), exposure actions LV(B) is to utilize the exposure value of obtaining from the two field picture of last time to carry out.Therefore, carry out two field picture that image processes in the timing of symbol P1, the P2 of Figure 11 G and be based on that conditions of exposure when common forms.And directly carry out Graphics Processing in the timing shown in symbol D1, the D2 of Figure 11 H.Then begin the AF action in the moment after this.
The two field picture of processing in the timing of the symbol P3 of Figure 11 G is based on the symbol AF(a of Figure 11 B) the image of exposure actions.Wherein, the exposure actions symbol AF(a of Figure 11 B) is to utilize from last symbol LV(b) the exposure value obtained of two field picture carry out.Therefore, the two field picture of processing in the timing of the symbol P3 of Figure 11 G is based on also that conditions of exposure when common forms.And directly carry out Graphics Processing in the timing shown in the symbol D3 of Figure 11 H.
Then, the two field picture of processing in the timing of the symbol P4 of Figure 11 G is based on the symbol AF(b of Figure 11 B) the image of exposure actions.Herein, the exposure actions symbol AF(b of Figure 11 B) is the symbol AF(a that utilized from last time) the exposure value obtained of two field picture carry out.Conditions of exposure when therefore, all being based on the AF action from the two field picture of processing in the timing of the symbol P4 of Figure 11 G forms.Therefore there is the possibility that produces change in exposure value.So, in the Graphics Processing that the timing shown in the symbol D4 of Figure 11 H is carried out, the two field picture of processing in the timing of the symbol P4 of correspondence is carried out processing with the addition of the two field picture (symbol P3) of last time.
Identical therewith, the two field picture of processing in the timing of the symbol P5 of Figure 11 G is to carry out Graphics Processing in the timing of the symbol D5 of Figure 11 H.In this Graphics Processing, carry out processing with the addition of the two field picture (symbol P4) of last time for the two field picture of processing in the timing of the symbol P5 of correspondence.After this carry out same processing.
Addition two field picture number when as above the conducting frame image addition is processed can followingly be obtained.
The poor BVerr of exposure of exposure BVb in during exposure BVa when the live view image shows and the AF can utilize BVerr=BVa-BVb to show.
And AF two field picture composite number (ConbAfFrameNum; CAFN) be CAFN=2^BVerr.As long as set addition two field picture number according to this AF two field picture composite number.
And, in the situation that only the acquired AF two field picture of addition (two field picture of obtaining during the AF action) is inadequate, as long as with LV two field picture (two field picture of obtaining when the common live view image before AF action beginning shows) addition.
The below uses that Figure 14 is simply described as follows among the above-mentioned effect of photographic equipment (camera) of present embodiment, the summary of (frame addition processing) is processed in the addition of the two field picture carried out when having carried out the zoom action in live view image display action.
Figure 14 is in the photographic equipment (camera) that is illustrated in present embodiment, the sequential chart of the effect when having carried out the zoom action in live view image display action.
Figure 14 A represents the vertical synchronizing signal VD in the imaging apparatus 12.Figure 14 B represents the exposure of each frame.In this Figure 14 B, express in detail by the exposure LV(a shown in Figure 12 B), LV(B) ... the rhombus of expression.The exposure of the frame during live view image display action according to each horizontal reset of advancing, expose, read in action.Therefore along with the time through and constantly play from reading in of the 1st row last column read in constantly between produce deviation.
Figure 14 C represents the operation index signal exported according to the operation of the zoom button in the functional unit that comprises in the operating portion 17.In the example shown in Figure 14 C, be illustrated in symbols Z 1 beginning zoom button operation, continue to carry out the situation that zoom button operates the moment of up-to symbol Z2.
Figure 14 D represents the zoom lens driving timing with above-mentioned zoom button operations linkage.In this case, zoom lens receives the connection signal of Figure 14 C and connects in the moment of symbol L1, disconnects in the moment of symbol L2.During symbol L2 ~ symbol L3, shown in Figure 14 C, although proceed the operation of zoom button, also stop the driving of (forbidding) zoom lens.And again become on-state in the moment of the symbol L3 of Figure 14 D, the moment of this schematic symbol L4 receive above-mentioned zoom button operation stop that (index signal stops; Symbols Z 2), also stopping zoom lens simultaneously drives.
In this case, during the symbol L2 of Figure 14 D ~ symbol L3 in, as shown in Figure 14B, carry out in the figure during the exposure second time.Accordingly, shown in Figure 14 D, stopping (forbidding) zoom lens within this period drives.
Figure 14 E represents the variation of zoom ratio.At first, shown in the symbol A of starting stage during be set as the multiplying power of carrying out before the zoom action during.Do not carry out the zoom button operation of Figure 14 C in this moment.Therefore zoom ratio does not change.Represent during shown in the next symbol B during the driving first time of zoom.The zoom lens of B and Figure 14 D drives the zero hour during driving the first time of this zoom, namely this schematic symbol L1 begins the multiplying power variation simultaneously, finishes multiplying power in the moment of this schematic symbol L2 and changes.Its reason is, the moment at the symbol L2 of Figure 14 D, begin the exposure of the 2nd frame, and zoom lens drives and stops.
Then, keeping the action first time of zoom during shown in the symbol C of Figure 14 E and finishing multiplying power constantly, zoom ratio does not produce change.During this zoom that is equivalent to the symbol L2 of Figure 14 D ~ symbol L3 is forbidden.
Then, shown in the symbol D of Figure 14 E during be drive for the second time of zoom during.During driving for the second time of this zoom D be equivalent to the symbol L3 of Figure 14 D ~ symbol L4 during.
During shown in the symbol E of Figure 14 E be keeping zoom the zoom ratio that drives the moment after finishing the second time during.
The amplification of Figure 14 F presentation video, dwindle the timing of processing.In the example of this figure, according to zoom ratio (the symbol C of Figure 14 E) and the front multiplying power (the symbol A of Figure 14 E) of zoom action that the result who drives the first time of carrying out zoom changes, carry out the amplification processing of image or dwindle processing.
Figure 14 G represents the timing that the two field picture addition is processed.In example shown in this figure, the two field picture addition is processed the two field picture that carries out the exposure image first time (image before the zoom action) and is added that the two field picture addition of exposure image (driving the image of the zoom ratio that change has occured based on the first time by zoom) is processed for the second time.
The timing that Figure 14 H presentation video is processed.Figure 14 I is the timing of image Graphics Processing.The image that carries out in the timing shown in the symbol P1 of Figure 14 H is processed and is based on by processing at the image of the obtained view data of the exposure image first time of Figure 14 B.It is that the front common image of zoom action is processed that this image is processed.And this processing result image shows in the timing shown in the symbol H1 of Figure 14 I.
Process the image processing that is based on by the view data that exposure image the is obtained second time at Figure 14 B at the image that the timing shown in the symbol P2 of Figure 14 H is carried out.Here, namely be the image that the two field picture addition processing by Figure 14 G obtains, the image after the change has occured the as a result zoom ratio that namely moves for the first time as zoom processes.This processing result image shows in the timing shown in the symbol H2 of Figure 14 I.
Like this, in live view image display action, when the autozoom action carried out based on zoom button operation, between the exposure period of imaging apparatus 12 in, also stop the zoom action even if proceed operation.
And until between next exposure period during restart zoom action, and suitably follow for the two field picture of before and after zoom action stopping period, obtaining and to amplify that the image that dwindles processing is synthetic to be processed, carry out the frame addition.
As mentioned above, according to above-mentioned the 2nd execution mode, having carried out AF in live view image display action has moved or has carried out in the situation of zoom action, carry out from the 1st display mode (common live view image display mode) to the 2nd display mode the switching controls of the display action of (the live view image display mode of following the frame addition to process), therefore can continue all the time to show uniform live view image.
Respectively process sequence about what illustrate in the respective embodiments described above, only otherwise violate its character, just allow the change of step.Therefore can for example change the execution sequence of various processes or carry out simultaneously a plurality for the treatment of steps above-mentioned processing sequence, or when carrying out a series of processing sequence so that the order of various processes is different.
And, the invention is not restricted to above-mentioned execution mode, certainly can in the scope that does not break away from inventive concept, implement various modification and application.And then above-mentioned execution mode comprises the invention in various stages, by disclosed a plurality of constitutive requirements are carried out appropriate combination, can extract various inventions.For example also can solve the problem that the invention wish solves in certain several constitutive requirements of deletion from all constitutive requirements shown in the above-mentioned execution mode, obtain in the situation of invention effect, the formation of having deleted after these constitutive requirements just can be extracted as invention.
The present invention be not limited only to digital camera etc. specific for the electronic equipment of camera function be photographic equipment, can also be applied to possess the other forms of electronic equipment of camera function, such as portable phone, sound pick-up outfit, electronic notebook, personal computer, game station, TV, clock, use GPS(Global Positioning System) the various electronic equipments with camera function such as navigator.

Claims (4)

1. photographic equipment is characterized in that having:
Image pickup part, it is taken subject with predetermined period;
Display part, it shows the image of being obtained by above-mentioned image pickup part; And
Display control unit, it controls above-mentioned display part,
Above-mentioned display control unit carries out switching controls to the 1st display mode and the 2nd display mode, wherein in the 1st display mode, show successively the image of being obtained with predetermined period by above-mentioned image pickup part, a plurality of image additions that the 2nd display mode will be obtained with predetermined period by above-mentioned image pickup part show the image that the result as this addition obtains.
2. photographic equipment according to claim 1 is characterized in that, above-mentioned display control unit should carry out the image of image addition according to the operation index signal change that is input to equipment when showing control based on above-mentioned the 2nd display mode.
3. photographic equipment according to claim 2 is characterized in that, the aforesaid operations index signal is the operation index signal of indication AF action.
4. photographic equipment according to claim 2 is characterized in that, the aforesaid operations index signal is the operation index signal of indication zoom action.
CN2012103273980A 2011-09-08 2012-09-06 Photographic equipment Pending CN103002223A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011-196259 2011-09-08
JP2011196259A JP2013058923A (en) 2011-09-08 2011-09-08 Photographing apparatus

Publications (1)

Publication Number Publication Date
CN103002223A true CN103002223A (en) 2013-03-27

Family

ID=47930309

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2012103273980A Pending CN103002223A (en) 2011-09-08 2012-09-06 Photographic equipment

Country Status (2)

Country Link
JP (1) JP2013058923A (en)
CN (1) CN103002223A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104144294A (en) * 2013-05-10 2014-11-12 奥林巴斯株式会社 Image pickup apparatus, and image correction method
CN105100594A (en) * 2014-05-12 2015-11-25 奥林巴斯株式会社 Imaging device and imaging method
CN109522861A (en) * 2018-11-28 2019-03-26 西南石油大学 A kind of micro- expression recognition method of face multiclass

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6227083B2 (en) * 2016-10-04 2017-11-08 オリンパス株式会社 Imaging apparatus and imaging method
JP7450408B2 (en) 2020-03-05 2024-03-15 キヤノン株式会社 Electronic devices, their control methods, programs and storage media
WO2021226990A1 (en) * 2020-05-15 2021-11-18 深圳市大疆创新科技有限公司 Photographic device and photographing method applicable to photographing extreme scenario

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1196632A (en) * 1998-01-22 1998-10-21 张琪 Optimization method for software of camera pickup at low light level static state
CN101594464A (en) * 2008-05-26 2009-12-02 奥林巴斯映像株式会社 Imaging device and formation method
CN101617339A (en) * 2007-02-15 2009-12-30 索尼株式会社 Image processing apparatus and image processing method
JP2010252006A (en) * 2009-04-15 2010-11-04 Casio Computer Co Ltd Imaging apparatus, photographing method, and program thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4794963B2 (en) * 2005-06-28 2011-10-19 キヤノン株式会社 Imaging apparatus and imaging program
JP2007189639A (en) * 2006-01-16 2007-07-26 Matsushita Electric Ind Co Ltd Digital camera
JP2010160311A (en) * 2009-01-08 2010-07-22 Panasonic Corp Imaging apparatus

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1196632A (en) * 1998-01-22 1998-10-21 张琪 Optimization method for software of camera pickup at low light level static state
CN101617339A (en) * 2007-02-15 2009-12-30 索尼株式会社 Image processing apparatus and image processing method
CN101594464A (en) * 2008-05-26 2009-12-02 奥林巴斯映像株式会社 Imaging device and formation method
JP2010252006A (en) * 2009-04-15 2010-11-04 Casio Computer Co Ltd Imaging apparatus, photographing method, and program thereof

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104144294A (en) * 2013-05-10 2014-11-12 奥林巴斯株式会社 Image pickup apparatus, and image correction method
CN104144294B (en) * 2013-05-10 2018-01-23 奥林巴斯株式会社 Capture apparatus and method for correcting image
CN105100594A (en) * 2014-05-12 2015-11-25 奥林巴斯株式会社 Imaging device and imaging method
CN105100594B (en) * 2014-05-12 2018-10-12 奥林巴斯株式会社 Photographic device and image capture method
CN109522861A (en) * 2018-11-28 2019-03-26 西南石油大学 A kind of micro- expression recognition method of face multiclass

Also Published As

Publication number Publication date
JP2013058923A (en) 2013-03-28

Similar Documents

Publication Publication Date Title
CN101202841B (en) Imaging apparatus and exposal control method for the same
CN103248813B (en) Photographic equipment and method of controlling operation thereof thereof
CN101355655B (en) Image pickup apparatus
US7852401B2 (en) Photographing apparatus and photographing method for exposure control during continuous photographing mode
CN103002211B (en) Photographic equipment
CN100576054C (en) Imaging device and light shielding member
CN103002223A (en) Photographic equipment
CN102957864A (en) Imaging device and control method thereof
JP5022758B2 (en) Imaging apparatus, imaging system, and driving method of imaging apparatus
JPS60143330A (en) Photographic device
CN104980644A (en) Shooting method and device
JP2009225072A (en) Imaging apparatus
JP2019161272A (en) Imaging apparatus, imaging method, program and recording medium
CN104079837A (en) Focusing method and device based on image sensor
CN109691085A (en) Photographic device and camera shooting control method
JP2011135185A (en) Imaging device
JP2015154409A (en) Imaging apparatus, control method of imaging apparatus, program, and storage medium
JP2015144346A (en) Imaging apparatus, imaging method and program
JP2009164892A (en) Image processing apparatus, control method, and program
CN102081294A (en) Image pickup apparatus
CN106101521A (en) Camera head and the control method of camera head
JP2018006828A (en) Imaging apparatus, control method therefor and program
JP6300569B2 (en) Imaging apparatus and control method thereof
CN106303212B (en) Filming apparatus and image pickup method
CN101403846B (en) Imaging device, and control method for imaging device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20151215

Address after: Tokyo, Japan, Japan

Applicant after: Olympus Corporation

Address before: Tokyo, Japan, Japan

Applicant before: Olympus Imaging Corp.

RJ01 Rejection of invention patent application after publication

Application publication date: 20130327

RJ01 Rejection of invention patent application after publication