CN102209196B - Image processing apparatus and image evaluation method - Google Patents

Image processing apparatus and image evaluation method Download PDF

Info

Publication number
CN102209196B
CN102209196B CN201110083627.4A CN201110083627A CN102209196B CN 102209196 B CN102209196 B CN 102209196B CN 201110083627 A CN201110083627 A CN 201110083627A CN 102209196 B CN102209196 B CN 102209196B
Authority
CN
China
Prior art keywords
image
factor
evaluation
sets
eliminating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110083627.4A
Other languages
Chinese (zh)
Other versions
CN102209196A (en
Inventor
坂本浩
坂本浩一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011046188A external-priority patent/JP4998630B2/en
Application filed by Nikon Corp filed Critical Nikon Corp
Priority to CN201610416864.0A priority Critical patent/CN105939456A/en
Publication of CN102209196A publication Critical patent/CN102209196A/en
Application granted granted Critical
Publication of CN102209196B publication Critical patent/CN102209196B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)

Abstract

A kind of image processing apparatus promptly selecting optimal shooting candidate image and image evaluation method are provided.Image processing apparatus (1) comprises: the first operational part (11), each image in the multiple images comprised in the image sets for input, calculates evaluation of estimate based on factor I respectively;Eliminating portion (11), operation results based on the first operational part (11), the image that this operation result meets from image sets rated condition carries out eliminating process;Second operational part (11), each image comprised in the image sets after eliminating is processed, calculate evaluation of estimate based on factor Ⅱ respectively, wherein this factor Ⅱ comprises the factor different from factor I;And evaluation process portion (11), operation results based on the second operational part (11), each image in image sets after processing eliminating sets up sequence.

Description

Image processing apparatus and image evaluation method
Technical field
The present invention relates to image processing apparatus and image evaluation method.
Background technology
Known have the camera (with reference to patent documentation 1) selecting optimal shooting from multiple images.In the prior art, select without the image as shaking, and select the image of optimum exposure based on this nothing as the block diagram of the image of shake.
Patent documentation 1: Japanese Unexamined Patent Publication 2006-311340 publication
In the prior art, with whole images as object, being made whether to exist as the judgement of shake, and select optimal shooting candidate image, therefore existence processes and spends time taking problem.That is, judgement based on optimum factor (without as shake) is implemented with all images for object.
Summary of the invention
The image processing apparatus of the present invention is characterised by, comprises: the multiple images comprised in the image sets for input calculate the first operational part of evaluation of estimate based on factor I respectively;Operation result based on the first operational part meets the image of rated condition from image sets and carries out the eliminating portion of eliminating process this operation result;The each image comprised in image sets after processing eliminating calculates the second operational part of evaluation of estimate based on factor Ⅱ respectively, and this factor Ⅱ comprises the factor different from factor I;And operation result of based on the second operational part eliminating is processed after image sets in each image set up the evaluation process portion of sequence.
The image evaluation method of the present invention is characterised by, including: first step, input comprises the image sets of multiple image;Second step, calculates evaluation of estimate based on factor I respectively to each image comprised in described image sets;Third step, operation result based on described second step, the image that this operation result meets rated condition is got rid of from described image sets;4th step, to each image comprised in the described image sets after described third step, calculates evaluation of estimate based on factor Ⅱ respectively, and this factor Ⅱ comprises the factor different from described factor I;And the 5th step, operation result based on described 4th step, each image in the described image sets after described third step is set up sequence.
Invention effect
In accordance with the invention it is possible to promptly select optimal shooting candidate image.
Accompanying drawing explanation
Fig. 1 is the block diagram of the major part structure of the electronic camera that an embodiment of the present invention is described.
Fig. 2 is the figure obtaining the time of the pre-image clapped under screening-mode of explanation.
Fig. 3 is the flow chart of the flow process illustrating that the picture appraisal that host CPU performs processes.
Fig. 4 is the figure of the relation between description flags Case_PF and mark Case_MV and corresponding evaluation processing item.
The figure in the region that frames out when Fig. 5 (a) is to illustrate lateral attitude, the figure in the region that frames out when (b) is to illustrate lengthwise position.
Fig. 6 is to illustrate AF region and the figure of center tiny area.
Fig. 7 is the figure illustrating evaluation object region.
Fig. 8 is the figure calculated of the evaluation of estimate illustrating that composition most preferably spends.
Fig. 9 is the figure of exemplary computer device.
Figure 10 is the figure of the display picture illustrating LCD monitor.
Description of reference numerals:
1 electronic camera
11 host CPUs
14 display image generative circuits
15 buffer storage
18 functional units
19 LCD monitors
22 imaging apparatuss
Detailed description of the invention
Hereinafter, referring to the drawings, the mode for implementing the present invention is described.Fig. 1 is the block diagram of the major part structure of the electronic camera 1 that one embodiment of the present invention is described.Electronic camera 1 is controlled by host CPU 11.
Capture lens 21 makes the imaging in the imaging surface of imaging apparatus 22 of subject picture.Imaging apparatus 22 includes ccd image sensor or cmos image sensor, the subject picture in shooting imaging surface, and is exported to imaging circuit 23 by image pickup signal.Simulation image pickup signal, in addition to the photoelectric conversion signal simulation exported from imaging apparatus 22 is processed (gain control etc.), is converted into numerical data also by built-in A/D change-over circuit by imaging circuit 23.
Host CPU 11 inputs the signal exported from each program segment and carries out the computing specified, and control signal based on operation result is exported to each program segment.Image processing circuit 12 such as constitutes ASIC (ApplicationSpecificIntergratedCircuits: special IC), and the data image signal inputted from imaging circuit 23 is carried out image procossing.Image procossing comprises such as emphasizing contour or colour temperature adjust (blank level adjustment) and process, the format conversion of picture signal is processed.
Image compression circuit 13 such as carries out the image Compression of regulation compression ratio by image processing circuit 12 to the picture signal after being processed in JPEG mode.Display image generative circuit 14 generates for shooting image is shown the display signal on LCD monitor 19.
LCD monitor 19 is made up of liquid crystal panel, shows image or actions menu picture etc. based on the display from display image generative circuit 14 input with signal.Image output circuit 20 generates and export, based on the display from display image generative circuit 14 input, the signal of video signal being used for image or actions menu picture etc. being shown on exterior display device with signal.
Buffer storage 15, in addition to the data before interim storage image procossing, after image procossing and in image processing process, is additionally operable to store the image file that the image file before record medium 30 records or storage read from record medium 30.In the present embodiment, the pre-bat image being additionally operable to obtain by imaging apparatus 22 before shooting instruction (release button complete press before operating) with regulation frame per second stores temporarily.Describe later about pre-image of clapping.
Flash memories 16 store host CPU 11 perform program or host CPU 11 carry out processing required data etc..The program of flash memories 16 storage or the content of data can carry out adding, changing according to the instruction from host CPU 11.
Card interface (I/F) 17 has adapter (not shown), and this adapter connects record media 30 such as having storage card.Card interface 17 carries out that connection records the data write of medium 30 according to the instruction from host CPU 11 or the data from record medium 30 are read in.Record medium 30 is made up of the storage card being built-in with semiconductor memory or hard disk drive etc..
Functional unit 18 comprises various buttons or the Switch of electronic camera 1, is exported to host CPU 11 by the operation signals corresponding with the operation content of each functional unit such as the handover operation of mode selector switch.Partly press switch 18a and the full push pressing switch 18b and discharge button (not shown) interlocks, will turn on signal respectively and export to host CPU 11.Press the connection signal (half press operation signal) of switch 18a from half to be exported when release button carries out push to the half degree of usual stroke, and release output by the push of releasing half trip.Export when discharging button and being carried out push to usual stroke from the connection signal (entirely pressing operation signal) entirely pressing switch 18b, release output when releasing the push of usual stroke.Half proceeds by shooting by operation signal designation host CPU 11 prepares.Entirely start to obtain record image by operation signal designation host CPU 11.
<screening-mode>
Electronic camera 1 has: common screening-mode, and according to above-mentioned complete by operation signal, frame by frame obtains shooting image, carries out record to record medium 30;Clap screening-mode in advance, receive above-mentioned half by when operating signal, with 120 frames/(120fps) per second, rest image time high-speed shutter second (being such as faster than 1/125th seconds) is carried out continuous shooting shooting, obtain the continuous shooting shooting image of multiple frame, receive above-mentioned entirely by operation signal time, by receive this entirely by the regulation two field picture before and after the moment of operation signal respectively to record medium 30 record.Each screening-mode can switch over according to the operation signal from functional unit 18.
<replay mode>
The image of record in above-mentioned each screening-mode by each frame or by the frame of specified quantity, is carried out, to LCD monitor 19, display of resetting by the electronic camera 1 of replay mode.
Owing to present embodiment is characterised by above-mentioned pre-bat screening-mode, therefore the following description is carried out centered by pre-bat screening-mode.Fig. 2 is the figure obtaining the time of the pre-image clapped under screening-mode of explanation.
<pat in advance and take the photograph>
In fig. 2, when moment t0 input half is by operation signal, host CPU 11 starts to discharge standby process.In discharging standby process, such as, shoot subject picture with the frame per second of 120 frames/second (120fps), be exposed computing or focal adjustments, and the view data of acquirement is stored to buffer storage 15 successively.
It is ensured in advance that sufficient capacity for patting the memory span of the buffer storage 15 taken the photograph in advance.After the time t theta, when in buffer storage 15, the frame number of the two field picture of storage reaches regulation number (such as A opens), host CPU 11 carries out covering from old two field picture eliminating successively.Thereby, it is possible to limit the memory span for patting the buffer storage 15 taken the photograph in advance.
Complete by when operating signal in moment t1 input, host CPU 11 starts release and processes.In release processes, the A photographed before time tl is opened two field picture and the B photographed after the time tl opens (A+B) two field picture after two field picture is added as the record candidate image to record medium 30.
Host CPU 11 is selected the C beforehand through menu operation instruction from (A+B) two field picture and is opened two field picture (C < A+B), this C opens two field picture and sets up association, store to record medium 30 respectively.Black bands expression in Fig. 2 obtains the record candidate image i.e. interval of (A+B) two field picture.Although oblique line band expression obtains is temporarily stored in buffer storage 15 but the interval of the capped two field picture eliminated.
In the present embodiment, as recording mode, to the first recording mode of record medium 30 record and (A+B) two field picture all can be switched over to the second recording mode recording medium 30 record the C in (A+B) two field picture being opened two field picture according to the operation signal from functional unit 18.In the present embodiment, illustrate to select the situation of the first recording mode.
As described below, host CPU 11 selects C to open two field picture from (A+B) two field picture.Fig. 3 is the flow chart of the flow process illustrating that the picture appraisal that host CPU 11 performs processes.(A+B) two field picture when buffer storage 15 inputs, is started the process of Fig. 3 by host CPU 11.
In step S10 of Fig. 3, host CPU 11 judges whether to comprise in two field picture " face " of personage.Host CPU 11 is opened two field picture for (A+B)=N and is implemented " face " detection process successively." face " detection processes due to known, therefore omits the description.When host CPU 11 is judged in two field picture containing " face ", step S10 is made and certainly judges, enter step S20.When host CPU 11 is judged in two field picture without " face ", step S10 is made a negative judgment, enter step S90.
In step S20, host CPU 11 judges that personage is whether as main subject.Such as n-th (wherein, 1≤n≤N) in individual two field picture, when the pixel count of " face " that composition detects is more than specified quantity, enumerator h (n) is set to 1, in n-th frame image, when being not detected by " face " or when the pixel count of " face " that detects of composition is unsatisfactory for specified quantity, enumerator h (n) is set to 0.It is, when PF > th_pf (wherein, th_pf is the decision threshold of regulation) sets up, to make step S20 and certainly judge, enter step S30 that host CPU 11 N opens two field picture carries out counting the result of PF=∑ h (n).In this case, it is judged that be main subject for personage.Step S20, when PF < th_pf sets up, is made a negative judgment by host CPU 11, enters step S90.In this case, it is judged that be not main subject for personage.
In step s 30, host CPU 11 will be deemed as situation that personage is main subject as " Case_A ".Mark Case_PF is set to 1 by host CPU 11 in this case, and mark Case_MV is set to 1, enters step S40.Mark Case_PF be when personage is main subject be 1 and when personage is not main subject be 0 mark.Mark Case_MV is the mark representing and whether producing difference between two field picture, when personage is main subject, for convenience, is set to 1.
In step S90, host CPU 11 judges that subject is with or without movement.Generally, when subject moves, when carrying out zoom operation and cause the angle of visual field to change and when electronic camera 1 carries out pan, between two field picture, produce difference respectively.In the present embodiment, between first and n frame, calculate the difference of the view data of correspondence the most successively, obtain summation (following formula (1)).
Delk_FR=∑xy| 1 (x, y, 1 |-1 (x, y, N) | (1)
Wherein, (x, y) is 1 (x, y, 1) to the view data of the first frame, and (x y) is 1 (x, y, N) to the view data of nth frame.
Step S90, when the operation result of above formula (1) is Del_FR > th_fr (wherein, th_fr is the decision threshold of regulation) establishment, is made and certainly being judged, enter step S100 by host CPU 11.In this case, changed by the movement of subject, the angle of visual field and pan occurs at least one, it is judged that be mobile for existing between frame.Step S90, when Del_FR < th_fr sets up, is made a negative judgment by host CPU 11, enters step S110.In this case, it is judged that for the most mobile, for being generally stationary state between frame.
In the step s 100, host CPU 11 will be deemed as personage is not that the situation that there is movement between main subject and two field picture is as " Case_B ".Mark Case_PF is set to 0 by the host CPU 11 of this situation, and mark Case_MV is set to 1, enters step S40.Mark Case_MV is the mark representing and whether producing difference between two field picture.It should be noted that in the case of " Case_B ", such as, the subject in will be present in the subject of minimum distance or being present in the region being chosen as answering focusing area by user or camera is judged as " main subject ", carries out later process.
In step s 110, host CPU 11 will be deemed as personage is not that the situation that there is not movement between main subject and two field picture is as " Case_C ".Mark Case_PF is set to 0 by the host CPU 11 of this situation, and mark Case_MV is set to 0, enters step S40.
In step s 40, host CPU 11 determines evaluation processing item, entrance step S50 according to the value (that is, according to for " Case_A ", " Case_B " or " Case_A ") of mark Case_PF and mark Case_MV.Fig. 4 is the table of the relation between description flags Case_PF and mark Case_MV and corresponding evaluation processing item.In the present embodiment, calculate in table the evaluation of estimate evaluating processing item being designated as " ON ", and about being designated as the evaluation processing item of " OFF ", omit the calculating of evaluation of estimate.
In the present embodiment, evaluate processing item 41~evaluate processing item 46 be for from N open in two field picture get rid of X open two field picture with retain C open two field picture evaluation of estimate calculate used in evaluation points, because of the most useless factor.When X is more than N/2, sequence additional treatments described later burden alleviates, thus preferably.And, evaluation processing item 45~evaluation processing item 50 are for C opens the evaluation points used in the evaluation of estimate calculating of the additional sequence representing quality of two field picture, because being referred to herein as necessity factor (or optimal evaluation points).Evaluate processing item 45 and 46 to repeat between the useless factor and the necessary factor.Describe later about each evaluation points.
In step s 50, host CPU 11 uses the evaluation points corresponding with the evaluation processing item determined in step s 40, is evaluated value and calculates, enters step S60.In step S60, host CPU 11 selects, from the two field picture of evaluation of estimate low (high without expenditure), the two field picture (unwanted picture) that X opens successively, and opens two field picture eliminating from N and retain C and open two field picture, enters step S70.
In step S70, host CPU 11 uses the evaluation points corresponding with the evaluation processing item determined in step s 40, is evaluated value and calculates, enters step S80.In step S80, host CPU 11 opens the two field picture of two field picture for C, and the two field picture high from evaluation of estimate sets up sequence successively, and terminates the process of Fig. 3.
About above-mentioned evaluation points, below it is described in detail.
<proximity detection between multiple personages>
Evaluating in processing item 41, calculating evaluation of estimate H1 (n) of nearness between the multiple personages represented in evaluation object image.Host CPU 11, when the area S in " face " region that each two field picture n (wherein, 1≤n≤N) opening two field picture about composition N detects exceedes regulation area set in advance, obtains value below.Area represents with the pixel count such as being constituted " face " region.
" face " that host CPU 11 exceedes regulation area to the area S detected in two field picture n counts, and the quantity (i.e. number) of " face " drawn by counting is set to KS (n).Further, the area in " face " region detected in two field picture n is set to Sk (n).And, coordinate suitable for the center with " face " region in two field picture n is set to (x_k (n), y_k (n)).Additionally, the limit in the grade direction being somebody's turn to do " face " region is expressed as Lx_k (n), the limit being somebody's turn to do the vertical direction in " face " region is expressed as Ly_k (n).Here, n is the frame number representing and constituting the two field picture that N opens continuous shooting image, k is " face " numbering representing and exceeding above-mentioned regulation area " face " region in two field picture n.
Host CPU 11 is when KS (n)≤1, and do not consider between personage is close, therefore represents evaluation of estimate H1 (n)=1 of nearness between multiple personage.On the other hand, during KS (n) > 1, m following circular treatment is repeated.Wherein, m=Combination (KS (n) .2).
Such as, illustrate in case of KS (n)=2,
| x_1 (the n)-x_2 (n) | that passes through R1x=, calculates in " face " region length x in grade direction in the heart,
| y_1 (the n)+y_2 (n) | that passes through R1x=, calculates in " face " region length y of vertical direction in the heart,
By R2x=(Lx_1 (n)+Lx_2 (n))/2, add the grade direction in " face " region limit 1/2,
By R2y=(Ly_1 (n)+Ly_2 (n))/2, add the vertical direction in " face " region limit 1/2.
Further, when R1x < R2x, R1y < R2y set up, hm (n)=1, time beyond above-mentioned, hm (n)=0.Additionally, the full combination of the quantity (i.e. number) of m time " face " is repeated.
As it has been described above, about the summation of above-mentioned hm (n), when ∑ hm (n) > 0 sets up, be set in two field picture n close to evaluation of estimate H1 (n)=0.On the other hand, about the summation of above-mentioned hm (n), when setting up in ∑ hm (n)=0, it is set to close to evaluation of estimate H1 (n)=1.
<face's dud detection>
In evaluating processing item 42, calculate evaluation of estimate H2 (n) of the reliability that " face " represented in evaluation object image is detected.Host CPU 11 opens each two field picture n of two field picture (wherein for constituting N, 1≤n≤N) maximum for " face " region area S of detecting " face " region, obtain the reliability information (minima 0~maximum are 1) obtained when this " face " detection processes.Reliability information can also use " face " similarity or the index of " face " curvature.At this, in the case of " face " similarity, image from " face " region of face detection extracts various characteristic quantities (colouring information, the information etc. of configuration relation of each several part (eye, nose, mouth etc.) of face) out, they is carried out comprehensive judgement and measures " face's similarity " of the reliability detected as " face ".In the case of " face " curvature, whether there is unnatural change by the curvature of the face mask of the method extractions such as research and utilization rim detection, and calculate the reliability that " face " is detected.Such as, for edge (profile) application curves (or oval) extracted out, when edge variation (profile is unsmooth) drastically, it is estimated as the situation occurring hair or hands etc. to block face etc., carries out reducing the process of reliability (evaluation of estimate H2 (n)).Host CPU 11 makes face's nonuseable part evaluation of estimate H2 (n) under two field picture n=reliability information.
<detection of closing one's eyes>
In evaluating processing item 43, calculate and represent in evaluation object image with or without evaluation of estimate H3 (n) closed one's eyes.Host CPU 11 opens " face " region maximum for " face " region area S that each two field picture n (wherein, 1≤n≤N) of two field picture detects for constituting N, carries out detection of closing one's eyes.Owing to eye closing detection is processed as known, therefore omit the description.Host CPU 11 uses eye closing testing result, when there is eye closing, makes eye closing evaluation of estimate H3 (n)=0 under two field picture n.On the other hand, when there is not eye closing, make eye closing evaluation of estimate H3 (n)=1.
<frame out detection>
In evaluating processing item 44, in the case of " Case_A ", calculate whether " face " region represented in evaluation object image deviates evaluation of estimate H4 (n) of screen, in the case of " Case_B ", calculate evaluation of estimate H4 (n) representing whether the region of " the main subject " that be estimated as the main subject beyond personage deviates screen.Host CPU 11 opens each two field picture n of two field picture (wherein at composition N, 1≤n≤N) " face " region (" main subject " region) of detecting i.e. should the area S of " face " (" main subject ") exceed in the region of regulation area, obtain the centre coordinate (F0_x, F0_y) in " face " (" main subject ") region farthest away from picture center.Further, setting the region that frames out illustrated in Fig. 5 (a) and Fig. 5 (b), whether (F0_x, F0_y) according to obtaining is included in evaluation of estimate H4 (n) that frames out determined as described below in the region that frames out.
Fig. 5 (a) is the figure illustrating the region that frames out when two field picture n is lateral attitude.Host CPU 11 is equivalent to frame out when above-mentioned centre coordinate (F0_x, F0_y) is included in oblique line portion 51, if evaluation of estimate H4 (n)=0 that frames out under two field picture n.On the other hand, be equivalent to without departing from screen when above-mentioned centre coordinate (F0_x, F0_y) is not included in oblique line portion 51, if frameing out evaluation of estimate H4 (n)=1.Fig. 5 (b) is the figure illustrating the region that frames out when two field picture n is lengthwise position.In the same manner as the situation of lateral attitude, whether host CPU 11 is included in oblique line portion 51 according to above-mentioned centre coordinate (F0_x, F0_y) determines evaluation of estimate H4 (n) that frames out.The scope in the oblique line portion 51 and 52 in two field picture, such as relative to a length of left and right each 1/8 on grade limit, is 1/8 relative to the length of vertical edges away from lower section.
It should be noted that, centre coordinate (the F0_x using " face " (" main subject ") region farthest away from picture center can also be replaced, F0_y) situation about frameing out is determined whether, and use the coordinate (FP_x corresponding with the region farthest away from picture center in the focus (center of focal area (AF region)) for focusing, FP_y), according to this coordinate (FP_x, FP_y) whether it is included in the oblique line portion 51,52 of Fig. 5 (a), Fig. 5 (b), and determines whether to frame out.
<AF, face area shaking detection/definition>
Evaluating in processing item 45, calculating evaluation of estimate H50 (n) of the degree of jitter represented in evaluation object image respectively and represent evaluation of estimate H51 (n) of definition in evaluation object image.Host CPU 11 opens each two field picture n of two field picture (wherein for constituting N, 1≤n≤N) maximum for " face " region area S of detecting " face " region (situation of " Case_A "), obtain meansigma methods HPF_av (n) of the output valve of HPF (high pass filter).Meansigma methods HPF_av (n) is the value after the pixel data comprised from " face " region is extracted high-frequency component out and is averaged, and is consequently adapted to judge that contrast is high or low.
Host CPU 11 is set to shake evaluation of estimate H50 (n) under two field picture n=HPF_av (n)/HPF_k when HPF_av (n) < HPF_k sets up.On the other hand, when HPF_av (n) >=HPF_k sets up, it is set to shake evaluation of estimate H50 (n)=1.Wherein, HPF_k is setting, such as, when prefocusing, use meansigma methods HPF_av (n) of the HPF output valve calculated.
It addition, host CPU 11 is definition evaluation of estimate H51 (n)=HPF_av (the n)/HPF_k under two field picture n.Wherein, HPF_k is above-mentioned setting.
It should be noted that, in the above description, output meansigma methods HPF_av (n) of the HPF (high pass filter) in the view data comprised in " face " region maximum by obtaining area S, and obtain shake evaluation of estimate H50 (n) and definition evaluation of estimate H51 (n) of " face " of " face ".This situation can also be replaced, and obtain output valve meansigma methods HPF_av (n) (" Case_B ", " Case_C " use these methods) of the HPF (high pass filter) of the pixel data comprised in the focal area (AF region) of focusing.About the pixel data comprised in AF region, by obtaining the output meansigma methods of HPF (high pass filter), and obtain shake evaluation of estimate H50 (n) of the subject for focusing and definition evaluation of estimate H51 (n) of the subject for focusing.
<overall brightness block diagram>
In evaluating processing item 46, calculate evaluation of estimate H6 (n) of the over-exposed or under-exposed frequency represented in evaluation object image.Host CPU 11 opens each two field picture n (wherein, 1≤n≤N) of two field picture for constituting N, calculates the brightness histogram of image overall (using whole pixel datas of pie graph picture).If the frequency of maximum gradation value (being 255 in the case of 8 bit data) is HL, the frequency of minimum gradation value (0) is LL.Host CPU 11 is set to over-exposed, under-exposed normalization frequency proportions i.e. evaluation of estimate H6 (n)=(HL+LL)/(whole pixel count).
It should be noted that can also be the frequency of the data making more than first decision threshold that frequency is regulation of maximum gradation value, and the frequency of the second decision threshold data below making the frequency of minimum gradation value be regulation.
<face area detection>
In evaluating processing item 47, calculate evaluation of estimate H7 (n) of the maximum area of " face " that represent in evaluation object image.Host CPU 11 calculates area S (n) opening " face " region maximum for " face " region area S that each two field picture n (wherein, 1≤n≤N) of two field picture detects for constituting N.As it has been described above, area is represented by the pixel count constituting " face " region.Host CPU 11 is set to face area evaluation of estimate H7 (n)=S (n).
<detection of smiling face's degree>
In evaluating processing item 48, calculate evaluation of estimate H8 (n) of smiling face's degree of " face " that represent in evaluation object image.Host CPU 11 obtains and constitutes N and open smiling face's grade of " face " that each two field picture n (wherein, 1≤n≤N) of two field picture detects.The judgement of smiling face's grade is carried out when above-mentioned " face " is detected.Smiling face's grade is such as divided into grade 3 (laugh), grade 2 (in laugh at), grade 1 (smile) these three grade.Host CPU 11 is set to evaluation of estimate H8 (n)=smiling face's grade of smiling face's degree.
<size detection of subject>
In evaluating processing item 49, calculate evaluation of estimate H9 (n) of the size of the subject represented in evaluation object image.Host CPU 11 opens each two field picture n of two field picture (wherein for constituting N, 1≤n≤N), by the focal area (AF region) for focusing, obtain mean flow rate Y0 and the average color (Cr0, Cb0) of center tiny area (such as 3 × 3 pixel).Fig. 6 is to illustrate AF region 61 and the figure of center tiny area 62.Center tiny area 62 is the reference area resolved for the region of following description.
Host CPU 11 uses mean flow rate Y0 under each two field picture n and average color (Cr0, Cb0), calculates Size Evaluation value H9 (n) of subject according to following order.
I mark, to center tiny area, is set to 1 by ().
(ii), when pixel value around is within feasible value rn (wherein, n=1:Y, 2:Cr, 3:Cb) of regulation, mark is carried out set.Specifically,
(| when Y (x, y)-Y0 | > r1) sets up, set becomes Flag1=0, and when being false, set becomes Flag1=1.
(| when Cr (x, y)-Cr0 | > r2) sets up, set becomes Flag2=0, and when being false, set becomes Flag2=1.
(| when Cb (x, y)-Cb0 | > r3) sets up, set becomes Flag3=0, and when being false, set becomes Flag3=1.
About above-mentioned Flag1~Flag3, when (Flag1*Flag2*Flag3=1) sets up, set becomes Flag=1, and when being false, set becomes Flag=0.
(iii) make evaluation object region from the beginning of surrounding the reference area i.e. limit of tiny area 62, making limit about increase (the first evaluation region 71 → the second evaluation region the 72 → the 3rd evaluation region 73 ...) step by step, the moment entirely becoming 0 at Flag terminates to process.Fig. 7 is the figure illustrating the evaluation object region 71~73 increased.
(iv) calculate Flag and become the area D of 1.Host CPU 11 makes Size Evaluation value H9 (the n)=D of the subject under the n-th image.
<detection most preferably spent by composition>
In evaluating processing item 50, calculate evaluation of estimate H10 (n) of the optimal degree of the composition represented in evaluation object image.Host CPU 11 opens each two field picture n (wherein, 1≤n≤N) of two field picture for constituting N, obtains the centre coordinate (Qx, Qy) in " face " region of the focal area (AF region) for focusing or detection.
As illustrated in Fig. 8, host CPU 11 arranges the composition of 5 the most respectively in picture and evaluates coordinate points P1~P5, calculates these 5 and above-mentioned centre coordinate (Qx respectively by following formula (2), Qy) distance KS (m) is (wherein, m=1,2 ... 5).
KS ( m ) ( P ( m , x ) - Qx ) 2 + ( P ( m , y ) - Qy ) 2 . . . . . . ( 2 )
Minimum M in (KS (m)) in 5 distances KS (m) is set to composition evaluation of estimate H10 (n) under the n-th image by host CPU 11.
Here, the details that the eliminating of opening the X in above-mentioned steps S60 processes (that is, C open optimal candidate image select process) illustrates.Host CPU 11 calculates unwanted picture evaluation of estimate RMV as described below, selects optimal candidate image.Wherein, arn, brn, crn are respectively predetermined coefficients.
In the case of " Case_A ",
RMV (n)=ar1*H1 (n)+ar2*H2 (n)+ar3*H3 (n)+ar4*H4 (n)+ar5*H50 (n)+ar6*H6 (n)
In the case of " Case_B ",
RMV (n)=br4*H4 (n)+br5*H50 (n)+br6*H6 (n)
In the case of " Case_C ", omit calculating of unwanted picture evaluation of estimate RMV.
To useless relevant evaluation of estimate RMV along with raising close to 0 without expenditure.Host CPU 11 is for the full images (N opens) of input, RMV (n) is obtained in each situation (Case), by without expenditure order from high to low, X is opened and counts, and by from N open get rid of X open after residue C open become regulation picture number (any value in such as 5~10) in the way of make X variable.Above-mentioned C is optimal candidate picture number.
The details of the sequence additional treatments (that is, sequence setting processes) in above-mentioned step S80 is illustrated.Host CPU 11 opens optimal candidate image for C, as described below calculates optimized image evaluation of estimate OPT, selects sorting representationb or represents (most preferably) image.Wherein, aon, bon, con are respectively predetermined coefficients.
In the case of " Case_A ",
OPT (n)=ao5*H51 (n)+ao6*H6 (n)+ao7*H7 (n)+ao8*H8 (n)+ao10*H10 (n)
In the case of " Case_B ",
OPT (n)=bo5*H51 (n)+bo6*H6 (n)+bo9*H9 (n)+bo10*H10 (n)
In the case of " Case_C ",
OPT (n)=co5*H51 (n)+co6*H6 (n)
Host CPU 11 opens (being 5~10 middle any value in this example) image for the C selected, OPT (n) is obtained in each situation (Case), according to this value order from high to low, set up sequence, and using image the highest for this value as (most preferably) image representing optimal candidate image.
Embodiment from the description above, can obtain following action effect.
(1) electronic camera 1 is owing to comprising following host CPU 11, therefore, it is possible to promptly select optimal shooting candidate image from image sets, these host CPUs 11 are: the multiple images comprised in the image sets for input, calculate the host CPU 11 of evaluation of estimate based on the useless factor respectively;Operation result based on this host CPU 11, meets this operation result the image of rated condition and carries out the host CPU 11 of eliminating process from image sets;The each image comprised in image sets after processing for this eliminating, calculates the host CPU 11 of evaluation of estimate based on the necessary factor respectively, and this necessity factor comprises the factor different from the useless factor;Operation result based on this host CPU 11, each image in image sets after processing above-mentioned eliminating sets up the host CPU 11 of sequence.
(2) above-mentioned image sets is owing to comprising multiple images of continuous shooting shooting, therefore, it is possible to promptly select optimal shooting candidate image from the image sets of continuous shooting.
(3) host CPU 11 changes the content of the useless factor according to photographed scene during shooting image sets, therefore, it is possible to calculate the evaluation of estimate corresponding with scene, thus compared with the situation having been based on the identical useless factor and calculating evaluation of estimate, it is possible to properly select optimal shooting candidate image.
(4) host CPU 11 above-mentioned (3) according to the subject picture comprised in image sets with or without in mobile and this subject picture with or without at least one party in personage, and change the content of the useless factor, therefore, it is possible to consider the movement of subject or properly select optimal shooting candidate image with or without personage.
(5) host CPU 11 operation result based on the useless factor, gets rid of image more than half from image sets, thus be excluded that after processing load alleviate, compared with not carrying out situation about getting rid of, it is possible to more quickly select optimal shooting candidate image.
(6) share with a part for the necessary factor, therefore compared with the situation that a unshared part processes, it is possible to shorten the overall process time due to a part for the useless factor.
(7) the necessary factor is expressed one's feelings at least one in the relevant factor of the face area of the relevant factor and this personage and the factor relevant with the shake of this subject picture, therefore, it is possible to select suitable optimal shooting candidate image due to the personage in the subject picture that comprises and contain in image sets.
(variation 1)
The quantity evaluating processing item performed in above-mentioned step S50 can also be constituted as follows, uses useless because of the period of the day from 11 p.m. to 1 a.m, even if the most all carrying out above-mentioned 41~46, it is also possible to evaluates processing item from these and suitably selects.And, the quantity evaluating processing item performed in step S70 can also be constituted as follows, uses necessary because of the period of the day from 11 p.m. to 1 a.m, even if the most all carrying out above-mentioned 45~50, it is also possible to evaluates processing item from these and suitably selects.
(variation 2)
By by above-mentioned first recording mode record the C recorded on medium 30 open image carry out resetting on LCD monitor 19 display time, when the host CPU 11 representative (most preferably) image in C is only opened by instruction carries out playback display, only representative (most preferably) image is carried out display of resetting.On the other hand, instruction C opened all images in image carry out resetting display time, one by one carry out display of resetting according to clooating sequence, or multiple images are carried out together with sequencing information list show.
Additionally, by above-mentioned second recording mode to recording when the image that (A+B) recorded on medium 30 opens carries out resetting display on LCD monitor 19, when the host CPU 11 representative (most preferably) image in only indicating (A+B) to open carries out playback display, only representative (most preferably) image is carried out playback and show.On the other hand, indicate the C in (A+B) is opened image carry out resetting display time, one by one carry out display of resetting according to clooating sequence, or multiple images are carried out together with sequencing information list show.
(variation 3)
Can also perform to carry out the picture appraisal processing routine of the process of Fig. 3 by making the computer installation 100 shown in Fig. 9, and constitute image processing apparatus.By when picture appraisal processing routine loading personal computer 100 uses, after program being input in the data storage device of personal computer 100, it is used as image processing apparatus by performing this program.
The record medium 104 of the CD-ROM etc. having program stored therein can also be put into and personal computer 100 inputs program to personal computer 100, it is also possible to by inputting to personal computer 100 via the method for the communication lines such as network 101.During via communication line 101, in hard disk unit 103 grade of the server (computer) 102 being connected with communication line 101, store program in advance.Picture appraisal processing routine can supply as the computer program of the various modes such as the offer via storage medium 104 or communication line 101.
(variation 4)
In the flow chart illustrated in above-mentioned Fig. 3, after step S80, it is also possible to the X of low for evaluation of estimate (high without expenditure) is opened two field picture (unwanted picture) and shows on LCD monitor 19.In variation 4, as above-mentioned recording mode, select to open the C in (A+B) two field picture the first recording mode that two field picture carries out to record medium 30 recording, such as, illustrate in case of N=(A+B)=20 and C=5 open.That is, by 5 images in the two field picture of 20 when record medium 30 records, display, on LCD monitor 19, allows the user to confirm the two field picture (unwanted picture) that 15 (=20-5) of evaluation of estimate low (high without expenditure) open.
The host CPU 11 of variation 4 sends instruction to display image generative circuit 14, carries out the display illustrated in Figure 10 on LCD monitor 19.Figure 10 is the figure of the display picture illustrating LCD monitor 19.In Fig. 10, in the P11 of region, breviary shows that 15 two field pictures are as getting rid of candidate.On the other hand, in the Q11 of region, breviary shows that 5 two field pictures are as preserving candidate.Be attached to 5 two field pictures top 1~5 the sequence additional in step S80 of numeric representation.
The host CPU 11 of variation 4 is by upper for any one two field picture (such as showing at upper left two field picture in the P11 of region) in the two field picture of display in the P11 of region display highlighting P12.When host CPU 11 such as carries out push to the crossbar switch (not shown) constituting functional unit 18, make on the two field picture that cursor P12 moves into place on the direction of operating of crossbar switch.When host CPU 11 carries out the operation indicating " selected as preserving candidate " by the push of functional unit 18, it is included in this moment makes the two field picture determined by cursor P12 move to region Q11 in preservation candidate.And, it is included in eliminating candidate in the two field picture of the sequence lowermost position in the two field picture that host CPU 11 shows in making region Q11 moves to region P11.
It should be noted that according to the push of functional unit 18, and can also determine in the Q11 of region in the two field picture (preservation candidate) of display, move to the two field picture that (is i.e. included in and gets rid of in candidate) in the P11 of region.Furthermore, it is also possible to when carrying out the operation indicating " selected as preserving candidate ", do not reduce the quantity of the two field picture (preservation candidate) having been shown in the Q11 of region, and increase in the Q11 of region as the picture number preserving candidate.
When the host CPU 11 of variation 4 carries out push to OK switch (not shown) constituting functional unit 18, the process of the flow chart illustrated in Fig. 3 terminates.Host CPU 11 finish time by region Q11 in display two field picture to record medium 30 record.According to variation 4, even getting rid of the two field picture comprised in candidate, it is also possible to preserve when user wishes and preserves.It is additionally, since and eliminating candidate image and preservation candidate image are distinguished as different regions and are indicated, therefore, it is possible to show in the way of user is easy to understand.
(variation 5)
In the above description, the example of the value (such as, C=5 opens) to open in image beforehand through menu operation setting N, to be saved in picture number C recorded in medium 30 illustrates.In variation 5, the not only value of C, and the value of N (wherein, N > C), or the value of not only C, and also the value of the value of A and B (wherein, (A+B) > C) also is able to be set by menu operation.Thus, the picture number preserving candidate is not only 5, can also be 10, or can be the desired value of user by the quantity set of the quantity of the two field picture shot in the past at moment t1 (the input full time by operation signal (i.e. shooting instruction)) and the two field picture shot after time tl.
(variation 6)
In variation 6, illustrate to use in evaluation of estimate RMV that calculates of the useless factor do not produce difference and regardless of good and bad situation.From above-mentioned N=20 open two field picture select 15 two field pictures as when getting rid of candidate, such as, to evaluation of estimate RMV of two field picture when evaluation of estimate RMV of low (without expenditure height) starts counting up, it is assumed that evaluation of estimate RMV from the 14th to the 18th is identical value.In this case, if only evaluation of estimate RMV based on the useless factor being compared, then it is difficult to select 15 to get rid of candidate.But, in order to manage to select 15, need evaluation of estimate RMV based on the useless factor is started counting up from low evaluation of estimate RMV, from 5 two field pictures of the 14th to the 18th, select two.
Therefore, the host CPU 11 of variation 6 is for the above-mentioned two field picture of the 14th to the 18th, the two field picture that such as value of evaluation of estimate H1 (n) of the nearness represented between multiple personages from the useless factor is little starts to select two two field pictures successively, using the two image as the 14th and the 15th two field picture.So, in evaluation of estimate RMV based on the useless factor for when being worth and be difficult to select 15 two field pictures, by selecting from predetermined evaluation of estimate based on the useless factor (the most above-mentioned H1 (the n)) two field picture that value is little, it is possible to reliably select 15 two field pictures as getting rid of candidate.It should be noted that, evaluation of estimate H1 (n) is also with value and during regardless of quality, such as represents that the two field picture that evaluation of estimate H2 (n) value of reliability that " face " detect is little selects successively from evaluations of estimate based on other useless factors further.
(variation 7)
In variation 7, illustrate to use evaluation of estimate RMV that calculates of the useless factor do not produces difference and regardless of another example of good and bad situation.From above-mentioned N=20 open two field picture select 15 two field pictures as when getting rid of candidate, such as, to evaluation of estimate RMV of two field picture when evaluation of estimate RMV of low (without expenditure height) starts counting up, it is assumed that evaluation of estimate RMV from the 14th to the 18th is identical value.The host CPU 11 of variation 7 will remove from eliminating candidate regardless of good and bad two field picture near the 15th, will be set to 13 as the two field picture getting rid of candidate.That is, will comprise to preserving candidate regardless of the good and bad the 14th to the 18th by only evaluation of estimate RMV based on the useless factor being compared, and using the 14th to the 20th 7 images as preserving candidate.
So, it is identical value and when being difficult to select 15 two field pictures in evaluation of estimate RMV based on the useless factor, reduces as the picture number (in other words, increasing the picture number of preservation candidate) getting rid of candidate.But, from upper (being 5 in this example) two field picture that C is opened to record medium 30 record in the preservation candidate image that host CPU 11 will increase, according to the sequence determined in step S80.
(variation 8)
In variation 8, illustrate to use evaluation of estimate RMV that calculates of the useless factor do not produces difference and regardless of another example of good and bad situation.From above-mentioned N=20 open two field picture select 15 two field pictures as when getting rid of candidate, such as, to evaluation of estimate RMV of two field picture when evaluation of estimate RMV of low (without expenditure height) starts counting up, it is assumed that evaluation of estimate RMV from the 14th to the 18th is identical value.The host CPU 11 of variation 8 stops selected as the two field picture getting rid of candidate, and N=20 opens two field picture all as preserving candidate.It should be noted that, be not limited to by only evaluation of estimate RMV based on the useless factor is compared and regardless of good and bad situation, even if when N=20 opens whole evaluations of estimate RMV of two field picture less than predetermined reference value, it is also possible to do not select to get rid of candidate and N=20 opens two field picture all as preserving candidate.
20 are preserved in candidate images, according to the sequence determined in step S80 from upper (being 5 in this example) two field picture of being opened by C to record medium 30 record by the host CPU 11 of variation 8.
(variation 9)
Like that, there is the situation of the two field picture appended sequence more than picture number C (being 5 in this example) preserved to record medium 30 to quantity in variation 7 described above or variation 8.Host CPU 11 now according to the sequence determined in step S80 from upper C is opened (being 5 in this example) two field picture to record medium 30 record.But, there is also optimized image evaluation of estimate OPT using the necessary factor to calculate do not produces difference and regardless of good and bad situation.
Such as to optimized image evaluation of estimate OPT of two field picture when high optimized image evaluation of estimate OPT starts counting up, it is assumed that optimized image evaluation of estimate OPT from the 3rd to the 7th is identical value.In this case, if only comparing optimized image evaluation of estimate OPT and being difficult to select 5.But, in order to manage to select 5, need optimized image evaluation of estimate OPT based on the necessary factor is started counting up from high optimized image evaluation of estimate OPT, and select three from the 3rd to the 7th these 5 two field pictures.
Therefore, the host CPU 11 of variation 9 is for the above-mentioned two field picture of the 3rd to the 7th, the two field picture that such as value of evaluation of estimate H7 (n) of expression " face " maximum area from the necessary factor is big starts to select two two field pictures successively, using the two image as the 4th and the 5th two field picture.So, optimized image evaluation of estimate OPT based on the necessary factor be identical value be difficult to select 5 two field pictures time, selected by the two field picture big from the value of predetermined evaluation of estimate (the most above-mentioned H7 (n)) based on the necessary factor, it is possible to reliably select 5 two field pictures as getting rid of candidate.It should be noted that evaluation of estimate H7 (n) also for identical value regardless of quality time, such as represent that from evaluations of estimate based on other necessary factors the two field picture that the value of evaluation of estimate H8 (n) of smiling face's degree of " face " is big selects successively.Even if time the most also regardless of quality, more i.e. represent that from evaluations of estimate based on other necessary factors the two field picture that the value of evaluation of estimate H9 (n) of subject size is big starts to select successively.Similarly, even if during so also regardless of quality, more i.e. represent that the two field picture that the value of evaluation of estimate H10 (n) that composition most preferably spends is big starts to select successively from evaluations of estimate based on other necessary factors.According to variation 9, it is possible to properly select 5 two field pictures preserved to record medium 30.
(variation 10)
In variation 10, illustrate optimized image evaluation of estimate OPT using the necessary factor to calculate does not produces difference and regardless of another example of good and bad situation.Even if use each above-mentioned evaluation of estimate H7 (n)~H10 (n) respectively also regardless of quality, host CPU 11 selects 5 two field pictures as record candidate as described below.That is, host CPU 11 in the multiple two field pictures as record candidate, select the two field picture that sequence is the 1st of four each evaluations of estimate according to each evaluation of estimate H7 (n) based on the necessary factor~H10 (n).Host CPU 11 will also add at 5 images of total after the two field picture of (shooting) acquired by the input full time nearest by the time (moment t1) of operation signal (i.e. shooting instruction) as record candidate.According to variation 10, it is possible to properly select 5 two field pictures preserved to record medium 30.
(variation 11)
The useless factor described above, the necessary factor is added after can also.Such as, replace the program of storage in flash memories 16 originally, for the useless factor and at least one party of the necessary factor, will newly add or change calculate formula after the program more new record that promotes of version in flash memories 16.
Specifically, after loading, in the data storage device of the personal computer 100 of Fig. 9 illustration, the program that version promotes, via not shown communication cable (such as USB cable), this personal computer and electronic camera 1 are connected.The input of personal computer 100 both can be carried out by version updating program by storage has the storage mediums 104 such as the CD-ROM of version updating program be arranged in personal computer 100, it is also possible to utilizes and inputs to personal computer 100 via the method for the communication line 101 of network etc..
Can supply via above-mentioned communication cable from personal computer 100 to the transmission of the version updating program of electronic camera 1.The version updating program that supply is come by host CPU 11 updates and recorded in flash memories 16.Or can also by personal computer 100 by version updating program record record medium 30 in after, being arranged on the card interface (I/F) 17 of electronic camera 1 by this record medium 30, the version updating program of record in record medium 30 is read and updates and recorded in flash memories 16 by host CPU 11.According to variation 11, it is possible to carry out employing the useless factor based on the up-to-date evaluation methodology comprised in version updating program, the evaluation of the necessary factor processes.And, meeting the useless factor of user preferences, the necessary factor if version updating program comprises, then the evaluation that can carry out user preferences processes.
An above explanation only example, is not defined to the structure of above-mentioned embodiment.

Claims (31)

1. an image processing apparatus, it is characterised in that comprise:
First operational part, each image in the multiple images comprised in the image sets for input, calculate evaluation of estimate based on factor I respectively;
Eliminating portion, operation result based on described first operational part, the image that this operation result meets from described image sets rated condition carries out eliminating process;
Second operational part, to each image do not got rid of from described image sets by described eliminating portion, calculates evaluation of estimate based on factor Ⅱ respectively, and this factor Ⅱ comprises the factor different from described factor I;And
Evaluation process portion, operation result based on described second operational part, each image of the described image sets do not got rid of by described eliminating portion is set up sequence.
Image processing apparatus the most according to claim 1, it is characterised in that
Described image sets comprises multiple images of continuous shooting shooting.
Image processing apparatus the most according to claim 1 and 2, it is characterised in that
Described first operational part changes the content of described factor I according to photographed scene when shooting described image sets.
Image processing apparatus the most according to claim 3, it is characterised in that
Described first operational part according to the subject picture comprised in described image sets with or without the content changing described factor I in mobile and this subject picture with or without at least one party in personage.
Image processing apparatus the most according to claim 1 and 2, it is characterised in that
Image more than half is got rid of by described eliminating portion operation result based on described first operational part from described image sets.
Image processing apparatus the most according to claim 1 and 2, it is characterised in that
A part for described factor I shares with a part for described factor Ⅱ.
Image processing apparatus the most according to claim 6, it is characterised in that
At least one party in the Luminance Distribution information in the whole region that the described factor shared comprises the sharpness information in the regulation region of each image constituting described image sets and described each image.
Image processing apparatus the most according to claim 1 and 2, it is characterised in that
Described factor Ⅱ comprises at least one in the relevant factor of the face area of express one's feelings to the personage in the subject picture contained in described the image sets relevant factor and this personage and the factor relevant with the shake of this subject picture.
Image processing apparatus the most according to claim 1, it is characterised in that
The image being also equipped with making to be got rid of by described eliminating portion shows display control unit on the display apparatus.
Image processing apparatus the most according to claim 9, it is characterised in that
Being also equipped with control portion, described eliminating portion is controlled in the way of not by the regulation image shown by described display control unit in image on said display means eliminating by this control portion.
11. image processing apparatus according to claim 1, it is characterised in that
The factor being also equipped with the factor for adding at least one party in described factor I and described factor Ⅱ adds portion.
12. image processing apparatus according to claim 5, it is characterised in that
The image of specified quantity is got rid of in described eliminating portion from described image sets,
Described specified quantity can change.
13. image processing apparatus according to claim 12, it is characterised in that
Described eliminating portion does not carry out described eliminating when operation result based on described first operational part cannot get rid of the image of described specified quantity,
When described second operational part does not carry out described the eliminating in described eliminating portion, all images group to described input calculates evaluation of estimate based on described factor Ⅱ respectively,
Each image in all images group of described input is set up sequence by described evaluation process portion operation result based on described second operational part.
14. image processing apparatus according to claim 12, it is characterised in that
The image of second specified quantity fewer than described specified quantity is got rid of in described eliminating portion from described image sets when operation result based on described first operational part cannot get rid of the image of described specified quantity,
Described second operational part calculates evaluation of estimate based on factor Ⅱ when described eliminating portion eliminates the image of described second specified quantity respectively to each image comprised in the image sets after described eliminating process, this factor Ⅱ comprises the factor different from described factor I
Each image in image sets after described eliminating process is set up sequence by described evaluation process portion operation result based on described second operational part.
15. according to the image processing apparatus described in claim 13 or 14, it is characterised in that
Be also equipped with selection portion, this selection portion based on by the additional described sequence selection in described evaluation process portion as the image of conservation object.
16. image processing apparatus according to claim 15, it is characterised in that
Described selection portion selects when described sequence does not exist difference closest to the highest image of the shooting image that obtains temporally of instruction and evaluation of estimate based on described factor Ⅱ.
17. 1 kinds of image evaluation methods, it is characterised in that including:
First step, input comprises the image sets of multiple image;
Second step, calculates evaluation of estimate based on factor I respectively to each image comprised in described image sets;
Third step, operation result based on described second step, the image that this operation result meets rated condition is got rid of from described image sets;
4th step, to each image do not got rid of from described image sets by described third step, calculates evaluation of estimate based on factor Ⅱ respectively, and this factor Ⅱ comprises the factor different from described factor I;And
5th step, operation result based on described 4th step, each image of the described image sets do not got rid of by described third step is set up sequence.
18. image evaluation methods according to claim 17, it is characterised in that
Described image sets comprises multiple images of continuous shooting shooting.
19. according to the image evaluation method described in claim 17 or 18, it is characterised in that
Described second step changes the content of described factor I according to photographed scene when shooting described image sets.
20. image evaluation methods according to claim 19, it is characterised in that
Described second step according to the subject comprised in described image sets with or without the content changing described factor I in mobile and subject picture with or without at least one party in personage.
21. according to the image evaluation method described in claim 17 or 18, it is characterised in that
A part for described factor I shares with a part for described factor Ⅱ.
22. image evaluation methods according to claim 21, it is characterised in that
At least one party in the Luminance Distribution information in the whole region that the described factor shared comprises the sharpness information in the regulation region of each image constituting described image sets and described each image.
23. according to the image evaluation method described in claim 17 or 18, it is characterised in that
Described factor Ⅱ comprises at least one in the relevant factor of the face area of express one's feelings to the personage in the subject picture contained in described the image sets relevant factor and this personage and the factor relevant with the shake of this subject picture.
24. image evaluation methods according to claim 17, it is characterised in that
Also perform to make the image got rid of by described third step show the 6th step on the display apparatus.
25. image evaluation methods according to claim 24, it is characterised in that
The regulation image shown by described 6th step in image on said display means is got rid of by described third step.
26. image evaluation methods according to claim 17, it is characterised in that
Described third step operation result based on described second step gets rid of the image of specified quantity from described image sets,
Described specified quantity can change.
27. image evaluation methods according to claim 26, it is characterised in that
Described third step does not carry out described eliminating when operation result based on described second step cannot get rid of the image of described specified quantity,
Described 4th step calculates evaluation of estimate based on described factor Ⅱ when described third step does not carry out described the eliminating respectively to all images group inputted by described first step,
Each image in the described 5th step operation result based on the described 4th step all images group to described input sets up sequence.
28. image evaluation methods according to claim 26, it is characterised in that
Described third step gets rid of the image of second specified quantity fewer than described specified quantity from described image sets when operation result based on described second step cannot get rid of the image of described specified quantity,
Described 4th step calculates evaluation of estimate based on factor Ⅱ when described third step eliminates described second specified quantity respectively to each image comprised in the image sets after described eliminating process, and this factor Ⅱ comprises the factor different from described factor I,
Each image in image sets after described third step is set up sequence by described 5th step operation result based on described 4th step.
29. according to the image evaluation method described in claim 27 or 28, it is characterised in that
Also perform based on the 7th step being carried out the image selected as conservation object by the described sequence that described 5th step is additional.
30. image evaluation methods according to claim 29, it is characterised in that
Described 7th step selects when described sequence does not exist difference closest to the highest image of the shooting image that obtains temporally of instruction and evaluation of estimate based on described factor Ⅱ.
31. 1 kinds of image processing apparatus, it is characterised in that
Possess the computer being able to carry out the image evaluation method according to any one of claim 17~30,
The record medium having multiple image from record inputs described image sets, and performs each step of described image evaluation method.
CN201110083627.4A 2010-03-30 2011-03-30 Image processing apparatus and image evaluation method Active CN102209196B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610416864.0A CN105939456A (en) 2010-03-30 2011-03-30 Image processing device and image estimating method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2010077746 2010-03-30
JP2010-077746 2010-03-30
JP2011-046188 2011-03-03
JP2011046188A JP4998630B2 (en) 2010-03-30 2011-03-03 Image processing apparatus and image evaluation program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201610416864.0A Division CN105939456A (en) 2010-03-30 2011-03-30 Image processing device and image estimating method

Publications (2)

Publication Number Publication Date
CN102209196A CN102209196A (en) 2011-10-05
CN102209196B true CN102209196B (en) 2016-08-03

Family

ID=44697837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110083627.4A Active CN102209196B (en) 2010-03-30 2011-03-30 Image processing apparatus and image evaluation method

Country Status (1)

Country Link
CN (1) CN102209196B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101949218B1 (en) * 2012-09-25 2019-02-18 삼성전자 주식회사 Method and apparatus for photographing in portable terminal
US9690980B2 (en) * 2012-11-09 2017-06-27 Google Inc. Automatic curation of digital images
JP5888614B2 (en) * 2013-03-21 2016-03-22 カシオ計算機株式会社 IMAGING DEVICE, VIDEO CONTENT GENERATION METHOD, AND PROGRAM
US20150071547A1 (en) 2013-09-09 2015-03-12 Apple Inc. Automated Selection Of Keeper Images From A Burst Photo Captured Set
CN105654470B (en) * 2015-12-24 2018-12-11 小米科技有限责任公司 Image choosing method, apparatus and system
CN107590459A (en) * 2017-09-11 2018-01-16 广东欧珀移动通信有限公司 The method and apparatus for delivering evaluation
CN107680386A (en) * 2017-11-07 2018-02-09 潘柏霖 A kind of intelligent traffic monitoring system
CN110099207B (en) * 2018-01-31 2020-12-01 成都极米科技股份有限公司 Effective image calculation method for overcoming camera instability
CN108513068B (en) * 2018-03-30 2021-03-02 Oppo广东移动通信有限公司 Image selection method and device, storage medium and electronic equipment
CN108900774A (en) * 2018-08-03 2018-11-27 崔跃 A kind of automatic shooting system with auxiliary shooting function
WO2020155052A1 (en) * 2019-01-31 2020-08-06 华为技术有限公司 Method for selecting images based on continuous shooting and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101136066A (en) * 2006-07-25 2008-03-05 富士胶片株式会社 System for and method of taking image and computer program
CN101534394A (en) * 2008-02-08 2009-09-16 卡西欧计算机株式会社 Imaging apparatus and imaging method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4347498B2 (en) * 2000-05-10 2009-10-21 オリンパス株式会社 Electronic camera and image processing apparatus
JP5304002B2 (en) * 2008-04-11 2013-10-02 株式会社ニコン Imaging apparatus, image selection method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101136066A (en) * 2006-07-25 2008-03-05 富士胶片株式会社 System for and method of taking image and computer program
CN101534394A (en) * 2008-02-08 2009-09-16 卡西欧计算机株式会社 Imaging apparatus and imaging method

Also Published As

Publication number Publication date
CN102209196A (en) 2011-10-05

Similar Documents

Publication Publication Date Title
CN102209196B (en) Image processing apparatus and image evaluation method
CN105939456A (en) Image processing device and image estimating method
CN102300049B (en) Image signal processing apparatus
CN101582989B (en) Image capture apparatus
CN1871847B (en) Signal processing system, signal processing method
US20050200722A1 (en) Image capturing apparatus, image capturing method, and machine readable medium storing thereon image capturing program
US20120206619A1 (en) Image processing apparatus, image capturing apparatus and recording medium
US8244059B2 (en) Image processing method, apparatus, recording medium, and image pickup apparatus
CN101621631B (en) Imaging apparatus
JP4468734B2 (en) Video signal processing apparatus and video signal processing program
US8335399B2 (en) Image processing apparatus, control method therefor, and program
KR20140016401A (en) Method and apparatus for capturing images
CN101888478B (en) Image capturing apparatus, data generating apparatus, and data structure
CN109672819A (en) Image processing method, device, electronic equipment and computer readable storage medium
CN108520493A (en) Processing method, device, storage medium and the electronic equipment that image is replaced
CN104796600B (en) Image synthesizer and image combining method
CN104883480A (en) Imaging Device And Imaging Method
KR100926133B1 (en) Method and apparatus for producing and taking digital contents
CN104243804B (en) Picture pick-up device, image processing equipment and its control method
US9294685B2 (en) Image processing apparatus, electronic camera, and medium storing image processing program
JP5862071B2 (en) Image processing apparatus, imaging apparatus, and program
CN107483809A (en) A kind of image capturing method, terminal and computer-readable recording medium
CN102090054B (en) Imaging device, image processing program, image processing device, and image processing method
JP7308376B2 (en) Electronics
JP2003333381A (en) Imaging apparatus with image evaluation function

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant