WO2014125682A1 - 撮像装置及び合焦制御方法 - Google Patents
撮像装置及び合焦制御方法 Download PDFInfo
- Publication number
- WO2014125682A1 WO2014125682A1 PCT/JP2013/079888 JP2013079888W WO2014125682A1 WO 2014125682 A1 WO2014125682 A1 WO 2014125682A1 JP 2013079888 W JP2013079888 W JP 2013079888W WO 2014125682 A1 WO2014125682 A1 WO 2014125682A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- focus
- evaluation value
- focus lens
- sharpness
- evaluation values
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/38—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/88—Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
Definitions
- the present invention relates to an imaging apparatus having an autofocus function.
- an imaging device In recent years, with the increase in the resolution of solid-state imaging devices such as CCD (Charge Coupled Device) image sensors and CMOS (Complementary Metal Oxide Semiconductor) image sensors, mobile phones such as digital still cameras, digital video cameras, and smartphones, PDA (Personal Digital). Demand for information equipment having a photographing function such as an assistant (mobile information terminal) is rapidly increasing. Note that an information device having the above imaging function is referred to as an imaging device.
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- These imaging apparatuses employ a contrast AF (Auto Focus) method or a phase difference AF method as a focusing control method for focusing on a main subject.
- a contrast AF Auto Focus
- a phase difference AF method as a focusing control method for focusing on a main subject.
- the contrast AF method is a method of acquiring the captured image contrast obtained at each driving stage as an evaluation value while driving the focus lens along the optical axis direction, and setting the lens position having the highest evaluation value as the in-focus position.
- the focus lens is a lens that adjusts the focal length of the photographing optical system by moving in the optical axis direction.
- the lens unit composed of a plurality of lenses a lens that adjusts the focal position is shown.
- the entire group is shown.
- Patent Document 2 describes a method of obtaining a function of a curve passing through the above three points and calculating a lens position corresponding to the maximum point of the evaluation value curve from this function.
- a spline function or a Bezier function is used as the function.
- the vicinity of the maximum point of the evaluation value curve may be a steep mountain shape.
- the frequency band of the image is closer to the high frequency side, so that the vicinity of the maximum point of the evaluation value curve becomes a steep mountain shape.
- many recent imaging apparatuses are equipped with a wide-angle lens, and the vicinity of the maximum point of the evaluation value curve tends to be steep due to the influence of the wide-angle lens. In recent cameras, since the lens is bright, the vicinity of the maximum point of the evaluation value curve tends to be steep.
- the method of obtaining the focus position using the slope of the straight line can obtain the focus position with a certain degree of accuracy regardless of the shape of the evaluation value curve.
- the error tends to increase as the evaluation value sampling number is decreased in order to improve the AF speed.
- the present invention has been made in view of the above circumstances, and provides an imaging apparatus and a focusing control method capable of performing AF with high accuracy regardless of the subject to be photographed while realizing high-speed AF. Objective.
- the image pickup apparatus of the present invention includes a focus lens that can move in the optical axis direction, an image pickup device that picks up an image of a subject through the focus lens, and the image pickup device for each position of the focus lens while moving the focus lens.
- An evaluation value calculation unit that calculates an evaluation value for focusing using a captured image signal obtained by imaging, and a sharp point near the maximum point of an evaluation value curve indicating a relationship between the evaluation value and the position of the focus lens
- a degree of sharpness calculation unit that calculates the degree using information on the position of the focus lens corresponding to each of the at least three evaluation values calculated by the evaluation value calculation unit and the at least three evaluation values; , Information on the position of the focus lens corresponding to each of the at least three evaluation values and the at least three evaluation values is used.
- At least one is selected according to the sharpness calculated by the sharpness calculation unit, and the selected calculation method allows the focus lens corresponding to the maximum point of the evaluation value curve to be selected.
- a focus position calculation unit that calculates a position as a focus position
- a focus control unit that performs focus control to move the focus lens to the focus position.
- the first calculation method for calculating the in-focus position, the at least three evaluation values, and the position information of the focus lens corresponding to each of the at least three evaluation values And calculating a first-order function and using the first-order function to calculate the in-focus position.
- the at least three evaluation values are calculated by the evaluation value calculation unit. And the evaluation value calculated for the position before and after the position of the focus lens corresponding to the evaluation value of the maximum value.
- the focus control method of the present invention is a focus control method by an image pickup apparatus having an image pickup device that picks up an image of a subject through a focus lens that is movable in the optical axis direction, and the focus lens is moved while moving the focus lens.
- an evaluation value calculating step for calculating an evaluation value for focusing using a captured image signal obtained by imaging with the image sensor, and an evaluation indicating a relationship between the evaluation value and the position of the focus lens Using the at least three evaluation values calculated by the evaluation value calculation step and the position information of the focus lens corresponding to each of the at least three evaluation values, the sharpness around the maximum point of the value curve is calculated.
- the sharpness calculation step to be calculated, the at least three evaluation values, and the at least three evaluation values respectively.
- At least one is selected according to the sharpness calculated in the sharpness calculation step, and the evaluation value curve is determined by the selected calculation method.
- a focus position calculating step for calculating the position of the focus lens corresponding to the maximum point as a focus position; and a focus control step for performing focus control for moving the focus lens to the focus position.
- the type of calculation method uses the information on the position of the focus lens corresponding to each of the at least three evaluation values and the at least three evaluation values to calculate a multi-order function indicating the evaluation value curve.
- a second calculation method of calculating a first-order function using information on the position of the corresponding focus lens and calculating the in-focus position using the first-order function includes the maximum value among the evaluation values calculated in the evaluation value calculation step and the evaluation values calculated for the positions before and after the position of the focus lens corresponding to the evaluation value of the maximum value. It is a waste.
- an imaging apparatus and a focus control method capable of performing AF with high accuracy regardless of the subject to be photographed while realizing high-speed AF.
- FIG. 4 is a diagram for explaining the calculation error of the in-focus position by the first calculation method and the second calculation method when the AF evaluation value curve is as shown in FIG.
- Flowchart for explaining the AF operation of the digital camera shown in FIG. The flowchart for demonstrating the modification of AF operation
- movement of the digital camera shown in FIG. The figure explaining a smart phone as an imaging device Internal block diagram of the smartphone of FIG.
- FIG. 1 is a diagram showing a schematic configuration of a digital camera as an example of an imaging apparatus for explaining an embodiment of the present invention.
- the imaging system of the digital camera shown in FIG. 1 includes an imaging optical system (including a photographing lens 1 and a diaphragm 2) and a solid-state imaging device 5 such as a CCD type or a CMOS type.
- An imaging optical system including the photographing lens 1 and the diaphragm 2 is detachable or fixed to the camera body.
- the photographing lens 1 includes a focus lens that can move in the optical axis direction.
- the solid-state imaging device 5 is not equipped with an optical low-pass filter, and thereby achieves high resolution.
- the system control unit 11 that controls the entire electric control system of the digital camera controls the flash light emitting unit 12 and the light receiving unit 13. Further, the system control unit 11 controls the lens driving unit 8 to adjust the position of the focus lens included in the photographing lens 1. Further, the system control unit 11 adjusts the exposure amount by controlling the aperture amount of the aperture 2 via the aperture drive unit 9.
- system control unit 11 drives the solid-state imaging device 5 via the imaging device driving unit 10 and outputs a subject image captured through the photographing lens 1 as a captured image signal.
- An instruction signal from the user is input to the system control unit 11 through the operation unit 14.
- the electric control system of the digital camera further includes an analog signal processing unit 6 that performs analog signal processing such as correlated double sampling processing connected to the output of the solid-state imaging device 5, and an analog output from the analog signal processing unit 6. And an A / D conversion circuit 7 for converting the signal into a digital signal.
- the analog signal processing unit 6 and the A / D conversion circuit 7 are controlled by the system control unit 11.
- the analog signal processing unit 6 and the A / D conversion circuit 7 may be built in the solid-state imaging device 5.
- the electric control system of the digital camera includes an interpolation calculation and a gamma correction for the main memory 16, the memory control unit 15 connected to the main memory 16, and the captured image signal output from the A / D conversion circuit 7.
- a digital signal processing unit 17 that performs calculation, RGB / YC conversion processing, and the like to generate captured image data, and the captured image data generated by the digital signal processing unit 17 is compressed into a JPEG format or the compressed image data is expanded.
- FIG. 2 is a functional block diagram of the contrast AF processing unit 19 in the digital camera shown in FIG.
- the contrast AF processing unit 19 includes an AF evaluation value calculation unit 191, a sharpness calculation unit 192, and an in-focus position calculation unit 193. These functional blocks are formed when a processor included in the system control unit 11 executes a program.
- the sharpness calculation unit 192 calculates the maximum value among the AF evaluation values calculated by the AF evaluation value calculation unit 191 and the two calculated positions before and after the focus lens position corresponding to the AF evaluation value of the maximum value. Using at least three AF evaluation values including the AF evaluation value and information on the focus lens position corresponding to each of the at least three AF evaluation values, the focus lens position and the AF evaluation value for the subject being shot The sharpness in the vicinity of the maximum point (the point at which the evaluation value becomes maximum) of the evaluation value curve showing the relationship is calculated.
- FIG. 3 and 4 are diagrams showing an example of the evaluation value curve.
- the evaluation value curve 30 has a low sharpness near the maximum point.
- the evaluation value curve 40 has a high sharpness near the maximum point.
- FIG. 3 shows at least three AF evaluation values (y0, y1, y2) and focus lens position information (x0, x1, x2) corresponding to each of the at least three AF evaluation values.
- the AF evaluation value y1 is the maximum value among the AF evaluation values calculated by the AF evaluation value calculation unit 191.
- the sharpness calculation unit 192 calculates the sharpness S as an index indicating the sharpness near the maximum point of the evaluation value curve.
- the sharpness calculation unit 192 uses, instead of the sharpness S, the following skewness skew representing the degree of bilateral asymmetry around the average value of the data distribution as an index indicating the sharpness near the maximum point of the evaluation value curve. calculate.
- the in-focus position calculation unit 193 includes at least three AF evaluation values (y0, y1, y2) described above, and information (x0, x1, x2) on the focus lens position corresponding to each of the at least three AF evaluation values. Is selected according to the sharpness calculated by the sharpness calculation unit 192, and the selected calculation method allows the focus lens corresponding to the maximum point of the evaluation value curve to be selected. The position is calculated as the in-focus position.
- the first calculation method uses (x0, y0), (x1, y1), (x2, y2) to calculate a quadratic function indicating an evaluation value curve passing through these three points, and the calculated quadratic function This is a method for calculating the in-focus position using.
- the in-focus position calculation unit 193 sets up a parabolic simultaneous equation that passes through three points (x0, y0), (x1, y1), and (x2, y2), and shows this parabola by solving this equation.
- the inflection point of the parabola (corresponding to the maximum point of the evaluation value curve) is calculated by differentiating the obtained quadratic function, and the in-focus position is calculated.
- the second calculation method uses (x0, y0), (x1, y1), (x2, y2), and uses only a linear function passing through these two points and the sign of the slope of this linear function. This is a method of calculating the in-focus position using two calculated linear functions and calculating a linear function passing through the remaining one point.
- the in-focus position calculation unit 193 first has a linear function indicating a straight line (reference numeral 31 in FIG. 3) passing through (x0, y0) and (x1, y1). Then, a linear function indicating a straight line (reference numeral 32 in FIG. 3) passing through (x1, y1) and (x2, y2) is calculated.
- the in-focus position calculation unit 193 calculates the x coordinate (xg in FIG. 3) at the point where the straight line 31 and the straight line 33 intersect as the in-focus position using a linear function indicating the straight line 31 and the straight line 33. To do.
- the first calculation method is advantageous for increasing the AF speed because the AF accuracy can be maintained even if the number of AF evaluation value samplings is reduced.
- the error of the calculated focus position becomes large.
- FIG. 5 shows the focus position x (1) calculated by the first calculation method and the focus position x (2) calculated by the second calculation method in the case of the evaluation value curve shown in FIG. FIG.
- the in-focus position obtained by the second calculation method is closer to an accurate value.
- the result shown in FIG. 5 is an example, and the calculation result of the in-focus position may be larger by the first calculation method and the second calculation method. Also, even with the difference of the degree shown in FIG. 5, since the focus shift affects the image quality, it is better to use the second calculation method.
- the system control unit 11 moves the focus lens from the MOD end to the INF end. While the focus lens is moving, imaging is performed by the solid-state imaging device 5 every predetermined time, and a captured image signal obtained by this imaging is sent to the contrast AF processing unit 19.
- step S3 if the sharpness exceeds the threshold value TH1 (step S3: YES), the in-focus position calculation unit 193 uses the pre-peak point data, the peak point data, and the peak post-point data to perform the second calculation method. To calculate the in-focus position (step S5).
- the system control unit 11 When the in-focus position is calculated in steps S4 and S5, the system control unit 11 performs control to move the focus lens to the in-focus position in accordance with the calculated in-focus position information (step S6). The focus process ends.
- the sharpness can be calculated, and the AF method can be improved by switching the calculation method by this sharpness. For this reason, there is no need to improve the AF accuracy by reducing the moving speed of the focus lens and increasing the number of AF evaluation value samplings. Therefore, the AF accuracy can be improved while increasing the AF speed.
- the curve function used in the first calculation method is a quadratic function indicating a left-right symmetric parabola.
- the evaluation value curve may not be symmetrical depending on the moving speed of the subject.
- the in-focus position is calculated using not only a quadratic function but also a higher order function (for example, a spline curve function or a Bezier curve function) as a curve function. Is good.
- step S3 when the determination in step S3 is YES, the in-focus position calculation unit 193 determines the amount of noise included in the AF evaluation value calculated in step S1 (step S10), and the noise amount is If the threshold value TH2 is exceeded, the process of step S4 is performed, and if the noise amount is less than TH2, the process of step S5 is performed.
- the second calculation method having a relatively large error is employed in spite of the fact that the evaluation value curve becomes as shown in FIG. be able to.
- the evaluation value curve When performing panning, if the moving speed of the subject (synonymous with the moving speed of the camera that follows this) is high, the evaluation value curve will not have a symmetrical shape. In this way, in a shooting situation where a left-right asymmetric evaluation value curve is obtained, a cubic or higher order such as a spline curve function or a Bezier curve function is used rather than calculating a focus position using a quadratic function or a linear function. The error is smaller when the in-focus position is calculated using the curve function.
- step S2 the process after step S2 is performed.
- step S10 may be added as in FIG.
- the evaluation value curve may have a high sharpness as shown in FIG. 4 depending on the subject. For this reason, even when a solid-state image sensor equipped with an optical low-pass filter is used as the solid-state image sensor 5, it is effective to adopt the AF control described in the present embodiment.
- FIG. 10 is a block diagram showing a configuration of the smartphone 200 shown in FIG.
- the main components of the smartphone include a wireless communication unit 210, a display input unit 204, a call unit 211, an operation unit 207, a camera unit 208, a storage unit 212, and an external input / output unit. 213, a GPS (Global Positioning System) receiving unit 214, a motion sensor unit 215, a power supply unit 216, and a main control unit 220.
- a wireless communication function for performing mobile wireless communication via a base station device BS (not shown) and a mobile communication network NW (not shown) is provided.
- the wireless communication unit 210 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 220. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
- the display panel 202 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
- LCD Liquid Crystal Display
- OELD Organic Electro-Luminescence Display
- the display panel 202 and the operation panel 203 of the smartphone 200 exemplified as an embodiment of the photographing apparatus of the present invention integrally constitute a display input unit 204.
- the arrangement 203 covers the display panel 202 completely.
- the operation panel 203 may have a function of detecting a user operation even in an area outside the display panel 202.
- the operation panel 203 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 202 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 202. May be included).
- the operation panel 203 may include two sensitive areas of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 201 and the like.
- the position detection method employed in the operation panel 203 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, a capacitance method, and the like. You can also
- the call unit 211 includes a speaker 205 and a microphone 206, converts user's voice input through the microphone 206 into voice data that can be processed by the main control unit 220, and outputs the voice data to the main control unit 220. 210 or the audio data received by the external input / output unit 213 is decoded and output from the speaker 205.
- the speaker 205 can be mounted on the same surface as the display input unit 204 and the microphone 206 can be mounted on the side surface of the housing 201.
- the operation unit 207 is a hardware key using a key switch or the like, and receives an instruction from the user.
- the operation unit 207 is mounted on the side surface of the housing 201 of the smartphone 200 and is turned on when pressed with a finger or the like, and turned off when the finger is released with a restoring force such as a spring. It is a push button type switch.
- the storage unit 212 includes a control program and control data of the main control unit 220, application software, address data that associates the name and telephone number of a communication partner, transmitted / received e-mail data, Web data downloaded by Web browsing, The downloaded content data is stored, and streaming data and the like are temporarily stored.
- the storage unit 212 includes an internal storage unit 217 built in the smartphone and an external storage unit 218 having a removable external memory slot.
- Each of the internal storage unit 217 and the external storage unit 218 constituting the storage unit 212 includes a flash memory type (hard memory type), a hard disk type (hard disk type), a multimedia card micro type (multimedia card micro type), This is realized using a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory), or the like.
- a flash memory type hard memory type
- hard disk type hard disk type
- multimedia card micro type multimedia card micro type
- a storage medium such as a card type memory (for example, MicroSD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory), or the like.
- the external input / output unit 213 serves as an interface with all external devices connected to the smartphone 200, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or a network.
- external devices for example, universal serial bus (USB), IEEE 1394, etc.
- a network for example, the Internet, wireless LAN, Bluetooth (registered trademark), RFID (Radio Frequency Identification), Infrared Data Association (IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark) ZigBee) (registered trademark, etc.) for direct or indirect connection.
- an external device connected to the smartphone 200 for example, a wired / wireless headset, a wired / wireless external charger, a wired / wireless data port, a memory card (Memory card) connected via a card socket, or a SIM (Subscriber).
- Identity Module Card / UIM (User Identity Module Card) card external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment connected wirelessly, yes / no
- the external input / output unit 213 transmits data received from such an external device to each component inside the smartphone 200, or allows the data inside the smartphone 200 to be transmitted to the external device. Can do.
- the GPS receiving unit 214 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 220, executes positioning calculation processing based on the received GPS signals, and calculates the latitude of the smartphone 200 Detect the position consisting of longitude and altitude.
- the GPS reception unit 214 can acquire position information from the wireless communication unit 210 or the external input / output unit 213 (for example, a wireless LAN), the GPS reception unit 214 can also detect the position using the position information.
- the motion sensor unit 215 includes, for example, a three-axis acceleration sensor, and detects the physical movement of the smartphone 200 in accordance with an instruction from the main control unit 220. By detecting the physical movement of the smartphone 200, the moving direction and acceleration of the smartphone 200 are detected. The detection result is output to the main control unit 220.
- the power supply unit 216 supplies power stored in a battery (not shown) to each unit of the smartphone 200 in accordance with an instruction from the main control unit 220.
- the main control unit 220 includes a microprocessor, operates according to a control program and control data stored in the storage unit 212, and controls each unit of the smartphone 200 in an integrated manner.
- the main control unit 220 includes a mobile communication control function that controls each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 210.
- the application processing function is realized by the main control unit 220 operating according to the application software stored in the storage unit 212.
- Examples of the application processing function include an infrared communication function for controlling the external input / output unit 213 to perform data communication with the opposite device, an e-mail function for transmitting / receiving e-mails, and a web browsing function for browsing web pages. .
- the main control unit 220 executes display control for the display panel 202 and operation detection control for detecting a user operation through the operation unit 207 and the operation panel 203.
- the main control unit 220 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
- a software key such as a scroll bar, or a window for creating an e-mail.
- the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 202.
- the camera unit 208 includes configurations other than the external memory control unit 20, the recording medium 21, the display control unit 22, the display unit 23, and the operation unit 14 in the digital camera shown in FIG.
- the captured image data generated by the camera unit 208 can be recorded in the storage unit 212 or output through the input / output unit 213 or the wireless communication unit 210.
- the camera unit 208 is mounted on the same surface as the display input unit 204, but the mounting position of the camera unit 208 is not limited to this, and the camera unit 208 may be mounted on the back surface of the display input unit 204. Good.
- the camera unit 208 can be used for various functions of the smartphone 200.
- an image acquired by the camera unit 208 can be displayed on the display panel 202, or the image of the camera unit 208 can be used as one of operation inputs of the operation panel 203.
- the GPS receiving unit 214 detects a position
- the position can be detected with reference to an image from the camera unit 208.
- the optical axis direction of the camera unit 208 of the smartphone 200 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
- the image from the camera unit 208 can also be used in the application software.
- the position information acquired by the GPS receiver 214 to the image data of the still image or the moving image, the voice information acquired by the microphone 206 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 215 can be added and recorded in the recording unit 212, or output through the input / output unit 213 and the wireless communication unit 210.
- the camera unit 208 is provided with the contrast AF processing unit 19 of FIG.
- the disclosed imaging device includes a focus lens that can move in an optical axis direction, an imaging device that captures an image of a subject through the focus lens, and the imaging device that moves the focus lens while moving the focus lens.
- An evaluation value calculation unit that calculates an evaluation value for focusing using a captured image signal obtained by imaging, and a sharp point near the maximum point of an evaluation value curve indicating a relationship between the evaluation value and the position of the focus lens
- a degree of sharpness calculation unit that calculates the degree using information on the position of the focus lens corresponding to each of the at least three evaluation values calculated by the evaluation value calculation unit and the at least three evaluation values; , Information on the position of the focus lens corresponding to each of the at least three evaluation values and the at least three evaluation values.
- At least one is selected according to the sharpness calculated by the sharpness calculation unit, and the focus lens corresponding to the maximum point of the evaluation value curve is selected by the selected calculation method.
- a focus position calculation unit that calculates a position as a focus position
- a focus control unit that performs focus control to move the focus lens to the focus position.
- a multi-order function indicating the evaluation value curve is calculated, and the multi-order function is used.
- the at least three evaluation values are calculated by the evaluation value calculation unit.
- the maximum value among the calculated evaluation values and the evaluation values calculated for the positions before and after the position of the focus lens corresponding to the maximum evaluation value are included.
- the in-focus position calculation unit selects the second calculation method when the sharpness exceeds a threshold value, and the first calculation method when the sharpness is equal to or less than the threshold value. Is to select.
- the focus position calculation unit when the focus position calculation unit performs panning, and the moving speed of the imaging device or the speed of a moving object included in the subject being shot exceeds a third threshold value Regardless of the sharpness, the at least three evaluation values, the position information of the focus lens corresponding to each of the at least three evaluation values, and a function of a third order or higher are used.
- the focal position is calculated.
- the disclosed imaging device includes the imaging device in which the optical low-pass filter is not mounted.
- the disclosed focus control method is a focus control method by an imaging apparatus having an imaging element that images a subject through a focus lens that is movable in the optical axis direction, and the focus lens is moved while moving the focus lens.
- an evaluation value calculation step for calculating an evaluation value for focusing using a captured image signal obtained by imaging with the image sensor, and an evaluation indicating a relationship between the evaluation value and the position of the focus lens Using the information on the position of the focus lens corresponding to each of the at least three evaluation values calculated by the evaluation value calculating step and the at least three evaluation values, the sharpness near the maximum point of the value curve is calculated.
- a first calculation method for calculating the in-focus position using the plurality of functions, the at least three evaluation values, and the at least three evaluation values A second calculation method of calculating a first-order function using information on the position of the focus lens corresponding to each, and calculating the focus position using the first-order function.
- the two evaluation values include the maximum value among the evaluation values calculated in the evaluation value calculation step and the evaluation values calculated for the positions before and after the position of the focus lens corresponding to the evaluation value of the maximum value. It is a waste.
- the present invention is particularly convenient and effective when applied to a digital camera or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Automatic Focus Adjustment (AREA)
- Focusing (AREA)
Abstract
Description
y0=c0・・・(2)
が得られる。
y1=c0+c1(x1-x0)・・・(3)
が得られる。
y1-y0=c1(x1-x0)となり、この式から
c1=(y1-y0)/(x1-x0)となる。
y2=c0+c1(x2-x0)+c2(x2-x0)(x2-x1)・・・(4)
が得られる。
y2-y1=c1(x2-x1)+c2(x2-x0)(x2-x1)
={c1+c2(x2-x0)}(x2-x1)となり、
c2={(y2-y1)/(x2-x1)-c1}/(x2-x0)
となる。
{|x1-x0|×|x1-x2|}/y1
を用い、これをc2に乗算した値をシャープネスSとして評価すると良いことを発明者は見出した。
S={|c2|×|x1-x0|×|x1-x2|}/y1 ・・・(5)
で尖鋭度を評価する。この指標を用いると、放物線の広がり具合を、AF評価値の大きさや、サンプリング点の間隔が違っても一つの変数だけで判定出来るようになり、より正確に尖鋭度を判定できる。
本出願は、2013年2月14日出願の日本特許出願(特願2013-26862)に基づくものであり、その内容はここに取り込まれる。
5 固体撮像素子
19 コントラストAF処理部
191 AF評価値算出部
192 尖鋭度算出部
193 合焦位置算出部
Claims (7)
- 光軸方向に移動可能なフォーカスレンズと、
前記フォーカスレンズを通して被写体を撮像する撮像素子と、
前記フォーカスレンズを移動させながら、前記フォーカスレンズの位置毎に、前記撮像素子により撮像して得られる撮像画像信号を利用して合焦のための評価値を算出する評価値算出部と、
前記フォーカスレンズの位置に対する前記評価値の関係を示す評価値曲線の極大点付近の尖鋭度を、前記評価値算出部によって算出された少なくとも3つの前記評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を利用して算出する尖鋭度算出部と、
前記少なくとも3つの評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を用いた複数種類の演算方法の中から、少なくとも前記尖鋭度算出部によって算出された尖鋭度に応じて1つを選択し、選択した演算方法により、前記評価値曲線の極大点に対応するフォーカスレンズの位置を合焦位置として算出する合焦位置算出部と、
前記合焦位置に前記フォーカスレンズを移動させる合焦制御を行う合焦制御部と、を備え、
前記複数種類の演算方法は、前記少なくとも3つの評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を用いて、前記評価値曲線を示す複数次の関数を算出し、前記複数次の関数を利用して前記合焦位置を算出する第一の演算方法と、前記少なくとも3つの評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を用いて1次の関数を算出し、前記1次の関数を利用して前記合焦位置を算出する第二の演算方法とを含み、
前記少なくとも3つの評価値は、前記評価値算出部により算出される前記評価値のうちの最大値と、前記最大値の評価値に対応するフォーカスレンズの位置の前後の位置について算出された評価値とを含む撮像装置。 - 請求項1記載の撮像装置であって、
前記合焦位置算出部は、前記尖鋭度と、前記評価値に含まれるノイズ量とに応じて、前記複数種類の演算方法から1つを選択する撮像装置。 - 請求項2記載の撮像装置であって、
前記合焦位置算出部は、前記尖鋭度が第一の閾値を超えかつ前記ノイズ量が第二の閾値以下の場合に前記第二の演算方法を選択し、前記尖鋭度が前記第一の閾値を超えかつ前記ノイズ量が前記第二の閾値を超える場合、及び、前記尖鋭度が前記第一の閾値以下の場合に前記第一の演算方法を選択する撮像装置。 - 請求項1記載の撮像装置であって、
前記合焦位置算出部は、前記尖鋭度が閾値を超える場合に前記第二の演算方法を選択し、前記尖鋭度が前記閾値以下の場合に前記第一の演算方法を選択する撮像装置。 - 請求項1乃至4のいずれか1項記載の撮像装置であって、
前記合焦位置算出部は、流し撮りが行われており、かつ、前記撮像装置の移動速度又は撮影中の被写体に含まれる動体の速度が第三の閾値を超える場合は、前記尖鋭度に関わらず、前記少なくとも3つの評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報と、3次以上の関数とを利用して前記合焦位置を算出する撮像装置。 - 請求項1乃至5のいずれか1項記載の撮像装置であって、
前記撮像素子は、光学ローパスフィルタ非搭載のものである撮像装置。 - 光軸方向に移動可能なフォーカスレンズを通して被写体を撮像する撮像素子を有する撮像装置による合焦制御方法であって、
前記フォーカスレンズを移動させながら、前記フォーカスレンズの位置毎に、前記撮像素子により撮像して得られる撮像画像信号を利用して合焦のための評価値を算出する評価値算出ステップと、
前記フォーカスレンズの位置に対する前記評価値の関係を示す評価値曲線の極大点付近の尖鋭度を、前記評価値算出ステップによって算出した少なくとも3つの前記評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を利用して算出する尖鋭度算出ステップと、
前記少なくとも3つの評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を用いた複数種類の演算方法の中から、少なくとも前記尖鋭度算出ステップで算出した尖鋭度に応じて1つを選択し、選択した演算方法により、前記評価値曲線の極大点に対応するフォーカスレンズの位置を合焦位置として算出する合焦位置算出ステップと、
前記合焦位置に前記フォーカスレンズを移動させる合焦制御を行う合焦制御ステップと、を備え、
前記複数種類の演算方法は、前記少なくとも3つの評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を用いて、前記評価値曲線を示す複数次の関数を算出し、前記複数次の関数を利用して前記合焦位置を算出する第一の演算方法と、前記少なくとも3つの評価値、及び、前記少なくとも3つの評価値の各々に対応する前記フォーカスレンズの位置の情報を用いて1次の関数を算出し、前記1次の関数を利用して前記合焦位置を算出する第二の演算方法とを含み、
前記少なくとも3つの評価値は、前記評価値算出ステップで算出される前記評価値のうちの最大値と、前記最大値の評価値に対応するフォーカスレンズの位置の前後の位置について算出された評価値とを含む合焦制御方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380073094.9A CN105190392B (zh) | 2013-02-14 | 2013-11-05 | 摄像装置和对焦控制方法 |
JP2015500091A JP5750558B2 (ja) | 2013-02-14 | 2013-11-05 | 撮像装置及び合焦制御方法 |
US14/825,678 US9703070B2 (en) | 2013-02-14 | 2015-08-13 | Imaging apparatus and focusing control method |
US15/614,216 US10095004B2 (en) | 2013-02-14 | 2017-06-05 | Imaging apparatus and focusing control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-026862 | 2013-02-14 | ||
JP2013026862 | 2013-02-14 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/825,678 Continuation US9703070B2 (en) | 2013-02-14 | 2015-08-13 | Imaging apparatus and focusing control method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014125682A1 true WO2014125682A1 (ja) | 2014-08-21 |
Family
ID=51353701
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/079888 WO2014125682A1 (ja) | 2013-02-14 | 2013-11-05 | 撮像装置及び合焦制御方法 |
Country Status (4)
Country | Link |
---|---|
US (2) | US9703070B2 (ja) |
JP (1) | JP5750558B2 (ja) |
CN (1) | CN105190392B (ja) |
WO (1) | WO2014125682A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108139564A (zh) * | 2015-09-30 | 2018-06-08 | 富士胶片株式会社 | 对焦控制装置、摄像装置、对焦控制方法及对焦控制程序 |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6799924B2 (ja) * | 2016-02-16 | 2020-12-16 | 株式会社Screenホールディングス | 細胞観察装置および細胞観察方法 |
JP6602954B2 (ja) | 2016-03-18 | 2019-11-06 | 富士フイルム株式会社 | 合焦位置検出装置及び合焦位置検出方法 |
CN106101565B (zh) * | 2016-08-22 | 2020-04-10 | 浙江宇视科技有限公司 | 一种电动镜头聚焦方法及装置 |
CN106501917B (zh) * | 2016-12-07 | 2019-12-27 | 歌尔科技有限公司 | 一种镜头辅助调焦方法和装置 |
CN108345085A (zh) * | 2017-01-25 | 2018-07-31 | 广州康昕瑞基因健康科技有限公司 | 聚焦方法和聚焦*** |
CN113888761A (zh) * | 2017-05-19 | 2022-01-04 | 谷歌有限责任公司 | 使用环境传感器数据的高效的图像分析 |
CN109584198B (zh) * | 2017-09-26 | 2022-12-23 | 浙江宇视科技有限公司 | 一种人脸图像质量评价方法、装置及计算机可读存储介质 |
JP2020020991A (ja) * | 2018-08-02 | 2020-02-06 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd | 制御装置、方法およびプログラム |
KR102667875B1 (ko) * | 2018-10-29 | 2024-05-22 | 한화비전 주식회사 | 오토 포커스를 수행하는 이미지 촬영 장치 |
CN109480766A (zh) * | 2018-11-14 | 2019-03-19 | 深圳盛达同泽科技有限公司 | 视网膜自动对焦方法、装置、***及眼底相机 |
CN113703321B (zh) * | 2021-08-27 | 2024-05-14 | 西安应用光学研究所 | 用于车载光电伺服控制***的贝塞尔曲线缓动处理方法 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005345695A (ja) * | 2004-06-02 | 2005-12-15 | Nikon Corp | カメラ |
JP2009027212A (ja) * | 2007-06-20 | 2009-02-05 | Ricoh Co Ltd | 撮像装置 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3103587B2 (ja) * | 1990-04-25 | 2000-10-30 | オリンパス光学工業株式会社 | 自動合焦装置 |
JP4265233B2 (ja) * | 2003-02-13 | 2009-05-20 | 株式会社ニコン | カメラ |
JP2004279721A (ja) * | 2003-03-14 | 2004-10-07 | Ricoh Co Ltd | 自動合焦装置 |
US7880800B2 (en) * | 2004-12-08 | 2011-02-01 | Fujifilm Corporation | Auto focus system that controls focusing speeds and movements based on image conditions |
JP4769616B2 (ja) * | 2006-03-27 | 2011-09-07 | 富士フイルム株式会社 | 駆動制御装置 |
JP4678603B2 (ja) * | 2007-04-20 | 2011-04-27 | 富士フイルム株式会社 | 撮像装置及び撮像方法 |
EP2007135B1 (en) | 2007-06-20 | 2012-05-23 | Ricoh Company, Ltd. | Imaging apparatus |
JP4518131B2 (ja) * | 2007-10-05 | 2010-08-04 | 富士フイルム株式会社 | 撮像方法及び装置 |
TWI374664B (en) * | 2007-12-05 | 2012-10-11 | Quanta Comp Inc | Focusing apparatus and method |
JP5328526B2 (ja) * | 2009-07-03 | 2013-10-30 | キヤノン株式会社 | 撮像装置 |
KR101593995B1 (ko) * | 2009-09-22 | 2016-02-15 | 삼성전자주식회사 | 자동 초점 조절 방법, 상기 방법을 기록한 기록 매체, 및 상기 방법을 실행하는 자동 초점 조절 장치 |
JP5445150B2 (ja) * | 2010-01-12 | 2014-03-19 | 株式会社リコー | 自動合焦制御装置、電子撮像装置及びデジタルスチルカメラ |
US8643730B2 (en) * | 2010-12-20 | 2014-02-04 | Samsung Electronics Co., Ltd | Imaging device and image capturing method |
-
2013
- 2013-11-05 WO PCT/JP2013/079888 patent/WO2014125682A1/ja active Application Filing
- 2013-11-05 JP JP2015500091A patent/JP5750558B2/ja active Active
- 2013-11-05 CN CN201380073094.9A patent/CN105190392B/zh active Active
-
2015
- 2015-08-13 US US14/825,678 patent/US9703070B2/en active Active
-
2017
- 2017-06-05 US US15/614,216 patent/US10095004B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005345695A (ja) * | 2004-06-02 | 2005-12-15 | Nikon Corp | カメラ |
JP2009027212A (ja) * | 2007-06-20 | 2009-02-05 | Ricoh Co Ltd | 撮像装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108139564A (zh) * | 2015-09-30 | 2018-06-08 | 富士胶片株式会社 | 对焦控制装置、摄像装置、对焦控制方法及对焦控制程序 |
US10795118B2 (en) | 2015-09-30 | 2020-10-06 | Fujifilm Corporation | Focusing control device, imaging device, focusing control method, and focusing control program |
Also Published As
Publication number | Publication date |
---|---|
US20150346585A1 (en) | 2015-12-03 |
CN105190392A (zh) | 2015-12-23 |
US10095004B2 (en) | 2018-10-09 |
JPWO2014125682A1 (ja) | 2017-02-02 |
CN105190392B (zh) | 2018-02-13 |
US20170269326A1 (en) | 2017-09-21 |
JP5750558B2 (ja) | 2015-07-22 |
US9703070B2 (en) | 2017-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5750558B2 (ja) | 撮像装置及び合焦制御方法 | |
US9235916B2 (en) | Image processing device, imaging device, computer-readable storage medium, and image processing method | |
JP5657182B2 (ja) | 撮像装置及び信号補正方法 | |
US20170004603A1 (en) | Image processing device, imaging device, image processing method, and image processing program | |
JP6186521B2 (ja) | 合焦制御装置、撮像装置、合焦制御方法、及び合焦制御プログラム | |
JP5982601B2 (ja) | 撮像装置及び合焦制御方法 | |
US9826150B2 (en) | Signal processing device, imaging apparatus, parameter generating method, signal processing method, and program | |
JP6307526B2 (ja) | 撮像装置及び合焦制御方法 | |
JP6028112B2 (ja) | 撮像装置及び合焦制御方法 | |
JP5677625B2 (ja) | 信号処理装置、撮像装置、信号補正方法 | |
JPWO2016080538A1 (ja) | 撮像装置及び撮像方法 | |
JP6255540B2 (ja) | 合焦制御装置、撮像装置、合焦制御方法、及び合焦制御プログラム | |
JP5982600B2 (ja) | 撮像装置及び合焦制御方法 | |
JP5990665B2 (ja) | 撮像装置及び合焦制御方法 | |
JP6171105B2 (ja) | 撮像装置及び合焦制御方法 | |
JP5789725B2 (ja) | 撮像装置及びその合焦方法と合焦制御プログラム | |
CN113454706B (zh) | 显示控制装置、摄像装置、显示控制方法 | |
WO2017057072A1 (ja) | 合焦制御装置、合焦制御方法、合焦制御プログラム、レンズ装置、撮像装置 | |
WO2014073441A1 (ja) | 撮像装置およびその動作制御方法 | |
WO2020158200A1 (ja) | 撮像装置の制御装置、撮像装置、撮像装置の制御方法、撮像装置の制御プログラム | |
WO2013145887A1 (ja) | 撮像装置及び撮像方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201380073094.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13875041 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015500091 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13875041 Country of ref document: EP Kind code of ref document: A1 |