GB2358536A - Autofocus with subject selection - Google Patents

Autofocus with subject selection Download PDF

Info

Publication number
GB2358536A
GB2358536A GB0029878A GB0029878A GB2358536A GB 2358536 A GB2358536 A GB 2358536A GB 0029878 A GB0029878 A GB 0029878A GB 0029878 A GB0029878 A GB 0029878A GB 2358536 A GB2358536 A GB 2358536A
Authority
GB
United Kingdom
Prior art keywords
subject
emitting
focus
angle
incident angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0029878A
Other versions
GB2358536B (en
GB0029878D0 (en
Inventor
Yujiro Ito
Susumu Kurita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of GB0029878D0 publication Critical patent/GB0029878D0/en
Publication of GB2358536A publication Critical patent/GB2358536A/en
Application granted granted Critical
Publication of GB2358536B publication Critical patent/GB2358536B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/30Systems for automatic generation of focusing signals using parallactic triangle with a base line
    • G02B7/32Systems for automatic generation of focusing signals using parallactic triangle with a base line using active means, e.g. light emitter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)
  • Automatic Focus Adjustment (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The auto-focus apparatus emits an eye-safe infra-red beam which it scans across a subject. Reflected light is incident on a position sensitive detector diode. Based on the emitting angle and the incident angle the system determines whether or not to focus. The system may focus only on objects close to the camera's optical axis. The camera may be a video camera.

Description

2358536 AUTO-FOCUS APPARATUS, FOCUS ADJUSTING METHOD, IMAGE CAPTURING
APPARATUS AND IMAGE CAPTURING METHOD
BACKGROUND OF THE INVENTI FIELD OF THE INVENTI
The present invention relates to an auto-focus apparatus, a focus adjusting method, an image capturing apparatus and an image capturing method, and may be suitably applied, for example, to a video camera.
DESCRIPTION OF THE REIATED ART
Conventionally, video cameras contain a so-called auto-focus function which automatically performs a focusing operation of a lens in accordance with the distance to a subject (subject distance). For realizing such an auto-focus function, a variety of focus detecting methods have been devised for detecting a defocused state, and among others, an image processing method, an infrared method and a phase difference detecting method are representative.
The image processing method picks up a central region from an image captured by an imaging device (CCD: Charge Coupled Device), extracts high frequency components from the picked-up region, and adds the high frequency components to generate a value which is used as an evaluation value for detecting the focus. This evaluation value becomes higher as an image of a subject bein captured approaches a focused state, and becomes lower as the! image is further away from the focused state after it preseriti! 1 the highest value at the position of the focused state. Therefoll-6, the image processing method moves the focus to examine whetlit the evaluation value increases or decreases, and adjusts the foO while moving the focus in a direction in which the evaluatio value becomes higher until the focused state is reached. Iri' 1 a er : i lh words, the image processing method performs a so-called hil.1.11 11 climbing operation.
This image processing method is advantageous in that it =;n realize an auto-focus function without modifying or adding.,ii design of the optical system such as lenses, and can improve! hE sensitivity to a focus error since the focus is adjusted us.i.', an image captured by the CCD.
Next, the infrared method applies the principles of triangulation to calculate the subject distance. Specifical).3 the infrared method irradiates an infrared ray from a video mEra to a subject, detects an incident angle of return light ref'l',, tE d by the subject and returning to the video camera, and then calculates the subject distance based on the detected incidgi.: angle of the return light. The infrared method is advantageOut n that the subject distance can be sufficiently measured, even the subject is dark, as long as the amount of return light f.4)m thesubject exceeds a predetermined amount, since the infrar,1(1 2ay emitted from the video camera itself is irradiated to the subject.
Further, the phase difference detecting method provides two sets of lens groups, each comprised of a small lens and a line sensor for detecting the position of light, in a lens optical system of a camera, and disposes the two sets of lens groups with their optical axes shifted from each other to realize the aforementioned triangulation. This phase difference detecting method is advantageous in that the capability of detecting a focusing state is constant irrespective of the subject distance.
The aforementioned image processing method, however, cannot detect a focusing stage unless the focus is moved to examine a change in evaluation value. Also, since the evaluation value varies in response to a small movement of a subject in the vertical direction with respect to the optical axis, the focused position can be erroneously detected. Therefore, the image processing method experiences difficulties in making the focus smoothly follow movements of the subject in the direction of the optical axis.
As a solution for the problems of the image processing method, the infrared method and the phase difference detecting method have been proposed. Since these methods.can reveal a focusing state without moving the focus, they need not move the focus to examine the focusing state. In addition, even if a subject moves in the vertical direction with respect to the optical axis, these methods will never erroneously measure the subject distance. However, because of its limited ability of measuring a distance of on'' about 10m or less, the infrared method is not suitable for aj business-use video camera which may capture a subject, for e i at a distance exceeding 10m with a small depth of field (a r,.i centered on the subject in which the subject is in focus).
Also, in the infrared method, since an optical system fo emitting an infrared ray is generally disposed.external to a camera, the optical axis of the video camera cannot be align c with the optical axis of the infrared ray, causing a problem,lc discrepancy between an actual screen range and a range viewec view finder, i.e., parallax.
Fig. 1 shows the principles as to how the parallax occur.
I shown in Fig. 1, a video camera I comprises a camera body 1A a camera lens 1B, an infrared ray emitter 1C and a return li.' incident angle detector 1D mounted on the camera body 1A, an applies the principles of triangulation to measure the subjet distance.
Since the camera lens 1B is spaced from the infrared ray generator 1C by a predetermined distance, the optical axis A;, an infrared ray is not coaxial with the optical axis A2 of th camera. In the video camera 1, since the optical axis Al of infrared ray is offset from the optical axis A2 of the camer;L this way, even if a subject Bl to be captured is located on t optical axis A2 of the camera, the infrared ray may be irrad to a subject B2 which is located on an axis offset from the 1 optical axis A2 of the camera, i.e., on the optical axis Al of the infrared ray.
In this event, the video camera 1 detects return light Ll from the subject B2, which is not to be captured, to measure the distance to the subject B2, and fails to measure the distance to the subject B1 to be captured.
The phase difference detecting method, on the other hand, suffers from a lower ability of measuring the subject distance as an iris of a camera lens is reduced. Specifically, since a video camera performs an autofocus operation and an imaging operation simultaneously, the video camera cannot open the iris in the autofocus operation and reduces the iris during the imaging operation, as does a still camera which separately performs the auto-focus operation and the imaging operation. Thus, due to the requirement of adjusting the iris during the imaging operation, the video camera cannot avoid a degradation in the ability of measuring the distance to a subject, resulting from the reduced iris.
SLR44ARY OF THE INVENTION In view of the foregoing, the present invention aims to provide an auto- focus apparatus, a focus adjusting method, an image capturing apparatus and an image capturing method which are capable of accurately adjusting the focus on a subject to be captured.
The present invention provides an auto-focus apparatus, a:f,,)ciis adjusting method, an image capturing apparatus and an image: capturing method in which an irradiation wave is emitted frpri emitting means for irradiation to a subject while changing Lr incident angle of the irradiation wave, an incident angle of' a reflected wave of the irradiation wave reflected by the subfEct, incident on light receiving means positioned corresponding tc the emitting means, is detected, whether or not the subject is s subject for which the focus is adjusted is determined based!cii he emitting angle and the incident angle, and the focus is adjl;sl--el on the subject when determining that the subject is the subject for which the focus should be adjusted, thereby making it p ssi.ble to accurately adjust the focus on the subject for which thelfocus should be adjusted.
Further particular and preferred aspects of the j)i Esent invention are set out in the accompanying independent and depE adent claims. Features of the dependent claims may be combinc-0 with features of the independent claims as appropriate, a n in combinations other than those explicitly set out in the claims.1 BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be described further, by a y of example only, with reference to preferred embodiments there f as illustrated in the accompanying drawings, in which:
Fig. 1 is a schematic diagram used for explaining the principles as to how parallax occurs; Fig. 2 is a schematic diagram used for explaining a vid('f camera according to an embodiment of the present invention; Fig. 3 is a schematic diagram used for explaining how the distance to a subject is measured; Fig. 4 is a block diagram illustrating the circuit configuration of the video camera; Fig. 5 is a schematic diagram used for explaining how the distance to a subject is measured; Fig. 6 is a schematic diagram showing the relationship between an emittingangle of an infrared ray and an incident angle of return light; Fig. 7 is a schematic diagram showing the relationship between an emitting angle of an infrared ray and an incident angle of return light; and Fig. 8 is a flow chart illustrating a focus adjustment processing procedure.
DETAILED DESCRIPTION OF THE EMBODIMENT
Preferred embodiments of this invention will be described with reference to the accompanying drawings:
Fig. 2 illustrates the configuration of a video camera, generally designated by reference numeral 10, which comprises a camera body-10A, and a camera lens 10B, an infrared ray emitter IOC as emitting means, and a return light incident angle detector 10D as incident light detecting means mounted at predetermined positions of the camera body 10A. The video camera employs the infrared method as a focus detecting method, and applies the principles of triangulation to measure the distance from th( camera lens 10B to a subject, i.e., the subject distance.
In this embodiment, for measuring the subject distance, e infrared ray emitter 10C can vary (scan) the optical axis AO of an infrared ray, i.e., the orientation of the infrared ray)v,r an emitting angle in a range of 0 1 to 0 u in the vertical direc i ti 1 oil.
In this event, an infrared ray scanning period is set, f 1 example, to 1/60 seconds, so that the infrared ray emitter 10., scans in a range of the emitting angle from 0 1 to 0 u once f o every 1/60 seconds. Also, the infrared ray emitter 10C irradlates the infrared ray at a position on the optical axis All of tIxE' camera spaced by a distance 0.8m from the camera lens 10B whiA, the infrared ray is emitted at angle 0.1, and irradiates the inf.zre:1 ray at a position on the optical axis All of the camera spaoa. by a distance 30m from the camera lens 10B when the infrared ra is emitted at angle Ou. In this way, the infrared ray emitter:l C can irradiate the infrared ray to a subject which is locatec n the optical axis All of the camera and distanced from the caintra lens 10B in a range of 0.8 to 30m.
Fig. 3 shows how to measure the subject distance when a:, subject B10 is located on the optical axis All of the camera,. In this event, the infrared ray emitter 10C irradiates an infrapod ray L10 to the subject B10 at an emitting angle Ou. The infrs,r d ray L10 is reflected by the subject B10, and return light L11 from the subject B10 is incident on the return light incident ang., detector 10D at an incident angle 0 ub.
When the subject B10 is located on the optical axis All of the camera as described above, the subject distance is uniquely determined when the incident angle 0 ub of the return light is determined. Therefore, the return light incident angle detector 10D measures the incident angle Oub of the return light L11 to calculate the subject distance Lu based on the incident angle 0 ub and the distance between the camera lens 10B and the return light incident angle detector 10D. Then, the video camera 10 adjusts the focus in accordance with the thus calculated subject distance Lu.
Here, the circuit configuration of the video camera 10 is illustrated in Fig. 4. A Central Processing Unit (CPU) 15 is mounted in the camera body 10A to control the general operation of the video camera 10. The CPU 15 periodically (for example, every 30msec) checks whether a push switch 16, disposed external to the video camera 10, is pressed, and generates instruction data S1 for emitting an infrared ray when it detects that the push switch 16 is pressed.
The CPU 15 sends the instruction data S1 to a digital-toanalog (DA) converter circuit 17 which converts the instruction data S1 to an analog instruction signal S2 which is then sent to an Laser Diode (LD) driving circuit 18 in the infrared ray emitter 10C. The LD driving circuit 18, in response to the instruction signal S2, drives an LD 19 as a light emitting element, and applies the LD 19 with a current. Then, the LD 19 emits las light L15 in accordance with the applied current.
The laser light L15 emitted from the LD 19 is transforme tc a diverting beam L16 having a diverting angle proximal to a p;- llel beam by a lens 20, reflected by a mirror 21 comprised of a f t mirror, and irradiated to the space as an infrared ray L17.
In this embodiment, the infrared ray emitter 10C employs'i 1 s a light emitting element, the LD 19, referred to as an eye saf laser diode, which is highly safe to eyes and oscillates in 140Onm Oband of wavelength, and irradiates the laser light Lr,L with large power exceeding 20OmW. The video camera 10, therlr,a, ensures the safety to the eyes of the user, as well as can eind a measurable distance to 30m, i.e., approximately three time he conventional light emitting diode which is measurable up to approximately 10m with power of 20mW, by way of example.
The CPU 15, upon detection of the pressed push switch 16 1,30 generates driving data S5 for driving a motor 22 coupled to 1 1, t mirror 21 in the infrared ray emitter 10C. The CPU 15 sends le driving data SS to a digital-to-analog (DA) converter circuit ?5 which converts the driving data SS to an analog driving signiliS5' which is then sent to a motor.driving circuit 26 in the infrz d ray emitter 10C. The motor driving circuit 26 drives the motz)M 2 based on the driving signal S6 to change the inclination of C-M mirror 21, thereby allowing an infrared ray L17 to be irradited 1 i to a subject which is iocated on the opticai axis All of thell 1 51 1 camera (Fig. 2) at a distance of 0.8 to 30m from the camera lens 10B.
An inclination sensor 27 is provided near the mirror 21, such that an inclination sensor detector circuit 28 detects the inclination of the mirror 21 through the inclination sensor 27 to generate a mirror inclination signal S8. Then, the inclination sensor detector circuit 28 sends the mirror inclination signal S8 to an analog-to-digital (AD) converter circuit 29 which converts the mirror inclination signal S8 to digital mirror inclination data S9 which is sent to the CPU 15. Therefore, the CPU 15 can know the inclination of the mirror 21 based on the mirror inclination data S9. In this way, the CUP 15 measures the orientation of the infrared ray L17 emitted from the infrared ray emitter 10C while changing the orientation of the infrared ray L17.
Also, the CPU 15 controls the power of the emitted infrared ray L17 in accordance with a change in the orientation of the infrared ray L17, i.e., a change in the distance from the camera lens 10B to a position to be measured to reduce the power consumption and extend the lifetime of the ID 19. Specifically, for example, the CPU 15 sets the emission power at 20OmW when the infrared ray L17 is irradiated at a distance of 30m from the camera lens 1. OB', while sets the emission power at 2mW when the infrared ray L17 is irradiated at a distance of 3m from the camera lens 10B, thereby controlling the return light incident on the return light incident angle detector 10D to a constant amount.
The return light L20 from a subject is incident on a lepsl 3! in the return light incident angle detector 10D, and is conV,, by the lens 35 on a light receiving surface of a Position Sensitive Diode (PSD) 36 as position detecting element. Thei 1 D 36 generates a current in accordance with the centroid of t,R magnitude of the return light L20 converged on the light req ving surface, and sends the current to the return light incident!,lgle detector 37'.
The PSD 36 has the light receiving surface in alignment ti-1 the focal plane of the lens 35, so that the incident angle o: tte return light L20 is uniquely determined when the position of thr= return light L20 converged on the light receiving surface ofl,:hE PSD 36 is determined. Therefore, the return light incident 4rla detector circuit 37, serving as detecting means, detects the incident angle of the return light L20 based on the current supplied from the PSD 36. Then, the return light incident a.c'.e detector circuit 37 sends the detected incident angle of the' return light L20.to an analog-to-digital (AD) converter circ 38 of the camera body 1.0A as a return light incident angle sign: l E12.
The AD converter circuit 38 converts the return light incideii angle signal S12 to digital return light incident angle data 5 which is sent to the CPU 15.
The CPU 15 calculates the subject distance based on the emitting angle of the infrared ray L17 derived from the mirrj inclination data S9 and the incident angle of the incident a g L20 derived from the return light incident angle data S13, and sends the subject distance to the camera lens IOB as subject distance data S15 to adjust the focus such that the focus of the camera lens 10B is coincident with the subject distance. For reference, the CPU 15 sends the subject distance data S15 to the camera lens 10B every 1/60 seconds to make the focus smoothly follow movements of the subject.
Here, Fig. 5 shows a situation in which a subject B10 to be imaged is located on the optical axis All of the camera, and a subject Bll not to be imaged is located at a position apart from the optical axis All of the camera, wherein an infrared ray L20 is irradiated to the subject Bll not to be imaged while the infrared ray L20 is being scanned.
In this situation, return light L21 reflected by the subject B11 is incident on the return light incident angle detector 10D at an incident angle 0 xb. In this event, a conventional video camera calculates a subject distance based only on the incident angle 0 xb of the return light L21. As a result, the video camera disadvantageously determines that the subject exists at a position of a point Pb on the optical axis All of the camera, and focuses at the position of the point Pb on the optical axis All of the camera.
To eliminate this disadvantage, the CPU 15 (Fig. 4) of the video camera 10 previously holds in an internal memory incident/emitting relation data (Fig. 6) indicative of the relationship between the emitting angle of the infrared raya d the incident angle of return light when a subject exists oniltKe optical axis All of the camera, and determines whether or nqt t-ie detected emitting angle of the infrared ray and incident anql f the return light match the incident/emitting relation data.
As a result, when determining that the detected emittinq ":Lngle of the infrared ray and incident angle of the return light TD:atch the incident/emitting relation data, the CPU 15 determines t, 4t the subject exists on -the optical axis All of the camera, an calculates the distance to the subject. On the other hand, 1q er determining that the detected emitting angle of the infrared ay and incident angle of the return light do not match the incident/emitting relation data, the CPU 15 determines that e subject does not exist on the optical axis All of the cameraj, ia d that the subject is not to be imaged.
For example, in Fig. 5, when the video camera 10 changes1l he emitting angle of the infrared ray L20 from 0 1 to 0 2, the infrared ray L20 is first irradiated to the subject Bll, anql 44axt I to the subject B10. In this event, the return light inciden angle detector 10D receives return light L21 from the subject B:1 to detect an incident angle 0 Bb, and next receives return liq ht from the subject B10 to detect an incident angle eAb, as sho-vii in Fig. 7.
Thus, the CPU 15 (Fig. 4) is provided with position data B (0 B, OBb) of the subject Bll and position data PA (0 A, OAb) () the subject B10, and determines based on the aforementioned incident/emitting relation data that only the subject B10 is located on the optical axis All of the camera. Eventually, the CPU 15 determines the subject B10 as a subject to be imaged, and calculates the distance to the subject B10 to adjust the focus.
As described above, in Fig. 8, the CPU 15, when entering a focus adjustment processing procedure RT1, proceeds to step SP1 to initiate every 1/60 seconds, and determines whether or not the push switch 16 is pressed at subsequent step SP2.
When an affirmative result is returned at step SP2, this means that the push switch 16 is pressed by the user, in which case, the CPU 15 proceeds to step SP3, where the CPU 15 forces the LD 19 to emit laser light L15. Conversely, when a negative result is returned at step SP12, this means that the push switch 16 is not pressed by the user, in which case, the CPU 15 proceeds to step SP4 to terminate the processing procedure.
Then, the CPU 15 proceeds to step SP4, where the CPU 15 drives the motor 22 through the motor driving circuit 26 to change the inclination of the mirror 21 to change the orientation of the infrared ray, and periodically samples the inclination of the mirror 21 derived from the infrared ray emitter 10C and the incident angle of return light derived from the return light incident angle detector 10D to store in the internal memory sampling data comprised of the emitting angle of the infrared ray indicated by the inclination of the mirror 21 and the incident -Is- angle of the return light thus derived.
Then, the CPU 15 turns off the LD 19 at step SP5, and ptc(.-eads to step SP6, where the CPU 15 acts as determining means to 9EQir::h the sampling data for one indicative of return light from a subject located on the optical axis All of the camera, and tC calculate the subject distance based on the incident angle tltie found return light.
Next, the CPU 15 acts as adjusting means at step SP7, WtEe the CPU 15 notifies the camera lens 10B of the calculated sh:je=t distance to adjust the focus such that the focus of the camEia lens 10B matches the subject distance. Subsequently, the CPU 15) proceeds to step SP4 to terminate the processing procedure.i In the foregoing configuration, the infrared ray emitte. o:: changes the emitting angle of the infrared ray under the COLt:-01 of the CPU 15 to emit the infrared ray, detects the emittinr angle, and notifies the CPU 15 of the detected emitting angle. Thd 1 infrared ray emitted from the infrared ray emitter 10C is 1 i reflected by a subject and incident on the return light inc.cnt angle detector 10D. The return light incident angle detectr 110D detects the incident angle of the return light from the sut),,)i t, and notifies the CPU 15 of the detected incident angle.
The CPU 15 determines based on the emitting angle of th infrared and the incident angle of the return light, detect.cl %hen the infrared ray is irradiated to the subject, whether or ng)l.the subject exists on the optical axis All of the camera, and calculates the subject distance based on the incident angle of the return light from the subject when determining that the subject exists on the optical axis All of the camera. Then, the CPU 15 adjust the focus of the camera lens 10B using the calculated subject distance.
Consequently, the video camera 10 prevents the calculation of the subject distance based on the incident angle of return light reflected by and returned from a subject which does not exist on the optical axis All of the camera, i.e., a subject not to be imaged.
According to the foregoing configuration, the infrared ray is emitted as the emitting angle thereof is changed, and it is determined based on the emitting angle of an infrared ray and an incident light of its return light, which are detected when the infrared ray is irradiated to a subject, whether or not the subject exists on the optical axis of the camera before the subject distance is calculated, thereby making it possible to accurately calculate the subject distance and accomplish more accurate focus adjustments as Compared with the prior art.
While the foregoing embodiment has been described in connection with a flat mirror which is employed as the mirror 21 for changing the emitting angle of the infrared ray, embodiments of the present invention are not limited to this particular mirror, but can employ a mirror in any of other various shapes, such as a polygon mirror, by way of example.
-17 1 Also, while the foregoing embodiment has been described,ir L connection with the PSD 36 employed as the position detecti 191 element, embodiments of the present invention are not limited to this p i lar element, but can employ any of other various position detect 1i ig elements such as a bisect pin photodiode, by way of example.'I Further, while the foregoing embodiment has been describe for a specific configuration in which the infrared ray is emittqdas its orientation is changed, embodiments of the present invent on are no"L, _#- -ted to this configuration. Alternatively, a plurality of infrar(,ad ray emitters for emitting infrared rays in different orientation from one another can be provided such that desired one is selecte! i ard lit from the plurality of infrared ray emitters as required.
Further, while the foregoing embodiment has been describe for I a specific configuration in which an infrared ray is irradi;4b to a subject to calculate the subject distance, embodiments of the preseitll invention are not limited to this configuration. Alternatively, any qf other various irradiation waves such as ultrasonic waves, by way oiff I example, can be irradiated to a subject to calculate the sub j ct I distance.
Further, while the foregoing embodiment has been described in connection with the video =nera 10 to which an errbodiment of the present 11,Ant Lon is applied, errbodin-ents of the present invention are not limited to the vid4o camera but can be applied widely to a variety of other auto-fecLs apparatus which contain an auto-focus function such as a sti' camera for photographing a still image, by way of example.
As described above, according to enbodiments of the present invention, the auto-focus apparatus irradiates an irradiation wave from emitting means to a subject while changing an emitting angle of the irradiation wave, detects an incident angle of a reflected wave of the irradiation wave reflected by the subject, incident on light receiving means positioned corresponding to the emitting means, determines based on the emitting angle and the incident angle whether or not the subject is a subject for which the focus should be adjusted, and adjusts the focus on the subject when determining that the subject is the subject for which the focus should be adjusted, thereby making it possible to accurately adjust the focus on the subject for which the focus should be adjusted.
In so far as the embodiments of the invention described above are implemented, at least in part, using sof tware-cont rolled data processing apparatus, it will be appreciated that a computer program providing such software control and a storage medium by which such a computer program is stored are envisaged as aspects of the present invention.
Although particular embodiments have been described herein, it will be appreciated that the invention is not limited thereto and that many modifications and additions thereto may be made within the scope of the invention. For example, various combinations of the features of the following dependent claims can be made with the features of the independent claims without departing from the scope of the present invention.

Claims (24)

1. An auto-focus apparatus comprising: ii emitting means for emitting an irradiation wave for irradiation to a subject while changing an emitting angle o:[', aid irradiation wave; detecting means for detecting an incident angle of a reLcted wave of said irradiation wave reflected by said subject, inqAent on light receiving means positioned corresponding to said en' ting means; determining means for determining based on said emitting nale and said incident angle whether or not said subject is a sub,jct,.
for which the focus should be adjusted; and adjusting means for adjusting the focus on said subject '71 er.
determining that said subject is the subject for which the f:>,,ug should be adjusted.
2. The auto-focus apparatus according to claim 1, wherein said emitting means emits an infrared ray emitted from al ele safe laser diode.
3. The auto-focus apparatus according to claim 1, wherein said emitting means controls emission power of said irradiation wave in accordance with a change in the emitting ngle of said irradiation wave.
4. The auto-focus apparatus according to claim 1, wherein said determining means comprises a storage means for storing sampling data of said emitting angle and said incident angle.
5. The autofocus apparatus according to claim 4, wherein said determining means comprises a storage means for storing correspondence data of said emitting angle and said corresponding incident angle.
6. A focus adjusting method comprising the steps of: emitting an irradiation wave from irradiating means for irradiation to a subject while changing an emitting angle of said irradiation wave; detecting an incident angle of a reflected wave of said irradiation wave reflected by said subject, incident on light receiving means positioned corresponding to said emitting means; determining based on said emitting angle and said incident angle whether or not said subject is a subject for which the focus should be adjusted; and adjusting the focus on said subject when determining that said subject is the subject for which the focus should be adjusted.
7. The focus adjusting method according to claim 6, wherein said emitting means emits an infrared ray emitted from an eye safe laser diode.
8. The focus adjusting method according to claim 6, whereinll said emitting means controls emission power of said irradiation wave in accordance with a change in the emittin(' nlle of said irradiation wave.
9. The focus adjusting method according to claim 6, wherein' said determination is made based upon stored sampling d,:t Cf said emitting angle and said incident angle.
10. The focus adjusting method according to claim 6, wherein i in said determination, said sampling data is selected ba 31 upon stored correspondence data of said emitting angle and s corresponding incident angle.
11. An image capturing apparatus comprising:
emitting means for emitting an irradiation wave for irradiation to a subject while changing an emitting angle of id irradiation wave; detecting means for detecting an incident angle of a ref G ted wave of said irradiation wave reflected by said subject, inc d nt on light receiving means positioned corresponding to said emi ing means; determining means for determining based on said emitting,:E gle and said incident angl e whether or not said subject is a sub,! t 1 for which the focus should be adjusted; and adjusting means for adjusting the focus on said subject when determining that said subject is the subject for which the focus should be adjusted.
12. The image capturing apparatus according to claim 11, wherein said emitting means emits an infrared ray emitted from an eye safe laser diode.
13. The image capturing apparatus according to claim 11, wherein said emitting means controls emission power of said irradiation wave in accordance with a change in the emitting angle of said irradiation wave.
14. The image capturing apparatus according to claim 11, wherein said determining means comprises a storage means for storing sampling data of said emitting angle and said incident angle.
15. The image capturing apparatus according to claim 14, wherein said determining means comprises a storage means for storing correspondence data of said emitting angle and said corresponding incident angle.
16. An image capturing method comprising the steps of: emitting an irradiation wave from irradiating means for irradiation to a subject while changing an emitting angle o:l aid 1 irradiation wave; detecting an incident angle of a reflected wave of said irradiation wave reflected by said subject, incident on light receiving means positioned corresponding to said emitting me,a S; determining based on said emitting angle and said incidei angle whether or not said subject is a subject for which the locus should be adjusted; and adjusting the focus on said subject when determining thi::1 s.id subject is the subject for which the focus should be adjust
17. The image capturing method according to claim 16, whereil"-1 said emitting means emits an infrared ray emitted from ai de safe laser diode.
18. The image capturing method according to claim 16, where:!.' said emitting means controls emission power of said irradiation wave in accordance with a change in the emitting ngle of said irradiation wave.
19. The image capturing method according to claim 16, where.i.a said determination is made based upon stored sampling dab cf said emitting angle and said incident angle.
20. The image capturing method according to claim 19, whereirwi in said determination, said sampling data is selected based upon stored correspondence data of said emitting angle and said corresponding incident angle.
21. An auto-focus apparatus, substantially as hereinbefore described with reference to Figures 2 to 8.
22. A focus adjusting method, substantially as hereinbefore described with reference to Figures 2 to 8.
23. An image capturing apparatus, substantially as hereinbefore described with reference to Figures 2 to 8.
24. An image capturing method, substantially as hereinbefore described with reference to Figures 2 to 8.
GB0029878A 1999-12-08 2000-12-07 Auto-focus apparatus focus adjusting method image capturing apparatus and image capturing method Expired - Fee Related GB2358536B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP34883999A JP3381233B2 (en) 1999-12-08 1999-12-08 Autofocus device and focus adjustment method

Publications (3)

Publication Number Publication Date
GB0029878D0 GB0029878D0 (en) 2001-01-24
GB2358536A true GB2358536A (en) 2001-07-25
GB2358536B GB2358536B (en) 2004-03-10

Family

ID=18399736

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0029878A Expired - Fee Related GB2358536B (en) 1999-12-08 2000-12-07 Auto-focus apparatus focus adjusting method image capturing apparatus and image capturing method

Country Status (3)

Country Link
US (3) US6917386B2 (en)
JP (1) JP3381233B2 (en)
GB (1) GB2358536B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6975361B2 (en) * 2000-02-22 2005-12-13 Minolta Co., Ltd. Imaging system, two-dimensional photographing device and three-dimensional measuring device
JP2003196870A (en) * 2001-12-26 2003-07-11 Tdk Corp Optical disk drive device, optical pickup and their manufacturing method and adjusting method
JP4532865B2 (en) * 2003-09-09 2010-08-25 キヤノン株式会社 Imaging device and focus control method of imaging device
JP4467958B2 (en) 2003-11-19 2010-05-26 キヤノン株式会社 Imaging device and focus control method of imaging device
TWI263444B (en) * 2004-12-10 2006-10-01 Hon Hai Prec Ind Co Ltd System and method for focusing images automatically
TWI250793B (en) * 2004-12-15 2006-03-01 Asia Optical Co Inc Range-finding type digital camera
JP2012049773A (en) * 2010-08-26 2012-03-08 Sony Corp Imaging apparatus and method, and program
DE102012003255B8 (en) * 2012-02-21 2014-01-16 Testo Ag Device for non-contact temperature measurement and temperature measurement method
US9648223B2 (en) * 2015-09-04 2017-05-09 Microvision, Inc. Laser beam scanning assisted autofocus

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4470681A (en) * 1982-09-07 1984-09-11 Polaroid Corporation Method of and apparatus for detecting range using multiple range readings
US5006700A (en) * 1988-04-11 1991-04-09 Nikon Corporation Distance-measuring apparatus for camera
US5137350A (en) * 1989-04-14 1992-08-11 Asahi Kogaku Kogyo Kabushiki Kaisha Distance measuring device
US5255045A (en) * 1991-02-25 1993-10-19 Olympus Optical Co., Ltd. Camera system having distance measuring apparatus
US5264892A (en) * 1990-07-04 1993-11-23 Olympus Optical Co., Ltd. Camera distance measuring apparatus
US5666566A (en) * 1994-06-10 1997-09-09 Samsung Aerospace Industries, Ltd. Camera having an automatic focus system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4623237A (en) * 1984-07-07 1986-11-18 Canon Kabushiki Kaisha Automatic focusing device
JPS6170407A (en) * 1984-08-08 1986-04-11 Canon Inc Instrument for measuring distance

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4470681A (en) * 1982-09-07 1984-09-11 Polaroid Corporation Method of and apparatus for detecting range using multiple range readings
US5006700A (en) * 1988-04-11 1991-04-09 Nikon Corporation Distance-measuring apparatus for camera
US5137350A (en) * 1989-04-14 1992-08-11 Asahi Kogaku Kogyo Kabushiki Kaisha Distance measuring device
US5264892A (en) * 1990-07-04 1993-11-23 Olympus Optical Co., Ltd. Camera distance measuring apparatus
US5255045A (en) * 1991-02-25 1993-10-19 Olympus Optical Co., Ltd. Camera system having distance measuring apparatus
US5666566A (en) * 1994-06-10 1997-09-09 Samsung Aerospace Industries, Ltd. Camera having an automatic focus system

Also Published As

Publication number Publication date
JP2001166199A (en) 2001-06-22
US6917386B2 (en) 2005-07-12
US7453512B2 (en) 2008-11-18
GB2358536B (en) 2004-03-10
US20050185085A1 (en) 2005-08-25
JP3381233B2 (en) 2003-02-24
US20010003465A1 (en) 2001-06-14
GB0029878D0 (en) 2001-01-24
US20050190285A1 (en) 2005-09-01
US7408585B2 (en) 2008-08-05

Similar Documents

Publication Publication Date Title
US7453512B2 (en) Auto-focus apparatus, focus adjusting method, image capturing apparatus and image capturing method
CN105607074B (en) Beacon self-adaptive optical system based on pulse laser
US7405762B2 (en) Camera having AF function
US10634785B2 (en) Optoelectronic sensor and method of detecting object information
CA2354614C (en) Autofocus sensor
CN1065343C (en) A camera having an automatic focus system
KR20210059314A (en) Multi-lateration laser tracking apparatus and method using initial position sensing function
US5680648A (en) Light projection type distance measuring device for auto-focusing in camera
JP2000097629A (en) Optical sensor
JP3794670B2 (en) Microscope autofocus method and apparatus
CN209783873U (en) stray light detection device of TOF camera
JPH0616483B2 (en) Projection optics
JP2000321482A (en) Automatic focusing device
JP2002139311A (en) Light beam irradiation measuring device
KR100820118B1 (en) Auto focusing system of vision system
JPH0560962A (en) Focusing device
JPH07284003A (en) Image pickup device
JP2002072058A (en) Device and method for adjusting camera
JPS6363910A (en) Distance measuring apparatus
JP2969536B2 (en) Distance measuring device
CN109922254A (en) Picture pick-up device and its control method
JP2004126614A (en) Range-finding device
JPH0772090A (en) Setting method for observation depth in crystal defect detector
JPH06148503A (en) Focus adjusting device for infrared camera
JPH0755921A (en) Automatic tracking apparatus for moving body

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20091207