CN106210497B - Object distance calculating method and object are apart from computing device - Google Patents

Object distance calculating method and object are apart from computing device Download PDF

Info

Publication number
CN106210497B
CN106210497B CN201510228262.8A CN201510228262A CN106210497B CN 106210497 B CN106210497 B CN 106210497B CN 201510228262 A CN201510228262 A CN 201510228262A CN 106210497 B CN106210497 B CN 106210497B
Authority
CN
China
Prior art keywords
image
pixel
group
class
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510228262.8A
Other languages
Chinese (zh)
Other versions
CN106210497A (en
Inventor
许恩峯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pixart Imaging Inc
Original Assignee
Pixart Imaging Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pixart Imaging Inc filed Critical Pixart Imaging Inc
Priority to CN201510228262.8A priority Critical patent/CN106210497B/en
Priority to CN201910263646.1A priority patent/CN109905606B/en
Publication of CN106210497A publication Critical patent/CN106210497A/en
Application granted granted Critical
Publication of CN106210497B publication Critical patent/CN106210497B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a kind of object distance calculating methods, are implemented on image sensor, and image sensor includes a plurality of first kind pixels, and first kind pixel is divided into the first group and the second group.The method includes: the first part of each in the first kind pixel of the first group of masking, and the second part of each in the first kind pixel of the second group of masking;With each not shielded part in the first kind pixel of the first group, the first image of object is captured;With each not shielded part in the first kind pixel of the second group, the second image of object is captured;Go out the first resultant image with the first image and the second Image compounding;And the fog-level according to the first resultant image, calculate the first distance information between object and image sensor.

Description

Object distance calculating method and object are apart from computing device
Technical field
The present invention about object distance calculating method and object apart from computing device, particularly with regard to object can be increased Range information can computer capacity object distance calculating method and object apart from computing device.
Background technique
In recent years, intelligent electric sub-device is more more and more universal.Other than common smartphone, tablet computer, intelligence Energy type TV is also common intelligent electric sub-device.With traditional tv the difference is that, user can not use remote controler, only Intelligent TV can be controlled using gesture.Due to such gesture unlike smartphone or tablet computer are usually direct It implements on Touch Screen, but is implemented under the situation for having a distance with intelligent TV.In the case, in order to accurate Judge more complex gesture, the detecting of 3D image just becomes particularly significant.For example, in figure 1A, user U is against TV The movement of a crawl is carried out, and in fig. ib, user U opens hand and then carries out the movement that palm moves backward.It is such dynamic If making usually relatively to be difficult to judge not via the detecting of 3D image.
Different from general flat image, the parameter of distance must also be added by detecting 3D image, therefore must have a detecting object The mechanism of distance.The mechanism that object distance is detected in known techniques is to issue light using a light source, is then reflected back using object The optical oomputing distance come.Due to more light sources and control light source, the device for receiving reflected light, higher cost, shared ruler It is very little also larger.
Summary of the invention
Therefore, a purpose of the invention is to provide a kind of object distance calculating method and a kind of object apart from computing device.
One embodiment of the invention discloses a kind of object distance calculating method, is implemented on an image sensor, and image passes Sensor includes a plurality of first kind pixels, and first kind pixel includes first kind pixel and one second group of one first group First kind pixel.This object distance calculating method includes: each first kind pixel in the first kind pixel of the first group of masking A first part, and cover the second group first kind pixel in each first kind pixel one second part;With first The not shielded part of each first kind pixel in the first kind pixel of group captures one first image of an object;With The not shielded part of each first kind pixel in the first kind pixel of two groups captures one second image of an object;With First image and the second Image compounding go out one first resultant image;And the fog-level according to the first resultant image, it calculates First distance information between object and image sensor.
One embodiment of the invention discloses a kind of object apart from computing device, single comprising an image sensor and a control Member.Image sensor includes: a plurality of first kind pixels, the first kind pixel comprising one first group and one second group First kind pixel;And a shielding layer, cover one first of each first kind pixel in the first kind pixel of the first group Part, and cover the second group first kind pixel in each first kind pixel one second part.Image sensor is with first The not shielded part of each first kind pixel in the first kind pixel of group captures one first image of an object, with the The not shielded part of each first kind pixel in the first kind pixel of two groups captures one second image of an object, with First image and the second Image compounding go out one first resultant image.One control unit, according to the fog-level of the first resultant image, Calculate the first distance information between object and image sensor.
Another embodiment of the present invention discloses a kind of object distance calculating method, is implemented in an object apart from computing device On, wherein the object includes an image sensor and a movable lens, the movable lens and the shadow apart from computing device As the distance of sensor is a first distance.This object distance calculating method includes: movable through this with the image sensor Lens capture the first sensing image of an object;According to the fog-level of first sensing image, the object and the shadow are calculated As the first distance information between sensor;The position of the movable lens is adjusted according to the fog-level of first sensing image Make movable lens second distance at a distance from the image sensor, and the object is captured using the image sensor One second sensing image;And the fog-level according to second sensing image, calculate the object and the image sensor One second distance information.
Another embodiment of the present invention discloses a kind of object apart from computing device, includes: a movable lens;One image passes It is a first distance at a distance from sensor, with the movable lens, the first sensing shadow of an object is captured through the movable lens Picture;One lens actuator;One control unit calculates the object and the image according to the fog-level of first sensing image First distance information between sensor.Wherein the control unit more makes the lens according to the fog-level of first sensing image The position that driver adjusts the movable lens makes movable lens second distance at a distance from the image sensor, and The image sensor is set to capture one second sensing image to the object through the movable lens, and according to second sensing image Fog-level, calculate a second distance information of the object Yu the image sensor.
Focusing mechanism in known techniques is used in the calculating of distance by the present invention, can be not necessary to other assemblies and cumbersome Calculating can calculate the range information of object.In addition, calculate range information using the pixel of multiclass, can increase object away from From information can computer capacity.And the way of movable lens position is adjusted by resultant image fog-level, can also it increase Object range information can computer capacity.
Detailed description of the invention
Figure 1A and Figure 1B are depicted in known techniques, and user implements the schematic diagram of more complex gesture against TV.
Fig. 2 depict in known techniques with " detecting phase " and Lai Jinhang focus operation schematic diagram.
Fig. 3 and Fig. 4 depict it is according to an embodiment of the invention, using " detecting phase " calculate showing for object distance It is intended to.
Fig. 5 and Fig. 6 depict it is according to another embodiment of the present invention, using " detecting phase " calculate object distance Schematic diagram.
Fig. 7 depicts the flow chart of object distance calculating method according to an embodiment of the invention.
Fig. 8 depicts object according to an embodiment of the invention apart from computing device.
Fig. 9 and Figure 10 depicts the schematic diagram of object distance calculating method according to another embodiment of the present invention.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
Hereinafter distance sensing method provided by the present invention will be introduced with different embodiments.Because it is provided by the present invention away from It is related with the detecting phase of image from method for detecting, thus will first introduce " detecting phase " and related content.
The camera of known techniques must usually execute the movement of focusing before the image for capturing an object, can just obtain clear The image of Chu.Focusing, which can be divided into, carries out auto-focusing by user's manual focus or camera.There are many methods of auto-focusing Kind, one of which is utilization " detecting phase " Lai Jinhang focus operation.
Fig. 2 depict in known techniques with " detecting phase " and Lai Jinhang focus operation schematic diagram.As shown in Fig. 2, object The light RL that part Ob is reflected can enter lenticule L_m1, L_m2 after excessive lens L_b refraction.Using lenticule L_ After m1, L_m2 refraction, pixel PI_1, PI_2 can be entered and capture and generate image signal.The a part of difference of pixel PI_1, PI_2 Shielded layer b_1, b_2 are covered, the not shielded part of pixel PI_1, PI_2 can object Ob based on the received reflection Light and generate the first image and the second image respectively, the first image and the second image can be synthesized and generate resultant image.Work as object For part Ob in focus, resultant image can be clear image.However, resultant image can be fuzzy when object Ob is not in focus Image (i.e. the first image and the second image have phase difference).Camera can be clear or fuzzy according to this resultant image To carry out auto-focusing movement.
Fig. 3 and Fig. 4 depict it is according to an embodiment of the invention, using " detecting phase " calculate showing for object distance It is intended to.As shown in figure 3, image sensor includes that a plurality of R pixels (red pixel) PI_r1, PI_r2 and a plurality of B pixels are (blue Pixel) PI_b1, PI_b2.It please notes, understands for convenience, only the R pixel of part and B pixel are indicated.R pixel is divided again For the R pixel PI_r1 of the first group and R pixel PI_r2 of one second group, same B pixel is also divided into the B of the first group The pixel PI_b1 and B pixel PI_b2 of one second group.As seen from Figure 3, left half in the R pixel PI_r1 of the first group Portion is shielded, therefore only right side can sense image, and the right side in the R pixel PI_r2 of the second group is hidden It covers, therefore only left side can sense image.The R pixel PI_r1 of first group can generate the first image, the R of the second group Pixel PI_r2 can generate the second image, and the first image and the second image can synthesize one first resultant image.As previously mentioned, When object is in focus, image can be clear.Therefore, according to the fog-level of the first resultant image, object can be calculated First distance information between image sensor.In another embodiment, can also directly according to the first image and the second image extremely The fog-level of few one, to calculate the first distance information between object and image sensor, though made below with resultant image For for example, it is not limited to this.
Likewise, the left side in the B pixel PI_b1 of the first group is shielded, therefore only right side can sense Image, and the right side in the B pixel PI_b2 of the second group is shielded, therefore only left side can sense image.First The B pixel PI_b1 of group can generate third image, and the B pixel PI_b2 of the second group can generate the 4th image, third image and 4th image can synthesize one second resultant image.As previously mentioned, image can be clear when object is in focus.Therefore, According to the fog-level of the first resultant image, the second distance information between object and image sensor can be calculated.
Please note, be not necessarily intended to masking the first group pixel left side and the second group pixel right side come into Row detecting phase can cover the Bu Tong part of the left side of the pixel of the first group and the pixel of the second group to carry out phase Detecting.For example, the lower half of the upper half of the pixel of the first group and the pixel of the second group can be covered to carry out phase Position detecting (covering different half portions respectively).In another embodiment, can also cover respectively the pixel of the first group and the The difference of the pixel of two groups partly carries out detecting phase, this is partly less than half-pixel.
In one embodiment, range information is calculated merely with one type pixel, such as is only utilized using R pixel or only B pixel.And in another embodiment, range information can be calculated with two class pixels simultaneously, can so allow the counting of object It is wider to calculate range.Referring to Fig. 4, the focus of blue light is in F_b, then its can computer capacity be Cr_b.That is, if object is located at Can be outside computer capacity Cr_b, then image sensor captures the second resultant image can be because apart from too far and can not calculate object Range information.The focus of feux rouges in F_r, then its can computer capacity be Cr_r, if therefore by R pixel also to calculate distance letter Breath, as long as then object can calculate range information in computer capacity Cr_b and Cr_r.So can computer capacity compared with making Can more it be increased with unitary class pixel.
In the embodiment of Fig. 4, can arbitrarily select will with that in first distance information and second distance information as Judge the foundation of the distance between object and the image sensor.In one embodiment, the first resultant image and can be calculated Two resultant images are compensated into the needed degree of compensation of sharp image, i.e., according to the first resultant image and the second resultant image Fog-level, determining will be with that in first distance information or second distance information as judging object and image sensor Between distance foundation.For example, the range information of more visible image mapping may be selected as judging object and image sensor Between distance foundation.
Referring again to Fig. 4, the structural schematic diagram of the image sensor 400 of corresponding diagram 3 is also depicted.Image sensor 400 contain an array of pixels PIM, a shielding layer BL, a filter layer CL and a microlens layer L_m.PIM contains multiple Pixel, shielding layer BL is to each pixel of masking as shown in Figure 3.Filter layer CL can only allow the light of specific wavelength to pass through, and filter Fall without having to light.For example, the red filter layer CL_r in filter layer CL, can only allow red light to pass through, therefore R pixel PI_r Feux rouges can only be received.Likewise, the blue filter layer CL_b in filter layer CL, can only allow blue light to pass through, therefore B pixel PI_b Blue light can only be received.Microlens layer L_m then contains lenticule L_m1, L_m2 shown in Fig. 2, to by anaclasis to picture Prime number group PIM.
It so please notes, the scope of the present invention is not limited to pixel arrangements shown in Fig. 3, is also not limited to shown in Fig. 4 The structure of image sensor 400.Moreover, the scope of the present invention is also not limited to the only pixel with a kind of or two classes.For example, In the 5 embodiment of figure 5, other than R pixel and B pixel, G pixel (green pixel) PI_g1, PI_g2 has been further included.G pixel and R As B pixel, the G pixel PI_g2 of the G pixel PI_g1 and one second group that are divided into the first group (are only indicated wherein pixel It is a part of).Left side in the G pixel PI_r1 of first group is shielded, therefore only right side can sense image, and Right side in the G pixel PI_r2 of second group is shielded, therefore only left side can sense image.First group G pixel PI_r1 can generate one the 5th image, and the G pixel PI_r2 of the second group can generate one the 6th image.5th image and Six images can synthesize a third resultant image accordingly, and calculate third range information according to third resultant image.
Fig. 6 depicts the structural schematic diagram of the image sensor 600 of corresponding diagram 5.As shown in fig. 6, image sensor IS is also Contain an array of pixels PIM, a shielding layer BL, a filter layer CL and a microlens layer L_m.Compared to Fig. 4, the shadow of Fig. 6 As sensor 600 shielding layer BL more than green filter layer CL_g, can only green light be allowed to pass through, therefore G pixel PI_g can only be received To green light.The effect of 600 other layers of image sensor is with the embodiment of Fig. 4, therefore details are not described herein.As shown in fig. 6, red The focus of light, green light and blue light respectively falls in different position F_r, F_g, F_b, can computer capacity be respectively Cr_r, Cr_ G and Cr_b.Therefore, as long as object is fallen in can be calculated in computer capacity Cr_r, Cr_g and any range of Cr_b Its distance out, so can computer capacity ratio Fig. 3 embodiment can more increase.
In the embodiment of Fig. 6, can arbitrarily select will with R pixel generate first distance information, G pixel generate second away from In the third range information generated from information and B pixel that as judge between object and the image sensor away from From foundation.In one embodiment, the first resultant image, the second resultant image, third resultant image can be calculated to be compensated into The needed degree of compensation of sharp image, i.e., according to the first resultant image and the fog-level of the second resultant image, decision will be with First distance information, second distance information and to that in three range informations as judging between object and image sensor The foundation of distance.
Fig. 7 depicts the flow chart of object distance calculating method according to an embodiment of the invention, is implemented in an image On sensor (such as 600 of 400 or Fig. 6 of Fig. 4).This image sensor includes a plurality of first kind pixels (such as R pixel). First kind pixel includes the first kind pixel (such as PI_r1 in Fig. 4) of one first group and the first kind of one second group Pixel (such as PI_r2 in Fig. 4).This object distance calculating method comprises the steps of
Step 701
A first part (such as left one side of something) for each first kind pixel in the first kind pixel of the first group is covered, and Cover one second part (such as right one side of something) of each first kind pixel in the first kind pixel of the second group.
Step 703
With the not shielded part of each first kind pixel in the first kind pixel of the first group, the one of an object is captured First image.
Step 705
With the not shielded part of each first kind pixel in the first kind pixel of the second group, the one of an object is captured Second image.
Step 707
Go out one first resultant image with the first image and the second Image compounding.
Step 709
According to the fog-level of the first resultant image, the first distance information between object and image sensor is calculated.
In the embodiment of Fig. 7, range information is only calculated with unitary class pixel, but also can as shown in Figures 3 to 6 as with more Class pixel calculates range information, with increase can computer capacity.Correlation step can be pushed away by foregoing teachings, therefore no longer superfluous herein It states.These multiclass pixels can R pixel, G pixel and B pixel (light i.e. to receive different wave length) as the aforementioned, but also may be used For other different classes of pixels.
Fig. 8 depicts object according to an embodiment of the invention apart from computing device.As shown in figure 8, object distance calculates Device 800 contains an image sensor 801 and a control unit 803.Image sensor 801 contains shown in Fig. 3 or Fig. 5 Structure, the step of calculating range information of the control unit 803 to execute previous embodiment.It so please notes, image sensor 801 are not restricted to Fig. 3 or structure shown in fig. 5, for example, it is not necessarily intended to comprising lens jacket and pixel is not necessarily intended to Filter layer could obtain needed light.Therefore, object according to the present invention can be shown as follows apart from computing device: a kind of object It include an image sensor and a control unit (such as 801 in Fig. 8 and 803) apart from computing device.Image sensor packet Contain: the first kind pixel of a plurality of first kind pixels, the first kind pixel comprising one first group and one second group (such as R pixel in Fig. 4);A plurality of second class pixels: and a shielding layer (such as BL in Fig. 4), the first of the first group of masking A first part of each first kind pixel in class pixel, and each first kind in the first kind pixel of the second group of masking One second part of pixel.Image sensor is not shielded with each first kind pixel in the first kind pixel of the first group Part, capture an object one first image, do not hidden with each first kind pixel in the first kind pixel of the second group The part covered captures one second image of an object, goes out one first resultant image with the first image and the second Image compounding.Control Unit calculates the first distance information between object and image sensor according to the fog-level of the first resultant image.
Other than embodiment above-mentioned, merely with unitary class pixel also can augmentation distance can computer capacity.Fig. 9 and Figure 10 are drawn Show object according to another embodiment of the present invention apart from computing device.Referring to Fig. 9, object apart from computing device 900 in addition to Outside image sensor 901 and control unit 903, a movable lens 905 and a lens actuator 907 have been further included.Shadow Structure as sensor 901 comprising such as this case Fig. 3 or Fig. 5, that is, pixel is divided into the first group and the second group, and different groups The different parts of pixel can be shielded.It so please notes, image sensor 901 is not limited to only wrap comprising the above pixel of two classes Containing a kind of pixel.Position of the lens actuator 907 to adjust movable lens 905.Image sensor 901 can penetrate movable Lens 905 capture the object image of out of phase, and generate resultant image according to these images.Control unit 903 can be as before The range information of object is calculated as described according to the fog-level of resultant image.
In the example of figure 9, distance d_1, focus F_1 between movable lens 905 and image sensor 901, and Can computer capacity be Cr_1, if therefore object fall within can be except computer capacity Cr_1, image can to control because of too fuzzy Unit 903 processed can not calculate the distance of object.In this case, control unit 903 can issue commands to lens actuator 907, Make lens actuator 907 to move movable lens 905.For example, movable lens 905 are moved in the example of Figure 10 At a distance of the position of 901 distance d_2 of image sensor, focus becomes F_2, and can computer capacity become Cr_2.Image sensor 901 can capture the object image of out of phase through movable lens 905 in the case, and conjunction is generated according to these images At image.The range information of object is calculated as the meeting as previously described of control unit 903 according to the fog-level of resultant image.If The resultant image obtained in Figure 10 can calculate range information, then be not necessary to adjust the position of movable lens 905 again.So And range information can not be calculated if the resultant image obtained in Figure 10 is still too fuzzy, aforementioned step can be repeated, More visible resultant image is obtained by the movable lens 905 of movement.By the way of Fig. 9 and Figure 10, due to can not offset Movable lens are moved to change focus, therefore also augmentation can computer capacity.
It so please notes, is too fuzzy in resultant image in previous example and just adjustment is movable when can not calculate range information The position of formula lens, but the scope of the present invention is not limited thereto.For example, control unit can be in the fog-level of resultant image When greater than a predetermined extent, range information can be whether calculated, just moves movable lens.It so can be to avoid calculating The range information of mistake.Alternatively, can pass through the mode of mobile movable lens 905 also to calculate multiple range informations, then take it Average, so that range information is more accurate.Therefore, the embodiment of Fig. 9 and Figure 10 can be considered according to the fuzzy of resultant image Degree moves movable lens.
It please notes, is that the structure in Fig. 3 and Fig. 5 is contained with image sensor 901 to say in the aforementioned embodiment It is bright, but be not intended to limit the invention.In one embodiment, image sensor 901 includes common pixel, that is, pixel is not had There is shielded part.In this embodiment, the movement of each component can be expressed as follows in Fig. 9: in the state of figure 9, movable Lens 905 are a first distance d1 at a distance from image sensor 901.Image sensor 901 captures one through movable lens The first sensing image (pay attention to this first sensing image and be two Image compoundings not as good as previous embodiment) of object.Control Unit 903 calculates the first distance information between object and image sensor 901 according to the fog-level of the first sensing image. Control unit 901 more makes lens actuator 907 adjust the positions of movable lens 905 according to the fog-level of the first sensing image Make a movable lens 905 second distance d2 (situation of such as Figure 10) at a distance from image sensor 901, and makes image sensing Device 901 captures one second sensing image to object through movable lens 905 and (pays attention to this second sensing image and not as good as aforementioned reality Applying as example is two Image compoundings), and according to the fog-level of the second sensing image, calculate object and image sensor One second distance information.
Likewise, such embodiment can the first sensing image too it is fuzzy can not calculate range information when it is just mobile Movable lens also can whether calculate distance when the fog-level of the first sensing image is greater than a predetermined extent Information just moves movable lens.It so can be to avoid the range information for calculating mistake.Alternatively, also can pass through mobile movable The mode of formula lens 905 calculates multiple range informations, then takes its average, so that range information is more accurate.
In conclusion the focusing mechanism in known techniques is used in the calculating of distance by the present invention, other groups can be not necessary to Part and cumbersome calculating can calculate the range information of object.In addition, calculating range information using the pixel of multiclass, can increase Add object range information can computer capacity.And the way of movable lens position is adjusted by image blur degree, also may be used Increase object range information can computer capacity.
The foregoing is merely the preferred embodiments of the invention, all equivalent changes done according to scope of the present invention patent with Modification, should all belong to the covering scope of the present invention.

Claims (15)

1. a kind of object distance calculating method, which is characterized in that be implemented on an image sensor, which includes multiple Several first kind pixels, those first kind pixels include the first kind pixel of one first group and the first kind of one second group Pixel, which includes:
A first part of each first kind pixel in the first kind pixel of first group is covered, and covers second group First kind pixel in each first kind pixel one second part;
With the not shielded part of each first kind pixel in the first kind pixel of first group, the one of an object is captured First image;
With the not shielded part of each first kind pixel in the first kind pixel of second group, the one of an object is captured Second image;
First image and second Image compounding are one first resultant image, according to the fuzzy of first resultant image Degree or according to first image and at least one of fog-level of the second image, calculates the object and the image sensing First distance information between device;
The image sensor further includes a plurality of second class pixels, and wherein those the second class pixels include the second of one first group Class pixel and the second class pixel of one second group, which includes:
A first part of each the second class pixel in the second class pixel of second group is covered, and covers second group The second class pixel in each the second class pixel one second part;
With the not shielded part of each the second class pixel in the second class pixel of first group, the one of an object is captured Third image;
With the not shielded part of each the second class pixel in the second class pixel of second group, the one of an object is captured 4th image;
Go out one second resultant image with the third image and the 4th Image compounding;And
According to the fog-level of second resultant image, the second distance information between the object and the image sensor is calculated.
2. object distance calculating method as described in claim 1, which is characterized in that those first kind pixels and those second classes Pixel is respectively to sense the light of different wave length.
3. object distance calculating method as described in claim 1, which is characterized in that further include:
According to first resultant image and the fog-level of second resultant image, decision will be with the first distance information or should One of them in second distance information is as the foundation for judging the distance between the object and the image sensor.
4. object distance calculating method as described in claim 1, which is characterized in that the image sensor further includes a plurality of Three classes pixel, wherein those third class pixels include the third class pixel of one first group and the third class picture of one second group Element, which includes:
A first part of each third class pixel in the third class pixel of second group is covered, and covers second group Third class pixel in each third class pixel one second part;
With the not shielded part of each third class pixel in the third class pixel of first group, the one of an object is captured 5th image;
With the not shielded part of each third class pixel in the third class pixel of second group, the one of an object is captured 6th image;
Go out a third resultant image with the 5th image and the 6th Image compounding;And
According to the fog-level of the third resultant image, the third range information between the object and the image sensor is calculated.
5. object distance calculating method as claimed in claim 4, which is characterized in that further include:
According to the fog-level of first resultant image, second resultant image and the third resultant image, decision will be with this One of them in first distance information, the second distance information and the third range information is as judging the object and the shadow As the foundation of the distance between sensor.
6. object distance calculating method as described in claim 1, which is characterized in that the first part is the first kind pixel One side of something, and second part is the first kind pixel another one side of something different from the first part.
7. object distance calculating method as described in claim 1, which is characterized in that be implemented in an object apart from computing device On, wherein the object includes the image sensor and a movable lens, the movable lens and the shadow apart from computing device As sensor distance be a first distance, the image sensor through the movable lens capture first image and this second Image, the object distance calculating method further include:
Make the movable lens and the shadow according to the position that the fog-level of first resultant image adjusts the movable lens As sensor distance be a second distance, and using the image sensor to the object capture a third image and one the 4th Image, and one second resultant image is formed with the third image and the 4th image;And
According to the fog-level of second resultant image, a second distance information of the object Yu the image sensor is calculated.
8. a kind of object apart from computing device, characterized by comprising:
One image sensor includes:
The first kind pixel of a plurality of first kind pixels, the first kind pixel comprising one first group and one second group;With And
One shielding layer, covers a first part of each first kind pixel in the first kind pixel of first group, and covers One second part of each first kind pixel in the first kind pixel of second group;
The image sensor part not shielded with each first kind pixel in the first kind pixel of first group, is picked One first image of an object is taken, with the not shielded portion of each first kind pixel in the first kind pixel of second group Part, one second image of an object is captured, one first resultant image is gone out with first image and second Image compounding;And
One control unit calculates between the object and the image sensor according to the fog-level of first resultant image One range information.
9. object as claimed in claim 8 is apart from computing device, which is characterized in that the image sensor further includes a plurality of Two class pixels, wherein those the second class pixels include the second class pixel of one first group and the second class picture of one second group Element, the control unit more execute following steps:
A first part of each the second class pixel in the second class pixel of second group is covered, and covers second group The second class pixel in each the second class pixel one second part;
With the not shielded part of each the second class pixel in the second class pixel of first group, the one of an object is captured Third image;
With the not shielded part of each the second class pixel in the second class pixel of second group, the one of an object is captured 4th image;
Go out one second resultant image with the third image and the 4th Image compounding;And
According to the fog-level of second resultant image, the second distance information between the object and the image sensor is calculated.
10. object as claimed in claim 9 is apart from computing device, which is characterized in that those first kind pixels and those second Class pixel is respectively to sense the light of different wave length.
11. object as claimed in claim 9 is apart from computing device, which is characterized in that the control unit is according to first synthesis The fog-level of image and second resultant image, decision will be with its in the first distance information or the second distance information In one as the foundation for judging the distance between the object and the image sensor.
12. object as claimed in claim 9 is apart from computing device, which is characterized in that the image sensor further includes a plurality of Third class pixel, wherein those third class pixels include the third class pixel of one first group and the third class of one second group Pixel, the control unit more execute following steps:
A first part of each third class pixel in the third class pixel of second group is covered, and covers second group Third class pixel in each third class pixel one second part;
With the not shielded part of each third class pixel in the third class pixel of first group, the one of an object is captured 5th image;
With the not shielded part of each third class pixel in the third class pixel of second group, the one of an object is captured 6th image;
Go out a third resultant image with the 5th image and the 6th Image compounding;And
According to the fog-level of the third resultant image, the third range information between the object and the image sensor is calculated.
13. object as claimed in claim 12 is apart from computing device, which is characterized in that the control unit is according to first synthesis The fog-level of image, second resultant image and the third resultant image, decision will with the first distance information, this second One of them in range information and the third range information is as the distance judged between the object and the image sensor Foundation.
14. object as claimed in claim 8 is apart from computing device, which is characterized in that the first part is the first kind pixel One side of something, and second part is the first kind pixel another one side of something different from the first part.
15. object as claimed in claim 8 is apart from computing device, which is characterized in that further include a movable lens and one Lens actuator, the movable lens are a first distance at a distance from the image sensor, which can through this Dynamic formula lens capture first image and second image;
The control unit adjusts the movable lens according to the fog-level of first resultant image with the lens actuator Position makes movable lens second distance at a distance from the image sensor, and the image sensor captures the object One third image and one the 4th image, and the image sensor forms one second synthesis with the third image and the 4th image Image;
The control unit calculates the one second of the object and the image sensor according to the fog-level of second resultant image Range information.
CN201510228262.8A 2015-05-07 2015-05-07 Object distance calculating method and object are apart from computing device Active CN106210497B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201510228262.8A CN106210497B (en) 2015-05-07 2015-05-07 Object distance calculating method and object are apart from computing device
CN201910263646.1A CN109905606B (en) 2015-05-07 2015-05-07 Object distance calculation method and object distance calculation device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510228262.8A CN106210497B (en) 2015-05-07 2015-05-07 Object distance calculating method and object are apart from computing device

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201910263646.1A Division CN109905606B (en) 2015-05-07 2015-05-07 Object distance calculation method and object distance calculation device

Publications (2)

Publication Number Publication Date
CN106210497A CN106210497A (en) 2016-12-07
CN106210497B true CN106210497B (en) 2019-05-07

Family

ID=57459597

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201910263646.1A Active CN109905606B (en) 2015-05-07 2015-05-07 Object distance calculation method and object distance calculation device
CN201510228262.8A Active CN106210497B (en) 2015-05-07 2015-05-07 Object distance calculating method and object are apart from computing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201910263646.1A Active CN109905606B (en) 2015-05-07 2015-05-07 Object distance calculation method and object distance calculation device

Country Status (1)

Country Link
CN (2) CN109905606B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108495115B (en) * 2018-04-17 2019-09-10 德淮半导体有限公司 Imaging sensor and its pixel group and pixel array, the method for obtaining image information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011095027A (en) * 2009-10-28 2011-05-12 Kyocera Corp Imaging device
CN103379277A (en) * 2012-04-19 2013-10-30 佳能株式会社 Ranging apparatus, ranging method and imaging system
CN103453881A (en) * 2012-02-24 2013-12-18 株式会社理光 Distance measuring device and distance measuring method
CN103828361A (en) * 2011-09-21 2014-05-28 富士胶片株式会社 Image processing device, method, program and recording medium, stereoscopic image capture device, portable electronic apparatus, printer, and stereoscopic image player device
CN103856772A (en) * 2012-12-03 2014-06-11 北京大学 Method for shielding parameter calibration

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010022772A (en) * 2008-07-24 2010-02-04 Olympus Medical Systems Corp Endoscope and system for in-vivo observation
JP4927182B2 (en) * 2009-06-22 2012-05-09 株式会社 ニコンビジョン Laser distance meter
TWI407081B (en) * 2009-09-23 2013-09-01 Pixart Imaging Inc Distance-measuring device by means of difference of imaging location and calibrating method thereof
US8773570B2 (en) * 2010-06-17 2014-07-08 Panasonic Corporation Image processing apparatus and image processing method
JP5354105B2 (en) * 2010-07-23 2013-11-27 トヨタ自動車株式会社 Distance measuring device and distance measuring method
CN103037173B (en) * 2011-09-28 2015-07-08 原相科技股份有限公司 Image system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011095027A (en) * 2009-10-28 2011-05-12 Kyocera Corp Imaging device
CN103828361A (en) * 2011-09-21 2014-05-28 富士胶片株式会社 Image processing device, method, program and recording medium, stereoscopic image capture device, portable electronic apparatus, printer, and stereoscopic image player device
CN103453881A (en) * 2012-02-24 2013-12-18 株式会社理光 Distance measuring device and distance measuring method
CN103379277A (en) * 2012-04-19 2013-10-30 佳能株式会社 Ranging apparatus, ranging method and imaging system
CN103856772A (en) * 2012-12-03 2014-06-11 北京大学 Method for shielding parameter calibration

Also Published As

Publication number Publication date
CN106210497A (en) 2016-12-07
CN109905606A (en) 2019-06-18
CN109905606B (en) 2020-12-22

Similar Documents

Publication Publication Date Title
US9754422B2 (en) Systems and method for performing depth based image editing
CN106454090B (en) Atomatic focusing method and system based on depth camera
CN102959970B (en) Device, method, and program for determining obstacle within imaging range when capturing images displayed in three-dimensional view
US20120105590A1 (en) Electronic equipment
JP6173156B2 (en) Image processing apparatus, imaging apparatus, and image processing method
CN104604215A (en) Image capture apparatus, image capture method and program
CN107113415A (en) The method and apparatus for obtaining and merging for many technology depth maps
CN103384998A (en) Imaging device, imaging method, program, and program storage medium
CN104270560A (en) Multi-point focusing method and device
CN101959020A (en) Imaging device and formation method
US20190253608A1 (en) Artificial optical state simulation
CN104205827B (en) Image processing apparatus and method and camera head
CN108028887A (en) Focusing method of taking pictures, device and the equipment of a kind of terminal
CN106060376A (en) Display control apparatus, display control method, and image capturing apparatus
CN108600638B (en) Automatic focusing system and method for camera
CN103379294B (en) Image processing equipment, the picture pick-up device with this equipment and image processing method
CN104885440A (en) Image processing device, imaging device, image processing method, and image processing program
JP7378219B2 (en) Imaging device, image processing device, control method, and program
JPWO2014155813A1 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
US20130050565A1 (en) Image focusing
US10429632B2 (en) Microscopy system, microscopy method, and computer-readable recording medium
CN106210497B (en) Object distance calculating method and object are apart from computing device
TWI539139B (en) Object distance computing method and object distance computing apparatus
US20150146072A1 (en) Image focusing
CN104125385B (en) Image editing method and image processor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant