CN114638872A - Method for obtaining depth information, scanning device and application - Google Patents

Method for obtaining depth information, scanning device and application Download PDF

Info

Publication number
CN114638872A
CN114638872A CN202210290915.5A CN202210290915A CN114638872A CN 114638872 A CN114638872 A CN 114638872A CN 202210290915 A CN202210290915 A CN 202210290915A CN 114638872 A CN114638872 A CN 114638872A
Authority
CN
China
Prior art keywords
light
extreme value
depth information
light intensity
measured surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210290915.5A
Other languages
Chinese (zh)
Inventor
吴笛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shengxiang Industrial Testing Technology Co ltd
Original Assignee
Shanghai Shengxiang Industrial Testing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shengxiang Industrial Testing Technology Co ltd filed Critical Shanghai Shengxiang Industrial Testing Technology Co ltd
Priority to CN202210290915.5A priority Critical patent/CN114638872A/en
Publication of CN114638872A publication Critical patent/CN114638872A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/514Depth or shape recovery from specularities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method for acquiring depth information, which comprises the following steps that a light generator projects generated light on a measured surface for scanning, and the light source is subjected to predictable change in a period; the image sensor shoots a measured surface to obtain the light intensity of the reflected light of each area of the measured surface; under the periodic change of the light source, the light intensity of the reflected light acquired by the photosensitive device array changes; the extreme value moment calculation unit judges the moment of occurrence of the extreme value of the light intensity irradiating each area of the measured surface according to the light intensity information acquired by the photosensitive device array; determining a plane where the light projected to the photosensitive device corresponding to each area is located according to the extreme value occurrence time; determining a straight line where the photosensitive device is located; and calculating the intersection point of the straight line and the plane to determine the space coordinate of the measured surface. The invention also discloses a scanning device for obtaining the depth information and application of the method for obtaining the depth information. The invention has the advantages of small subsequent calculation amount, high resolution and strong ambient light interference resistance.

Description

Method for obtaining depth information, scanning device and application
Technical Field
The invention belongs to the technical field of three-dimensional image scanning, and particularly relates to a method for acquiring depth information, a scanning device and application.
Background
The three-dimensional scanning technology is widely applied to the fields of human-computer interaction, object recognition, face recognition, product quality detection, three-dimensional modeling and the like. The classification is made from the basic physical principle of the measurement method, and mainly includes triangulation, time-of-flight method, depth-of-focus method, confocal method, and interferometry.
Triangulation is the most widely used near-field three-dimensional scanning technology at present, and mainly includes active structured light technology and passive stereo vision technology. Structured light technology reconstructs the three-dimensional topography of the surface of an object by photographing the surface of the object to be measured onto which known light patterns or pattern sequences are projected. As an active three-dimensional reconstruction technology, the structured light technology can better adapt to the surfaces of objects with different reflectivities and textures compared with a passive three-dimensional visual technology. Structured light technology is more widely used in industries and other places with high requirements on data reliability and integrity.
Structured light technologies can be divided into point structured light, line structured light, and area structured light according to the dimensions of the projected light. Since the point structured light and the line structured light are usually laser light sources in practice, the point structured light and the line laser sensor are generally called as point laser and line laser sensors, and both of the point structured light and the line laser sensors are widely applied to industrial online detection. A point laser sensor is often used to measure the height value of a single point. While line laser sensors are often used to measure the profile of a cross-section of an object and are therefore also referred to as two-dimensional profile sensors.
The line laser sensor can obtain the area array three-dimensional data only by moving a measured object or the sensor. Therefore, for some applications requiring acquiring area array three-dimensional data, such as object recognition, three-dimensional size detection and the like, the line laser technology is slow in scanning speed and more complex in system compared with the area array structured light technology.
According to the classification of the projected patterns, the area array structured light technology can be divided into random projection and coded projection, or a mixed form of the two projection modes. The area array structured light technology of random projection is a stereoscopic vision technology which enhances the surface texture of an object by projection of an active light source so as to improve the matching success rate. The area array structured light projected randomly can complete three-dimensional reconstruction only by one projection and shooting at least, but due to the discrete characteristic and uncertainty of the texture, the effective resolution, integrity and precision of three-dimensional data are poor. The area array structured light technology of the coded projection projects the pixels to the surface of an object in a time-sharing manner after the coordinates of the projection unit are coded through multiple times of projection and shooting, and the pixels are decoded after shooting and then restored into coordinate information. The coding scheme typically includes a binary code and phase modulation stripes. The information brought by multiple exposure and shooting of coded projection is very rich, if a phase shifting method is adopted, phase information higher than pixel resolution can be obtained, and the effective resolution, integrity and precision of the method are generally higher than those of the area array structured light technology of random projection. However, since multiple times of shooting are needed, the area array structured light technology of coded projection cannot shoot objects moving at high speed, the shooting speed is far lower than that of the area array structured light technology of random projection, and the time required by reconstruction calculation is longer.
Most of recent innovations on area array structured light technology mainly focus on the following aspects:
(1) improving the projection pattern and solving the algorithm;
(2) changing a computing platform and an electronic hardware architecture to improve processing speed and accuracy;
(3) attempts have been made to mix random projections with coded projections.
PETERSEN JENS et al provide a structured light technique that can be flexibly adjusted in the balance of accuracy and speed by rotating a random projection pattern for multiple shots and a method based on probability calculation, WO2017220595(a 1). Although the method greatly improves the measurement precision which can be realized by random projection and brings more flexibility to the system, the precision/time cost ratio is still lower than that of the encoding projection method during high-precision measurement.
VOIGT RAINER proposes a method of combining random projection and coded projection, and using multiple projections and shots, DE102016113228 (A1). The method effectively improves the integrity and reliability of data, but the shooting speed is obviously reduced, and the advantage that the random projection technology can shoot objects in motion is lost.
SCHUMAN-OLSEN HENRIK and the like improve the calculation speed of the encoding projection area array structured light technology through a GPU parallel operation technology and a Pipe Line method WO2017125507 (A1). The performance of the method depends on the performance of computer hardware, and the cost and the volume of the computer hardware are improved.
SONG ZHANG proposes that sine stripes are replaced by defocused binary stripes to break through the speed limit of a projection device when projecting a multi-bit depth picture, and the shooting frame rate of a coding projection area array structured light technology is increased to several hundred FPS (Song Zhang, "High-resolution 3D profile with binary phase-shifting methods," application. Opt.50,1753-1757 (2011)). The method obviously sacrifices the measurement depth of field and the precision of the system, and still needs multiple exposures to complete one-time calculation. Moreover, the phase splicing depends on the continuity of the surface of the measured object, and if a scene containing a plurality of discontinuous surfaces needs to be measured, the number of projection stripes still needs to be increased, so that the advantage of the measuring speed is reduced.
The above improvement measures are usually to improve another performance index on the premise of sacrificing a certain performance index, or to improve the engineering properties of hardware and algorithms, and the improvement range of the performance is relatively limited. The improvement is essentially to adjust the array structured light technology, so that the characteristics of the technology meet the requirements of a certain specific application scene, but the limitations caused by the random projection and coding projection principles cannot be broken through.
At present, no structured light technology exists, and the three-dimensional scanning of a dynamic object with high resolution and high precision can be realized.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a method for acquiring depth information, a scanning device for acquiring depth information and application of the method for acquiring depth information, which realize high-precision three-dimensional scanning of a dynamic object on the premise of not reducing the pixel resolution of three-dimensional data and have wider applicability.
The technical scheme adopted by the invention for solving the technical problems is as follows: a method of obtaining depth information, comprising the steps of:
the light generator projects the generated light on the measured surface for scanning, and the light source is predictably changed in period;
the image sensor shoots a measured surface, and reflected light rays of a plurality of areas of the measured surface are respectively projected to the photosensor array so as to obtain the light intensity of the reflected light rays of each area of the measured surface;
under the periodic change of a light source, the light intensity of the reflected light acquired by the photosensitive device array changes;
the extreme value time calculation unit judges the occurrence time of the extreme value of the light intensity irradiated on each area of the measured surface according to the light intensity information acquired by the photosensitive device array, wherein the extreme value is the maximum value or the minimum value of the light intensity;
determining a plane where the light projected to the photosensitive device corresponding to each area is located according to the extreme value occurrence time;
determining a straight line where the photosensitive device is located;
and calculating the intersection point of the straight line and the plane to determine the space coordinate of the measured plane.
Further, the light is projected on the reflecting surface;
the reflecting surface rotates by taking a straight line intersecting a plane where the light generator is located and a plane where the reflecting surface is located as an axis, and the light is projected to the measured surface at different angles for scanning.
Further, the light source is a linear light source moving along the surface to be measured.
Further, the linear light source may move a bright line in any direction, or a bright band with a certain width in any direction, or a dark line in a bright background in any direction, or a dark band with a certain width in a bright background in any direction.
Further, the light source is a variable pattern controlled by the area array light controller.
Furthermore, the extreme value moment calculation unit performs continuous analog-to-digital conversion through the analog-to-digital conversion unit to obtain discrete sequences of light intensity values of the photosensitive device at different moments in the optical scanning process, obtains the sequence number nMax of the light intensity extreme value through analysis of the discrete sequences, and obtains the occurrence moment of the light intensity extreme value through the time interval of the analog-to-digital conversion.
Further, the extreme time calculating unit is implemented by an analog circuit, the analog circuit includes a comparator U1, a follower U2, a first switch tube Q1 and a second switch tube Q2 respectively connected to the output terminal of the comparator U1, an extreme voltage storage capacitor C1, an extreme reference voltage storage capacitor C2, and a reference voltage source Vref, the reference voltage source Vref is generated by a controllable voltage source and the voltage is proportional to time, the input terminal of the comparator U1 is connected to the output voltage of the photosensitive device and the extreme voltage storage capacitor C1.
Further, the extreme value time calculation unit determines the time when the light intensity extreme value occurs in a manner of determining the time when the peak and the trough occur, or determining the position of the center of gravity of the region formed by the signal curve and the time axis.
The invention also discloses a scanning device for obtaining depth information, which comprises:
the light generator is used for projecting light which changes periodically to the measured surface so as to scan;
the image sensor is used for shooting a measured surface irradiated with periodically-changed light to acquire reflected light of a plurality of areas and comprises a photosensitive device array;
the light sensor array is used for acquiring the light intensity of the reflected light of each area of the measured surface, the light intensity of the reflected light changes under the periodic change of the light source, and the light intensity is used for determining the straight line where each light sensor is located;
and the extreme value moment calculation unit judges the moment of occurrence of the extreme value of the light intensity irradiating each area of the measured surface according to the light intensity information acquired by the photosensitive device array, and determines the plane where the light projected to the photosensitive device corresponding to each area is located according to the moment of occurrence of the extreme value.
Furthermore, the light generator comprises a generator for generating linear light and a reflecting surface, wherein the reflecting surface can rotate by taking a straight line intersecting a plane where the generator is located and a plane where the reflecting surface is located as an axis, and the light generated by the generator is projected to a measured surface at different angles.
Further, the photosensitive devices correspond to the plurality of regions one to one; the extreme value moment calculation units correspond to the photosensitive devices one to one, or a plurality of photosensitive devices correspond to one extreme value moment calculation unit.
Further, the extreme value moment calculation unit and the image sensor are packaged into a whole.
The invention also discloses application of the method for acquiring the depth information, which is applied in the field of human-computer interaction, or object recognition, or face recognition, or product quality detection, or three-dimensional modeling.
The invention provides a novel method for acquiring image data containing depth information and a scanning device based on the principle of triangulation. The basic principle is that patterns which can be predicted but change along with time are projected on a detected scene through a light source, the detected scene is shot through an image sensor, and three-dimensional coordinates of each pixel are calculated through the time when an extreme light intensity value acquired by each pixel in a shooting period occurs. The at least one light source irradiates a detected scene, the detected scene can be a fixed detected object surface or a dynamic detected object surface, the position or the pattern of the light source irradiating on the detected scene is subjected to predictable continuous or intermittent change in a shooting period, the at least one depth image sensor shoots the detected scene while the light source irradiates the scene, and the depth image sensor can acquire the time when the brightness of the irradiated pixels reaches the maximum value or the minimum value. The extreme value moment information obtained by the depth image sensor can be used for calculating the coordinate information of the detected scene. The invention can be used for three-dimensional scanning, namely acquiring the three-dimensional shape information of an object.
The invention has the advantages that 1) the light source has simple structure, and the cost is lower than that of the coding projection needing a complex light source system; 2) the subsequent calculation amount is less than the random projection or the encoding projection; 3) 3D data resolution can be achieved that is the same as the sensor native pixel resolution; 4) on the premise of high pixel resolution of three-dimensional data, the shooting speed is improved, and the ambient light interference resistance is strong;
5) the method is suitable for high-precision scanning of dynamic objects; 6) compared with the traditional structured light method, the method has better technical indexes and wider applicability.
Drawings
FIG. 1 is a schematic view of a scanning device according to the present invention.
FIG. 2 is a schematic diagram showing the relationship between the projection brightness and the ambient light and the background noise.
Fig. 3 is a schematic diagram of an extreme value time calculation unit implemented by an analog circuit according to the present invention.
FIG. 4 is a schematic diagram of a scanning method according to the present invention.
The device comprises a generator 1, an image sensor 2, a measured surface 3, a measured surface 4, a measured object A, a D-photosensitive device and a P-pixel.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, a scanning device for acquiring depth information includes a light generator, an image sensor, a photosensor array, and an extreme value time calculation unit.
In the present embodiment, the light generator is disposed above the measured surface 3, and the image sensor is disposed obliquely above the measured surface 3, opposite to the light generator.
And a light generator for projecting light periodically varying toward the measured surface 3 to scan. The light source is a linear light source moving along the surface 3 to be measured, or a varying pattern controlled by an area array light controller.
In this embodiment, one implementation manner of the light generator is to control the light by an area array light controller such as a DMD or an LCOS, so that the pattern of the light source irradiated on the detected scene is controllably changed with time.
Light generator light another implementation mode is that the light generator includes a generator 1 generating a linear light source, and a reflecting surface 4, the reflecting surface 4 can rotate around a straight line (passing through point O and perpendicular to the paper surface in fig. 1) where the plane where the generator 1 is located and the plane where the reflecting surface 4 is located, and the light generated by the generator 1 is projected to a measured surface 3 at different angles, and at this time, the light emitted by the generator 1 is parallel to the horizontal plane. More specifically, laser light generated by a linear laser generator 1 is irradiated on a reflecting surface 4, and the reflecting surface 4 can rotate around a straight line intersecting a plane where the linear laser light is located and a plane where the reflecting surface 4 is located, so that a projection angle of a linear line can change along with the angle of the reflecting surface 4, thereby realizing scanning of the laser light on a detected scene, wherein the detected scene can be fixed or dynamic. The rotary motion of the reflecting surface 4 is controllable, and the reflecting surface 4 may be a micro laser galvanometer based on MEMS technology, a laser galvanometer based on motor driving, or other devices.
The light source with periodically changing projection, that is, the light projection pattern, may be varied in form with time, and may have a bright line moving in any direction, a bright band with a certain width moving in any direction, a dark line moving in any direction on a bright background, or a dark band with a certain width moving in any direction on a bright background. The linear pattern has an advantage of high scanning accuracy, and the belt pattern has an advantage of a large dynamic range. While a bright pattern has the advantage of being good at objects with low reflectivity of the scanning surface, a dark pattern has the advantage of being good at objects with high reflectivity of the scanning surface. Different pattern variation methods belong to different implementation modes of the invention.
And an image sensor 2 for photographing the measured surface 3 irradiated with the periodically varying light source to obtain the reflected light of the plurality of areas a. The image sensor comprises a photosensitive device array, wherein the photosensitive device array formed by a plurality of photosensitive devices D is used for acquiring the light intensity of the reflected light of each area A of the measured surface 3, and the light intensity of the reflected light changes under the periodic change of the light source and is used for determining the linear equation of each photosensitive device D.
More specifically, one individual photosensitive device D is referred to as one pixel P. Through the optical device (lens) 21, the reflected light of a micro area a of the detected scene is transmitted to a corresponding pixel P, so that the photosensitive device D of the pixel P can sense the light intensity of the corresponding area of the detected surface 3, and the light intensity measured by the photosensitive device D is recorded as i (t).
As shown in fig. 2, the light intensity i (t) sensed by each pixel of the image sensor varies with time during a shooting period. The system consists of background noise IN (t), ambient light is (t) and projection brightness II (t) of a light source in a scene area corresponding to the pixel P. When the bright part of the light source projection pattern reaches the area a, ii (t) will increase; when the bright part of the light source projection pattern leaves the area a, ii (t) will decrease. The noise floor in (t), and the ambient light is (t) should be suppressed within a range such that as the projection pattern moves, the variation in i (t) due to the variation in ii (t) is significant.
And the extreme value moment calculation unit is used for judging the moment of the occurrence of the extreme value of the light intensity irradiating each area of the measured surface 3 according to the light intensity information acquired by the photosensitive device array, and determining a plane equation where the light projected to the photosensitive device D corresponding to each area A is located according to the moment of the occurrence of the extreme value.
Specifically, the light intensity signal i (t) of the pixel is connected to the extreme time calculation unit. The extreme value time calculation unit may be included in the image sensor 2, packaged as a whole, or may be independent. The photosensitive devices D correspond to the plurality of regions a of the measured surface 3 one by one, each pixel P may be provided with one extreme value time calculation unit, that is, the extreme value time calculation unit corresponds to the photosensitive device D one by one, or the plurality of pixels P may share one extreme value time calculation unit by time-sharing switching, that is, the plurality of photosensitive devices D correspond to one extreme value time calculation unit. In one implementation, each pixel P is provided with a separate extreme time computing unit, and all the extreme time computing units are packaged in the image sensor 2.
The extreme value moment calculation unit is used for judging the moment tc when the extreme value of the light intensity I' (t) irradiated on the photosensitive device D occurs through the light intensity I (t) measured by the photosensitive device D. Since the intensity of light I '(t) impinging on the light-sensing device D may be too strong, resulting in saturation of D and a peak clipping of the signal, I (t) is not always equal to I' (t). In order to work even when i (t) is clipped, the extreme time calculation unit may determine tc by calculating the central position of the i (t) signal and the threshold Ith transversal line. Other implementations of the extremum moment calculating unit include, but are not limited to: and judging the peak, the time when the trough occurs and the gravity center position of a region formed by the signal curve and the time axis.
In this embodiment, one implementation method of the extreme value time calculation unit is to perform continuous analog-to-digital conversion through the analog-to-digital conversion unit, obtain discrete sequences of light intensity values of the pixels P at different times in the light source scanning process, obtain a sequence number nMax where a light intensity extreme value is located through analysis of the discrete sequences, and obtain an extreme value time through a time interval of the analog-to-digital conversion.
Another implementation method of the extreme value time calculation unit is implemented by an analog circuit. As shown in fig. 3, the analog circuit includes a comparator U1, a follower U2, a first switch tube Q1 and a second switch tube Q2 respectively connected to the output terminal of the comparator U1, an extreme value voltage storage capacitor C1, an extreme value reference voltage storage capacitor C2, and a reference voltage source Vref. A reference voltage source Vref, whose voltage is proportional to time, is generated by a controllable voltage source. The input end of the comparator U1 is connected with the output voltage of the photosensitive device D and is connected with the extreme voltage storage capacitor C1. When the output voltage of the photosensitive device D exceeds the voltage of the extreme voltage storage capacitor C1, the comparator U1 drives the switching tube Q1 to be turned on, and the output voltage of the photosensitive device D is driven by the follower U2 to charge the extreme voltage storage capacitor C1, so that the voltage of the extreme voltage storage capacitor C1 always keeps the maximum value of the output voltage of the photosensitive device D during the light source scanning process. When the first switch Q1 is turned on, the second switch Q2 is turned on at the same time, and the current instantaneous value of the reference voltage source Vref is stored in the extreme value reference voltage storage capacitor C2. The voltage of the extremum reference voltage storage capacitor C2 may be directly used as an output value, or a digital signal may be output through analog-to-digital conversion. Since the voltage value of the Vref is proportional to time, the voltage of the extremum reference voltage storage capacitor C2 can represent the time when the photosensor D reaches its peak value.
The output value of the extreme value time calculation means is the extreme value occurrence time tc indicating the time at which the center of the bright portion of the light source is irradiated to the center of the pixel. The extreme value calculation is performed according to the time domain, and the resolution is not influenced by the distance between pixels. Since the change of the projection pattern of the light source with time is determined in advance and can be predicted, the plane equation where the light ray irradiated on the pixel is located can be determined by the extreme value occurrence time tc. The parameters of the camera are calibrated in advance, so that a linear equation where the pixel is located can be calculated, and the intersection point of the linear equation and the planar equation can be calculated according to the linear equation and the planar equation, wherein the point is the space coordinate of the surface of the detected scene shot by the pixel.
As shown in fig. 4, a method for acquiring depth information is performed by the scanning device, and the method includes the following steps:
the light generator projects the generated light on the measured surface 3 to scan, and the light source generates predictable changes in period;
the light source can be realized by controlling light through an area array light controller such as a DMD or an LCOS and the like, so that the pattern of the light source irradiating on a detected scene can be controllably changed along with time; the generator 1 may project the generated light on the reflecting surface 4, and the reflecting surface 4 may be rotated around a straight line intersecting a plane on which the generator 1 is located and a plane on which the reflecting surface 4 is located, and project the light at different angles onto the surface 3 to be measured to scan the light.
Under the periodically-changed projection of the light source, the image sensor shoots the measured surface 3, and the reflected light rays of a plurality of areas A of the measured surface 3 are respectively projected to the photosensor array so as to obtain the light intensity of the reflected light rays of each area of the measured surface 3;
the light intensity of the reflected light acquired by the photosensitive device array changes due to the periodic change of the light source;
the extreme value time calculation unit judges the occurrence time tc of the extreme value of the light intensity irradiated in each area of the measured surface 3 according to the light intensity information acquired by the photosensitive device array, wherein the extreme value is the maximum value or the minimum value of the light intensity;
determining a plane equation where the light rays projected to the photosensitive device D corresponding to each area A are located according to the extreme value occurrence time tc;
the parameters of the camera are calibrated in advance, so that the linear equation of the photosensitive device D, namely the pixel P, can be determined;
according to the above-mentioned linear equation and plane equation, for example, the vector method is used to calculate the intersection point of the two, which is the space coordinate of the measured surface photographed by this pixel P, i.e. the space coordinate of the measured surface 3 is determined.
The application of the method for obtaining the depth information can be applied to the field of human-computer interaction, or the field of object recognition, or the field of face recognition, or the field of product quality detection, or the field of three-dimensional modeling.
The foregoing detailed description is intended to illustrate and not limit the invention, which is intended to be within the spirit and scope of the appended claims, and any changes and modifications that fall within the true spirit and scope of the invention are intended to be covered by the following claims.

Claims (13)

1. A method of obtaining depth information, comprising the steps of:
the light generator projects the generated light on the measured surface for scanning, and the light source is predictably changed in period;
the image sensor shoots a measured surface, and reflected light rays of a plurality of areas of the measured surface are respectively projected to the photosensor array so as to obtain the light intensity of the reflected light rays of each area of the measured surface;
under the periodic variation of the light source, the light intensity of the reflected light acquired by the photosensitive device array is changed;
the extreme value time calculation unit judges the occurrence time of the extreme value of the light intensity irradiated on each area of the measured surface according to the light intensity information acquired by the photosensitive device array, wherein the extreme value is the maximum value or the minimum value of the light intensity;
determining a plane where the light projected to the photosensitive device corresponding to each area is located according to the extreme value occurrence time;
determining a straight line where the photosensitive device is located;
and calculating the intersection point of the straight line and the plane to determine the space coordinate of the measured plane.
2. The method of obtaining depth information of claim 1, wherein:
light is projected on the reflecting surface;
the reflecting surface rotates by taking a straight line intersecting a plane where the light generator is located and a plane where the reflecting surface is located as an axis, and the light is projected to the measured surface at different angles for scanning.
3. The method of obtaining depth information of claim 2, wherein: the light source is a linear light source moving along the measured surface.
4. The method of obtaining depth information of claim 3, wherein: the linear light source moves in any direction for a bright line, or moves in any direction for a bright band with a certain width, or moves in any direction for a dark line in a bright background, or moves in any direction for a dark band with a certain width in a bright background.
5. The method of obtaining depth information of claim 1, wherein: the light source is a variable pattern controlled by the area array light controller.
6. The method of obtaining depth information of claim 1, wherein: the extreme value moment calculation unit carries out continuous analog-to-digital conversion through the analog-to-digital conversion unit, obtains discrete sequences of light intensity values of the photosensitive device at different moments in the optical scanning process, obtains the sequence number nMax of the light intensity extreme value through analysis of the discrete sequences, and obtains the occurrence moment of the light intensity extreme value through the time interval of analog-to-digital conversion.
7. The method of obtaining depth information of claim 1, wherein: the extreme value time calculation unit is realized by an analog circuit, the analog circuit comprises a comparator U1, a follower U2, a first switch tube Q1 and a second switch tube Q2 which are respectively connected with the output end of the comparator U1, an extreme value voltage storage capacitor C1, an extreme value reference voltage storage capacitor C2 and a reference voltage source Vref, wherein the reference voltage source Vref is generated by a controllable voltage source and the voltage is in direct proportion to the time, and the input end of the comparator U1 is connected with the output voltage of the photosensitive device and the extreme value voltage storage capacitor C1.
8. The method of obtaining depth information of claim 1, wherein: the extreme value moment calculation unit judges the moment of the light intensity extreme value, wherein the moment of the wave crest and the wave trough is judged, or the gravity center position of an area formed by a signal curve and a time axis is judged.
9. A scanning device for obtaining depth information, comprising:
the light generator is used for projecting light which changes periodically to the measured surface so as to scan;
the image sensor is used for shooting a measured surface irradiated with periodically-changed light to acquire reflected light of a plurality of areas and comprises a photosensitive device array;
the light sensor array is used for acquiring the light intensity of the reflected light of each area of the measured surface, the light intensity of the reflected light changes under the periodic change of the light source, and the light intensity is used for determining the straight line where each light sensor is located;
and the extreme value moment calculation unit judges the moment of occurrence of the extreme value of the light intensity irradiating each area of the measured surface according to the light intensity information acquired by the photosensitive device array, and determines the plane where the light projected to the photosensitive device corresponding to each area is located according to the moment of occurrence of the extreme value.
10. The scanning device for obtaining depth information according to claim 9, wherein: the light generator comprises a generator for generating linear light and a reflecting surface, wherein the reflecting surface can rotate by taking a straight line intersecting a plane where the generator is located and a plane where the reflecting surface is located as an axis, and the light generated by the generator is projected to a measured surface at different angles.
11. The scanning device for obtaining depth information according to claim 9, wherein: the photosensitive devices correspond to the plurality of areas one by one; the extreme value moment calculation units correspond to the photosensitive devices one to one, or a plurality of photosensitive devices correspond to one extreme value moment calculation unit.
12. The scanning device for obtaining depth information according to claim 8, wherein: and the extreme value moment calculation unit and the image sensor are packaged into a whole.
13. An application of a method for obtaining depth information, characterized by: the method is applied to the field of human-computer interaction, or object recognition, or face recognition, or product quality detection, or three-dimensional modeling.
CN202210290915.5A 2022-03-23 2022-03-23 Method for obtaining depth information, scanning device and application Pending CN114638872A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210290915.5A CN114638872A (en) 2022-03-23 2022-03-23 Method for obtaining depth information, scanning device and application

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210290915.5A CN114638872A (en) 2022-03-23 2022-03-23 Method for obtaining depth information, scanning device and application

Publications (1)

Publication Number Publication Date
CN114638872A true CN114638872A (en) 2022-06-17

Family

ID=81949320

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210290915.5A Pending CN114638872A (en) 2022-03-23 2022-03-23 Method for obtaining depth information, scanning device and application

Country Status (1)

Country Link
CN (1) CN114638872A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102589476A (en) * 2012-02-13 2012-07-18 天津大学 High-speed scanning and overall imaging three-dimensional (3D) measurement method
CN106597463A (en) * 2016-12-29 2017-04-26 天津师范大学 Photoelectric proximity sensor based on dynamic vision sensor (DVS) chip, and detection method
CN107726053A (en) * 2016-08-12 2018-02-23 通用电气公司 Probe system and detection method
CN112945086A (en) * 2020-11-21 2021-06-11 重庆大学 Structured light coding method based on space sequence and light intensity threshold segmentation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102589476A (en) * 2012-02-13 2012-07-18 天津大学 High-speed scanning and overall imaging three-dimensional (3D) measurement method
CN107726053A (en) * 2016-08-12 2018-02-23 通用电气公司 Probe system and detection method
CN106597463A (en) * 2016-12-29 2017-04-26 天津师范大学 Photoelectric proximity sensor based on dynamic vision sensor (DVS) chip, and detection method
CN112945086A (en) * 2020-11-21 2021-06-11 重庆大学 Structured light coding method based on space sequence and light intensity threshold segmentation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王辉: "数字化全息三维显示与检测", 30 November 2013, 上海交通大学出版社, pages: 175 - 178 *

Similar Documents

Publication Publication Date Title
CN107607040B (en) Three-dimensional scanning measurement device and method suitable for strong reflection surface
US10616510B2 (en) Methods and apparatus for superpixel modulation
US10706562B2 (en) Motion-measuring system of a machine and method for operating the motion-measuring system
CN110230994B (en) Phase measurement error correction method of image point tracing object grating image phase shift method
US5838428A (en) System and method for high resolution range imaging with split light source and pattern mask
CN109341589B (en) Grating image projection method, three-dimensional reconstruction method and three-dimensional reconstruction system
Ishii et al. High-speed 3D image acquisition using coded structured light projection
KR101461068B1 (en) Three-dimensional measurement apparatus, three-dimensional measurement method, and storage medium
CN107071248B (en) High dynamic range imaging method for extracting geometric features of strong reflection surface
JP6169096B2 (en) 3D measurement method for objects with limited depth
US10078907B2 (en) Distance measurement apparatus, distance measurement method, and storage medium
Furukawa et al. Depth estimation using structured light flow--analysis of projected pattern flow on an object's surface
CN108596008B (en) Face shake compensation method for three-dimensional face measurement
CN111829449B (en) Depth data measuring head, measuring device and measuring method
JP6418884B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method and program
CN105303572B (en) Based on the main depth information acquisition method passively combined
US11885613B2 (en) Depth data measuring head, measurement device and measuring method
CN102519390A (en) Three coding period gray scale trapezoid phase shift structured light three dimensional information obtaining method
CN107483815B (en) Method and device for shooting moving object
CN111692987A (en) Depth data measuring head, measuring device and measuring method
Yu et al. Unequal-period combination approach of gray code and phase-shifting for 3-D visual measurement
CN116718133A (en) Short-distance single-point structured light three-dimensional measurement method
Maruyama et al. Multi-pattern embedded phase shifting using a high-speed projector for fast and accurate dynamic 3D measurement
Sui et al. Active stereo 3-D surface reconstruction using multistep matching
CN112985302B (en) Three-dimensional measurement system, method, apparatus, medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination