US6373557B1 - Method and apparatus for picking up a three-dimensional range image - Google Patents
Method and apparatus for picking up a three-dimensional range image Download PDFInfo
- Publication number
- US6373557B1 US6373557B1 US09/581,091 US58109100A US6373557B1 US 6373557 B1 US6373557 B1 US 6373557B1 US 58109100 A US58109100 A US 58109100A US 6373557 B1 US6373557 B1 US 6373557B1
- Authority
- US
- United States
- Prior art keywords
- sensor
- pixel
- integration
- light
- integration time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000000034 method Methods 0.000 title claims abstract description 76
- 230000010354 integration Effects 0.000 claims abstract description 89
- 238000005286 illumination Methods 0.000 claims abstract description 43
- 230000005693 optoelectronics Effects 0.000 claims description 13
- 238000011156 evaluation Methods 0.000 claims description 10
- 230000007613 environmental effect Effects 0.000 claims description 7
- 230000003287 optical effect Effects 0.000 claims description 7
- 230000007246 mechanism Effects 0.000 claims description 4
- 238000002310 reflectometry Methods 0.000 claims description 4
- 230000003044 adaptive effect Effects 0.000 claims description 3
- 230000003111 delayed effect Effects 0.000 claims 2
- 230000003068 static effect Effects 0.000 claims 2
- 238000012935 Averaging Methods 0.000 abstract description 2
- 206010034960 Photophobia Diseases 0.000 abstract 1
- 208000013469 light sensitivity Diseases 0.000 abstract 1
- 238000005259 measurement Methods 0.000 description 18
- 238000001514 detection method Methods 0.000 description 12
- 239000003990 capacitor Substances 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 206010034972 Photosensitivity reaction Diseases 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 230000036211 photosensitivity Effects 0.000 description 1
- 230000001373 regressive effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000699 topical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60N—SEATS SPECIALLY ADAPTED FOR VEHICLES; VEHICLE PASSENGER ACCOMMODATION NOT OTHERWISE PROVIDED FOR
- B60N2/00—Seats specially adapted for vehicles; Arrangement or mounting of seats in vehicles
- B60N2/002—Seats provided with an occupancy detection means mounted therein or thereon
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/015—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting the presence or position of passengers, passenger seats or child seats, and the related safety parameters therefor, e.g. speed or timing of airbag inflation in relation to occupant position or seat belt use
- B60R21/01512—Passenger detection systems
- B60R21/0153—Passenger detection systems using field detection presence sensors
- B60R21/01538—Passenger detection systems using field detection presence sensors for image processing, e.g. cameras or sensor arrays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/04—Systems determining the presence of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- the present invention relates to a method and a device for picking up a three-dimensional range image of spatial objects.
- Three-dimension projection and processing sensor systems are becoming more and more important for a variety of tasks in industrial technology.
- Known optical radar systems such as laser radar are based either on the principle of measuring laser pulse transit times or on the determination of the phase difference of modulated laser light for the purpose of deriving the object's distance. Additional mechanical scanning devices are necessary in order to build a three-dimensional imaging system. This leads to a relatively expensive electronic and mechanical outlay, which limits the use of such three-dimensional systems to a few specific applications.
- a method for picking up a three-dimensional range image of spacial objects using an optoelectronic sensor with pixel resolution having electronic short-time integrators for each pixel element within the sensor and wherein an integration time can be adjusted.
- the method includes the steps of illuminating an object having a plurality of object points with one or more light pulses each having a predetermined period ⁇ L . Light pulses are then sensed with the sensor that have been backscattered by object points of the object at corresponding pixels of the sensor within a predetermined short integration time ⁇ A , where ⁇ A ⁇ L .
- a time essence for the beginning of the predetermined short integration time ⁇ A proceeds incidence of the first backscattered light pulse at the sensor, which corresponds to a nearest object point.
- intensities of each of the sensed light pulses that have been backscattered by the object points are registered and distance values are computed from different registered intensities of the backscattered light pulses resulting from their different transit times.
- a method for picking up a three-dimensional range image of spacial objects using an optoelectronic sensor with pixel resolution includes the steps of first picking up and integrating the sensor signal of the sensor from the beginning of the picking up and integration to a defined integration time T 2 .
- This integration represents dark current and environmental light.
- an object is illuminated by an illumination device simultaneous to the beginning of the picking up and integration of the sensor signal at the sensor.
- the integration occurs during a light intensity rise of the light received at the sensor up to an integration time T 1 where T 1 ⁇ T 2 .
- the object is then repeatedly illuminated by the illumination device with simultaneous starting of the picking up and integration of the sensor signal at the sensor, wherein integration occurs within the light intensity rise of the light received at the sensor up to the integration time T 2 the respectively integrated value of the sensor signal for all pixels is readout and stored at times T 1 and T 2 .
- a transit time T 0 of the light from the illumination device to the sensor via the object and a corresponding distance value based on the stored integrated values is calculated for each pixel.
- an apparatus for picking up a three-dimensional range image including an illuminating device that emits light pulses onto an object via a first optical system.
- An optoelectronic sensor with a second optical system is configured to sense received light pulses backscattered by the object within an adjustable integration time and is comprised of a plurality of pixel elements to provide a pixel resolution, the pixel elements being randomly readable and configured to adjust the integration time pixel by pixel.
- a triggering mechanism is included that is configured to provide time synchronization between the illumination device and the sensor.
- a computing unit is included to calculate a three-dimensional image from corresponding charges of pixel elements of the sensor that have been charged by the received light pulses.
- the present invention is based on the recognition that an extremely fast registration of a three-dimensional range image is possible using a randomly readable optoelectronic sensor with pixel resolution whose integration time can be adjusted point by point.
- an object is illuminated with one or more very short light pulses, whereupon light impulses of the same length are backscattered by the object.
- These backscattered light pulses are conducted to the optoelectronic chip via a corresponding optical system. Owing to the difference in the distances of different points of the object from the sensor, backscattered light pulses that correspond to respective locations will arrive at the sensor at different times.
- a time measuring window is opened for ranging whose duration corresponds to a predeterminable integration time.
- the integration time is less than or equal to the length of the emitted and, thus, of the reflected light pulses. Hence, is guaranteed that a uniform cutoff of the backscattered light pulses occurs at the sensor at the end of the integration time.
- the light pulses of each pixel element that arrive with a time delay are cut off in back, so that the different transit times can be converted into charge differences based on the different charges in the raster of the optoelectronic sensor.
- a three-dimensional range image can be computed in this way.
- a mere light intensity rise having a steep edge is used, which is correspondingly registered and evaluated at the sensor.
- the measurement result becomes independent of the course of the trailing edge of the light pulse.
- the influence of a dark current, which is generated by the operating heat of a sensor element, and the environmental light (unwanted light) portion can be exactly compensated for each pixel.
- the dark current and the environmental light are acquired by a total of three consecutive measurements.
- the light quantities that are reflected by the object and received at the sensor are integrated in the form of a senor signal in the context of an illumination, this process then being repeated with a longer integration time. From this, the transit time of the light can be computed for each object point by a corresponding interpolation. This allows the possibility of using lower light powers while at the same time affords more precise measurement of the transit time and, thus, the distance to the object.
- all light pulses are registered simultaneous with the above described measurement process using a very long integration time or are registered after this with their full length at a time offset. This is used for normalizing, so that differences in the reflection behavior of the object can be detected and compensated.
- the essential advantages of the invention are that mechanical shutters are are not used, for example. Extremely short image pick-up times can, thus, be realized.
- the utilized optoelectronic sensor is generally referred to as a CMOS sensor, though this is merely the technical term for the semiconductor component. Using this type of sensor, minimum integration times of 50 to 30 nsec can be realized (jitter at less than 0.1%). Accordingly, technical development progresses with respect to the integration times.
- FIG. 1 illustrates the functional principle for acquiring a three-dimensional range image using a CMOS sensor
- FIG. 2 is a schematic representation of a time shift of two light impulses whose pertaining object points are different distances from the CMOS sensor, relative to integration windows;
- FIG. 3 shows two variants of the senor for simultaneously acquiring three-dimensional range images and intensity or gray value images, respectively, using a CMOS sensor;
- FIG. 4 shows the schematic representation of vehicle interior surveillance using a three-dimensional CMOS sensor
- FIG. 5 shows the ranging process using an integrated CMOS image sensor, with representation of the signal of the laser diode at the transmit side and of the sensor signals at the receive side;
- FIG. 6 shows the ranging process using an integrated CMOS image sensor, with FIG. 6 a representing the operation of a laser diode at the transmit side and FIG. 6 b representing the sensor signals that are achieved by continuous integration at the sensor;
- FIG. 7 is a time correlation showing the relations between illumination at the transmit side and detection of a laser impulse at the receive side, with measuring signals in the contexts of a short integration time and a very long integration time being represented at the bottom of the Figure;
- FIG. 8 shows the time correlation of the transmit-side and receive-side representation of a laser pulse, with two different short integration times provided in connection with the illumination control of the sensor.
- FIG. 9 shows a schematic arrangement of light sources illuminating an object in predetermined areas, the light then being detected by sensors.
- a method for the serial and simultaneous acquisition or generation of an intensity and a three-dimensional range image of a spatial object using an optoelectronic sensor under short-term illumination.
- the method exploits the transit time differences between the light pulses that are backscattered by the three-dimensional objects in the pixel-synchronous detection at the sensor within short integration times.
- a CMOS sensor is used. This sensor has a photosensitivity of 1 mLux, for example. Furthermore, it has a high intensity dynamic of up to 10 7 , a random access to the individual pixels and an adjustable integration time (sample & hold) for measuring the charge quantity Q(t) given illumination at the individual pixel.
- CMOS does not require expensive mechanical shutters, and high-powered laser light sources need not be used for the temporary illumination.
- the method is particularly suited to detecting persons and movement sequences in surveillance applications, for instance for monitoring the interior or exterior of a vehicle for crane automation, and for navigation.
- the spatial objects that are to be captured are illuminated using short light pulses, (e.g., ⁇ 100 ns).
- the illumination can be performed with laser light, for instance with a pulsed laser diode or with light sources such as a pulsed LED diode.
- the method is independent of the angle of the illumination, which need not necessarily occur centrally in relation to the general direction of detection. It is also conceivable to use a ring light for coaxial illumination and detection.
- the arrangement represented in FIG. 1 serves only for the schematic illustration of the functional principle.
- a first image pick-up A is connected with a short integration time ⁇ A at the CMOS sensor.
- the light pulses 3 of the length ⁇ L ( ⁇ 100 nsec) that are backscattered by the object points G of the three-dimensional scene 1 are acquired at the pixels 9 of the CMOS sensor within a set short integration time ⁇ A ⁇ L .
- An electronic trigger pulse from elecronic trigger emitter 8 produces a fixed time relation between emitted light pulse 2 and the opening of the integration time window at the CMOS sensor. Due to the transit time of the light, there is a different time shift depending on object distance R of:
- I O represents Intensity of the emitted light impulse
- O R represents a Surface reflection coefficient at the object point G
- t D is a trigger point time delay between emitted light pulse and start of the integration window at the CMOS sensor.
- CMOS sensor For object points G with the same surface reflection coefficient O R , a different charge Q A is measured at the corresponding pixel of the CMOS sensor depending on their distance R. In this way, small differences in transit time of the light pulses are transformed into charge modifications Q A so that an integrated charge is representative of a respective object point G and its respective distance R (1 . . . ) . In a CMOS sensor these can be detected with a high degree of sensitivity and with a high dynamic. Objects of a three-dimensional scene usually.
- a second image pick-up Q B is performed, which is dependent only on the surface reflection of the objects of the three-dimensional scene.
- a second image pick-up B with a long integration time ⁇ B serves for normalizing the surface reflection of the three-dimensional scene, where in principle, the customary intensity image or gray value image is used.
- a second integration time ⁇ B is set at the CMOS sensor, which is quite large compared to the length of an illumination pulse ( ⁇ B >> ⁇ L e.g. 1 microsecond).
- ⁇ B >> ⁇ L e.g. 1 microsecond.
- the image obtained is dependent only on the illumination intensity I 0 , the coefficient of surface reflection O R of the corresponding object point, and the length ⁇ L of the light pulse.
- the two-dimensional range image Q R is generated from the difference and normalization of image pick-ups A and B, or respectively, Q A and Q B according to the condition:
- This value can be output directly as range image Q R subsequent to the readout, digitization and additional scaling for all pixels. If the trigger delay time t D does not equal 0, then the following constant offset is added to all points of the range image Q R :
- R D Distance value given t D (charge offset).
- the simultaneous pick-up of intensity images and three-dimensional images is related to an execution of a spatially and chronologically parallel acquisition of intensity values and distance values.
- a chip architecture and a pixel-related integration time are selected such that directly adjacent pixels A and B corresponding to FIG. 3 pick up the backscattered light impulses 3 of the three-dimensional scene on the CMOS sensor with a short integration time ⁇ A ⁇ L (for pixel A) and acquire these impulses with a long integration time ⁇ B >> ⁇ L simultaneously.
- the two-dimensional range image of the allocated pixels A and B can then be directly computed according to the equation:
- FIG. 3 is a schematic of two possible arrangements on the CMOS sensor for the parallel detection of intensity and the three-dimensional range image. Further variants are possible for this.
- the simultaneous detection of intensity and the three-dimensional range image is important, particularly for the analysis of moving three-dimensional scenes, such as the detection of human gestures or for tracking an object.
- an additional normalizing of the three-dimensional range image with respect to environmental light may be executed.
- the charge of a pixel is first acquired with short and long integration times without illumination of the three-dimensional scene or the object, respectively, and subtracted from charges Q A and Q B that are measured under illumination. This is followed by the calculation of the range image.
- sensitivity of the method to noise can be increased given low backscattered light intensities by forming a time average of the signals of several light impulses.
- the measurement uncertainty for the distance determination depends on the signal/noise ratio of the CMOS sensor. Transit time differences as low as 0.1 ns should still be detectable. This results in a measurement uncertainty of less than 3 cm for the distance determination.
- the applications of a preferred embodiment of the method and the apparatus of the present invention relate to the monitoring of interior spaces, particularly in vehicles, in connection with volumetric evaluation methods.
- An object of the optical surveillance of interior space in vehicles is to detect seat occupancy (e.g., people, child seats, or other objects), to register the seat position of people, and for security against theft; (i.e., registering the unauthorized penetration into the interior of the vehicle from the outside.
- the detection of people and their seat position is extremely important in terms of safety for the gradual release of an airbag (smart airbag) and must be performed very reliably and in short measuring times.
- the present invention satisfies these requirements by a rapid and reliable generation of a three-dimensional range image Q R in the vehicle interior, wherein volumetric evaluation methods are employed.
- the portions of net volume in the vehicle's interior that are occupied by objects 1 is determined from the distance values R in a solid angle element ⁇ as the difference relative to the distance values when the vehicle's interior is unoccupied (See FIG. 4 ).
- the present method and apparatus deliver other significant advantages, such as fast, global detection of the current seat occupancy by forming the difference of a three-dimensional range image of the vehicle interior without objects (three-dimensional reference image Q RO ) and the three-dimensional range image, which is currently being evaluated, with a person or some other object Q RP on a seat.
- R 0 stands for the distance values without a person or other object
- R p stands for the distance values with a person or other object on the seat
- dF is a differential area
- the adaptive detection of the seat occupancy from the calculation of the relative distance changes before and after a person enters the vehicle can also be effected by the present invention.
- the reliability of the difference determination can be increased further by applying regressive and stochastic evaluation methods.
- Determination of the size of detected objects and the global discrimination of objects via volume comparison classes are also possible with the present invention.
- volumetric tracking of movement sequences in the space given chronologically consecutive image pick-ups and difference formation. Is possible, as well as recognition of persons and gestures from the motion analysis.
- This integral volume observation makes possible a global detection of objects and positions in space and is not reliant on the determination of features such as contours, corners, or edges in the image in order to recognize the object.
- the evaluation times can be under 10 ms for the three-dimensional image pick-up and the volumetric evaluation.
- a vehicle interior is one particular field of application of the described method and apparatus.
- an object is illuminated with LED impulses of, for instance, 50 ns (nanoseconds) for the three-dimensional image pick-up.
- the integration times at the CMOS sensor are selected at 50 ns for the image pick-up Q A and at 0.5 ⁇ s for the image pick-up Q B .
- the scene dynamic, which is to be detected, in the vehicle interior should equal 200:1.
- the digital acquisition of the three-dimensional range image Q R is thus guaranteed by a 12-bit AID converter.
- a maximum of 10 4 read operations are necessary for the image pick-ups A with a short integration time and B with a long integration time.
- These read operations lead to a total image pick-up time for the three-dimensional range image of 5 ms at the most, given read frequencies of 2 Mhz, for example.
- the calculation of the difference volumes from the 2500 distance values can be performed in another 5 ms without difficulty using a fast processor such as a Pentium® operating at 200 Mhz.
- FIG. 4 shows a schematic of an application of the invention in vehicle interiors.
- the arrows with broken lines represent an unoccupied seat, and those with solid lines represent a seat that is occupied by a person.
- the surrounding portion of net volume is determined from the three-dimensional distance data given an occupied and unoccupied vehicle.
- the net volume V P of a person or some other object on the car seat is calculated according to equation (7).
- T 0 transit time of the light
- ⁇ A integration time
- U ges measuring signal for A B ⁇ dark current portion for ⁇ B ;
- U P U ges ⁇ (measuring signal portion for ⁇ A ⁇ dark current portion for ⁇ A ).
- FIG. 8 evaluates the registered measuring signals at the sensor by means of an interpolation method.
- the transit time of the light from the light source to the sensor via the object is given by the intersection of the curve of the measuring signal in FIG. 8, as it is crossed by the curve of the dark current portion.
- Three-dimensional image acquisition which is required in a number of industrial applications of image processing, is necessary particularly for the automatic surveillance of spaces, such as a car interior for example. Overly high requirements are not placed on the precision of the range image/range image. Range images with some 1000 pixels would already suffice for spatial surveillance in most cases. Conventional triangulation methods differ by cost, as well as on the basis of the large measurement base that is required.
- Both the illumination type corresponding to FIG. 7 and the type corresponding to FIG. 8 can realize a fast and cost-effective pick-up of a three-dimensional range image.
- the transit time of the light which is necessary for the evaluation, is achieved for each image element point of the sensor 4 via the interpolation.
- a light pulse with a definite length instead of a light pulse with a definite length, only a rise in light intensity with a steep edge is evaluated.
- the laser pulse reflected by the object is cut off by two different integration times.
- the measuring signal becomes independent of the course of the trailing edge of the light pulse, and on the other hand, it is possible to precisely compensate for the influence of the dark current, which arises by virtue of the operating temperatures of a sensor, for example, and the influence of the environmental light for each pixel.
- FIG. 5 shows the distance measurement with an integrated CMOS image sensor.
- the laser diode illuminating at the transmit side is represented, by its rectangular light impulse.
- the measuring signals picked up at the receive side are represented.
- the solid line that leads generally from the origin of the coordinate system to the voltage U D is the first measurement performed and contains a dark current portion plus an extraneous light portion.
- U D is picked up at integration time T 2 , which is greater than another integration time T 1 .
- the object 1 shown in FIG. 1 is illuminated with the laser diode, whereupon only the dark currents in the individual pixels in connection with the extraneous light portion are integrated.
- the measuring signal rises from time T 0 more sharply in correspondence to the brightness of the respective pixel.
- T 1 the voltage U 1 for all pixels is read out and stored.
- T 2 the integration time T 2 that is already known from the first dark current measurement.
- T 1 equals 30 ns and T 2 equals 60 ns.
- the light transit time T 0 can be computed according to the formula represented in FIG. 5 . If a line is extended through the points U 2 and U 1 , further down this line intersects the line representing the dark current, which runs between the origin of the coordinate system and the voltage U D . At the intersection, the light transit time T 0 can be read. All values for U 1 and U 2 or for ⁇ U, for all pixels are likewise read out and stored. The transit time T 0 can be precisely and unambiguously computed for every pixel from the voltages U D , U 1 , U 2 and ⁇ U that are stored for each pixel, in connection with the predetermined integration times T 1 and T 2 , even when there are relatively high dark current portions U D . The following relation applies:
- T 0 U 1 ⁇ T ⁇ U ⁇ T 1 /(U D ⁇ T/T 2 ⁇ AU).
- a preferred embodiment provides that in order to reduce the laser power, which is extremely critical for reasons of cost, the above described process is repeated several times in succession, and the resulting values for U 1 , U D , and ⁇ U are read out and digitized only at the end of the multiple illumination of CMOS sensors. See FIGS. 6 a and 6 b in this regard.
- An analogous average value formation for the multiple illumination on the CMOS sensor also avoids the relatively long readout times in a later digital averaging.
- the described steps make it possible to calculate the light transit time T 0 precisely given the presence of dark current and environmental light, to read out the signal from the CMOS sensor only after the multiple illumination, whereupon the digitization follows, and to be able to adaptively adjust the multiple illumination in accordance with the object's reflectivity.
- a previously required laser power can be reduced by a factor of 10 to 20, or, the accuracy can be increased.
- the sensor principle used in the image sensor is an integrated method, based on use of an n + -p photodiode, for example.
- This photodiode is a component of an electronic short-term integrator, which also comprises a capacitor and several transistors. The connection is configured such that the capacity of the capacitor is discharged depending on the light that strikes the photodiode. This is controlled via what is known as a shutter transistor. Next, the potential remaining in the capacitor is read, for example.
- the time control of the electronic short-term integrator generates what is known as a strobe signal for controlling a light source.
- An electronic short-term integrator (electronic shutter) such as this is used for each pixel element 9 of the sensor 4 .
- the potential that has already been tapped can also be used as measurement value instead of the potential remaining in the capacitor at the end of a measurement.
- FIG. 6 a shows several laser pulses that are switched in succession at the transmit side.
- the integration time T 1 is represented in FIG. 6 b in connection with the respective voltage U 1 and the dark current portion U D .
- the same can be mapped for T 2 , U 2 and U D .
- FIGS. 7 and 8 By contrasting FIGS. 7 and 8, it can be seen that the interpolation method corresponding to FIG. 8 has shorter illumination times.
- the average shutter times of 30 ns, for example, as and 60 ns represented in FIG. 8 and 60 ns as represented in connection with a very long laser pulse period in FIG. 7 should define the integration times at the sensor.
- the time relation between the illumination at the transmit side and the arrival of the laser pulse at the receive side is shown.
- the embodiments represented in the FIGS. 5 to 8 do not have a trigger delay time. This means that the measurement window is opened at the receive side with the beginning of the sensor impulse.
- FIG. 7 For the representation in FIG.
- the short-term shutter (60 ns) cuts off the received laser pulse (related to an object point or image element point) at time ⁇ A .
- the period of the light impulse is ⁇ L at the transmit and receive sides.
- the electronic short-term integrator at the sensor delivers a respective potential as measurement value that is integrated depending on the transit time from the time T 0 to the end of and ⁇ A .
- the integration time ⁇ B is used to compensate reflectivity differences at the object 1 . There, a dark current and an extraneous light portion are calculated, which can be correspondingly subtracted from the measuring signal.
- FIG. 8 shows a diagram that corresponds to FIG. 7, the upper graph of which is identical to that of FIG. 7 .
- two short-term shutter times are shown. These are used to cut off the laser pulses that impinge at the sensor 4 , as in FIG. 7 .
- T 1 the integration time
- T 2 the integration time
- the measuring signal has a dark current and extraneous light portion as shown in FIGS. 7 and 8.
- the measuring signal is thus the result of the addition of the photocurrent portion to the dark current and the extraneous light portion.
- the photocurrent portion can be computed in that the dark current and extraneous light portion are attracted from the measuring signal.
- the transit time T 0 of the light emerges at the point on the time axis at which, given an incident reflected light pulse, the measuring signal diverts from the normal course of the dark current and extraneous light portions because the photocurrent portion is no longer zero.
- the evaluation which yields the light transit time T 0 was described in connection with FIG. 5 .
- a measurement object is partially illuminated in series. Illumination and evaluation occur simultaneously.
- an object 1 is partially illuminated and respectively evaluated in series, wherein a specific part of the object 1 is allocated to one or more light sources 10 , respectively.
- the rise time of the intensity of a light source 10 for instance of a laser, can be significantly shortened, possibly to 0.1 ns.
- FIG. 9 shows a schematic arrangement of three light sources 10 , that respectively illuminate an object 1 in predetermined areas 11 .
- the sensor 4 receives the reflected light portions that correspond to the partial areas 11 on the object 1 and processes them.
- This development allows the limitation, for instance, of the laser power of an illumination unit that has a laser.
- the serial illumination and detection can be realized cost-effectively, and it is not a problem to fall below maximum laser powers that are prescribed by specific standards.
- the rise time of the laser intensity can be shortened considerably, for instance to 0.1 nsec.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Electromagnetism (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Aviation & Aerospace Engineering (AREA)
- Transportation (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE19757595 | 1997-12-23 | ||
DE19757595A DE19757595C2 (de) | 1997-12-23 | 1997-12-23 | Verfahren und Vorrichtung zur Aufnahme eines dreidimensionalen Abstandsbildes |
DE19833207A DE19833207A1 (de) | 1998-07-23 | 1998-07-23 | Verfahren und Vorrichtung zur Aufnahme eines dreidimensionalen Abstandsbildes |
DE19833207 | 1998-07-23 | ||
PCT/DE1998/003344 WO1999034235A1 (fr) | 1997-12-23 | 1998-11-14 | Procede et dispositif pour la prise d'une vue tridimensionnelle permettant la mesure d'une distance |
Publications (1)
Publication Number | Publication Date |
---|---|
US6373557B1 true US6373557B1 (en) | 2002-04-16 |
Family
ID=26042787
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/581,091 Expired - Lifetime US6373557B1 (en) | 1997-12-23 | 1998-11-14 | Method and apparatus for picking up a three-dimensional range image |
Country Status (6)
Country | Link |
---|---|
US (1) | US6373557B1 (fr) |
EP (1) | EP1040366B1 (fr) |
JP (1) | JP3860412B2 (fr) |
KR (1) | KR100508277B1 (fr) |
DE (1) | DE59809883D1 (fr) |
WO (1) | WO1999034235A1 (fr) |
Cited By (73)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020096624A1 (en) * | 2001-01-24 | 2002-07-25 | Mckee Bret A. | Method and apparatus for gathering three dimensional data with a digital imaging system |
US6581961B1 (en) | 1999-12-17 | 2003-06-24 | Trw Vehicle Safety Systems Inc. | Deactivation of second stage of air bag inflator |
US20030188089A1 (en) * | 1999-12-21 | 2003-10-02 | Ronald S. Perloff | Hash cam having a reduced width comparison circuitry and its application |
US20040021771A1 (en) * | 2002-07-16 | 2004-02-05 | Xenogen Corporation | Method and apparatus for 3-D imaging of internal light sources |
WO2004086087A1 (fr) * | 2003-03-25 | 2004-10-07 | Thomson Licensing Sas | Detection d'un rayonnement electromagnetique |
US6810135B1 (en) | 2000-06-29 | 2004-10-26 | Trw Inc. | Optimized human presence detection through elimination of background interference |
US6822687B1 (en) * | 1999-07-08 | 2004-11-23 | Pentax Corporation | Three-dimensional image capturing device and its laser emitting device |
US6822681B1 (en) * | 1998-07-02 | 2004-11-23 | Pentax Corporation | Image capturing device for capturing a shape of a measurement subject |
US20050040632A1 (en) * | 2003-08-20 | 2005-02-24 | Kabushiki Kaisha Toshiba | Distance detecting apparatus, air bag system controlling apparatus, and method of detecting distance |
US20050078297A1 (en) * | 2001-12-21 | 2005-04-14 | Gunter Doemens | Device for monitoring spatial areas |
EP1544535A1 (fr) * | 2003-12-20 | 2005-06-22 | Leuze lumiflex GmbH + Co. KG | Dispositif de surveillance de la zone dans la portée d'un outil de travail |
US20050151721A1 (en) * | 2004-01-09 | 2005-07-14 | Cheah Chiang S. | Image acquisition timing system and method |
US20050162638A1 (en) * | 2004-01-28 | 2005-07-28 | Denso Corporation | Apparatus, method, and program for generating range-image-data |
US20050203641A1 (en) * | 2002-03-18 | 2005-09-15 | Sick Ag | Sensor-machine interface and method for operation thereof |
US20060006309A1 (en) * | 2004-07-06 | 2006-01-12 | Jerry Dimsdale | Method and apparatus for high resolution 3D imaging |
US20060007422A1 (en) * | 2004-07-06 | 2006-01-12 | Jerry Dimsdale | System and method for determining range in 3D imaging systems |
WO2006030989A1 (fr) * | 2004-09-17 | 2006-03-23 | Matsushita Electric Works, Ltd. | Capteur d'image telemetrique |
FR2877279A1 (fr) * | 2004-11-03 | 2006-05-05 | Faurecia Sieges Automobile | Systeme de detection de presence d'un occupant sur un siege de vehicule automobile, siege adapte a ce systeme et vehicule comportant un tel systeme |
US20060262971A1 (en) * | 2005-05-18 | 2006-11-23 | Scott Foes | Transient defect detection algorithm |
US20060268153A1 (en) * | 2005-05-11 | 2006-11-30 | Xenogen Corporation | Surface contruction using combined photographic and structured light information |
EP1748304A1 (fr) * | 2005-07-27 | 2007-01-31 | IEE International Electronics & Engineering S.A.R.L. | Procédé de fonctionnement d'un pixel d'imageur de temps de vol |
WO2007031102A1 (fr) * | 2005-09-15 | 2007-03-22 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Detection d'un rayonnement optique |
US7274815B1 (en) | 2003-10-09 | 2007-09-25 | Sandia Corporation | Parallel phase-sensitive three-dimensional imaging camera |
US20070237363A1 (en) * | 2004-07-30 | 2007-10-11 | Matsushita Electric Works, Ltd. | Image Processing Device |
US20070253908A1 (en) * | 2002-07-16 | 2007-11-01 | Xenogen Corporation | Fluorescent light tomography |
US7298415B2 (en) | 2001-07-13 | 2007-11-20 | Xenogen Corporation | Structured light imaging apparatus |
US20070270697A1 (en) * | 2001-05-17 | 2007-11-22 | Xenogen Corporation | Method and apparatus for determining target depth, brightness and size within a body region |
US20070273687A1 (en) * | 2003-10-15 | 2007-11-29 | Ron Daniel | Device for Scanning Three-Dimensional Objects |
US20080052052A1 (en) * | 2006-08-24 | 2008-02-28 | Xenogen Corporation | Apparatus and methods for determining optical tissue properties |
US20080143085A1 (en) * | 1992-05-05 | 2008-06-19 | Automotive Technologies International, Inc. | Vehicular Occupant Sensing Techniques |
US20080181487A1 (en) * | 2003-04-18 | 2008-07-31 | Stephen Charles Hsu | Method and apparatus for automatic registration and visualization of occluded targets using ladar data |
US20080186475A1 (en) * | 2006-11-21 | 2008-08-07 | Tadashi Kawata | Method and Apparatus for Position Judgment |
EP1956570A1 (fr) * | 2007-02-09 | 2008-08-13 | Siemens Aktiengesellschaft | Procédé de surveillance centrale et agencement de réception, d'évaluation et d'affichage sélectif d'images de personnes inactives |
US20080245952A1 (en) * | 2007-04-03 | 2008-10-09 | Troxell John R | Synchronous imaging using segmented illumination |
US20080273758A1 (en) * | 2005-11-14 | 2008-11-06 | Oliver Fuchs | Apparatus and method for monitoring a spatial area, in particular for safeguarding a hazardous area of an automatically operated installation |
WO2009025373A1 (fr) | 2007-08-22 | 2009-02-26 | Hamamatsu Photonics K.K. | Dispositif d'imagerie à semi-conducteurs et dispositif de mesure d'image de distance |
US20090123061A1 (en) * | 2007-11-13 | 2009-05-14 | Samsung Electronics Co., Ltd. | Depth image generating method and apparatus |
US20090135405A1 (en) * | 2005-09-30 | 2009-05-28 | Marc Fischer | Device and Method for Recording Distance-Measuring Images |
WO2009074584A1 (fr) * | 2007-12-10 | 2009-06-18 | Selex Sensors And Airborne Systems Limited | Système d'imagerie |
US7554652B1 (en) | 2008-02-29 | 2009-06-30 | Institut National D'optique | Light-integrating rangefinding device and method |
WO2009105857A1 (fr) | 2008-02-29 | 2009-09-03 | Institut National D'optique | Dispositif et procédé de télémétrie à intégration de lumière |
CN1954236B (zh) * | 2004-07-30 | 2010-05-05 | 松下电工株式会社 | 图像处理设备 |
US20110134222A1 (en) * | 2008-08-03 | 2011-06-09 | Microsoft International Holdings B.V. | Rolling Camera System |
US20110176146A1 (en) * | 2008-09-02 | 2011-07-21 | Cristina Alvarez Diez | Device and method for measuring a surface |
US20120268727A1 (en) * | 2008-04-14 | 2012-10-25 | Olaf Schrey | Optical Distance Measuring Device and Method for Optical Distance Measurement |
US20140022349A1 (en) * | 2012-07-20 | 2014-01-23 | Samsung Electronics Co., Ltd. | Image processing apparatus and method |
EP2702464A2 (fr) * | 2011-04-25 | 2014-03-05 | Microsoft Corporation | Modes de diodes laser |
US9000350B1 (en) * | 2012-09-11 | 2015-04-07 | Rockwell Collins, Inc. | Time-domain overlap imagery detecting system and method of using same |
US20150096375A1 (en) * | 2013-10-07 | 2015-04-09 | Avaya Inc. | Device proximity detection |
WO2016054670A1 (fr) * | 2014-10-09 | 2016-04-14 | Trumpf Maschinen Austria Gmbh & Co. Kg. | Dispositif de mesure d'angle de cintrage |
US9360554B2 (en) | 2014-04-11 | 2016-06-07 | Facet Technology Corp. | Methods and apparatus for object detection and identification in a multiple detector lidar array |
CN106133552A (zh) * | 2013-11-14 | 2016-11-16 | 欧都思影像公司 | 用于照明物体的方法 |
US9723233B2 (en) | 2012-04-18 | 2017-08-01 | Brightway Vision Ltd. | Controllable gated sensor |
US9866816B2 (en) | 2016-03-03 | 2018-01-09 | 4D Intellectual Properties, Llc | Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis |
DE102017106073B3 (de) | 2017-03-21 | 2018-03-29 | Elmos Semiconductor Aktiengesellschaft | Verfahren zur Erfassung der Verzögerung zwischen einem Spannungspuls an einem Leuchtmittel und einem Shutter-an-Signal zur Verbesserung von Verfahren und Vorrichtungen zur Messung der Lichtlaufzeit |
DE102017106071B3 (de) | 2017-03-21 | 2018-03-29 | Elmos Semiconductor Aktiengesellschaft | Verfahren zur Erfassung der Verzögerung zwischen einem Lichtpuls und einem Shutter-an-Signal zur Verbesserung von Verfahren und Vorrichtungen zur Messung der Lichtlaufzeit |
DE102017106072B3 (de) | 2017-03-21 | 2018-03-29 | Elmos Semiconductor Aktiengesellschaft | Verfahren zur Erfassung der Verzögerung zwischen einem Strompuls an einem Leuchtmittel und einem Shutter-an-Signal zur Verbesserung von Verfahren und Vorrichtungen zur Messung der Lichtlaufzeit |
DE102017106077B3 (de) | 2017-03-21 | 2018-07-26 | Elmos Semiconductor Aktiengesellschaft | Vorrichtung zur Signalverzögerung unter Benutzung eines Quarz-Oszillators und deren Anwendung in einer TOF-Kamera |
DE102017106078B3 (de) | 2017-03-21 | 2018-07-26 | Elmos Semiconductor Aktiengesellschaft | Vorrichtung zur Signalverzögerung unter Benutzung eines MEMS Oszillators und deren Anwendung in einer TOF-Kamera |
DE102017106076B3 (de) | 2017-03-21 | 2018-07-26 | Elmos Semiconductor Aktiengesellschaft | Vorrichtung zur Signalverzögerung unter Benutzung eines Referenzoszillators und deren Anwendung in einer TOF-Kamera |
US10036801B2 (en) | 2015-03-05 | 2018-07-31 | Big Sky Financial Corporation | Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array |
US10073164B2 (en) * | 2013-06-26 | 2018-09-11 | Panasonic Intellectual Property Management Co., Ltd. | Distance-measuring/imaging apparatus, distance measuring method of the same, and solid imaging element |
US20180321363A1 (en) * | 2017-05-02 | 2018-11-08 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Device for determining a distance from an object, and corresponding method |
US10203399B2 (en) | 2013-11-12 | 2019-02-12 | Big Sky Financial Corporation | Methods and apparatus for array based LiDAR systems with reduced interference |
US10401483B2 (en) | 2014-12-02 | 2019-09-03 | Odos Imaging Ltd. | Distance measuring device and method for determining a distance |
US10422859B2 (en) | 2013-06-27 | 2019-09-24 | Panasonic Intellectual Property Management Co, Ltd. | Distance measuring device and solid-state image sensor |
CN110509055A (zh) * | 2019-09-04 | 2019-11-29 | 恩坦华汽车零部件(镇江)有限公司 | 一种汽车玻璃升降器微型直流电机齿轮壳装配方法 |
DE112011101123B4 (de) | 2010-03-31 | 2019-12-12 | Honda Motor Co., Ltd. | Festkörper-Bildgerät |
US10557703B2 (en) | 2014-11-21 | 2020-02-11 | Rockwell Automation Limited | Distance measuring device and method for determining a distance |
US10908288B2 (en) | 2015-03-26 | 2021-02-02 | Fujifilm Corporation | Distance image acquisition apparatus and distance image acquisition method |
US11092678B2 (en) | 2018-06-21 | 2021-08-17 | Analog Devices, Inc. | Measuring and removing the corruption of time-of-flight depth images due to internal scattering |
WO2021253308A1 (fr) * | 2020-06-18 | 2021-12-23 | 深圳市汇顶科技股份有限公司 | Appareil d'acquisition d'image |
US11730370B2 (en) | 2006-08-24 | 2023-08-22 | Xenogen Corporation | Spectral unmixing for in-vivo imaging |
Families Citing this family (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19908215A1 (de) * | 1999-02-25 | 2000-09-21 | Siemens Ag | Vorrichtung und Verfahren zum optischen Erfassen eines Objektes oder einer Person im Innenraum eines Fahrzeugs |
US6323942B1 (en) * | 1999-04-30 | 2001-11-27 | Canesta, Inc. | CMOS-compatible three-dimensional image sensor IC |
JP4810052B2 (ja) * | 2000-06-15 | 2011-11-09 | オートモーティブ システムズ ラボラトリー インコーポレーテッド | 乗員センサ |
DE10051505C2 (de) * | 2000-10-17 | 2002-09-12 | Astrium Gmbh | Verfahren und Vorrichtung zum Erzeugen von 3D-Entfernungsbildern |
WO2002041031A1 (fr) * | 2000-11-14 | 2002-05-23 | Siemens Aktiengesellschaft | Dispositif de traitement de donnees image et procede y relatif |
DE50208355D1 (de) * | 2001-08-06 | 2006-11-16 | Siemens Ag | Verfahren und vorrichtung zur aufnahme eines dreidimensionalen abstandsbildes |
JP2003149717A (ja) * | 2001-11-19 | 2003-05-21 | Mitsubishi Heavy Ind Ltd | 撮像方法及び撮像装置 |
DE10253437B4 (de) * | 2002-11-12 | 2007-02-15 | Iris-Gmbh Infrared & Intelligent Sensors | Vorrichtung und Verfahren zum Erfassen einer Topografie in drei Dimensionen |
JP4645177B2 (ja) * | 2004-11-30 | 2011-03-09 | パナソニック電工株式会社 | 計測装置 |
DE102004038302A1 (de) * | 2004-08-04 | 2006-03-16 | Iris-Gmbh Infrared & Intelligent Sensors | Expertensystem und mobiles Assistenzgerät |
DE102005016556A1 (de) | 2005-04-11 | 2006-10-12 | Sick Ag | Verfahren zum Betrieb eines optoelektronischen Sensors |
KR100665321B1 (ko) * | 2005-06-03 | 2007-01-09 | 주식회사 삼성산업 | Spc가 피복된 강관의 구조 |
DE102006016026A1 (de) | 2006-04-05 | 2007-10-11 | Sick Ag | Distanzmessvorrichtung |
DE102006029025A1 (de) * | 2006-06-14 | 2007-12-27 | Iris-Gmbh Infrared & Intelligent Sensors | Vorrichtung und Verfahren zur Abstandsbestimmung |
JP5266636B2 (ja) * | 2006-12-12 | 2013-08-21 | 株式会社デンソー | 光センサ、および距離検出装置 |
DE102007009244A1 (de) * | 2007-02-22 | 2008-08-28 | Sick Ag | Verfahren zur Überprüfung der Funktionsweise und/oder Justierung einer optoelektronischen Sensoranordnung und optoelektronische Sensoranordnung |
JP4831760B2 (ja) * | 2007-03-29 | 2011-12-07 | 日本放送協会 | 3次元情報検出方法及びその装置 |
JP2009047475A (ja) * | 2007-08-15 | 2009-03-05 | Hamamatsu Photonics Kk | 固体撮像素子 |
DE102007046562A1 (de) | 2007-09-28 | 2009-04-02 | Siemens Ag | Verfahren und Vorrichtung zum Bestimmen eines Abstands mittels eines optoelektronischen Bildsensors |
DE102008006449A1 (de) * | 2008-01-29 | 2009-07-30 | Kaba Gallenschütz GmbH | Verfahren und Vorrichtung zur Überwachung eines Raumvolumens |
FR2940463B1 (fr) * | 2008-12-23 | 2012-07-27 | Thales Sa | Systeme d'imagerie passive equipe d'un telemetre |
US20120154535A1 (en) * | 2010-12-15 | 2012-06-21 | Microsoft Corporation | Capturing gated and ungated light in the same frame on the same photosurface |
JP5453230B2 (ja) * | 2010-12-15 | 2014-03-26 | 本田技研工業株式会社 | 乗員検知装置 |
DE102011010102B4 (de) * | 2011-02-01 | 2012-09-13 | Diehl Bgt Defence Gmbh & Co. Kg | Verfahren zum Messen einer Entfernung und Vorrichtung zur Durchführung des Verfahrens |
TWI575494B (zh) * | 2011-08-19 | 2017-03-21 | 半導體能源研究所股份有限公司 | 半導體裝置的驅動方法 |
JP5781918B2 (ja) * | 2011-12-28 | 2015-09-24 | 浜松ホトニクス株式会社 | 距離測定装置 |
DE102013003515A1 (de) * | 2013-03-04 | 2014-09-04 | Jenoptik Robot Gmbh | Verfahren und Vorrichtung zur Geschwindigkeitsmessung von Fahrzeugen mittels einer Lasereinrichtung |
DE102013007961B4 (de) * | 2013-05-10 | 2023-06-22 | Audi Ag | Optisches Messsystem für ein Fahrzeug |
DE102013108824A1 (de) * | 2013-08-14 | 2015-02-19 | Huf Hülsbeck & Fürst Gmbh & Co. Kg | Sensoranordnung zur Erfassung von Bediengesten an Fahrzeugen |
EP2890125B1 (fr) | 2013-12-24 | 2021-10-13 | Sony Depthsensing Solutions | Système de caméra à capture de la durée de déplacement de la lumière |
DE102014013099B4 (de) | 2014-09-03 | 2019-11-14 | Basler Aktiengesellschaft | Verfahren und Vorrichtung zur vereinfachten Erfassung eines Tiefenbildes |
DE102014220322B4 (de) * | 2014-10-07 | 2020-02-13 | Conti Temic Microelectronic Gmbh | Bilderfassungsvorrichtung und Verfahren zum Erfassen eines optischen Bildes |
DE102015201901B4 (de) * | 2015-02-04 | 2021-07-22 | Volkswagen Aktiengesellschaft | Bestimmung einer Position eines fahrzeugfremden Objekts in einem Fahrzeug |
DE102016014851A1 (de) * | 2016-12-14 | 2018-06-14 | Alexander Zielbach | Umrisspulsmodulationsmessung |
DE102017008666A1 (de) | 2017-09-14 | 2018-03-01 | Daimler Ag | Verfahren zur Erfassung der Umgebungshelligkeit |
EP3640677B1 (fr) | 2018-10-17 | 2023-08-02 | Trimble Jena GmbH | Suiveur d'appareil d'étude de suivi pour suivre une cible |
EP3640590B1 (fr) | 2018-10-17 | 2021-12-01 | Trimble Jena GmbH | Appareil d'arpentage pour examiner un objet |
EP3696498A1 (fr) | 2019-02-15 | 2020-08-19 | Trimble Jena GmbH | Instrument de surveillance et procédé d'étalonnage d'un instrument de surveillance |
DE102020103597A1 (de) | 2020-02-12 | 2021-08-12 | Saf-Holland Gmbh | Verfahren und System zum Ermitteln einer Ausrichtung eines Anhängers gegenüber einem Zugfahrzeug |
DE102021100913A1 (de) | 2020-03-10 | 2021-09-16 | Elmos Semiconductor Se | Pixel für ein bildgebendes Lichtlaufzeitmesssystem mit einer verbesserten Fertigungsausbeute |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3608547A (en) * | 1967-07-29 | 1971-09-28 | Olympus Optical Co | Method for determining the distance of an object from an edoscope |
EP0151257A2 (fr) | 1984-02-08 | 1985-08-14 | Dornier Gmbh | Procédé pour la production d'images codées en distance |
EP0363735A2 (fr) | 1988-10-12 | 1990-04-18 | Kaman Aerospace Corporation | Système lidar à représentation utilisant de la lumière non visible |
DE3839513A1 (de) | 1988-11-23 | 1990-05-31 | Messerschmitt Boelkow Blohm | Bildsensor |
EP0465806A2 (fr) | 1990-07-12 | 1992-01-15 | Ball Corporation | Détecteur de distances à intégration de charges |
US5446529A (en) * | 1992-03-23 | 1995-08-29 | Advanced Scientific Concepts, Inc. | 3D imaging underwater laser radar |
WO1997011353A1 (fr) | 1995-09-18 | 1997-03-27 | Daedalus Enterprises, Inc. | Systeme de detection de cibles utilisant des criteres optiques multiples |
-
1998
- 1998-11-14 KR KR10-2000-7007058A patent/KR100508277B1/ko not_active IP Right Cessation
- 1998-11-14 WO PCT/DE1998/003344 patent/WO1999034235A1/fr active IP Right Grant
- 1998-11-14 EP EP98962257A patent/EP1040366B1/fr not_active Expired - Lifetime
- 1998-11-14 JP JP2000526831A patent/JP3860412B2/ja not_active Expired - Lifetime
- 1998-11-14 DE DE59809883T patent/DE59809883D1/de not_active Expired - Lifetime
- 1998-11-14 US US09/581,091 patent/US6373557B1/en not_active Expired - Lifetime
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3608547A (en) * | 1967-07-29 | 1971-09-28 | Olympus Optical Co | Method for determining the distance of an object from an edoscope |
EP0151257A2 (fr) | 1984-02-08 | 1985-08-14 | Dornier Gmbh | Procédé pour la production d'images codées en distance |
EP0363735A2 (fr) | 1988-10-12 | 1990-04-18 | Kaman Aerospace Corporation | Système lidar à représentation utilisant de la lumière non visible |
DE3839513A1 (de) | 1988-11-23 | 1990-05-31 | Messerschmitt Boelkow Blohm | Bildsensor |
EP0465806A2 (fr) | 1990-07-12 | 1992-01-15 | Ball Corporation | Détecteur de distances à intégration de charges |
US5446529A (en) * | 1992-03-23 | 1995-08-29 | Advanced Scientific Concepts, Inc. | 3D imaging underwater laser radar |
WO1997011353A1 (fr) | 1995-09-18 | 1997-03-27 | Daedalus Enterprises, Inc. | Systeme de detection de cibles utilisant des criteres optiques multiples |
Non-Patent Citations (1)
Title |
---|
Scientific CMOS CID imagers, Zarnowski et al., Proc. SPIE vol. 2654, p. 29-37, Solid State Sensor Arrays and CCD Cameras, Mar. 1996. * |
Cited By (157)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080143085A1 (en) * | 1992-05-05 | 2008-06-19 | Automotive Technologies International, Inc. | Vehicular Occupant Sensing Techniques |
US8152198B2 (en) | 1992-05-05 | 2012-04-10 | Automotive Technologies International, Inc. | Vehicular occupant sensing techniques |
US6822681B1 (en) * | 1998-07-02 | 2004-11-23 | Pentax Corporation | Image capturing device for capturing a shape of a measurement subject |
US6822687B1 (en) * | 1999-07-08 | 2004-11-23 | Pentax Corporation | Three-dimensional image capturing device and its laser emitting device |
US7116371B2 (en) | 1999-07-08 | 2006-10-03 | Pentax Corporation | Three dimensional image capturing device and its laser emitting device |
US20050041143A1 (en) * | 1999-07-08 | 2005-02-24 | Pentax Corporation | Three dimensional image capturing device and its laser emitting device |
US6581961B1 (en) | 1999-12-17 | 2003-06-24 | Trw Vehicle Safety Systems Inc. | Deactivation of second stage of air bag inflator |
US20030188089A1 (en) * | 1999-12-21 | 2003-10-02 | Ronald S. Perloff | Hash cam having a reduced width comparison circuitry and its application |
US6810135B1 (en) | 2000-06-29 | 2004-10-26 | Trw Inc. | Optimized human presence detection through elimination of background interference |
US20020096624A1 (en) * | 2001-01-24 | 2002-07-25 | Mckee Bret A. | Method and apparatus for gathering three dimensional data with a digital imaging system |
US6950135B2 (en) * | 2001-01-24 | 2005-09-27 | Hewlett-Packard Development Company, L.P. | Method and apparatus for gathering three dimensional data with a digital imaging system |
US20100262019A1 (en) * | 2001-05-17 | 2010-10-14 | Xenogen Corporation | Method and apparatus for determining target depth, brightness and size within a body region |
US8180435B2 (en) | 2001-05-17 | 2012-05-15 | Xenogen Corporation | Method and apparatus for determining target depth, brightness and size within a body region |
US8825140B2 (en) | 2001-05-17 | 2014-09-02 | Xenogen Corporation | Imaging system |
US7403812B2 (en) | 2001-05-17 | 2008-07-22 | Xenogen Corporation | Method and apparatus for determining target depth, brightness and size within a body region |
US7764986B2 (en) | 2001-05-17 | 2010-07-27 | Xenogen Corporation | Method and apparatus for determining target depth, brightness and size within a body region |
US20070270697A1 (en) * | 2001-05-17 | 2007-11-22 | Xenogen Corporation | Method and apparatus for determining target depth, brightness and size within a body region |
US8279334B2 (en) | 2001-07-13 | 2012-10-02 | Xenogen Corporation | Structured light imaging apparatus |
US7298415B2 (en) | 2001-07-13 | 2007-11-20 | Xenogen Corporation | Structured light imaging apparatus |
US20080079802A1 (en) * | 2001-07-13 | 2008-04-03 | Xenogen Corporation | Structured light imaging apparatus |
US7274438B2 (en) * | 2001-12-21 | 2007-09-25 | Siemens Aktiengesellschaft | Device for monitoring spatial areas |
US20050078297A1 (en) * | 2001-12-21 | 2005-04-14 | Gunter Doemens | Device for monitoring spatial areas |
US8050778B2 (en) * | 2002-03-18 | 2011-11-01 | Sick Ag | Sensor-machine interface and method for operation thereof |
US20050203641A1 (en) * | 2002-03-18 | 2005-09-15 | Sick Ag | Sensor-machine interface and method for operation thereof |
US20110090316A1 (en) * | 2002-07-16 | 2011-04-21 | Xenogen Corporation | Method and apparatus for 3-d imaging of internal light sources |
US7599731B2 (en) | 2002-07-16 | 2009-10-06 | Xenogen Corporation | Fluorescent light tomography |
US7797034B2 (en) | 2002-07-16 | 2010-09-14 | Xenogen Corporation | 3-D in-vivo imaging and topography using structured light |
US20080018899A1 (en) * | 2002-07-16 | 2008-01-24 | Xenogen Corporation | Method and apparatus for 3-d imaging of internal light sources |
US20100022872A1 (en) * | 2002-07-16 | 2010-01-28 | Xenogen Corporation | Method and apparatus for 3-d imaging of internal light sources |
US20050201614A1 (en) * | 2002-07-16 | 2005-09-15 | Xenogen Corporation | 3-D in-vivo imaging and topography using structured light |
US7860549B2 (en) | 2002-07-16 | 2010-12-28 | Xenogen Corporation | Method and apparatus for 3-D imaging of internal light sources |
US7555332B2 (en) | 2002-07-16 | 2009-06-30 | Xenogen Corporation | Fluorescent light tomography |
US7603167B2 (en) | 2002-07-16 | 2009-10-13 | Xenogen Corporation | Method and apparatus for 3-D imaging of internal light sources |
US8909326B2 (en) | 2002-07-16 | 2014-12-09 | Xenogen Corporation | Method and apparatus for 3-D imaging of internal light sources |
US20070253908A1 (en) * | 2002-07-16 | 2007-11-01 | Xenogen Corporation | Fluorescent light tomography |
US20080031494A1 (en) * | 2002-07-16 | 2008-02-07 | Xenogen Corporation | Fluorescent light tomography |
US7616985B2 (en) | 2002-07-16 | 2009-11-10 | Xenogen Corporation | Method and apparatus for 3-D imaging of internal light sources |
US20040021771A1 (en) * | 2002-07-16 | 2004-02-05 | Xenogen Corporation | Method and apparatus for 3-D imaging of internal light sources |
US20060131483A1 (en) * | 2003-03-25 | 2006-06-22 | Olaf Schrey | Detection of electromagnetic radiation |
WO2004086087A1 (fr) * | 2003-03-25 | 2004-10-07 | Thomson Licensing Sas | Detection d'un rayonnement electromagnetique |
US7495202B2 (en) | 2003-03-25 | 2009-02-24 | Thomson Licensing | Device for detecting electromagnetic radiation |
US20080181487A1 (en) * | 2003-04-18 | 2008-07-31 | Stephen Charles Hsu | Method and apparatus for automatic registration and visualization of occluded targets using ladar data |
US20050040632A1 (en) * | 2003-08-20 | 2005-02-24 | Kabushiki Kaisha Toshiba | Distance detecting apparatus, air bag system controlling apparatus, and method of detecting distance |
US20080001386A1 (en) * | 2003-08-20 | 2008-01-03 | Kabushiki Kaisha Toshiba | Distance detecting apparatus, air bag system controlling apparatus, and method of detecting distance |
US7311326B2 (en) * | 2003-08-20 | 2007-12-25 | Kabushiki Kaisha Toshiba | Distance detecting apparatus, air bag system controlling apparatus, and method of detecting distance |
US7274815B1 (en) | 2003-10-09 | 2007-09-25 | Sandia Corporation | Parallel phase-sensitive three-dimensional imaging camera |
US20070273687A1 (en) * | 2003-10-15 | 2007-11-29 | Ron Daniel | Device for Scanning Three-Dimensional Objects |
US7620235B2 (en) * | 2003-10-15 | 2009-11-17 | Isis Innovation Ltd. | Device for scanning three-dimensional objects |
EP1544535A1 (fr) * | 2003-12-20 | 2005-06-22 | Leuze lumiflex GmbH + Co. KG | Dispositif de surveillance de la zone dans la portée d'un outil de travail |
US8698893B2 (en) | 2003-12-20 | 2014-04-15 | Leuze Lumiflex Gmbh & Co. Kg | Device for monitoring an area of coverage on a work tool |
US20050151721A1 (en) * | 2004-01-09 | 2005-07-14 | Cheah Chiang S. | Image acquisition timing system and method |
US7652241B2 (en) * | 2004-01-09 | 2010-01-26 | Aptina Imaging Corporation | Image acquisition timing system and method |
US7230685B2 (en) * | 2004-01-28 | 2007-06-12 | Denso Corporation | Apparatus, method, and program for generating range-image-data |
US20050162638A1 (en) * | 2004-01-28 | 2005-07-28 | Denso Corporation | Apparatus, method, and program for generating range-image-data |
US7697748B2 (en) | 2004-07-06 | 2010-04-13 | Dimsdale Engineering, Llc | Method and apparatus for high resolution 3D imaging as a function of camera position, camera trajectory and range |
US7991222B2 (en) | 2004-07-06 | 2011-08-02 | Topcon Positioning Systems, Inc. | Method and apparatus for high resolution 3D imaging as a function of camera position, camera trajectory and range |
US20070252974A1 (en) * | 2004-07-06 | 2007-11-01 | Dimsdale Engineering, Llc. | System and method for determining range in 3d imaging systems |
US20090076758A1 (en) * | 2004-07-06 | 2009-03-19 | Dimsdale Engineering, Llc. | System and method for determining range in 3d imaging systems |
US20060007422A1 (en) * | 2004-07-06 | 2006-01-12 | Jerry Dimsdale | System and method for determining range in 3D imaging systems |
US8547532B2 (en) | 2004-07-06 | 2013-10-01 | Topcon Positioning Systems, Inc. | System and method for determining range in 3D imaging systems |
US7236235B2 (en) | 2004-07-06 | 2007-06-26 | Dimsdale Engineering, Llc | System and method for determining range in 3D imaging systems |
US20100188504A1 (en) * | 2004-07-06 | 2010-07-29 | Dimsdale Engineering, Llc | Method and apparatus for high resolution 3d imaging as a function of camera position, camera trajectory and range |
US20060006309A1 (en) * | 2004-07-06 | 2006-01-12 | Jerry Dimsdale | Method and apparatus for high resolution 3D imaging |
US7453553B2 (en) | 2004-07-06 | 2008-11-18 | Dimsdale Engineering, Llc | System and method for determining range in 3D imaging systems |
EP1771749B1 (fr) * | 2004-07-30 | 2011-08-24 | Panasonic Electric Works Co., Ltd. | Dispositif de traitement d'images |
CN1954236B (zh) * | 2004-07-30 | 2010-05-05 | 松下电工株式会社 | 图像处理设备 |
EP2312336A1 (fr) * | 2004-07-30 | 2011-04-20 | Panasonic Electric Works Co., Ltd | Dispositif de traitement d'images |
US20070237363A1 (en) * | 2004-07-30 | 2007-10-11 | Matsushita Electric Works, Ltd. | Image Processing Device |
US7834305B2 (en) | 2004-07-30 | 2010-11-16 | Panasonic Electric Works Co., Ltd. | Image processing device |
US20070057209A1 (en) * | 2004-09-17 | 2007-03-15 | Matsushita Electric Works, Ltd. | Range image sensor |
US7362419B2 (en) | 2004-09-17 | 2008-04-22 | Matsushita Electric Works, Ltd. | Range image sensor |
WO2006030989A1 (fr) * | 2004-09-17 | 2006-03-23 | Matsushita Electric Works, Ltd. | Capteur d'image telemetrique |
FR2877279A1 (fr) * | 2004-11-03 | 2006-05-05 | Faurecia Sieges Automobile | Systeme de detection de presence d'un occupant sur un siege de vehicule automobile, siege adapte a ce systeme et vehicule comportant un tel systeme |
US8044996B2 (en) | 2005-05-11 | 2011-10-25 | Xenogen Corporation | Surface construction using combined photographic and structured light information |
US20060268153A1 (en) * | 2005-05-11 | 2006-11-30 | Xenogen Corporation | Surface contruction using combined photographic and structured light information |
US20060262971A1 (en) * | 2005-05-18 | 2006-11-23 | Scott Foes | Transient defect detection algorithm |
US7591583B2 (en) * | 2005-05-18 | 2009-09-22 | Federal-Mogul World Wide, Inc. | Transient defect detection algorithm |
US20090135404A1 (en) * | 2005-07-27 | 2009-05-28 | Iee International Electronics & Engineering | Method for operating a time-of-flight imager pixel |
EP1748304A1 (fr) * | 2005-07-27 | 2007-01-31 | IEE International Electronics & Engineering S.A.R.L. | Procédé de fonctionnement d'un pixel d'imageur de temps de vol |
WO2007014818A1 (fr) * | 2005-07-27 | 2007-02-08 | Iee International Electronics & Engineering S.A. | Procédé d’utilisation d’un pixel d’imagerie de temps de vol |
US7907257B2 (en) | 2005-07-27 | 2011-03-15 | Iee International Electronics & Engineering S.A. | Method for operating a time-of-flight imager pixel |
US7947939B2 (en) | 2005-09-15 | 2011-05-24 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Detection of optical radiation using a photodiode structure |
US20080258044A1 (en) * | 2005-09-15 | 2008-10-23 | Fraunhofer-Gesellschaft Zur Forderung Der Angewandten Forschung E.V. | Detection of Optical Radiation |
WO2007031102A1 (fr) * | 2005-09-15 | 2007-03-22 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Detection d'un rayonnement optique |
US7791714B2 (en) * | 2005-09-30 | 2010-09-07 | Siemens Aktiengesellschaft | Device and method for recording distance-measuring images |
US20090135405A1 (en) * | 2005-09-30 | 2009-05-28 | Marc Fischer | Device and Method for Recording Distance-Measuring Images |
US20080273758A1 (en) * | 2005-11-14 | 2008-11-06 | Oliver Fuchs | Apparatus and method for monitoring a spatial area, in particular for safeguarding a hazardous area of an automatically operated installation |
US8224032B2 (en) | 2005-11-14 | 2012-07-17 | Pilz Gmbh & Co. Kg | Apparatus and method for monitoring a spatial area, in particular for safeguarding a hazardous area of an automatically operated installation |
CN101310194B (zh) * | 2005-11-14 | 2012-05-30 | 皮尔茨公司 | 用于监视空间区域、特别是用于保护自动操作装置的危险区域的设备和方法 |
US10775308B2 (en) | 2006-08-24 | 2020-09-15 | Xenogen Corporation | Apparatus and methods for determining optical tissue properties |
US11730370B2 (en) | 2006-08-24 | 2023-08-22 | Xenogen Corporation | Spectral unmixing for in-vivo imaging |
US20080052052A1 (en) * | 2006-08-24 | 2008-02-28 | Xenogen Corporation | Apparatus and methods for determining optical tissue properties |
US20080186475A1 (en) * | 2006-11-21 | 2008-08-07 | Tadashi Kawata | Method and Apparatus for Position Judgment |
US8120761B2 (en) * | 2006-11-21 | 2012-02-21 | Stanley Electric Co., Ltd. | Method and apparatus for position judgment |
EP1956570A1 (fr) * | 2007-02-09 | 2008-08-13 | Siemens Aktiengesellschaft | Procédé de surveillance centrale et agencement de réception, d'évaluation et d'affichage sélectif d'images de personnes inactives |
US20080245952A1 (en) * | 2007-04-03 | 2008-10-09 | Troxell John R | Synchronous imaging using segmented illumination |
US7745771B2 (en) * | 2007-04-03 | 2010-06-29 | Delphi Technologies, Inc. | Synchronous imaging using segmented illumination |
WO2009025373A1 (fr) | 2007-08-22 | 2009-02-26 | Hamamatsu Photonics K.K. | Dispositif d'imagerie à semi-conducteurs et dispositif de mesure d'image de distance |
US8767189B2 (en) | 2007-08-22 | 2014-07-01 | Hamamatsu Photonics K.K. | Solid state imaging device and distance image measurement device |
US20090123061A1 (en) * | 2007-11-13 | 2009-05-14 | Samsung Electronics Co., Ltd. | Depth image generating method and apparatus |
US8305562B2 (en) * | 2007-11-13 | 2012-11-06 | Samsung Electronics Co., Ltd. | Depth image generating method and apparatus |
WO2009074584A1 (fr) * | 2007-12-10 | 2009-06-18 | Selex Sensors And Airborne Systems Limited | Système d'imagerie |
US20100252736A1 (en) * | 2007-12-10 | 2010-10-07 | Selex Galileo Limited | Imaging system |
US7554652B1 (en) | 2008-02-29 | 2009-06-30 | Institut National D'optique | Light-integrating rangefinding device and method |
WO2009105857A1 (fr) | 2008-02-29 | 2009-09-03 | Institut National D'optique | Dispositif et procédé de télémétrie à intégration de lumière |
US20120268727A1 (en) * | 2008-04-14 | 2012-10-25 | Olaf Schrey | Optical Distance Measuring Device and Method for Optical Distance Measurement |
US10101155B2 (en) | 2008-04-14 | 2018-10-16 | Volkswagen Aktiengesellschaft | Optical distance measuring device and method for optical distance measurement |
US9395440B2 (en) * | 2008-04-14 | 2016-07-19 | Volkswagen Aktiengesellschaft | Optical distance measuring device and method for optical distance measurement |
US20110134222A1 (en) * | 2008-08-03 | 2011-06-09 | Microsoft International Holdings B.V. | Rolling Camera System |
US8593507B2 (en) | 2008-08-03 | 2013-11-26 | Microsoft International Holdings B.V. | Rolling camera system |
US8836955B2 (en) * | 2008-09-02 | 2014-09-16 | Carl Zeiss Ag | Device and method for measuring a surface |
US20110176146A1 (en) * | 2008-09-02 | 2011-07-21 | Cristina Alvarez Diez | Device and method for measuring a surface |
DE112011101123B4 (de) | 2010-03-31 | 2019-12-12 | Honda Motor Co., Ltd. | Festkörper-Bildgerät |
EP2702464A2 (fr) * | 2011-04-25 | 2014-03-05 | Microsoft Corporation | Modes de diodes laser |
EP2702464A4 (fr) * | 2011-04-25 | 2014-09-17 | Microsoft Corp | Modes de diodes laser |
US9723233B2 (en) | 2012-04-18 | 2017-08-01 | Brightway Vision Ltd. | Controllable gated sensor |
US10324033B2 (en) * | 2012-07-20 | 2019-06-18 | Samsung Electronics Co., Ltd. | Image processing apparatus and method for correcting an error in depth |
US20140022349A1 (en) * | 2012-07-20 | 2014-01-23 | Samsung Electronics Co., Ltd. | Image processing apparatus and method |
US9000350B1 (en) * | 2012-09-11 | 2015-04-07 | Rockwell Collins, Inc. | Time-domain overlap imagery detecting system and method of using same |
US10073164B2 (en) * | 2013-06-26 | 2018-09-11 | Panasonic Intellectual Property Management Co., Ltd. | Distance-measuring/imaging apparatus, distance measuring method of the same, and solid imaging element |
US10422859B2 (en) | 2013-06-27 | 2019-09-24 | Panasonic Intellectual Property Management Co, Ltd. | Distance measuring device and solid-state image sensor |
US9507015B2 (en) * | 2013-10-07 | 2016-11-29 | Avaya Inc. | Device proximity detection |
US20150096375A1 (en) * | 2013-10-07 | 2015-04-09 | Avaya Inc. | Device proximity detection |
US11131755B2 (en) | 2013-11-12 | 2021-09-28 | Big Sky Financial Corporation | Methods and apparatus for array based LiDAR systems with reduced interference |
US10203399B2 (en) | 2013-11-12 | 2019-02-12 | Big Sky Financial Corporation | Methods and apparatus for array based LiDAR systems with reduced interference |
CN106133552A (zh) * | 2013-11-14 | 2016-11-16 | 欧都思影像公司 | 用于照明物体的方法 |
US10241204B2 (en) | 2013-11-14 | 2019-03-26 | Odos Imaging Ltd. | Method for illuminating an object |
US9360554B2 (en) | 2014-04-11 | 2016-06-07 | Facet Technology Corp. | Methods and apparatus for object detection and identification in a multiple detector lidar array |
US10585175B2 (en) | 2014-04-11 | 2020-03-10 | Big Sky Financial Corporation | Methods and apparatus for object detection and identification in a multiple detector lidar array |
US11860314B2 (en) | 2014-04-11 | 2024-01-02 | Big Sky Financial Corporation | Methods and apparatus for object detection and identification in a multiple detector lidar array |
WO2016054670A1 (fr) * | 2014-10-09 | 2016-04-14 | Trumpf Maschinen Austria Gmbh & Co. Kg. | Dispositif de mesure d'angle de cintrage |
US10557703B2 (en) | 2014-11-21 | 2020-02-11 | Rockwell Automation Limited | Distance measuring device and method for determining a distance |
US10401483B2 (en) | 2014-12-02 | 2019-09-03 | Odos Imaging Ltd. | Distance measuring device and method for determining a distance |
US11226398B2 (en) | 2015-03-05 | 2022-01-18 | Big Sky Financial Corporation | Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array |
US10036801B2 (en) | 2015-03-05 | 2018-07-31 | Big Sky Financial Corporation | Methods and apparatus for increased precision and improved range in a multiple detector LiDAR array |
US10908288B2 (en) | 2015-03-26 | 2021-02-02 | Fujifilm Corporation | Distance image acquisition apparatus and distance image acquisition method |
US10623716B2 (en) | 2016-03-03 | 2020-04-14 | 4D Intellectual Properties, Llc | Object identification and material assessment using optical profiles |
US10873738B2 (en) | 2016-03-03 | 2020-12-22 | 4D Intellectual Properties, Llc | Multi-frame range gating for lighting-invariant depth maps for in-motion applications and attenuating environments |
US11477363B2 (en) | 2016-03-03 | 2022-10-18 | 4D Intellectual Properties, Llc | Intelligent control module for utilizing exterior lighting in an active imaging system |
US10298908B2 (en) | 2016-03-03 | 2019-05-21 | 4D Intellectual Properties, Llc | Vehicle display system for low visibility objects and adverse environmental conditions |
US11838626B2 (en) | 2016-03-03 | 2023-12-05 | 4D Intellectual Properties, Llc | Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis |
US10382742B2 (en) | 2016-03-03 | 2019-08-13 | 4D Intellectual Properties, Llc | Methods and apparatus for a lighting-invariant image sensor for automated object detection and vision systems |
US9866816B2 (en) | 2016-03-03 | 2018-01-09 | 4D Intellectual Properties, Llc | Methods and apparatus for an active pulsed 4D camera for image acquisition and analysis |
DE102017106071B3 (de) | 2017-03-21 | 2018-03-29 | Elmos Semiconductor Aktiengesellschaft | Verfahren zur Erfassung der Verzögerung zwischen einem Lichtpuls und einem Shutter-an-Signal zur Verbesserung von Verfahren und Vorrichtungen zur Messung der Lichtlaufzeit |
DE102017106072B3 (de) | 2017-03-21 | 2018-03-29 | Elmos Semiconductor Aktiengesellschaft | Verfahren zur Erfassung der Verzögerung zwischen einem Strompuls an einem Leuchtmittel und einem Shutter-an-Signal zur Verbesserung von Verfahren und Vorrichtungen zur Messung der Lichtlaufzeit |
DE102017106073B3 (de) | 2017-03-21 | 2018-03-29 | Elmos Semiconductor Aktiengesellschaft | Verfahren zur Erfassung der Verzögerung zwischen einem Spannungspuls an einem Leuchtmittel und einem Shutter-an-Signal zur Verbesserung von Verfahren und Vorrichtungen zur Messung der Lichtlaufzeit |
DE102017106077B3 (de) | 2017-03-21 | 2018-07-26 | Elmos Semiconductor Aktiengesellschaft | Vorrichtung zur Signalverzögerung unter Benutzung eines Quarz-Oszillators und deren Anwendung in einer TOF-Kamera |
DE102017106078B3 (de) | 2017-03-21 | 2018-07-26 | Elmos Semiconductor Aktiengesellschaft | Vorrichtung zur Signalverzögerung unter Benutzung eines MEMS Oszillators und deren Anwendung in einer TOF-Kamera |
DE102017106076B3 (de) | 2017-03-21 | 2018-07-26 | Elmos Semiconductor Aktiengesellschaft | Vorrichtung zur Signalverzögerung unter Benutzung eines Referenzoszillators und deren Anwendung in einer TOF-Kamera |
US20180321363A1 (en) * | 2017-05-02 | 2018-11-08 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Device for determining a distance from an object, and corresponding method |
US11016184B2 (en) * | 2017-05-02 | 2021-05-25 | Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. | Device for determining a distance from an object, and corresponding method |
CN108802753B (zh) * | 2017-05-02 | 2022-05-31 | 弗劳恩霍夫应用研究促进协会 | 用于确定距对象的距离的设备以及相应的方法 |
CN108802753A (zh) * | 2017-05-02 | 2018-11-13 | 弗劳恩霍夫应用研究促进协会 | 用于确定距对象的距离的设备以及相应的方法 |
US11092678B2 (en) | 2018-06-21 | 2021-08-17 | Analog Devices, Inc. | Measuring and removing the corruption of time-of-flight depth images due to internal scattering |
CN110509055A (zh) * | 2019-09-04 | 2019-11-29 | 恩坦华汽车零部件(镇江)有限公司 | 一种汽车玻璃升降器微型直流电机齿轮壳装配方法 |
CN110509055B (zh) * | 2019-09-04 | 2021-04-06 | 恩坦华汽车零部件(镇江)有限公司 | 一种汽车玻璃升降器微型直流电机齿轮壳装配方法 |
WO2021253308A1 (fr) * | 2020-06-18 | 2021-12-23 | 深圳市汇顶科技股份有限公司 | Appareil d'acquisition d'image |
Also Published As
Publication number | Publication date |
---|---|
JP2002500367A (ja) | 2002-01-08 |
WO1999034235A1 (fr) | 1999-07-08 |
EP1040366A1 (fr) | 2000-10-04 |
JP3860412B2 (ja) | 2006-12-20 |
DE59809883D1 (de) | 2003-11-13 |
KR100508277B1 (ko) | 2005-08-17 |
KR20010033549A (ko) | 2001-04-25 |
EP1040366B1 (fr) | 2003-10-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6373557B1 (en) | Method and apparatus for picking up a three-dimensional range image | |
US6392747B1 (en) | Method and device for identifying an object and determining its location | |
US6535275B2 (en) | High resolution 3-D imaging range finder | |
EP3789792B1 (fr) | Procédé d'atténuation d'éclairage d'arrière-plan à partir d'une valeur d'exposition d'un pixel dans une mosaïque, et pixel pour une utilisation dans celle-ci | |
US7212278B2 (en) | Method and device for recording a three-dimensional distance-measuring image | |
US10948575B2 (en) | Optoelectronic sensor and method of measuring the distance from an object | |
DE19757595C2 (de) | Verfahren und Vorrichtung zur Aufnahme eines dreidimensionalen Abstandsbildes | |
US7554652B1 (en) | Light-integrating rangefinding device and method | |
EP1147370B1 (fr) | Surface photosensible a deblocage automatique | |
US7667598B2 (en) | Method and apparatus for detecting presence and range of a target object using a common detector | |
US20090185159A1 (en) | Distance measuring method and distance measuring element for detecting the spatial dimension of a target | |
EP2260325B1 (fr) | Dispositif et procédé de télémétrie à intégration de lumière | |
EP4016124A1 (fr) | Calcul de temps de vol avec estimation delta inter-bin | |
US6480265B2 (en) | Active target distance measurement | |
Mengel et al. | Fast range imaging by CMOS sensor array through multiple double short time integration (MDSI) | |
US7791714B2 (en) | Device and method for recording distance-measuring images | |
US20230194666A1 (en) | Object Reflectivity Estimation in a LIDAR System | |
US20230375678A1 (en) | Photoreceiver having thresholded detection | |
CN109309794B (zh) | 一种检测距离的方法 | |
US20220091236A1 (en) | Techniques for detecting and mitigating interference among multiple lidar sensors | |
CN113597534B (zh) | 测距成像***、测距成像方法和程序 | |
JP2022551427A (ja) | シーンまでの距離を決定するための方法および装置 | |
JP2020079756A (ja) | 距離情報取得装置および距離情報取得方法 | |
US20230273316A1 (en) | Imaging apparatus and method | |
CN115657064A (zh) | 一种激光材质识别方法、***、激光雷达及雷达制品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SIEMENS AKTIENGESELLSCHAFT, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MENGEL, PETER;DOEMENS, GUENTER;REEL/FRAME:010930/0295 Effective date: 19981112 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |