CN114690155A - Photoelectric detection device and electronic equipment - Google Patents

Photoelectric detection device and electronic equipment Download PDF

Info

Publication number
CN114690155A
CN114690155A CN202210090126.7A CN202210090126A CN114690155A CN 114690155 A CN114690155 A CN 114690155A CN 202210090126 A CN202210090126 A CN 202210090126A CN 114690155 A CN114690155 A CN 114690155A
Authority
CN
China
Prior art keywords
sensing
light
module
emitting
light beam
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210090126.7A
Other languages
Chinese (zh)
Inventor
汪浩
王小明
李佳鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Fushi Technology Co Ltd
Original Assignee
Shenzhen Fushi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Fushi Technology Co Ltd filed Critical Shenzhen Fushi Technology Co Ltd
Priority to CN202210090126.7A priority Critical patent/CN114690155A/en
Publication of CN114690155A publication Critical patent/CN114690155A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application provides a photoelectric detection device, including transmission module, receiving module and processing module. The emission module comprises an emission optical device and a plurality of light-emitting units, wherein the optical device is configured to modulate light beams emitted by the light-emitting units into a plurality of sensing light beams respectively having preset emission directions and emit the sensing light beams to a target scene, and the preset emission direction of each sensing light beam is different from the preset emission directions of other sensing light beams. The receiving module includes receiving optics configured to transmit photons from a target scene to a plurality of light-sensitive pixels configured to respond to the received photons and output corresponding light-sensitive signals, and a number of light-sensitive pixels that is less than a number of light-emitting units. The processing module is configured to process the light sensing signals to obtain a time at which the sensing light beams reflected back by objects in the target scene are received by the light sensing pixels. The present application further provides an electronic device comprising the photodetection means.

Description

Photoelectric detection device and electronic equipment
Technical Field
The application belongs to the photoelectric detection field, and particularly relates to a photoelectric detection device and an electronic device.
Background
The Time of Flight (ToF) measurement principle is to calculate the distance of an object, or three-dimensional information such as the depth of the surface of the object, by measuring the Time of Flight of an optical signal in a target scene. The ToF measurement has the advantages of long sensing distance, high precision, low energy consumption and the like, and is widely applied to the fields of consumer electronics, intelligent driving, AR/VR and the like.
The photoelectric detection device using the ToF principle comprises a transmitting module and a receiving module. The transmitting module is used for transmitting a sensing light beam to the target scene, and the receiving module is used for receiving the sensing light beam reflected by an object in the target scene and sensing the three-dimensional information of the object according to the time of the sensing light beam flying in the target scene from the transmitting period to the receiving period.
The current photoelectric detection device generally obtains the coordinate information of the object by converting the pixel coordinates of the sensing light beam received by the receiving module into world coordinates. However, this requires that the pixel resolution of the receiving module is high enough, otherwise object coordinate information that can meet the resolution accuracy requirement cannot be obtained, but the higher pixel resolution will undoubtedly increase the cost of the receiving module.
Disclosure of Invention
In view of the above, the present application provides a photodetecting device and an electronic apparatus capable of improving the problems of the prior art.
In a first aspect, the present application provides a photoelectric detection apparatus, which includes a transmitting module, a receiving module, and a processing module. The emission module comprises an emission optical device and a plurality of light-emitting units, wherein the optical device is configured to modulate light beams emitted by the light-emitting units into a plurality of sensing light beams respectively having preset emission directions and emit the sensing light beams to a target scene, and the preset emission direction of each sensing light beam is different from the preset emission directions of other sensing light beams. The receiving module includes receiving optics configured to transmit photons from the target scene to a plurality of light-sensitive pixels configured to respond to the received photons and output corresponding light-sensitive signals, and a number of light-sensitive pixels that is less than the number of light-emitting units. The processing module is configured to process the light-sensing signals to obtain times at which the sensing beams reflected back by objects in a target scene are received by the light-sensing pixels.
Further, the processing module is configured to determine a time of flight of the sensing beam according to a difference between a receiving time and an emitting time of the sensing beam reflected from the target scene, and determine coordinate information of an object reflecting the sensing beam in the target scene according to the emitting direction and the time of flight of the sensing beam.
Further, the emitting direction of the sensing light beam is defined by the parameter value of the emitting module coordinate system.
Furthermore, the emitting module coordinate system is established in a mode that the direction perpendicular to the outside direction of the light emitting surface of the emitting module is the positive direction of the z axis, the x axis and the y axis are positioned in the light emitting surface of the emitting module, the included angle between the emitting direction of the sensing light beam and the positive direction of the z axis is the polar angle of the emitting direction, the included angle between the projection of the emitting direction of the sensing light beam on the xy plane of the emitting module coordinate system and the positive direction of the x axis is the azimuth angle of the emitting direction, and the emitting direction of the sensing light beam is defined by the polar angle and the azimuth angle in the emitting module coordinate system.
Further, the processing module is configured to obtain a flight time of a corresponding sensing light beam according to the light sensing signal, obtain distance information between an object reflecting the sensing light beam and the emitting module according to the flight time of the sensing light beam, and determine coordinate information of the object reflecting the sensing light beam in an emitting module coordinate system according to the distance information between the object and the emitting module and an emitting direction of the sensing light beam.
Further, the device also comprises a control module, wherein the control module is configured to control the emission module to emit the multiple sensing light beams with different emission directions respectively in different periods.
Furthermore, each photosensitive pixel has a corresponding sensing area in the target scene, the sensing light beam reflected back in the sensing area is transmitted to the corresponding photosensitive pixel for receiving, and the control module is configured to control the emission module to emit a sensing light beam corresponding to one sensing area for irradiation in the same time period.
Furthermore, a plurality of sensing light beams with different emission directions emitted by the emission module in the same time period are respectively emitted into a plurality of different sensing areas in a one-to-one correspondence manner.
Further, each of the photosensitive pixels has a corresponding sensing region in the target scene, the sensing beams reflected back in the sensing region are transmitted to the corresponding photosensitive pixels for receiving, the control module is configured to control the emission module to correspondingly emit a plurality of sensing beams with different emission directions to a plurality of sensing sub-regions at different positions in the same sensing region in a plurality of different time periods, respectively, and the plurality of sensing sub-regions are arranged over the entire sensing region.
In a second aspect, the present application provides an electronic device comprising a photodetection apparatus as described above. The electronic equipment further comprises an application module which is configured to realize corresponding functions according to the coordinate information of the object obtained by the photoelectric detection device.
The beneficial effect of this application:
according to the embodiment of the application, the three-dimensional information image with relatively high resolution can be obtained by using the photosensitive pixels with relatively low resolution according to the mode of defining the three-dimensional coordinates of the object by the preset emission direction and the flight time of the sensing light beam emitted by the emission module, so that the detection resolution of the photoelectric detection device is improved.
While multiple embodiments are disclosed, including variations thereof, other embodiments of the disclosure will be apparent to those skilled in the art from the following detailed description, which shows and describes illustrative embodiments of the disclosure. It will be recognized that the present disclosure is capable of modification in various obvious respects, all without departing from the spirit and scope of the present invention. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not as restrictive.
Drawings
The features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Fig. 1 is a schematic functional module diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a functional block diagram of an embodiment of the photodetection device in FIG. 1;
FIG. 3 is a schematic diagram illustrating the photoelectric detection apparatus of FIG. 1 detecting coordinates of an object in a target scene;
FIG. 4 is a schematic diagram of the processing module of FIG. 2 obtaining a statistical histogram;
FIG. 5 is a schematic structural diagram of an embodiment of the transmitter module and the receiver module shown in FIG. 1;
FIG. 6 is a schematic diagram of an embodiment of the light source shown in FIG. 5;
FIGS. 7-10 are schematic views of the projection areas of the photodetection device shown in FIG. 5 at different time periods;
FIG. 11 is a schematic diagram showing the composition of three-dimensional information maps obtained by the photodetection device shown in FIG. 5 at different time intervals;
FIG. 12 is a schematic structural diagram of another embodiment of the transmitter module and the receiver module shown in FIG. 2;
fig. 13-16 are schematic views of the projection areas of the photodetecting device shown in fig. 12 respectively at different time periods.
FIG. 17 is a schematic diagram showing the composition of three-dimensional information maps obtained by the photodetection device shown in FIG. 12 at different time intervals;
FIG. 18 is a schematic structural diagram of a transmitting module and a receiving module according to another embodiment of FIG. 2;
FIGS. 19-22 are schematic views of the projection regions of the photodetecting device shown in FIG. 18 respectively at different time periods;
fig. 23 is a schematic diagram showing the composition of three-dimensional information maps obtained by the photodetection device shown in fig. 18 at different time periods.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function. The embodiments described below with reference to the accompanying drawings are illustrative and are only for the purpose of explaining the present application and are not to be construed as limiting the present application. In the description of the present application, it is to be understood that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or to imply that the indicated technical features are in number or order. Thus, features defined as "first" and "second" may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it should be noted that, unless explicitly stated or limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; either mechanically or electrically or in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship or combination of two or more elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
The following disclosure provides many different embodiments, or examples, for implementing different features of the application. In order to simplify the disclosure of the present application, only the components and settings of a specific example are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repeat use is intended to provide a simplified and clear description of the present application and may not in itself dictate a particular relationship between the various embodiments and/or configurations discussed. In addition, the various specific processes and materials provided in the following description of the present application are only examples of implementing the technical solutions of the present application, but one of ordinary skill in the art should recognize that the technical solutions of the present application can also be implemented by other processes and/or other materials not described below.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject technology can be practiced without one or more of the specific details, or with other structures, components, and so forth. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring the focus of the application.
The embodiment of the application provides a photoelectric detection device, which comprises a transmitting module, a receiving module and a processing module. The emission module comprises an emission optical device and a plurality of light-emitting units, wherein the optical device is configured to modulate light beams emitted by the light-emitting units into a plurality of sensing light beams respectively having preset emission directions and emit the sensing light beams to a target scene, and the preset emission direction of each sensing light beam is different from the preset emission directions of other sensing light beams. The receiving module includes receiving optics configured to transmit photons from the target scene to a plurality of light-sensitive pixels configured to respond to the received photons and output corresponding light-sensitive signals, and a number of light-sensitive pixels that is less than the number of light-emitting units. The processing module is configured to process the light-sensing signals to obtain times at which the sensing beams reflected back by objects in a target scene are received by the light-sensing pixels.
It will be appreciated that the photons from the target scene include photons of ambient light in the target scene and photons of the sensing beam in the target scene.
Optionally, in some embodiments, the processing module is configured to determine a time of flight of the sensing beam according to a difference between a receiving time and an emitting time of the sensing beam reflected back from the target scene, and determine coordinate information of an object reflecting the sensing beam in the target scene according to an emitting direction and the time of flight of the sensing beam.
Optionally, in some embodiments, the emission direction of the sensing beam is defined by a parameter value of an emission module coordinate system.
Optionally, in some embodiments, the emitting module coordinate system is established in a manner that a direction perpendicular to an outward direction of the emitting module light emitting surface is a positive z-axis direction, and an x-axis and a y-axis are located in the emitting module light emitting surface, an included angle between an emitting direction of the sensing light beam and the positive z-axis direction is a polar angle of the emitting direction, an included angle between a projection of the emitting direction of the sensing light beam on an xy plane of the emitting module coordinate system and the positive x-axis direction is an azimuth angle of the emitting direction, and the emitting direction of the sensing light beam is defined by the polar angle and the azimuth angle thereof in the emitting module coordinate system.
Optionally, in some embodiments, the processing module is configured to obtain a flight time of a corresponding sensing light beam according to the light sensing signal, obtain distance information between an object reflecting the sensing light beam and the emission module according to the flight time of the sensing light beam, and determine coordinate information of the object reflecting the sensing light beam in an emission module coordinate system according to the distance information between the object and the emission module and an emission direction of the sensing light beam.
Optionally, in some embodiments, the method further includes controlling the emission module to emit the plurality of sensing light beams with different emission directions respectively in different periods.
Optionally, in some embodiments, each of the photosensitive pixels has a corresponding one of the sensing regions in the target scene, the sensing light beam reflected back in the sensing region is transmitted to the corresponding photosensitive pixel for reception, and the control module is configured to control the emission module to emit one of the sensing light beams corresponding to one of the sensing regions for illumination in the same time period.
Optionally, in some embodiments, multiple sensing light beams with different emission directions emitted by the emission module in the same time period are respectively emitted into multiple different sensing regions in a one-to-one correspondence.
Optionally, in some embodiments, each of the photosensitive pixels has a corresponding sensing area in the target scene, the sensing beams reflected back in the sensing area are transmitted to the corresponding photosensitive pixel for reception, and the control module is configured to control the emission module to correspondingly emit a plurality of sensing beams having different emission directions to a plurality of sensing sub-areas at different positions in the same sensing area respectively in a plurality of different time periods, where the plurality of sensing sub-areas are arranged over the entire sensing area.
Embodiments of the present application further provide an electronic device, which includes the photodetection apparatus. And the electronic equipment realizes corresponding functions according to the three-dimensional information obtained by the photoelectric detection device. The three-dimensional information is, for example: and one or more of proximity information, depth information, distance information, coordinate information and other related information of the object in the target scene. The three-dimensional information may be used in the fields of 3D modeling, face recognition, automatic driving, machine vision, monitoring, unmanned aerial vehicle control, Augmented Reality (AR)/Virtual Reality (VR), instant positioning and Mapping (SLAM), object proximity determination, and the like, for example, and the present application does not limit the present invention.
The photo detection device may be, for example, a lidar configured to obtain three-dimensional information of objects in a target scene. The laser radar is applied to the fields of intelligent piloted vehicles, intelligent piloted airplanes, 3D printing, VR, AR, service robots and the like. Taking an intelligent driving vehicle as an example, a laser radar arranged in the intelligent driving vehicle can scan the surrounding environment by rapidly and repeatedly emitting laser beams to obtain point cloud data reflecting the appearance, position and motion of one or more objects in the surrounding environment. Specifically, the lidar emits a laser beam to the surrounding environment, receives an echo beam reflected by each object in the surrounding environment, and determines the distance/depth information of each object by calculating the time delay (i.e., the time-of-flight) between the emission time of the laser beam and the return time of the echo beam. Meanwhile, the laser radar can also determine angle information describing the orientation of a laser beam target scene, combine the distance/depth information of each object with the angle information of the laser beam to generate a three-dimensional map including each object in the scanned surrounding environment, and can guide the intelligent driving of the unmanned vehicle by using the three-dimensional map.
Hereinafter, an embodiment in which the photodetecting device is applied to the electronic apparatus will be described in detail with reference to the drawings.
Fig. 1 is a schematic diagram of functional modules of a photodetection device 10 applied to an electronic device 1 according to an embodiment of the present application. Fig. 2 is a schematic functional block diagram of the photodetecting device 10 according to the embodiment of the present application.
Referring to fig. 1 and 2, the electronic device 1 comprises a photodetection means 10. The photo detection device 10 can detect the object 2 in the target scene to obtain three-dimensional information of the object 2. Such as, but not limited to, one or more of proximity information of the object 2, depth information of the surface of the object 2, distance information of the object, and coordinate information of the object 2 in the target scene.
The electronic device 1 may further include an application module 20, and the application module 20 may implement corresponding functions according to the obtained three-dimensional information of the object 2, such as but not limited to: whether the object 2 appears in a preset range in front of the electronic equipment 1 can be judged according to the proximity information of the object 2; or, the electronic device 1 may be controlled to avoid the obstacle according to the distance information of the object 2; alternatively, 3D modeling, face recognition, machine vision, etc. may be implemented according to depth information of the surface of the object 2. The electronic device may further include a storage medium 30, and the storage medium 30 may provide support for storage requirements of the photodetecting apparatus 10 during operation.
Optionally, in some embodiments, the photodetection device 10 is, for example, a direct Time of Flight (dtoff) measurement device. The dtod measurement device 10 may perform three-dimensional information sensing based on the direct time-of-flight detection principle. For example, the dtod measuring device 10 may emit a sensing beam to a target scene and receive the sensing beam reflected back by an object 2 in the target scene, a time difference between an emission time and a reception time of the reflected sensing beam is referred to as a time of flight t of the sensing beam, and distance information of the object 2 may be obtained by calculating a distance that the sensing beam travels within the time of flight t
Figure BDA0003488832780000081
Where c is the speed of light.
Alternatively, in some other embodiments, the photodetection device 10 may also be an indirect Time of Flight (iToF) measurement device. The iToF measurement device 10 is based on the indirect time-of-flight detection principle to perform depth information sensing. The iToF measuring device 10 obtains three-dimensional information of the object 2 by comparing the phase difference between the emitted sensing beam and the received reflected sensing beam.
In the following embodiments of the present application, the photoelectric detection device 10 is mainly described as a dtofs measuring device.
Optionally, as shown in fig. 2, the photodetection device 10 includes a transmitting module 12, a receiving module 14, and a processing module 15. The emitting module 12 is configured to emit a sensing beam to the target scene to detect three-dimensional information of an object in the target scene, wherein at least a portion of the sensing beam is reflected by the object 2 in the target scene and returns, the sensing beam reflected by the object 2 carries the three-dimensional information of the object 2, and at least a portion of the reflected sensing beam can be received by the receiving module 14 to obtain the three-dimensional information of the object 2. The receiving module 14 is configured to receive the light signal from the target scene and output a corresponding light sensing signal. It will be appreciated that the optical signal received by the receiving module 14 may be photons, for example photons comprising a sensing beam reflected back by an object 2 in the target scene and photons of ambient light in the target scene. The processing module 15 is configured to obtain three-dimensional information of the object 2 from the difference between when the sensing beam is emitted and when it is received back reflected.
The processing module 15 may be disposed on the photodetecting device 10. Optionally, in some other embodiments, all or a part of the processing module 15 may also be disposed on the electronic device 1.
As shown in fig. 3, the sensing light beam emitted by the emission module 12 has a predetermined emission direction. Optionally, in some embodimentsA coordinate system of the emission module is established in such a way that the direction perpendicular to the outward direction of the light emitting surface of the emission module 12 is the forward direction of the z axis, and the x axis and the y axis are located in the light emitting surface of the emission module 12. The emitting direction of the sensing beam can be defined by the parameter values of the emitting module coordinate system, such as: the included angle between the emission direction of the sensing light beam and the positive direction of the z axis of the emission module coordinate system is the polar angle theta of the emission direction, and the included angle between the projection of the emission direction of the sensing light beam on the xy plane of the emission module coordinate system and the positive direction of the x axis is the azimuth angle of the emission direction
Figure BDA0003488832780000091
Thereby, the emitting direction of the sensing light beam can pass through the polar angle theta and the azimuth angle thereof in the emitting module coordinate system
Figure BDA0003488832780000092
And (4) defining.
Alternatively, the emission direction of the sensing light beam may also be defined in other suitable manners, as long as the emission direction of the sensing light beam can be accurately described in a quantifiable manner, which is not specifically limited in this application. For example, in some other embodiments, the emitting direction of the sensing light beam may also be defined by the angle between the emitting direction and the x, y, and z axes of the emitting module rectangular coordinate system.
Optionally, the sensing beam is a laser pulse with a preset frequency. The emitting module 12 is configured to periodically emit the laser pulses as a sensing beam at a preset frequency within a detection frame.
Optionally, the sensing light beam is, for example, visible light, infrared light or near-infrared light, and the wavelength range is, for example, 390 nanometers (nm) -780nm, 700nm-1400nm, 800nm-1000 nm.
Referring to fig. 2 and 4, in some embodiments, the processing module 15 may include a counting unit 152, a counting unit 154, a time-of-flight obtaining unit 156, and a distance obtaining unit 158. The counting unit 152 is configured to cumulatively count the Time of outputting the corresponding light sensing signal according to the light signal received by the receiving module 14 in a corresponding Time bin, where the Time bin is a Time-to-Digital (TDC) converter that records a minimum Time interval Δ t that can be resolved by the Time of generating the light sensing signal. That is, each time the receiving module 14 receives one optical signal, the counting unit 152 cumulatively adds one to a corresponding time bin according to the generation time of the optical signal.
Optionally, in some embodiments, the statistical unit 154 may be configured to perform statistics on the light sensing signal counts in each corresponding time bin to generate a corresponding statistical histogram. The abscissa of the statistical histogram represents the timestamp of each corresponding time bin, and the ordinate of the statistical histogram represents the light-induced signal count value accumulated in each corresponding time bin. Alternatively, the statistical unit 154 may be a histogram circuit.
In the sensing process, a large number of photons of the ambient light are also received by the receiving module 14 to generate a corresponding photo-sensing signal count. The probability that these ambient light photons are sensed to leave a count in each time bin tends to be the same, constituting a Noise floor (Noise Level) of the target scene, which is measured to be relatively high in a scene with high ambient light intensity and relatively low in a scene with low ambient light. On the basis, the sensing light beam reflected from the object 2 is received and the corresponding photo-induced signal count is superimposed on the noise background, so that the photo-induced signal count in the time bin corresponding to the receiving moment of the sensing light beam is obviously higher than the photo-induced signal counts in other time bins, and a prominent signal peak is formed. It is understood that the counting height of the signal peak may be influenced by the light emission power of the sensing light beam, the reflectivity of the object 2, the detection range of the photo-detection device 10, etc., and the width of the signal peak may be influenced by the emitted sensing light beam width, the time jitter of the photoelectric conversion element and the TDC of the receiving module 14, etc. Thus, the time-of-flight obtaining unit 156 can obtain the time-of-flight of the relevant sensing beam reflected by the object 2 and received by the receiving module 14 according to the time difference between the time stamp t1 of the time bin corresponding to the signal peak and the emitting time t0 (not shown) of the relevant sensing beam generating the signal peak. The distance obtaining unit 158 may be configured to obtain distance information between the object 2 reflecting the relevant sensing light beam and the emitting module 12 according to the flight time of the relevant sensing light beam determined by the statistical histogram, for example, a link distance between the object 2 in the target scene and the position on the emitting module 12 from which the relevant sensing light beam is emitted.
It should be understood that the emitting module 12 and the receiving module 14 are adjacently disposed side by side, the light emitting surface of the emitting module 12 and the light incident surface of the receiving module 14 both face the same side of the photodetection device 10, and the distance between the emitting module 12 and the receiving module 14 may range from 2 millimeters (mm) to 20mm, for example. Because the emitting module 12 and the receiving module 14 are relatively close to each other, as shown in fig. 3, although the emitting path of the sensing beam from the emitting module 12 to the object and the returning path of the sensing beam from the object to the receiving module 14 after reflection are not completely equal, both paths are far larger than the distance between the emitting module 12 and the receiving module 14, and may be considered to be approximately equal. Thus, the distance information between the object and the emitting module 12 can be calculated according to the product of half of the flight time t of the sensing light beam reflected by the object and the speed of light c.
Optionally, in some embodiments, the processing module 15 may further include a coordinate acquisition unit 159. The coordinate acquisition unit 159 is configured to determine coordinate information of an object reflecting the sensing light beam in an emission module coordinate system according to an emission direction in which the emission module 12 emits the sensing light beam and distance information obtained by detecting a flight time of the sensing light beam. As shown in fig. 3, if the emitting direction of the sensing beam adopts the polar angle θ and the azimuth angle of the emitting module coordinate system
Figure BDA0003488832780000111
Is defined by the distance information obtained by detecting the time of flight t of the sensing beamAnd the mark is D, the coordinate values x, y and z of the object reflecting the sensing light beam in the target scene in the transmitting module coordinate system can be respectively calculated by the following formulas:
Figure BDA0003488832780000121
Figure BDA0003488832780000122
z=D×cosθ
it is understood that the emitting direction of the emitting module 12 emitting the sensing light beam may be preset before factory shipment and the relevant parameter value may be obtained through calibration. Therefore, the photoelectric detection device 10 can detect the coordinate value of the position of the object in the target scene by combining the direction of the emitting module 12 emitting the sensing light beam with the flight time t of the sensing light beam.
The emitting module 12 is configured to emit at least one sensing light beam toward a target scene, where the sensing light beam has a preset emitting direction. Optionally, in some embodiments, the control module 18 is configured to control the emission module 12 to emit a plurality of sensing light beams to the target scene, where each of the sensing light beams has an emission direction different from that of the other sensing light beams, so as to correspondingly detect objects located at different directions in the target scene, and improve the resolution of the target scene for the photoelectric detection apparatus 10 to perform three-dimensional information detection. Alternatively, the control module 18 may be configured to control the emission module 12 to simultaneously emit a plurality of sensing light beams having different emission directions. Optionally, the control module 18 may also be configured to control the emission module 12 to emit multiple sensing light beams with different emission directions in different time periods, and the number of the sensing light beams emitted in each different time period may be the same or different. Alternatively, in some other embodiments, the optoelectronic detection device 10 can also improve the spatial resolution of three-dimensional information detection by adjusting the emitting direction of the sensing light beams to scan different directions in the target scene without increasing the number of the emitted sensing light beams.
As shown in fig. 5, in some embodiments, the emitting module 12 includes a light source 120, and the light source 120 may include a plurality of light emitting units 122, and the light emitting units 122 are configured to emit the sensing light beams. Alternatively, the plurality of light emitting units 122 may be arranged in an array.
Optionally, in some embodiments, the emission module 12 may further include emission optics 124. The emitting optics 124 are disposed on the light emitting side of the light source 120, and the emitting optics 124 are configured to modulate the light beams emitted by the light source 120 into a plurality of sensing light beams respectively having preset emitting directions and emit the sensing light beams toward a target scene.
The photo detection apparatus 10 further includes a control module 18, and the control module 18 is configured to control part or all of the light emitting units 122 to emit light. That is, if the light source 120 includes N light emitting units 122, where N is a positive integer greater than 1, the control unit 18 may control M light emitting units 122 of the N light emitting units 122 to emit light, where M is a positive integer less than or equal to N. The light emitting units 122 may independently or individually emit light without being affected by other light emitting units 122.
Optionally, in some embodiments, the control module 18 may be configured to control one or several of the plurality of light-emitting units 122 to emit light simultaneously. That is, if the light source 120 includes N light emitting units 122, where N is a positive integer greater than 1, the control unit 18 can control M light emitting units 122 of the N light emitting units 122 to emit light simultaneously, where M is a positive integer less than or equal to N. When M is equal to N, the control unit 18 controls all the light emitting units 122 of the light source 120 to emit light simultaneously.
Alternatively, in some embodiments, the control module 18 may be configured to control each of the plurality of light-emitting units 122 to emit light at different periods. For example, the light source includes 4 light emitting units, and the control unit 18 may control the 4 light emitting units 122 to emit light at different periods of T1, T2, T3 and T4, respectively.
Optionally, in some embodiments, the control module 18 may be configured to control several of the light-emitting units 122 to emit light in different periods, and the number of the light-emitting units 122 to be lit in each different period may be the same or different. That is, if the light source includes N light emitting units, where N is a positive integer greater than 1, the control unit may control M light emitting units of the N light emitting units to respectively emit light in T different periods, where M is a positive integer greater than 1 and less than or equal to N, and T is a positive integer greater than 1 and less than or equal to M. For example: if N is 4, M is 3, and T is 2, the light source includes 4 light-emitting units, and the control unit may control 1 of the 4 light-emitting units to emit light during a period T1, and control 2 of the 4 light-emitting units to emit light during a period T2.
As shown in fig. 6, in some embodiments, the light emitting unit 122 may further include a plurality of light emitting sub-units 1220, and one or several light emitting sub-units 1220 in the same light emitting unit 122 emit light together in the same period to form a sensing light beam emitted by the light emitting unit 122. Thus, the control module 18 can adjust the light emission power of the light emitting unit 122 by controlling the number of the light emitting sub-units 1220 that are lit up together in the same period.
Alternatively, the Light Emitting unit 122 may be a Light source in the form of a Vertical Cavity Surface Emitting Laser (VCSEL, or Vertical Cavity Surface Emitting Laser), an Edge Emitting Laser (EEL), a Light Emitting Diode (LED), a Laser Diode (LD), or the like. The edge-emitting laser may be a Fabry Perot (FP) laser, a Distributed Feedback (DFB) laser, an Electro-absorption Modulated (EML) laser, and the like, which is not limited in the embodiment of the present application.
Optionally, in some embodiments, the receiving module 14 may include a photosensor 140. The photosensor 140 includes, for example, a single photosensitive pixel 142 or a photosensitive pixel array composed of a plurality of photosensitive pixels 142. The photo-sensing pixels 142 are configured to receive a light signal returned from the outside of the photo-detection device 10 and output a corresponding photo-sensing signal. The photosensitive pixels 142 are, for example, Single Photon Avalanche photodiodes (SPADs), Avalanche Photodiodes (APDs), Silicon photomultipliers (sipms) formed by a plurality of SPADs connected in parallel, and/or other suitable photoelectric conversion elements. Alternatively, each of the photosensitive pixels 142 may include a single SPAD or a combination of SPADs.
Optionally, in some embodiments, the receiving module 14 may further include a peripheral circuit (not shown) including one or more of a signal amplifier, a TDC, an Analog-to-Digital Converter (ADC), and the like, which are connected to the photosensor 140. The peripheral circuitry may be configured to record the times at which the light-sensitive pixels receive light signals to generate corresponding light-sensitive signals. Optionally, the peripheral circuit may be partially or wholly integrated in the photosensor 140.
Optionally, the receiving module 14 may further include a receiving optical device 144. The receiving optics 144 are disposed on the light entrance side of the photosensor 140 and are configured to transmit photons from the target scene to a plurality of light-sensitive pixels 142 on the photosensor 140. For example, in some embodiments, the receiving optics 144 include a receiving lens. Alternatively, the receiving lens 144 may include one or more lenses. It should be understood that the photons in the target scene that are transmitted to the light-sensitive pixels 142 via the receiving optics 144 include photons of the sensing beam that are reflected back by objects in the target scene.
As shown in fig. 7, the photosensitive pixels 142 on the photosensor 140 have corresponding sensing regions 20 in the target scene, and the sensing light beams projected into the sensing regions 20 are reflected and transmitted to the corresponding photosensitive pixels 142 through the receiving optics 144 for receiving. For example, in some embodiments, the photosensitive pixels 142 of the photosensor 140 are arranged in a 2 × 2 array, including number 1-4 photosensitive pixels 142, respectively. The number I sensing area 20 corresponds to the number 1 photosensitive pixel 142, the number II sensing area 20 corresponds to the number 2 sensing pixel 142, the number III sensing area 20 corresponds to the number 3 sensing pixel 142, and the number VI sensing area 20 corresponds to the number 4 sensing pixel 142. The sensing light beam emitted by the emitting module 12 is projected on the object located in the sensing area 20 to form a light spot 126, the reflected sensing light beam generated by the light spot 126 is received by the corresponding photosensitive pixel 142 to output a corresponding photosensitive signal, and the photosensitive signal can be used to obtain the flight time corresponding to the position of the light spot 126, so as to obtain the three-dimensional information corresponding to the position of the light spot 126 according to the flight time.
Optionally, in some embodiments, the control module 18 is configured to control the emitting module 12 to emit the multiple sensing light beams with different emitting directions to the target scene respectively at different time intervals, where each time interval may emit one or more sensing light beams respectively, and each emitted sensing light beam has an emitting direction different from that of the other sensing light beams. Correspondingly, the control module 18 is configured to control the photosensitive pixels 142 to receive the reflected sensing light beams in the corresponding different periods and output corresponding photosensitive signals. For example, in some embodiments, the control module 18 is configured to control the emission module 12 to emit one sensing light beam to irradiate corresponding to one sensing region in the same time period, and a plurality of sensing light beams with different emission directions emitted by the emission module 12 in the same time period are respectively emitted into a plurality of different sensing regions in a one-to-one correspondence manner. The control module 18 is further configured to control the emission module 12 to correspondingly emit a plurality of sensing light beams with different emission directions to a plurality of sub-sensing areas at different positions in the same sensing area at different time periods. Correspondingly, one of the photosensitive pixels 142 receives one sensing light beam with the same preset emission direction in the same time period, and the sensing light beams respectively received by the same photosensitive pixel 142 in a plurality of different time periods have a plurality of different preset emission directions. Therefore, by presetting and calibrating the preset emitting direction of the sensing light beam received by each photosensitive pixel 142 in a plurality of different time periods, the processing module 15 may determine the emitting direction of the received sensing light beam according to the time period in which the photosensitive pixel 142 receives the sensing light beam, and may obtain the coordinate value of the object reflecting the sensing light beam in the target scene in the emitting module coordinate system by combining the detected flight time of the sensing light beam.
Optionally, in some embodiments, the sum of the numbers of the sensing light beams respectively emitted by the emitting module 12 in a plurality of different periods is greater than the number of the photosensitive pixels 142 of the receiving module 14. Thus, integrating three-dimensional information of different locations in the target scene obtained by the multiple sensing beams at different time intervals can obtain a spatial resolution higher than the number of photosensitive pixels 142.
Optionally, in some embodiments, the number of sensing light beams emitted by the emission module 12 in one period is smaller than or equal to the number of photosensitive pixels 142 of the reception module 14. Therefore, all the sensing light beams emitted by the emission module 12 in a period of time can be collocated with a corresponding photosensitive pixel for three-dimensional detection, and the waste of the sensing light beams is not caused.
Referring to fig. 5 and 7-10 together, in some embodiments, the emission optics 124 include a projection lens. The emission module 12 includes a light source 120 and a projection lens 124, the light source 120 includes a plurality of light emitting units 122, and the projection lens 124 is configured to project light beams emitted by the light emitting units 122 toward a target scene along different preset emission directions, respectively, to form a plurality of corresponding sensing light beams. The light source 120 includes, for example, 16 light emitting units 122, and the light emitting units 122 are arranged in a 4 × 4 array, and are respectively marked with numbers 1 to 16. Optionally, in some embodiments, the emitting optics 124 may further include a collimator 127 (see fig. 12), and the collimator 127 may be disposed between the light source 120 and the projection lens 124 to collimate the light beam emitted from the light source 120 and then emit the light beam through the projection lens 124.
The receiving module 14 includes a photoelectric sensor 140 and a receiving lens 144, and a sensing light beam reflected by the object 2 in the target scene is transmitted to the photosensitive pixel 142 on the photoelectric sensor 140 for receiving through the receiving lens 144. The photosensor 142 includes, for example, 4 photosensitive pixels 142, and the photosensitive pixels 142 are arranged in a 2 × 2 array, and are respectively labeled as nos. 1 to 4. The photosensitive pixels 142 respectively have sensing regions 20 corresponding to the target scene, and 4 of the sensing regions 20 are correspondingly arranged in a 2 × 2 array in the target scene and respectively marked as numbers I, II, III, and VI.
As shown in fig. 7, the control module 18 is configured to control the No. 1, 3, 9 and No. 11 light-emitting units 122 to emit the sensing light beams with different emission directions respectively during the period T1. The sensing light beam emitted by the No. 1 light emitting unit 122 irradiates the No. 1 sensing sub-region in the No. I sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 3 light emitting unit 122 irradiates the No. 3 sensing sub-region in the No. II sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 9 light emitting unit 122 irradiates the No. 9 sensing sub-region in the No. III sensing region 20 and forms a corresponding light spot 126, and the sensing light beam emitted by the No. 11 light emitting unit 122 irradiates the No. 11 sensing sub-region in the No. VI sensing region 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 1 light-emitting unit 122, the No. 2 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 3 light-emitting unit 122, the No. 3 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 9 light-emitting unit 122, and the No. 4 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 11 light-emitting unit 122 during a period T1. The processing module 15 is configured to obtain three-dimensional information of the No. 1, 3, 9 and 11 sensing sub-regions of the T1 time period according to the emitting direction and the flight time of the sensing light beam emitted by the No. 1, 3, 9 and 11 light-emitting units 122, respectively, to form a three-dimensional information map of the T1 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T1 time period is 2 × 2.
As shown in fig. 8, the control module 18 is configured to control No. 2, 4, 10 and 12 light emitting units 122 to emit sensing light beams having different emission directions, respectively, during a period T2. The sensing light beam emitted by the No. 2 light emitting unit 122 irradiates the No. 2 sensing sub-region in the No. I sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 4 light emitting unit 122 irradiates the No. 4 sensing sub-region in the No. II sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 10 light emitting unit 122 irradiates the No. 10 sensing sub-region in the No. III sensing region 20 and forms a corresponding light spot 126, and the sensing light beam emitted by the No. 12 light emitting unit 122 irradiates the No. 12 sensing sub-region in the No. VI sensing region 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 2 light-emitting unit 122, the No. 2 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 4 light-emitting unit 122, the No. 3 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 10 light-emitting unit 122, and the No. 4 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 12 light-emitting unit 122 during a period T2. The processing module is configured to correspondingly obtain three-dimensional information of No. 2, 4, 10 and 12 sensing subregions of the T2 time period according to the emission direction and the flight time of the sensing light beam emitted by the No. 2, 4, 10 and 12 light-emitting units 122 respectively, so as to form a three-dimensional information map of the T2 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T2 period is 2 × 2.
As shown in fig. 9, the control module 18 is configured to control No. 5, 7, 13 and 15 light emitting units 122 to emit sensing light beams having different emission directions, respectively, during a period T3. The sensing light beam emitted by the No. 5 light emitting unit 122 irradiates the No. 5 sensing sub-region in the No. I sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 7 light emitting unit 122 irradiates the No. 7 sensing sub-region in the No. II sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 13 light emitting unit 122 irradiates the No. 13 sensing sub-region in the No. III sensing region 20 and forms a corresponding light spot 126, and the sensing light beam emitted by the No. 15 light emitting unit 122 irradiates the No. 15 sensing sub-region in the No. VI sensing region 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensing pixel 142 to detect the flight time of the sensing light beam emitted by the No. 5 light-emitting unit 122, the No. 2 light-sensing pixel 142 to detect the flight time of the sensing light beam emitted by the No. 7 light-emitting unit 122, the No. 3 light-sensing pixel 142 to detect the flight time of the sensing light beam emitted by the No. 13 light-emitting unit 122, and the No. 4 light-sensing pixel 142 to detect the flight time of the sensing light beam emitted by the No. 15 light-emitting unit 122 during a period T3. The processing module 15 is configured to correspondingly obtain three-dimensional information of the No. 5, 7, 13 and 15 sensing sub-regions in the target scene during the T3 time period according to the emitting direction and the flight time of the sensing light beam emitted by the No. 5, 7, 13 and 15 light-emitting units 122, respectively, so as to form a three-dimensional information map during the T3 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T3 period is 2 × 2.
As shown in fig. 10, the control module 18 is configured to control No. 6, 8, 14 and 16 light-emitting units 122 to emit sensing light beams having different emission directions, respectively, during a period T4. The sensing light beam emitted by the No. 6 light-emitting unit 122 irradiates the No. 6 sensing sub-region in the No. I sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 8 light-emitting unit 122 irradiates the No. 8 sensing sub-region in the No. II sensing region 20 and forms a corresponding light spot 126, the sensing light beam emitted by the No. 14 light-emitting unit 122 irradiates the No. 14 sensing sub-region in the No. III sensing region 20 and forms a corresponding light spot 126, and the sensing light beam emitted by the No. 16 light-emitting unit 122 irradiates the No. 16 sub-region in the No. VI sensing region 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 6 light-emitting unit 122, the No. 2 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 8 light-emitting unit 122, the No. 3 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 14 light-emitting unit 122, and the No. 4 light-sensitive pixel 142 to detect the flight time of the sensing light beam emitted by the No. 16 light-emitting unit 122 during a period T4. The processing module 15 is configured to correspondingly obtain three-dimensional information maps of the number 6, 8, 14 and 16 sensing sub-regions in the target scene at the time T4 according to the emission direction and the flight time of the sensing light beams emitted by the number 6, 8, 14 and 16 light-emitting units 122, respectively. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T4 period is 2 × 2.
As shown in fig. 11, since the sensing light beams emitted by the light emitting units No. 1, No. 2, No. 5, and No. 6 have different emission directions, the sensing sub-regions No. 1, No. 2, No. 5, and No. 6 correspondingly irradiate on the sensing sub-region No. I at different positions in the different periods. Similarly, the No. 3, 4, 7 and 8 light emitting units 122 correspondingly illuminate the No. 3, 4, 7 and 8 sensing sub-regions with different positions on the No. II sensing region in the different time periods. The No. 9, 10, 13 and 14 light emitting units 122 correspondingly illuminate No. 9, 10, 13 and 14 sub-sensing regions with different positions on the No. III sensing region in the different periods of time. The No. 11, No. 12, No. 15 and No. 16 light-emitting units 122 correspondingly illuminate No. 11, No. 12, No. 15 and No. 16 sensing sub-regions with different positions on the No. VI sensing region in the different periods of time.
It is understood that the sensing light beams having different emission directions are projected to a plurality of sensing sub-regions at different positions within the sensing region 20 for a plurality of periods of time, respectively, and the plurality of sensing sub-regions are arranged throughout the sensing region 20. For example: no. 1-16 light-emitting units 122 on the light source 120 are all configured to emit sensing light beams along different preset emission directions, and the sensing light beams are correspondingly irradiated to 16 non-overlapping sensing sub-areas at the target scene position in the different T1-T4 periods. Thus, the processing module 15 can synthesize the three-dimensional information maps having the lower 2 × 2 resolutions obtained at the periods T1, T2, T3, and T4, respectively, into the three-dimensional information map having the higher 4 × 4 resolution.
It is understood that the different periods T1, T2, T3 and T4 can be a detection frame of the photo-detection device 10.
SPAD arrays have limited resolution because they require complex quenching, timing, memory and read elements. In the embodiment of the present application, the three-dimensional information map with relatively high resolution can be obtained by using the photosensitive pixels 142 with relatively low resolution according to the manner of defining the three-dimensional coordinates of the object by the preset emitting direction and the flight time of the sensing light beam emitted by the emitting module 12, so as to improve the detection resolution of the photoelectric detection device 10 using SPAD as the photosensitive pixels 142.
As shown in fig. 12, in some embodiments, the emission optics 124 includes a beam splitter 125. The emission module 12 includes a light source 120 and a beam splitter 125. The light source 120 is configured to emit at least one sensing light beam. The beam splitter 125 is configured to perform a beam splitting process on the sensing light beams to split one sensing light beam into a plurality of sensing light beams respectively having different emission directions. The beam splitter 125 is, for example, a cylindrical lens, a grating, a microlens array, a Diffractive Optical Element (DOE), or the like. The multiple sensing beams split by the beam splitter 125 may be arranged along one dimension or on a two-dimensional plane, which is not specifically limited in this application. Alternatively, the beam splitter 125 may be made of a resin material or a glass material, or may be made of both a resin material and a glass material. Thus, the number of sensing beams with different emission directions can be increased by using the beam splitter 125, thereby improving the resolution of the target scene for three-dimensional detection.
Optionally, the emission optics 124 may also include a collimator 127. The collimator 127 may be disposed between the light source 120 and the beam splitter 125, and is used for collimating the sensing light beam emitted from the light source 120. The collimated sensing beam enters the beam splitter 125 for beam splitting. The collimator 127 is, for example, a collimator lens. Optionally, the collimator lens may include one or more lenses (not shown).
It is to be understood that a corresponding collimator 127 and/or beam splitter 125 may be provided for each light emitting unit 122 or a number of light emitting units 122 of the plurality of light emitting units 122, respectively, to increase the flexibility of modulating the emitted sensing light beam.
Referring to fig. 13-16, the light source 120 includes, for example, 4 light emitting units 122, and the light emitting units 122 may be arranged in a 2 × 2 array, which is labeled as nos. 1-4. The light beam emitted from each of the light emitting units 122 is split into 4 sensing light beams with different preset emission directions by a beam splitter. The light beam emitted by the No. 1 light emitting unit 122 is divided into the No. 11, 12, 13 and 14 sensing light beams by the beam splitter, the light beam emitted by the No. 2 light emitting unit 122 is divided into the No. 21, 22, 23 and 24 sensing light beams by the beam splitter, the light beam emitted by the No. 3 light emitting unit 122 is divided into the No. 31, 32, 33 and 34 sensing light beams by the beam splitter, and the light beam emitted by the No. 4 light emitting unit 122 is divided into the No. 41, 42, 43 and 44 sensing light beams by the beam splitter.
The photosensor 140 includes, for example, 4 photosensitive pixels 142, and the photosensitive pixels 142 are arranged in a 2 × 2 array, and are respectively labeled as nos. 1 to 4. The photosensitive pixels 142 respectively have sensing regions 20 corresponding to the target scene, and 4 of the sensing regions 20 are correspondingly arranged in a 2 × 2 array in the target scene and respectively marked as numbers I, II, III, and VI.
The control module 18 is configured to control the lighting of the No. 1 light emitting unit 122 during the period T1, and the emitted light beam is split into No. 11, 12, 13 and 14 sensing light beams with different emission directions by the beam splitter 125. The No. 11 sensing light beam irradiates the No. 11 sensing sub-region in the No. I sensing region 20 and forms a corresponding light spot 126, the No. 12 sensing light beam irradiates the No. 12 sensing sub-region in the No. II sensing region 20 and forms a corresponding light spot 126, the No. 13 sensing light beam irradiates the No. 13 sensing sub-region in the No. III sensing region 20 and forms a corresponding light spot 126, and the No. 14 sensing light beam irradiates the No. 14 sensing sub-region in the No. VI sensing region 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the flight time of the No. 1 light-sensitive pixel 142 to detect the No. 11 sensing light beam, the flight time of the No. 2 light-sensitive pixel 142 to detect the No. 12 sensing light beam, the flight time of the No. 3 light-sensitive pixel 142 to detect the No. 13 sensing light beam, and the flight time of the No. 4 light-sensitive pixel 142 to detect the No. 14 sensing light beam at a time T1. The processing module 15 is configured to correspondingly obtain three-dimensional information of No. 11, 12, 13 and 14 sensing sub-regions of the T1 time period according to the emitting directions and the flight times of the No. 11, 12, 13 and 14 sensing light beams, respectively, so as to form a three-dimensional information map of the T1 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T1 period is 2 × 2.
As shown in fig. 14, the control module 18 is configured to control the lighting of the No. 2 light-emitting unit 122 in the period T2, and the emitted light beam is split into No. 21, No. 22, No. 23 and No. 24 sensing light beams with different emission directions by the beam splitter 125. The No. 21 sensing light beam irradiates the No. 21 sensing sub-region in the No. I sensing region 20 and forms a corresponding light spot 126, the No. 22 sensing light beam irradiates the No. 22 sensing sub-region in the No. II sensing region 20 and forms a corresponding light spot 126, the No. 23 sensing light beam irradiates the No. 23 sensing sub-region in the No. III sensing region 20 and forms a corresponding light spot 126, and the No. 24 sensing light beam irradiates the No. 24 sensing sub-region in the No. VI sensing region 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the flight time of the No. 1 light-sensitive pixel 142 to detect the No. 21 sensing light beam, the flight time of the No. 2 light-sensitive pixel 142 to detect the No. 22 sensing light beam, the flight time of the No. 3 light-sensitive pixel 142 to detect the No. 23 sensing light beam, and the flight time of the No. 4 light-sensitive pixel 142 to detect the No. 24 sensing light beam at a time T2. The processing module 15 is configured to correspondingly obtain three-dimensional information of No. 21, 22, 23 and 24 sensing sub-regions of the T2 time period according to the emitting direction and the flight time of the No. 21, 22, 23 and 24 sensing light beams, respectively, so as to form a three-dimensional information map of the T2 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T2 period is 2 × 2.
As shown in fig. 15, the control module 18 is configured to control the lighting of the No. 3 light emitting unit 122 in the period T3, and the emitted light beam is split into the No. 31, 32, 33 and 34 sensing light beams with different emission directions by the beam splitter 125. The number 31 sensing light beam irradiates the number 31 sensing sub-area in the number I sensing area 20 and forms a corresponding light spot 126, the number 32 sensing light beam irradiates the number 32 sensing sub-area in the number II sensing area 20 and forms a corresponding light spot 126, the number 33 sensing light beam irradiates the number 33 sensing sub-area in the number III sensing area 20 and forms a corresponding light spot 126, and the number 34 sensing light beam irradiates the number 34 sensing sub-area in the number VI sensing area 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the flight time of the No. 1 light-sensitive pixel 142 to detect the No. 31 sensing light beam, the flight time of the No. 2 light-sensitive pixel 142 to detect the No. 32 sensing light beam, the flight time of the No. 3 light-sensitive pixel 142 to detect the No. 33 sensing light beam, and the flight time of the No. 4 light-sensitive pixel 142 to detect the No. 34 sensing light beam at a time T3. The processing module 15 is configured to correspondingly obtain three-dimensional information of No. 31, 32, 33 and 34 sensing subregions in the target scene in the T3 time period according to the emission direction and the flight time of the No. 31, 32, 33 and 34 sensing light beams respectively to form a three-dimensional information map of the T3 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T3 time period is 2 × 2.
As shown in fig. 16, the control module 18 is configured to control the lighting of the No. 4 light-emitting unit 122 in the period T4, and the emitted light beam is split into No. 41, 42, 43 and No. 44 sensing light beams with different emission directions by the beam splitter 125. The No. 41 sensing light beam irradiates the No. 41 sensing sub-area in the No. I sensing area 20 and forms a corresponding light spot 126, the No. 42 sensing light beam irradiates the No. 42 sensing sub-area in the No. II sensing area 20 and forms a corresponding light spot 126, the No. 43 sensing light beam irradiates the No. 43 sensing sub-area in the No. III sensing area 20 and forms a corresponding light spot 126, and the No. 44 sensing light beam irradiates the No. 44 sensing sub-area in the No. VI sensing area 20 and forms a corresponding light spot 126.
Correspondingly, the control module 18 is configured to control the flight time of the No. 1 light-sensitive pixel 142 to detect the No. 41 sensing light beam, the flight time of the No. 2 light-sensitive pixel 142 to detect the No. 42 sensing light beam, the flight time of the No. 3 light-sensitive pixel 142 to detect the No. 43 sensing light beam, and the flight time of the No. 4 light-sensitive pixel 142 to detect the No. 44 sensing light beam at a time T4. The processing module 15 is configured to correspondingly obtain three-dimensional information of No. 41, 42, 43 and 44 sensing sub-regions in the target scene of the T4 time period according to the emission direction and the time-of-flight of No. 41, 42, 43 and 44 sensing light beams, respectively, so as to form a three-dimensional information map of the T4 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T4 period is 2 × 2.
As shown in fig. 17, the No. 11, 21, 31 and 41 sensing light beams have different emission directions, and correspond to No. 11, 21, 31 and 41 sensing sub-regions which are irradiated on the No. I sensing region at different positions in the above different periods. Similarly, the No. 12, 22, 32 and 42 sensing light beams respectively illuminate the No. 12, 22, 32 and 42 sensing sub-regions with different positions on the No. II sensing region in the different time periods. The 13 th, 23 th, 33 th and 43 th sensing beams respectively irradiate 13 th, 23 th, 33 th and 43 th sensing sub-regions on the III-sensing region 20 at different positions in the different periods. The No. 14, 24, 34 and 44 sensing light beams respectively irradiate No. 14, 24, 34 and 44 sensing sub-regions with different positions on the No. VI sensing region 20 in the different time periods.
Since each sensing light beam is configured to emit along different preset emission directions, the different T1-T4 periods correspond to 16 non-overlapping sensing sub-regions irradiated to the target scene position, and the sensing sub-regions at different positions in each sensing region 20 are arranged to cover the whole sensing region 20. The processing module 15 may synthesize the three-dimensional information maps having the lower 2 × 2 resolutions obtained at the periods T1, T2, T3, and T4, respectively, into the three-dimensional information map having the higher 4 × 4 resolution. It is understood that the different periods T1, T2, T3 and T4 can be a detection frame of the photo-detection device 10.
The above-mentioned embodiment of the present application defines the three-dimensional coordinate of the object according to the preset emitting direction and the flight time of the sensing light beam emitted by the emitting module 12, and increases the number of the sensing light beam through the beam splitter 125, so that the three-dimensional information map with relatively high resolution can be obtained by using a smaller number of light emitting units 122 and relatively low resolution photosensitive pixels 142, thereby improving the detection resolution of the photoelectric detection apparatus 10 using SPAD as the photosensitive pixels 142 and reducing the required device cost.
As shown in fig. 18, in some embodiments, the emission optics 124 include a scanner 128. The transmission module 12 includes a light source 120 and a scanner 128. The light source 120 is configured to emit at least one sensing light beam. The scanner 128 is configured to deflect the emitting direction of the sensing light beam, which can realize scanning detection of different positions in a target scene by the sensing light beam, thereby improving the resolution of the photo-detection apparatus 10. The scanner 128, such as a Micro-Electro-Mechanical System (MEMS) galvanometer, may be configured to reflect the sensing light beam emitted by the light source 120 to different emission directions by deflecting Micro-mirrors. Alternatively, the scanner 128 is, for example, an Optical Phased Array (OPA), and adjusts the emitting direction of the sensing beam by controlling the orientation of the equiphase surface of the sensing beam. The control module 18 is configured to control the scanner 128 to deflect the sensing light beam to have different emission directions respectively at different periods of time.
As shown in fig. 19, the light source 120 includes, for example, 4 light emitting units 122, and the light emitting units 122 may be arranged in a 2 × 2 array, and are respectively marked with numbers 1-4. The photosensor 140 includes, for example, 4 photosensitive pixels 142, and the photosensitive pixels 142 are arranged in a 2 × 2 array, and are respectively labeled as nos. 1 to 4. The photosensitive pixels 142 respectively have sensing regions 20 corresponding to the target scene, and 4 of the sensing regions 20 are correspondingly arranged in a 2 × 2 array in the target scene and respectively marked as numbers I, II, III, and VI.
The sensing light beam emitted by the No. 1 light-emitting unit 122 is deflected by the scanner 128 to emit directions to irradiate No. 11, No. 12, No. 13 and No. 14 sensing sub-regions in the No. I sensing region 20 respectively and form corresponding light spots 126, and the No. 11, No. 12, No. 13 and No. 14 sensing sub-regions do not overlap with each other. The sensing light beam emitted by the No. 2 light-emitting unit 122 is deflected by the scanner 128 in the emission direction, and can respectively irradiate the No. 21, 22, 23 and 24 sensing sub-regions in the No. II sensing region 20 and form corresponding light spots 126, and the No. 21, 22, 23 and 24 sensing sub-regions do not overlap with each other. The sensing light beam emitted by the No. 3 light emitting unit 122 is deflected by the scanner 128 in the emission direction to irradiate the No. 31, 32, 33 and 34 sensing subregions in the No. III sensing region 20 respectively and form corresponding light spots 126, and the No. 31, 32, 33 and 34 sensing subregions do not overlap with each other. The sensing light beam emitted by the No. 4 light-emitting unit 122 is deflected by the scanner 128 in the emission direction, and can respectively irradiate the No. 41, 42, 43 and 44 sensing subregions in the No. VI sensing region 20 and form corresponding light spots 126, and the No. 41, 42, 43 and 44 sensing subregions do not overlap with each other.
The control module 18 is configured to control the scanner 128 to deflect the sensing light beam emitted by the No. 1 light-emitting unit 122 to irradiate the No. 11 sensing sub-area within the No. I sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 2 light-emitting unit 122 to irradiate the No. 21 sensing sub-area within the No. II sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 3 light-emitting unit 122 to irradiate the No. 31 sensing sub-area within the No. III sensing area 20, and control the scanner 128 to deflect the sensing light beam emitted by the No. 4 light-emitting unit 122 to irradiate the No. 41 sensing sub-area within the No. VI sensing area 20 in a period T1.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 11 sensing sub-area, the No. 2 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 21 sensing sub-area, the No. 3 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 31 sensing sub-area, and the No. 4 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 41 sensing sub-area at a time period T1. The processing module 15 is configured to correspondingly obtain three-dimensional information of the sensing sub-regions No. 11, 21, 31 and 41 of the T1 time period according to the emission direction and the flight time of the sensing light beam received by the photosensitive pixels No. 1-4 respectively, so as to form a three-dimensional information map of the T1 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T1 period is 2 × 2.
As shown in fig. 20, the control module 18 is configured to control the scanner 128 to deflect the sensing light beam emitted by the No. 1 light-emitting unit 122 to irradiate the No. 12 sensing sub-area within the No. I sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 2 light-emitting unit 122 to irradiate the No. 22 sensing sub-area within the No. II sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 3 light-emitting unit 122 to irradiate the No. 32 sensing sub-area within the No. III sensing area 20, and control the scanner 128 to deflect the sensing light beam emitted by the No. 4 light-emitting unit 122 to irradiate the No. 42 sensing sub-area within the No. VI sensing area 20 in a T2 period.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 12 sensing sub-area, the No. 2 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 22 sensing sub-area, the No. 3 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 32 sensing sub-area, and the No. 4 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 42 sensing sub-area at a time period T2. The processing module 15 is configured to correspondingly obtain three-dimensional information of the sensing sub-regions No. 12, 22, 32 and 42 of the T2 time period according to the emission direction and the flight time of the sensing light beam received by the photosensitive pixels No. 1-4 respectively, so as to form a three-dimensional information map of the T2 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T2 period is 2 × 2.
As shown in fig. 21, the control module 18 is configured to control the scanner 128 to deflect the sensing light beam emitted by the No. 1 light-emitting unit 122 to irradiate the No. 13 sensing sub-area within the No. I sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 2 light-emitting unit 122 to irradiate the No. 23 sensing sub-area within the No. II sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 3 light-emitting unit 122 to irradiate the No. 33 sensing sub-area within the No. III sensing area 20, and control the scanner 128 to deflect the sensing light beam emitted by the No. 4 light-emitting unit 122 to irradiate the No. 43 sensing sub-area within the No. VI sensing area 20 in a T3 time period.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 13 sensing sub-area, the No. 2 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 23 sensing sub-area, the No. 3 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 33 sensing sub-area, and the No. 4 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 43 sensing sub-area at a time period T3. The processing module 15 is configured to correspondingly obtain three-dimensional information of No. 13, 23, 33 and 43 sensing sub-regions of the T3 time period according to the emission direction and the flight time of the sensing light beam received by the No. 1-4 photosensitive pixels, respectively, so as to form a three-dimensional information map of the T3 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T3 period is 2 × 2.
As shown in fig. 22, the control module 18 is configured to control the scanner 128 to deflect the sensing light beam emitted by the No. 1 light-emitting unit 122 to irradiate the No. 14 sensing sub-area within the No. I sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 2 light-emitting unit 122 to irradiate the No. 24 sensing sub-area within the No. II sensing area 20, control the scanner 128 to deflect the sensing light beam emitted by the No. 3 light-emitting unit 122 to irradiate the No. 34 sensing sub-area within the No. III sensing area 20, and control the scanner 128 to deflect the sensing light beam emitted by the No. 4 light-emitting unit 122 to irradiate the No. 44 sensing sub-area within the No. VI sensing area 20 in a T4 period.
Correspondingly, the control module 18 is configured to control the No. 1 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 14 sensing sub-region, the No. 2 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 24 sensing sub-region, the No. 3 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 34 sensing sub-region, and the No. 4 light-sensitive pixel 142 to detect the flight time of the sensing light beam irradiating the No. 44 sensing sub-region at a time period T4. The processing module 15 is configured to correspondingly obtain three-dimensional information of No. 14, 24, 34 and 44 sensing sub-regions of the T4 time period according to the emission direction and the flight time of the sensing light beam received by the No. 1-4 photosensitive pixels, respectively, so as to form a three-dimensional information map of the T4 time period. It can be understood that the target scene resolution of the three-dimensional information map obtained in this T4 time period is 2 × 2.
As shown in fig. 23, since the sensing light beams are deflected to emit along different preset emission directions in different periods of time, the target scene positions of the 16 sensing sub-regions correspondingly illuminated in the different periods of time are respectively not overlapped, and the sensing sub-regions at different positions in each sensing region 20 are arranged over the whole sensing region 20. The processing module 15 may synthesize the three-dimensional information maps having the lower 2 × 2 resolutions obtained at the periods T1, T2, T3, and T4, respectively, into the three-dimensional information map having the higher 4 × 4 resolution. It is understood that the different periods T1, T2, T3 and T4 can be a detection frame of the photo-detection device 10.
In the above embodiment of the present application, according to the preset emitting direction and the flight time of the sensing light beam emitted by the emitting module 12, the emitting direction of the sensing light beam is changed by the scanner 128 in different periods, so that the three-dimensional information map with relatively high resolution can be obtained by using a small number of light emitting units 122 and relatively low resolution photosensitive pixels 142, thereby improving the detection resolution of the photodetection device 10 using SPAD as the photosensitive pixels 142 and reducing the required device cost.
Alternatively, in some embodiments, all or some of the functional units in the control module 18 and/or the processing module 15 may be firmware solidified in the storage medium 30 or computer software codes stored in the storage medium 30, and executed by the corresponding one or more processors 40 to control the relevant components to implement the corresponding functions. Examples of the Processor 40 include, but are not limited to, an Application Processor (AP), a Central Processing Unit (CPU), a Micro Controller Unit (MCU), and the like. The storage medium 30 includes, but is not limited to, a Flash Memory (Flash Memory), a charged Erasable Programmable read only Memory (EEPROM), a Programmable Read Only Memory (PROM), a hard disk, and the like.
Optionally, in some embodiments, the processor 40 and/or the storage medium 30 may be disposed in the photodetecting device 10, such as: integrated on the same circuit board as the transmission module 12 or the reception module 14. Optionally, in some other embodiments, the processor 40 and/or the storage medium 30 may also be disposed at other positions of the electronic device 1, such as: on the main circuit board of the electronic device 1.
Optionally, in some embodiments, part or all of the functional units of the control module 18 and/or the processing module 15 may also be implemented by hardware, for example, by any one or a combination of the following technologies: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like. It will be appreciated that the hardware described above for implementing the functions of the control module 18 and/or the processing module 15 may be provided within the photo detection device 10. The above-mentioned hardware for implementing the functions of the control module 18 and/or the processing module 15 may also be disposed at other positions of the electronic device 1, such as: is provided on the main circuit board of the electronic device 1.
It should be noted that the technical solutions claimed in the present application may satisfy only one of the above embodiments or satisfy a plurality of the above embodiments at the same time, that is, an embodiment in which one or more of the above embodiments are combined also belongs to the protection scope of the present application.
In the description herein, references to the description of the terms "one embodiment," "certain embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The logic and/or steps represented in the flowcharts or otherwise described herein may be considered as an ordered listing of executable instructions for implementing logical functions, and can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the "computer-readable medium" include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Further, the "computer-readable medium" may even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted, or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It should be understood that portions of embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in a storage medium and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (10)

1. A photodetecting device, characterized in that it comprises:
the device comprises an emission module and a plurality of light-emitting units, wherein the emission module comprises an emission optical device and a plurality of light-emitting units, the optical device is configured to modulate light beams emitted by the light-emitting units into a plurality of sensing light beams respectively having preset emission directions and emit the sensing light beams to a target scene, and the preset emission direction of each sensing light beam is different from the preset emission directions of other sensing light beams;
a receiving module including receiving optics and a plurality of light-sensitive pixels, the receiving optics configured to transmit photons from the target scene to the plurality of light-sensitive pixels, the light-sensitive pixels configured to respond to the received photons and output corresponding light-sensitive signals, the number of light-sensitive pixels being less than the number of light-emitting units; and
a processing module configured to process the light-induced signals to obtain times at which the sensing beams reflected back by objects in a target scene are received by the light-sensitive pixels.
2. The photodetection device according to claim 1 wherein the processing module is configured to determine the time of flight of the sensing light beam according to the difference between the time of receipt and the time of emission of the sensing light beam reflected back from the target scene, and to determine the coordinate information of the object reflecting the sensing light beam in the target scene according to the direction of emission and the time of flight of the sensing light beam.
3. The photodetection device according to claim 2 wherein the emitting direction of the sensing light beam is defined by the parameter value of the emitting module coordinate system.
4. The detecting device according to claim 3, wherein the emitting module coordinate system is established in such a way that a direction perpendicular to the emitting module light-emitting surface is a positive z-axis direction, and an x-axis and a y-axis are located in the emitting module light-emitting surface, an included angle between the emitting direction of the sensing light beam and the positive z-axis direction is a polar angle of the emitting direction, an included angle between a projection of the emitting direction of the sensing light beam on an xy-plane of the emitting module coordinate system and the positive x-axis direction is an azimuth angle of the emitting direction, and the emitting direction of the sensing light beam is defined by the polar angle and the azimuth angle thereof in the emitting module coordinate system.
5. The apparatus according to claim 2, wherein the processing module is configured to obtain distance information between the object reflecting the sensing beam and the emitting module according to the flight time of the sensing beam, and determine coordinate information of the object reflecting the sensing beam in the emitting module coordinate system according to the distance information between the object and the emitting module and the emitting direction of the sensing beam.
6. The photo-detection apparatus according to claim 1, further comprising a control module configured to control the emission module to emit the plurality of sensing light beams having different emission directions respectively in different periods.
7. The apparatus according to claim 6, wherein each of the photosensitive pixels has a corresponding sensing area in the target scene, the sensing beam reflected back in the sensing area is transmitted to the corresponding photosensitive pixel for receiving, and the control module is configured to control the emission module to emit a sensing beam to illuminate corresponding to one of the sensing areas in the same time period.
8. The apparatus according to claim 7, wherein a plurality of sensing beams having different emission directions emitted by the emission module in the same period are respectively emitted into a plurality of different sensing regions in a one-to-one correspondence.
9. The apparatus according to claim 10, wherein each of the photosensitive pixels has a corresponding sensing area in the target scene, the sensing beams reflected from the sensing areas are transmitted to the corresponding photosensitive pixels for receiving, and the control module is configured to control the emission module to emit a plurality of sensing beams with different emission directions to a plurality of sensing sub-areas at different positions in the same sensing area respectively for a plurality of different periods, and the plurality of sensing sub-areas are arranged over the entire sensing area.
10. An electronic device comprising the photodetecting apparatus according to claims 1-9, wherein the electronic device further comprises an application module configured to implement a corresponding function according to the coordinate information of the object obtained by the photodetecting apparatus.
CN202210090126.7A 2022-01-25 2022-01-25 Photoelectric detection device and electronic equipment Pending CN114690155A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210090126.7A CN114690155A (en) 2022-01-25 2022-01-25 Photoelectric detection device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210090126.7A CN114690155A (en) 2022-01-25 2022-01-25 Photoelectric detection device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114690155A true CN114690155A (en) 2022-07-01

Family

ID=82137599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210090126.7A Pending CN114690155A (en) 2022-01-25 2022-01-25 Photoelectric detection device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114690155A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118091609A (en) * 2024-04-29 2024-05-28 深圳阜时科技有限公司 Receiving module, MEMS galvanometer laser radar system and electronic equipment
CN118091608A (en) * 2024-04-29 2024-05-28 深圳阜时科技有限公司 Transmitting module, MEMS galvanometer laser radar system and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118091609A (en) * 2024-04-29 2024-05-28 深圳阜时科技有限公司 Receiving module, MEMS galvanometer laser radar system and electronic equipment
CN118091608A (en) * 2024-04-29 2024-05-28 深圳阜时科技有限公司 Transmitting module, MEMS galvanometer laser radar system and electronic equipment

Similar Documents

Publication Publication Date Title
CN108885263B (en) LIDAR-based 3D imaging with variable pulse repetition
CN111722241B (en) Multi-line scanning distance measuring system, method and electronic equipment
CN111830530B (en) Distance measuring method, system and computer readable storage medium
CN109557522A (en) Multi-beam laser scanner
CN112805595B (en) Laser radar system
CN114690155A (en) Photoelectric detection device and electronic equipment
CN111487639A (en) Laser ranging device and method
EP4206726A1 (en) Laser light source, light emission unit, and lidar
CN111796295A (en) Collector, manufacturing method of collector and distance measuring system
CN112558105A (en) Laser radar system and control method of laser radar system
CN114236496A (en) Emission module, optical detection device and electronic equipment
CN108828559B (en) Laser radar device and laser radar system
CN114966620B (en) Photoelectric detection device and electronic equipment
CN114935742B (en) Emission module, photoelectric detection device and electronic equipment
CN114935743B (en) Emission module, photoelectric detection device and electronic equipment
CN111487603B (en) Laser emission unit and manufacturing method thereof
KR102567502B1 (en) Time of flight apparatus
CN210835244U (en) 3D imaging device and electronic equipment based on synchronous ToF discrete point cloud
CN111796296A (en) Distance measuring method, system and computer readable storage medium
CN114720959A (en) Photoelectric detection device, electronic equipment and three-dimensional information detection method
CN213903798U (en) Distance measuring system with dual light-emitting modes
CN114924257B (en) Receiving module, photoelectric detection device and electronic equipment
US20230072058A1 (en) Omni-view peripheral scanning system with integrated mems spiral scanner
CN115453548A (en) Laser radar detection method and laser radar
CN114236504A (en) dToF-based detection system and light source adjusting method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination