CN114866703A - Active exposure method and device based on TOF imaging system and electronic equipment - Google Patents

Active exposure method and device based on TOF imaging system and electronic equipment Download PDF

Info

Publication number
CN114866703A
CN114866703A CN202110146847.0A CN202110146847A CN114866703A CN 114866703 A CN114866703 A CN 114866703A CN 202110146847 A CN202110146847 A CN 202110146847A CN 114866703 A CN114866703 A CN 114866703A
Authority
CN
China
Prior art keywords
frame
automatic exposure
target scene
exposure time
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110146847.0A
Other languages
Chinese (zh)
Inventor
章炳刚
郑梁超
张定乾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Original Assignee
Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Sunny Optical Intelligent Technology Co Ltd filed Critical Zhejiang Sunny Optical Intelligent Technology Co Ltd
Priority to CN202110146847.0A priority Critical patent/CN114866703A/en
Publication of CN114866703A publication Critical patent/CN114866703A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides an automatic exposure method based on a TOF imaging system. The method comprises the following steps: acquiring gray information of a target scene image of a calculation frame in a current frame; calculating the automatic exposure time according to the acquired gray information; and adjusting the exposure time of a plurality of sub-frames after the calculation frame in the current frame according to the automatic exposure time. The TOF imaging system adopting the automatic exposure method can calculate the automatic exposure time according to the target scene of the current frame and apply the automatic exposure time to the current frame in the imaging process, and is favorable for improving the measurement precision and the imaging accuracy.

Description

Active exposure method and device based on TOF imaging system and electronic equipment
Technical Field
The present application relates to the field of machine vision, and more particularly, to an automatic exposure method, an automatic exposure apparatus, an electronic device, a computer-readable medium, and a TOF imaging system based on a TOF imaging system.
Background
At present, more and more mobile robots are equipped with a visual sensor such as a Time of Flight (TOF) camera to sense an external environment, so as to realize functions of automatic navigation and obstacle avoidance.
The TOF camera obtains the distance of the TOF camera from the target scene by emitting light towards the target scene and then receiving the light reflected from the target scene using the image sensor to calculate the flight (round trip) time of the emitted and reflected light. The image taken by the TOF camera thus has depth information of the distance of the TOF camera from the target scene, while having conventional two-dimensional image information, and thus generates a three-dimensional image of the target scene.
With the increase of the moving speed of the robot, when the scene is switched rapidly, the image generated by the TOF camera is overexposed or underexposed, so that the robot collides with an object in the target scene.
Disclosure of Invention
The present application provides an automatic exposure method based on a TOF imaging system, an automatic exposure apparatus, an electronic device, a computer-readable medium, and a TOF imaging system that may at least partially solve the above technical problems.
An aspect of the present application provides an automatic exposure method. The method comprises the following steps: acquiring gray information of a target scene image of a calculation frame in a current frame; calculating the automatic exposure time according to the acquired gray information; and adjusting the exposure time of a plurality of sub-frames after the calculation frame in the current frame according to the automatic exposure time.
In one embodiment, the calculation frame may include an active frame and an inactive frame following the active frame, wherein the acquiring of the gray scale information may include: and acquiring gray information of the target scene image of the effective frame in the calculation frames.
In one embodiment, the exposure time within the inactive frame may be zero.
In one embodiment, the exposure time within the inactive frame may be a non-zero constant.
In one embodiment, the step of acquiring the gray scale information may include: and acquiring the gray information of the calculation frame in the current frame according to the amplitude information of the target scene image of the calculation frame in the current frame.
In one embodiment, the gray scale information of the target scene image may include gray scale values of a plurality of pixels of the target scene image, wherein the step of calculating the automatic exposure time may include: the automatic exposure time is calculated from the average of the gray values of the plurality of pixels.
In one embodiment, the method further comprises: and adjusting the exposure time of the calculation frame in the next frame according to the automatic exposure time.
In one embodiment, the plurality of subframes after the calculation frame in the current frame is four subframes.
In one embodiment, the step of adjusting the exposure time of the plurality of subframes may comprise: each of the plurality of sub-frames is adjusted according to the same automatic exposure time.
In one embodiment, there may be a time interval between the current frame and the next frame.
Another aspect of the present application provides an automatic exposure apparatus, including: the acquisition module is used for acquiring the gray information of the target scene image of the calculation frame in the current frame; the calculation module is used for calculating the automatic exposure time according to the acquired gray information; and the adjusting module is used for adjusting the exposure time of a plurality of sub-frames after the calculation frame in the current frame according to the automatic exposure time.
In one embodiment, the calculation frame may include an active frame and an inactive frame following the active frame, and the gray scale information may be from a target scene image of the active frame in the calculation frame.
In one embodiment, the exposure time within the inactive frame may be zero.
In one embodiment, the exposure time within the inactive frame may be a non-zero constant.
In one embodiment, the adjusting module is further configured to adjust the exposure time of the calculation frame in the next frame according to the automatic exposure time.
Another aspect of the present application also provides an electronic device, including: a memory for storing a program; a processor for executing the program to perform any of the embodiments of the auto-exposure method as described above.
Another aspect of the present application also provides a computer-readable medium. On which a computer program is stored which, when being executed by a processor, carries out any of the embodiments of the automatic exposure method as described above.
Another aspect of the present application also provides a TOF imaging system. The TOF imaging system comprises: a light source module for emitting light to a target scene; a light sensing module for receiving light reflected via a target scene; and the working module is used for controlling the light source module and the photosensitive module and processing the reflected light received by the photosensitive module to execute any embodiment of the automatic exposure method.
The TOF imaging system adopting the automatic exposure method can calculate the automatic exposure time according to the target scene of the current frame and apply the automatic exposure time to the current frame in the imaging process, and is favorable for improving the measurement precision and the imaging quality.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is a functional schematic diagram of a TOF imaging system according to an embodiment of the present application;
FIG. 2 is a block diagram schematic diagram of the components of a TOF imaging system according to an embodiment of the present application;
FIG. 3 is a flow chart of a method of automatic exposure based on a TOF imaging system according to an embodiment of the present application;
FIG. 4 is a diagram of a current frame format and a next frame format according to an embodiment of the application;
FIG. 5 is a diagram of a current frame format and a subsequent frame format in the prior art; and
fig. 6 is a schematic structural diagram of an electronic apparatus suitable for implementing the automatic exposure method according to the embodiment of the present application.
Detailed Description
For a better understanding of the present application, various aspects of the present application will be described in more detail with reference to the accompanying drawings. It should be understood that the detailed description is merely illustrative of exemplary embodiments of the present application and does not limit the scope of the present application in any way.
The terminology used herein is for the purpose of describing particular example embodiments and is not intended to be limiting. The terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, integers, elements, components, and/or groups thereof, but do not preclude the presence or addition of one or more other features, integers, elements, components, and/or groups thereof.
This description is made with reference to schematic illustrations of exemplary embodiments. The exemplary embodiments disclosed herein should not be construed as limited to the particular shapes and dimensions shown, but are to include various equivalent structures capable of performing the same function, as well as deviations in shapes and dimensions that result, for example, from manufacturing. The locations shown in the drawings are schematic in nature and are not intended to limit the location of the various components.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. Terms such as those defined in commonly used dictionaries should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The TOF imaging system can acquire distance information from an observation point (TOF imaging system) to a target scene, and further acquire depth information of an image of the target scene to generate a three-dimensional image. And the TOF imaging system has the advantages of high real-time performance, stable precision, small occupied space and the like, can be implemented as equipment such as a camera and is widely applied to the fields of machine vision processing, such as object recognition, three-dimensional reconstruction, mobile robot navigation and the like.
Fig. 1 is a schematic diagram of operation of a TOF imaging system 100 in accordance with an embodiment of the present application. As shown in FIG. 1, the TOF imaging system 100 is adapted to acquire images of a target scene 200. Unlike conventional imaging systems, the TOF imaging system 100 can acquire three-dimensional image information of the target scene 200 to be measured according to a time of flight (TOF) method. In a simplified manner, the TOF imaging system 100 enters the TOF imaging system 100 through the reflected light of the target scene 200 and is received by the image sensor of the TOF imaging system 100, and after the image sensor performs a preliminary analysis process on the entered light, the image sensor converts the reflected light carrying the image information into the three-dimensional image information of the target scene 200, so as to complete the shooting of the target scene 200. During the imaging process of the TOF imaging system 100, the exposure performance of the TOF imaging system 100 directly or indirectly affects the imaging effect of the target scene 200, especially for the case that the switching speed of the TOF imaging system 100 and the target scene 200 is too fast.
Fig. 2 is a block diagram schematic diagram of the components of a TOF imaging system 100 according to an embodiment of the application. As shown in FIG. 2, the TOF imaging system 100 can include a light source module 110, a light sensing module 120, and a working module 130.
The light source module 110 may actively emit light to the target scene, which may be used as a light signal for detecting the target scene. The light source module 110 may include at least one light source, such as a laser light source or an LED light source. In some embodiments, the light source may be implemented as a Vertical Cavity Surface Emitting Laser (VCSEL). The core parameter of the light source is the center wavelength, and infrared light of 850nm or 940nm can be used. The sunlight has strong illumination at 850nm wavelength and weak illumination at 940nm wavelength, so that the interference of outdoor sunlight as ambient light can be avoided when the infrared light with 940nm wavelength is used as a light source.
The optical signal emitted by the light source module 110 may be a modulated optical pulse or a continuous wave, and the frequency of the modulated optical pulse may reach 100 MHz.
The light sensing module 120 may receive reflected light formed after being reflected by the target scene, and a reflected light signal carried by the reflected light may include distance information between the target scene 200 and the TOF imaging system 100, and depth information of an image of the target scene 200 may be obtained by performing corresponding processing on the distance information.
The light sensing module 120 may include at least one image sensor, which may be, for example, a CCD image sensor or a CMOS image sensor having a high-speed shutter and exposure accumulation function. The image sensor may include a plurality of pixels arranged in a two-dimensional array, and is different from the conventional image sensor in that the image sensor applied to the TOF imaging system 100 further includes a modulation control unit, an a/D conversion unit, and the like for demodulating the reflected light signal and preliminary data processing. Thus, the size of a single pixel of an image sensor applied to the TOF imaging system 100 is larger than that of a conventional image sensor, e.g., the image sensor of the TOF imaging system 100 may be 20 μm in size.
In some embodiments, the light sensing module 120 may further include a lens (not shown) disposed in a light sensing path of the image sensor for collecting light reflected via the target scene. The photosensitive module 120 may further include a filter element (not shown), such as an infrared band pass filter. The filter element is disposed between the image sensor and the lens, and it can be ensured that only reflected light having the same wavelength as the emitted light from the light source module 110 enters the image sensor. The filtering element improves the imaging quality of the TOF imaging system 100 by filtering out stray light.
The working module 130 is communicably connected to the light source module 110 and the light sensing module 120 to control the light source module 110 and the light sensing module 120 and perform data interaction. It should be understood that the communication connection is not limited to being implemented as a wired or wireless connection, such as an electrical connection, a signal connection, and the like.
In some embodiments, the operational module 130 may include a control unit 131. The control unit 131 may control the operating states of the light source module 110 and the light sensing module 120 by transmitting control signals to the light source module 110 and the light sensing module 120. For example, the control unit 131 may control the light source of the light source module 110 to emit the emission light having a preset wavelength and frequency or may control the exposure time of the light sensing module 120 to receive the reflected light. In other words, the light source module 110 and the light sensing module 120 can change their operating states by the operating module 130. Alternatively, the control unit 131 may transmit control signals to the light source module 110 and the light sensing module 120 synchronously, so that the light source module 110 and the light sensing module 120 operate synchronously, for improving the accuracy of the depth information of the target scene image acquired by the TOF imaging system 100.
In some embodiments, the work module 130 may further include a processing unit 132. The processing unit 132 is communicatively connected to the photosensitive module 120, and is configured to receive the image information of the target scene collected by the photosensitive module 120 for further processing to generate gray scale information or depth information about the image of the target scene, and further generate a three-dimensional image about the target scene.
The processing unit 132 is further communicably connected to the control unit 131, and is configured to obtain control signals for controlling the light source module 110 and the light sensing module 120 after performing corresponding calculation processing according to the collected partial information about the target scene image, and transmit the control signals to the control unit 131. For example, the processing unit 132 may calculate and generate a control signal for the control unit 132 to perform an automatic exposure operation according to a certain automatic exposure method according to a control requirement of the automatic exposure, and transmit the control signal to the control unit 131, so that the control unit 131 controls the photosensitive module 120 to perform the automatic exposure operation.
The operation of TOF imaging system 100 can be described as follows. The light source module 110 emits emission light having a predetermined wavelength, and the emission light is reflected to form reflected light after encountering a surface of an object to be measured in a target scene. The light sensing module 120 can be controlled to rapidly receive the reflected light at different time periods to obtain gray scale information and depth information of the target scene. Specifically, after the light source module 110 emits the emission light, the photosensitive module 120 is rapidly exposed to acquire several pictures, and a time difference or a phase difference between the emission light and the reflected light is calculated based on the several pictures to calculate distance information, i.e., depth information, of the target scene object from the TOF imaging system 100. And the gray scale information of the target scene can be acquired while the depth information of the target scene is acquired. And integrating the depth information and the gray scale information to obtain a three-dimensional image of the target scene.
It should be understood that the ranging method can be divided into a direct TOF measurement method and an indirect TOF measurement method according to the modulation mode of the light source signal. In direct TOF measurement, the light signal appears in pulses, and by acquiring the time from emission to reception of a light pulse, the distance of the light signal reflected from the target scene to the image sensor is acquired. In the indirect TOF measurement method, the emitted light signal is modulated to be a periodic light signal, such as a sinusoidal light signal, the phase of the emitted light signal reaching the image sensor changes after the round-trip distance delay, and the change of the phase is calculated by using a certain phase discrimination technique, so as to obtain the distance from the light signal reflected from the target scene to the image sensor.
The present application provides an automatic exposure method 1000, and the various steps are described in detail below with reference to the TOF imaging system 100 shown in fig. 2. The working module 130 may control the exposure time of the photosensitive module 120 through the method 1000, so that the photosensitive module 120 obtains the target scene image in an automatic exposure manner. The specific processing and cooperation of the various modules of the TOF imaging system 100 shown in fig. 2 can be further explained by describing the method 1000.
Fig. 3 is a flow chart of an automatic exposure method 1000 based on the TOF imaging system 100 according to an embodiment of the application. As shown in fig. 3, the method 1000 may include the steps of:
step S110, obtaining gray scale information of the target scene image of the calculation frame in the current frame.
And step S120, calculating the automatic exposure time according to the acquired gray scale information.
Step S130, adjusting the exposure time of a plurality of sub-frames after the calculation frame in the current frame according to the automatic exposure time.
In step S110, the light source module 110 actively emits light to the target scene, and the light emitting module forms reflected light after detecting the target scene, and the light sensing module 120 receives the reflected light to obtain gray scale information of the target scene image. The operation of the TOF imaging system 100 to acquire an image of a target scene and the functions of the hardware during the operation are described in detail above, and are not described herein again.
TOF imaging system 100 can acquire images of a target scene at a frame rate, and as the frame rate increases, TOF imaging system 100 can generate three-dimensional images of the target scene in real-time. Illustratively, a device such as a camera embedded in the TOF imaging system 100 may be mounted on a mobile robot, and based on SLAM (Simultaneous localization and mapping) technology, navigation and obstacle avoidance functions of the mobile robot are implemented.
Each frame (e.g., a current frame and a next frame) may be divided into a computation frame and a plurality of subframes following the computation frame. The calculation frame may have gray scale information of the current frame target scene image. Multiple subframes may have a similar set of measurements, and the similar set of measurements may be used to obtain depth information for a target scene image of a current frame. It should be understood that the current frame described herein may be any frame of the plurality of target scene images that are generated by TOF imaging system 100 continuously during operation.
In some embodiments, the emission light of the TOF imaging system 100 may be a periodic emission light signal having a preset wavelength, such as a sinusoidal light signal. At this time, as in the ranging principle described above, the time of the round trip of the optical signal may be indirectly calculated using the phase difference between the emitted light signal and the reflected light signal, thereby calculating the distance of the TOF imaging system 100 from the target scene. In other words, the depth information of the target scene image can be calculated by the target scene image information acquired by the reflected light signals at different phases. Therefore, the target scene image information at a plurality of phase frames (sub-frames) can be acquired by exposing at different phases of the periodic reflected light signal for a certain exposure time, and the depth information of the target scene image can be calculated. For sinusoidal optical signals, four measurement values of reflected optical signal phase intervals of 90 degrees can be collected, so that the current frame can carry out the resolution of the target scene depth information through the measurement values of four sub-frames. In other words, the plurality of subframes after the calculation frame of the current frame may be four subframes.
In some embodiments, the calculation frame may include an active frame for acquiring the gray scale of the target scene image and an inactive frame following the calculation frame. The gray scale information obtained at the calculation frame needs to be further processed to calculate the automatic exposure time for the current frame. Since the emitted optical signal generally needs to be modulated at a high frequency, the time interval between the active frame and the plurality of sub-frames in the calculation frame is very short, and the time required for the calculation can be reserved for step S120 by adding the inactive frame after the active frame and before the plurality of sub-frames.
In some embodiments, the exposure time within the inactive frame may be zero. In other words, the exposure operation is not performed by the photosensitive module 120 during the invalid frame, which can be used as the time interval between the valid frame and the following subframes.
In some embodiments, the exposure time within an inactive frame may be a non-zero constant. The invalid frame may perform the exposure operation not with the automatic exposure time of the current frame calculated using the gray scale information acquired by the valid frame, but with a preset fixed exposure time. With this automatic exposure method, the target scene image information acquired by the invalid frame can be used for detection of ambient light. Meanwhile, a sufficient time required for the calculation can be reserved for step S120.
In addition, the manner of obtaining the gray scale information of the target scene may be to set the TOF imaging system 100 in a gray scale mode, so that two-dimensional image information of the target scene image is collected in the calculation frame, and the gray scale information of the target scene image of the calculation frame of the current frame is obtained by using the two-dimensional image information. Alternatively, when the TOF imaging system emits a periodic light signal, such as a sinusoidal light signal, the reflected light signal reflected via the target scene may also be periodic and the amplitude of the reflected light signal may be different due to the different reflectivity of the individual objects in the target scene. Therefore, the gray scale information of the target scene image of the calculation frame of the current frame can be obtained according to the amplitude information of the target scene image of the calculation frame of the current frame.
In step S120, the gray information of the image may be a set of gray values of each pixel in the image. The gray scale value can be several levels, which are divided into a logarithmic relationship between white and black, and generally ranges from 0 to 255, wherein white is 255 and black is 0.
In order to satisfy the requirement of stable enough measurement accuracy when a dynamic target scene generates a large distance span, a mode of controlling the exposure time can be adopted. The visual expression of the exposure condition can be the change of gray scale, and the gray scale is higher when the exposure condition is high, so that the image becomes bright; the low exposure is lower in gray scale, and the image becomes dark. Thus, the gray scale information of the target scene image of the computed frame of the current frame may be applied to the processing unit 132 in the TOF imaging system 100 in response to determining the auto exposure time matching therewith.
In some embodiments, the automatic exposure time of the current frame may be calculated according to an average value of gray values of respective pixels of the image, which serves as a reference for calculating the automatic exposure time. It should be understood that the method for calculating the automatic exposure time according to the embodiment of the present application is not limited thereto, and other methods for calculating the automatic exposure time, such as a local sampling method, may be used to calculate the automatic exposure time of the current frame.
In step S130, the exposure time of each of the plurality of subframes of the current frame is adjusted by the control unit 131 of the operation module 130 according to the automatic exposure time of the current frame calculated in step S120.
In some embodiments, the automatic exposure time calculated in step S120 can also be used to perform an exposure operation on the active frame in the next frame, so that the exposure time of the active frame in each frame can be made variable. Specifically, the exposure time of the effective frame in the next frame may be adjusted according to the gray scale information of the target scene image obtained by the effective frame in the current frame, so that the gray scale information of the target scene image of the next frame can be obtained for the effective frame in the next frame, and more accurate exposure time can be provided.
In some embodiments, there may be a time interval between the current frame and the next frame, which may enable the TOF imaging system 100 to maintain a preset frame rate (e.g., 10fps) operation.
In some embodiments, when the exposure process is performed in a plurality of sub-frames, each sub-frame may be adjusted according to the same automatic exposure time acquired in step S120. Therefore, the control of the exposure time of a plurality of subframes in the current frame is facilitated, and the accuracy of the measured value acquired by each subframe is also facilitated by adopting the same automatic exposure time.
FIG. 4 shows a current frame F according to an embodiment of the present application k Frame F following the format k+1 Schematic representation of the format. As shown in FIG. 4, the current frame F k May include computing a frame F c And four subframes (first subframe F) 1 The second sub-frame F 2 The third sub-frame F 3 And a fourth subframe F 4 ). TOF imaging system on complete current frame F k Target scene image information for generating a three-dimensional image may be acquired. The target scene image information may include depth information D and gray scale information G for the image. In the calculation frame F c In the method, the light sensing module 120 may obtain gray scale information G of the target scene image; the grey scale information G may then be transmitted to the processing unit 132 in the work module 130, using an algorithm such as global grey scale averaging, from the calculation frame F c The gray scale information G is internally generated, and the automatic exposure time E suitable for the current frame is calculated k
Illustratively, a frame F may also be calculated c Further divided into active frames F a And following an invalid frame F na . In the effective frame F a In addition, the light sensing module 120 can obtain the gray scale information G of the target scene image, and the invalid frame F na Can be calculated to fit the current frame F k Automatic exposure time E of k A time interval is provided.
Illustratively, the current frame F k And last burst F k+1 There may be a time interval T between them, which may maintain a certain frame rate when TOF imaging system 100 is in operation.
In the first sub-frame F 1 To the fourth subframe F 4 In addition, the image sensor in the photosensitive module 120 can be exposed for the automatic exposure time E k The inner exposure is used to obtain four measurements used to resolve the target scene image depth information D. The depth information D of the target scene can be obtained by solving the four measurements. Combined in a calculation frame F c The gray scale information G of the internally acquired target scene image can be in the complete current frame F k In this case, a three-dimensional scene is generated by the processing unit 132 of the work module 130 with respect to the target sceneAnd (4) an image.
It should be appreciated that the TOF imaging system 100, when in operation, can generate three-dimensional images of the target scene continuously over a frame period, and the format of each frame can be the same. Thus, within each frame, the above commands are repeatedly executed, and the photosensitive module 120 is controlled to acquire the image information of the target scene in an automatic exposure manner, so as to realize the automatic exposure function of the TOF imaging system 100.
FIG. 5 shows a current frame F' k Format and subsequent frame F' k+1 Schematic representation of the format. As shown in FIG. 5, in the conventional automatic exposure method, in the current frame F' k Having four subframes (first subframe F' 1 Second subframe F' 2 . Third subframe F' 3 And a fourth subframe F' 4 ) That is, when the target scene image information (gradation information G and depth information D) of the current frame is acquired through four subframes, the current frame F' k Of automatic exposure time E' k Needs to be determined from the gray scale information of the target scene image information of the previous frame (not shown). In other words, current frame F' k The gray scale information G 'of the target scene image information of (2) is used for determining the following frame F' k+1 Of automatic exposure time E' k+1 . Furthermore, the current frame F 'can be obtained' k And the following frame F' k+1 A time interval T 'is set between the two frames of the frame F' k+1 Of automatic exposure time E' k Sufficient computation time is provided while maintaining a certain frame rate for TOF imaging system 100 during normal operation.
According to the existing automatic exposure method, the current frame F' k Automatic exposure time of execution E' k Is according to the previous frame F' k-1 Is determined from the target scene image of (1). This results in the execution of an auto exposure time E' k Time lags the acquired gray scale information G' of the target scene image. For example, when the TOF imaging system continuously generates three-dimensional images about a target scene at 10fps, the current frame F' k Automatic exposure time of execution E' k Lags the acquired gray scale information G' 100ms of the target scene image.
When the apparatus embedded in the TOF imaging system 100 is mounted on a mobile robot for obstacle avoidance and automatic navigation, it is necessary to continuously generate three-dimensional images of a target scene. When a mobile robot (TOF imaging system) is rapidly switched with a target scene, for example, an obstacle is actively approaching, the robot is rapidly rotating, or when the frame rate of the TOF imaging system is low and the time interval between a current frame and a next frame is long, the TOF imaging system obtains a target scene image in an adjacent frame with a large difference. Thus, the automatic exposure time of the current frame determined according to the gray scale information of the target scene image acquired from the previous frame is not suitable for the exposure of the current frame. In other words, if the gray scale information of the target scene image of the previous frame is used to determine the automatic exposure time of the current frame, the target scene image generated by the current frame may be overexposed or underexposed, and the accuracy of the depth information of the target scene image may also be affected.
The automatic exposure method provided by the application can be used for exposing the current frame by using the automatic exposure time determined according to the gray scale information of the target scene image of the current frame. Therefore, the time delay during the execution of the current frame automatic exposure time can be avoided, so that the target scene image of the current frame can obtain correct exposure, and the accuracy of the depth information of the current frame target scene image can be improved.
Fig. 6 is a schematic structural diagram of an electronic apparatus 300 suitable for implementing the automatic exposure method according to the embodiment of the present application. It should be understood that the electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the electronic device 300 includes one or more processors 301 (e.g., CPUs) that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)302 or a program loaded from a storage section 306 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus 300 are also stored. The processor 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
The following components are connected to the I/O interface 305 including: a storage portion 306 such as a hard disk or the like; and a communication section 307 including a network interface card such as a LAN card, a modem, or the like. The communication section 307 performs communication processing via a network such as the internet. A driver 308 is also connected to the I/O interface 305 as necessary. A removable medium 309 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 308 as necessary, so that a computer program read out therefrom is mounted into the storage section 306 as necessary.
Those skilled in the art will appreciate that all or part of the steps in the method for implementing the above embodiments may be implemented by hardware instructions associated with a program, where the program may be stored in a computer-readable storage medium, and when executed, the program includes the following steps: s110, acquiring gray information of a target scene image of a calculation frame in a current frame; s120, calculating automatic exposure time according to the acquired gray information; and S130, adjusting the exposure time of a plurality of sub-frames after the calculation frame in the current frame according to the automatic exposure time. The storage medium includes, for example, ROM/RAM, magnetic disks, optical disks, and the like.
The computer program may be partitioned into a plurality of modules that are stored in the memory and executed by the processor. The modules may be a series of instruction segments of a computer program capable of performing specific functions, and the instruction segments are used for describing the execution process of the computer program in the automatic exposure device. For example, the computer program may be divided into an acquisition module, a calculation module, and an adjustment module, and each module has the following specific functions:
and the acquisition module is used for acquiring the gray information of the target scene image of the calculation frame in the current frame.
The calculation module is used for calculating the automatic exposure time according to the acquired gray information; and
and the adjusting module is used for adjusting the exposure time of a plurality of sub-frames after the calculation frame in the current frame according to the automatic exposure time.
The above description is only a preferred embodiment of the present application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (10)

1. The automatic exposure method based on the TOF imaging system is characterized by comprising the following steps:
acquiring gray information of a target scene image of a calculation frame in a current frame;
calculating the automatic exposure time according to the acquired gray information; and
and adjusting the exposure time of a plurality of sub-frames after the calculation frame in the current frame according to the automatic exposure time.
2. The automatic exposure method according to claim 1, wherein the calculation frame includes an active frame and an inactive frame following the active frame, and wherein the step of acquiring the gradation information includes:
and acquiring gray information of the target scene image of the effective frame in the calculation frames.
3. The automatic exposure method according to claim 2, wherein the exposure time within the invalid frame is zero.
4. The automatic exposure method according to claim 2, wherein the exposure time within the invalid frame is a non-zero constant.
5. The automatic exposure method according to any one of claims 1 to 4, wherein the step of acquiring the gradation information includes:
and acquiring the gray information of the calculation frame in the current frame according to the amplitude information of the target scene image of the calculation frame in the current frame.
6. The automatic exposure method according to any one of claims 1 to 4, wherein the gray scale information of the target scene image comprises gray scale values of a plurality of pixels of the target scene image, wherein the step of calculating the automatic exposure time comprises:
and calculating the automatic exposure time according to the average value of the gray values of the plurality of pixels.
7. An automatic exposure apparatus, characterized in that,
the acquisition module is used for acquiring the gray information of the target scene image of the calculation frame in the current frame;
the calculation module is used for calculating the automatic exposure time according to the acquired gray information; and
and the adjusting module is used for adjusting the exposure time of a plurality of subframes after the calculation frame in the current frame according to the automatic exposure time.
8. An electronic device, comprising:
a memory for storing a program;
a processor for performing the method of any one of claims 1 to 10 when executing the program.
9. Computer-readable medium, characterized in that a computer program is stored thereon which, when being executed by a processor, carries out the method according to any one of claims 1 to 10.
A TOF imaging system comprising:
a light source module for emitting light to a target scene;
a light sensing module to receive the light reflected via a target scene; and
a working module for controlling the light source module and the photosensitive module, and processing the reflected light received by the photosensitive module to perform the method of any one of claims 1 to 10.
CN202110146847.0A 2021-02-03 2021-02-03 Active exposure method and device based on TOF imaging system and electronic equipment Pending CN114866703A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110146847.0A CN114866703A (en) 2021-02-03 2021-02-03 Active exposure method and device based on TOF imaging system and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110146847.0A CN114866703A (en) 2021-02-03 2021-02-03 Active exposure method and device based on TOF imaging system and electronic equipment

Publications (1)

Publication Number Publication Date
CN114866703A true CN114866703A (en) 2022-08-05

Family

ID=82623692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110146847.0A Pending CN114866703A (en) 2021-02-03 2021-02-03 Active exposure method and device based on TOF imaging system and electronic equipment

Country Status (1)

Country Link
CN (1) CN114866703A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117154530A (en) * 2023-11-01 2023-12-01 江苏博睿光电股份有限公司 High-power VCSEL laser and manufacturing method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105635597A (en) * 2015-12-21 2016-06-01 湖北工业大学 Auto-exposure method and system for vehicle-mounted camera
CN106851123A (en) * 2017-03-09 2017-06-13 广东欧珀移动通信有限公司 Exposal control method, exposure-control device and electronic installation
CN109819174A (en) * 2017-11-22 2019-05-28 浙江舜宇智能光学技术有限公司 Automatic explosion method and automatic exposure time calculation method and TOF camera based on TOF imaging system
CN109903324A (en) * 2019-04-08 2019-06-18 京东方科技集团股份有限公司 A kind of depth image acquisition method and device
CN111372005A (en) * 2018-12-25 2020-07-03 浙江舜宇智能光学技术有限公司 Automatic exposure compensation method and system for TOF camera module

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105635597A (en) * 2015-12-21 2016-06-01 湖北工业大学 Auto-exposure method and system for vehicle-mounted camera
CN106851123A (en) * 2017-03-09 2017-06-13 广东欧珀移动通信有限公司 Exposal control method, exposure-control device and electronic installation
CN109819174A (en) * 2017-11-22 2019-05-28 浙江舜宇智能光学技术有限公司 Automatic explosion method and automatic exposure time calculation method and TOF camera based on TOF imaging system
CN111372005A (en) * 2018-12-25 2020-07-03 浙江舜宇智能光学技术有限公司 Automatic exposure compensation method and system for TOF camera module
CN109903324A (en) * 2019-04-08 2019-06-18 京东方科技集团股份有限公司 A kind of depth image acquisition method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117154530A (en) * 2023-11-01 2023-12-01 江苏博睿光电股份有限公司 High-power VCSEL laser and manufacturing method thereof
CN117154530B (en) * 2023-11-01 2024-02-02 江苏博睿光电股份有限公司 High-power VCSEL laser and manufacturing method thereof

Similar Documents

Publication Publication Date Title
JP7418340B2 (en) Image augmented depth sensing using machine learning
US9194953B2 (en) 3D time-of-light camera and method
EP3264364A1 (en) Unmanned aerial vehicle depth image acquisition method, device and unmanned aerial vehicle
US11892573B2 (en) Real-time estimation of dc bias and noise power of light detection and ranging (LiDAR)
US20170123063A1 (en) Distance measuring device, moving system, and distance measurement method
KR102664396B1 (en) LiDAR device and operating method of the same
US20220221584A1 (en) Laser radar and method for generating laser point could data
CN111487648A (en) Non-visual field imaging method and system based on flight time
CN114866703A (en) Active exposure method and device based on TOF imaging system and electronic equipment
EP3276576A1 (en) Disparity estimation by fusion of range data and stereo data
JP2020052001A (en) Depth acquisition device, depth acquisition method, and program
WO2022195954A1 (en) Sensing system
CN116299341A (en) Binocular depth information acquisition system and method based on TOF
JP2020051991A (en) Depth acquisition device, depth acquisition method, and program
US20230033352A1 (en) Time-of-flight object detection circuitry and time-of-flight object detection method
US20220018940A1 (en) Vision first light detection and ranging system
JP7147729B2 (en) Movement amount estimation device, movement amount estimation method, movement amount estimation program, and movement amount estimation system
EP4078221A1 (en) Time-of-flight imaging circuitry, time-of-flight imaging system, time-of-flight imaging method
CN111445507A (en) Data processing method for non-visual field imaging
US20240134053A1 (en) Time-of-flight data generation circuitry and time-of-flight data generation method
CN114697560A (en) Active exposure method based on TOF imaging system and exposure time calculation method
US10698111B2 (en) Adaptive point cloud window selection
US20220108554A1 (en) Vision based light detection and ranging system using multi-fields of view
WO2022149466A1 (en) Signal processing device and method, and program
WO2023057343A1 (en) Apparatuses and methods for event guided depth estimation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination