WO2021170114A1 - 深度图像的获取方法、装置及显示装置 - Google Patents
深度图像的获取方法、装置及显示装置 Download PDFInfo
- Publication number
- WO2021170114A1 WO2021170114A1 PCT/CN2021/078271 CN2021078271W WO2021170114A1 WO 2021170114 A1 WO2021170114 A1 WO 2021170114A1 CN 2021078271 W CN2021078271 W CN 2021078271W WO 2021170114 A1 WO2021170114 A1 WO 2021170114A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- output
- depth
- images
- grayscale
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000001914 filtration Methods 0.000 claims abstract description 17
- 239000011159 matrix material Substances 0.000 claims description 33
- 238000012546 transfer Methods 0.000 claims description 25
- 239000000758 substrate Substances 0.000 claims description 7
- 229920006280 packaging film Polymers 0.000 claims description 6
- 239000012785 packaging film Substances 0.000 claims description 6
- ZOKXTWBITQBERF-UHFFFAOYSA-N Molybdenum Chemical compound [Mo] ZOKXTWBITQBERF-UHFFFAOYSA-N 0.000 claims description 3
- 239000000463 material Substances 0.000 claims description 3
- 239000007769 metal material Substances 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 2
- 230000003287 optical effect Effects 0.000 description 16
- 230000008859 change Effects 0.000 description 15
- 239000003990 capacitor Substances 0.000 description 13
- 230000005540 biological transmission Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000000739 chaotic effect Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000008901 benefit Effects 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 230000000903 blocking effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005538 encapsulation Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
Definitions
- the present disclosure relates to the field of display technology, and in particular to a method, device and display device for acquiring a depth image.
- the depth camera can also obtain the depth information of the scene in the image (that is, the distance between the scene and the camera lens) while shooting the image. Because of this advantage, the depth camera can be used in gestures. It is widely used in technical fields such as interaction, stereo display, machine vision, and satellite remote sensing.
- the present disclosure provides a method, device and display device for acquiring a depth image.
- a method for acquiring a depth image for use in an image depth sensing component, wherein a filter layer is provided outside the image depth sensing component, and the method includes:
- Emitting reference light with a preset phase to the filter layer Emitting reference light with a preset phase to the filter layer
- the reflected light By receiving reflected light, at least four output images are acquired, and the received phases of the at least four output images and the preset phases have different phase differences, and the reflected light is the reference light after the filtering.
- the method further includes:
- the obtaining the filter parameters of the filter layer includes:
- the undetermined filter parameter formula includes:
- H is the transfer function matrix of the filter layer
- H * is the adjoint matrix of the transfer function matrix H
- H w is the undetermined filter parameter
- ⁇ is the undetermined coefficient
- the at least four output images are respectively convolved with the pending filter parameters to obtain at least four grayscale images, and it is determined that the definition of the at least four grayscale images is the highest corresponding to at least four first pending coefficient;
- the target coefficient is taken as the undetermined coefficient into the undetermined filter parameter formula to obtain the filter parameter.
- the obtaining at least four grayscale images according to the filtering parameters and the at least four output images includes:
- the at least four output images are respectively convolved with the filter parameter to obtain the at least four grayscale images.
- the number of the at least four output images is four, and the four output images include a first output image, a second output image, a third output image, and a fourth output image, wherein:
- the phase difference between the reception phase of the first output image and the preset phase is 0°
- the at least four grayscale images are a first grayscale image corresponding to the first output image, a second grayscale image corresponding to the second output image, and a first grayscale image corresponding to the third output image.
- the three grayscale images and the fourth grayscale image corresponding to the fourth output image, the obtaining a depth image according to the at least four grayscale images includes:
- the depth image is obtained according to the gray value of each pixel and the depth image formula, and the depth image formula includes:
- a 1 , A 2 , A 3 and A 4 are respectively the same position in the first gray scale image, the second gray scale image, the third gray scale image, and the fourth gray scale image
- L is the depth of field parameter corresponding to each pixel in the depth image
- ⁇ is the phase difference corresponding to each pixel in the depth image
- C is the speed of light
- ⁇ is the circumference of the circle
- f is the The frequency of the reference light.
- the method further includes:
- the undetermined filter parameter formula includes:
- H is the transfer function matrix of the filter layer
- H * is the adjoint matrix of the transfer function matrix H
- H w is the undetermined filter parameter
- ⁇ is the undetermined coefficient
- the at least four output images are respectively convolved with the pending filter parameters to obtain at least four grayscale images, and it is determined that the definition of the at least four grayscale images is the highest corresponding to at least four first pending coefficient;
- the obtaining at least four grayscale images according to the filtering parameters and the at least four output images includes:
- the number of the at least four output images is four, and the four output images include a first output image, a second output image, a third output image, and a fourth output image, wherein:
- the phase difference between the reception phase of the first output image and the preset phase is 0°
- the at least four grayscale images are a first grayscale image corresponding to the first output image, a second grayscale image corresponding to the second output image, and a third grayscale image corresponding to the third output image And a fourth grayscale image corresponding to the fourth output image, the obtaining a depth image according to the at least four grayscale images includes:
- the depth image is obtained according to the gray value of each pixel and the depth image formula, and the depth image formula includes:
- a 1 , A 2 , A 3 and A 4 are respectively the same position in the first gray scale image, the second gray scale image, the third gray scale image, and the fourth gray scale image
- L is the depth of field parameter corresponding to each pixel in the depth image
- ⁇ is the phase difference corresponding to each pixel in the depth image
- C is the speed of light
- ⁇ is the circumference of the circle
- f is the The frequency of the reference light.
- a display device in another aspect, includes a display panel and an image depth sensing component, the display panel includes a filter layer, and the image depth sensing component is used to perform the above-mentioned depth image acquisition method.
- the image depth sensing component is located outside the display panel.
- the image depth sensing component is located inside the display panel.
- the display panel includes a laminated transparent substrate, the filter layer, and a transparent packaging film layer, and the image depth sensing component is located on a side of the transparent substrate away from the transparent packaging film layer.
- the filter layer includes a stacked light-emitting unit layer and a mask layer, and the mask layer is located on a side of the filter layer close to the image depth sensing component.
- the material of the mask layer includes a molybdenum metal material.
- the display panel further includes an anode conductive layer, and the anode conductive layer and the mask layer are an integral structure.
- a depth image acquisition device which includes an image depth sensing component and a filter layer located outside the image depth sensing component;
- the image depth sensing component is configured to emit reference light with a preset phase to the filter layer
- the image depth sensing component is configured to obtain at least four output images by receiving reflected light, and the received phases of the at least four output images and the preset phases have different phase differences, and the reflection Light is the light reflected by the object after the reference light passes through the filter layer;
- the image depth sensing component is configured to obtain at least four grayscale images according to the filter parameters of the filter layer and the at least four output images;
- the image depth sensing component is used to obtain a depth image based on the at least four gray-scale images.
- the image depth sensing component is used to obtain a undetermined filter parameter formula
- the undetermined filter parameter formula includes:
- H is the transfer function matrix of the filter layer
- H * is the adjoint matrix of the transfer function matrix H
- H w is the undetermined filter parameter
- ⁇ is the undetermined coefficient
- the image depth sensing component is configured to convolve the at least four output images with the undetermined filter parameter to obtain at least four grayscale images, and determine that the definition of the at least four grayscale images is the highest At least four first undetermined coefficients corresponding to time;
- the image depth sensing component is used to determine the average value of the at least four first undetermined coefficients as the target coefficient
- the image depth sensing component is used to bring the target coefficient as the undetermined coefficient into the undetermined filter parameter formula to obtain the filter parameter.
- the number of the at least four output images is four, and the four output images include a first output image, a second output image, a third output image, and a fourth output image, wherein:
- the phase difference between the reception phase of the first output image and the preset phase is 0°
- the at least four grayscale images are a first grayscale image corresponding to the first output image, a second grayscale image corresponding to the second output image, and a first grayscale image corresponding to the third output image.
- the image depth sensing component is used to obtain each of the first grayscale image, the second grayscale image, the third grayscale image, and the fourth grayscale image.
- the image depth sensing component is configured to obtain the depth image according to the gray value of each pixel and a depth image formula, and the depth image formula includes:
- a 1 , A 2 , A 3 and A 4 are respectively the same position in the first gray scale image, the second gray scale image, the third gray scale image, and the fourth gray scale image
- L is the depth of field parameter corresponding to each pixel in the depth image
- ⁇ is the phase difference corresponding to each pixel in the depth image
- C is the speed of light
- ⁇ is the circumference of the circle
- f is the The frequency of the reference light.
- the image depth sensing component is configured to convolve the at least four output images with the filter parameters to obtain the at least four grayscale images.
- FIG. 1 is a flowchart of a method for acquiring a depth image provided by an embodiment of the disclosure.
- Fig. 2 is another flow chart of a method for acquiring a depth image provided by an embodiment of the disclosure.
- FIG. 3 is a diagram of the main circuit composition of an image depth sensing component provided by an embodiment of the disclosure.
- FIG. 4 is a waveform diagram of receiving pixels of an image depth sensing component provided by an embodiment of the disclosure.
- Fig. 5 is a flow chart of obtaining filter parameters of a filter layer in the embodiment shown in Fig. 2.
- FIG. 6 is a schematic diagram of a display device provided by an embodiment of the disclosure.
- FIG. 7 is a top view of a pattern of a mask layer in a display device provided by an embodiment of the disclosure.
- FIG. 1 it is a flowchart of a method for acquiring a depth image provided by an embodiment of the present disclosure, which is used in an image depth sensing component (the image depth sensing component may also be referred to as an image depth sensing device, or Image depth sensing structure, the embodiment of the present disclosure does not limit this)
- the image depth sensing component may also be referred to as an image depth sensing device, or Image depth sensing structure, the embodiment of the present disclosure does not limit this
- a filter layer is provided outside the image depth sensing component, and the method includes:
- step 101 reference light of a preset phase is emitted to the filter layer.
- the reference light is used to pass through the filter layer and be directed toward the object, and the object may be an object located on the side of the filter layer away from the image depth sensing component.
- step 102 at least four output images are acquired by receiving the reflected light.
- the reflected light is the light reflected by the object after the reference light passes through the filter layer, and the received phases and the preset phases of the at least four output images have different phase differences respectively.
- step 103 at least four grayscale images are obtained according to the filter parameters of the filter layer and at least four output images.
- step 104 a depth image is obtained based on at least four grayscale images.
- the depth image acquisition method can emit reference light with a preset phase through the image depth sensor component, and receive the reflected light reflected by the reference light by the object, and obtain at least four received phase sums.
- the output images with different preset phases of the reference light are restored, and at least four output images are restored according to the filter parameters of the filter layer to obtain at least four gray-scale images, and the depth image is obtained from the at least four gray-scale images. Since this method replaces the optical lens in the traditional depth camera with a filter layer, a complicated optical lens set is not needed, so that the method can obtain a depth image with a relatively simple structure.
- the depth image acquisition method can be conveniently applied to the display device.
- FIG. 2 is a flowchart of a method for acquiring a depth image provided by an embodiment of the present disclosure, which is used in an image depth sensing component, and a filter layer is provided outside the image depth sensing component, and the method includes:
- step 201 the image depth sensing component emits reference light of a preset phase to the filter layer.
- the reference light is used to pass through the filter layer and be directed toward the object, and the object may be an object located on the side of the filter layer away from the image depth sensing component.
- the reference light is emitted by the image depth sensing component, which is a function of the image depth sensing component.
- the preset phase can be determined in advance, and after the preset phase is determined, the reference light is phase modulated in advance to make it have the preset phase, which can facilitate the subsequent phase difference calculate.
- the method provided by the embodiments of the present disclosure can be applied to a depth image acquisition device, which includes the above-mentioned image depth sensing component and a filter layer.
- the object within the range that the reference light can irradiate can be the target object that is expected to obtain depth information.
- the depth of a certain point of the object refers to the distance between the point and the depth image acquisition device, and the different depths of multiple points are analyzed ,
- the posture of the object can be obtained, and the trend of the depth of multiple points over time can be analyzed, and the movement trend of the object can be obtained, so as to analyze the change of the overall state of the object.
- the image depth sensing component acquires at least four output images by receiving the reflected light.
- the reflected light is the light reflected by the object after the reference light passes through the filter layer, and the received phases of at least four output images and the preset phases of the reference light have different phase differences.
- the preset phase refers to the specific phase obtained by modulating the reference light when the image depth sensing component emits the reference light, and is the phase of the reference light.
- the reception phase refers to the specific phase of the light received by the image depth sensor component, and the image depth sensor component can be set to only receive light whose phase is the reception phase.
- the reference light is reflected by the object and returned to the depth image acquisition device. After being filtered by the filter layer, the image depth sensor component receives and forms an output image. Theoretically, due to the difference in the receiving phase, the received output image can be countless Therefore, there is a different phase difference between the received phase of these output images and the preset phase of the reference light.
- the number of acquired output images is less than four, only approximate calculations can be used to obtain grayscale images based on the output images and depth images based on the degree images, and the accuracy of the final acquired depth images is not high.
- the method provided by the embodiment of the present disclosure selects to acquire at least four output images, and the received phases of the at least four output images and the preset phases of the reference light have different phase differences.
- the different phase differences between the received phases of the at least four output images and the preset phases of the reference light need to meet a certain distribution law, so that a grayscale image can be obtained from the output image, and a depth image can be obtained from the grayscale image.
- the number of at least four output images is four, and the four output images include the first output image, the second output image, the third output image, and the fourth output image, where:
- phase difference is 0°.
- the four output images are all grayscale images.
- the step of acquiring the output image in step 202 may be completed by the photosensitive circuit in the image depth sensing component.
- Each pixel of the image depth sensing component is equipped with a photosensitive circuit, and the photosensitive circuit is equipped with a capacitor.
- the change of the capacitance value can reflect the change of the amplitude value of the light signal of each pixel, which is reflected in the grayscale image.
- the gray value of each pixel is calculated.
- the structure of the photosensitive circuit of the image depth sensing component may include the structure shown in FIG. 3, DM1 and DM2 are respectively two photoelectric switches arranged on one pixel in the image depth sensing component, which can sense light. Corresponding to switches DM1 and DM2 are capacitors Ca and Cb. When the gate voltage of DM1 is at the first level and the gate voltage of DM2 is at the second level (the first level is high relative to the second level), the switch of DM1 is turned on and DM2 is turned off, that is, the current direction Ca storage, and vice versa, storage to Cb. The voltage change generated on the capacitor Ca is a, and the voltage change generated on the capacitor Cb is b.
- Vrst in Figure 3 is the reset voltage, and Mr1 and Mr2 are transistors used for voltage reset.
- Fig. 3 also includes a constant current source CCS and a ground terminal GND.
- the receiving pixel waveform diagram of the image depth sensor component is shown in Figure 4.
- the measurement of the transmission and reception timing is divided into four types, namely: the transmission and reception are completely synchronized (that is, the phase difference is 0 degrees), and the difference between the transmission and reception is 90 degrees. Transmitting and receiving are 180 degrees apart, and transmitting and receiving are 270 degrees apart.
- a0-b0 (VAa+VOa+VR-Ga*2cos(Trt/T))-(VAb+VOb+VR+Gb*2cos(Trt/T)).
- a180-b180 (VAa+VOa+VR+Ga2cos(Trt/T))-(VAb+VOb+VR-Gb*2cos(Trt/T)).
- a90-b90 (VAa+VOa+VR-Ga*2cos(Trt/T))-(VAb+VOb+VR+Gb*2cos(Trt/T)).
- a270-b270 (VAa+VOa+VR+Ga2cos(Trt/T))-(VAb+VOb+VR-Gb*2cos(Trt/T)).
- a0 is the voltage change generated on the capacitor Ca when the transmission and reception are 0 degrees apart
- a90 is the voltage change generated on the capacitor Ca when the transmission and reception are 90 degrees apart
- a180 is the voltage change generated on the capacitor Ca when the transmission and reception are 180 degrees apart
- a270 is the voltage change generated on the capacitor Ca when the transmit and receive are 270 degrees different.
- b 0 is the voltage change generated on the capacitor Cb when the transmitting and receiving are 0 degrees
- b 90 is the voltage change generated on the capacitor Cb when the transmitting and receiving are 90 degrees
- b 180 is the voltage change on the capacitor Cb when the transmitting and receiving are 180 degrees different.
- the voltage change of b 270 is the voltage change generated on the capacitor Cb when the difference between transmitting and receiving is 270 degrees.
- V Aa and ambient light variation V Ab, V 0a and V 0b initial bias, V R is the initial potential (equal to the value Vrst when the reset Vrst), Ga and Gb are signal gain, T rt after the light switch is turned on The time stored on the capacitor, T is the clk clock cycle.
- a 0 -b 0 (V Aa +V Oa +V R )-(V Ab +V Ob +V R )-(Ga+Gb)*2cos(T rt /T)
- a 180 -b 180 (V Aa +V Oa +V R )-(V Ab +V Ob +V R )+(Ga+Gb)*2cos(T rt /T)
- a 90 -b 90 (V Aa +V Oa +V R )-(V Ab +V Ob +V R )-(Ga+Gb)*2sin(T rt /T)
- a 270 -b 270 (V Aa +V Oa +V R )-(V Ab +V Ob +V R )+(Ga+Gb)*2sin(T rt /T)
- a 0 -b 0 is the gray value of a certain pixel on the first output image when the difference between transmitting and receiving is 0 degrees, that is, A 1 .
- a 90- b 90 is the gray value of a certain pixel on the second output image when the transmit and receive are 90 degrees apart, that is, A 2 .
- a 180 -b 180 is the gray value of a certain pixel on the third output image when the transmission and reception differ by 180 degrees, that is, A 3 .
- a 270- b 270 is the gray value of a certain pixel on the fourth output image when the difference between the emission and the reception is 270 degrees, that is, A 4 .
- the gray value of each pixel in each output image can be obtained through the photosensitive circuit in each pixel.
- step 203 the image depth sensing component obtains the filter parameters of the filter layer.
- obtaining the filter parameters of the image of the object by the filter layer in step 203 includes:
- Step 2031 The image depth sensing component obtains the undetermined filter parameter formula.
- the undetermined filter parameter formula includes:
- H is the transfer function matrix of the filter layer
- H * is the adjoint matrix of the transfer function matrix H
- H w is the undetermined filter parameter
- ⁇ is the undetermined coefficient.
- the transfer function matrix H in the undetermined filter parameter formula is obtained by actually measuring the filter layer after the filter layer in the depth image acquisition device is set.
- the undetermined filter parameter formula The transfer function matrix H in is a known quantity.
- the transfer function matrix H after the matrix H * is also accompanied by the transfer function matrix H can be calculated accordingly, and thus determined filter parameters in the transfer function matrix equation adjoint matrix H * H is a known quantity.
- 2 in the undetermined filter parameter formula is also a known quantity. Therefore, the unknown quantity is the undetermined coefficient ⁇ and the undetermined filter parameter H w , and the undetermined filter parameter is expressed by the undetermined coefficient ⁇ and multiple known quantities using the above-mentioned undetermined filter parameter formula.
- the image depth sensing component convolves at least four output images with undetermined filter parameters to obtain at least four grayscale images, and determines that the resolution of the at least four grayscale images is at least the fourth corresponding to the highest resolution. A coefficient to be determined.
- the undetermined coefficient ⁇ can be assigned by an iterative method first, and the corresponding coefficient ⁇ after the assignment is used to represent the undetermined filter parameter, and it is convolved with at least four output images to obtain at least four Create a grayscale image and calculate the sharpness of the grayscale image corresponding to each assignment.
- the sharpness can be calculated using a gradient algorithm.
- the value of the coefficient ⁇ is different, and the definition of the corresponding gray image obtained after convolution is also different.
- the corresponding at least four coefficients are the first undetermined coefficients. It can be understood that the number of the first undetermined coefficients is the same as the number of output images.
- Step 2033 The image depth sensing component determines the average value of the at least four first undetermined coefficients as the target coefficient.
- the average value may be an arithmetic average or geometric average of at least four first undetermined coefficients.
- step 2034 the image depth sensing component brings the target coefficient into the undetermined filter parameter formula to obtain the filter parameter.
- the clearest grayscale image is obtained through the first undetermined coefficient, it is equivalent to proper sharpening of the output image, for example, proper sharpening of the edge of the output image. Therefore, the most The clear grayscale image may have some distortion compared with the real image.
- the clearest grayscale image may not be the most suitable for analysis and obtain the depth image. Therefore, at least four first undetermined coefficients can be averaged Value processing is used to obtain the final target coefficient, and the target coefficient is used as the coefficient in the undetermined filter parameter formula, which can ensure the accuracy of the depth image while ensuring the definition of the gray image.
- the image depth sensing component obtains at least four grayscale images according to the filter parameters and at least four output images.
- obtaining at least four grayscale images according to the filter parameters and at least four output images includes:
- At least four output images are convolved with the filter parameters to obtain at least four grayscale images.
- D represents the output image
- Hw represents the filter parameter
- S represents the grayscale image
- the embodiment of the present disclosure uses a filter layer to replace the traditional optical lens set. Therefore, the light reflected from the actual scene passes through the filter layer to obtain the resulting image.
- the above formula indicates that the filter parameter H w of the filter layer is used to In the process of restoring the resulting output image D and obtaining the grayscale image S, it can be understood that the filter parameter H w of the filter layer is actually equivalent to the inverse transfer function of the filter layer.
- the image restoration process in step 204 can be completed by using a processor integrated in the image depth sensing component without using other optical elements for optical transformation.
- the grayscale value of each pixel in each grayscale image is also obtained at the same time, so that these grayscale values can be used to calculate the depth value to obtain the depth image.
- the image depth sensing component obtains a depth image based on at least four grayscale images.
- the at least four grayscale images are a first grayscale image corresponding to the first output image, a second grayscale image corresponding to the second output image, and a third grayscale image corresponding to the third output image.
- the image and the fourth grayscale image corresponding to the fourth output image, obtaining a depth image based on at least four grayscale images includes:
- the depth image is obtained according to the gray value of each pixel and the depth image formula.
- the depth image formula includes:
- ⁇ is the intermediate quantity
- a 1 , A 2 , A 3 and A 4 are the values of pixels at the same position in the first, second, third, and fourth grayscale images, respectively.
- Gray value L is the depth of field parameter corresponding to each pixel in the depth image
- ⁇ is the phase difference corresponding to each pixel in the depth image
- C is the speed of light
- ⁇ is the circumference of the circle
- f is the frequency of the reference light.
- step 205 is also a calculation step implemented by the processor in the image depth sensing component, and the photosensitive circuit in the image depth sensing component does not need to be used.
- the photosensitive circuit in the image depth sensing component can be Play a role in step 202.
- the display device includes a display panel
- the depth sensing component and the display panel can be powered by two separate driving power sources.
- the depth sensing component and the display panel may also share the same driving power supply, and the depth sensing component and the display panel may respectively correspond to two different power supply modules in the same driving power supply.
- the depth image acquisition method can emit reference light with a preset phase through the image depth sensor component, and receive the reflected light reflected by the reference light by the object, and obtain at least four received phase sums.
- the output images with different preset phases of the reference light are restored, and at least four output images are restored according to the filter parameters of the filter layer to obtain at least four gray-scale images, and the depth image is obtained from the at least four gray-scale images. Since this method replaces the optical lens in the traditional depth camera with a filter layer, a complicated optical lens set is not needed, so that the method can obtain a depth image with a relatively simple structure.
- the depth image acquisition method can be conveniently applied to the display device.
- FIG. 6 it is a structural diagram of a display device provided by an embodiment of the present disclosure.
- the display device includes a display panel 31 and an image depth sensing component 32.
- the display panel 31 includes a filter layer 311 and an image depth sensing component. 32 is used to perform the depth image acquisition method.
- the image depth sensing component 32 is located outside the display panel 31.
- the image depth sensing component 32 can be directly arranged inside the display panel 31 and integrated with the display panel 31.
- the display panel 31 includes a transparent substrate 312, a filter layer 311 and a transparent packaging film layer 313 which are stacked, and the image depth sensor component 32 is located on the side of the transparent substrate 312 away from the transparent packaging film layer 313.
- the depth sensing component 32 and the display panel 31 may be powered by two separate driving power sources.
- the depth sensing component 32 may also share the same driving power source with the display panel 31, and the depth sensing component 32 and the display panel 31 respectively correspond to two different power modules in the same driving power source.
- the filter layer 311 includes a stacked light-emitting unit layer 3111 and a mask layer 3112, and the mask layer 3112 is located on the side of the filter layer 311 close to the image depth sensing component 32. Disposing the mask layer 3112 on the side of the filter layer 311 close to the image depth sensor component 32 can increase the light output rate of the display device and prevent the mask layer 3112 from blocking the light emitted from the light emitting unit layer 3111 to the light emitting side of the display panel 31.
- the transparent substrate 312 and the transparent encapsulation film layer 313 in the display panel 31 are made of transparent materials, the light transmittance is improved, so that the light reflected from the object can be irradiated to the image depth sensing component 32 more effectively.
- the material of the mask layer 3112 includes a molybdenum metal material.
- the specific pattern of the mask layer 3112 is shown in FIG. 7, and FIG. 7 is a top view of the mask layer 3112.
- the black area in FIG. 7 is a blocking area, which can block light
- the white area is a hollow area, which can transmit light. Since the mask layer 3112 has a pattern as shown in FIG. 7, the mask layer 3112 has a light filtering effect. Encode the transmitted light.
- the specific pattern can be set according to actual filtering and coding requirements, so that the mask layer 3112 has a specific coding function.
- the light-emitting unit layer 3111 may include at least one organic light-emitting diode (OLED) 31111, and the light-emitting unit layer 3111 is used to emit red light, green light, or blue light.
- OLED organic light-emitting diode
- the number of image depth sensors included in the image depth sensing assembly 32 is the same as the number of organic light emitting diodes 31111.
- the filter layer 311 is an incompletely transparent hierarchical structure in the display panel and has a filtering effect. After the light reflected from the object enters the display panel, it first passes through the light emitting unit layer 3111, the light is encoded once, and then passes through the mask layer 3112, the light is encoded a second time, and all the light reflected from the object is After encoding twice, the output image can be obtained. Suitable light-emitting unit layer 3111 and mask layer 3112 can be selected according to actual imaging requirements, so that the combination of light-emitting unit layer 3111 and mask layer 3112 has filter coding characteristics suitable for imaging and has sufficient coding capability.
- the single light-emitting unit layer 3111 since the shape of the organic light-emitting diode 31111 is relatively fixed, the single light-emitting unit layer 3111 only has a certain passive encoding capability. Based on this, a mask layer 3112 is also provided in the embodiment of the present disclosure to cooperate with the light-emitting unit layer 3111 , To effectively and controllably encode the light.
- the light reflected by the object is generally divergent and chaotic, and it is not possible to directly use photosensitive equipment for effective collection. Therefore, a set of optical lenses is set in a traditional camera to converge and converge the divergent and chaotic light reflected by the object. Encoding, and then restore the image obtained after passing through the optical lens to obtain the desired image.
- the filter layer 311 can also encode chaotic light, which is a good substitute for the optical lens in a traditional camera, so that the display device can have a higher degree of integration and a smaller volume.
- the encoding characteristics of the filter layer 311 are expressed by a transfer function matrix when performing image restoration. Different filter layers 311 can have different transfer function matrices. In an optional manner, you can select according to actual imaging requirements. A filter layer 311 with a corresponding transfer function matrix.
- the display panel further includes an anode conductive layer, and the anode conductive layer and the mask layer 3112 are an integrated structure, that is, the anode conductive layer and the mask layer 3112 may have the same structure, or the anode conductive layer may be an anode conductive layer.
- the layer and the mask layer 3112 may be two connected structures.
- anode conductive layer directly as the mask layer 3112 can reduce the number of layers of the display device, simplify the structure, and reduce the cost.
- the working process of the display device may include:
- the image depth sensor component 32 emits reference light, passes through the display panel 31, illuminates the object and returns.
- the returned light passes through the display panel 31 and is collected by the image depth sensor component 32.
- the filter layer 3111 replaces the optical lens in the traditional camera, so that the display device can have a higher integration level and a smaller volume.
- the photosensitive circuit of the image depth sensing component 32 can only obtain unreduced output images, but the image depth sensing component 32 also has the function of restoring the image, and the image depth sensing component 32 further performs the output
- the image is restored to obtain a grayscale image, and the depth information is determined according to the grayscale image. That is, although the image depth sensing component 32 has collected the output image, the final depth information is determined based on the restored grayscale image.
- the two steps of restoring the image and obtaining the grayscale image are both completed by the processor integrated in the image depth sensing component 32.
- the display device provided by the present disclosure includes a display panel 31 and an image depth sensing component 32, the display panel 31 includes a filter layer 311, and the image depth sensing component 32 is used to implement a depth image acquisition method. Since the filter layer 311 is used to replace the optical lens in the traditional depth camera, there is no need to set up a complicated optical lens group, so that the filter layer and the image depth sensing component can be well integrated in the display device, so that the display device can be used to obtain the depth image .
- embodiments of the present disclosure also provide a depth image acquisition device, including an image depth sensing component and a filter layer located outside the image depth sensing component; the depth image acquisition device can be used to implement the depth image acquisition device provided in the above embodiments The method of obtaining the depth image.
- the image depth sensing component is used to emit reference light with a preset phase to the filter layer
- the image depth sensing component is used to obtain at least four output images by receiving the reflected light.
- the received phase and the preset phase of the at least four output images have different phase differences, respectively, and the reflected light is the reference light after passing through the filter layer.
- the image depth sensing component is used to obtain at least four grayscale images according to the filter parameters of the filter layer and at least four output images;
- the image depth sensing component is used to obtain a depth image based on at least four gray-scale images.
- the image depth sensing component is used to obtain the undetermined filter parameter formula
- the undetermined filter parameter formula includes:
- H is the transfer function matrix of the filter layer
- H * is the adjoint matrix of the transfer function matrix H
- H w is the undetermined filter parameter
- ⁇ is the undetermined coefficient
- the image depth sensing component is used to convolve at least four output images with undetermined filter parameters to obtain at least four grayscale images, and determine that the resolution of the at least four grayscale images is at least the fourth corresponding to the highest resolution A undetermined coefficient;
- the image depth sensing component is used to determine the average value of at least four first undetermined coefficients as the target coefficient
- the image depth sensing component is used to bring the target coefficient as the undetermined coefficient into the undetermined filter parameter formula to obtain the filter parameter.
- the number of at least four output images is four, and the four output images include the first output image, the second output image, the third output image, and the fourth output image, where:
- the phase difference between the received phase of the first output image and the preset phase is 0°
- the at least four grayscale images are a first grayscale image corresponding to the first output image, a second grayscale image corresponding to the second output image, a third grayscale image corresponding to the third output image, and a fourth output image.
- the fourth grayscale image corresponding to the image is a first grayscale image corresponding to the first output image, a second grayscale image corresponding to the second output image, a third grayscale image corresponding to the third output image, and a fourth output image.
- the image depth sensing component is used to obtain the gray value of each pixel in each gray image in the first gray image, the second gray image, the third gray image, and the fourth gray image;
- the image depth sensing component is used to obtain the depth image according to the gray value of each pixel and the depth image formula.
- the depth image formula includes:
- a 1 , A 2 , A 3 and A 4 are respectively the gray scales of the pixels at the same position in the first gray scale image, the second gray scale image, the third gray scale image, and the fourth gray scale image
- L Is the depth of field parameter corresponding to each pixel in the depth image
- ⁇ is the phase difference corresponding to each pixel in the depth image
- C is the speed of light
- ⁇ is the circumference of the circle
- f is the frequency of the reference light.
- the image depth sensing component is used to convolve at least four output images with filter parameters to obtain at least four grayscale images.
- an embodiment of the present disclosure also provides a computer storage medium, the computer storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, at least one program, code set or instruction set consists of
- the processor loads and executes some steps of the depth image acquisition method provided in the above-mentioned embodiment, for example, step 103 to step 104 in the embodiment shown in FIG. 1 and step 203 to step in the embodiment shown in FIG. 2 205.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
Description
Claims (20)
- 一种深度图像的获取方法,用于图像深度传感组件中,所述图像深度传感组件的外部设置有滤波层,所述方法包括:向所述滤波层发射预设相位的参考光;通过接收反射光,获取至少四张输出图像,所述至少四张输出图像的接收相位和所述预设相位之间分别具有不同的相位差,所述反射光是所述参考光经所述滤波层后由物体反射的光;根据所述滤波层的滤波参数以及所述至少四张输出图像,得到至少四张灰度图像;根据所述至少四张灰度图像得到深度图像。
- 根据权利要求1所述的方法,所述根据所述滤波层的滤波参数以及所述至少四张输出图像,得到至少四张灰度图像之前,所述方法还包括:获取所述滤波层的滤波参数。
- 根据权利要求1-3任一所述的方法,所述根据所述滤波参数以及所述至少四张输出图像,得到至少四张灰度图像,包括:将所述至少四张输出图像分别与所述滤波参数卷积得到所述至少四张灰度图像。
- 根据权利要求1-4任一所述的方法,所述至少四张输出图像的接收相位和所述预设相位之间具有的不同的相位差呈差值为x的等差数列,其中,x×n=360°,所述n为所述输出图像的总数。
- 根据权利要求5所述的方法,所述至少四张输出图像的数量为四,四张所述输出图像包括第一输出图像、第二输出图像、第三输出图像和第四输出图像,其中:所述第一输出图像的接收相位和所述预设相位之间的相位差为0°,所述第二输出图像的接收相位和所述预设相位之间具有90°的相位差,所述第三输出图像的接收相位和所述预设相位之间具有180°的相位差,所述第四输出图像的接收相位和所述预设相位之间具有270°的相位差。
- 根据权利要求6所述的方法,所述至少四张灰度图像为所述第一输出图像对应的第一灰度图像、所述第二输出图像对应的第二灰度图像、所述第三输出图像对应的第三灰度图像以及所述第四输出图像对应的第四灰度图像,所述根据所述至少四张灰度图像得到深度图像,包括:获取所述第一灰度图像、所述第二灰度图像、所述第三灰度图像以及所述第四灰度图像中每张灰度图像中每个像素点的灰度值;根据所述每个像素点的灰度值以及深度图像公式得到所述深度图像,所述深度图像公式包括:其中,Α 1、Α 2、Α 3和Α 4依次分别为所述第一灰度图像、所述第二灰度图像、所述第三灰度图像和所述第四灰度图像中相同位置的像素点的灰度,L为所述深度图像中每个像素点对应的景深参数,φ为所述深度图像中每个像素点对应的相位差,C为光速,π为圆周率,f为所述参考光的频率。
- 根据权利要求1所述的方法,所述根据所述滤波层的滤波参数以及所述至少四张输出图像,得到至少四张灰度图像之前,所述方法还包括:获取待定滤波参数公式,所述待定滤波参数公式包括:其中,H为所述滤波层的传递函数矩阵,H *是所述传递函数矩阵H的伴随矩阵,H w为所述待定滤波参数,ε为待定系数;将所述至少四张输出图像分别与所述待定滤波参数卷积得到至少四张灰度图像,确定所述至少四张灰度图像的清晰度均为最高时所对应的至少四个第一待定系数;将所述至少四个第一待定系数的平均值确定为目标系数;将所述目标系数作为所述待定系数带入所述待定滤波参数公式,得到所述滤波参数;所述根据所述滤波参数以及所述至少四张输出图像,得到至少四张灰度图像,包括:将所述至少四张输出图像分别与所述滤波参数卷积得到所述至少四张灰度图像;所述至少四张输出图像的数量为四,四张所述输出图像包括第一输出图像、第二输出图像、第三输出图像和第四输出图像,其中:所述第一输出图像的接收相位和所述预设相位之间的相位差为0°,所述第二输出图像的接收相位和所述预设相位之间具有90°的相位差,所述第三输出图像的接收相位和所述预设相位之间具有180°的相位差,所述第四输出图像的接收相位和所述预设相位之间具有270°的相位差;所述至少四张灰度图像为所述第一输出图像对应的第一灰度图像、所述第二输出图像对应的第二灰度图像、所述第三输出图像对应的第三灰度图像以及所述第四输出图像对应的第四灰度图像,所述根据所述至少四张灰度图像得到深度图像,包括:获取所述第一灰度图像、所述第二灰度图像、所述第三灰度图像以及所述第四灰度图像中每张灰度图像中每个像素点的灰度值;根据所述每个像素点的灰度值以及深度图像公式得到所述深度图像,所述深度图像公式包括:其中,Α 1、Α 2、Α 3和Α 4依次分别为所述第一灰度图像、所述第二灰度图像、所述第三灰度图像和所述第四灰度图像中相同位置的像素点的灰度,L为所述深度图像中每个像素点对应的景深参数,φ为所述深度图像中每个像素点对应的相位差,C为光速,π为圆周率,f为所述参考光的频率。
- 一种显示装置,所述显示装置包括显示面板以及图像深度传感组件,所述显示面板包括滤波层,所述图像深度传感组件用于执行权利要求1-8任一所述的深度图像的获取方法。
- 根据权利要求9所述的显示装置,所述图像深度传感组件位于所述显示面板的外部。
- 根据权利要求9所述的显示装置,所述图像深度传感组件位于所述显示面板的内部。
- 根据权利要求10所述的显示装置,所述显示面板包括层叠的透明基板、所述滤波层以及透明封装膜层,所述图像深度传感组件位于所述透明基板远离所述透明封装膜层一侧。
- 根据权利要求12所述的显示装置,所述滤波层包括叠置的发光单元层和掩膜层,所述掩膜层位于所述滤波层靠近所述图像深度传感组件的一侧。
- 根据权利要求12所述的显示装置,所述掩膜层的材料包括钼金属材料。
- 根据权利要求13所述的显示装置,所述显示面板还包括阳极导电层,所述阳极导电层与所述掩膜层为一体结构。
- 一种深度图像的获取装置,包括图像深度传感组件以及位于所述图像深 度传感组件的外部的滤波层;所述图像深度传感组件,用于向所述滤波层发射预设相位的参考光;所述图像深度传感组件,用于通过接收反射光,获取至少四张输出图像,所述至少四张输出图像的接收相位和所述预设相位之间分别具有不同的相位差,所述反射光是所述参考光经所述滤波层后由物体反射的光;所述图像深度传感组件,用于根据所述滤波层的滤波参数以及所述至少四张输出图像,得到至少四张灰度图像;所述图像深度传感组件,用于根据所述至少四张灰度图像得到深度图像。
- 根据权利要求17所述的深度图像的获取装置,所述至少四张输出图像的数量为四,四张所述输出图像包括第一输出图像、第二输出图像、第三输出图像和第四输出图像,其中:所述第一输出图像的接收相位和所述预设相位之间的相位差为0°,所述第二输出图像的接收相位和所述预设相位之间具有90°的相位差,所述第三输出图像的接收相位和所述预设相位之间具有180°的相位差,所述第四输出图像的接收相位和所述预设相位之间具有270°的相位差。
- 根据权利要求18所述的深度图像的获取装置,所述至少四张灰度图像为所述第一输出图像对应的第一灰度图像、所述第二输出图像对应的第二灰度图像、所述第三输出图像对应的第三灰度图像以及所述第四输出图像对应的第四灰度图像,所述图像深度传感组件,用于获取所述第一灰度图像、所述第二灰度图像、所述第三灰度图像以及所述第四灰度图像中每张灰度图像中每个像素点的灰度值;所述图像深度传感组件,用于根据所述每个像素点的灰度值以及深度图像公式得到所述深度图像,所述深度图像公式包括:其中,Α 1、Α 2、Α 3和Α 4依次分别为所述第一灰度图像、所述第二灰度图像、所述第三灰度图像和所述第四灰度图像中相同位置的像素点的灰度,L为所述深度图像中每个像素点对应的景深参数,φ为所述深度图像中每个像素点对应的相位差,C为光速,π为圆周率,f为所述参考光的频率。
- 根据权利要求16-19任一所述的深度图像的获取装置,所述图像深度传感组件,用于将所述至少四张输出图像分别与所述滤波参数卷积得到所述至少四张灰度图像。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010124155.1 | 2020-02-27 | ||
CN202010124155.1A CN111464721B (zh) | 2020-02-27 | 2020-02-27 | 深度图像的获取方法及显示装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021170114A1 true WO2021170114A1 (zh) | 2021-09-02 |
Family
ID=71679999
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/078271 WO2021170114A1 (zh) | 2020-02-27 | 2021-02-26 | 深度图像的获取方法、装置及显示装置 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111464721B (zh) |
WO (1) | WO2021170114A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111464721B (zh) * | 2020-02-27 | 2022-02-08 | 京东方科技集团股份有限公司 | 深度图像的获取方法及显示装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160078628A1 (en) * | 2012-01-03 | 2016-03-17 | Samsung Electronics Co., Ltd. | Display apparatus and method for estimating depth |
CN108616735A (zh) * | 2016-12-02 | 2018-10-02 | 北京三星通信技术研究有限公司 | 用于获得对象的三维深度图像的装置和方法 |
CN109215604A (zh) * | 2018-11-07 | 2019-01-15 | 京东方科技集团股份有限公司 | 显示装置及其纹路识别方法、实现该方法的产品、纹路识别器件 |
CN109508683A (zh) * | 2018-11-21 | 2019-03-22 | 京东方科技集团股份有限公司 | 用于显示装置的纹路识别方法、纹路检测芯片及显示装置 |
CN111464721A (zh) * | 2020-02-27 | 2020-07-28 | 京东方科技集团股份有限公司 | 深度图像的获取方法及显示装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102566290B (zh) * | 2010-12-22 | 2014-06-18 | 上海微电子装备有限公司 | 投影式斜坡曝光光刻机装置与方法 |
JP5414752B2 (ja) * | 2011-08-08 | 2014-02-12 | キヤノン株式会社 | 画像処理方法、画像処理装置、撮像装置、および、画像処理プログラム |
CN102679950B (zh) * | 2012-05-18 | 2014-06-18 | 中国航空工业集团公司北京长城计量测试技术研究所 | 一种基于三波长飞秒激光的测距装置及方法 |
WO2014100942A1 (zh) * | 2012-12-24 | 2014-07-03 | 华为技术有限公司 | 一种激光光源输出装置及激光输出*** |
US9584790B2 (en) * | 2013-06-03 | 2017-02-28 | Microsoft Technology Licensing, Llc | Edge preserving depth filtering |
KR20170098089A (ko) * | 2016-02-19 | 2017-08-29 | 삼성전자주식회사 | 전자 장치 및 그의 동작 방법 |
CN107068886B (zh) * | 2017-05-12 | 2019-01-25 | 京东方科技集团股份有限公司 | 有机电致发光器件及其制作方法、发光装置 |
CN109962091B (zh) * | 2019-03-29 | 2021-01-26 | 京东方科技集团股份有限公司 | 一种电致发光显示面板及显示装置 |
CN112098976B (zh) * | 2019-10-25 | 2021-05-11 | 深圳煜炜光学科技有限公司 | 一种具有同步并行扫描功能的多线激光雷达和控制方法 |
-
2020
- 2020-02-27 CN CN202010124155.1A patent/CN111464721B/zh active Active
-
2021
- 2021-02-26 WO PCT/CN2021/078271 patent/WO2021170114A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160078628A1 (en) * | 2012-01-03 | 2016-03-17 | Samsung Electronics Co., Ltd. | Display apparatus and method for estimating depth |
CN108616735A (zh) * | 2016-12-02 | 2018-10-02 | 北京三星通信技术研究有限公司 | 用于获得对象的三维深度图像的装置和方法 |
CN109215604A (zh) * | 2018-11-07 | 2019-01-15 | 京东方科技集团股份有限公司 | 显示装置及其纹路识别方法、实现该方法的产品、纹路识别器件 |
CN109508683A (zh) * | 2018-11-21 | 2019-03-22 | 京东方科技集团股份有限公司 | 用于显示装置的纹路识别方法、纹路检测芯片及显示装置 |
CN111464721A (zh) * | 2020-02-27 | 2020-07-28 | 京东方科技集团股份有限公司 | 深度图像的获取方法及显示装置 |
Also Published As
Publication number | Publication date |
---|---|
CN111464721B (zh) | 2022-02-08 |
CN111464721A (zh) | 2020-07-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210356572A1 (en) | Systems and methods for augmentation of sensor systems and imaging systems with polarization | |
KR102456875B1 (ko) | 심도 촬상 장치, 방법 및 응용 | |
JP7208156B2 (ja) | ピクセルワイズイメージングの方法及びシステム | |
KR102497704B1 (ko) | 광 검출기 | |
Heredia Conde | Compressive sensing for the photonic mixer device | |
US20090268045A1 (en) | Apparatus and methods for configuration and optimization of image sensors for gaze tracking applications | |
CN111727602B (zh) | 单芯片rgb-d相机 | |
JP4900723B2 (ja) | 画像処理装置、画像処理プログラムおよび表示装置 | |
US9131221B2 (en) | Stereoscopic image capturing method, system and camera | |
EP3918415B1 (en) | Apparatus integrated with display panel for tof 3d spatial positioning | |
JP2010071976A (ja) | 距離推定装置、距離推定方法、プログラム、集積回路およびカメラ | |
JP2016529491A (ja) | 飛行時間型カメラシステム | |
JP2016510396A (ja) | 異なる奥行き撮像技術を用いて生成された奥行き画像をマージするための方法および装置 | |
CN108921781A (zh) | 一种基于深度的光场拼接方法 | |
KR20170094968A (ko) | 피사체 거리 측정 부재, 이를 갖는 카메라 모듈 | |
CN109697957B (zh) | 图像像素校正方法及*** | |
CN111678457A (zh) | 一种OLED透明屏下ToF装置及测距方法 | |
US20210310803A1 (en) | Sensing element having pixels exposed by aperture increasing along predetermined direction | |
WO2021170114A1 (zh) | 深度图像的获取方法、装置及显示装置 | |
CN109788216A (zh) | 用于tof的抗干扰方法、装置及tof传感器芯片 | |
US11610339B2 (en) | Imaging processing apparatus and method extracting a second RGB ToF feature points having a correlation between the first RGB and TOF feature points | |
CN115170810B (zh) | 一种可见光红外图像融合目标检测实例分割方法 | |
CN111654626A (zh) | 一种包含深度信息的高分辨率相机 | |
Conde et al. | Low-light image enhancement for multiaperture and multitap systems | |
US20220295038A1 (en) | Multi-modal and multi-spectral stereo camera arrays |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21761181 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21761181 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21761181 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12/04/2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21761181 Country of ref document: EP Kind code of ref document: A1 |