CN114283089A - Jump acceleration based depth recovery method, electronic device, and storage medium - Google Patents

Jump acceleration based depth recovery method, electronic device, and storage medium Download PDF

Info

Publication number
CN114283089A
CN114283089A CN202111603078.9A CN202111603078A CN114283089A CN 114283089 A CN114283089 A CN 114283089A CN 202111603078 A CN202111603078 A CN 202111603078A CN 114283089 A CN114283089 A CN 114283089A
Authority
CN
China
Prior art keywords
parallax
value
disparity
seed point
speckle pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111603078.9A
Other languages
Chinese (zh)
Other versions
CN114283089B (en
Inventor
李东洋
化雪诚
王海彬
刘祺昌
户磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Dilusense Technology Co Ltd
Original Assignee
Beijing Dilusense Technology Co Ltd
Hefei Dilusense Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dilusense Technology Co Ltd, Hefei Dilusense Technology Co Ltd filed Critical Beijing Dilusense Technology Co Ltd
Priority to CN202111603078.9A priority Critical patent/CN114283089B/en
Publication of CN114283089A publication Critical patent/CN114283089A/en
Application granted granted Critical
Publication of CN114283089B publication Critical patent/CN114283089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The embodiment of the invention relates to the field of image processing, and discloses a depth recovery method based on jump acceleration, electronic equipment and a storage medium, wherein the method comprises the following steps: aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point; for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search, determining whether the candidate seed point is a seed point or not based on the obtained matching cost value corresponding to each parallax value, and obtaining the parallax value of the seed point; determining the disparity value of the object speckle pattern and the reference speckle pattern by using the seed points and the disparity values thereof and adopting a region growing method; recovering depth information based on disparity values of the object speckle pattern and the reference speckle pattern. According to the scheme, the depth recovery process can be accelerated on the basis of effectively ensuring the recovery image precision.

Description

Jump acceleration based depth recovery method, electronic device, and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to a depth recovery method based on jump acceleration, an electronic device, and a storage medium.
Background
At present, the most active technical branch in the field of machine vision belongs to the depth perception technology, and the speckle structure light technology is an important part in the depth perception technology. The speckle structured light technology is used as the most common active stereoscopic vision technology and has wide application in the fields of face recognition, automatic driving, security monitoring and the like. The speckle structured light system projects pseudo-random speckles to a shot object, and then performs characteristic matching of the speckles according to a specific algorithm to obtain parallax information, so as to further obtain depth information of a scene.
However, brute force speckle matching is computationally intensive and time consuming. Currently, a number of different techniques are being adopted in the industry to address this problem. For example: by using an image binarization method, the Hamming (Hamming) distance is used as the similarity measure during matching, which greatly saves the calculation amount and time consumption, but the precision of the method is generally not higher than that of the methods of image Local gray Normalization (LCN) and Zero-mean Normalized Cross Correlation (ZNCC) measure; or, the approximate range of the depth is estimated in a certain way, so that the parallax searching range is further reduced, but the time consumption is reduced, and the parallax searching range is limited by the service use; alternatively, the approach of using neural networks, but also limited by the amount of data required for training and memory limitations of the model size, and black boxes are not easily interpretable.
Disclosure of Invention
An object of embodiments of the present invention is to provide a depth recovery method based on jump acceleration, an electronic device, and a storage medium, which can accelerate a depth recovery process on the basis of effectively ensuring the accuracy of a recovered image.
In order to solve the above technical problem, an embodiment of the present invention provides a depth recovery method based on jump acceleration, including:
aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point;
for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point;
determining the disparity value of the object speckle pattern and the reference speckle pattern by using the seed points and the disparity values thereof and adopting a region growing method;
recovering depth information based on disparity values of the object speckle pattern and the reference speckle pattern.
An embodiment of the present invention also provides an electronic device, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a jump acceleration based depth restoration method as described above.
Embodiments of the present invention also provide a computer-readable storage medium storing a computer program which, when executed by a processor, implements a jump acceleration based depth recovery method as described above.
Compared with the prior art, the method and the device have the advantages that a plurality of candidate seed points and a first parallax search range corresponding to each candidate seed point are selected from the object speckle pattern by aiming at the preprocessed object speckle pattern and the reference speckle pattern; for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point; determining the parallax values of the object speckle pattern and the reference speckle pattern by using the seed points and the parallax values thereof and adopting a region growing method; and recovering the depth information based on the parallax values of the object speckle pattern and the reference speckle pattern. According to the scheme, when the large-range parallax matching of the candidate seed points is carried out, continuous searching and matching are not carried out in a preset parallax searching range, and the matching searching is carried out in a jumping mode from the middle according to a certain interval; after the seed points are successfully searched and determined, the seed points are grown until the growth of the image area is finished, the parallax value of the object speckle pattern and the reference speckle pattern is determined, and then the depth information of the object is recovered based on the parallax map, so that the depth recovery process is accelerated on the basis of effectively ensuring the accuracy of the recovered image.
Drawings
Fig. 1 is a specific flowchart one of a depth recovery method based on jump acceleration according to an embodiment of the present invention;
FIG. 2 is a diagram of a candidate seed point selection according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a triangulation principle to calculate depth according to an embodiment of the invention;
FIG. 4 is a specific flowchart II of a depth recovery method based on jump acceleration according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, embodiments of the present invention will be described in detail below with reference to the accompanying drawings. However, it will be appreciated by those of ordinary skill in the art that numerous technical details are set forth in order to provide a better understanding of the present application in various embodiments of the present invention. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
The invention relates to a depth recovery method based on jump acceleration, which is suitable for an image processing scene for recovering a depth image of a target object by using a speckle pattern of the target object. As shown in fig. 1, the depth recovery method based on jump acceleration provided by this embodiment includes the following steps.
Step 101: and aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point.
Specifically, a speckle pattern of a target object can be photographed by a structured light camera (simply referred to as "camera") as an object speckle pattern; the reference speckle pattern is a planar speckle pattern of known distance. The object speckle pattern and the reference speckle pattern are preprocessed to improve the light-dark contrast and the brightness balance effect of the speckle.
In one example, pre-processing the object speckle pattern and the reference speckle pattern may include: and sequentially carrying out histogram equalization processing and local binarization processing on the object speckle pattern and the reference speckle pattern.
Specifically, histogram equalization processing is performed on the speckle pattern (gray scale pattern) to equalize the brightness value of the image, so as to avoid the situation that the image is too bright or too dark. The specific operation process is as follows:
for speckle images, counting all the pixel numbers m of each pixel value i in the images according to the pixel valuei,i∈[0,255](ii) a If the speckle image resolution is R (the number of pixels included in the image), the method for calculating the processed pixel value i' corresponding to the pixel value i is formula (1).
Figure BDA0003432506590000031
Then, local binarization processing is carried out on the speckle image after histogram equalization processing, and a pixel point P at any coordinate (x, y) on the speckle image is set, wherein the gray value of the pixel point P is G (x, y); taking a neighborhood window (the window size is k × k) with the pixel point P as the center, calculating the average value avg and the standard deviation std (the maximum value is recorded as stdmax) of the gray levels in the neighborhood window, and then taking the binary threshold as formula (2).
Figure BDA0003432506590000032
Wherein, thre (x, y) is the local binary threshold of the pixel P, and Delta is a hyper-parameter with the value range of [ -1,1 ].
And then carrying out binarization through a formula (3) to obtain a binarized image m (x, y).
Figure BDA0003432506590000033
The local binarization processing method can be self-adaptive to the image brightness and contrast in the speckle image, and compared with a fixed binarization threshold or an average threshold, the tolerance to the image quality of the speckle image and the robustness of an algorithm are increased.
For the preprocessed object speckle pattern and the reference speckle pattern, a plurality of candidate seed points can be selected from the object speckle pattern, and a first parallax search range corresponding to each candidate seed point.
Specifically, the principle of depth recovery using the region growing algorithm is to consider the depth of the scene to have a certain continuity, which is equivalent to the cost aggregation part in the depth recovery process. Therefore, a plurality of pixel points are selected from the object speckle pattern as candidate seed points (such as the solid points in fig. 2) according to a certain interval grid mode, and a queue of the candidate seed points is formed. And selecting candidate seed points in the queue in sequence, and determining an initial parallax search range, namely a first parallax search range, for each candidate seed point so as to perform parallax search. The first parallax search range here may be a full range or a partially continuous range of parallax search.
Step 102: and for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point.
Specifically, the conventional disparity search for candidate seed points is a full-range disparity search, and in order to alleviate the problem of excessive calculation amount at this stage, the disparity search is performed in a jump matching manner in the embodiment. Namely: and jumping and selecting a plurality of parallax values from the initially determined first parallax searching range to perform parallax searching in the preprocessed reference speckle pattern. The jump selection can be performed by using a fixed jump step size or a variable jump step size, wherein the step size value of the fixed jump step size is larger than 1. In other words, a partial disparity value is selected from the first disparity search range to perform disparity search. After a plurality of parallax values are selected from the corresponding first parallax search range for each candidate seed point, local image block matching can be performed on the matching points corresponding to the parallax values and the corresponding candidate seed points in the preprocessed reference speckle pattern, so that the matching cost value corresponding to each parallax value is obtained. And finally, determining whether the candidate seed point is a seed point according to the matching cost value corresponding to each parallax value of each candidate seed point, and acquiring the parallax value of the seed point when the candidate seed point is determined to be one seed point. When whether the candidate seed point is the seed point is determined according to the matching cost value corresponding to each disparity value of each candidate seed point, the smaller the matching cost value corresponding to each disparity value is, the higher the possibility that the candidate seed point is determined as the seed point is. After the disparity value corresponding to the seed point is determined as the seed point, the disparity value corresponding to the seed point may be determined from the disparity values based on the matching cost values corresponding to the disparity values, or may be further calculated and generated based on the disparity values.
Step 103: and determining the parallax value of the object speckle pattern and the reference speckle pattern by using the seed points and the parallax value thereof and adopting a region growing method.
Specifically, if the candidate seed point is successfully judged as the seed point and the disparity value of the seed point is obtained, entering a growth link of the seed point. Growth was performed around the seed point. For each neighborhood point around the seed point, carrying out [ -2,2] parallax search matching at the position of the difference value of the parallax value of the seed point, wherein the matching method and the cost value calculation method are the same as the correlation method used by the original seed point. And if the matching cost value is smaller than the growth threshold value, determining to find a matched parallax value. In this case, the disparity value corresponding to the matching cost value may be directly used as the disparity value of the current matching point, or the sub-pixel level disparity of the disparity value may be calculated as the final disparity value with reference to formula (5). And then, taking the current neighborhood point as a new seed point and carrying out region growth on the periphery of the new seed point so as to further obtain the parallax value of the new neighborhood point. And if the matching cost value is smaller than the growth threshold value, calculating the neighborhood point of the next seed point.
And (3) for each seed point, iterating and searching the parallax of the neighborhood point by adopting a region growing method, and finally determining the parallax value of the pixel point between the object speckle pattern and the reference speckle pattern to form a parallax map.
Step 104: and recovering the depth information based on the parallax values of the object speckle pattern and the reference speckle pattern.
Specifically, after the image growth is completed, the depth Z is calculated according to the triangulation principle shown in fig. 3 by using the parallax values d of all the pixel points, and the calculation formula is as follows:
Figure BDA0003432506590000041
wherein z is0Is the reference plane distance in mm; f, l are camera calibration focal length and base line distance respectively.
After the depth map is obtained, post-processing, such as median filtering, may be performed on the depth map to remove redundant noise and output a high-precision depth map.
Compared with the prior art, the embodiment of the invention selects a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point by aiming at the preprocessed object speckle pattern and the reference speckle pattern; for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point; determining the parallax values of the object speckle pattern and the reference speckle pattern by using the seed points and the parallax values thereof and adopting a region growing method; and recovering the depth information based on the parallax values of the object speckle pattern and the reference speckle pattern. According to the scheme, when the large-range parallax matching of the candidate seed points is carried out, continuous searching and matching are not carried out in a preset parallax searching range, and the matching searching is carried out in a jumping mode from the middle according to a certain interval; after the seed points are successfully searched and determined, the seed points are grown until the growth of the image area is finished, the parallax value of the object speckle pattern and the reference speckle pattern is determined, and then the depth information of the object is recovered based on the parallax map, so that the depth recovery process is accelerated on the basis of effectively ensuring the accuracy of the recovered image.
Another embodiment of the present invention relates to a jump acceleration based depth recovery method, as shown in fig. 4, which is an improvement of the steps of the method shown in fig. 1, and is improved by refining the process of determining a seed point and obtaining a disparity value of the seed point. As shown in fig. 4, the step 102 may include the following sub-steps.
Substep 1021: for the current candidate seed point, jumping and selecting a plurality of disparity values from any end boundary value in the corresponding first disparity search range along the direction towards the other end boundary value; or, a plurality of disparity values are selected in a jumping manner along the direction towards the two end boundary values from the position with the disparity value of 0 in the corresponding first disparity search range.
Specifically, a first disparity search range, such as [ -d ], corresponding to the current candidate seed point is determinedl,dr]Then, the range [ -d ] can be searched from the first disparityl,dr]A plurality of disparity values are selected in different jumps. For example, a plurality of boundary values may be skipped in a direction toward the other end starting at any one of the boundary values in the first disparity search rangeA disparity value. Such as with-dlAs a first selected disparity value, followed by a fixed or variable jump step towards drExtending the direction, and selecting multiple parallax values as the direction from [ -d [)l,dr]A plurality of disparity values selected internally; or, e.g. with drAs the first selected disparity value, followed by a fixed or variable jump step towards-dlExtending the direction, and selecting multiple parallax values as the direction from [ -d [)l,dr]A plurality of disparity values selected within. For another example, a plurality of disparity values may be selected to jump in a direction toward both end boundary values starting from a position where the disparity value is 0 in the first disparity search range. E.g. with a disparity value of 0 as the first disparity value chosen, followed by a fixed or variable jump step towards-d, respectivelylAnd drExtending in two directions, and selecting a plurality of disparity values as the sum of [ -d [)l,dr]A plurality of disparity values selected within. In this manner, it is desirable to obtain dl<0<dr
In one example, the jump step size used when the jump selects the plurality of disparity values is the radius of the speckle.
Specifically, the principle of the jump matching employed in the present embodiment is that when the reference image block and the object image block are slid in the parallax extending direction for matching, the matching degree in a small range of the GT parallax (true parallax) is relatively good after it comes within the range (the range coincides with the radius of the speckle point). By using such a feature, if a point with a high matching degree is found by using a jump point finding method, it can be considered that the position near or at the point is the best position corresponding to the parallax. Therefore, in the present embodiment, the jump step is set to the radius of the speckle. For example, when the speckle point diameter is 3 ~ 5, the jump step is set to 2.
Substep 1022: and carrying out parallax search on the selected multiple parallax values in the preprocessed reference speckle pattern, and determining the matching cost value corresponding to each parallax value.
Specifically, after a plurality of disparity values are selected for each candidate seed point, disparity search may be performed on the selected plurality of disparity values in the preprocessed reference speckle pattern, and a matching cost value corresponding to each disparity value is determined. In this embodiment, the matching cost algorithm in the parallax search is not limited.
In one example, a neighborhood window may be utilized to calculate a hamming distance between the object image block and the reference image block corresponding to each disparity value, and the hamming distance is used as a matching cost value corresponding to the corresponding disparity value.
Specifically, in the information theory, Hamming Distance (Hamming Distance) represents the number of different characters in corresponding positions of two character strings of equal length. In this embodiment, the hamming distance between the object image block and the reference image block is defined as the number of pixel points with different gray values at the same position in the object image block and the reference image block, and the larger the hamming distance is, the more the number of pixel points with different gray values at the same position is, the worse the matching degree between the object image block and the reference image block is. The Hamming distance between the object image block and the reference image block corresponding to each parallax value is used as the matching cost value corresponding to the corresponding parallax value, so that the matching degree between the two image blocks under the parallax value can be represented, and the quality of the parallax value can be evaluated.
For example, the seed points in the queue are selected in order, and the coordinates of the seed points are recorded as (x)p,yp) With this point as the center, the object image block with window length w is Iw(x, y); coordinate with matching parallax position d is (x)p+d,yp) With this point as the center, the reference image block with window length w is Jw(x, y); the disparity value d e-dl,dr]. The jump step length s (i.e. s | d) is divided by the range of the parallax value to obtain the jump times, and further obtain the parallax value selected in each jump. The hamming distance (i.e. the number of different points in the two windows) of the object image block and the reference image block under this disparity is calculated by the xor operator in the bit operation.
In one example, in order to reduce the amount of calculation, image compression may be performed on the preprocessed object speckle pattern and the reference speckle pattern in advance to obtain an integer pattern whose size is 1/32 times larger than that of the original image (speckle image after local binarization processing). Accordingly, in this step 1022, the object image block and the reference image block corresponding to each disparity value may be refined, and the corresponding integer image blocks in the integer image are subjected to xor operation to obtain the hamming distance, and the hamming distance is used as the matching cost value corresponding to the corresponding disparity value.
The advantage of the processing is that the general integer is 32 bits, the binary value is a Boolean type, and only occupies 1 bit, so that the Hamming distance can be calculated subsequently through the XOR operation of the integer, the calculation amount is 32 times less, and the reason that the depth recovery can be quickly realized by adopting the local binarization method is also the reason.
Substep 1023: and determining whether the current candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value, and acquiring the parallax value of the seed point.
In one example, this sub-step 1023 can be implemented by the following steps.
The method comprises the following steps: determining a second parallax searching range based on the parallax value with the minimum matching cost value in all the parallax values, and performing continuous parallax searching on the second parallax searching range in the preprocessed reference speckle pattern to obtain a plurality of first matching cost values of the current candidate seed points; the length of the second disparity search range is smaller than the length of the first disparity search range.
Specifically, after determining, for the current candidate seed point, the matching cost values corresponding to the disparity values selected based on the skipping manner, one disparity value with the smallest matching cost value may be selected from the disparity values, and the second disparity search range may be determined based on the smallest disparity value. The length of the second parallax search range is smaller than the length of the first parallax search range. For example, the second disparity search range may be a jump step size, such as a radius of a blob, used to select a plurality of disparity values for the jump, superimposed on the minimum disparity value by [ -s, s ]. And for the second parallax search range, continuous parallax search is carried out in the preprocessed reference speckle pattern, and the matching cost value corresponding to each parallax d of the current candidate seed point in the second parallax search range is obtained. In order to distinguish the matching cost values determined in the first parallax search range in a jumping way, the matching cost values determined continuously in the second parallax search range are marked as the first matching cost values.
In this embodiment, the first matching cost value of the parallax in the second parallax search range is calculated by using a hamming distance method, which is the same as the method for calculating the matching cost value of the parallax in the first parallax search range.
Step two: and if the minimum value of the first matching cost values is smaller than the set threshold value, taking the current candidate seed point as a seed point, and taking the parallax value corresponding to the minimum value of the first matching cost values as the parallax value of the seed point.
Specifically, when the minimum first matching cost value obtained by performing the disparity search in the reference speckle pattern is smaller than the set threshold in the second disparity search range, it is considered that the better disparity value corresponding to the current candidate seed point is searched in the current disparity search range. At this time, the current candidate seed point may be directly determined as a seed point, and the disparity value of the seed point with the minimum first matching cost value in the current disparity search range is used as the disparity value of the seed point.
Step three: if the minimum value of the plurality of first matching cost values is not less than the matching threshold, the current candidate seed point is discarded.
Specifically, when the minimum first matching cost value obtained by performing the disparity search in the reference speckle pattern is not less than the set threshold in the second disparity search range, it is determined that the superior disparity value corresponding to the current candidate seed point is not searched in the second disparity search range. At this time, the matching degree of the candidate seed point itself may not be high, and the current candidate seed point is determined to be failed, and then the determining process of other candidate seed points may be performed from sub-step 1021.
In an example, when the second step is satisfied, and the minimum value of the plurality of first matching cost values is smaller than the set threshold, the original parallax can be replaced by calculating the sub-pixel level parallax, so as to increase the accuracy of the parallax. The treatment process comprises the following steps:
step four: determining the parallax value d corresponding to the minimum value of the first matching cost valuesThe matching cost value C of the adjacent parallax value d-1 corresponding to dd-1Matching cost value C with adjacent disparity value d +1d+1
In particular, the matching cost value Cd-1And matching cost value Cd+1And CdThe calculation processes are the same and can be obtained by calculating the hamming distance, which is not described herein.
Step five: calculating the sub-pixel level parallax d 'of the parallax value d by adopting the following formula, and replacing the parallax value d with the sub-pixel level parallax d' as the parallax value of the seed point:
Figure BDA0003432506590000071
wherein L ═ Cd-1-Cd,R=Cd+1-Cd
Specifically, for the candidate seed point (x, y) determined as the seed point, the corresponding minimum first matching cost value C is obtaineddAfter the disparity value d is obtained, the matching cost value C corresponding to two adjacent disparity values d-1 and d +1 of the disparity value d can be calculatedd-1And Cd+1. Setting L ═ Cd-1-Cd,R=Cd+1-CdThe sub-pixel level disparity d' of each disparity value d is calculated by formula (5). And finally, replacing the parallax value d with the sub-pixel level parallax d' as the parallax value of the current seed point.
Compared with the related art, the embodiment skips and selects a plurality of disparity values from any boundary value in the corresponding first disparity search range in the direction towards the boundary value at the other end of the corresponding first disparity search range for the current candidate seed point; or, jumping and selecting a plurality of parallax values along the direction towards the boundary values at the two ends from the position where the parallax value is 0 in the corresponding first parallax search range; performing parallax search on the selected multiple parallax values in the preprocessed reference speckle pattern, and determining the matching cost value corresponding to each parallax value; and determining whether the current candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value, and acquiring the parallax value of the seed point, so that the seed point and the parallax value of the seed point are determined quickly and accurately based on the mode of selecting the parallax value to be matched based on jumping.
Another embodiment of the invention relates to an electronic device, as shown in FIG. 5, comprising at least one processor 202; and a memory 201 communicatively coupled to the at least one processor 202; wherein the memory 201 stores instructions executable by the at least one processor 202, the instructions being executable by the at least one processor 202 to enable the at least one processor 202 to perform any of the method embodiments described above.
Where the memory 201 and the processor 202 are coupled in a bus, the bus may comprise any number of interconnected buses and bridges that couple one or more of the various circuits of the processor 202 and the memory 201 together. The bus may also connect various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. A bus interface provides an interface between the bus and the transceiver. The transceiver may be one element or a plurality of elements, such as a plurality of receivers and transmitters, providing a means for communicating with various other apparatus over a transmission medium. The data processed by the processor 202 is transmitted over a wireless medium through an antenna, which further receives the data and transmits the data to the processor 202.
The processor 202 is responsible for managing the bus and general processing and may also provide various functions including timing, peripheral interfaces, voltage regulation, power management, and other control functions. And memory 201 may be used to store data used by processor 202 in performing operations.
Another embodiment of the present invention relates to a computer-readable storage medium storing a computer program. The computer program realizes any of the above-described method embodiments when executed by a processor.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the invention, and that various changes in form and details may be made therein without departing from the spirit and scope of the invention in practice.

Claims (10)

1. A depth recovery method based on jump acceleration is characterized by comprising the following steps:
aiming at the preprocessed object speckle pattern and the reference speckle pattern, selecting a plurality of candidate seed points from the object speckle pattern and a first parallax search range corresponding to each candidate seed point;
for each candidate seed point, jumping and selecting a plurality of parallax values from the corresponding first parallax search range to perform parallax search in the preprocessed reference speckle pattern, determining whether the candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value obtained by the parallax search, and obtaining the parallax value of the seed point;
determining the disparity value of the object speckle pattern and the reference speckle pattern by using the seed points and the disparity values thereof and adopting a region growing method;
recovering depth information based on disparity values of the object speckle pattern and the reference speckle pattern.
2. The method of claim 1, wherein pre-processing the object speckle pattern and the reference speckle pattern comprises:
and sequentially carrying out histogram equalization processing and local binarization processing on the object speckle pattern and the reference speckle pattern.
3. The method of claim 2, wherein for each candidate seed point, skipping from the corresponding first disparity search range to select a plurality of disparity values for performing disparity search in the reference speckle pattern, determining whether the candidate seed point is a seed point based on a matching cost value corresponding to each disparity value obtained by the disparity search, and obtaining the disparity value of the seed point comprises:
for the current candidate seed point, jumping and selecting a plurality of disparity values from any end boundary value in the corresponding first disparity search range along the direction towards the other end boundary value; or, jumping and selecting a plurality of parallax values along the direction towards the boundary values at the two ends from the position where the parallax value is 0 in the corresponding first parallax search range;
performing parallax search on the selected multiple parallax values in the preprocessed reference speckle pattern, and determining a matching cost value corresponding to each parallax value;
and determining whether the current candidate seed point is a seed point or not based on the matching cost value corresponding to each parallax value, and acquiring the parallax value of the seed point.
4. The method of claim 3, wherein the step size of the jump is a radius of the speckle when the jump selects the plurality of disparity values.
5. The method of claim 3, wherein performing a disparity search on the selected disparity values in the preprocessed reference speckle pattern to determine a matching cost value corresponding to each disparity value comprises:
and calculating the Hamming distance between the object image block and the reference image block corresponding to each parallax value by using a neighborhood window, and taking the Hamming distance as the matching cost value corresponding to the corresponding parallax value.
6. The method of claim 5, further comprising:
performing image compression on the preprocessed object speckle pattern and the reference speckle pattern to obtain an integer pattern which is 1/32 times larger than the size of the original image;
the calculating a hamming distance between the object image block and the reference image block corresponding to each parallax value by using the neighborhood window, and taking the hamming distance as a matching cost value corresponding to the parallax value, includes:
and carrying out XOR operation on the object image blocks and the reference image blocks corresponding to the parallax values in the integer image blocks corresponding to the integer image block to obtain the Hamming distance, and taking the Hamming distance as the matching cost value corresponding to the parallax values.
7. The method according to any one of claims 3 to 6, wherein the determining whether the current candidate seed point is a seed point based on the matching cost value corresponding to each of the disparity values and obtaining the disparity value of the seed point comprises:
determining a second parallax search range based on the parallax value with the minimum matching cost value in each parallax value, and performing continuous parallax search on the second parallax search range in the preprocessed reference speckle pattern to obtain a plurality of first matching cost values of the current candidate seed points; the length of the second parallax search range is smaller than the length of the first parallax search range;
if the minimum value in the first matching cost values is smaller than a set threshold value, taking the current candidate seed point as a seed point, and taking the parallax value corresponding to the minimum value of the first matching cost values as the parallax value of the seed point;
discarding the current candidate seed point if a minimum value of the plurality of first matching cost values is not less than a matching threshold.
8. The method of claim 7, wherein when a minimum value of the plurality of first matching cost values is less than a set threshold, the method further comprises:
for the plurality of first matchesDetermining the matching cost value C of the adjacent parallax value d-1 corresponding to the parallax value dd-1Matching cost value C with adjacent disparity value d +1d+1
Calculating a sub-pixel level disparity d 'of the disparity value d by adopting the following formula to replace the disparity value d by the sub-pixel level disparity d' as the disparity value of the seed point:
Figure FDA0003432506580000021
wherein L ═ Cd-1-Cd,R=Cd+1-Cd
9. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a jump acceleration based depth restoration method as claimed in any one of claims 1 to 8.
10. A computer-readable storage medium, storing a computer program, wherein the computer program, when executed by a processor, implements the jump acceleration based depth recovery method according to any one of claims 1 to 8.
CN202111603078.9A 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium Active CN114283089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111603078.9A CN114283089B (en) 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111603078.9A CN114283089B (en) 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN114283089A true CN114283089A (en) 2022-04-05
CN114283089B CN114283089B (en) 2023-01-31

Family

ID=80875334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111603078.9A Active CN114283089B (en) 2021-12-24 2021-12-24 Jump acceleration based depth recovery method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN114283089B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820393A (en) * 2022-06-28 2022-07-29 合肥的卢深视科技有限公司 Depth recovery method for fusion hole repair, electronic device and storage medium
CN115423808A (en) * 2022-11-04 2022-12-02 合肥的卢深视科技有限公司 Quality detection method for speckle projector, electronic device, and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102903096A (en) * 2012-07-04 2013-01-30 北京航空航天大学 Monocular video based object depth extraction method
CN104268871A (en) * 2014-09-23 2015-01-07 清华大学 Method and device for depth estimation based on near-infrared laser speckles
CN108734776A (en) * 2018-05-23 2018-11-02 四川川大智胜软件股份有限公司 A kind of three-dimensional facial reconstruction method and equipment based on speckle
US20200126246A1 (en) * 2018-10-19 2020-04-23 Samsung Electronics Co., Ltd. Method and apparatus for active depth sensing and calibration method thereof
CN111325782A (en) * 2020-02-18 2020-06-23 南京航空航天大学 Unsupervised monocular view depth estimation method based on multi-scale unification
CN111402313A (en) * 2020-03-13 2020-07-10 合肥的卢深视科技有限公司 Image depth recovery method and device
WO2020206666A1 (en) * 2019-04-12 2020-10-15 深圳市汇顶科技股份有限公司 Depth estimation method and apparatus employing speckle image and face recognition system
CN112070819A (en) * 2020-11-11 2020-12-11 湖南极点智能科技有限公司 Face depth image construction method and device based on embedded system
CN113674335A (en) * 2021-08-19 2021-11-19 北京的卢深视科技有限公司 Depth imaging method, electronic device, and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102903096A (en) * 2012-07-04 2013-01-30 北京航空航天大学 Monocular video based object depth extraction method
CN104268871A (en) * 2014-09-23 2015-01-07 清华大学 Method and device for depth estimation based on near-infrared laser speckles
CN108734776A (en) * 2018-05-23 2018-11-02 四川川大智胜软件股份有限公司 A kind of three-dimensional facial reconstruction method and equipment based on speckle
US20200126246A1 (en) * 2018-10-19 2020-04-23 Samsung Electronics Co., Ltd. Method and apparatus for active depth sensing and calibration method thereof
WO2020206666A1 (en) * 2019-04-12 2020-10-15 深圳市汇顶科技股份有限公司 Depth estimation method and apparatus employing speckle image and face recognition system
CN112771573A (en) * 2019-04-12 2021-05-07 深圳市汇顶科技股份有限公司 Depth estimation method and device based on speckle images and face recognition system
CN111325782A (en) * 2020-02-18 2020-06-23 南京航空航天大学 Unsupervised monocular view depth estimation method based on multi-scale unification
CN111402313A (en) * 2020-03-13 2020-07-10 合肥的卢深视科技有限公司 Image depth recovery method and device
CN112070819A (en) * 2020-11-11 2020-12-11 湖南极点智能科技有限公司 Face depth image construction method and device based on embedded system
CN113674335A (en) * 2021-08-19 2021-11-19 北京的卢深视科技有限公司 Depth imaging method, electronic device, and storage medium

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
CHEN ZHANG等: "Binocular Depth Estimation Based on Diffractive Optical Elements and the Semiglobal Matching Algorithm", 《2018 IEEE 3RD INTERNATIONAL CONFERENCE ON IMAGE, VISION AND COMPUTING (ICIVC)》 *
张彦峰 等: "基于渐进可靠点生长的散斑图快速立体匹配", 《计算机科学》 *
熊伟 等: "基于异构多核构架的双目散斑3维重建", 《工程科学与技术》 *
王梦伟 等: "基于投影散斑的实时场景深度恢复", 《计算机辅助设计与图形学报》 *
谢宜江 等: "基于散斑立体匹配的快速三维人脸重建", 《光电子激光》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114820393A (en) * 2022-06-28 2022-07-29 合肥的卢深视科技有限公司 Depth recovery method for fusion hole repair, electronic device and storage medium
CN115423808A (en) * 2022-11-04 2022-12-02 合肥的卢深视科技有限公司 Quality detection method for speckle projector, electronic device, and storage medium

Also Published As

Publication number Publication date
CN114283089B (en) 2023-01-31

Similar Documents

Publication Publication Date Title
CN110084757B (en) Infrared depth image enhancement method based on generation countermeasure network
CN114283089B (en) Jump acceleration based depth recovery method, electronic device, and storage medium
US20110176722A1 (en) System and method of processing stereo images
US9105091B2 (en) Watermark detection using a propagation map
US9406140B2 (en) Method and apparatus for generating depth information
CN113674335B (en) Depth imaging method, electronic device and storage medium
CN111105452A (en) High-low resolution fusion stereo matching method based on binocular vision
CN116029996A (en) Stereo matching method and device and electronic equipment
CN110334652B (en) Image processing method, electronic device, and storage medium
JP4296617B2 (en) Image processing apparatus, image processing method, and recording medium
CN112580434A (en) Face false detection optimization method and system based on depth camera and face detection equipment
CN110188640B (en) Face recognition method, face recognition device, server and computer readable medium
CN110120012B (en) Video stitching method for synchronous key frame extraction based on binocular camera
CN113763449B (en) Depth recovery method and device, electronic equipment and storage medium
CN114331919B (en) Depth recovery method, electronic device, and storage medium
CN114283081B (en) Depth recovery method based on pyramid acceleration, electronic device and storage medium
CN114693546B (en) Image denoising method and device, electronic equipment and computer readable storage medium
JP2019020839A (en) Image processing apparatus, image processing method and program
CN113936316B (en) DOE (DOE-out-of-state) detection method, electronic device and computer-readable storage medium
CN113822818B (en) Speckle extraction method, device, electronic device, and storage medium
CN111630569B (en) Binocular matching method, visual imaging device and device with storage function
CN113965697B (en) Parallax imaging method based on continuous frame information, electronic device and storage medium
CN113379816B (en) Structure change detection method, electronic device, and storage medium
CN112233164B (en) Method for identifying and correcting error points of disparity map
CN110751163A (en) Target positioning method and device, computer readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220601

Address after: 230091 room 611-217, R & D center building, China (Hefei) international intelligent voice Industrial Park, 3333 Xiyou Road, high tech Zone, Hefei, Anhui Province

Applicant after: Hefei lushenshi Technology Co.,Ltd.

Address before: 100083 room 3032, North B, bungalow, building 2, A5 Xueyuan Road, Haidian District, Beijing

Applicant before: BEIJING DILUSENSE TECHNOLOGY CO.,LTD.

Applicant before: Hefei lushenshi Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant