CN111489383A - Depth image up-sampling method and system based on depth edge point and color image - Google Patents

Depth image up-sampling method and system based on depth edge point and color image Download PDF

Info

Publication number
CN111489383A
CN111489383A CN202010280991.9A CN202010280991A CN111489383A CN 111489383 A CN111489383 A CN 111489383A CN 202010280991 A CN202010280991 A CN 202010280991A CN 111489383 A CN111489383 A CN 111489383A
Authority
CN
China
Prior art keywords
depth
edge
resolution
pixel
depth map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010280991.9A
Other languages
Chinese (zh)
Other versions
CN111489383B (en
Inventor
王春兴
祖兰晶
万文博
任艳楠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Normal University
Original Assignee
Shandong Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Normal University filed Critical Shandong Normal University
Priority to CN202010280991.9A priority Critical patent/CN111489383B/en
Publication of CN111489383A publication Critical patent/CN111489383A/en
Application granted granted Critical
Publication of CN111489383B publication Critical patent/CN111489383B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/529Depth or shape recovery from texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The disclosure discloses a depth image up-sampling method and system based on depth edge points and color image guidance, which obtains a low-resolution depth image and a high-resolution color image; performing edge detection on the low-resolution depth map, and dividing a pixel area of the low-resolution depth map into a flat area and an edge area according to the low-resolution depth edge map; marking unreliable pixel points of the low-resolution depth map; correcting unreliable pixel points of the low-resolution depth map, initializing the edge-enhanced low-resolution depth map, judging the structural consistency of the initialized depth map and the high-resolution color map, finishing the classification of the pixel points in the initialized depth map, and searching real edge pixel points of a depth-reliable pixel region; mapping real edge pixel points in the initialized depth map to the edge-enhanced low-resolution depth map, and finishing the depth value correction of a depth reliable pixel area of the initialized depth map based on the influence factors; and obtaining a high-resolution depth map.

Description

Depth image up-sampling method and system based on depth edge point and color image
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a depth image upsampling method and system based on depth edge points and color images.
Background
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
As 3DTV and 3D movies are more and more widely introduced into human life, high quality visual perception can provide consumers with sufficient mental support, and thus, the visual quality requirements of people for images and videos are higher and higher. The 3DTV system needs to input 2D color video and 2D depth data from the same scene at the same time. Wherein, the depth data can explain the position of the scene, and 3D stereoscopic vision can be provided for consumers by means of stereoscopic display technology. Depth data is therefore an important basis for 3DTV systems and the acquisition of high quality depth information is of great interest.
The depth information may be obtained in a direct manner and an indirect manner. In the direct method, there are great limitations on hardware devices used for capturing depth information, for example, noise interference cannot be effectively suppressed, the price is high, and the like, and thus, the requirement of consumers for directly acquiring depth information cannot be met. Therefore, a method for indirectly acquiring depth information, namely a depth up-sampling algorithm, is becoming an increasingly hot research solution.
In the course of implementing the present disclosure, the inventors found that the following technical problems exist in the prior art:
in recent years, the Depth up-sampling algorithm is widely concerned by scholars at home and abroad, Kopf et al propose Joint Bilateral Depth up-sampling (JBU) algorithm based on Bilateral filtering, which ignores the problem of mismatching between two image pairs.Yang et al propose Joint Bilateral filtering with Depth hypothesis to perfect the output of its high-resolution Depth map. L iu et al propose a weighted analysis representation model for guiding Depth image enhancement, which uses dynamic adjustment guidance to update Depth image.Yang et al propose a self-regression (smoothness) model for guiding adaptive high-resolution color image guidance, which uses Razekov to update Depth image, etc. propose a weighted analysis representation model for guiding Depth image enhancement, which uses dynamic adjustment guidance to update Depth image.J. Yang et al propose a weighted analysis model for guiding Depth image enhancement, which uses a weighted regression (auto-guided) model for guiding adaptive high-resolution color image guidance to update Depth image, which uses weighted regression (MRzekov) to determine the weighted average resolution of local Depth image, which uses a weighted average resolution of a weighted regression (weighted average) to determine the weighted average resolution of the local weighted image, which uses a weighted average resolution of a weighted regression model for guiding Depth image enhancement, which uses a weighted average resolution of a weighted average of a weighted regression model for guiding Depth image enhancement, which does not only uses a weighted average of a weighted regression model for guiding Depth image, but does not use a weighted average resolution of a weighted regression model for guiding Depth image with a weighted Depth information of a weighted regression model for guiding Depth image, which is a weighted average of a weighted regression model for guiding Depth image, which is a weighted average of high-weighted average of a weighted regression model for guiding Depth image, which is considered as a weighted average of a weighted regression model for guiding Depth image, which is equal to obtain a weighted average of.
Disclosure of Invention
In order to solve the defects of the prior art, the disclosure provides a depth image up-sampling method and system based on a depth edge point and a color image; and obtaining a high-resolution depth image with enhanced depth discontinuous region and clear edge structure.
In a first aspect, the present disclosure provides a depth image upsampling method based on depth edge points and color images;
the depth image up-sampling method based on the depth marginal point and color image guidance comprises the following steps:
acquiring a low-resolution depth map and a high-resolution color map in the same scene; carrying out edge detection on the low-resolution depth map to obtain a low-resolution depth edge map; dividing a pixel region of the low-resolution depth map into a flat region and an edge region according to the low-resolution depth edge map; marking unreliable pixel points of the low-resolution depth map based on the flat region and the edge region; correcting unreliable pixel points of the low-resolution depth map to obtain an edge-enhanced low-resolution depth map;
initializing the edge-enhanced low-resolution depth map to obtain an initialized depth map; carrying out structural consistency judgment on the initialized depth map and the high-resolution color map, finishing the classification of pixel points in the initialized depth map, and obtaining a depth reliable pixel area and a depth unreliable pixel area;
searching real edge pixel points of a depth-reliable pixel area from the initialized depth map according to a high-resolution gradient matrix corresponding to the high-resolution color image; mapping real edge pixel points in the initialized depth map into an edge-enhanced low-resolution depth map, and setting influence factors generated due to space position constraint on pixel points at different positions in a pixel block;
completing the depth value correction of the depth reliable pixel area of the initialized depth map based on the influence factor; completing the correction of the depth value of the depth unreliable pixel area of the initialized depth map; and obtaining a corrected high-resolution depth map.
In a second aspect, the present disclosure also provides a depth image upsampling system based on depth edge points and color images;
depth image upsampling system based on depth edge points and color image guidance, comprising:
an acquisition module configured to: acquiring a low-resolution depth map and a high-resolution color map in the same scene; carrying out edge detection on the low-resolution depth map to obtain a low-resolution depth edge map; dividing a pixel region of the low-resolution depth map into a flat region and an edge region according to the low-resolution depth edge map; marking unreliable pixel points of the low-resolution depth map based on the flat region and the edge region; correcting unreliable pixel points of the low-resolution depth map to obtain an edge-enhanced low-resolution depth map;
a determination module configured to: initializing the edge-enhanced low-resolution depth map to obtain an initialized depth map; carrying out structural consistency judgment on the initialized depth map and the high-resolution color map, finishing the classification of pixel points in the initialized depth map, and obtaining a depth reliable pixel area and a depth unreliable pixel area;
a setup module configured to: according to a high-resolution gradient matrix corresponding to the high-resolution color image, searching real edge pixel points of a depth-reliable pixel region from the initialized depth map; mapping real edge pixel points in the initialized depth map into an edge-enhanced low-resolution depth map, and setting influence factors generated by space position constraint on pixel points at different positions in a pixel block;
a correction module configured to: completing the depth value correction of the depth reliable pixel area of the initialized depth map based on the influence factor; completing the correction of the depth value of the depth unreliable pixel area of the initialized depth map; and obtaining a corrected high-resolution depth map.
In a third aspect, the present disclosure also provides an electronic device comprising a memory and a processor, and computer instructions stored on the memory and executed on the processor, wherein the computer instructions, when executed by the processor, perform the steps of the method of the first aspect.
In a fourth aspect, the present disclosure also provides a computer-readable storage medium for storing computer instructions which, when executed by a processor, perform the steps of the method of the first aspect.
Compared with the prior art, the beneficial effect of this disclosure is:
1. because the low-resolution depth image may have a low quality problem, pixels with wrong depth and holes exist, and if the low-resolution depth image is directly used for depth upsampling, more pixels with wrong depth can be generated in the high-resolution depth image, so that the unreliable pixels are marked by the low-resolution depth image, and the unreliable pixels are corrected, so that the accuracy of the depth in the low-resolution depth image is ensured.
2. The scene seen by the two eyes of a person is divided into far and near parts, the scene seen by the left eye and the right eye is greatly different, the distance sense in the front and back direction and the scene difference in the left and right direction are reflected in 8 fields of any pixel point in a depth map, wherein the great change between the depth values of the pixel points can reflect the difference of the positions of objects, therefore, the method selects a 3 × 3 pixel block as a unit, combines edge point distribution in a high-resolution depth map, a low-resolution depth map and a gradient map of a color image to judge the consistency of the structures of the edge regions of a depth space and a color space, divides the pixel points of the edge regions into two types, namely effective pixel points and unreliable pixel points, and divides the regions into effective depth regions and unreliable depth regions, thereby effectively avoiding the phenomenon of generating artifacts in an output depth map due to the inconsistency of the structures of the depth map and the color map.
3. Considering that the correlation among pixels and the difference of the positions of the pixels have different influences on other pixels, the neighborhood pixels in 8 directions with a certain pixel as the center can be used for restraining the central pixel, so that the influence factor of space position restraint is set for the influence among the pixels, the influence factor generated by the space position restraint of the 3 × 3 pixels is used for completing the depth correction of a depth effective region, and the accuracy of the depth can be effectively improved.
4. Because certain correlation exists among pixels, if only the pixel depth in the vertical direction is adopted, the effective depth for depth correction has uniqueness, and the characteristic that the positions of the pixels have different regional characteristics is easy to ignore, so that the range of the available effective pixels is expanded, the correction of the depth value is completed by utilizing the mutual influence of 8 neighborhood pixels taking a certain pixel as the center, and the accuracy of the depth can be effectively improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate exemplary embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application.
FIG. 1 is a general flow chart of an implementation of the present disclosure;
FIG. 2 is a graph of the relationship of pixel points between different images for determining the structural consistency of a depth map and a color map in the present disclosure;
FIG. 3(a) is a high resolution color map of an Art image;
FIG. 3(b) is a high resolution depth map of an Art image;
FIG. 3(c) is a high resolution color map of a Reindeer image;
FIG. 3(d) is a high resolution depth map of a Reindeer image;
fig. 4(a) is a high-resolution depth image obtained by up-sampling 4 times by the Bicubic method;
fig. 4(b) is a high-resolution depth image obtained by up-sampling 4 times by the JBU method;
fig. 4(c) is a high resolution depth image obtained by 4 times up-sampling by the TGV method;
fig. 4(d) is a high resolution depth image obtained by 4 times up-sampling by the EGDU method;
FIG. 4(e) is a high resolution depth image obtained by 4-fold upsampling by the method of the present disclosure;
fig. 5(a) is a high-resolution depth image obtained by up-sampling 4 times by the Bicubic method;
fig. 5(b) is a high-resolution depth image obtained by up-sampling 4 times by the JBU method;
fig. 5(c) is a high resolution depth image obtained by 4 times up-sampling by the TGV method;
fig. 5(d) is a high resolution depth image obtained by 4 times up-sampling by the EGDU method;
fig. 5(e) is a high resolution depth image obtained by 4-fold upsampling by the method of the present disclosure.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular is intended to include the plural unless the context clearly dictates otherwise, and it should be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of features, steps, operations, devices, components, and/or combinations thereof.
In the first embodiment, the present embodiment provides a depth image upsampling method based on depth edge points and color images;
as shown in fig. 1, the depth image upsampling method based on depth edge point and color image guide includes:
s100: acquiring a low-resolution depth map and a high-resolution color map in the same scene;
s101: carrying out edge detection on the low-resolution depth map to obtain a low-resolution depth edge map;
s102: dividing a pixel area of the low-resolution depth map into a flat area and an edge area according to the low-resolution depth edge map;
s103: marking unreliable pixel points of the low-resolution depth map based on the flat region and the edge region;
s104: correcting unreliable pixel points of the low-resolution depth map to obtain an edge-enhanced low-resolution depth map;
s105: initializing the edge-enhanced low-resolution depth map to obtain an initialized depth map;
s106: carrying out structural consistency judgment on the initialized depth map and the high-resolution color map, and finishing classification of pixel points in the initialized depth map to obtain a depth reliable pixel area and a depth unreliable pixel area;
s107: according to a high-resolution gradient matrix corresponding to the high-resolution color image, searching real edge pixel points of a depth-reliable pixel region from the initialized depth map;
s108: mapping real edge pixel points in the initialized depth map into an edge-enhanced low-resolution depth map, and setting influence factors generated by space position constraint on pixel points at different positions in a pixel block;
s109: completing the depth value correction of the depth reliable pixel area of the initialized depth map based on the influence factor; completing the correction of the depth value of the depth unreliable pixel area of the initialized depth map; and obtaining the corrected high-resolution depth map.
In one or more embodiments, in S100, the low-resolution depth map means that the low-resolution depth map is obtained by down-sampling the real high-resolution depth map 1110 × 1370 by a sampling factor of 2, 4, or 8.
The high-resolution color image is a high-resolution color image corresponding to the real high-resolution depth image and has the size of 1110 × 1370.
As one or more embodiments, in S101, performing edge detection on the low-resolution depth map to obtain a low-resolution depth edge map; the method comprises the following specific steps:
and (4) extracting edge points of the low-resolution depth map by using a Sobel operator to obtain the low-resolution depth edge map.
It will be appreciated that for the low resolution depth map DLExtracting edge points by using Sobel operator to obtain a low-resolution depth edge map
Figure BDA0002446566310000081
As one or more embodiments, in S102, the low-resolution depth edge map is divided into a flat area and an edge area; the method comprises the following specific steps: and extracting Sobel edge points of the low-resolution depth map, wherein the extracted edge points are edge areas, and the points which are not extracted are flat areas.
As one or more embodiments, in S103, based on the flat region and the edge region, the unreliable pixel points of the low-resolution depth map are marked; the method comprises the following specific steps:
firstly, marking the pixel point with the depth value of 0 as an unreliable pixel point;
second, at low resolution depth map DLTaking the image block of 3 × 3, when the image block is in a flat area, if the number of times that the difference value of the depth values of the central pixel point and the neighborhood pixel point is not less than 3 exceeds t1Put t at1If the number is 3, marking the central pixel point as an unreliable pixel point;
when the image block is in the edge area, if the number of times that the depth value difference between the central pixel point and the neighborhood pixel point is not less than 3 exceeds t1Put t at1If the number is 3, marking the central pixel point as an unreliable pixel point;
when the image block is in the flat area and the edge area at the same time, the low-resolution depth edge image is used for corresponding to the edge area in the image block one by one, the pixel point in the edge area in the image block is compared with the pixel point adjacent to the edge area, and if the times that the depth value difference is not less than 3 is greater than t2Put t at2If the number is 2, marking as an unreliable pixel point; comparing the pixel points in the flat area in the image block with the adjacent pixel points by using the flat area, and if the times that the depth value difference is not less than 3 are greater than t2Put t at2And if the number is 2, marking as an unreliable pixel point.
In the above manner, the whole low-resolution depth map D is completed in sequenceLOf the unreliable pixel points.
It should be understood that the depth image DLThe pixel points with missing depth values or wrong depth values exist in the image interpolation method, and in the interpolation process, more wrong pixel points can be generated, so that the edge of the interpolated image is blurred, and the sawtooth effect is obvious. Therefore, unreliable pixels are marked first.
As one or more embodiments, in S104, the unreliable pixel point of the low-resolution depth map is corrected to obtain an edge-enhanced low-resolution depth map; the method comprises the following specific steps:
for unreliable pixel points in a flat area or an edge area, filling the depth value by utilizing bicubic interpolation of 8 neighborhood reliable pixel points according to the edge distribution of a low-resolution depth edge map;
and for the pixel points of which the 8 neighborhoods are not in the flat area or the edge area at the same time, filling the pixel points by using the average value of the depth values of the adjacent reliable pixel points in the corresponding area to obtain an edge-enhanced low-resolution depth map.
It should be understood that the edge enhanced and depth information complete low resolution depth map is
Figure BDA0002446566310000091
As one or more embodiments, in S105, initializing the edge-enhanced low-resolution depth map to obtain an initialized depth map; the method comprises the following specific steps:
and carrying out bicubic interpolation on the edge-enhanced low-resolution depth map to obtain an initialized depth map.
It should be appreciated that since the depth image upsampling method based on depth edge points and color image guidance corrects for pseudo depth matrices, the initialization of the low resolution depth map is achieved first.
As one or more embodiments, in S106, the structural consistency of the initialized depth map and the high-resolution color map is determined, and classification of the pixels in the initialized depth map is completed to obtain a depth-reliable pixel region and a depth-unreliable pixel region; the method comprises the following specific steps:
firstly, selecting a pixel block A of 3 × 3 with a certain pixel point O as the center in a high-resolution depth space, and respectively determining the belonging area of each pixel point in the pixel block A, wherein the belonging area comprises an edge area and a flat area;
secondly, finding a pixel block B corresponding to the pixel block A in the high-resolution gradient space, similarly determining regions where 9 pixel points of the pixel block B are located, performing consistency judgment on the region characteristics of the pixel points in the two pixel blocks, and judging as a depth-reliable pixel point if the region characteristics of the pixel points in the two pixel blocks are the same; otherwise, projecting the pixel points with inconsistent area characteristics in the two pixel blocks into the low-resolution depth map, and judging.
The criterion for judging again includes: suppose pixel block A is a pixel (x)hJ) the pixel block of 3 × 3 centered on j), which satisfies 8 conditions in formula (1) is a reliable pixel, and if not, is a depth unreliable pixel.
Figure BDA0002446566310000111
Wherein T ═ { T ═ TiI is not less than 1 and not more than 8, and Q is a threshold value set by experimentiI is more than or equal to 1 and less than or equal to 8 represents 8 conditions required to be satisfied,
Figure BDA0002446566310000112
the center pixel point (x) of the pixel block AhJ) the depth of the pixel points mapped to the low resolution depth map for which the depth correction has been completed,
Figure BDA0002446566310000113
Figure BDA0002446566310000114
respectively indicate at pixel points
Figure BDA0002446566310000115
And 8 neighborhood pixels.
The structure consistency of the initialized depth map and the high-resolution color map is judged to complete the initialized depth map D0Classification of middle pixel points and initialization of depth map D0The region to which the middle pixel point belongs is also divided into a depth-reliable pixel region and a depth-unreliable pixel region.
It will be appreciated that in high resolution space, a portion of the image is selected for illustration, as shown in FIG. 2, assuming that the location of the imaged pixel points is at x1To xmAnd lines from j-1 column to j +1 column, wherein any pixel point is expressed by a formula as:
P={(x,y)|x1≤x≤xm,j-1≤y≤j+1} (2)
wherein, the pixel point (x)hJ) and (x)m-1J +1) has a structure ofMaximum gradient values in j and j +1 columns, these pixel points projecting in low resolution depth space
Figure BDA0002446566310000116
And
Figure BDA0002446566310000117
it should be understood that the consistency determination is in units of 3 × 3 pixel blocks, initializing the depth map to D0
As one or more embodiments, in S107, according to a high-resolution gradient matrix corresponding to the high-resolution color image, a true edge pixel point of a depth-reliable pixel region is found from the initialized depth map; the method comprises the following specific steps:
in initializing depth map D0At least two effective depth values generally exist in the edge region with the consistent structure, and a high-resolution gradient matrix G corresponding to the high-resolution color image is used for determining a real edge pixel point;
in high resolution gradient space, pixel points (x) are assumedhY) has the largest gradient value, and is a real edge point or a pixel point in the neighborhood of the real edge point;
the set of effective depth edge pixels is:
ΩG={(x,y)||G(x,y)-G(xh,y)|<TG,(x,y)∈P} (3)
wherein P represents a set of pixel points satisfying formula (2), G (x, y) represents a gradient value of a gradient matrix of the color map at coordinates (x, y), G (x)hY) is expressed in coordinates (x)hY) gradient. T isGRepresenting any pixel point in the set P and G (x)hY) threshold value for the absolute value of the difference.
Calculating the difference between the pixel values of the pixel point (x, y) in the set and the pixel values of the pixel points in the neighborhood, and assuming that the abscissa of the pixel point with the largest difference between the pixel values of the pixel point (x, y) in the set is xeThe formula is as follows:
Figure BDA0002446566310000121
wherein, the abscissa of the real edge pixel point is defined as xfThe abscissa of the true edge pixel is xeAnd xhThe vertical coordinate of the real edge pixel point is y; the value ranges of x and y are set omega provided by formula (3)GIt is decided that I (x, y) represents the pixel value at coordinate (x, y) in the high resolution color map.
As one or more embodiments, in S108, mapping real edge pixel points in the initialized depth map to the edge-enhanced low-resolution depth map, and setting influence factors generated due to spatial position constraints on pixel points at different positions in a pixel block; the method comprises the following specific steps:
first, real edge pixel point (x)fY) mapping to a low resolution depth space with the mapped coordinates being
Figure BDA0002446566310000122
Selecting pixel points
Figure BDA0002446566310000123
A central 3 × 3 pixel block;
secondly, setting influence factors generated by space position constraint on pixel points at different positions in a pixel block by utilizing the correlation among pixels;
wherein, the setting of the influence factor is as follows:
assuming a low resolution depth image
Figure BDA0002446566310000131
Initialized depth map D with size of l × n and magnification of K times0Has a size of L× N, i.e.
Figure BDA0002446566310000132
Wherein the low resolution depth image
Figure BDA0002446566310000139
Coordinates (x) of any pixel point in the spaceL,yL) And the initialized depth map D0The coordinate mapping formula between the coordinates (x, y) of any pixel point is as follows:
Figure BDA0002446566310000133
Figure BDA0002446566310000134
low resolution depth image
Figure BDA0002446566310000135
Middle pixel (x)L,yL) And a distance point (x)L,yL) The depth values of the last 8 pixel points are used to calculate the initialized depth map D0Parameters of depth values at the middle (x, y) position are obtained, influence factors α of 9 pixel points in a 3 × 3 pixel block are obtained by utilizing a space constraint function, and an initialized depth map D is obtained0The depth value at the (x, y) position of (a) is calculated according to the two-dimensional characteristics of the pixel points, and the spatial position constraint function is calculated according to the rows and the columns of the pixel points, wherein the spatial constraint function in the x direction is as follows:
Figure BDA0002446566310000136
wherein, CiRepresenting the influence factor generated in the x direction, m1 is 3 or 4, m2 is 1 or 2, i is 1,2,3, r2Indicating the difference in longitudinal distance in the x-direction.
To obtain CjThe formula is as follows:
Figure BDA0002446566310000137
wherein, CjDenotes an influence factor generated in the y direction, j is 1,2,3, m1 is 3, 4, m2 is 1,2, r1Indicating the difference in longitudinal distance in the y-direction.
The transverse distance in the x direction is r2In aA longitudinal distance r in the y-direction1The formulas are respectively as follows:
Figure BDA0002446566310000138
Figure BDA0002446566310000141
wherein x and y represent the horizontal and vertical coordinates of the pixel points in the initialized depth map, and xL,yLRespectively representing the horizontal and vertical coordinates of pixel points in the low-resolution depth map which maps x and y to the finished depth correction. And l and n respectively represent the size of the low-resolution depth map with the depth correction completed, and K is a sampling factor for realizing depth up-sampling.
In the window region of 3 × 3, the influence factor α of a certain pixel is:
αi,j=Ci(r2)Cj(r1),i=1,2,3,j=1,2,3 (11)
wherein, αi,jRepresenting the influence factor, C, at different positions in the combined x and y directionsiDenotes an influence factor, C, generated in the x directionjI and j are used to represent the position coordinates of the 9 pixel points in the 3 × 3 window area, with xL,yLIs positively correlated with the size of (a), and the coordinate (x) is represented by i ═ 1 and j ═ 1L-1,yL-1) position, with i-2, j-2 representing the coordinate (x)L,yL) The position of (a).
In S109, the correcting the depth value of the depth-reliable pixel region of the initialized depth map is performed based on the influence factor; the method comprises the following specific steps:
the formula for completing the depth correction of the depth-reliable pixel region by using the real edge points and the effective depth mapped to the low-resolution space is as follows:
Figure BDA0002446566310000142
wherein D isH(x, y) denotes a depth value at (x, y) in the finally output high-resolution depth map, αi,jThe influence factors set when coordinates (i, j) in the x and y directions are integrated are shown, and i and j are 1,2 and 3 respectively. s1, t1,s2,t2The number of the whole groups is an integer,
Figure BDA0002446566310000143
respectively in the low-resolution depth map with depth correction
Figure BDA0002446566310000144
The depth value of (c). x is the number of1,xm,xfRespectively representing the abscissa of the first pixel point and the abscissa of the last pixel point in a certain column and the abscissa of the true edge point,1mrespectively expressed as the gradient difference between the real edge point and the first pixel point and the last pixel point in a certain column.
In one or more embodiments, in S109, completing depth value correction of the depth unreliable pixel region of the initialized depth map; the method comprises the following specific steps:
suppose that the error pixel point in the initialized depth map and the projection point in the low resolution space are (x, y), (x) and (y), respectivelyL,yL) Wherein, pixel point (x)L,yL) The set of pixels in the 8 neighborhoods belonging to a valid depth value is:
Figure BDA0002446566310000151
where Ω is the set of effective depths, s, t are integers,
Figure BDA0002446566310000152
for indicating at a pixel point (x)L,yL) And 8, the depth value of the pixel point in the neighborhood.
Correcting unreliable pixel points:
Figure BDA0002446566310000153
wherein D isH(x, y) represents the depth value at (x, y) in the final output high resolution depth map, the selection and initialization depth map D0The depth value with the minimum difference of the depth values of the middle pixel points (x, y) is used as the final output high-resolution depth map DHThe final depth value at (x, y), d being the depth values in the set Ω.
First, unreliable pixels of the low resolution depth map are corrected. Edge map for obtaining low-resolution depth image by using Sobel edge detection operator
Figure BDA0002446566310000154
And a gradient map G of the high-resolution color image, and obtaining a high-resolution depth edge map by using the mapping relation between the low-resolution and high-resolution maps
Figure BDA0002446566310000155
Secondly, by utilizing the correlation among pixels and taking a 3 × 3 pixel block as a unit, judging the structural consistency of the edge regions of the initialized depth map and the color map, dividing the pixel depth into a depth-reliable pixel point region and a depth-unreliable pixel point region, and finishing pixel point classification.
Finally, for pixel areas with consistent edge structures, setting influence factors on the existing effective depth by utilizing space position constraint to complete depth correction;
for pixel areas with inconsistent edge structures, effective depth is searched by using the pixel characteristics of a low-resolution depth space, and finally a high-quality high-resolution depth map D is outputH
At the output of the high resolution depth image DHIs equivalent to the initialized depth map D0
Completing the initialization of the depth map D0The high-resolution depth image with clear edge structure and high quality is output by the correction of the image.
Description of the experiments
1. Simulation conditions are as follows:
simulations were performed on Intel (R) core (TM) i7-8700CPU @3.20GHz, WINDOWS 10 system, MatlabR2018a platform.
The present disclosure selects two test images for simulation as shown in fig. 3(a) -3 (d), where fig. 3(a) is a high resolution color map of an Art image, fig. 3(b) is a high resolution depth map of an Art image, fig. 3(c) is a high resolution color map of a reindreer image, and fig. 3(d) is a high resolution depth map of a reindreer image.
Before the experiment begins, 2 times, 4 times and 8 times of downsampling processing are respectively carried out on the high-resolution depth image provided in the test set, and a low-resolution depth image to be upsampled is obtained.
2. The simulation method comprises the following steps:
① bicubic interpolation
② Kopf proposed joint bilateral filtering upsampling JBU method
③ Ferstl proposed method for realizing depth upsampling TGV by utilizing anisotropic diffusion tensor
④ Ren proposed depth upsampling EGDU method based on corner point and gradient assistance
⑤ depth edge and color image guided depth up-sampling method of the present disclosure
3. Simulation content:
simulation 1, the Art and reindreer images in fig. 3(a) -3 (d) were up-sampled by 4 times the depth map using Bicubic, JBU, TGV, EGDU and the method of the present disclosure, respectively, and the results are shown in fig. 4(a) -4 (e) and fig. 5(a) -5 (e), where:
fig. 4(a) is a high-resolution depth image obtained by up-sampling 4 times by the Bicubic method; fig. 4(b) is a high-resolution depth image obtained by up-sampling 4 times by the JBU method; fig. 4(c) is a high resolution depth image obtained by 4 times up-sampling by the TGV method; fig. 4(d) is a high resolution depth image obtained by 4 times up-sampling by the EGDU method; FIG. 4(e) is a high resolution depth image obtained by 4-fold upsampling by the method of the present disclosure;
fig. 5(a) is a high-resolution depth image obtained by up-sampling 4 times by the Bicubic method; fig. 5(b) is a high-resolution depth image obtained by up-sampling 4 times by the JBU method; fig. 5(c) is a high resolution depth image obtained by 4 times up-sampling by the TGV method; fig. 5(d) is a high resolution depth image obtained by 4 times up-sampling by the EGDU method; FIG. 5(e) is a high resolution depth image obtained by 4-fold upsampling by the method of the present disclosure;
and (4) comparing the results:
the high-resolution depth image output by Bicubic and JBU algorithms has the problems of image blurring, expansion of a depth value missing area, local edge missing and the like, and experimental output images of TGV and EDGU show that the two algorithms cannot effectively enhance a detail edge area and cannot effectively reconstruct the depth missing area. As can be seen from the images in FIG. 4(e) and FIG. 5(e), the image is clear and rich in detail, the method not only can enhance the edge detail, but also can repair the deep black hole in the low-resolution depth image, and output the accurate high-resolution depth image with complete depth information. By comparing the depth maps output by the 5 methods, the images obtained by the first 4 methods have the problems of blurring and artifacts, wherein the phenomenon of edge mixing also exists. From the aspect of subjective effect, the depth image generated by the method is clearer, the detail edge area can be effectively enhanced, the reconstruction of the depth missing part is completed, and the performance of the algorithm provided by the method is better subjectively.
Simulation 2, performing up-sampling on the Art test set graph shown in fig. 3(a) by using Bicubic, JBU, TGV, EDGU and the disclosed method for depth maps of 2 times, 4 times and 8 times, and performing data analysis on the experimental result aiming at an evaluation index.
1) BPR bad point rate
Data analysis was performed for the three evaluation indexes, and the results are shown in table 1. From table 1, it can be seen that, from the upsampling factors 2 to 8, the algorithm proposed herein always obtains the minimum BPR value, which means that the performance of the algorithm is relatively stable, and the obtained experimental result is more reliable than other algorithms. The smaller the value of the image quality evaluation index BPR is, the smaller the difference between the output image of the algorithm and the real high-resolution depth map is, and the data in the table 1 show that the algorithm has higher effectiveness. The method disclosed by the invention not only can subjectively bring good visual effect to people, but also has very obvious advantages on the BPR objective evaluation index.
Table 1 data analysis of results of upsampling of low resolution depth images for test image Art using 5 typical methods available with the present disclosure
Art
bilinear 0.1911 0.3373 0.5282
bicubic 0.1870 0.3255 0.5276
JBU 0.1677 0.3117 0.6113
TGV 0.1577 0.1247 0.3096
EGDU 0.0279 0.0314 0.0606
Text algorithm 0.0164 0.0299 0.0411
The second embodiment also provides a depth image sampling system based on the depth edge points and the color images;
depth image upsampling system based on depth edge points and color image guidance, comprising:
an acquisition module configured to: acquiring a low-resolution depth map and a high-resolution color map in the same scene; carrying out edge detection on the low-resolution depth map to obtain a low-resolution depth edge map; dividing a pixel region of the low-resolution depth map into a flat region and an edge region according to the low-resolution depth edge map; marking unreliable pixel points of the low-resolution depth map based on the flat region and the edge region; correcting unreliable pixel points of the low-resolution depth map to obtain an edge-enhanced low-resolution depth map;
a determination module configured to: initializing the edge-enhanced low-resolution depth map to obtain an initialized depth map; carrying out structural consistency judgment on the initialized depth map and the high-resolution color map, finishing the classification of pixel points in the initialized depth map, and obtaining a depth reliable pixel area and a depth unreliable pixel area;
a setup module configured to: according to a high-resolution gradient matrix corresponding to the high-resolution color image, searching real edge pixel points of a depth-reliable pixel region from the initialized depth map; mapping real edge pixel points in the initialized depth map into an edge-enhanced low-resolution depth map, and setting influence factors generated by space position constraint on pixel points at different positions in a pixel block;
a correction module configured to: completing the depth value correction of the depth reliable pixel area of the initialized depth map based on the influence factor; completing the correction of the depth value of the depth unreliable pixel area of the initialized depth map; and obtaining a corrected high-resolution depth map.
In a third embodiment, the present embodiment further provides an electronic device, which includes a memory, a processor, and computer instructions stored in the memory and executed on the processor, where the computer instructions, when executed by the processor, implement the steps of the method in the first embodiment.
In a fourth embodiment, the present embodiment further provides a computer-readable storage medium for storing computer instructions, and the computer instructions, when executed by a processor, perform the steps of the method in the first embodiment.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. The depth image up-sampling method based on depth marginal points and color image guidance is characterized by comprising the following steps:
acquiring a low-resolution depth map and a high-resolution color map in the same scene;
carrying out edge detection on the low-resolution depth map to obtain a low-resolution depth edge map; dividing a pixel area of the low-resolution depth map into a flat area and an edge area according to the low-resolution depth edge map; marking unreliable pixel points of the low-resolution depth map based on the flat region and the edge region; correcting unreliable pixel points of the low-resolution depth map to obtain an edge-enhanced low-resolution depth map;
initializing the edge-enhanced low-resolution depth map to obtain an initialized depth map; carrying out structural consistency judgment on the initialized depth map and the high-resolution color map, finishing the classification of pixel points in the initialized depth map, and obtaining a depth reliable pixel area and a depth unreliable pixel area;
according to a high-resolution gradient matrix corresponding to the high-resolution color image, searching real edge pixel points of a depth-reliable pixel region from the initialized depth map; mapping real edge pixel points in the initialized depth map into an edge-enhanced low-resolution depth map, and setting influence factors generated by space position constraint on pixel points at different positions in a pixel block;
completing the depth value correction of the depth reliable pixel area of the initialized depth map based on the influence factor; completing the correction of the depth value of the depth unreliable pixel area of the initialized depth map; and obtaining a corrected high-resolution depth map.
2. The method of claim 1, wherein edge detection is performed on the low resolution depth map to obtain a low resolution depth edge map; the method comprises the following specific steps:
and (4) extracting edge points of the low-resolution depth map by using a Sobel operator to obtain the low-resolution depth edge map.
3. The method of claim 1, wherein the low resolution depth edge map is divided into a flat region and an edge region; the method comprises the following specific steps: and extracting Sobel edge points of the low-resolution depth map, wherein the extracted edge points are edge regions, and the points which are not extracted are flat regions.
4. The method of claim 1, wherein unreliable pixels of the low resolution depth map are marked based on the flat region and the edge region; the method comprises the following specific steps:
firstly, marking the pixel point with the depth value of 0 as an unreliable pixel point;
second, at low resolution depth map DLImage of item 3 × 3And when the image block is in a flat area, if the number of times that the depth value difference between the central pixel point and the neighborhood pixel point is not less than 3 exceeds t1Put t at1If the number is 3, marking the central pixel point as an unreliable pixel point;
when the image block is in the edge area, if the number of times that the depth value difference between the central pixel point and the neighborhood pixel point is not less than 3 exceeds t1Put t at1If the number is 3, marking the central pixel point as an unreliable pixel point;
when the image block is in the flat area and the edge area at the same time, the low-resolution depth edge image is used for corresponding to the edge area in the image block one by one, the pixel point in the edge area in the image block is compared with the pixel point adjacent to the edge area, and if the times that the depth value difference is not less than 3 is greater than t2Put t at2If the number is 2, marking as an unreliable pixel point; comparing the pixel points in the flat area in the image block with the adjacent pixel points by using the flat area, and if the times that the depth value difference is not less than 3 are more than t2Put t at2And if the number is 2, marking as an unreliable pixel point.
5. The method of claim 1, wherein unreliable pixels of the low resolution depth map are modified to obtain an edge enhanced low resolution depth map; the method comprises the following specific steps:
for unreliable pixel points in a flat area or an edge area, filling the depth value by utilizing bicubic interpolation of 8 neighborhood reliable pixel points according to the edge distribution of a low-resolution depth edge map;
and for the pixel points of which the 8 neighborhoods are not in the flat area or the edge area at the same time, filling the pixel points by using the average value of the depth values of the adjacent reliable pixel points in the corresponding area to obtain an edge-enhanced low-resolution depth map.
6. The method of claim 1, wherein initializing the edge-enhanced low resolution depth map results in an initialized depth map; the method comprises the following specific steps:
and carrying out bicubic interpolation on the edge-enhanced low-resolution depth map to obtain an initialized depth map.
7. The method as claimed in claim 1, wherein the structural consistency judgment is performed on the initialized depth map and the high-resolution color map, and the classification of pixel points in the initialized depth map is completed to obtain a depth-reliable pixel region and a depth-unreliable pixel region; the method comprises the following specific steps:
firstly, selecting a pixel block A of 3 × 3 with a certain pixel point O as the center in a high-resolution depth space, and respectively determining the belonging area of each pixel point in the pixel block A, wherein the belonging area comprises an edge area and a flat area;
secondly, finding a pixel block B corresponding to the pixel block A in the high-resolution gradient space, similarly determining regions where 9 pixel points of the pixel block B are located, performing consistency judgment on the region characteristics of the pixel points in the two pixel blocks, and judging as a depth-reliable pixel point if the region characteristics of the pixel points in the two pixel blocks are the same; otherwise, projecting the pixel points with inconsistent area characteristics in the two pixel blocks into the low-resolution depth map, and judging.
8. The depth image up-sampling system based on depth marginal points and color image guidance is characterized by comprising the following components:
an acquisition module configured to: acquiring a low-resolution depth map and a high-resolution color map in the same scene; carrying out edge detection on the low-resolution depth map to obtain a low-resolution depth edge map; dividing a pixel area of the low-resolution depth map into a flat area and an edge area according to the low-resolution depth edge map; marking unreliable pixel points of the low-resolution depth map based on the flat region and the edge region; correcting unreliable pixel points of the low-resolution depth map to obtain an edge-enhanced low-resolution depth map;
a determination module configured to: initializing the edge-enhanced low-resolution depth map to obtain an initialized depth map; carrying out structural consistency judgment on the initialized depth map and the high-resolution color map, finishing the classification of pixel points in the initialized depth map, and obtaining a depth reliable pixel area and a depth unreliable pixel area;
a setup module configured to: according to a high-resolution gradient matrix corresponding to the high-resolution color image, searching real edge pixel points of a depth-reliable pixel region from the initialized depth map; mapping real edge pixel points in the initialized depth map into an edge-enhanced low-resolution depth map, and setting influence factors generated by space position constraint on pixel points at different positions in a pixel block;
a correction module configured to: completing the depth value correction of the depth reliable pixel area of the initialized depth map based on the influence factor; completing the correction of the depth value of the depth unreliable pixel area of the initialized depth map; and obtaining a corrected high-resolution depth map.
9. An electronic device comprising a memory and a processor and computer instructions stored on the memory and executable on the processor, the computer instructions when executed by the processor performing the method of any of claims 1-7.
10. A computer-readable storage medium storing computer instructions which, when executed by a processor, perform the method of any one of claims 1 to 7.
CN202010280991.9A 2020-04-10 2020-04-10 Depth image up-sampling method and system based on depth marginal point and color image Expired - Fee Related CN111489383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010280991.9A CN111489383B (en) 2020-04-10 2020-04-10 Depth image up-sampling method and system based on depth marginal point and color image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010280991.9A CN111489383B (en) 2020-04-10 2020-04-10 Depth image up-sampling method and system based on depth marginal point and color image

Publications (2)

Publication Number Publication Date
CN111489383A true CN111489383A (en) 2020-08-04
CN111489383B CN111489383B (en) 2022-06-10

Family

ID=71798244

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010280991.9A Expired - Fee Related CN111489383B (en) 2020-04-10 2020-04-10 Depth image up-sampling method and system based on depth marginal point and color image

Country Status (1)

Country Link
CN (1) CN111489383B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112233191A (en) * 2020-09-18 2021-01-15 南京理工大学 Depth map colorizing method
CN113284081A (en) * 2021-07-20 2021-08-20 杭州小影创新科技股份有限公司 Depth map super-resolution optimization method and device, processing equipment and storage medium
CN116883255A (en) * 2023-05-22 2023-10-13 北京拙河科技有限公司 Boundary correction method and device for high-precision light field image

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609977A (en) * 2012-01-12 2012-07-25 浙江大学 Depth integration and curved-surface evolution based multi-viewpoint three-dimensional reconstruction method
CN103854257A (en) * 2012-12-07 2014-06-11 山东财经大学 Depth image enhancement method based on self-adaptation trilateral filtering
US20140219547A1 (en) * 2013-02-01 2014-08-07 Mitsubishi Electric Research Laboratories, Inc Method for Increasing Resolutions of Depth Images
WO2016193393A1 (en) * 2015-06-05 2016-12-08 Université Du Luxembourg Real-time temporal filtering and super-resolution of depth image sequences
CN106651938A (en) * 2017-01-17 2017-05-10 湖南优象科技有限公司 Depth map enhancement method blending high-resolution color image
CN107563963A (en) * 2017-08-11 2018-01-09 北京航空航天大学 A kind of method based on individual depth map super-resolution rebuilding
CN107689050A (en) * 2017-08-15 2018-02-13 武汉科技大学 A kind of depth image top sampling method based on Color Image Edge guiding
CN108293136A (en) * 2015-09-23 2018-07-17 诺基亚技术有限公司 Method, apparatus and computer program product for encoding 360 degree of panoramic videos
CN110866882A (en) * 2019-11-21 2020-03-06 湖南工程学院 Layered joint bilateral filtering depth map restoration algorithm based on depth confidence
WO2020056769A1 (en) * 2018-09-21 2020-03-26 Intel Corporation Method and system of facial resolution upsampling for image processing

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609977A (en) * 2012-01-12 2012-07-25 浙江大学 Depth integration and curved-surface evolution based multi-viewpoint three-dimensional reconstruction method
CN103854257A (en) * 2012-12-07 2014-06-11 山东财经大学 Depth image enhancement method based on self-adaptation trilateral filtering
US20140219547A1 (en) * 2013-02-01 2014-08-07 Mitsubishi Electric Research Laboratories, Inc Method for Increasing Resolutions of Depth Images
WO2016193393A1 (en) * 2015-06-05 2016-12-08 Université Du Luxembourg Real-time temporal filtering and super-resolution of depth image sequences
CN108293136A (en) * 2015-09-23 2018-07-17 诺基亚技术有限公司 Method, apparatus and computer program product for encoding 360 degree of panoramic videos
CN106651938A (en) * 2017-01-17 2017-05-10 湖南优象科技有限公司 Depth map enhancement method blending high-resolution color image
CN107563963A (en) * 2017-08-11 2018-01-09 北京航空航天大学 A kind of method based on individual depth map super-resolution rebuilding
CN107689050A (en) * 2017-08-15 2018-02-13 武汉科技大学 A kind of depth image top sampling method based on Color Image Edge guiding
WO2020056769A1 (en) * 2018-09-21 2020-03-26 Intel Corporation Method and system of facial resolution upsampling for image processing
CN110866882A (en) * 2019-11-21 2020-03-06 湖南工程学院 Layered joint bilateral filtering depth map restoration algorithm based on depth confidence

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘继忠等: "基于像素滤波和中值滤波的深度图像修复方法", 《光电子激光》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112233191A (en) * 2020-09-18 2021-01-15 南京理工大学 Depth map colorizing method
CN113284081A (en) * 2021-07-20 2021-08-20 杭州小影创新科技股份有限公司 Depth map super-resolution optimization method and device, processing equipment and storage medium
CN113284081B (en) * 2021-07-20 2021-10-22 杭州小影创新科技股份有限公司 Depth map super-resolution optimization method and device, processing equipment and storage medium
CN116883255A (en) * 2023-05-22 2023-10-13 北京拙河科技有限公司 Boundary correction method and device for high-precision light field image
CN116883255B (en) * 2023-05-22 2024-05-24 北京拙河科技有限公司 Boundary correction method and device for high-precision light field image

Also Published As

Publication number Publication date
CN111489383B (en) 2022-06-10

Similar Documents

Publication Publication Date Title
CN106651938B (en) A kind of depth map Enhancement Method merging high-resolution colour picture
CN111489383B (en) Depth image up-sampling method and system based on depth marginal point and color image
US9361725B2 (en) Image generation apparatus, image display apparatus, image generation method and non-transitory computer readable medium
CN106780590B (en) Method and system for acquiring depth map
EP2820593B1 (en) Method and system for adaptive perspective correction of ultra wide-angle lens images
US7239331B2 (en) Method and apparatus for correction of perspective distortion
CN102892021B (en) New method for synthesizing virtual viewpoint image
CN107578430B (en) Stereo matching method based on self-adaptive weight and local entropy
CN110517306B (en) Binocular depth vision estimation method and system based on deep learning
CN104157010A (en) 3D human face reconstruction method and device
CN109584156A (en) Micro- sequence image splicing method and device
CN103605716B (en) Data processing method and device used for webpage click display
CN106023230B (en) A kind of dense matching method of suitable deformation pattern
US9519996B2 (en) Virtual view generating method and apparatus
CN107689050B (en) Depth image up-sampling method based on color image edge guide
CN105812766B (en) A kind of vertical parallax method for reducing
CN111626927B (en) Binocular image super-resolution method, system and device adopting parallax constraint
CN103136734A (en) Restraining method on edge Halo effects during process of resetting projections onto convex sets (POCS) super-resolution image
CN107633489A (en) The fish eye lens center of circle, which is brought up again, takes reflection method distortion correction method
CN111047709A (en) Binocular vision naked eye 3D image generation method
CN111899295A (en) Monocular scene depth prediction method based on deep learning
CN111105452A (en) High-low resolution fusion stereo matching method based on binocular vision
CN104091364B (en) Single-image super-resolution reconstruction method
CN111951339A (en) Image processing method for performing parallax calculation by using heterogeneous binocular cameras
CN108848365B (en) A kind of reorientation stereo image quality evaluation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220610

CF01 Termination of patent right due to non-payment of annual fee