CN107292822B - Image splicing method and device - Google Patents

Image splicing method and device Download PDF

Info

Publication number
CN107292822B
CN107292822B CN201710494989.XA CN201710494989A CN107292822B CN 107292822 B CN107292822 B CN 107292822B CN 201710494989 A CN201710494989 A CN 201710494989A CN 107292822 B CN107292822 B CN 107292822B
Authority
CN
China
Prior art keywords
overlapping area
matching
difference
preset
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710494989.XA
Other languages
Chinese (zh)
Other versions
CN107292822A (en
Inventor
王琳
王西颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Qiyuan Technology Co ltd
Original Assignee
Beijing QIYI Century Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing QIYI Century Science and Technology Co Ltd filed Critical Beijing QIYI Century Science and Technology Co Ltd
Priority to CN201710494989.XA priority Critical patent/CN107292822B/en
Publication of CN107292822A publication Critical patent/CN107292822A/en
Application granted granted Critical
Publication of CN107292822B publication Critical patent/CN107292822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing

Abstract

The invention provides a method and a device for splicing images, and a difference matrix is constructed according to offset values of corresponding pixel points in a first overlapping area and a second overlapping area and color difference values and edge difference values of the corresponding pixel points in the first overlapping area and the second overlapping area which are obtained through pre-calculation. Since the offset value of the corresponding pixel point is a main characteristic for distinguishing the near view object from the distant view object, the offset value corresponding to the distant view object is small, and the offset value corresponding to the near view object is large. In addition, when the first overlapping area and the second overlapping area are spliced according to the difference matrix, a splicing seam with a small difference value is selected, and when the difference value is small, the corresponding deviation value is small, at the moment, the optimal splicing seam can be formed at a distant view object in the image and cannot be formed at a close view object in the image, and therefore the splicing effect is natural.

Description

Image splicing method and device
Technical Field
The invention relates to the technical field of image processing, in particular to an image splicing method and device.
Background
With the popularization of intelligent display devices, panoramic image photographing of a multi-view camera has become a necessary function for many hardware intelligent display devices.
In the prior art, when two images are obtained by shooting through a multi-view camera and then the overlapping regions of the two images are spliced, firstly, a gray value and an edge value (sobel edge value or canny edge value) of each pixel point on the overlapping region of each image are obtained, a difference value of the gray values of the pixel points corresponding to the overlapping regions in the two images is used as a color difference value, the difference value of the edge values of the pixel points corresponding to the overlapping regions in the two images is used as an edge difference value, a difference matrix is constructed according to the color difference value and the edge difference value of the pixel point corresponding to the overlapping region of each image, an optimal splicing seam is obtained according to the difference matrix, and the two images can be spliced.
After the images are spliced, when a close-range object exists in an overlapping area of the two images, the optimal splicing seam generally penetrates through the close-range object, and the images are spliced unnaturally.
Therefore, a method for stitching images more naturally when there is a close-range object in the overlapping region of two images is needed.
Disclosure of Invention
In view of the above, the present invention provides a method and an apparatus for stitching images, so as to solve the problem in the prior art that when there is a close-range object in an overlapping region of two images, the stitching of the images is unnatural.
In order to solve the technical problems, the invention adopts the following technical scheme:
an image stitching method comprises the following steps:
calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area;
constructing a difference matrix according to the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area and the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area which are obtained by pre-calculation;
and splicing the first overlapping area and the second overlapping area according to the difference matrix.
Preferably, calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area includes:
matching the plurality of first feature points extracted in the first overlapping area with the plurality of second feature points extracted in the second overlapping area to obtain a plurality of matched feature point pairs;
calculating a difference vector of each matched feature point pair according to the space position coordinate of the first feature point and the space position coordinate of the second feature point in each matched feature point pair;
matching the plurality of first preset feature points extracted from the first overlapping area with the plurality of second preset feature points extracted from the second overlapping area to obtain a plurality of preset matching feature point pairs;
and calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area according to the difference vector of each matched characteristic point pair and the preset difference vector of each preset matched characteristic point pair.
Preferably, before matching the plurality of first feature points extracted in the first overlapping area with the plurality of second feature points extracted in the second overlapping area to obtain a plurality of matched feature point pairs, the method further includes:
respectively acquiring the first overlapping area and the second overlapping area;
uniformly dividing the first overlap region and the second overlap region into n regions, respectively, wherein n is a positive integer and n > 1;
extracting a preset number of first feature points for each region of the first overlap region division and extracting the preset number of second feature points for each region of the second overlap region division.
Preferably, matching the plurality of first feature points extracted in the first overlapping region with the plurality of second feature points extracted in the second overlapping region to obtain a plurality of matched feature point pairs includes:
and performing SIFT feature matching on the plurality of first feature points extracted in the first overlapping region and the plurality of second feature points extracted in the second overlapping region by adopting a scale-invariant feature transform (SIFT) algorithm to obtain a plurality of matched feature point pairs.
Preferably, calculating a disparity vector of each of the pairs of matched feature points according to the spatial position coordinates of the first feature point and the spatial position coordinates of the second feature point in each of the pairs of matched feature points includes:
and taking the difference value of the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matching characteristic point pair as the difference vector of each matching characteristic point pair.
Preferably, calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area according to the disparity vector of each matching feature point pair and the preset disparity vector of each preset matching feature point pair, includes:
dividing the topological structure corresponding to the first overlapping area into m triangles by adopting a Delaunay triangulation algorithm; each vertex of each triangle obtained by division is a first feature point in the matching feature point pair or a first preset feature point in the preset matching feature point pair, each triangle is internally provided with only one pixel point, m is a positive integer and is greater than 1;
calculating the deviant of the pixel point positioned in each triangle by adopting a dual difference algorithm according to the difference vector of the matching feature point pair respectively corresponding to the three vertexes of each triangle or the preset difference vector of each preset matching feature point pair obtained by division;
and calculating the deviant of the pixel point positioned on each triangle edge by adopting a dual difference algorithm according to the difference vector of the matching feature point pair respectively corresponding to the two vertexes of the triangle edge where each pixel point positioned on each triangle edge is positioned or the preset difference vector of each preset matching feature point pair.
Preferably, constructing a difference matrix according to the offset values of the corresponding pixels in the first overlapping area and the second overlapping area, and the color difference value and the edge difference value of the corresponding pixels in the first overlapping area and the second overlapping area, which are obtained by pre-calculation, includes:
calculating difference values of corresponding pixel points in the first overlapping area and the second overlapping area according to the offset value, the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area;
and constructing a difference matrix according to the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area.
An apparatus for stitching images, comprising:
the calculation module is used for calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area;
a construction module, configured to construct a difference matrix according to offset values of corresponding pixels in the first overlapping area and the second overlapping area, and color difference values and edge difference values of corresponding pixels in the first overlapping area and the second overlapping area, which are obtained through pre-calculation;
and the splicing module is used for splicing the first overlapping area and the second overlapping area according to the difference matrix.
Preferably, the calculation module comprises:
a first matching sub-module, configured to match the plurality of first feature points extracted in the first overlapping region with the plurality of second feature points extracted in the second overlapping region, so as to obtain a plurality of matched feature point pairs;
the first calculation submodule is used for calculating the difference vector of each matched characteristic point pair according to the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matched characteristic point pair;
the second matching submodule is used for matching the plurality of first preset feature points extracted from the first overlapping area with the plurality of second preset feature points extracted from the second overlapping area to obtain a plurality of preset matching feature point pairs;
and the second calculating submodule is used for calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area according to the difference vector of each matched characteristic point pair and the preset difference vector of each preset matched characteristic point pair.
Preferably, the method further comprises the following steps:
an obtaining module, configured to, before the first matching sub-module matches the plurality of first feature points extracted in the first overlapping area with the plurality of second feature points extracted in the second overlapping area to obtain a plurality of matched feature point pairs, obtain the first overlapping area and the second overlapping area respectively;
a dividing module, configured to uniformly divide the first overlapping area and the second overlapping area into n areas, where n is a positive integer and n > 1;
an extracting module, configured to extract a preset number of first feature points for each region divided by the first overlapping region and extract the preset number of second feature points for each region divided by the second overlapping region.
Preferably, the first matching sub-module includes:
and the matching unit is used for carrying out SIFT feature matching on the plurality of first feature points extracted from the first overlapping region and the plurality of second feature points extracted from the second overlapping region by adopting a Scale Invariant Feature Transform (SIFT) algorithm to obtain a plurality of matched feature point pairs.
Preferably, the first calculation submodule includes:
and the calculating unit is used for taking the difference value of the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matching characteristic point pair as the difference vector of each matching characteristic point pair.
Preferably, the second calculation submodule includes:
the triangle dividing unit is used for dividing the topological structure corresponding to the first overlapping area into m triangles by adopting a Delaunay triangulation algorithm; each vertex of each triangle obtained by division is a first feature point in the matching feature point pair or a first preset feature point in the preset matching feature point pair, each triangle is internally provided with only one pixel point, m is a positive integer and is greater than 1;
the first deviant calculating unit is used for calculating deviants of pixel points positioned in each triangle by adopting a dual difference algorithm according to the difference vectors of the matching feature point pairs respectively corresponding to the three vertexes of each triangle obtained by division or the preset difference vectors of each preset matching feature point pair;
and the second deviant calculating unit is used for calculating the deviant of the pixel point positioned on each triangle edge by adopting a dual difference algorithm according to the difference vector of the matching characteristic point pair respectively corresponding to the two vertexes of the triangle edge where each pixel point positioned on each triangle edge is positioned or the preset difference vector of each preset matching characteristic point pair.
Preferably, the building block comprises:
a difference value calculating submodule, configured to calculate difference values of corresponding pixel points in the first overlapping area and the second overlapping area according to offset values, color difference values, and edge difference values of the corresponding pixel points in the first overlapping area and the second overlapping area;
and the construction submodule is used for constructing a difference matrix according to the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a method and a device for splicing images, and a difference matrix is constructed according to offset values of corresponding pixel points in a first overlapping area and a second overlapping area and color difference values and edge difference values of the corresponding pixel points in the first overlapping area and the second overlapping area which are obtained through pre-calculation. Since the offset value of the corresponding pixel point is a main characteristic for distinguishing the near view object from the distant view object, the offset value corresponding to the distant view object is small, and the offset value corresponding to the near view object is large. In addition, when the first overlapping area and the second overlapping area are spliced according to the difference matrix, a splicing seam with a small difference value is selected, and when the difference value is small, the corresponding deviation value is small, at the moment, the optimal splicing seam can be formed at a distant view object in the image and cannot be formed at a close view object in the image, and therefore the splicing effect is natural.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a flowchart of a method for stitching images according to the present invention;
FIG. 2 is a flow chart of another image stitching method provided by the present invention;
FIG. 3 is a flowchart of a method for stitching images according to another embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an image stitching apparatus according to the present invention;
fig. 5 is a schematic structural diagram of another image stitching apparatus provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides an image splicing method, and with reference to fig. 1, the method comprises the following steps:
s11, calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area;
the first overlapping area and the second overlapping area are provided with a plurality of pixel points, the plurality of pixel points in the first overlapping area and the plurality of pixel points in the second overlapping area are provided with corresponding relations, two pixel points corresponding to the same shooting object are corresponding pixel points, and if one pixel point in the first overlapping area and one pixel point in the second overlapping area are both the pixel points of the tree tip, the two pixel points are corresponding pixel points.
S12, constructing a difference matrix according to the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area and the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area which are obtained through pre-calculation;
specifically, step S12 includes:
1) calculating difference values of corresponding pixel points in the first overlapping area and the second overlapping area according to the offset value, the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area;
wherein, the difference value of the corresponding pixel point represents the difference of the corresponding pixel point.
Specifically, when calculating the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area, weights are set for the offset value, the color difference value, and the edge difference value of the corresponding pixel points, respectively, where the weight of the offset value of the corresponding pixel point is the largest.
It should be noted that the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area need to be calculated in advance, where a difference value between the gray values of the pixel points corresponding to the first overlapping area and the second overlapping area is used as the color difference value, and a difference value between the edge values of the pixel points corresponding to the first overlapping area and the second overlapping area is used as the edge difference value.
And respectively multiplying the offset value, the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area by the respective corresponding weights to obtain the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area.
2) And constructing a difference matrix according to the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area.
And arranging the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area according to the position relation of the pixel points to obtain a difference matrix.
In this step, the difference values of the corresponding pixels in the first overlapping area and the second overlapping area are obtained, and then a numerical basis is provided for constructing the difference matrix.
And S13, splicing the first overlapping area and the second overlapping area according to the difference matrix.
According to the difference matrix, a dynamic programming algorithm can be adopted to splice the first overlapping area and the second overlapping area.
Specifically, according to the difference matrix, an optimal splicing seam is obtained by adopting a dynamic programming algorithm, and after the optimal splicing seam is obtained, the first overlapping area and the second overlapping area are spliced according to the optimal splicing seam.
In this embodiment, a difference matrix is constructed according to the offset values of the corresponding pixels in the first overlapping area and the second overlapping area, and the color difference value and the edge difference value of the corresponding pixels in the first overlapping area and the second overlapping area, which are obtained through pre-calculation. Since the offset value of the corresponding pixel point is a main characteristic for distinguishing the near view object from the distant view object, the offset value corresponding to the distant view object is small, and the offset value corresponding to the near view object is large. In addition, when the first overlapping area and the second overlapping area are spliced according to the difference matrix, a splicing seam with a small difference value is selected, and when the difference value is small, the corresponding deviation value is small, at the moment, the optimal splicing seam can be formed at a distant view object in the image and can not be formed at a close view object in the image, and the splicing effect is natural.
On the basis of the above embodiment, step S11 includes:
s21, matching the plurality of first characteristic points extracted in the first overlapping area with the plurality of second characteristic points extracted in the second overlapping area to obtain a plurality of matched characteristic point pairs;
the first overlapping area is an overlapping area in a first image of the two images to be spliced; the second overlapping area is an overlapping area in a second image of the two images to be stitched.
Specifically, S21 includes:
and performing SIFT feature matching on the plurality of first feature points extracted in the first overlapping region and the plurality of second feature points extracted in the second overlapping region by adopting a scale-invariant feature transform (SIFT) algorithm to obtain a plurality of matched feature point pairs.
It should be noted that after a plurality of matching feature point pairs are obtained, matching singular points can be removed by randomly adopting a consistent RANSAC algorithm, and specifically, a matching singular point is a matching feature point pair with poor matching degree among the obtained plurality of matching feature point pairs. When the matching singular points enable later-stage images to be spliced, the splicing effect is poor, and then the matching singular points need to be eliminated.
In this step, the manner of obtaining a plurality of matching feature point pairs by using the scale invariant feature transform SIFT algorithm is an embodiment of obtaining a plurality of matching feature point pairs, and any embodiment that can obtain a plurality of matching feature point pairs is within the scope of the present invention.
Optionally, on the basis of this embodiment, before step S21, the method further includes:
1) respectively acquiring the first overlapping area and the second overlapping area;
2) uniformly dividing the first overlapping area and the second overlapping area into n areas respectively, wherein n is a positive integer and n > 1;
2) a preset number of first feature points are extracted for each region of the first overlap region division and a preset number of second feature points are extracted for each region of the second overlap region division.
Specifically, when the first feature point or the second feature point is extracted, the pixel points with higher brightness or higher edge intensity are extracted, when the pixel points with higher brightness or higher edge intensity are concentrated, and the offset values of the corresponding pixel points in the first overlapping region and the second overlapping region are solved at a later stage, the positions of the feature points are concentrated, and the situation that the offset values of the corresponding pixel points are not easy to calculate is caused, so that the extracted feature points need to be uniformly distributed.
The acquired first overlapping area and the acquired second overlapping area are uniformly divided into n areas, wherein the first overlapping area and the second overlapping area can be divided into triangles, rectangles or polygons when dividing, and the like, as long as the uniform division is ensured when dividing. The uniform division is to ensure that the extracted feature points are distributed uniformly in the whole area, so that the situation that the positions of the feature points are concentrated and the offset value of the corresponding pixel point is not easy to calculate can be avoided.
Thereafter, a preset number of first feature points are extracted for each region of the first overlap region division and a preset number of second feature points are extracted for each region of the second overlap region division.
The preset number is selected by the staff according to the image complexity of the two images to be spliced.
In this embodiment, the acquired first overlap region and the acquired second overlap region are uniformly divided into n regions, a preset number of first feature points are extracted for each region divided by the first overlap region, and a preset number of second feature points are extracted for each region divided by the second overlap region. The extracted feature points can be ensured to be uniformly distributed in the first overlapping area and the second overlapping area.
S22, calculating a difference vector of each matched feature point pair according to the space position coordinate of the first feature point and the space position coordinate of the second feature point in each matched feature point pair;
specifically, step S22 includes:
and taking the difference value of the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matching characteristic point pair as the difference vector of each matching characteristic point pair.
In detail, each matching feature point pair has a spatial position coordinate for a first feature point in the first overlapping region, and each matching feature point pair has a spatial position coordinate for a second feature point in the second overlapping region, and the difference between the spatial position coordinate of the first feature point and the spatial position coordinate of the second feature point is taken as the disparity vector of each matching feature point pair.
Wherein the disparity vector represents a spatial position difference for each matching pair of feature points.
S23, matching the plurality of first preset feature points extracted in the first overlapping area with the plurality of second preset feature points extracted in the second overlapping area to obtain a plurality of preset matching feature point pairs;
the first overlapping area is provided with a plurality of first preset characteristic points, the second overlapping area is provided with a plurality of second preset characteristic points, and the plurality of first preset characteristic points and the plurality of second preset characteristic points form a plurality of preset matching characteristic point pairs correspondingly.
It should be noted that the preset disparity vector of each preset matching feature point pair is (∞, infinity). The purpose of selecting the plurality of preset matching feature point pairs is to ensure that the optimal splicing seam is not in the area outside the first overlapping area and the second overlapping area during final splicing.
And S24, calculating the offset value of the corresponding pixel point in the first overlapping area and the second overlapping area according to the difference vector of each matching characteristic point pair and the preset difference vector of each preset matching characteristic point pair.
Specifically, the disparity vector of each selected matching feature point pair and the preset disparity vector of each preset matching feature point pair are known, but the disparity values of the remaining corresponding pixel points in the first overlapping area and the second overlapping area are unknown. Wherein the disparity vector and the disparity value are derivable from each other.
The disparity value is related to the offset values of the corresponding pixels in the first overlapping area and the second overlapping area, so that the offset values of the corresponding pixels in the first overlapping area and the second overlapping area need to be calculated.
In this embodiment, the offset value of the corresponding pixel point in the first overlapping region and the second overlapping region can be calculated through the difference vector of each matching feature point pair and the preset difference vector of each preset matching feature point pair, so that data support can be provided for constructing the difference matrix.
Optionally, on the basis of any of the foregoing embodiments, referring to fig. 3, step S24 includes:
s31, dividing the topological structure corresponding to the first overlapping area into m triangles by adopting a Delaunay triangulation algorithm;
each vertex of each triangle obtained by division is a first feature point in a matching feature point pair or a first preset feature point in a preset matching feature point pair, each triangle is internally provided with and only comprises one pixel point, m is a positive integer and is greater than 1.
It should be noted that, the topological structure corresponding to the first overlapping area is the same as the topological structure corresponding to the second overlapping area, and at this time, the topological structure corresponding to the second overlapping area may also be divided into m triangles by using a Delaunay triangulation algorithm.
S32, calculating the deviant of the pixel point inside each triangle by adopting a dual difference algorithm according to the difference vector of the matching feature point pair respectively corresponding to the three vertexes of each triangle obtained by division or the preset difference vector of each preset matching feature point pair;
when one pixel point is positioned inside the triangle, calculating the deviant of the pixel point positioned inside each triangle by adopting a dual difference algorithm according to the values of the delta x and the delta y according to the difference vector of the matching characteristic point pair respectively corresponding to the three vertexes of each triangle obtained by division or the preset difference vector of each preset matching characteristic point pair, namely (delta x, delta y).
And S33, calculating the deviant of the pixel point on each triangle edge by adopting a dual difference algorithm according to the difference vector of the matching feature point pair respectively corresponding to the two vertexes of the triangle edge where each pixel point on each triangle edge is located or the preset difference vector of each preset matching feature point pair.
When a pixel point is located on a triangle edge, since the pixel point may be located on a common edge of two triangles at the same time, a problem that the vertex of which triangle is selected cannot be known by adopting a calculation method of an offset value of the pixel point located inside the triangle at this time exists. Therefore, when calculating the deviant of the pixel point on the triangle side, the deviant of the pixel point on each triangle side is calculated by adopting the difference vector of the matching feature point pair respectively corresponding to the two vertexes of the triangle side where the pixel point is located or the preset difference vector of each preset matching feature point pair and adopting the dual difference algorithm.
In this embodiment, a Delaunay triangulation algorithm is adopted to divide the topology structure corresponding to the first overlapping area into m triangles, and then, the offset values of the pixel points located on the sides and inside of each triangle can be calculated according to a dual difference algorithm, so as to provide a basis for constructing the difference matrix.
Optionally, another embodiment of the present invention provides an image stitching apparatus, referring to fig. 4, including:
a calculating module 101, configured to calculate offset values of corresponding pixel points in the first overlapping area and the second overlapping area;
a constructing module 102, configured to construct a difference matrix according to the offset values of the corresponding pixels in the first overlapping area and the second overlapping area, and the color difference value and the edge difference value of the corresponding pixels in the first overlapping area and the second overlapping area, which are obtained through pre-calculation;
a splicing module 103, configured to splice the first overlapping area and the second overlapping area according to the difference matrix.
Optionally, further, the building module 102 includes:
a difference value calculating submodule, configured to calculate difference values of corresponding pixel points in the first overlapping area and the second overlapping area according to offset values, color difference values, and edge difference values of the corresponding pixel points in the first overlapping area and the second overlapping area;
and the construction submodule is used for constructing a difference matrix according to the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area.
In this embodiment, a difference matrix is constructed according to the offset values of the corresponding pixels in the first overlapping area and the second overlapping area, and the color difference value and the edge difference value of the corresponding pixels in the first overlapping area and the second overlapping area, which are obtained through pre-calculation. Since the offset value of the corresponding pixel point is a main characteristic for distinguishing the near view object from the distant view object, the offset value corresponding to the distant view object is small, and the offset value corresponding to the near view object is large. In addition, when the first overlapping area and the second overlapping area are spliced according to the difference matrix, a splicing seam with a small difference value is selected, and when the difference value is small, the corresponding deviation value is small, at the moment, the optimal splicing seam can be formed at a distant view object in the image and can not be formed at a close view object in the image, and the splicing effect is natural.
It should be noted that, for the working process of each module in this embodiment, please refer to the corresponding parts in the above embodiments, which are not described herein again.
Optionally, in another embodiment of the present invention, referring to fig. 5, the calculating module 101 includes:
a first matching sub-module 1011, configured to match the plurality of first feature points extracted in the first overlapping region with the plurality of second feature points extracted in the second overlapping region, so as to obtain a plurality of matching feature point pairs;
a first calculating sub-module 1012, configured to calculate a disparity vector for each matching feature point pair according to a spatial position coordinate of a first feature point and a spatial position coordinate of a second feature point in each matching feature point pair;
a second matching sub-module 1013, configured to match the plurality of first preset feature points extracted in the first overlapping region with the plurality of second preset feature points extracted in the second overlapping region, so as to obtain a plurality of preset matching feature point pairs;
the second calculating sub-module 1014 is configured to calculate, according to the disparity vector of each matching feature point pair and the preset disparity vector of each preset matching feature point pair, an offset value of a corresponding pixel point in the first overlapping area and the second overlapping area.
Optionally, further, the method further includes:
an obtaining module, configured to, before the first matching sub-module matches the plurality of first feature points extracted in the first overlapping area with the plurality of second feature points extracted in the second overlapping area to obtain a plurality of matched feature point pairs, obtain the first overlapping area and the second overlapping area respectively;
a dividing module, configured to uniformly divide the first overlapping area and the second overlapping area into n areas, where n is a positive integer and n > 1;
an extracting module, configured to extract a preset number of first feature points for each region divided by the first overlapping region and extract the preset number of second feature points for each region divided by the second overlapping region.
Optionally, further, the first matching sub-module 1011 includes:
and the matching unit is used for carrying out SIFT feature matching on the plurality of first feature points extracted from the first overlapping region and the plurality of second feature points extracted from the second overlapping region by adopting a Scale Invariant Feature Transform (SIFT) algorithm to obtain a plurality of matched feature point pairs.
Optionally, further, the first computing submodule 1012 includes:
and the calculating unit is used for taking the difference value of the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matching characteristic point pair as the difference vector of each matching characteristic point pair.
In this embodiment, the offset value of the corresponding pixel point in the first overlapping region and the second overlapping region can be calculated through the difference vector of each matching feature point pair and the preset difference vector of each preset matching feature point pair, so that data support can be provided for constructing the difference matrix.
It should be noted that, for the working process of each module or unit in this embodiment, please refer to the corresponding description in the foregoing embodiment, which is not described herein again.
Optionally, in another embodiment of the present invention, the second computing submodule 1014 includes:
the triangle dividing unit is used for dividing the topological structure corresponding to the first overlapping area into m triangles by adopting a Delaunay triangulation algorithm; each vertex of each triangle obtained by division is a first feature point in the matching feature point pair or a first preset feature point in the preset matching feature point pair, each triangle is internally provided with only one pixel point, m is a positive integer and is greater than 1;
the first deviant calculating unit is used for calculating deviants of pixel points positioned in each triangle by adopting a dual difference algorithm according to the difference vectors of the matching feature point pairs respectively corresponding to the three vertexes of each triangle obtained by division or the preset difference vectors of each preset matching feature point pair;
and the second deviant calculating unit is used for calculating the deviant of the pixel point positioned on each triangle edge by adopting a dual difference algorithm according to the difference vector of the matching characteristic point pair respectively corresponding to the two vertexes of the triangle edge where each pixel point positioned on each triangle edge is positioned or the preset difference vector of each preset matching characteristic point pair.
In this embodiment, a Delaunay triangulation algorithm is adopted to divide the topology structure corresponding to the first overlapping area into m triangles, and then, the offset values of the pixel points located on the sides and inside of each triangle can be calculated according to a dual difference algorithm, so as to provide a basis for constructing the difference matrix.
It should be noted that, for the working process of each module or unit in this embodiment, please refer to the corresponding description in the foregoing embodiment, which is not described herein again.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (12)

1. An image stitching method is characterized by comprising the following steps:
calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area;
constructing a difference matrix according to the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area and the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area, which are obtained by pre-calculation, wherein the difference matrix comprises: calculating difference values of corresponding pixel points in the first overlapping area and the second overlapping area according to the offset value, the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area; constructing a difference matrix according to the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area;
and splicing the first overlapping area and the second overlapping area according to the difference matrix.
2. The stitching method of claim 1, wherein calculating the offset values of the corresponding pixels in the first overlapping region and the second overlapping region comprises:
matching the plurality of first feature points extracted in the first overlapping area with the plurality of second feature points extracted in the second overlapping area to obtain a plurality of matched feature point pairs;
calculating a difference vector of each matched feature point pair according to the space position coordinate of the first feature point and the space position coordinate of the second feature point in each matched feature point pair;
matching the plurality of first preset feature points extracted from the first overlapping area with the plurality of second preset feature points extracted from the second overlapping area to obtain a plurality of preset matching feature point pairs;
and calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area according to the difference vector of each matched characteristic point pair and the preset difference vector of each preset matched characteristic point pair.
3. The stitching method according to claim 2, wherein before matching the plurality of first feature points extracted in the first overlapping area with the plurality of second feature points extracted in the second overlapping area to obtain a plurality of matched feature point pairs, the stitching method further comprises:
respectively acquiring the first overlapping area and the second overlapping area;
uniformly dividing the first overlap region and the second overlap region into n regions, respectively, wherein n is a positive integer and n > 1;
extracting a preset number of first feature points for each region of the first overlap region division and extracting the preset number of second feature points for each region of the second overlap region division.
4. The stitching method according to claim 2, wherein matching a plurality of first feature points extracted in the first overlapping region with a plurality of second feature points extracted in the second overlapping region to obtain a plurality of matched feature point pairs comprises:
and performing SIFT feature matching on the plurality of first feature points extracted in the first overlapping region and the plurality of second feature points extracted in the second overlapping region by adopting a scale-invariant feature transform (SIFT) algorithm to obtain a plurality of matched feature point pairs.
5. The stitching method according to claim 2, wherein calculating the disparity vector for each of the pairs of matched feature points based on the spatial position coordinates of the first feature point and the spatial position coordinates of the second feature point in each of the pairs of matched feature points comprises:
and taking the difference value of the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matching characteristic point pair as the difference vector of each matching characteristic point pair.
6. The stitching method according to claim 2, wherein calculating the offset values of the corresponding pixels in the first overlapping region and the second overlapping region according to the disparity vector of each matching feature point pair and the preset disparity vector of each preset matching feature point pair comprises:
dividing the topological structure corresponding to the first overlapping area into m triangles by adopting a Delaunay triangulation algorithm; each vertex of each triangle obtained by division is a first feature point in the matching feature point pair or a first preset feature point in the preset matching feature point pair, each triangle is internally provided with only one pixel point, m is a positive integer and is greater than 1;
calculating the deviant of the pixel point positioned in each triangle by adopting a dual difference algorithm according to the difference vector of the matching feature point pair respectively corresponding to the three vertexes of each triangle or the preset difference vector of each preset matching feature point pair obtained by division;
and calculating the deviant of the pixel point positioned on each triangle edge by adopting a dual difference algorithm according to the difference vector of the matching feature point pair respectively corresponding to the two vertexes of the triangle edge where each pixel point positioned on each triangle edge is positioned or the preset difference vector of each preset matching feature point pair.
7. An apparatus for stitching images, comprising:
the calculation module is used for calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area;
the construction module is used for constructing a difference matrix according to the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area and the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area which are obtained by pre-calculation, and comprises a difference value calculation submodule and a difference value calculation submodule, wherein the difference value calculation submodule is used for calculating the difference values of the corresponding pixel points in the first overlapping area and the second overlapping area according to the offset values, the color difference value and the edge difference value of the corresponding pixel points in the first overlapping area and the second overlapping area; the construction submodule is used for constructing a difference matrix according to difference values of corresponding pixel points in the first overlapping area and the second overlapping area;
and the splicing module is used for splicing the first overlapping area and the second overlapping area according to the difference matrix.
8. Splicing device according to claim 7, wherein said calculation module comprises:
a first matching sub-module, configured to match the plurality of first feature points extracted in the first overlapping region with the plurality of second feature points extracted in the second overlapping region, so as to obtain a plurality of matched feature point pairs;
the first calculation submodule is used for calculating the difference vector of each matched characteristic point pair according to the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matched characteristic point pair;
the second matching submodule is used for matching the plurality of first preset feature points extracted from the first overlapping area with the plurality of second preset feature points extracted from the second overlapping area to obtain a plurality of preset matching feature point pairs;
and the second calculating submodule is used for calculating the offset values of the corresponding pixel points in the first overlapping area and the second overlapping area according to the difference vector of each matched characteristic point pair and the preset difference vector of each preset matched characteristic point pair.
9. The splicing device of claim 8, further comprising:
an obtaining module, configured to, before the first matching sub-module matches the plurality of first feature points extracted in the first overlapping area with the plurality of second feature points extracted in the second overlapping area to obtain a plurality of matched feature point pairs, obtain the first overlapping area and the second overlapping area respectively;
a dividing module, configured to uniformly divide the first overlapping area and the second overlapping area into n areas, where n is a positive integer and n > 1;
an extracting module, configured to extract a preset number of first feature points for each region divided by the first overlapping region and extract the preset number of second feature points for each region divided by the second overlapping region.
10. The splicing device of claim 8, wherein the first matching sub-module comprises:
and the matching unit is used for carrying out SIFT feature matching on the plurality of first feature points extracted from the first overlapping region and the plurality of second feature points extracted from the second overlapping region by adopting a Scale Invariant Feature Transform (SIFT) algorithm to obtain a plurality of matched feature point pairs.
11. The splicing apparatus of claim 8, wherein the first computing submodule comprises:
and the calculating unit is used for taking the difference value of the space position coordinate of the first characteristic point and the space position coordinate of the second characteristic point in each matching characteristic point pair as the difference vector of each matching characteristic point pair.
12. The splicing apparatus of claim 8, wherein the second computing submodule comprises:
the triangle dividing unit is used for dividing the topological structure corresponding to the first overlapping area into m triangles by adopting a Delaunay triangulation algorithm; each vertex of each triangle obtained by division is a first feature point in the matching feature point pair or a first preset feature point in the preset matching feature point pair, each triangle is internally provided with only one pixel point, m is a positive integer and is greater than 1;
the first deviant calculating unit is used for calculating deviants of pixel points positioned in each triangle by adopting a dual difference algorithm according to the difference vectors of the matching feature point pairs respectively corresponding to the three vertexes of each triangle obtained by division or the preset difference vectors of each preset matching feature point pair;
and the second deviant calculating unit is used for calculating the deviant of the pixel point positioned on each triangle edge by adopting a dual difference algorithm according to the difference vector of the matching characteristic point pair respectively corresponding to the two vertexes of the triangle edge where each pixel point positioned on each triangle edge is positioned or the preset difference vector of each preset matching characteristic point pair.
CN201710494989.XA 2017-06-26 2017-06-26 Image splicing method and device Active CN107292822B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710494989.XA CN107292822B (en) 2017-06-26 2017-06-26 Image splicing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710494989.XA CN107292822B (en) 2017-06-26 2017-06-26 Image splicing method and device

Publications (2)

Publication Number Publication Date
CN107292822A CN107292822A (en) 2017-10-24
CN107292822B true CN107292822B (en) 2020-08-28

Family

ID=60099436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710494989.XA Active CN107292822B (en) 2017-06-26 2017-06-26 Image splicing method and device

Country Status (1)

Country Link
CN (1) CN107292822B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108765292B (en) * 2018-05-30 2022-04-29 中国人民解放军军事科学院国防科技创新研究院 Image splicing method based on space triangular patch fitting
CN108898550B (en) * 2018-05-30 2022-05-17 中国人民解放军军事科学院国防科技创新研究院 Image splicing method based on space triangular patch fitting
TWI743477B (en) * 2019-05-07 2021-10-21 威盛電子股份有限公司 Image processing device and method for image processing

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101593350A (en) * 2008-05-30 2009-12-02 日电(中国)有限公司 The methods, devices and systems of depth adaptive video-splicing
US8041147B2 (en) * 2007-07-18 2011-10-18 3DHISTECH Kft; Method for realistic stitching image blocks of an electronically recorded multipart image
US8917951B1 (en) * 2013-07-19 2014-12-23 Hong Kong Applied Science and Technology Research Institute Company Limited Method of on-the-fly image stitching
CN105389774A (en) * 2014-09-05 2016-03-09 华为技术有限公司 Method and device for aligning images
CN105389787A (en) * 2015-09-30 2016-03-09 华为技术有限公司 Panorama image stitching method and device
WO2016048014A1 (en) * 2014-09-22 2016-03-31 Samsung Electronics Co., Ltd. Image stitching for three-dimensional video
CN105894451A (en) * 2016-03-30 2016-08-24 沈阳泰科易科技有限公司 Method and device for splicing panoramic image
CN106815802A (en) * 2016-12-23 2017-06-09 深圳超多维科技有限公司 A kind of image split-joint method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7375745B2 (en) * 2004-09-03 2008-05-20 Seiko Epson Corporation Method for digital image stitching and apparatus for performing the same

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8041147B2 (en) * 2007-07-18 2011-10-18 3DHISTECH Kft; Method for realistic stitching image blocks of an electronically recorded multipart image
CN101593350A (en) * 2008-05-30 2009-12-02 日电(中国)有限公司 The methods, devices and systems of depth adaptive video-splicing
US8917951B1 (en) * 2013-07-19 2014-12-23 Hong Kong Applied Science and Technology Research Institute Company Limited Method of on-the-fly image stitching
CN105389774A (en) * 2014-09-05 2016-03-09 华为技术有限公司 Method and device for aligning images
WO2016048014A1 (en) * 2014-09-22 2016-03-31 Samsung Electronics Co., Ltd. Image stitching for three-dimensional video
CN105389787A (en) * 2015-09-30 2016-03-09 华为技术有限公司 Panorama image stitching method and device
CN105894451A (en) * 2016-03-30 2016-08-24 沈阳泰科易科技有限公司 Method and device for splicing panoramic image
CN106815802A (en) * 2016-12-23 2017-06-09 深圳超多维科技有限公司 A kind of image split-joint method and device

Also Published As

Publication number Publication date
CN107292822A (en) 2017-10-24

Similar Documents

Publication Publication Date Title
US11410320B2 (en) Image processing method, apparatus, and storage medium
CN109697688B (en) Method and device for image processing
US9916676B2 (en) 3D model rendering method and apparatus and terminal device
CN109242961B (en) Face modeling method and device, electronic equipment and computer readable medium
US20190019299A1 (en) Adaptive stitching of frames in the process of creating a panoramic frame
KR102637901B1 (en) A method of providing a dolly zoom effect by an electronic device and the electronic device utilized in the method
US10970821B2 (en) Image blurring methods and apparatuses, storage media, and electronic devices
CN106981078B (en) Sight line correction method and device, intelligent conference terminal and storage medium
CN109767388B (en) Method for improving image splicing quality based on super pixels, mobile terminal and camera
CN107292822B (en) Image splicing method and device
US9542733B2 (en) Image processing method, imaging processing apparatus and image processing program for correcting density values between at least two images
CN113724368B (en) Image acquisition system, three-dimensional reconstruction method, device, equipment and storage medium
CN110288692B (en) Illumination rendering method and device, storage medium and electronic device
CN104811684A (en) Three-dimensional beautification method and device of image
CN114143528A (en) Multi-video stream fusion method, electronic device and storage medium
CN110009567A (en) For fish-eye image split-joint method and device
CN114067051A (en) Three-dimensional reconstruction processing method, device, electronic device and storage medium
CN108205822B (en) Picture pasting method and device
CN113361320A (en) Video face changing method, system, medium and device based on dense face key points
CN113496474A (en) Image processing method, device, all-round viewing system, automobile and storage medium
CN113139905B (en) Image stitching method, device, equipment and medium
CN104036453A (en) Image local deformation method and image local deformation system and mobile phone with image local deformation method
CN109377268A (en) The method of adjustment and device of commodity
US10832425B2 (en) Image registration method and apparatus for terminal, and terminal
EP3671624A1 (en) Target object elimination method based on panoramic camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210930

Address after: 210000 room 1103, building C, Xingzhi science and Technology Park, Nanjing Economic and Technological Development Zone, Nanjing, Jiangsu Province

Patentee after: Nanjing iqiyi Intelligent Technology Co., Ltd

Address before: 10 / F and 11 / F, iqiyi innovation building, No.2 Beiyi street, Haidian District, Beijing 100080

Patentee before: BEIJING QIYI CENTURY SCIENCE & TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 210000 room 1103, building C, Xingzhi science and Technology Park, Nanjing Economic and Technological Development Zone, Nanjing, Jiangsu Province

Patentee after: Nanjing Qiyuan Technology Co.,Ltd.

Address before: 210000 room 1103, building C, Xingzhi science and Technology Park, Nanjing Economic and Technological Development Zone, Nanjing, Jiangsu Province

Patentee before: Nanjing iqiyi Intelligent Technology Co.,Ltd.