CN103714561A - Structure preserving texture synthesis method based on Chamfer distance - Google Patents

Structure preserving texture synthesis method based on Chamfer distance Download PDF

Info

Publication number
CN103714561A
CN103714561A CN201310738949.7A CN201310738949A CN103714561A CN 103714561 A CN103714561 A CN 103714561A CN 201310738949 A CN201310738949 A CN 201310738949A CN 103714561 A CN103714561 A CN 103714561A
Authority
CN
China
Prior art keywords
distance
texture
chamfer
fringe region
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310738949.7A
Other languages
Chinese (zh)
Inventor
汤颖
史晓颖
范菁
肖廷哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University of Technology ZJUT
Original Assignee
Zhejiang University of Technology ZJUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University of Technology ZJUT filed Critical Zhejiang University of Technology ZJUT
Priority to CN201310738949.7A priority Critical patent/CN103714561A/en
Publication of CN103714561A publication Critical patent/CN103714561A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

A structure preserving texture synthesis method based on the Chamfer distance comprises the steps that a texture sample drawing S and a texture characteristic pattern Fs are synthesized into a four-channel image S', and the format of each pixel is RGBA; the RGB channels of the S' store color values of the sample drawing, and the alpha channel stores a label value in a structure characteristic pattern corresponding to the sample drawing; an output image R' is initialized, an area is selected randomly from the S' and copied to the upper left corner of the R', the R' also has four color channels, the RGB channels store color values of an output texture image, and the alpha channel stores a label value in a structure characteristic pattern corresponding to the output texture image; blocks of the output image R' are synthesized sequentially one by one according to a scanning line; for a block to be synthesized currently, the distance between the block to be synthesized and the edge area of each block in the sample drawing is calculated, and the blocks, with the distances meeting a threshold value, in the sample drawing serve as candidate blocks; a block is selected randomly from the candidate blocks to be placed at the position for synthesis, and the block and a synthesized area form a certain overlapping area; a sewing path with the smallest error is found in the overlapping area, and a newly selected block is sewn into an output texture along the sewing path.

Description

Structure-preserving texture synthesis method based on Chamfer distance
Technical field
The present invention relates to the Future Opportunities of Texture Synthesis based on sample.
Background technology
Future Opportunities of Texture Synthesis based on sample according to the vein pattern of input synthetic visually with the textured pattern of the similar arbitrary size of master drawing, it is one of study hotspot of field of Computer Graphics.Future Opportunities of Texture Synthesis based on sample is widely used in the aspects such as texture, picture editting, image analogy.
Future Opportunities of Texture Synthesis based on sample is mainly divided into based on the synthetic and block-based of pixel and synthesizes.Based on synthesizing of pixel, choose a pixel from master drawing at every turn and copy to target texture pattern, block-based synthesizing is from master drawing, to choose a continuous texture region at every turn.Conventionally the block-based synthetic method of comparing based on pixel can keep the structure in continuous texture region better, and execution efficiency is better than synthetic based on pixel, and the present invention adopts block-based synthetic method.Block-based texture synthesis method is using adjacent overlapping fringe region content as constraint, in sample texture, search the most similar texture block, the most similar texture block is in the past searched the coupling texture block of only considering under color metric range, then by finding the way of minimum error cut or color blend, solves the discontinuous and gap between border, further adjacent overlapping region.These methods are better for random texture patterns effect, can reduce to a certain extent discontinuous between texture block, but for containing compared with the texture of strong constitutive property line unit, still can produce the discontinuous effect of line meta structure in some synthetic region, thereby affect the synthetic quality of image.
Summary of the invention
The present invention requires to overcome the above-mentioned shortcoming of prior art, a kind of new block-based texture synthesis method is provided, the method is not only considered the colouring information of texture when searching match block, also to consider the architectural feature of texture, solve the discontinuous problem of line meta structure occurring when the significant texture of composite structure, thereby improved the synthetic effect of texture.
Structure-preserving texture synthesis method based on Chamfer distance of the present invention, step is as follows:
The first step: by vein pattern S and textural characteristics figure F sbe merged into the picture S' of a four-way (form that is each pixel is RGBA).The color value of the RGB passage storage master drawing of S', A channel is stored the label value in the architectural feature figure that master drawing is corresponding.
Second step: initialization output map R', chooses a region duplication to the upper left corner of R', as shown in Figure 3 at random from S'.R' has four Color Channels equally, the color value of RGB passage storage output texture maps, and the label value in architectural feature figure corresponding to texture maps is exported in A channel storage.
The 3rd step: carry out the synthetic output map R' of following step by sweep trace order block-by-block, specifically comprise:
3.1 collect the fringe region L of current to be synthesized rwith the fringe region { L of all possible in master drawing s| L s∈ S};
3.2 edge calculation region L rwith all fringe region { L in master drawing s| L sthe distance of ∈ S}; Innovation of the present invention is that the distance is here calculated and not only calculates L rwith L sbetween Euclidean distance, also calculate the Chamfer distance between them.Euclidean distance represents the similarity of fringe region color, uses the value of RGB passage to calculate.Chamfer, apart from the similarity that represents fringe region texture structure, is used the value of A channel to calculate.
Euclidean distance between fringe region of the present invention is as formula (1) expression
Figure BDA0000447812350000011
In formula (1), p and q represent respectively L sand L rin pixel, R (), G (), G () is 3 passages of color respectively,
Figure BDA0000447812350000012
represent L sand L rbetween middle pixel colour-difference and.D e(L s, L r) less, L rwith L scolor more similar.
The present invention uses Chamfer apart from the similarity that represents fringe region texture structure.If fringe region L rand L svector representation: L s={ p 00, p 01..., p xy, p mn, L r={ q 00, q 01..., q ij, q mn, ij and xy represent the coordinate in pixel edge region, q ijand p xythe label value (value of A channel) that represents respective pixel, m, n represents the length and width of fringe region.L rto L schamfer apart from d c' (L r, L s) be defined as fringe region L rin all pixels and fringe region L sminimum L between the identical pixel of middle label value the summation of distance, as shown in formula (3), if L sin there is no the pixel of same label value, the distance of this pixel is made as to the twice of fringe region size.In formula (3), max (.) represents to get larger value, and min{.} represents to get value minimum in set, d xyrepresent that in fringe region, coordinate (x, y) is located distance corresponding to pixel,
Figure BDA0000447812350000022
that represent is the L of two pixels distance, as formula (2) expression, the i.e. higher value of two pixel respective coordinates component absolute differences.According to above-mentioned definition, Chamfer distance is asymmetric, by L rto L schamfer apart from d c' (L r, L s) add L sto L rchamfer apart from d c' (L s, L r) obtain symmetrical Chamfer apart from d c(L s, L r), as formula (4).Symmetrical Chamfer is apart from d c(L s, L r) expression L swith L rthe similarity of texture structure.
d L ∞ ( q ij , p xy ) = max ( | i - x | , | j - y | ) - - - ( 2 )
d c ′ ( L r , L s ) = Σ i = 0 m Σ j = 0 n min { d xy | [ d xy = d L ∞ ( q ij , p xy ) ∩ q ij = p xy ] ∪ [ d xy = 2 × max ( m , n ) ∩ q ij ≠ p xy ] , 0 ≤ x ≤ m , 0 ≤ y ≤ n } - - - ( 3 )
d c(L s,L r)=d c'(L r,L s)+d c'(L s,L r) (4)
The distance in edge calculation of the present invention region is d (L s, L r)=d e(L s, L r)+w * d c(L s, L r), wherein w represents the weight of Chamfer distance.
3.3 according to the texture block in the distance screening master drawing between fringe region, using the texture block that meets threshold value as candidate blocks.Then from this group candidate blocks, select at random a piece to be put into position to be synthesized, and form certain overlay region with synthetic region.
3.4 find a least error to sew up path in overlay region, along this path, the piece of newly selecting are sewn onto in output texture maps.
The invention describes a kind of based on Chamfer apart from texture synthesis method.It this to have improved block-based texture synthetic, searching of similar of texture, is not only to consider colouring information, also considers the structural information of texture simultaneously.The present invention uses Chamfer apart from the similarity that represents texture structure, makes searching of similar of texture more accurate, has improved the synthetic effect of block-based texture, has solved the discontinuous problem of line meta structure occurring when the significant texture of structure.
Advantage of the present invention is as follows:
(1) thinking is novel.Use Chamfer apart from the similarity that represents texture structure, make searching of similar of texture more accurate, improved the synthetic effect of block-based texture, there is larger innovative significance.
(2) calculate fast.The similarity that the present invention adopts CUDA to realize between texture block is calculated, and uses the computation capability of GPU to accelerate, and can synthesize fast output texture.
(3) convenient and easy.User only need provide a little vein pattern and output texture size, execution algorithm can generate user set size visually with the similar textured pattern of master drawing.
Accompanying drawing explanation
Fig. 1 is block-based texture synthetic schemes of the present invention, and the blue portion of output has represented synthetic part;
Block edge region to be synthesized has overlapping with synthetic part
Fig. 2 is the texture building-up process figure of combination characteristic pattern of the present invention
Fig. 3 is initialization output map R' of the present invention
Embodiment
With reference to accompanying drawing 1-3:
The block-based Texture Synthesis thought that the present invention adopts as shown in Figure 1, is synthesized target texture pattern according to sweep trace order block-by-block.For current piece to be synthesized, calculate the similarity of the fringe region of each piece in itself and master drawing, will meet similar of threshold value as candidate blocks.Then from this group candidate blocks, select at random a piece to be put into position to be synthesized, and form certain overlay region (overlay region is the fringe region for constraint block similarity) with synthetic region.In overlay region, find a least error to sew up path, along this path, the piece of newly selecting is sewn onto in output texture.The inventive method emphasis has improved searching of similar texture candidate blocks, considers the similarity of texture color and structure when searching candidate blocks simultaneously, has improved the synthetic quality of texture.
Below in conjunction with Fig. 2, introduce input and output of the present invention.Because candidate blocks of the present invention search the similarity of simultaneously considering texture color and structure, so input figure is vein pattern S and architectural feature figure F s.F sthe binary map that vein pattern S is corresponding, the texture structure information that comprises master drawing.Concrete vein pattern and counter structure characteristic pattern as two, the left side of Fig. 2 as shown.Two, the right figure of Fig. 2 is the result that the present invention generates: output texture maps R and corresponding architectural feature figure F thereof r.
Structure-preserving texture synthesis method based on Chamfer distance of the present invention, step is as follows:
The first step: by vein pattern S and textural characteristics figure F sbe merged into the picture S' of a four-way (form that is each pixel is RGBA).The color value of the RGB passage storage master drawing of S', A channel is stored the label value in the architectural feature figure that master drawing is corresponding.
Second step: initialization output map R', chooses a region duplication to the upper left corner of R', as shown in Figure 3 at random from S'.R' has four Color Channels equally, the color value of RGB passage storage output texture maps, and the label value in architectural feature figure corresponding to texture maps is exported in A channel storage.
The 3rd step: carry out the synthetic output map R' of following step by sweep trace order block-by-block, specifically comprise
3.1 collect the fringe region L of current to be synthesized rwith the fringe region { L of all possible in master drawing s| L s∈ S};
3.2 edge calculation region L rwith all fringe region { L in master drawing s| L sthe distance of ∈ S}; Innovation of the present invention is that the distance is here calculated and not only calculates L rwith L sbetween Euclidean distance, also calculate the Chamfer distance between them.Euclidean distance represents the similarity of fringe region color, uses the value of RGB passage to calculate.Chamfer, apart from the similarity that represents fringe region texture structure, is used the value of A channel to calculate.
Euclidean distance between fringe region of the present invention is as formula (1) expression
In formula (1), p and q represent respectively L sand L rin pixel, R (), G (), G () is 3 passages of color respectively.
D e(L s, L r) less, L rwith L scolor more similar.
The present invention uses Chamfer apart from the similarity that represents fringe region texture structure.If fringe region L rand L svector representation: L s={ p 00, p 01..., p xy, p mn, L r={ q 00, q 01..., q ij, q mn, ij and xy represent the coordinate in pixel edge region, q ijand p xythe label value (value of A channel) that represents respective pixel, m, n represents the length and width of fringe region.L rto L schamfer apart from d c' (L r, L s) be defined as fringe region L rin all pixels and fringe region L sminimum L between the identical pixel of middle label value the summation of distance, as shown in formula (3), if L sin there is no the pixel of same label value, the distance of this pixel is made as to the twice of fringe region size.In formula (3), max (.) represents to get larger value, and min{.} represents to get value minimum in set, d xyrepresent that in fringe region, coordinate (x, y) is located distance corresponding to pixel,
Figure BDA0000447812350000033
that represent is the L of two pixels distance, as formula (2) expression, the i.e. higher value of two pixel respective coordinates component absolute differences.According to above-mentioned definition, Chamfer distance is asymmetric, by L rto L schamfer apart from d c' (L r, L s) add L sto L rchamfer apart from d c' (L s, L r) obtain symmetrical Chamfer apart from d c(L s, L r), as formula (4).Symmetrical Chamfer is apart from d c(L s, L r) expression L swith L rthe similarity of texture structure.
d L ∞ ( q ij , p xy ) = max ( | i - x | , | j - y | ) - - - ( 2 )
d c ′ ( L r , L s ) = Σ i = 0 m Σ j = 0 n min { d xy | [ d xy = d L ∞ ( q ij , p xy ) ∩ q ij = p xy ] ∪ [ d xy = 2 × max ( m , n ) ∩ q ij ≠ p xy ] , 0 ≤ x ≤ m , 0 ≤ y ≤ n } - - - ( 3 )
d c(L s,L r)=d c'(L r,L s)+d c'(L s,L r) (4)
The distance in edge calculation of the present invention region is d (L s, L r)=d e(L s, L r)+w * d c(L s, L r), wherein w represents the weight of Chamfer distance.
3.3 according to the texture block in the distance screening master drawing between fringe region, using the texture block that meets threshold value as candidate blocks.Then from this group candidate blocks, select at random a piece to be put into position to be synthesized, and form certain overlay region with synthetic region.
3.4 find a least error to sew up path in overlay region, along this path, the piece of newly selecting are sewn onto in output texture maps.
It is very consuming time that distance in step 3.2 is calculated, so the present invention uses the computation capability of GPU to accelerate.Suppose the fringe region L of current to be synthesized rby n pixel, form L r={ q 1, q 2..., q n; The fringe region that has m possible piece in master drawing
Figure BDA0000447812350000043
, the fringe region in master drawing is comprised of n pixel equally
Figure BDA0000447812350000044
distance calculates m distance value { d 1, d 2..., d m.The false code that so general distance is calculated is as shown in table 1, and this serial computing mode is very consuming time.
The false code that the distance of table 1 serial is calculated
Figure BDA0000447812350000041
Calculate the similarity of two interblocks and the similarity of other pieces and calculate onrelevant, calculate Euclidean distance and the L of two pixels distance is independent of other pixels, so the present invention can use, GPU is parallel to accelerate.Here use the mode of tearing circulation open the code of CPU serial computing to be converted into the CUDA kernel function of GPU parallel computing.Chamfer in table 1 has three to recirculate apart from calculating false code.The present invention takes outermost loop and middle layer circulation apart, the distance of two pixels of each thread computes of CUDA kernel function.The similarity of interblock is represented by the distance between fringe region.The distance of two fringe regions be in these two fringe regions all respective pixel spacings and, the present invention uses parallel reduction accumulation method to calculate.GPU parallel computation of the present invention specifically, as shown in the false code of table 2, is used three CUDA kernel function to realize.First CUDA kernel function is calculated the minimum L of each pixel distance, in order to guarantee that Chamfer is apart from symmetry, the minimum L of a pair of pixel of each GPU thread computes distance.Second Chamfer distance that CUDA kernel function edge calculation is interregional, is used the minimum L of all pixels in parallel reduction accumulation method edge calculation region distance and.The 3rd the interregional Euclidean distance of CUDA kernel function edge calculation, only tears outer circulation, the Euclidean distance between a pair of fringe region of each GPU thread computes open in compute euclidian distances.
The false code that the parallel similarity of table 2 is calculated, is used three CUDA kernel function to realize
Figure BDA0000447812350000042
Figure BDA0000447812350000051
The present invention proposes when searching match block, not only to consider the colouring information of texture, also will consider the architectural feature of contained line unit in texture.The present invention not only adopts Euclidean distance to calculate the similarity of color, adopts Chamfer apart from the similarity of calculating texture structure simultaneously.Chamfer distance has good differentiation to shape and edge, can catch well the architectural feature of texture, thereby guarantees that the most similar under Chamfer distance can keep texture structure information better.But the calculated amount of Chamfer distance is very large, the present invention uses programmable graphics hardware GPU to accelerate to improve synthetic efficiency for this reason.

Claims (1)

1. the structure-preserving texture synthesis method based on Chamfer distance, step is as follows:
The first step: by vein pattern S and textural characteristics figure F sbe merged into the picture S' of a four-way, the form of each pixel is RGBA; The color value of the RGB passage storage master drawing of S', A channel is stored the label value in the architectural feature figure that master drawing is corresponding;
Second step: initialization output map R', chooses a region duplication to the upper left corner of R' at random from S'; R' has four Color Channels equally, the color value of RGB passage storage output texture maps, and the label value in architectural feature figure corresponding to texture maps is exported in A channel storage;
The 3rd step: carry out the synthetic output map R' of following step by sweep trace order block-by-block:
3.1 collect the fringe region L of current to be synthesized rwith the fringe region { L of all possible in master drawing s| L s∈ S};
3.2 edge calculation region L rwith all fringe region { L in master drawing s| L sthe distance of ∈ S}; Distance is calculated and is comprised calculating L rwith L sbetween Euclidean distance and Chamfer distance; Euclidean distance represents the similarity of fringe region color, uses the value of RGB passage to calculate; Chamfer, apart from the similarity that represents fringe region texture structure, is used the value of A channel to calculate;
Euclidean distance d between fringe region e(L s, L r) as formula (1) expression
Figure FDA0000447812340000011
In formula (1), p and q represent respectively L sand L rin pixel, R (), G (), G () is 3 passages of color respectively,
Figure FDA0000447812340000012
represent L sand L rbetween middle pixel colour-difference and; d e(L s, L r) less, L rwith L scolor more similar.
Chamfer is apart from the similarity that represents fringe region texture structure; If fringe region L rand L svector representation: L s={ p 00, p 01..., p xy, p mn, L r={ q 00, q 01..., q ij, q mn, ij and xy represent the coordinate in pixel edge region, q ijand p xythe label value (value of A channel) that represents respective pixel, m, n represents the length and width of fringe region; L rto L schamfer apart from d c' (L r, L s) be defined as fringe region L rin all pixels and fringe region L sminimum L between the identical pixel of middle label value the summation of distance, as shown in formula (3), if L sin there is no the pixel of same label value, the distance of this pixel is made as to the twice of fringe region size; In formula (3), max (.) represents to get larger value, and min{.} represents to get value minimum in set, d xyrepresent that in fringe region, coordinate (x, y) is located distance corresponding to pixel,
Figure FDA0000447812340000014
that represent is the L of two pixels distance, as formula (2) expression, the i.e. higher value of two pixel respective coordinates component absolute differences; According to above-mentioned definition, Chamfer distance is asymmetric, by L rto L schamfer apart from d c' (L r, L s) add L sto L rchamfer apart from d c' (L s, L r) obtain symmetrical Chamfer apart from d c(L s, L r), as formula (4); Symmetrical Chamfer is apart from d c(L s, L r) expression L swith L rthe similarity of texture structure;
d L ∞ ( q ij , p xy ) = max ( | i - x | , | j - y | ) - - - ( 2 )
d c ′ ( L r , L s ) = Σ i = 0 m Σ j = 0 n min { d xy | [ d xy = d L ∞ ( q ij , p xy ) ∩ q ij = p xy ] ∪ [ d xy = 2 × max ( m , n ) ∩ q ij ≠ p xy ] , 0 ≤ x ≤ m , 0 ≤ y ≤ n } - - - ( 3 )
d c(L s,L r)=d c'(L r,L s)+d c'(L s,L r) (4)
The distance in edge calculation region is d (L s, L r)=d e(L s, L r)+w * d c(L s, L r), wherein w represents the weight of Chamfer distance;
3.3 according to the texture block in the distance screening master drawing between fringe region, using the texture block that meets threshold value as candidate blocks; Then from this group candidate blocks, select at random a piece to be put into position to be synthesized, and form certain overlay region with synthetic region;
3.4 find a least error to sew up path in overlay region, along this path, the piece of newly selecting are sewn onto in output texture maps.
CN201310738949.7A 2013-12-27 2013-12-27 Structure preserving texture synthesis method based on Chamfer distance Pending CN103714561A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310738949.7A CN103714561A (en) 2013-12-27 2013-12-27 Structure preserving texture synthesis method based on Chamfer distance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310738949.7A CN103714561A (en) 2013-12-27 2013-12-27 Structure preserving texture synthesis method based on Chamfer distance

Publications (1)

Publication Number Publication Date
CN103714561A true CN103714561A (en) 2014-04-09

Family

ID=50407502

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310738949.7A Pending CN103714561A (en) 2013-12-27 2013-12-27 Structure preserving texture synthesis method based on Chamfer distance

Country Status (1)

Country Link
CN (1) CN103714561A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106815879A (en) * 2017-01-17 2017-06-09 湖南优象科技有限公司 A kind of quick texture synthesis method based on LBP features
CN108010095A (en) * 2017-11-14 2018-05-08 阿里巴巴集团控股有限公司 The method, apparatus and equipment of a kind of textures synthesis
CN108364276A (en) * 2018-03-13 2018-08-03 重庆大学 Texture image synthetic method based on tag database
CN111915702A (en) * 2019-05-10 2020-11-10 浙江大学 Image processing method and device
CN111986284A (en) * 2020-08-14 2020-11-24 中国人民解放军战略支援部队信息工程大学 Image texture synthesis method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030206176A1 (en) * 2001-08-03 2003-11-06 Ritter Bradford A. System and method for performing texture synthesis
US20060028481A1 (en) * 2003-10-31 2006-02-09 Jingdan Zhang Synthesis of progressively-variant textures and application to arbitrary surfaces
CN102063705B (en) * 2010-12-02 2012-08-08 天津大学 Method for synthesizing large-area non-uniform texture
CN102867290A (en) * 2012-08-28 2013-01-09 浙江工业大学 Texture optimization-based non-homogeneous image synthesis method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030206176A1 (en) * 2001-08-03 2003-11-06 Ritter Bradford A. System and method for performing texture synthesis
US20060028481A1 (en) * 2003-10-31 2006-02-09 Jingdan Zhang Synthesis of progressively-variant textures and application to arbitrary surfaces
CN102063705B (en) * 2010-12-02 2012-08-08 天津大学 Method for synthesizing large-area non-uniform texture
CN102867290A (en) * 2012-08-28 2013-01-09 浙江工业大学 Texture optimization-based non-homogeneous image synthesis method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FAN JING ET.AL.: "An optimized texture-by-numbers synthesis method and its visual applications", 《SCIENCE CHINA INFORMATION SCIENCES》 *
朱文浩等: "基于样本的纹理合成技术综述", 《中国图象图形学报》 *
肖春霞等: "结合图像细节特征的全局优化纹理合成", 《计算机学报》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106815879A (en) * 2017-01-17 2017-06-09 湖南优象科技有限公司 A kind of quick texture synthesis method based on LBP features
CN106815879B (en) * 2017-01-17 2019-11-05 湖南优象科技有限公司 A kind of quick texture synthesis method based on LBP feature
CN108010095A (en) * 2017-11-14 2018-05-08 阿里巴巴集团控股有限公司 The method, apparatus and equipment of a kind of textures synthesis
CN108364276A (en) * 2018-03-13 2018-08-03 重庆大学 Texture image synthetic method based on tag database
CN108364276B (en) * 2018-03-13 2021-12-03 重庆大学 Texture image synthesis method based on label database
CN111915702A (en) * 2019-05-10 2020-11-10 浙江大学 Image processing method and device
CN111986284A (en) * 2020-08-14 2020-11-24 中国人民解放军战略支援部队信息工程大学 Image texture synthesis method and device
CN111986284B (en) * 2020-08-14 2024-04-05 中国人民解放军战略支援部队信息工程大学 Texture synthesis method and device for image

Similar Documents

Publication Publication Date Title
US11455712B2 (en) Method and apparatus for enhancing stereo vision
CN103714561A (en) Structure preserving texture synthesis method based on Chamfer distance
Yuan et al. Superpixel-based seamless image stitching for UAV images
Hickson et al. Efficient hierarchical graph-based segmentation of RGBD videos
Wang et al. Video tooning
Zhang et al. Style transfer via image component analysis
CN105321177B (en) A kind of level atlas based on image importance pieces method together automatically
MXPA03001171A (en) Image conversion and encoding techniques.
KR20150032176A (en) Color video processing system and method, and corresponding computer program
CN105046689B (en) A kind of interactive stereo-picture fast partition method based on multi-level graph structure
Dong et al. Summarization-based image resizing by intelligent object carving
CN108829711A (en) A kind of image search method based on multi-feature fusion
CN108629809A (en) A kind of accurate efficient solid matching method
Du et al. Double-channel guided generative adversarial network for image colorization
JP5463269B2 (en) Feature figure addition method, feature figure detection method, feature figure addition device, feature figure detection device, and program
CN102572305B (en) Method of video image processing and system
Shesh et al. Efficient and dynamic simplification of line drawings
Han et al. Optimal multiple surfaces searching for video/image resizing-a graph-theoretic approach
CN112258561A (en) Matching point acquisition method for image stitching
Li et al. Short-long-term propagation-based video inpainting
CN109785367A (en) Exterior point filtering method and device in threedimensional model tracking
Kowdle et al. Scribble based interactive 3d reconstruction via scene co-segmentation
Hall Nonphotorealistic Rendering by Q‐mapping
CN110009654B (en) Three-dimensional volume data segmentation method based on maximum flow strategy
CN102930569B (en) Generation method for irregular-scale photomosaic

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140409