CN1570977A - Grain transferring method based on multiple master drawings - Google Patents

Grain transferring method based on multiple master drawings Download PDF

Info

Publication number
CN1570977A
CN1570977A CN 200410034726 CN200410034726A CN1570977A CN 1570977 A CN1570977 A CN 1570977A CN 200410034726 CN200410034726 CN 200410034726 CN 200410034726 A CN200410034726 A CN 200410034726A CN 1570977 A CN1570977 A CN 1570977A
Authority
CN
China
Prior art keywords
texture
block
image
brightness
vein
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200410034726
Other languages
Chinese (zh)
Other versions
CN1256706C (en
Inventor
齐越
赵沁平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN 200410034726 priority Critical patent/CN1256706C/en
Publication of CN1570977A publication Critical patent/CN1570977A/en
Application granted granted Critical
Publication of CN1256706C publication Critical patent/CN1256706C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Image Generation (AREA)

Abstract

This invention is a vein transfer method based on multi-pattern. This method divides the customized brightness of target image to several grades under the participation of user. Then adopting different vein pattern to synthesize vein according to different brightness level by the block-based method. At last, calculating brightness diversity of each vein block for the synthesized block, if it is larger than the user-entered valve value, then divide this block to smaller blocks until the brightness diversity of subdivided block is less than the user-entered valve value. Using L shape neighborhood research method to synthesize vein for these subdivided blocks.

Description

Texture transfer method based on various figure
Technical field
The invention belongs to the computer virtual reality technology field, specifically a kind of the utilization has the method that texture image generates new texture image, is used for the structure and the drafting of realistic virtual environment.
Background technology
In real world, body surface has very abundant grain details, can distinguish according to their people variously to have identical shaped different objects, thereby the simulation of article surface vein details plays important effect in the photo realism graphic rendering technique.How show these grain details effectively in the figure that computing machine generates, be the hot issue of studying in the structure of computer graphics and virtual environment and the rendering technique always.
The simulation grain details mainly contains two methods on the object that computing machine generates.One is with polygon or other how much primitive surface details to be carried out modeling, but when details was very careful, very complicated, this method was very impracticable; Another be image mapped to the surface of object, this technology is called texture (texture mapping).The image that is used for texture is rectangle normally, is called texture maps (texture map) or texture (texture).Texture can be simulated various surface properties, as color, reflection (reflection), transparent etc.Since Catmull adopted the grain details of texture technical modelling body surface first in nineteen seventies, the texture technology had obtained extensive studies and application.
In computer graphics, the texture technology obtains very big success, but also has three serious problems: the first, and the texture source is more single.Present texture mainly still obtains by the photo of manual drawing picture and scanning.The picture of manual drawing can satisfy the needs on the art, but is difficult to true as photo; And photo is very little usually as texture, can not cover whole body surface.At this moment, simply stick on visually and can cause artificial vestige.The second, from the texture space to the place, do not have the mapping of nature, thereby when mapping texture can produce serious deformation.The 3rd, texture takies a large amount of internal memories and bandwidth.
For this reason, people have proposed the texture synthetic technology.The texture that utilization texture synthetic technology generates can be any size, unduplicated pattern visually.By suitably handling boundary condition, the synthetic image that can also generate Pasting of texture.
The synthetic target of texture is: synthetic new texture visually makes people feel to use the basic process identical with original texture to generate.
Texture synthesizes dual mode: a kind of is to generate texture according to model; Another kind is to generate new texture according to existing texture image (being also referred to as vein pattern).Texture is synthetic will to be solved two problems: the one, and how modeling estimates the texture generative process from given limited vein pattern.The process of estimating should modeling input texture procedure division, again can the modeling random partial.The success or not of modeling is to be determined with respect to given master drawing verisimilitude visually by synthetic texture.The 2nd, how sampling is developed effective sampling process and is generated new texture from given model.The validity of sampling process will directly determine the calculated amount that texture generates.
Wei[1] method of L shaped neighborhood search is proposed, adopt the multiresolution synthetic method to reduce the hunting zone, and quicken with the vector quantization technology of tree structure.Ashikhmin[2] utilize correlation principle to improve the Wei method, the hunting zone is limited in current neighborhood, thereby has improved search speed.
Based on L shaped texture synthesis method be: according to synthetic zone, the seed of seeking the similarity maximum in master drawing synthesizes.Specify below:
Two neighborhood N that shape is identical 1, N 2Similarity weigh with following formula, that is:
d ( N 1 , N 2 ) = Σ p ∈ N 1 , q ∈ N 2 ( R ( p ) - R ( q ) ) 2 + ( G ( p ) - G ( q ) ) 2 + ( B ( p ) - B ( q ) ) 2
The d value is more for a short time in the formula shows that two neighborhoods are similar more.Wherein, function R (), G () and B () represent R (redness), G (green) and B (blueness) value of pixel in the neighborhood respectively; P and q are respectively neighborhood N 1And N 2Middle corresponding pixel.Therefore, the shape of neighborhood and search order are very important, and it directly influences the result of the local coupling of texture.
Ashikhmin[2] utilized correlation principle that the hunting zone is limited in current neighborhood, thus improved search speed, as Fig. 2.According to the correspondence position (in Fig. 2 seed of arrow indication) of L shaped neighborhood seed in input picture, utilize corresponding skew to find seed to be selected.For example, the upper right corner adjoint point of the current seed P that will synthesize is Q in the output image, and the coupling seed of Q in input picture is Q 1, be that initial point is set up coordinate system with the image upper left corner, x axle positive dirction level to the right, y axle positive dirction vertically downward, seed P and Q differ-1 in the horizontal direction on the x like this, differ 1 on vertical direction y, and P=Q+ (1,1) is then arranged, like this, seed Q to be selected 1By Q 2+ (1,1) obtains.Calculate all seeds to be selected thus.Black color dots is the seed to be selected that is calculated in Fig. 2, and relatively the similarity of the L shaped neighborhood of each seed to be selected and current seed P is calculated similarity according to (formula 1), selects the corresponding seed of the minimum seed of d value as current seed.
As follows based on L shaped texture composition algorithm:
(1) position of seed position in input picture in the record buffer memory.
(2) for each seed in the output image, carry out following calculating by scanning sequency:
● in output image, consider to specify the L shaped neighborhood of big or small current seed, as Fig. 2;
● to each seed in the neighborhood, according to seed position initial in buffer memory, skew is found out candidate seed behind the relevant position;
● remove the candidate seed that repeats;
● in candidate seed tabulation, choose seed with the L shaped neighborhood error minimum of the current seed of output image;
● copy the value of selected seed to output image current seed from input picture, and in buffer memory, write down its initial position.
Xu[3], Liang[4] and Efros[5] adopt technology based on piece (patch), characteristics are that the texture aggregate velocity is fast, particularly Liang has realized synthesizing in real time texture.The following describes block-based texture synthesis method.Be convenient narration, earlier used symbol and implication thereof in this method be described as follows:
I InThe sampling texture of input
I OutSynthetic texture
B kI InIn texture block, be used to paste I OutIn, adopt square
W BB kWidth
E BkB kWith I OutIn the width that merges of synthetic texture
E Out kI OutIn the borderline region of the piece pasted
W EE Out kWidth
B (x, y)I InThe middle lower left corner is in (x, texture block y)
Ψ BSet comprises I InWith E Out kThe borderline region of all texture block of coupling
Make I R1And I R2Be two identical shaped and big or small texture block is arranged.Definition: if d is (R 1, R 2)<δ then claims I R1And I R2Coupling.Here the distance between two texture block of d () expression, δ is a constant.
Block-based sampling algorithm is estimated MRF (FRAME or the Gibbs) density p (I of local condition by the experience histogram with the nonparametric form R| I R).Definition texture block I RBorderline region be R, its width is w EAs the texture I on borderline region RWhen known, can estimate unknown texture block I RConditional probability distribution.Needn't tectonic model, known I for all RBe the piece of its borderline region, directly inquire about all input vein pattern I InQuery Result is to texture block I RForm the histogram Ψ of an experience.For synthetic I R, as long as from Ψ, select an element arbitrarily.On the mathematics, the condition MRF density of estimation is:
p ( I R | I ∂ R ) = Σ i α i δ ( I R - I R i ) ,
Here I RiBe I RA sampling block, its borderline region I RiCoupling borderline region I RStandardization weights α i, be similar to scale-up factor.
For synthesizing texture I arbitrarily Out, construction set Ψ B:
&Psi; B = { B ( x , y ) | d ( E B ( x , y ) , E out k ) < d max , B ( x , y ) &Element; I in }
Here, d MaxBe borderline region apart from tolerance.From Ψ BIn select arbitrarily the texture block of a texture as k stickup.For given d Max, set Ψ BMay be empty.At this moment, select to have minor increment d (E Bk, E Out k) B kAs I InIn texture block.Block-based texture building-up process as shown in Figure 3.Specific algorithm is as follows:
(1) arbitrarily from I InThe middle w that selects B* w BTexture block B 0, B 0Paste I OutThe lower left corner, establish k=1;
(2) structure I InAll texture block set Ψ B, its borderline region coupling E Out k
(3) if Ψ BBe sky, establish Ψ B={ B Min, wherein, B MinThe expression the most close E in border Out kThe zone.
(4) from set Ψ BSelect an element as k texture block B arbitrarily kB kPaste output texture I OutIn.If k=k+1;
(5) repeat (2), (3) and (4) are up to I OutTill covering fully.
Merge on borderline region, way is as follows: establishing the gradual change factor is real number d, and its value changes to 0 according to locations of pixels gradually by 1; If two corresponding width of cloth texture lap pixel values are kept at respectively among buffer memory image1 and the image2, the result after the fusion is kept among the buffer memory image, calculates the formula that merges to be: image=d*image1+ (1-d) * image2.
Block-based texture aggregate velocity is fast, but synthetic image has limitation.Slow based on L shaped neighborhood search method aggregate velocity, but synthetic process controlled easily, can be according to synthetic new, the beyond thought texture image of master drawing.Present texture transfer method all is based on single master drawing, adopts L shaped texture synthesis method, and aggregate velocity is very slow.
Summary of the invention
For overcoming above-mentioned shortcoming, the object of the present invention is to provide a kind of texture transfer method based on various figure, this method can enlarge the approach in texture source according to synthetic (just " generation ") the new texture image of existing texture image, satisfies the demand that the sense of reality is drawn in the virtual environment.
The present invention adopts various figure to realize that texture shifts, and specific practice is: at first, target image is carried out piecemeal roughly according to brightness, adopt different textures at different luminance block, use block-based texture synthesis method to realize that texture shifts; Then, each brightness variation is segmented greater than the piece that the user imports threshold values again, changed up to brightness till the threshold values of importing less than the user, use texture synthesis method realization texture to shift based on L shaped neighborhood to the luminance block of segmentation.
The present invention compared with prior art, its beneficial effect is: can generate new texture according to existing texture, the texture transfer process is from coarse to fine, progressively refinement, speed are fast, the user can control different vein patterns and realize that texture shifts.
Description of drawings
Fig. 1 is the main process flow diagram of texture transfer method that the present invention is based on various figure;
Fig. 2 for the present invention adopt based on L shaped texture synthesis method;
The block-based texture building-up process that Fig. 3 adopts for the present invention.
Embodiment
Below in conjunction with the drawings and specific embodiments the present invention is elaborated.
Texture transfer method in the past is based on mostly that single vein pattern carries out.The texture that the present invention is based on various figure shifts, fully utilized Liang based on block method and Ashikhmin based on L shaped neighborhood search method, from coarse to fine, progressively refinement realizes shifting based on the texture of various figure.
The texture branch problem of various figure can be summed up as following problem:
I 1+I 2+…+I n+T=R
In the formula, I 1, I 2..., I nBe different vein patterns, T is a target image, and R is the texture image after synthesizing.That is to say vein pattern I iAccording to user's constraint condition, target image T is carried out texture synthesize, generate new image.User's constraint condition of the present invention is the brightness variation according to target image, adopts different vein patterns that it is carried out texture respectively and synthesizes.
Main process flow diagram of the present invention as shown in Figure 1, concrete steps are as follows:
(1) realizes that with block-based method texture shifts
At first the brightness of target image is analyzed, set up the corresponding relation between different intensity levels and the different texture master drawing.Because most of images all are that (R represents red value to employing RGB, G represents green value, B represents blue valve) the color information of form memory image, and rgb value can not reflect well that the brightness of image changes, so we adopt YIQ, and (Y represents brightness, I represents in-phase signal, and Q represents orthogonal signal) form.Conversion from RGB to YIQ can be carried out according to following formula:
Y I Q = 0.299 0.587 0.114 0.596 - 0.274 - 0.322 0.211 - 0.523 0.312 R G B
Can carry out brightness analysis to image according to brightness value Y.
Be that example illustrates the method that various figure texture shifts with two master drawing texture transfer methods below.
According to the given brightness Y threshold values of user, target image is divided into two-stage, the basic classification unit of image is the piece of m * m, experience shows, is the image of n * n for resolution, m=n/.To this two-stage, use two different vein patterns respectively, adopt block-based method to carry out texture and synthesize, realize that tentatively texture shifts.
(2) method based on L shaped neighborhood search realizes that texture shifts
Realize that with block-based method texture shifts, the texture of generation is also more coarse, needs further segmentation.At this moment, for the texture block of each intensity level in the preliminary texture composograph, size is m * m, calculates poor between the maximum brightness value of pixel in each texture block and the minimum luminance value, if less than the threshold value of user's input, then keep this piece; Otherwise, m * m being subdivided into (m/2) * (m/2), the piece to these segmentations carries out brightness value relatively again, and so repeatedly, the brightness of pixel changes till the threshold value of importing less than the user in piece.Change little piece for these brightness, use method, carry out texture according to its vein pattern that adopts and synthesize based on L shaped neighborhood search.
The block-based texture synthesis method that adopts in this example and all belong to known technology based on L shaped texture synthesis method is described in background technology, so do not repeat them here.
The present invention also can use more vein pattern, for example use 3 vein patterns, this moment, the user wanted given two luminance threshold Y1, Y2, and Y2>Y1,, use different vein patterns to carry out texture as stated above then respectively and shift whether smaller or equal to Y1, greater than Y1 and less than Y2, target image is divided into three grades according to the brightness value Y of target image more than or equal to Y2.
The foregoing description only is used to illustrate invention step of the present invention, does not limit scope of patent protection of the present invention, and any improvement of doing according to the inventive method must belong within the claim of the present invention.
Citing document
[1]Li-Yi?Wei,Marc?Levoy.Fast?Texture?Synthesis?using?Tree-structured?Vector?Quantization.In:Proceedings?of?SIGGRAPH.Los?Angeles:ACM?Press,2000.479~488
[2]M.Ashikhmin.Synthesizing?natural?textures.In:2001?ACM?Symposium?on?Interactive?3D?Graphics.Los?Angeles:ACM?Press,2001.217~226
[3]Y.Xu,B.Guo,and?H.-Y.Shum.Chaos?mosaic:Fast?and?Memory?Efficient?Texture?Synthesis”.Tech.Rep.MSR-TR-2000-32,Microsoft?Research,2000
[4]L.Liang,C.Liu,Y.Xu,B.Guo,et?al.Real-time?texture?synthesis?by?patch-based?sampling.TechnicalReport?MSR-TR-2001-40,Microsoft?Research,March?2001
[5]Alexei?A.Efros,William?T.Freeman.Image?Quilting?for?Texture?Synthesis?and?Transfer.In:Proceedings?of?SIGGRAPH.Los?Angeles:ACM?Press,2001.341~347

Claims (2)

1. texture transfer method based on various figure is characterized in that it may further comprise the steps:
(1) carrying out preliminary texture shifts
Target image is subdivided into image block, determine n-1 brightness classification thresholds according to the brightness of target image, according to these threshold values image block is generalized into the n level by intensity level, n level image block is used different n texture pattern respectively, adopt block-based method that each image block is carried out texture and be synthesized into each texture block, thereby realize that preliminary texture shifts;
(2) by luminance difference segmentation texture block
For each texture block, calculate the poor of its maximum brightness value and minimum luminance value, if threshold value less than regulation, then keep this texture block, otherwise this texture block segmentation, again the texture block after the segmentation is carried out the comparison of luminance difference, so repeatedly, until the luminance difference of each texture block all till the threshold value less than regulation;
(3) texture is synthetic
For the texture block of above-mentioned luminance difference less than defined threshold, according to its intensity level, use corresponding vein pattern, employing is carried out texture based on L shaped texture synthesis method and is synthesized.
2. the texture transfer method based on various figure as claimed in claim 1 is characterized in that:
The brightness of each pixel of target image in the described step 1 is obtained by following fortran:
Y=0.299R+0.587G+0.114b
In the formula: R, G, B are the color-values of image, and Y is the brightness value of image.
CN 200410034726 2004-05-09 2004-05-09 Grain transferring method based on multiple master drawings Expired - Fee Related CN1256706C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 200410034726 CN1256706C (en) 2004-05-09 2004-05-09 Grain transferring method based on multiple master drawings

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 200410034726 CN1256706C (en) 2004-05-09 2004-05-09 Grain transferring method based on multiple master drawings

Publications (2)

Publication Number Publication Date
CN1570977A true CN1570977A (en) 2005-01-26
CN1256706C CN1256706C (en) 2006-05-17

Family

ID=34481544

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200410034726 Expired - Fee Related CN1256706C (en) 2004-05-09 2004-05-09 Grain transferring method based on multiple master drawings

Country Status (1)

Country Link
CN (1) CN1256706C (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101013501B (en) * 2006-01-31 2011-02-23 索尼株式会社 Image processing apparatus
CN103229213A (en) * 2010-11-29 2013-07-31 汤姆逊许可公司 Method and device for reconstructing a self-similar textured region of an image
CN105354281A (en) * 2014-02-03 2016-02-24 株式会社隆创 Image inspection apparatus and image inspection procedure
CN106780701A (en) * 2016-11-23 2017-05-31 深圳大学 The synthesis control method of non-homogeneous texture image, device, storage medium and equipment
CN109520972A (en) * 2018-12-04 2019-03-26 青岛理工大学 Hierarchical visibility measuring method and device
CN112235476A (en) * 2020-09-15 2021-01-15 南京航空航天大学 Test data generation method based on fusion variation

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101013501B (en) * 2006-01-31 2011-02-23 索尼株式会社 Image processing apparatus
CN103229213A (en) * 2010-11-29 2013-07-31 汤姆逊许可公司 Method and device for reconstructing a self-similar textured region of an image
US9367932B2 (en) 2010-11-29 2016-06-14 Thomson Licensing Method and device for reconstructing a self-similar textured region of an image
CN103229213B (en) * 2010-11-29 2016-08-10 汤姆逊许可公司 The method and apparatus rebuilding the self similarity texture region of image
CN105354281A (en) * 2014-02-03 2016-02-24 株式会社隆创 Image inspection apparatus and image inspection procedure
CN105354281B (en) * 2014-02-03 2018-12-07 株式会社隆创 Image testing device and image checking method
CN106780701A (en) * 2016-11-23 2017-05-31 深圳大学 The synthesis control method of non-homogeneous texture image, device, storage medium and equipment
CN106780701B (en) * 2016-11-23 2020-03-13 深圳大学 Non-uniform texture image synthesis control method, device, storage medium and equipment
CN109520972A (en) * 2018-12-04 2019-03-26 青岛理工大学 Hierarchical visibility measuring method and device
CN112235476A (en) * 2020-09-15 2021-01-15 南京航空航天大学 Test data generation method based on fusion variation

Also Published As

Publication number Publication date
CN1256706C (en) 2006-05-17

Similar Documents

Publication Publication Date Title
Mildenhall et al. Nerf: Representing scenes as neural radiance fields for view synthesis
Wang et al. Efficient example-based painting and synthesis of 2d directional texture
Wei et al. State of the art in example-based texture synthesis
Wei Texture synthesis by fixed neighborhood searching
CN108986195B (en) Single-lens mixed reality implementation method combining environment mapping and global illumination rendering
CN1215443C (en) Layer representation of three-D body and method and device for drawing said body by utilizing it
Lu et al. Interactive painterly stylization of images, videos and 3D animations
EP3649618A1 (en) Systems and methods for providing non-parametric texture synthesis of arbitrary shape and/or material data in a unified framework
Li et al. Read: Large-scale neural scene rendering for autonomous driving
CN103530907B (en) Complicated three-dimensional model drawing method based on images
CN102509357B (en) Pencil sketch simulating and drawing system based on brush stroke
CN1655192A (en) Method and apparatus for high speed visualization of depth image-based 3D graphic data
CN1655191A (en) Programmable graphic hardware based rapid voxel method for polygonal grid model
CN104881839A (en) Hotspot map generation method based parallel acceleration
US6239807B1 (en) Method and system for multi-resolution texture mapping
CN102930593A (en) Real-time rendering method based on GPU (Graphics Processing Unit) in binocular system
CN110706325A (en) Real-time dynamic rendering method and system for three-dimensional submarine environment
CN1256706C (en) Grain transferring method based on multiple master drawings
Lefebvre et al. Tiletrees
Ge et al. 3D Reconstruction of Ancient Buildings Using UAV Images and Neural Radiation Field with Depth Supervision
CN1991915A (en) Interactive ink and wash style real-time 3D romance and method for realizing cartoon
CN1256707C (en) Grain synthesizing method based on multiple master drawings
CN117150755A (en) Automatic driving scene simulation method and system based on nerve point rendering
CN1667652A (en) Vision convex hull accelerated drafting method based on graph processing chips
Burgess et al. A system for real-time watercolour rendering

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20060517

Termination date: 20120509