CN111507946A - Element data driven flower type pattern rapid generation method based on similarity sample - Google Patents
Element data driven flower type pattern rapid generation method based on similarity sample Download PDFInfo
- Publication number
- CN111507946A CN111507946A CN202010255136.2A CN202010255136A CN111507946A CN 111507946 A CN111507946 A CN 111507946A CN 202010255136 A CN202010255136 A CN 202010255136A CN 111507946 A CN111507946 A CN 111507946A
- Authority
- CN
- China
- Prior art keywords
- pattern
- similarity
- elements
- image
- sample
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000011218 segmentation Effects 0.000 claims abstract description 24
- 238000012545 processing Methods 0.000 claims abstract description 18
- 125000004122 cyclic group Chemical group 0.000 claims abstract description 12
- 238000001514 detection method Methods 0.000 abstract description 8
- 238000013461 design Methods 0.000 description 18
- 238000004422 calculation algorithm Methods 0.000 description 17
- 239000004744 fabric Substances 0.000 description 11
- 239000004753 textile Substances 0.000 description 11
- 238000004519 manufacturing process Methods 0.000 description 9
- 238000013528 artificial neural network Methods 0.000 description 7
- 238000000605 extraction Methods 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 210000001503 joint Anatomy 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 241000282414 Homo sapiens Species 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000011960 computer-aided design Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 3
- 238000005520 cutting process Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000002860 competitive effect Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000005314 correlation function Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 238000005034 decoration Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000012938 design process Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009776 industrial production Methods 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000011176 pooling Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/60—Rotation of whole images or parts thereof
- G06T3/608—Rotation of whole images or parts thereof by skew deformation, e.g. two-pass or three-pass rotation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/80—Geometric correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/187—Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20112—Image segmentation details
- G06T2207/20164—Salient point detection; Corner detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a similarity sample-based element data-driven flower type pattern rapid generation method, which comprises the following steps of: performing cyclic unit detection and splicing mode judgment on the input pattern, performing gray level enhancement processing on unit pictures, and extracting similarity elements; carrying out segmentation processing on the element object of the pattern image, segmenting the element object according to the region, and removing the background and irrelevant parts; and generating a new similar pattern, comparing the similarity of the elements in an element library, replacing the elements of the same type, and combining the replaced element layers to obtain the similar pattern. The method can ensure that the newly generated pattern has the same style and layout with the original input sample pattern to the maximum extent, is not limited by a certain specific style and layout, and has good wide practicability and operability.
Description
Technical Field
The invention relates to a rapid generation method of a flower type pattern, belongs to the technical field of computer aided design, and particularly relates to a rapid generation method of a flower type pattern driven by element data based on a similarity sample.
Background
The textile and clothing industry belongs to labor-intensive industry, and a large amount of textile pattern fabrics need to be generated to meet the daily life and spirit requirements of human beings. The design of textile patterns varies with fashion trends and mass market preferences. The design of patterns produced in industrial mass production includes but is not limited to the following features: the requirements on style positioning and design theme of pattern design are very clear, the requirements on response time of the pattern design are very fast, the requirements on content change of the pattern design are very diverse, and the requirements on content richness of design elements are higher and higher. Research and statistical analysis show that most of the development quantity of new fabric products with pattern patterns in the practical production of textile and clothing industry is completed by reusing or slightly modifying the existing pattern elements. Therefore, the design of pattern patterns in industrial production increasingly depends on the efficiency improvement of computer aided design methods and the abundant data of element materials on the internet to improve the design efficiency and reduce the design cost. Currently, many researchers have attempted to generate patterns from different perspectives using computer-aided design methods. There are a method of generating a pattern using a topological structure, a pattern generation method for a certain fixed style, a pattern design method of a fractal technology, an evolution art wall drawing pattern design, a printed pattern generation method based on a repetitive pattern discovery, and the like.
The outline of the pattern content generated by the existing method is fixed or needs to be customized by a user, so that the requirement for rich pattern modeling in actual production is difficult to meet, and only a pattern with a certain specific style can be generated. If a user needs to generate an ideal style, a great deal of time is spent on manual modification, and the efficiency and the effect of design cannot meet the requirements of quick change and quick response in actual production. Therefore, related software products which are not mature in the market are put into practical production of the textile and clothing industry for popularization and use.
Disclosure of Invention
The invention aims to solve the defects of the prior art and provide a similarity sample-based element data-driven pattern rapid generation method for rapidly generating pattern patterns with similarity association, which can be applied to, but not limited to, pattern design of industrially produced clothes, wall cloth, home textiles, packaging and other products.
In order to achieve the purpose, the invention discloses a method for quickly generating a pattern based on element data drive of a similarity sample, which comprises the following steps:
step 1, generating an element layer: inputting a sample pattern, detecting a cyclic unit, performing gray level enhancement processing on the unit picture, and extracting similarity elements.
Step 2, element object segmentation processing: and (4) segmenting the extracted element objects in the sample pattern according to the regions, and removing the background and irrelevant parts.
And 3, replacing similar elements: and performing similarity comparison on the segmented elements in an element library, performing element replacement on the elements of the same type, and combining the replaced element layers to obtain similar flower type patterns.
Compared with the prior art, the method has the following beneficial effects: the method for quickly generating the new pattern based on the existing pattern is realized, the complicated pattern analysis and design process is simplified, the time and cost for manually analyzing pattern circulation, elements, splicing, combination and the like are reduced, the labor intensity of workers can be obviously reduced for the textile pattern design industry which is characterized by high volume production, diversity, short period, quickness and fashion, and the labor cost of enterprises is reduced. The method is convenient to operate, the time required for generating the novel pattern with the specific style and theme is very short, and a user does not need to carry out complicated editing and operation on the pattern outline. The method can ensure that the newly generated pattern has the same style and layout with the original input sample pattern to the maximum extent, is not limited by a certain specific style and layout, has good wide practicability and operability, can be directly applied to the pattern design industries of textile, clothing, packaging, printing, decoration and the like, and can obviously reduce the time and cost required by pattern design and development.
Drawings
FIG. 1 is a schematic process flow diagram of the present invention.
Detailed Description
The invention is further illustrated with reference to the following figures and examples.
As shown in FIG. 1, the method for quickly generating the element data-driven pattern based on the similarity sample provided by the invention comprises the following steps:
step 1, generating an element layer: inputting a sample pattern, detecting a cyclic unit, performing gray level enhancement processing on the unit picture, and extracting similarity elements.
Step 2, element object segmentation processing: and (4) segmenting the extracted element objects in the sample pattern according to the regions, and removing the background and irrelevant parts.
And 3, replacing similar elements: and performing similarity comparison on the segmented elements in an element library, performing element replacement on the elements of the same type, and combining the replaced element layers to obtain similar flower type patterns.
The step 1 comprises the following steps:
step 1-1, detecting by a circulation unit: the pattern of the industrial mass production can be regarded as a periodically repeated pattern. The content of the patterns is varied, and the pattern content is usually not dense enough and is relatively sparse, so that the patterns have abundant visual variation effects. If the content of the pattern is directly processed, the calculation amount is very large, and the pattern used in the actual production needs to be a complete cycle unit. Therefore, to the transmissionThe incoming sample pattern must be analyzed for a complete cycle unit. Due to the diversity of patterns, the element objects of the same theme usually have different degrees of similarity, and sometimes, it is difficult for human eyes to quickly recognize a complete cycle unit. If the pattern has distortion or deformation, the manual correction and splicing of the images are time-consuming and labor-consuming. The detection method of the pattern type pattern circulation unit meeting the content of the diversified patterns, which is proposed by the existing literature, generally needs manual supervision, and does not consider a complex pattern splicing form, such as jump connection. However, the jumper connection splicing method is a very common splicing method in the field of pattern design, such as the printing industry and the wall cloth industry. The effect of the cyclic unit detection proposed herein is to automatically and unsupervised cut out the unit images, to be able to analyze the cyclic unit images for jump-splicing, to reduce the time required for the subsequent processing, and to automatically correct the positional tilt or physical deformation present in the fabric pattern, which can be used directly without cutting or splicing in the actual generation. Inputting a pattern image I of a fabric samplergbThe Shi-Tomasi corner detection based on Harris corner detection improvement is adopted, and the method has higher robustness compared with the traditional corner detection method. And (3) processing a local image of the image I to obtain angular point position information, wherein horizontal and vertical coordinates of the characteristic points are represented by (I, j). The pattern structure for mass production can be regarded as a structure mode of a periodic repeating unit, and comprises two modes of butt joint and jump joint. Order toAre the two sides of the adjacent quadrangle of angular feature points along the i direction,are the two sides of the adjacent quadrilateral for which the corner feature point lies along the j direction. The spatial consistency of the repeating unit quadrilateral can be described as:
wherein E represents a normalized error function, | | | | | non-woven phosphor2Representation L2A regularization term. If E is less than a given value, it indicates the degree of approximation rule, and the smaller the value, the more similar the degree of similarity of the repeating units. Candidates for adjacent repeating units may be found using cross-correlation functions.
Modeling the distribution information of the corner characteristic points of the repeated quadrilateral units existing in the sample by using a Markov random field model:
the method comprises the steps of detecting the repeating units in a fabric sample picture and evaluating the inclination and distortion degree of the repeating units by using a belief propagation algorithm (BP algorithm), and solving quadrilateral texture rules of a random field model in a Markov random field model by using a mean shift algorithm based on a mean shift algorithm to obtain quadrilateral texture primitives in the fabric sample picture, wherein x represents the repeating units, z represents image feature distances, and Ψ and Φ represent a space constraint pair relationship function of the repeating units and a function of approximate repeating unit values at a given position s, respectively.
And calculating the inclination angle of the sample picture. For the detected candidate quadrangle, the included angles between the edges in the i and j directions and the horizontal and vertical directions are respectively marked as Hα、VβAccording to HαOr VβAnd judging the rotation angle of the sample picture.
And judging the splicing mode of the sample pattern. In the textile industry, the splicing mode of a sample circulating unit needs to be judged when a fabric sample pattern is designed and produced, and the splicing mode P for judging the sample pattern is as follows:
where P ═ 0 indicates that the sample is in butt joint, and P ═ 1 indicates that the sample is in skip joint, such as 1/2, 1/3, 1/4 skip joint, and the like. S is the stitching step size of the adjacent cyclic unit. When S is equal to 0, the splicing mode of the sample circulating unit is splicing, and at the moment, the adjacent splicing units at the upper part, the lower part, the left part and the right part share all edges and the vertex of the quadrilateral mesh; when S is larger than 0, the splicing mode of the sample circulation unit is jumping connection, and at the moment, the adjacent splicing units at the upper part, the lower part, the left part and the right part do not share all edges and quadrilateral mesh vertexes. The simplest butt-splice approach is represented as: p is 0 ^ theta is 90 ^ Hα=0∧VβAnd theta is the included angle of two sides of the quadrangle, which is 0.
Jump-over and butt-joint. When S is 0, not all of the butt splicing methods can be converted into the skip splicing method, for example, the simplest butt splicing method cannot be converted into the skip splicing method. When S is larger than 0, all the pattern circulating units can be represented by a butt joint mode, namely, the jumper joint splicing mode can be converted into the butt joint splicing mode. According to the detection result of the parallelogram grid model provided by the text, the characteristics of various jumper connection modes and butt connection modes and the conversion relation between the jumper connection modes and the butt connection modes can be conveniently analyzed. For example, the 1/2 jump and butt method is expressed as follows:
wherein P-1/2 represents a jumper splicing method. The side length calculation method of the cyclic unit of the butt joint and the 1/2 jump joint is as follows:
in the formula (I), the compound is shown in the specification,indicating the side lengths of the two sides of the butting quadrangle, respectively representing the length of the adjacent sides of the 1/2 jump. Similarly, according to the detection result of the parallelogram grid model, the characteristics of the jumper connection modes such as 1/3, 1/4 and 1/5 and the like and the characteristics of the butt connection modes can be analyzed, and various conversion relations between the jumper connection modes and the butt connection modes are obtained.
And correcting distortion of the cyclic unit of the pattern. For the detected quadrilateral edges after multiple iterations, the twisted and deformed edges can be corrected to the quadrilateral in a regular state by adopting a Spline Warping algorithm, and the images in the quadrilateral region can be obtained by calculating the images of the jump-over cyclic unit according to the side length calculation method of the cyclic unit and are marked as IUnit-rgb。
Step 1-2, contrast enhancement of gray level images: the extraction of the similarity features first requires converting the color pattern of the cyclic unit picture from RGB to a pattern of a grayscale map. When the color image is converted into the gray-scale image, color information is lost, some elements in the pattern are difficult to distinguish from the pattern image, and further, the identification and the segmentation of the pattern elements are influenced, so that the error of an image processing result is increased and the accuracy is reduced. The existing similarity feature extraction algorithm and image registration documents propose methods such as color invariants, color descriptors, color affine transformation, texture feature fusion, adaptive processing and the like, but have problems in the aspects of instantaneity and sample diversity adaptability. The method adopts a latest color contrast enhancement algorithm to reduce the RGB three-dimensional space of a fabric picture to a one-dimensional space, and the I obtained by the processing of the step 1-1Unit-rgbObtaining a picture marked as I after the dimension of the image is reducedUnit-d1. The method for reducing the image color from three dimensions to one dimension is as follows:
where r, g, b represent the color values of the input image I at R, G, B for the three color channels. P denotes a paired local and non-local candidate set, gx-gyRepresentation imageElement gxAnd gyThe difference between the gray values. The contrast enhancement model is as follows:
the details of the algorithm and its solving method are found in the documents 2: L iu Q, L i S, Xiong J, Qin B, WpmDecolor: weighted projection resolution, the Visual computer 2019, 35: 205-21.
Step 1-3, extracting similarity elements: the similarity pattern in the fabric sample pattern is matched by adopting a SURF operator-based feature extraction method. SURF is an acceleration steady feature algorithm based on SIFT (scale invariant feature transform), the operation speed of the SURF algorithm is improved by 3-5 times compared with SIFT, and the accuracy is not reduced. The SURF algorithm converts a color image into a grayscale image and then performs template feature matching and registration. The existing literature mainly adopts color information approximation or conversion processing, and processing such as color self-adaptation and texture feature fusion, but has the problems of poor real-time performance, difficult sample diversity adaptation and the like. The gray-scale picture enhancement contrast ratio in the step 1-2 is adopted, so that real-time calculation is well realized, the color contrast information of color picture elements is enhanced, and the method is suitable for SURF algorithm feature matching. Due to the diversity and complexity of textile patterns, the decomposition of the patterns involves semantic segmentation, image content understanding, and the like. Meanwhile, considering different actual requirements, the I obtained from the step 1-2 processing can be roughly processed manuallyUnit-d1The image frame selects the interested element object region TEAnd the region is used as a template image for SURF feature matching. For template image TEAnd a unit image I to be recognizedUnit-d1Establishing an integral image, wherein the value ii (i, j) of any point (i, j) in the integral image is relative to the point (i, j) from the upper left corner of the original imageSum of gray values for the corresponding diagonal regions:
wherein p (r, c) represents IUnit-d1The value of the point (r, c) in the image. The SURF algorithm extracts characteristic points through a Hessian matrix:
formula (III) Lxx(x,y,σ)、Lxy(x,y,σ)、Lyy(x, y, σ) are second-order partial derivatives of Gaussian respectivelyAndand image IUnit-d1(x, y) convolution at point (x, y), where the two-dimensional gaussian function is:for the convenience of calculation, a determinant approximating a Hessian matrix is adopted: det (h) ═ DxxDyy-(ωDxy)2Where ω is a compensation parameter, typically 0.9, Dxx、Dxx、DxxRespectively, box filter (box filter) and image I of different sizesUnit-d1Convolution of (x, y). The feature point extraction and feature matching of the scale invariance are described in document 3: [1]Bay H, Es A, Tuytelaars T, Van L, speed-up robust features (SURF), Computer vision and image understating.2008; 110(3):346-59EMay be in IUnit-d1The problem of matching feature point selection is repeated multiple times, and a feature point matching mode based on an iterative loop is provided. Ith T to be detectedEElemental objects, we use the updated residual feature image IUnit-d1 iThe feature matching is performed, which is specifically described as follows:
the characteristic point matching mode of the above iterative loop can well solve the problem of interference of the component characteristic points in the complex image on the matching judgment of the integral element characteristic points.
The step 2 comprises the following steps:
step 2-1, element object region segmentation: due to the fact that cost and time are considered, and actual market preference factors such as specific popular elements, themes and styles are considered, similarity elements are generally searched from an existing element library to be used as inspiration materials for re-creation. The searched element object library of an enterprise usually comprises element objects with various background colors and attached band areas or miscellaneous points, and in order to reduce the similarity matching interference of other redundant information on the element objects and improve the accuracy and success rate of similarity comparison, a processing method of multi-region fuzzy competitive segmentation is adopted to process the element objects T detected in the steps 1-3EiAnd performing region segmentation. The object of element object region segmentation is to establish a clean element object image with a transparent background, and reduce the influence of irrelevant regions and backgrounds with different colors on the comparison of similarity elements. The processing methods presented herein are essential in the actual alignment process. For example, the edges of the textile pattern content object are generally difficult to directly extract, manual supervision and correction are needed, the color levels are generally rich and close (adjacent colors are often used for coordinated color matching), and the image similarity of a black background and a white background is a problem that the comparison result is very obviously influenced. The redundant irrelevant background area or other irrelevant part area also has obvious influence on the pattern composition and distribution, so that the comparison result is also obviously influenced. Element object TEiThe method for performing region segmentation is as follows:
where N is the number of region class divisions and Ω is the input element object TEiU is the fuzzy membership of the region segmentation, ▽ is the gradient operator, and λ is the region data item control parameter.Is an error function used to characterize membership level fluctuation in a certain area, and its expression is equal to f-ci||2Wherein c isiIs a constant. v is an auxiliary variable and θ is a control parameter small enough for uiAnd viAre approximately equal in value. Through variable separation, the equation of the above formula can be solved by using a chambole fast projection algorithm and an alternative minimization method, that is:
where υ is a lagrange multiplier with a positive value. p is a radical of*Solve the passing orderAnd iteration is carried out:
where τ is a parameter with a small value to ensure iteration convergence.
Step 2-2, element background and irrelevant parts are removed: obtaining high quality element segmentation results requires two conditions to be satisfied: (1) the division time is real-time or quasi-real-time; (2) the segmentation elements are complete and extraneous feature or mottle areas are minimized or even eliminated completely. The segmentation method in the step 2-1 is a convergent convex function with a fast solving method, the segmentation time is short, and the requirement of the segmentation time is met. Due to element pairsElephant TEiIs a partial region detected from the original sample image, the number of colors and the number of regions are generally controlled to a relatively small range, which ensures the applicability of the segmentation method of step 2-1, since the segmentation method of step 2-1 is generally applicable to the number of segmentations N ∈ [2,6 ]]The area ratio and the color distinction degree of the area are in a relatively obvious range, which is the essential principle of the color image fuzzy area competition model.
The element image T is based on the principle of the color image fuzzy region competition modelEiIf more color regions are involved, the number of segmentations N selected to remove the background is typically between 4 and 6. Simple elemental image T if background color and element are clearly distinguishedEiThen N ∈ [2,3 ]]The user only needs to select a relatively complete part of the element region as an output result according to the segmentation result, and the region of interest of u in the result of the step 2-1 is detected for the region of the output result, the u can obtain a binary image through image binarization processing, the number of connected regions and the circumscribed rectangular outline of each connected region are marked by an 8-connection method, and meanwhile, the coordinates of the centroid position x and the y of the region surrounded by the outline are calculated, wherein x is ∑ j, x isi,j/∑xi,j、y=∑i*xi,j/∑xi,j, xi,jThe value representing the pixel point coordinate at (i, j) is 0V 1. And performing background and irrelevant part clearing processing on the result of each segmentation. The method comprises (1) binarizing the segmentation result u for the element object TEiDisplaying to generate a AND TEiThe MAP matrixes with the same size in a plane have the value range of 0V 1, 0 represents the background, 1 represents the display content, the background is displayed as a mosaic pattern for the convenience of observation of human eyes, (2) whether the irrelevant parts need to be removed is determined according to the outline of the circumscribed rectangle of the region of interest, and the small region is forcibly divided into N ∈ [2,6 ] due to the dividing method in the step 2-1]The number of extraneous components is usually already small, and the user can easily choose to remove the extraneous components outside the circumscribed rectangular wheel of the region of interest according to the requirement. (3) The image within the circumscribed rectangular wheel is displayed,and generates a MAP of the same size as the circumscribed rectangular outlineTEAnd a matrix, wherein the value of 0 in the matrix represents the background and is displayed by a mosaic pattern, and 1 represents the element after the background and the irrelevant parts are removed.
The step 3 comprises the following steps:
and 3-1, comparing element object similarity. It is divided into 3 sub-steps.
Step 3-1-1: extracting element neural network features: and extracting the neural network characteristics of the element pictures in the element database by adopting a Convolutional Neural Network (CNN). The elements included in the element library are elements of existing patterns, and the number of the existing elements is usually between tens of thousands and hundreds of thousands according to the actual element style, content and semantic classification used in enterprises. With reference 5: dong W, Zhou N, Paul J, Zhang X. optimized image rendering using seam clipping and transformation, ACM transformations on Graphics 2009; 28, (5) 1-10. the image seam clipping (seamcarving) algorithm performs seam clipping on the elements, and reduces the influence of redundant blank areas on post-feature extraction and element picture synthesis. Performing convolution operation on the obtained seam cutting image, the color image and the gray level image to obtain a feature map, realizing a picture feature extraction method by using a convolution layer and a pooling layer in a CNN, and finally determining convolution kernel parameters through a back propagation algorithm to obtain the final element neural network feature.
Step 3-1-2: a temporary feature library is generated. From element layer set { TEi}i=1:MAnd M is the number of the element objects segmented in the step 2, the required element layer is selected, seam clipping is carried out on the elements by using an image seam clipping (seam clipping) algorithm, and the obtained element picture is subjected to neural network feature extraction according to the method in the step 3-1-1.
Step 3-1-3: and (5) comparing element similarity. And extracting a certain proportion of features from the temporary feature library according to actual requirements for combination, calculating the distance between the features and the element neural network features in the element library, and sequencing according to the distance value from small to large. The weights of the element neural network characteristics of the seam cutting image, the color image and the gray level image are w1、w2、w3All values are [0, 1 ]]. And when the distance is calculated, weight assignment is respectively carried out on the corresponding image element neural network characteristics in the element library according to the temporary characteristic combination weights. The distance function is calculated by using Euclidean distance. The top-ranked elements are similar to the full-content elements in step 2. According to semantic classification labels in the element library, such as flowers, birds, piglets and the like, the sorting result is screened, and the success rate of element similarity comparison can be improved.
And 3-2, replacing elements. Four substeps are provided.
Step 3-2-1, layer association: selecting element atlas { T in turnEi}i=1:MAfter the element in the step (3) is subjected to similarity comparison in the step (1), a blank image is established according to the comparison search result meeting the requirement and the size of the input sample image, and the position and the angle of the element in the blank image in the comparison result are determined according to the position and the angle of the original element in the original image: the rotation angle of the same element of the matched template obtained by the calculation in the step 1-3 and the x and y coordinates of the central position calculated in the step 2-2. Establishing a similarity layer association set { I) by using a plurality of compared similarity elements corresponding to each element to be replacedSE1,ISE2,ISE3,...ISEnAnd n represents the map layer number of the similar map layer association set.
Step 3-2-2, layer combination: setting a similar image layer association set { I) which is required to be used for replacing one element in the original image according to requirementsSE1,ISE2,ISE3,...ISEnAnd (4) taking k elements which are ranked at the top according to the element similarity of the step 3-1, wherein k is less than or equal to n, and the smaller the k value is, the higher the repetition degree is. And combining the layers in the similar layer association set corresponding to different elements in the original image.
Step 3-2-3, merging layers: and (3) continuously filling the layer combination result in the step (3-2-2) in the blank layer according to the sequence of the layer numbers, and judging the content which should be displayed by a certain pixel point in a mode that the content of the layer of the latter covers the content of the former.
Step 3-2-4, shape and color adjustment: the user can perform appropriate editing operations such as scaling, clipping and moving on the size of the replacement element according to actual requirements, and change the color of the local object according to the actual requirements to obtain the similarity pattern.
Claims (1)
1. A method for quickly generating a pattern based on element data driving of similarity samples comprises the following steps:
step 1, generating an element layer: inputting a sample pattern, detecting a cyclic unit, performing gray level enhancement processing on the unit picture, and extracting similarity elements.
Step 2, element object segmentation processing: and (4) segmenting the extracted element objects in the sample pattern according to the regions, and removing the background and irrelevant parts.
And 3, replacing similar elements: and performing similarity comparison on the segmented elements in an element library, performing element replacement on the elements of the same type, and combining the replaced element layers to obtain similar flower type patterns.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010255136.2A CN111507946A (en) | 2020-04-02 | 2020-04-02 | Element data driven flower type pattern rapid generation method based on similarity sample |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010255136.2A CN111507946A (en) | 2020-04-02 | 2020-04-02 | Element data driven flower type pattern rapid generation method based on similarity sample |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111507946A true CN111507946A (en) | 2020-08-07 |
Family
ID=71869060
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010255136.2A Pending CN111507946A (en) | 2020-04-02 | 2020-04-02 | Element data driven flower type pattern rapid generation method based on similarity sample |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111507946A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112712576A (en) * | 2021-01-19 | 2021-04-27 | 东华大学 | Intelligent pattern design generation method |
CN114139904A (en) * | 2021-11-23 | 2022-03-04 | 南京林业大学 | Waste reduction packaging design control system for big data monitoring service |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080063070A (en) * | 2006-12-28 | 2008-07-03 | 가부시키가이샤 시마세이키 세이사쿠쇼 | Design apparatus of patterns for woven fabrics, design methods, and design program |
CN106709171A (en) * | 2016-12-13 | 2017-05-24 | 南京大学 | Repeat mode discovery-based printed pattern generation method |
CN106778881A (en) * | 2016-12-23 | 2017-05-31 | 中国科学院深圳先进技术研究院 | Digital printing method and device |
CN106934846A (en) * | 2015-12-29 | 2017-07-07 | 深圳先进技术研究院 | A kind of cloth image processing method and system |
CN107025457A (en) * | 2017-03-29 | 2017-08-08 | 腾讯科技(深圳)有限公司 | A kind of image processing method and device |
CN107369068A (en) * | 2017-07-02 | 2017-11-21 | 绍兴原色数码科技有限公司 | Based on internet printing pattern designing, color separation and digit printing fabric custom-built system |
CN109242858A (en) * | 2018-07-18 | 2019-01-18 | 浙江理工大学 | Pattern primitive dividing method is recycled based on the matched textile printing of adaptive template |
-
2020
- 2020-04-02 CN CN202010255136.2A patent/CN111507946A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20080063070A (en) * | 2006-12-28 | 2008-07-03 | 가부시키가이샤 시마세이키 세이사쿠쇼 | Design apparatus of patterns for woven fabrics, design methods, and design program |
CN106934846A (en) * | 2015-12-29 | 2017-07-07 | 深圳先进技术研究院 | A kind of cloth image processing method and system |
CN106709171A (en) * | 2016-12-13 | 2017-05-24 | 南京大学 | Repeat mode discovery-based printed pattern generation method |
CN106778881A (en) * | 2016-12-23 | 2017-05-31 | 中国科学院深圳先进技术研究院 | Digital printing method and device |
CN107025457A (en) * | 2017-03-29 | 2017-08-08 | 腾讯科技(深圳)有限公司 | A kind of image processing method and device |
CN107369068A (en) * | 2017-07-02 | 2017-11-21 | 绍兴原色数码科技有限公司 | Based on internet printing pattern designing, color separation and digit printing fabric custom-built system |
CN109242858A (en) * | 2018-07-18 | 2019-01-18 | 浙江理工大学 | Pattern primitive dividing method is recycled based on the matched textile printing of adaptive template |
Non-Patent Citations (6)
Title |
---|
BAY H 等: "Speeded-Up Robust Features (SURF)" * |
BAY H 等: "Speeded-Up Robust Features (SURF)", 《COMPUT VIS IMAGE UNDERSTAND》 * |
FANG LI等: "A Multiphase Image Segmentation Method Based on Fuzzy Region Competition" * |
MINWOO PARK等: "Deformed Lattice Detection in Real-World images Using Mean-Shift Belief Propagation" * |
MINWOO PARK等: "Deformed lattice detection in real-world images using mean-shift belief propagation", 《IEEE TRANS PATTERN ANAL MACH INTELL》 * |
QIEGEN LIU等: "WpmDecolor: weighted projection maximum solver for contrast-preserving decolorization", 《VIS COMPUT》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112712576A (en) * | 2021-01-19 | 2021-04-27 | 东华大学 | Intelligent pattern design generation method |
CN114139904A (en) * | 2021-11-23 | 2022-03-04 | 南京林业大学 | Waste reduction packaging design control system for big data monitoring service |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210342585A1 (en) | Systems and methods for extracting and vectorizing features of satellite imagery | |
CN105279787B (en) | The method that three-dimensional house type is generated based on the floor plan identification taken pictures | |
CN114529925B (en) | Method for identifying table structure of whole line table | |
CN110428428A (en) | A kind of image, semantic dividing method, electronic equipment and readable storage medium storing program for executing | |
Liu et al. | Recognition methods for coal and coal gangue based on deep learning | |
CN110021028B (en) | Automatic clothing making method based on clothing style drawing | |
CN111507946A (en) | Element data driven flower type pattern rapid generation method based on similarity sample | |
CN113298809B (en) | Composite material ultrasonic image defect detection method based on deep learning and superpixel segmentation | |
CN113435240A (en) | End-to-end table detection and structure identification method and system | |
CN114820471A (en) | Visual inspection method for surface defects of intelligent manufacturing microscopic structure | |
Yao et al. | Manga vectorization and manipulation with procedural simple screentone | |
CN111161213A (en) | Industrial product defect image classification method based on knowledge graph | |
CN115830359A (en) | Workpiece identification and counting method based on target detection and template matching in complex scene | |
Pierre et al. | Recent approaches for image colorization | |
Fang et al. | Hand-drawn grayscale image colorful colorization based on natural image | |
CN109785283A (en) | A kind of textural characteristics matching process and device for fabric segmentation | |
Vacavant et al. | A combined multi-scale/irregular algorithm for the vectorization of noisy digital contours | |
CN112508007B (en) | Space target 6D attitude estimation method based on image segmentation Mask and neural rendering | |
CN114723601A (en) | Model structured modeling and rapid rendering method under virtual scene | |
CN109918783B (en) | Intelligent clothing design system | |
CN113033656B (en) | Interactive hole detection data expansion method based on generation countermeasure network | |
CN116704128B (en) | Method and system for generating 3D model by single drawing based on deep learning | |
CN113610066B (en) | Red date data identification method based on artificial intelligence | |
Ye et al. | Deep learning-based human head detection and extraction for robotic portrait drawing | |
CN117152476A (en) | Automatic extraction method for multi-level transformation information of design image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200807 |
|
RJ01 | Rejection of invention patent application after publication |