WO2008152931A1 - ニットウェアのシミュレーション方法とその装置及び記憶媒体 - Google Patents
ニットウェアのシミュレーション方法とその装置及び記憶媒体 Download PDFInfo
- Publication number
- WO2008152931A1 WO2008152931A1 PCT/JP2008/060100 JP2008060100W WO2008152931A1 WO 2008152931 A1 WO2008152931 A1 WO 2008152931A1 JP 2008060100 W JP2008060100 W JP 2008060100W WO 2008152931 A1 WO2008152931 A1 WO 2008152931A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- knitwear
- human body
- body model
- viewpoint
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A41—WEARING APPAREL
- A41H—APPLIANCES OR METHODS FOR MAKING CLOTHES, e.g. FOR DRESS-MAKING OR FOR TAILORING, NOT OTHERWISE PROVIDED FOR
- A41H3/00—Patterns for cutting-out; Methods of drafting or marking-out such patterns, e.g. on the cloth
- A41H3/007—Methods of drafting or marking-out patterns using computers
-
- D—TEXTILES; PAPER
- D04—BRAIDING; LACE-MAKING; KNITTING; TRIMMINGS; NON-WOVEN FABRICS
- D04B—KNITTING
- D04B37/00—Auxiliary apparatus or devices for use with knitting machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/60—Shadow generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2113/00—Details relating to the application field
- G06F2113/12—Cloth
Definitions
- the present invention relates to the simulation of a state in which the wear is worn on a human body model, etc., in particular, a simulation that expresses details of yarn such as fluff and the shadow that knitwear has on the human body model.
- a simulation that expresses details of yarn such as fluff and the shadow that knitwear has on the human body model is about. Background art
- the inventor has developed a knitwear simulation method.
- the simulation is performed in 3D, and the human body model simulates the state where the wearer is worn from above, the realistic simulation expresses the details of the yarn such as fluff, and the shadow that the knitwear gives to the human body model etc.
- yarn can be represented by a pipe
- fluff can be represented by a fine pipe protruding from the yarn body.
- this increases the time required for the simulation.
- the knitwear casts a shadow on the human body model, but it is difficult to find this shadow by ray tracing based on the knitwear thread.
- Patent Document 1 discloses a technique for expressing the fluff of yarn by 2D simulation of knitwear.
- Patent Document 2 discloses a technique for expressing the fluff of yarn by 2D simulation of knitwear.
- Patent Document 3 discloses that knitwear is virtually mounted on a human body model and the knitwear is simulated in 3D. Disclosure of the invention
- An object of the present invention is to realistically simulate knitwear with fluffy stitches, simulate shadows of human clothing on the human body model, and perform these simulations. Is to do at high speed. Means for solving the problem
- the present invention relates to a method for simulating a state in which a human body model is wearing a knitwear by using knitwear design data and yarn data and a three-dimensional image of the human body model.
- the method further includes the step of preparing an image of the fabric worn by the human body model on the inside of the two-wear.
- the power image of the human body model and the color image of the fabric are darkened with a lower position resolution than the stitches in the shadow part of the knitwear,
- the color image of the human body model, the color image of the fabric, and the two-dimensional color image of the knitwear are synthesized using the image of the opacity of the contemplatre.
- the fabric image is a 3D image for fabrics that are easy to simulate, and for a knitted fabric, a 2D color image similar to a knitwear and 2D opacity.
- it is an image.
- the viewpoint position and the light source position the two-dimensional force image of the human body model and an image showing the position in the depth direction from the viewpoint, and the two-dimensional color image of Fabric and the depth direction from the viewpoint.
- the range where the software image overlaps the human model image is determined as the shadow part of the software
- the image synthesis is performed by obtaining the front-rear relationship of the human body model, the fabric, and the network with respect to the viewpoint based on the three images indicating the positions in the depth direction.
- a two-dimensional force image of the human body model and an image showing the position in the depth direction from the viewpoint and a two-dimensional color image of Fabric and the depth direction from the viewpoint.
- the part where the knitwear blocks the light from the light source to the human body model is determined as the shadow part
- the image composition is performed by obtaining the front-rear relationship of the human body model, the fabric, and the network with respect to the viewpoint based on the three images indicating the positions in the depth direction.
- a step of obtaining a shadow part of the knitwear in the human body model fabric is performed by obtaining from the position of the stitches in the yarn thread data.
- a two-dimensional force image of the human body model and an image showing the position in the depth direction from the viewpoint and a two-dimensional color image of Fabric and an image showing the position in the depth direction from the viewpoint
- a two-dimensional color image of Fabric and an image showing the position in the depth direction from the viewpoint
- the image synthesis is performed by obtaining the front-rear relationship of the human body model, the fabric, and the software with respect to the viewpoint based on the three images indicating the positions in the depth direction.
- the present invention also provides an apparatus for simulating a state in which a human body model is wearing a knitwear based on knitwear design data and yarn data and a three-dimensional image of the human body model.
- the present invention further includes a computer-readable storage for storing a program for simulating a state in which the human body model is wearing the knitwear based on the knitwear design data and yarn data, and the three-dimensional image of the human body model.
- the computer causes the following steps to be performed:
- a step that displays a composite image is a step that displays a composite image.
- the description related to the simulation method applies to the simulation apparatus and the simulation program as it is, and the description related to the simulation apparatus also applies to the simulation method and the simulation program.
- Fabrics include clothing and scarves and other fabrics and knitwear.
- the human body model wears the cloth, and simulates the state in which the wear is put on the cloth. The invention's effect
- the stitch of the wear is expressed not by a 3D model but by a 2D model composed of a yarn body and fluff.
- fluff is represented by a simple two-dimensional image, not a tube and its polygon in the three-dimensional model.
- the shadows that n's cast on the human body model are expressed not by ray tracing the shadows of individual threads but by the average shadows of knitwear.
- the human body model image and the knitwear image are combined using the opacity of the clothing as a parameter.
- the simulation can be similarly performed when an image of another fabric is arranged between the software and the human body model.
- the shadow of the knitwear is dropped on the color image of the human body model and the color image of the haploic, and the color image of the human body model and the color of the haploic color are dropped.
- a single image and a two-dimensional color image of knitwear can be combined using an image of opacity of twear.
- images of the software, the human body model, and the fabric are each processed as a two-dimensional force image, image composition is easy.
- an image showing the position in the depth direction is provided for each of the knitwear, the human body model, and the fabric, these front-rear relations can be easily processed.
- the easiest way to determine the range where the shadow is cast is to determine the overlap of the 2D image such as the color image of the knitwear and the 2D image such as the color image such as the human body model.
- a shadow is good. This is a model in which the part covered with knitwear is shaded.
- Shadows are obtained from three images showing the position of the knitwear, the human body model, and the fabric in the depth direction.
- Ettoware especially the contour of the knitwear, finds the area where the light from the light source is blocked, good. This is a model that uses ray tracing to determine the shadow created by a semi-transparent knitwear with a coarser position resolution than individual yarns.
- the shadow of the knitwear may be obtained at the stage of the 3D data of the string data, the human body model, and the fabric. At this stage, shadows of individual yarns are not required, but the range of shadows of knitwear can be determined.
- FIG. 1 is a block diagram of the simulation apparatus of the embodiment.
- FIG. 2 is a block diagram of the simulation method of the embodiment.
- FIG. 3 is a diagram showing the process from creation of a color image of the software to shadowing in the embodiment.
- FIG. 4 is a diagram showing the superposition of the knit structure, the fabric, and the human body model in the example.
- Fig. 5 is a diagram showing simulation images obtained in the examples.
- A) to d) show examples in which the knitting data of the knit product is common and only the yarn data is transformed, and e) to! 1) is a partially enlarged view of images a) to.
- 2 is a simulation device
- 4 is a path
- 6 is a stylus
- 8 is a mouse
- 10 is a keyboard
- Manual input may include a trackball or joystick
- 1 2 is a color monitor
- 1 4 is a color printer
- 1 6 is a communication unit, and communicates with a LAN or the Internet.
- 8 is a disk drive, and it reads from and writes to an appropriate disk.
- 20 is a design data editing unit, which generates knitwear design data using the stylus 6, mouse 8, keyboard 10 or the like.
- the data conversion unit converts the design data into knitting data for driving a knitting machine such as a flat knitting machine.
- the thread line data generator 24 converts the knitting data into thread line data, and this data is three-dimensional data.
- the connection relation between stitches is shown.
- the stitch position is specified by a one-dimensional 3D position representing the stitch.
- the stitch type is added as an attribute.
- the stitch position is specified by the coordinates of multiple points of the stitch, the stitch type can be determined from the stitch shape, so there is no need to memorize the stitch type!
- the scene setting unit 26 sets the light source position and viewpoint position in the simulation.
- the human body model storage unit 30 stores a three-dimensional color image of the human body model
- the fabric data storage unit 32 stores the clothing made of the fabric that the human body model wears under the knitwear, hereinafter simply referred to as “fabric”. Store color images.
- the yarn data storage unit 34 stores the color image and opacity of the yarn used in the toner, and the color image of the yarn includes the yarn body portion and the fluff portion.
- Reference numeral 35 denotes a dressing processing unit for dressing a virtual knitwear on a human body model or fabric using yarn thread data in accordance with WO 2 0 0 5/0 8 2 1 8 5.
- fabric is not considered, but for example, the fabric is first worn on the human body model, and the worn fabric collides with a rigid body or knit stitch to deform.
- W0 2 0 0 5/0 8 2 1 8 it is possible to simulate the wearing of knitwear in the same way as W 0 2 0 0 5/0 8 2 1 8 5.
- the knit drawing unit 3 8 prints a loop image viewed from the viewpoint position of the stitches that have not been hidden in the knitwear's yarn streak data, and connects the loop images to determine the point of view within the knitwear.
- the two-dimensional force image of the visible part includes the image of the yarn body and the image of the fluff, and the depth direction position (Z coordinate) with respect to the viewpoint and an image indicating the opacity ⁇ of the yarn body and the fluff are added.
- Image data is, for example, RGB, Z, and.
- Knit drawing part 3 8 connects individual loop images according to the thread streak data, creates a two-dimensional color image (knitwear layer) of knitwear, and further creates a depth image (Z image) and an image of opacity ⁇ . Create a (mask image).
- the shadow image creation unit 40 creates an average shadow image that the knitwear has on the human body model or fabric.
- the ray-tracing section 41 performs ray-tracing on the human body model, fabric and knit images. However, ray tracing will not be performed on how knitwear blocks light from human body models.
- the fabric shall be opaque, and no ray tracing will be performed to determine whether the fabric will block light to the human body model.
- the cloth may be translucent and the shadow of the cloth on the human body model may be obtained.
- the shadow of the cloth on the human body model may be obtained in the same manner as the shadow image creation unit 40 obtains the shadow of the knitwear.
- the color image of the human body model is converted into a two-dimensional color image viewed from the viewpoint. These two-dimensional color images are called layers.
- an image showing the position in the depth direction with respect to the viewpoint is added from the 3D data, and ray tracing and shadowing are performed.
- the color image of the two-dimensional human body model and the color image of the fabric data are obtained and stored in the layer storage unit 42.
- the human body model stores the image indicating the position in the depth direction of the fabric data in the layer storage unit 42.
- the layers to be created are knitwear, fabric, human body model, each two-dimensional color image, opacity image (mask image) of knitwear, shadow image of knitwear, and mask image of fabric.
- the image composition unit 4 4 synthesizes four images: knitwear, human body model, fabric and background. When synthesizing, the image of the opacity ⁇ of the knitwear is used in order to realize a human body model that can be seen through a semi-transparent knit image using the data of the depth position ⁇ ⁇ ⁇ ⁇ with respect to the viewpoint.
- the obtained image is stored in the image memory 50, displayed on the color monitor 12, and output from the force printer 14 or the communication unit 16.
- Figure 2 shows the data flow in the example. Convert design data into organization data, Convert to string data.
- the thread data expresses the stitch position by specifying one to multiple 3D positions for each stitch, and has the stitch type and connection relationship as attributes.
- Both the human body model and fabric data are 3D color images.
- the stitches on the thread line data are arranged around the human body model and the fabric.
- the position of the viewpoint and the light source is set, and the portion of the knitwear that is hidden by the human body model is erased, and the portion of the fabric that is hidden by the human body model is deleted. Convert a 3D color image of a human body model into a 2D color image viewed from the viewpoint.
- the color image of the fabric is converted into a two-dimensional color image of the fabric viewed from the viewpoint, and the yarn string data of the knitwear is converted into the two-dimensional color image viewed from the viewpoint.
- the target stitch is a stitch whose hidden surface has not been erased, and a color image of the yarn body and fluff is pasted along the stitch line, and images of these opacity levels are created at the same time. Color images of the yarn body and fluff and their opacity are stored in the yarn data storage unit 34.
- a stitch image is created when the stitch is viewed from the front, the stitch image is rotated in the direction seen from the viewpoint, and the portion where the yarn overlaps in the depth direction due to the rotation of the stitch, The image is synthesized according to the transparency, and the opacity is increased in the overlapping area. Since the 3D position of the stitch is described in the thread data, the coordinate Z in the depth direction of the stitch with respect to the viewpoint is generated using this. Next, ray tracing is performed within the range of the knitwear so that the yarn on the side far from the light source becomes dark at the overlapping portion of the yarn.
- ray tracing for light from the light source is performed for the human body model while ignoring the fabric knitwear, and considering the human body model for the fabric and ignoring the knitwear for the light from the light source.
- the shadow of knitwear is added to the color image of the human body model and the color image of the fabric. This shadow does not represent the shadow of individual yarns in knitwear but is the average shadow of knitwear. After applying the shadow of the knitwear, the images of the layers of the knitwear, the human body model, and the fabric are combined.
- this image of stitch 62 viewed from the front By rotating this image of stitch 62 viewed from the front, Alternatively, it can be obtained by allocating an image of the thread body and fluff to the string from the viewpoint from the beginning.
- the yarn body and fluff are given opacity at the yarn data stage, and if the yarn overlaps due to rotation, the opacity is increased at that portion.
- a two-dimensional color image for one stitch viewed from the viewpoint 65, a depth image, and an opacity image are obtained.
- a plurality of stitches in the thread data are overlapped, and stitches are overlapped according to the value of Z and opacity ⁇ where the stitches overlap.
- a two-dimensional color image of the software viewed from the viewpoint 65 is obtained.
- FIG. 4 shows the shadow model.
- 70 is the color image layer of knitwear, and the value of the image is P1.
- 7 1 is the software mask image layer, the image value is CK, 7 2 is the software shadow image layer, and the image value is; 3.
- 7 5 is the human body
- the value of the image is ⁇ 3. The positional relationship of these layers with respect to the viewpoint has been processed by the hidden surface removal unit 36, the top layer is the fabric, the middle is the fabric, and the human body model is the top.
- the shadow image is monochrome, / 3 is one-dimensional, and when the shadow image is color, is three-dimensional data with RGB components.
- the color shadow has a color value obtained by blurring the color value of the knitwear.
- a composite image can be easily obtained by the image composition unit 44.
- the shadow By specifying the intensity of the shadow and the degree of blurring, the shadow can be expressed realistically, and the direction of the light source can be expressed by sliding the shadow.
- Fig. 5 a) to h) show simulation images in the embodiment.
- the images of the yarn used are shown in the upper left of a) to d).
- the human body model and fabric are common, and the design data of knitwear is also common.
- the simulation images are shown enlarged in e) to h), and the brightness of the human body model seen through the knitwear changes depending on the thickness of the thread.
- the wearing state of the knitwear with fluff can be simulated. In the embodiment, the following effects can be obtained.
- the knitwear layer, the human body model layer, and the fabric layer may be combined, and the composition is simple.
- the simulation apparatus 2 is realized by installing a simulation program in a computer, and the simulation program is read into the computer via a storage medium such as a CD-ROM or a carrier wave.
- Fig. 2 corresponds to the block diagram of the simulation program.
- the simulation program executes each process in Fig. 2.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Textile Engineering (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP08753121.6A EP2159755A4 (en) | 2007-06-13 | 2008-05-26 | KNIT SIMULATION METHOD, APPARATUS FOR THE METHOD, AND STORAGE MEDIUM |
JP2009519222A JP5161213B2 (ja) | 2007-06-13 | 2008-05-26 | ニットウェアのシミュレーション方法とその装置及び記憶媒体 |
CN2008800197794A CN101689307B (zh) | 2007-06-13 | 2008-05-26 | 针织品模拟方法及其装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007-155912 | 2007-06-13 | ||
JP2007155912 | 2007-06-13 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2008152931A1 true WO2008152931A1 (ja) | 2008-12-18 |
Family
ID=40129541
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2008/060100 WO2008152931A1 (ja) | 2007-06-13 | 2008-05-26 | ニットウェアのシミュレーション方法とその装置及び記憶媒体 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP2159755A4 (ja) |
JP (1) | JP5161213B2 (ja) |
CN (1) | CN101689307B (ja) |
WO (1) | WO2008152931A1 (ja) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4792521B2 (ja) * | 2009-12-15 | 2011-10-12 | 株式会社アイ.エス.テイ | 布製品識別装置および布製品把持システム |
DE102011106401A1 (de) * | 2011-07-02 | 2013-01-03 | H. Stoll Gmbh & Co. Kg | Verfahren und Vorrichtung zur Maschendarstellung |
CN106608201B (zh) | 2015-10-26 | 2019-04-19 | 比亚迪股份有限公司 | 电动车辆及其主动安全控制***和方法 |
CN110644128B (zh) * | 2018-09-27 | 2022-02-22 | 北京大豪科技股份有限公司 | 手套机机头控制方法、装置、设备及存储介质 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005082185A1 (ja) * | 2004-02-26 | 2005-09-09 | Shima Seiki Manufacturing, Ltd. | 人体モデルへのニットガーメントの着装シミュレーション方法とその装置、並びにそのプログラム |
JP2005258537A (ja) * | 2004-03-09 | 2005-09-22 | Nippon Telegr & Teleph Corp <Ntt> | 3次元モデル生成方法と生成装置およびプログラムと記録媒体 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5557527A (en) * | 1993-08-31 | 1996-09-17 | Shima Seiki Manufacturing Ltd. | Knit design system and a method for designing knit fabrics |
-
2008
- 2008-05-26 WO PCT/JP2008/060100 patent/WO2008152931A1/ja active Application Filing
- 2008-05-26 CN CN2008800197794A patent/CN101689307B/zh active Active
- 2008-05-26 JP JP2009519222A patent/JP5161213B2/ja active Active
- 2008-05-26 EP EP08753121.6A patent/EP2159755A4/en not_active Withdrawn
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005082185A1 (ja) * | 2004-02-26 | 2005-09-09 | Shima Seiki Manufacturing, Ltd. | 人体モデルへのニットガーメントの着装シミュレーション方法とその装置、並びにそのプログラム |
JP2005258537A (ja) * | 2004-03-09 | 2005-09-22 | Nippon Telegr & Teleph Corp <Ntt> | 3次元モデル生成方法と生成装置およびプログラムと記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2159755A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP5161213B2 (ja) | 2013-03-13 |
JPWO2008152931A1 (ja) | 2010-08-26 |
EP2159755A4 (en) | 2014-05-07 |
CN101689307B (zh) | 2012-02-29 |
CN101689307A (zh) | 2010-03-31 |
EP2159755A1 (en) | 2010-03-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240193834A1 (en) | Techniques and Workflows for Computer Graphics Animation System | |
EP3129978B1 (en) | Graphics processing enhancement by tracking object and/or primitive identifiers | |
EP1918881B1 (en) | Techniques and workflows for computer graphics animation system | |
EP1452985B1 (en) | Knit design method and device | |
JP4966003B2 (ja) | 布帛パターンの作成装置と作成方法、作成プログラム | |
KR101078215B1 (ko) | 니트 디자인 방법 및 장치 | |
JP2023553507A (ja) | 特注仕様製品の合成データ表示の高品質レンダリング表示を得るためのシステムおよびその方法 | |
CN112131724A (zh) | 一种针织成形产品的三维设计仿真***与方法 | |
JP5431173B2 (ja) | 着装シミュレーション装置とシミュレーションプログラム | |
JP5161213B2 (ja) | ニットウェアのシミュレーション方法とその装置及び記憶媒体 | |
JP5161229B2 (ja) | 着装シミュレーション装置とシミュレーション方法、シミュレーションプログラム | |
JP5208130B2 (ja) | ニットシミュレーション装置とニットシミュレーションでの糸の捻れ修正方法 | |
JP2003036449A (ja) | 繊維製品の3次元画像生成用マップデータの自動生成方法、及び、繊維製品の3次元シミュレーション画像生成方法 | |
JP2002056405A (ja) | テクスチャマッピング処理装置 | |
JP3749373B2 (ja) | 三次元立体構造体の二次元表示方法 | |
JP5079786B2 (ja) | ニット製品のシミュレーション装置とシミュレーション方法 | |
JPWO2008105529A1 (ja) | ニット製品のシミュレーション装置とシミュレーション方法 | |
Lazunin et al. | Interactive visualization of multi-layered clothing | |
MacDonald | 3-D Drawing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200880019779.4 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08753121 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2009519222 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2008753121 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2008753121 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |