WO2013111195A1 - 描画データ生成装置及び画像描画装置 - Google Patents
描画データ生成装置及び画像描画装置 Download PDFInfo
- Publication number
- WO2013111195A1 WO2013111195A1 PCT/JP2012/000531 JP2012000531W WO2013111195A1 WO 2013111195 A1 WO2013111195 A1 WO 2013111195A1 JP 2012000531 W JP2012000531 W JP 2012000531W WO 2013111195 A1 WO2013111195 A1 WO 2013111195A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- texture
- polygon
- node
- group
- image
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
Definitions
- the present invention relates to a drawing data generation device and an image drawing device that express a two-dimensional or three-dimensional shape using a group of polygons managed in a tree structure and expressing a model with a plurality of details.
- a polygon model is widely used as a method for expressing a two-dimensional or three-dimensional shape.
- a triangle is mainly used as a unit shape, and the shape is expressed by a combination thereof.
- texture mapping is widely used in which a two-dimensional texture image is associated with the surface of the polygon, mapped, and drawn.
- a polygon drawing command is issued after a command for selecting a texture image to be used is issued to a polygon drawing device such as a GPU. Since the processing time of the selection command is particularly long, a texture atlas is generally used in which a plurality of texture images are combined in advance as shown in FIG. 1 in order to shorten the drawing time.
- FIG. 1 shows polygon group drawing without using a texture atlas
- (b) shows polygon group drawing using a texture atlas.
- FIG. 2 shows the drawing flow of FIG. 1 (a)
- FIG. 1 (b) shows the drawing flow of FIG. 1 (b).
- image size that can be used as a texture image for each polygon rendering device, when there are a large number of texture images, they cannot be all contained in one texture atlas, and a plurality of texture atlases are used. Will be generated.
- LOD Level Of Detail
- model groups are often managed in a tree structure.
- Non-Patent Document 1 discloses a technique for managing a tree-structured node corresponding to one polygon model, and managing a simplified model by combining child nodes in association with the parent node.
- FIG. 3 shows an example of a tree structure and a polygon group and a texture image group corresponding to each node.
- FIG. 3A shows a tree structure showing the relationship between polygon groups
- FIG. 3B shows a polygon group corresponding to each node
- FIG. 3C shows a texture corresponding to each node.
- the LOD technique selects and draws nodes appropriately so as not to draw the same model with different levels of detail at the same time.
- 4A shows an example of a result of selecting a drawing target node from the tree structure of FIG. 3, and
- FIG. 4B shows a result of drawing a polygon group corresponding to the selected node. ing.
- a texture atlas is generated by combining texture images used in the LOD described above, a plurality of texture atlases must be generated if the number of nodes of the tree structure is large. At this time, the nodes to be drawn in the tree structure are only a part of all nodes. Therefore, in order to draw at high speed, the texture atlas should be reduced so that the number of texture image designation commands issued at the time of drawing is reduced. Needs to be generated and rendered properly.
- JP-A-8-293041 Japanese Patent Laid-Open No. 10-172003
- the present invention has been made to solve the above-described problems, and provides a drawing data generation apparatus and an image drawing apparatus capable of generating a texture atlas so that texture image selection commands used at the time of drawing are reduced.
- the purpose is to obtain.
- the drawing data generation device has a relationship in which a plurality of details are represented by a tree structure, and a polygon group expressing a model with a plurality of details and a texture image uniquely assigned to each polygon group
- a node collection unit that determines a node corresponding to the texture image group to be combined, and a node image generated by the node collection unit to combine the texture image group to generate a texture atlas group.
- a texture atlas generation unit that converts texture coordinates of vertices of a polygon group in correspondence with a drawing position.
- the drawing data generation apparatus determines a node corresponding to a texture image group to be combined and generates a texture atlas group by combining the texture image group using the information of this node.
- the texture atlas can be generated so that fewer texture image selection commands are executed.
- FIG. 5 is a configuration diagram illustrating the image drawing apparatus according to the first embodiment.
- the image drawing apparatus includes a preprocessing unit 1, a runtime processing unit 2, an HDD (hard disk device) 3, and a polygon drawing device 4.
- the preprocessing unit 1 constitutes a drawing data generation device, and generates a tree structure, a polygon group, and a texture atlas group from a tree structure, a polygon group, and a texture image group.
- the node collection unit 11 and the texture atlas generation unit 12 I have.
- the runtime processing unit 2 issues a drawing command to the polygon drawing device 4 based on the tree structure, the polygon group, and the texture atlas group generated by the preprocessing unit 1, and includes a drawing node determination unit 21 and a drawing list generation unit 22. And a drawing unit 23.
- the HDD 3 is a storage device that stores the generation result of the preprocessing unit 1.
- the polygon drawing device 4 is a device that is made of a GPU or the like and performs drawing in accordance with a drawing command from the runtime processing unit 2.
- the node collection unit 11 of the pre-processing unit 1 has a relationship in which a plurality of details are expressed in a tree structure, and is uniquely assigned to each polygon group that represents a model with a plurality of details and each of these polygon groups.
- a processing unit that receives a texture image group as input and determines a node corresponding to the texture image group to be combined.
- the texture atlas generation unit 12 generates a texture atlas group by combining the texture image group using the node information generated by the node collection unit 11 and associates the texture coordinates of the vertex of the polygon group with the drawing position.
- a processing unit for conversion is
- the drawing node determination unit 21 in the runtime processing unit 2 uses the tree structure, the polygon group, and the texture atlas group output from the texture atlas generation unit 12, and determines the polygon group to be drawn using at least the viewpoint position information. It is a processing unit.
- the drawing list generation unit 22 is a processing unit that generates a list indicating the drawing order for the drawing target polygon group determined by the drawing node determination unit 21.
- the drawing unit 23 is a processing unit that issues a command for drawing a drawing target polygon group to the polygon drawing device 4 using the list generated by the drawing list generation unit 22.
- the preprocessing unit 1 receives a plurality of polygon groups, texture image groups corresponding to each of the polygon groups, and a tree structure indicating the relationship between the polygon groups, and generates a small number of texture atlases from the texture image groups and polygon groups.
- the texture coordinates of the vertices are appropriately converted, and the tree structure, polygon group, and texture atlas are output to the HDD 3.
- the runtime processing unit 2 reads the tree structure, texture atlas group, and polygon group output from the preprocessing unit 1 from the HDD 3, determines the polygon group to be rendered based on input information such as the viewpoint position, and renders them Is issued to the polygon drawing device 4.
- FIG. 6 is a flowchart showing the operation of the node collection unit 11.
- the node collection unit 11 refers to the input tree structure and texture image group, and applies the following processing to each node group at the same depth in the tree structure.
- each node at the same depth is set as one set.
- all nodes at a depth closer to the root of one stage than the depth to be processed are set as ancestor nodes.
- nodes having the same ancestor node among the nodes of the depth to be processed are collected and set as a range that can be integrated (step ST111).
- FIG. 7 shows the result of applying step ST111 to the tree structure and texture image group shown in FIG. 3 as input and processing the node group of the leaf depth. 7 to 15, the dotted line frame indicates a set, the broken line frame indicates a range that can be integrated, and the solid line frame indicates a range that cannot be integrated.
- step ST112 one range that can be integrated is selected (step ST112). Then, two sets having the smallest total area of texture images corresponding to the nodes in the set are selected from the set in the selected range (step ST113), and it is determined whether or not integration is possible (step ST114).
- integration is possible when all the texture images in the two selected sets are combined into one image, and the combined result fits into a texture size that can be used by hardware. I mean. Note that whether or not the combined result fits within the texture size can be determined by solving a two-dimensional bin packing problem of packing each texture image with respect to a texture size rectangle. If the integration is impossible, the range is set as a non-integration range, and the process proceeds to step ST117. If integration is possible, the two selected sets are integrated (step ST115), and it is determined whether two or more sets remain in the range (step ST116).
- FIG. 8 shows the result of selecting the leftmost range from the state of FIG. 7 and integrating the two sets.
- FIG. 9 shows a state in which all sets within the left end range are integrated into one set. If there is only one set remaining in the range, it is determined whether all integrable ranges have been selected in step ST112 (step ST117). If there is a range that has not been selected, the process returns to step ST112 to select The same processing is repeated by selecting a range that has not been performed.
- FIG. 10 shows a state when all the ranges that can be integrated are selected and processed in step ST112, and the nodes in all the ranges are integrated into one.
- step ST117 when all the ranges have already been selected in step ST112 and the set integration processing has been applied, it is determined whether or not there are two or more ranges that can be integrated. Is terminated (step ST118). Otherwise, the depth of the ancestor node is brought closer to the root of one step, the integrable ranges having the same ancestor node are integrated (step ST118), and the process returns to step ST112.
- FIG. 11 shows a state as a result of bringing the ancestor node closer to the one-step root from the state of FIG. 10 and integrating the connectable ranges having the same ancestor node.
- FIG. 12 and FIG. The state transition in the process is shown.
- FIG. 14 shows the state of the node when the processing of the node collection unit 11 is completed. This figure is a result of the processing being ended after it is determined in step ST114 that the set including the nodes 5 to 12 and the set including the nodes 13 to 20 are not connectable in step ST114. Finally, the node collection unit 11 outputs the generated set together with the input tree structure, texture group, and texture image group.
- FIG. 15 shows an example of a result of applying the processing of the node collection unit 11 to nodes of all depths. In the figure, for the following explanation, unique IDs 0 to 3 are assigned to each set.
- the node collection unit 11 collects nodes at the same depth in the tree structure in descending order of the relative relationship, and can generate a texture atlas having a size closest to the maximum size usable by the polygon drawing apparatus. A set of nodes representing a set of images is generated.
- the texture atlas generation unit 12 combines the texture images corresponding to the nodes in the set generated by the node collection unit 11 to form one texture atlas.
- the method for generating the texture atlas is arbitrary, but can be realized by, for example, solving the two-dimensional bin packing problem in the same manner as the processing in step ST114 in the node collection unit 11.
- FIG. 16 shows the result of generating a texture atlas corresponding to each set of FIG.
- the texture atlas generation unit 12 appropriately updates the texture coordinates of the vertices of each polygon.
- FIG. 17A shows the range of texture coordinates before generating the texture atlas
- FIG. 17B shows the range of texture coordinates after generating the texture atlas. That is, FIG. 17A shows a texture image mapped to the polygon group 2 in FIG. 3, and the vertexes of the corresponding polygon are (0.0, 0.0) to (1.0, 1.0). ) Texture coordinates in the range are assigned.
- the texture image occupies a range from (0.5, 0.5) to (1.0, 1.0).
- U ′ 0.5 + 0.5 ⁇ U (1)
- V ′ 0.5 + 0.5 ⁇ V (2) Apply the transformation.
- (U, V) represents the texture coordinates of the vertex before conversion
- (U ′, V ′) represents the texture coordinates after conversion.
- the texture atlas generation unit 12 records the input tree structure, the polygon group whose vertex texture coordinates are updated, and the generated texture atlas to the HDD 3.
- the drawing node determination unit 21 reads the tree structure, texture atlas, and polygon group recorded by the preprocessing unit 1 from the HDD 3, and determines a polygon group to be drawn according to input information such as a viewpoint position. At this time, it is determined that the same models with different levels of detail are not simultaneously drawn.
- the method for determining the drawing target is arbitrary, but for example, a threshold value can be set for each node of the tree structure, and can be determined according to the relationship between the distance from the viewpoint and the threshold value.
- the root node is set as a temporary drawing target node, and if the distance from the viewpoint to the polygon group corresponding to the root node is equal to or greater than the threshold, the polygon group corresponding to the root node is set as the drawing target. On the other hand, if it is less than the threshold, the root node is excluded from the drawing target, and the child of the root node is set as a temporary drawing target node. Thereafter, the drawing target can be determined by repeating the same determination for the temporary drawing target node.
- FIG. 18 shows an example in which a drawing target node is determined for the tree structure of FIG. In the figure, shaded numbers indicate drawing target nodes.
- the drawing list generation unit 22 generates a list in which IDs of drawing target nodes are collected for each set.
- FIG. 19 shows an example of a list generated from the state of the tree structure shown in FIG. That is, 3 and 4 are listed as drawing target nodes in the set 1, and 5, 6, 7, 8, 9, 10, 11, and 12 are listed as drawing target nodes in the set 2, respectively. .
- the drawing unit 23 selects a non-empty list from the list generated by the drawing list generation unit 22 (step ST231), and transmits a command for selecting a texture atlas corresponding to the selected list to the polygon drawing device 4. (Step ST232). Next, one polygon group corresponding to the ID of the node included in the selected list is selected (step ST233), and a command for drawing the selected polygon group is transmitted to the polygon drawing apparatus 4 (step ST234). Then, it is determined whether or not Step ST233 and Step ST234 are applied to all node IDs in the list (Step ST235). If not, the process returns to Step ST233.
- step ST236 If it has been applied, it is determined whether or not steps ST231 to ST235 have been applied to all lists (step ST236), and if not applied, the process returns to step ST231.
- the polygon drawing process in the polygon drawing apparatus 4 is the same as a general polygon drawing method, and is therefore omitted.
- the texture image group uniquely assigned to each is input, and the node collection unit that determines the node corresponding to the texture image group to be combined and the node information generated by the node collection unit are used to combine the texture image groups.
- a texture atlas generation unit that generates texture atlases and converts the texture coordinates of the vertices of the polygons according to the drawing position, so that fewer texture image selection commands are used at the time of drawing.
- a texture atlas can be generated.
- the node collection unit collects nodes at the same depth in the tree structure in order from the closest relatives and uses the polygon drawing apparatus that draws the polygon. Since a set of nodes representing a set of texture images that can generate a texture atlas of the size closest to the maximum possible size is generated, the texture image selection command used at the time of drawing can be minimized. High speed drawing can be performed.
- a plurality of levels of detail have a relationship represented by a tree structure, and a polygon group that represents a model with a plurality of levels of detail and a polygon group that are unique to each other
- the texture image group to be assigned is input, and the node collection unit that determines the node corresponding to the texture image group to be combined and the node information generated by the node collection unit are used to combine the texture image group to obtain the texture atlas group.
- the viewpoint position using the texture atlas generator that converts the texture coordinates of the vertices of the polygon group according to the drawing position, and the tree structure, polygon group, and texture atlas group output from the texture atlas generator.
- the drawing node determination unit that determines a polygon group to be drawn using information and the drawing node determination unit determine A drawing list generation unit that generates a list indicating the drawing order for the polygon group to be drawn, and a drawing that issues a command for drawing the polygon group to be drawn to the polygon drawing device using the list generated by the drawing list generation unit Since a texture atlas can be generated by selecting a combination of texture images so that the number of texture image selection commands used at the time of drawing is reduced, a high-speed drawing can be performed.
- the drawing node determination unit has the same polygon group expressed with different degrees of detail according to the positional relationship between the threshold value set for each node of the tree structure and the viewpoint. Since the node to be drawn is determined so as not to be selected, the drawing time can be shortened.
- the drawing list generation unit generates a list representing the polygon group to be drawn for each set generated by the node collection unit, so that high-speed drawing is performed. It can be performed.
- the drawing unit refers to each list generated by the drawing list generation unit, issues a texture image designation command only once for each non-empty list, Since a command for drawing each polygon group is issued, high-speed drawing can be performed.
- any component of the embodiment can be modified or any component of the embodiment can be omitted within the scope of the invention.
- the drawing data generation apparatus and the image drawing apparatus reduce the number of texture images used at the time of drawing by combining texture images that are highly likely to be drawn at the same time to form a texture atlas.
- the polygon model corresponding to each texture atlas is drawn together and is suitable for use in computer graphics or the like.
- 1 pre-processing unit 2 runtime processing unit, 3 HDD, 4 polygon drawing device, 11 node collection unit, 12 texture atlas generation unit, 21 drawing node determination unit, 22 drawing list generation unit, 23 drawing unit.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
Abstract
Description
以下、この発明をより詳細に説明するために、この発明を実施するための形態について、添付の図面に従って説明する。
図5は、実施の形態1の画像描画装置を示す構成図である。
図5に示すように、画像描画装置は、前処理部1、実行時処理部2、HDD(ハードディスク装置)3、ポリゴン描画装置4を備えている。前処理部1は描画データ生成装置を構成し、木構造とポリゴン群、テクスチャ画像群から、木構造、ポリゴン群、テクスチャアトラス群を生成するもので、ノード収集部11とテクスチャアトラス生成部12を備えている。実行時処理部2は、前処理部1が生成した木構造、ポリゴン群、テクスチャアトラス群に基づいてポリゴン描画装置4に描画命令を発行するもので、描画ノード決定部21と描画リスト生成部22と描画部23とを備えている。HDD3は、前処理部1の生成結果を格納する記憶装置である。ポリゴン描画装置4は、GPU等からなり、実行時処理部2からの描画命令に従って描画を行う装置である。
図5において、前処理部1は、複数のポリゴン群とそれぞれに対応するテクスチャ画像群、ポリゴン群同士の関係を示す木構造を入力とし、テクスチャ画像群から少数のテクスチャアトラスを生成すると共にポリゴン群の頂点が持つテクスチャ座標を適切に変換し、木構造とポリゴン群、テクスチャアトラスをHDD3に出力する。実行時処理部2は、前処理部1が出力した木構造、テクスチャアトラス群、ポリゴン群をHDD3から読み込み、視点位置などの入力情報に基づいて描画対象となるポリゴン群を決定し、それらを描画する命令をポリゴン描画装置4へ発行する。
図6は、ノード収集部11の動作を示すフローチャートである。ノード収集部11では、入力された木構造とテクスチャ画像群を参照し、木構造の中で、同じ深さにあるノード群ごとにそれぞれ次の処理を適用する。
テクスチャアトラス生成部12は、ノード収集部11が生成した集合内のノードに対応するテクスチャ画像を結合し、それぞれ1つのテクスチャアトラスとする。なお、テクスチャアトラスを生成する方法は任意であるが、例えば、ノード収集部11におけるステップST114の処理と同様に2次元のビンパッキング問題を解くことなどによって実現される。図16は、図15の各集合に対応するテクスチャアトラスを生成した結果を示している。また、テクスチャアトラス生成部12は、各ポリゴンの頂点が持つテクスチャ座標を適切に更新する。
U’=0.5+0.5×U (1)
V’=0.5+0.5×V (2)
の変換を適用する。(1)式および(2)式において、(U,V)は変換前の頂点が持つテクスチャ座標、(U’,V’)は変換後のテクスチャ座標を表している。最後に、テクスチャアトラス生成部12は、入力された木構造、頂点のテクスチャ座標を更新したポリゴン群、生成したテクスチャアトラスをHDD3へ記録する。
描画ノード決定部21は、前処理部1が記録した木構造、テクスチャアトラス、ポリゴン群をHDD3から読み込み、視点位置などの入力情報に応じて、描画対象とするポリゴン群を決定する。このとき、詳細度が異なる同じモデルが同時に描画対象とならないよう決定する。描画対象を決定する方法は任意であるが、例えば、木構造の各ノードに閾値を設定しておき、視点との距離と閾値との関係に応じて決定することができる。まず、根ノードを仮の描画対象ノードとしておき、視点との根ノードに対応するポリゴン群との距離が閾値以上であれば根ノードに対応するポリゴン群を描画対象とする。反対に、閾値未満であれば、根ノードを描画対象から外し、根ノードの子を仮の描画対象ノードとする。以下、同様の判定を仮の描画対象ノードに対して繰り返すことで、描画対象を決定できる。図18は、図3の木構造に対して、描画対象ノードを決定した例を示している。図中、網掛けされた番号が描画対象ノードを示している。
描画リスト生成部22は、集合ごとに描画対象ノードのIDを集めたリストを生成する。図19は、図18に示す木構造の状態から生成されたリストの例を示している。すなわち、集合1では、描画対象ノードとして3と4が、集合2では、描画対象ノードとして5,6,7,8,9,10,11,12が、それぞれ描画対象のリストとして挙げられている。
描画部23は、描画リスト生成部22で生成されたリストの中で、空でないリストを選択し(ステップST231)、選択したリストに対応するテクスチャアトラスを選択する命令を、ポリゴン描画装置4に送信する(ステップST232)。次に、選択したリストに含まれるノードのIDに対応するポリゴン群を1つ選択し(ステップST233)、選択されたポリゴン群を描画する命令をポリゴン描画装置4に送信する(ステップST234)。そして、リスト内の全ノードIDに対してステップST233、ステップST234が適用されたかどうかを判定し(ステップST235)、適用されていない場合はステップST233に戻る。適用済みの場合は、全リストに対してステップST231~ステップST235が適用されたかどうかを判定し(ステップST236)、適用されていない場合はステップST231に戻る。なお、ポリゴン描画装置4におけるポリゴンの描画処理は、一般的なポリゴン描画方法と同様であるため省略する。
Claims (6)
- 複数の詳細度が木構造で表される関係を持ち、前記複数の詳細度でモデルを表現するポリゴン群と、当該ポリゴン群に対してそれぞれ固有に割り当てられるテクスチャ画像群とを入力とし、結合する前記テクスチャ画像群に対応するノードを決定するノード収集部と、
前記ノード収集部が生成するノードの情報を用いて前記テクスチャ画像群を結合してテクスチャアトラス群を生成すると共に、前記ポリゴン群の頂点が持つテクスチャ座標を描画位置に対応させて変換するテクスチャアトラス生成部とを備えた描画データ生成装置。 - ノード収集部は、木構造内で同じ深さにあるノードを、親戚関係の近いものから順に集め、ポリゴン描画を行うポリゴン描画装置が使用可能な最大サイズに最も近いサイズのテクスチャアトラスを生成可能なテクスチャ画像群の集合を表すノードの集合を生成することを特徴とする請求項1記載の描画データ生成装置。
- 複数の詳細度が木構造で表される関係を持ち、前記複数の詳細度でモデルを表現するポリゴン群と、当該ポリゴン群に対してそれぞれ固有に割り当てられるテクスチャ画像群とを入力とし、結合する前記テクスチャ画像群に対応するノードを決定するノード収集部と、
前記ノード収集部が生成するノードの情報を用いて前記テクスチャ画像群を結合してテクスチャアトラス群を生成すると共に、前記ポリゴン群の頂点が持つテクスチャ座標を描画位置に対応させて変換するテクスチャアトラス生成部と、
前記テクスチャアトラス生成部が出力した木構造、ポリゴン群、テクスチャアトラス群を用い、少なくとも視点位置の情報を用いて描画対象とするポリゴン群を決定する描画ノード決定部と、
前記描画ノード決定部が決定した描画対象のポリゴン群に対し、描画順を示すリストを生成する描画リスト生成部と、
前記描画リスト生成部が生成したリストを用いて前記描画対象のポリゴン群を描画する命令をポリゴン描画装置に発行する描画部を備えた画像描画装置。 - 描画ノード決定部は、木構造の各ノードに設定した閾値と視点との位置関係に応じて、異なる詳細度で表される同じポリゴン群が選択されないよう、描画対象となるノードを決定することを特徴とする請求項3記載の画像描画装置。
- 描画リスト生成部は、ノード収集部で生成した集合ごとに、描画対象となるポリゴン群を表すリストを生成することを特徴とする請求項3記載の画像描画装置。
- 描画部は、描画リスト生成部が生成した各リストを参照し、空でないものに対してそれぞれ1度だけテクスチャ画像指定命令を発行し、リスト内の各ポリゴン群をそれぞれ描画する命令を発行することを特徴とする請求項3記載の画像描画装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013554997A JP5653541B2 (ja) | 2012-01-27 | 2012-01-27 | 描画データ生成装置及び画像描画装置 |
DE112012005770.8T DE112012005770T5 (de) | 2012-01-27 | 2012-01-27 | Zeichnungsdaten-Erzeugungsvorrichtung und Bildzeichnungsvorrichtung |
CN201280066320.6A CN104054112A (zh) | 2012-01-27 | 2012-01-27 | 描绘数据生成装置以及图像描绘装置 |
PCT/JP2012/000531 WO2013111195A1 (ja) | 2012-01-27 | 2012-01-27 | 描画データ生成装置及び画像描画装置 |
US14/360,790 US20150235392A1 (en) | 2012-01-27 | 2012-01-27 | Drawing data generation device and image drawing device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/000531 WO2013111195A1 (ja) | 2012-01-27 | 2012-01-27 | 描画データ生成装置及び画像描画装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013111195A1 true WO2013111195A1 (ja) | 2013-08-01 |
Family
ID=48872975
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/000531 WO2013111195A1 (ja) | 2012-01-27 | 2012-01-27 | 描画データ生成装置及び画像描画装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20150235392A1 (ja) |
JP (1) | JP5653541B2 (ja) |
CN (1) | CN104054112A (ja) |
DE (1) | DE112012005770T5 (ja) |
WO (1) | WO2013111195A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015146517A1 (ja) * | 2014-03-27 | 2015-10-01 | 株式会社ジオ技術研究所 | 画像表示システム |
WO2016046890A1 (ja) * | 2014-09-22 | 2016-03-31 | 三菱電機株式会社 | 情報表示制御システムおよびアトラス画像作成方法 |
CN106157353A (zh) * | 2015-04-28 | 2016-11-23 | Tcl集团股份有限公司 | 一种文字渲染方法和文字渲染装置 |
JP2019159379A (ja) * | 2018-03-07 | 2019-09-19 | 五洋建設株式会社 | 三次元画像生成システム |
JP2019159904A (ja) * | 2018-03-14 | 2019-09-19 | 日本ユニシス株式会社 | テクスチャマッピング装置およびテクスチャマッピング用プログラム |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108780583A (zh) * | 2016-03-15 | 2018-11-09 | 三菱电机株式会社 | 纹理映射装置和纹理映射程序 |
CN107194982B (zh) * | 2016-03-15 | 2021-07-27 | 斑马智行网络(香港)有限公司 | 创建纹理图集和纹理图集等待集合的方法、装置和设备 |
WO2018039936A1 (en) * | 2016-08-30 | 2018-03-08 | Microsoft Technology Licensing, Llc. | Fast uv atlas generation and texture mapping |
JP2018170448A (ja) * | 2017-03-30 | 2018-11-01 | 株式会社ニューフレアテクノロジー | 描画データ作成方法 |
CN107248187B (zh) * | 2017-05-22 | 2020-12-08 | 武汉地大信息工程股份有限公司 | 一种快速三维模型纹理切割重组的方法 |
CN107463398B (zh) * | 2017-07-21 | 2018-08-17 | 腾讯科技(深圳)有限公司 | 游戏渲染方法、装置、存储设备及终端 |
CN108460826B (zh) * | 2017-12-28 | 2022-04-15 | 深圳市创梦天地科技有限公司 | 一种3d模型的处理方法及终端 |
US11315321B2 (en) * | 2018-09-07 | 2022-04-26 | Intel Corporation | View dependent 3D reconstruction mechanism |
US11741093B1 (en) | 2021-07-21 | 2023-08-29 | T-Mobile Usa, Inc. | Intermediate communication layer to translate a request between a user of a database and the database |
US11924711B1 (en) | 2021-08-20 | 2024-03-05 | T-Mobile Usa, Inc. | Self-mapping listeners for location tracking in wireless personal area networks |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001118056A (ja) * | 1999-10-20 | 2001-04-27 | Sony Corp | 画像処理装置 |
JP2008077604A (ja) * | 2006-09-25 | 2008-04-03 | Toshiba Corp | テクスチャフィルタリング装置、テクスチャマッピング装置、方法およびプログラム |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0740272A2 (en) * | 1995-04-28 | 1996-10-30 | Sun Microsystems, Inc. | Method and apparatus for fast rendering of three-dimensional objects |
US6525722B1 (en) * | 1995-08-04 | 2003-02-25 | Sun Microsystems, Inc. | Geometry compression for regular and irregular mesh structures |
US7940279B2 (en) * | 2007-03-27 | 2011-05-10 | Utah State University | System and method for rendering of texel imagery |
US9024959B2 (en) * | 2009-12-21 | 2015-05-05 | Intel Corporation | Demand-paged textures |
US8872839B2 (en) * | 2011-09-09 | 2014-10-28 | Microsoft Corporation | Real-time atlasing of graphics data |
-
2012
- 2012-01-27 US US14/360,790 patent/US20150235392A1/en not_active Abandoned
- 2012-01-27 DE DE112012005770.8T patent/DE112012005770T5/de not_active Ceased
- 2012-01-27 CN CN201280066320.6A patent/CN104054112A/zh active Pending
- 2012-01-27 WO PCT/JP2012/000531 patent/WO2013111195A1/ja active Application Filing
- 2012-01-27 JP JP2013554997A patent/JP5653541B2/ja active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001118056A (ja) * | 1999-10-20 | 2001-04-27 | Sony Corp | 画像処理装置 |
JP2008077604A (ja) * | 2006-09-25 | 2008-04-03 | Toshiba Corp | テクスチャフィルタリング装置、テクスチャマッピング装置、方法およびプログラム |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015146517A1 (ja) * | 2014-03-27 | 2015-10-01 | 株式会社ジオ技術研究所 | 画像表示システム |
JP2015187795A (ja) * | 2014-03-27 | 2015-10-29 | 株式会社ジオ技術研究所 | 画像表示システム |
CN105474268A (zh) * | 2014-03-27 | 2016-04-06 | 吉欧技术研究所股份有限公司 | 图像显示*** |
WO2016046890A1 (ja) * | 2014-09-22 | 2016-03-31 | 三菱電機株式会社 | 情報表示制御システムおよびアトラス画像作成方法 |
JPWO2016046890A1 (ja) * | 2014-09-22 | 2017-04-27 | 三菱電機株式会社 | 情報表示制御システムおよびアトラス画像作成方法 |
US10109261B2 (en) | 2014-09-22 | 2018-10-23 | Mitsubishi Electric Corporation | Information display control system and method of mapping elemental images into a texture atlas |
CN106157353A (zh) * | 2015-04-28 | 2016-11-23 | Tcl集团股份有限公司 | 一种文字渲染方法和文字渲染装置 |
CN106157353B (zh) * | 2015-04-28 | 2019-05-24 | Tcl集团股份有限公司 | 一种文字渲染方法和文字渲染装置 |
JP2019159379A (ja) * | 2018-03-07 | 2019-09-19 | 五洋建設株式会社 | 三次元画像生成システム |
JP7079926B2 (ja) | 2018-03-07 | 2022-06-03 | 五洋建設株式会社 | 三次元画像生成システム |
JP2019159904A (ja) * | 2018-03-14 | 2019-09-19 | 日本ユニシス株式会社 | テクスチャマッピング装置およびテクスチャマッピング用プログラム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2013111195A1 (ja) | 2015-05-11 |
JP5653541B2 (ja) | 2015-01-14 |
DE112012005770T5 (de) | 2014-10-23 |
CN104054112A (zh) | 2014-09-17 |
US20150235392A1 (en) | 2015-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5653541B2 (ja) | 描画データ生成装置及び画像描画装置 | |
JP4467267B2 (ja) | 画像処理方法、画像処理装置、および画像処理システム | |
US8725466B2 (en) | System and method for hybrid solid and surface modeling for computer-aided design environments | |
US11527041B2 (en) | Information processing apparatus, information processing method and storage medium | |
JP4199170B2 (ja) | 高次元テクスチャマッピング装置、方法及びプログラム | |
JP5055214B2 (ja) | 画像処理装置、画像処理方法 | |
JP6580078B2 (ja) | 既存の3次元モデルをグラフィックデータに変換するための方法およびシステム | |
US20090102834A1 (en) | Image processing apparatus and image processing method | |
KR100503789B1 (ko) | 렌더링시스템, 렌더링방법 및 그 기록매체 | |
CN101281656A (zh) | 用于将纹理映射到三维对象模型的方法和装置 | |
CN103649856A (zh) | 通过工具的动作模拟物体的加工的方法及其***和计算机程序产品 | |
US9789650B2 (en) | Conversion of stereolithographic model into logical subcomponents | |
JP2000348213A (ja) | 三次元画像生成装置、三次元画像生成表示装置、及びその方法並びに記録媒体 | |
US20180137676A1 (en) | Identifying primitives in input index stream | |
JP2017532662A (ja) | ネットワーク転送およびリアルタイムレンダリング用に3dテクスチャモデルを自動的に最適化するシステム、方法、およびコンピュータプログラム製品 | |
JP2017199354A (ja) | 3dシーンのグローバル・イルミネーションの描画 | |
JP4948273B2 (ja) | 情報処理方法及び情報処理装置 | |
WO2011104746A1 (ja) | 画像表示装置 | |
JP2017168081A (ja) | 記述子を用いた3dオブジェクトの位置特定 | |
JP2008077604A (ja) | テクスチャフィルタリング装置、テクスチャマッピング装置、方法およびプログラム | |
JP2017162447A (ja) | 量子化を用いた3dオブジェクトの位置特定 | |
JP4896237B2 (ja) | 画像処理方法、画像処理装置、および画像処理システム | |
JP5948479B1 (ja) | 2次元または3次元の物体データを表示操作するシステム、方法及びコンピュータソフトウエアプログラム | |
CN106570934A (zh) | 针对大场景的空间隐函数建模方法 | |
JP2007334739A (ja) | 画像生成装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013554997 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12866967 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14360790 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112012005770 Country of ref document: DE Ref document number: 1120120057708 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12866967 Country of ref document: EP Kind code of ref document: A1 |