KR101665039B1 - Computer system and method for exporting data for 3D printer based on figure customizing function - Google Patents

Computer system and method for exporting data for 3D printer based on figure customizing function Download PDF

Info

Publication number
KR101665039B1
KR101665039B1 KR1020150059499A KR20150059499A KR101665039B1 KR 101665039 B1 KR101665039 B1 KR 101665039B1 KR 1020150059499 A KR1020150059499 A KR 1020150059499A KR 20150059499 A KR20150059499 A KR 20150059499A KR 101665039 B1 KR101665039 B1 KR 101665039B1
Authority
KR
South Korea
Prior art keywords
character
data
background object
printer
user
Prior art date
Application number
KR1020150059499A
Other languages
Korean (ko)
Inventor
김시진
Original Assignee
주식회사 엔씨소프트
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 엔씨소프트 filed Critical 주식회사 엔씨소프트
Priority to KR1020150059499A priority Critical patent/KR101665039B1/en
Application granted granted Critical
Publication of KR101665039B1 publication Critical patent/KR101665039B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A data creation method for a 3D printer using figure customization is disclosed. The method comprises: a step of extracting 3D modeling data of a character and a background object selected by a user; a step of setting a position and a posture of the character and the background object; and a step of merging the 3D modeling data of the character and the background object to export graphic data for 3D printer output. The present invention may be implemented as a method executed on a computer or a computer executing the method.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method and system for generating data for a 3D printer using figure customization,

The present invention relates to a method for generating data for a 3D printer on software processing 3D graphics.

3D printer technology has been recognized as one of the top 10 technologies in the World Economic Forum (WEF) in 2012 and is being considered as one of the technologies that will change human civilization in the future.

Currently, commercially available 3D printers have a rapid prototyping method in which a three-dimensional shape is formed by stacking a large number of layers of hardened powder or liquid, and a method of making a product by cutting a large synthetic resin into a round blade.

The scope of 3D printer technology is unlimited, and there is a great interest in the technology in the plastic and figure industries.

In the past, it was common for the artist to make a work using sculptures, to produce a resin kit by using resin or other materials, or to produce a plastic product by making a casting and plastics material.

However, when a 3D printer is used, it is possible to easily produce various products using 3D graphic data generated in a predetermined format.

Using 3D printer technology, which has the advantage of small quantity production of various products, it becomes possible to produce figures etc. in response to various demands of users.

In particular, if various contents using 3D graphic data are converted into data of a type recognizable by the 3D printer, various demands of users who want to have figures related to specific contents can be satisfied.

However, since it is necessary to have expert knowledge to modify or convert 3D graphic data, it is much more difficult and costly to purchase a ready-made article if users want to output their desired figures to a 3D printer I can not help it.

1. Korean Patent Laid-Open Publication No. 10-2014-0061340 "Game screen shot management device using EXIF metadata and method thereof" 2. Korea Patent Registration No. 10-0771839 " Online Game Screen Capture and Character Position Confirmation Providing System and Method " 3. Korean Patent Registration No. 10-0682455 "Game scrap system, game scrap method, and computer readable recording medium recording a program for executing the method"

The present invention proposes a method for exporting a specific scene to a 3D printer so that it can be output directly on software such as a game for rendering and displaying 3D graphic data in real time.

In particular, a method of generating data for a 3D printer by simply setting a terrain, an item, an operation, and the like using a character possessed by a user in a game in the game is proposed.

According to another aspect of the present invention, there is provided a method of generating data for a 3D printer using figure customization,

Retrieving 3D modeling data of a character and a background object selected by a user;

Setting a position and a posture of the character and the background object;

And merging the 3D modeling data of the character and the background object to export graphic data for 3D printer output.

At this time, in the step of fetching the 3D modeling data, an interface is provided to select one of the characters possessed by the user in the account, and the user extracts the 3D modeling data of the character by selecting any one of them.

At this time, in the setting of the position and the posture, an in-game action or a skill action of the character is reproduced and displayed in the form of a moving picture. When the user selects a pause function, the motion of the character is stopped And sets the posture of the character.

However, at this time, the 3D printer output graphic data is used to exclude the effect of in-game operation or skill.

In addition, the user can select an item, an apparatus, or other objects in the game other than the character and the background object. In this case, the user can fetch the 3D modeling data of the other object selected by the user and attach or wear the 3D modeling data to the character Or a predetermined position.

If the other object does not touch the character or background object, an extension line connecting the other object and the character or background object is generated, and the generated extension line is further included in the 3D printer output graphic data.

At this time, the extension line may be generated along a path corresponding to the shortest distance from the center of gravity of at least one of the two corresponding objects to the remaining one.

Meanwhile, in the step of fetching 3D modeling data, 3D modeling data including a polygon and mapping data of the object is acquired, color information is reconstructed using mapping data, and the reconstructed color information is converted into graphic data for 3D printer output You can export it.

According to another aspect of the present invention, there is provided a data generating system for a 3D printer using figure customization according to the present invention includes a display and a processor,

The display displays a UI for selection of a character and a background object on the execution screen of the 3D software according to the processing of the processor,

The processor fetches 3D modeling data of the character and the background object selected by the user, processes the position and attitude setting of the character and the background object according to user manipulation, and merges the 3D modeling data of the character and the background object ) To export graphic data for 3D printer output.

1 is a block diagram illustrating a computer system in which the present invention is implemented,
2 is a flowchart illustrating a data generating method for a 3D printer using figure customization according to the present invention,
3 is a diagram illustrating a user selecting a character held in the account and selecting a background object for positioning the character,
FIG. 4 is a diagram for explaining a process of setting a posture by selecting a pause function during the skill reproduction of a character,
5 is a diagram for explaining an extension line connecting objects that do not touch each other,
FIG. 6 is a view for explaining a process of filling a void generated in a graphic data crop,
7 is a diagram illustrating an output process by the 3D printer,
Fig. 8 is a diagram illustrating an output result by the 3D printer.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT Hereinafter, the present invention will be described in detail with reference to preferred embodiments of the present invention and the accompanying drawings, wherein like reference numerals refer to like elements.

It is to be understood that when an element is referred to as being "comprising" another element in the description of the invention or in the claims, it is not to be construed as being limited to only that element, And the like.

Also, in the description of the invention or the claims, the components named as "means", "parts", "modules", "blocks" refer to units that process at least one function or operation, Each of which may be implemented by software or hardware, or a combination thereof.

Hereinafter, the 3D software is collectively referred to as a type of software in which 3D graphic data including polygons and mapping data are rendered in real time and displayed on a screen. For example, a 3D game program, graphic software for 3D graphic modeling, and the like.

Hereinafter, a 3D object refers to an object rendered in real time on 3D software. A character, a background, an attachment attached to a character or a background, occupies a predetermined area on the coordinate system of the 3D software, may be composed of a plurality of polygons, or may be a vector object. On the other hand, each 3D object may include texture mapping data that is mapped externally.

Hereinafter, a 3D printer refers to a printer that outputs 3D graphics data using a prepared material according to 3D graphic data input in a predetermined format. It should be construed as belonging to the 3D printer as long as it satisfies the above definition in addition to the well-known known methods such as rapid prototyping method in which a powder or a plastic liquid is cured by laminating layers, or modeling of a synthetic resin mass is performed.

Hereinafter, the graphic data for output of a 3D printer refers to data relating to a three-dimensional shape having a format that can be recognized by a 3D printer, that is, a driver for driving a 3D printer, and is input to the 3D printer so that the 3D printer outputs the three- Means data that can be executed.

Hereinafter, a character or a character means one type of a 3D object and a character is rendered in 3D form. In the case of a game program, a character usually corresponds to an object operated by a user directly in the game, and can be changed in such a manner as to directly set its appearance or to wear or replace the equipment.

Hereinafter, the background object corresponds to the background in which the character is displayed. Preferably, the character may be displayed above the background object while being in contact with the background object.

The present invention relates to a method and system for generating data for a 3D printer using figure customization, and can be implemented in the form of a computer system, a method executed on a computer, or a computer program.

1 is a block diagram illustrating a computer system in which the present invention is implemented.

1, the computer system 100 has a processor 101 and a display 102, a memory 103,

Processor 101 is a means for executing instructions and may take the form of a chipset, such as a CPU.

The display 102 may be in the form of an LCD monitor or the like as a means for visually displaying information.

The memory 103 may be in the form of RAM as a means for storing information in a non-volatile manner.

The data storage device 104 may be data such as a hard disk drive or a solid state drive (SSD).

On the other hand, 3D software is stored in the data storage device 104, and the processor 101 loads the corresponding 3D software into the memory 103, and then executes the instruction to display it on the display 102. [

The processor 101 may be a game-on-line game-displaying on the display 102 a user interface that allows the user to select a character that is retained in the user's account upon connection.

In addition, a user interface for selecting a background object (2) to be positioned by the user is displayed on the display (102).

As the user selects the character 1 and the background object 2, the processor 101 provides a user interface so that the user can set the position and posture of the selected character 1 and the background object 2.

When the setting of the position and the posture is completed, the processor 101 selects the objects to be included in the 3D printer output data, obtains modeling data of the 3D objects to be finally exported through predetermined post-processing.

Then, they are merged and converted into a single file format to generate data for 3D printer output, and the converted data is stored in the data storage device 104 again.

FIG. 2 is a flowchart illustrating a data generating method for a 3D printer using figure customization according to the present invention.

The 3D printer data generation method using figure customization according to the present invention can be executed on the computer system 100 as described above.

In a micro perspective, the processor 101 displays the user interface on the display 102, executes instructions on the memory 103, executes the instructions on the memory 103, It can be understood that it is executed in such a manner as to store data in the storage device 104. [

2, the computer system 100 executes the figure customizing system in the game as the user connects to the game.

The figure customizing system is preferably a system in which a user selects a character 1 possessed by his / her account in the game, selects a background object 2 on which the character 1 is to be positioned, Means a service or software for exporting graphic data for 3D printer output.

The computer system 100 may connect to the server (not shown) either in standalone mode or in a online fashion to run the figure customizing system.

FIG. 3 is a diagram for explaining how a user selects a character 1 held in the account and selects a background object 2 to which the character 1 is to be placed.

As the figure customizing system is executed, the computer system 100 reads the character (1) information that the user has in the account of the game, and prompts the user to select any one of them (S110).

The user can have one or more characters (1) in his account and select one of them to create graphic data for 3D printer output.

It is needless to say that it is not always necessary to restrict the selection of only the character (1) included in the user account. However, in this way, the user is interested in holding and nurturing a large number of characters (1) I feel.

On the other hand, if the character 1 is selected, a user interface is provided so that the background object 2 can be selected at this time.

FIG. 3 (b) illustrates the selection of one of the plurality of background objects 2.

The background object 2 can be prepared in various ways such as a footstool, a mount, a pollen, a chair, etc., and the user selects any one of them and positions the character 1 thereon.

As the user selects the character 1 and the background object 2, the computer system 100 fetches the 3D modeling data of the character 1 and the background object 2 selected by the user (S120).

FIG. 4 illustrates that the character 1 is placed on the background object 2. FIG.

Thereafter, the computer system 100 provides a user interface for setting the position and posture of the character 1 and the background object 2 by the user, and controls the character 1 and the background object 2 according to the user's operation. (S130).

For example, the user can move the character 1 to an arbitrary position on the background object 2 by dragging a mouse, or set the position and the posture by moving the arms 1 and legs of the character 1.

If the background object (2) is also an object capable of changing its posture by user's manipulation, it can be moved in a proper manner such as a mouse drag or the like.

However, there is no problem in setting the position of the character 1 or the background object 2. However, it is very inconvenient to delicately control the operation of the character 1 or the background object 2 using an input device such as a mouse, Otherwise it is very cumbersome and difficult to make a pose of the desired character (1).

Accordingly, it is possible to reproduce the operation or the skill of the character 1 prepared in advance in the game, and stop the operation at an arbitrary point of time to select a desired posture.

That is, in step S130, the computer system 100 fetches the operation or skill information of the character 1 selected by the user, and displays it on the screen in the form of a list so that the user can select any one of them.

The operation or skill information of the character 1 is predefined in the game software for each type of the character 1, and the posture of the character 1 is easily set by utilizing it.

When the user selects any one action or skill, the computer system 100 reproduces the in-game action or skill action of the character 1 and displays it in the form of a moving picture.

At this time, when the user selects the pause function, the computer system 100 stops the movement of the character 1.

Fig. 4 (a) illustrates the state of the character 1 in operation, and Fig. 4 (b) illustrates the state of the character 1 stopped during the operation as the user selects the pause function.

That is, the 3D modeling data at the time when the movement of the character 1 stops is directly generated as 3D printer output graphic data.

On the other hand, an effect utilizing various texture mapping or other 3D objects is used at the time of the operation or skill generation of the in-game character 1.

However, since these effects are generally not suitable for inclusion as 3D modeling data such as a plurality of particles that are not connected to each other, if the effect has been applied at the time when the movement of the character 1 stops, the texture mapping associated with the effect 3D objects are excluded from graphic data for 3D printer output by default.

On the other hand, in addition to the operation of the character 1 and the position on the background object 2, it is also possible to select the equipment of the character 1, clothes, items, and so on, and to include them in the 3D printer output graphic data.

For this, in step S130, the computer system 100 causes the user to input various in-game 3D objects (hereinafter referred to as "others ") except for the character 1 and the background object 2 such as the equipment of the character 1, Quot; object ").

When the user selects any other object, the computer system 100 fetches the 3D modeling data of the other object 3 selected by the user, attaches it to the character 1, worn it, or places it at a predetermined position .

At this time, the other object (3) has an attribute value, and the computer system (100) determines the displayed position and the type according to the attribute value of the other object (3) selected by the user.

For example, in the case of a knife or a shield, the character 1 is held in the hand, and if it is an armor, the character 1 is worn. On the other hand, if the character 1 is not an object of the type worn or directly used in the game, it is processed so as to be positioned at the coordinates selected by the user.

After the position and orientation setting of the character 1 and the background object 2 are completed as described above, when the user selects the export function, the computer system merge the 3D modeling data of the character 1 and the background object 2 ), Converts the 3D object into a single 3D object, and then outputs it as 3D printer output graphic data (S140).

When the user selects and includes the other object 3, the other object 3 is merged with the character 1 and the background object 2. [

If any one object is a vector object, convert it to a polygon and perform a merge.

The 3D modeling data of each object may further include texture mapping data in addition to the polygon. The computer system 100 may generate color information for each region of the merged object using the texture mapping data.

Thereafter, the computer system 100 uses the converting tool to export the graphic data for 3D printer output. That is, the image data is converted into a file format that can be recognized by the 3D printer.

On the other hand, there are cases where it is difficult for the user to directly export the objects set position or posture to graphic data for 3D printer output.

In this case, for example, two or more objects that are not adjacent to each other are included, or the size of the background object (2) is excessively large, so that only a part of the background object (2) needs to be exported as graphic data for 3D printer output.

The process of this case will be further described below.

1. Selection of objects

In step S140, the computer system 100 selects objects to be excluded from the 3D printer output graphic data, as opposed to objects to be included in 3D printer output graphic data.

It is preferable that the 3D printer output is preferably expressed in one lump, so that a plurality of particles-like objects not in contact with each other are excluded from objects to be included in 3D printer output graphic data.

In particular, all of the operations and the graphical effects generated during reproduction of the skill (1) are excluded.

2. Creation of extension line

When the user selects and includes other objects 3 that are not in contact with the character 1, the following processing is performed.

5, the character 1 stands on the background object 2, and a bird that is the other object 3 is displayed thereon. The computer system 100 has the background object 2 ) Or the character 1 and the other object 3 - bird.

The extension line can be basically created by finding a line corresponding to the shortest distance from the selected objects or by obtaining a line extending from the other object 3 in the vertical direction to the background object 2.

When generating an extension line corresponding to the shortest distance, it is possible to obtain the shortest distance from the center of gravity of at least one of the two objects to be connected to the extension line to the other one, and to generate an extension line 4 corresponding to the shortest distance.

The shortest distance from the center of gravity of the background object 2 or the other object 3 not in contact with the character 1 to the background object 2 or the character 1 is obtained in the example of Fig. Can be generated.

A known algorithm can be used as an algorithm for finding the path of the shortest distance.

When the extension line 4 is thus obtained, the obtained extension line 4 is converted into a 3D object having a predetermined thickness and is included in 3D printer output graphic data.

On the other hand, the thickness and the number of the extension lines 4 can be dynamically determined in accordance with the volume of the background object 2 and the other objects 3 remote from the character 1.

The thickness of the extension line 4 may be determined so as to have a sufficient supporting force depending on the material used in the 3D printer.

When the volume of the other object 3 is equal to or larger than a predetermined value, it is judged that it is difficult to support with the single extension line 4 and the background object 2 or the character 1). ≪ / RTI >

On the other hand, when the other object 3 is connected to the character 1, depending on the ratio of the volume of the other object 3 and the volume of the character 1, if the volume ratio of the other objects 3 is larger than a certain extent It is determined that the character 1 can not support the other object 3 and the other object 3 and the background object 2 are connected in place of the extension line 4 connecting the other object 3 and the character 1 An extension line 4 can be generated.

In the case where the volume ratio of the other object 3 is larger than the background object 2 by a certain degree or more, it is impossible to support the other object 3 by the extension line 4, . In this case, it is needless to say that the extension line 4 is not generated.

5, it is possible to merge the 3D objects into graphic data for outputting a single piece of 3D printer output. In this case, the graphic data for 3D printer output from the character 1 is transferred from the character 1 to the other object 3 It can be seen that the following extension line 4 is displayed in a three-dimensional shape having a predetermined thickness.

In this way, it is possible to export graphic data for 3D printer output including a plurality of 3D objects that do not touch each other.

3. Treatment of cavities

It may happen that the background object 2 is excessively large and only a part of the background object 2 is selected or cropped.

When the background object 2 is cropped, the polygons constituting the background object 2 are concentrated on the outer surface of the 3D object, and empty cavities are formed in the interior of the 3D object, ), The output result using the 3D printer may not have a sufficient three-dimensional shape.

FIG. 6 conceptually illustrates a process of filling voids generated when graphic data is cropped.

The left side of FIG. 6 shows a cross section of the background object 2 that has been cropped along the outline of the ellipse. It can be seen that the bottom is empty, and it consists of a surface that is rounded upwards. In other words, it can be seen that a cavity is formed in the bottom surface portion of the background object 2.

In this case, the computer system 10 creates a virtual face connecting the cropped outline of the 3D object.

Then, the surface is included in the 3D printer output graphic data, or the inner space of the surface and the corresponding cropped background object 2 is filled and included in the 3D printer output graphic data.

The right side of Fig. 6 shows a background object 2 having a three-dimensional shape filled with an inner cavity.

When the 3D printer output graphic data is exported, the user can store it in a portable storage device or the like, or transmit it to the 3D printer 10 via the network and output it in a stereoscopic form.

7 illustrates an output process by the 3D printer 10.

The 3D printer 10 receiving the 3D printer output graphic data outputs the result including the background object 2 and the character 1 in a three-dimensional form.

Fig. 8 is a diagram illustrating an output result by the 3D printer.

It is possible to confirm that the objects selected by the user are transformed so that they can be outputted integrally through a predetermined post-process, and then outputted in a three-dimensional shape.

Meanwhile, the 3D printer data generation method using figure customization according to an embodiment of the present invention may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, It belongs to the scope of right.

The present invention can be applied to the fields of 3D graphics software technology and 3D printer application technology.

1: Character
2: Background object
3: Other Objects
4: Extension line
10: 3D Printers
100: Computer system
101: Processor
102: Display
103: Memory
104: Data storage device

Claims (19)

A computer system,
Retrieving 3D modeling data of a character and a background object selected by a user;
Setting a position and a posture of the character and the background object;
And merging the 3D modeling data of the character and the background object to export graphic data for 3D printer output,
In the step of fetching the 3D modeling data, when the user selects only a part of the background object, the polygon constituting the background object is cropped to be included in the graphic data for 3D printer output, A method for generating data for a printer.
A computer system,
Retrieving 3D modeling data of a character and a background object selected by a user;
Setting a position and a posture of the character and the background object;
And merging the 3D modeling data of the character and the background object to export graphic data for 3D printer output,
In the step of fetching the 3D modeling data, when a user selects only a part of the background object, data for 3D printer using figure customization in which a part of the selected background object is cropped into a predetermined shape and included in graphic data for 3D printer output Way.
3. The method according to claim 1 or 2,
In the step of fetching the 3D modeling data, an interface is provided to select one of the characters possessed by the user in the account. When the user selects one of the characters, the user uses the figure customizing A method for generating data for a 3D printer.
3. The method according to claim 1 or 2,
In the step of setting the position and the posture, an in-game action or a skill action of the character is reproduced and displayed in the form of a moving picture. When the user selects a pause function, the motion of the character is stopped, A method of generating data for a 3D printer using figure customization for setting an attitude of a 3D printer.
5. The method of claim 4,
In the step of setting the position and the posture, the pose function is selected during reproduction of the in-game operation or the skill activation of the character, and the figure customization that excludes the effect according to the in- A method for generating data for a 3D printer.
3. The method according to claim 1 or 2,
In the step of setting the position and the posture, 3D modeling data of other objects selected by the user are fetched,
Attached to or worn on the character, or placed in a predetermined position,
A method for generating data for a 3D printer using figure customizing for merging 3D modeling data of the character, background object, and other objects in exporting the 3D printer output graphic data.
The method according to claim 6,
A 3D printer using figure customization that generates an extension line connecting the other object with the character or background object and further includes the generated extension line in graphic data for 3D printer output when the other object does not contact the character or background object Lt; / RTI >
8. The method of claim 7,
Wherein the extension line is formed along a path corresponding to a shortest distance from the center of gravity of at least one of the two objects to the other one of the two objects.
8. The method of claim 7,
And determining the thickness or the number of the extension lines in accordance with the attribute of the object positioned above the two objects connected by the extension line.
The method according to claim 1,
In the drawing of the 3D modeling data, if the inside of the cropped background object is empty, a virtual customizing surface is created to link the cropped area outline of the cropped background object and included in the 3D printer output graphic data. A method for generating data for a 3D printer.
3. The method according to claim 1 or 2,
Wherein the 3D modeling data is obtained by acquiring 3D modeling data including a polygon and mapping data of the object when the 3D modeling data is acquired.
12. The method of claim 11,
And reconstructing the color information using the mapping data and incorporating the reconstructed color information as 3D printer output graphic data for export.
In a data generating system for a 3D printer using figure customization,
Providing a user with a UI for selecting a character and a background object;
Retrieving 3D modeling data of a character and a background object selected by a user;
Processing position and orientation setting of the character and background object;
Executing 3D modeling data of the character and the background object, and outputting graphic data for 3D printer output,
In the step of fetching the 3D modeling data, when the user selects only a part of the background object, data belonging to the selected area as a polygon constituting the background object is cropped and included in 3D printer output graphic data A computer program stored on a medium for executing a method.
In a data generating system for a 3D printer using figure customization,
Providing a user with a UI for selecting a character and a background object;
Retrieving 3D modeling data of a character and a background object selected by a user;
Processing position and orientation setting of the character and background object;
Executing 3D modeling data of the character and the background object, and outputting graphic data for 3D printer output,
In the step of fetching the 3D modeling data, when a user selects only a part of the background object, a 3D printer data generation method of cropping a part of the selected background object into a predetermined shape and incorporating the selected part into graphic data for 3D printer output A computer program stored on a medium.
A display and a processor,
Wherein the display displays a UI for selection of a character and a background object on an execution screen of the 3D software according to a process of the processor,
The processor fetches 3D modeling data of the character and the background object selected by the user, processes the position and attitude setting of the character and the background object according to user manipulation, and merges the 3D modeling data of the character and the background object ) To export graphic data for 3D printer output,
In the process of fetching the 3D modeling data, when the user selects only a part of the background object, the processor crops what belongs to the selected area as a polygon constituting the background object, and includes figure customization Data Creation System for 3D Printer Using.
A display and a processor,
Wherein the display displays a UI for selection of a character and a background object on an execution screen of the 3D software according to a process of the processor,
The processor fetches 3D modeling data of the character and the background object selected by the user, processes the position and attitude setting of the character and the background object according to user manipulation, and merges the 3D modeling data of the character and the background object ) To export graphic data for 3D printer output,
In the process of fetching the 3D modeling data, when the user selects only a part of the background object, the processor scans a part of the selected background object into a predetermined shape to be included in graphic data for 3D printer output, Gt;
17. The method according to claim 15 or 16,
The processor reproduces the action or skill of the character in the form of a moving picture as the user selects one of the in-game action or skill of the character, and when the user selects the pause function , And setting the posture of the character by stopping the movement of the character.
18. The method of claim 17,
The processor generates data for a 3D printer using figure customization that excludes effects due to in-game operation or skill from graphic data for 3D printer output as the pause function is selected during reproduction of the in-game operation or skill of the character system.
19. The method of claim 18,
The processor may be a 3D modeling device that uses a figure customizing function to merge 3D modeling data of the character, the background object, and the other objects by attaching or attaching other objects selected by the user to the character, Data generation system for printers.
KR1020150059499A 2015-04-28 2015-04-28 Computer system and method for exporting data for 3D printer based on figure customizing function KR101665039B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150059499A KR101665039B1 (en) 2015-04-28 2015-04-28 Computer system and method for exporting data for 3D printer based on figure customizing function

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150059499A KR101665039B1 (en) 2015-04-28 2015-04-28 Computer system and method for exporting data for 3D printer based on figure customizing function

Publications (1)

Publication Number Publication Date
KR101665039B1 true KR101665039B1 (en) 2016-10-11

Family

ID=57161883

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150059499A KR101665039B1 (en) 2015-04-28 2015-04-28 Computer system and method for exporting data for 3D printer based on figure customizing function

Country Status (1)

Country Link
KR (1) KR101665039B1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102082340B1 (en) 2019-07-19 2020-02-27 김현일 Figure and voice service platform using the same
WO2020105871A1 (en) * 2018-11-22 2020-05-28 삼성전자주식회사 Electronic device and control method thereof
KR20200071870A (en) * 2018-12-05 2020-06-22 주식회사 레이젠 3d printer for education and method for coding training and 3d printing
KR20210010302A (en) 2020-02-20 2021-01-27 김현일 Figure and voice service platform using the same
KR102357983B1 (en) 2020-12-29 2022-02-08 주식회사 스쿱 Apparatus and method of preprocessing data for format transforming 3d design mesh data
KR102553653B1 (en) 2023-02-03 2023-07-11 공주대학교 산학협력단 Apparatus and method for supplemental modeling of artifact shape

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070007799A (en) * 2004-02-12 2007-01-16 비숀 알리반디 System and method for producing merchandise from a virtual environment
KR100682455B1 (en) 2005-03-17 2007-02-15 엔에이치엔(주) Game scrap system, game scrap method, and computer readable recording medium recording program for implementing the method
KR100771839B1 (en) 2007-02-06 2007-10-30 여호진 Online game picture capturing and character position system and method
KR20090075926A (en) * 2008-01-07 2009-07-13 이호철 3-d character producting method and 3-d character service method
KR20140061340A (en) 2014-04-25 2014-05-21 주식회사 엔씨소프트 Apparatus and method of managing game screenshot based on exif meta-data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070007799A (en) * 2004-02-12 2007-01-16 비숀 알리반디 System and method for producing merchandise from a virtual environment
KR100682455B1 (en) 2005-03-17 2007-02-15 엔에이치엔(주) Game scrap system, game scrap method, and computer readable recording medium recording program for implementing the method
KR100771839B1 (en) 2007-02-06 2007-10-30 여호진 Online game picture capturing and character position system and method
KR20090075926A (en) * 2008-01-07 2009-07-13 이호철 3-d character producting method and 3-d character service method
KR20140061340A (en) 2014-04-25 2014-05-21 주식회사 엔씨소프트 Apparatus and method of managing game screenshot based on exif meta-data

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020105871A1 (en) * 2018-11-22 2020-05-28 삼성전자주식회사 Electronic device and control method thereof
US11501487B2 (en) 2018-11-22 2022-11-15 Samsung Electronics Co., Ltd. Electronic device and control method thereof
KR20200071870A (en) * 2018-12-05 2020-06-22 주식회사 레이젠 3d printer for education and method for coding training and 3d printing
KR102082340B1 (en) 2019-07-19 2020-02-27 김현일 Figure and voice service platform using the same
KR20210010302A (en) 2020-02-20 2021-01-27 김현일 Figure and voice service platform using the same
KR102357983B1 (en) 2020-12-29 2022-02-08 주식회사 스쿱 Apparatus and method of preprocessing data for format transforming 3d design mesh data
KR102553653B1 (en) 2023-02-03 2023-07-11 공주대학교 산학협력단 Apparatus and method for supplemental modeling of artifact shape

Similar Documents

Publication Publication Date Title
KR101665039B1 (en) Computer system and method for exporting data for 3D printer based on figure customizing function
EP3086291B1 (en) Device and method of generating a model for subsequent 3d printing
US10860838B1 (en) Universal facial expression translation and character rendering system
CN101156175B (en) Depth image-based representation method for 3d object, modeling method and apparatus, and rendering method and apparatus using the same
US20160257077A1 (en) System, device and method of 3d printing
US20200122406A1 (en) System and method of 3d print modelling
KR20130080442A (en) Real-time animation of facial expressions
KR101052805B1 (en) 3D model object authoring method and system in augmented reality environment
JP2023071722A (en) Data set for learning function using image as input
GB2564401A (en) System and method of enhancing a 3D printed model
JP6860776B2 (en) Virtual space controller, its control method, and program
CN111643899A (en) Virtual article display method and device, electronic equipment and storage medium
US9196076B1 (en) Method for producing two-dimensional animated characters
KR101977893B1 (en) Digital actor managing method for image contents
US10460497B1 (en) Generating content using a virtual environment
KR101597940B1 (en) Method of generating gesture of an avatar and computing device for performing the same
JP6376591B2 (en) Data output device, data output method, and three-dimensional object manufacturing system
JP7364702B2 (en) Animated face using texture manipulation
KR101653802B1 (en) 3 dimenstional graphic data capturing system and method thereof
KR101780496B1 (en) Method for producing 3D digital actor image based on character modelling by computer graphic tool
US11769346B2 (en) Video reenactment with hair shape and motion transfer
Eitsuka et al. Authoring animations of virtual objects in augmented reality-based 3d space
GB2543097A (en) Device and method of generating 3D printing data for a model
Fukusato et al. View-dependent formulation of 2.5 d cartoon models
US11568352B2 (en) Immersive packaging system and method

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20190904

Year of fee payment: 4