WO2001080188A1 - Procede et dispositif de traitement de modele tridimensionnel, moyen et programme - Google Patents
Procede et dispositif de traitement de modele tridimensionnel, moyen et programme Download PDFInfo
- Publication number
- WO2001080188A1 WO2001080188A1 PCT/JP2001/003328 JP0103328W WO0180188A1 WO 2001080188 A1 WO2001080188 A1 WO 2001080188A1 JP 0103328 W JP0103328 W JP 0103328W WO 0180188 A1 WO0180188 A1 WO 0180188A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tool
- target object
- dimensional model
- editing
- processing
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8058—Virtual breeding, e.g. tamagotchi
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- the present invention relates to a three-dimensional model for performing processing such as deformation and coloring on a three-dimensional model displayed on a display of a personal computer or the like. More specifically, the present invention relates to a model processing apparatus, a three-dimensional model processing method, and a program providing medium. More specifically, a target object tool as a virtual object corresponding to a three-dimensional model displayed on a display, a shape changing tool, a coloring tool, etc. By operating the various editing tools of the 3D model, the operator can change and display various attributes such as the shape and color of the 3D model displayed on the display according to the operation of the operator.
- the present invention relates to an original model processing device, a three-dimensional model processing method, and a program providing medium.
- Fig. 1 shows a configuration example of a display object shape change processing device using a glove-type manipulator.
- the operation is performed on the head mount display.
- the user wears 10 and observes the object 30 on the display.
- a glove-type manipulator 11 is attached to the hands of the operator.
- the manipulator 11 is provided with a pressure sensor or a magnetic sensor for detecting hand and finger movement, and detects hand movement during operation.
- the detection signal is input to the controller 20 via the I / O 22 and the CPU 21 executes processing according to the program stored in the ROM 24 or the RAM 25, and responds to the detection signal. Then, a display parameter changing process for the display object 30 is executed to generate new three-dimensional shape data, and the three-dimensional object 30 based on the new display parameter is generated via the display control means 23. Display on head-mounted display 10.
- the sensor of the manipulator 11 detects the scraping operation of the manipulator 11, and
- the CPU 21 changes the display parameters of the display object 30 on the basis of the detection signal input via the / 0 22 2 to change the display parameter of the display object 30 to the head of the display object forming the display unit 31. Display on the mount display 10.
- Input by a three-dimensional digitizer or a three-dimensional scanner is useful for inputting the shape of a virtual object, but is not suitable for the deformation processing of an input object. These input devices are expensive.
- the input processing using the glove-type manipulator is a tool that can be intuitively performed by the operator.
- specific processing such as performing a “push” operation on an object or “pulling”
- input devices are also quite expensive.
- DISCLOSURE OF THE INVENTION The present invention has been made in view of the above-described problems of the related art, and performs various processes such as shape change and surface coloring on a three-dimensional display object by using a virtual object and various processes corresponding to each process.
- a three-dimensional model processing device, a three-dimensional model processing method, and a program that enable an operator to operate a target object by operating each tool in a manner closer to actual processing using an editing tool. The purpose is to provide a distribution medium.
- the present invention has been made in consideration of the above problems, and has a display device that three-dimensionally displays an object to be edited, a target object tool that can be moved and changed in posture, and a relative distance to the target object tool that can be changed.
- the relative position between the editing tool, the object tool, and the editing tool is detected, and the processing set for the editing tool is executed based on the detected relative position information and displayed on the display device.
- the processing means further detects the changed position information of the target object tool based on the movement and posture change of the target object tool, and displays the attribute of the edit target object displayed on the display device based on the adhered position information. It has a configuration to change information.
- the processing means includes a configuration capable of executing a plurality of different processes according to the type of the editing tool.
- the processing means supports the editing tool based on relative position information including at least one of a relative distance between the target object tool and the editing tool or a relative angle between the target object tool and the editing tool. Includes a configuration to execute the set processing.
- the attribute information of the edit target object to be changed by the processing means is attribute information on the shape, color, or sound of the edit target object displayed on the display device. Is one of
- the processing means includes a configuration for executing a functional operation of the object to be edited displayed on the display device as a process set corresponding to the editing tool.
- the present invention is a three-dimensional model processing method for executing various processes on an edit target object three-dimensionally displayed on a display device.
- the method includes a target object tool capable of changing a movement and a posture, and a target object tool.
- the present invention further includes a step of detecting the position information of the target object tool changed based on the movement and the posture change of the target object shell; and an attribute of the edit target object displayed on the display device based on the detected position information. Steps to change information.
- the present invention further includes a step of determining a type of the editing tool, and executes a process according to the type of the determined editing tool.
- the step of detecting the relative position detects a relative position including at least one of a relative distance between the target object tool and the editing tool or a relative angle between the target object tool and the editing tool.
- the attribute information is any one of attribute information relating to the shape, color, or sound of the object to be edited displayed on the display means.
- the functional operation of the object to be edited displayed on the display device is executed as a process set corresponding to the editing tool.
- the present invention is a program providing medium for providing a computer program for executing, on a computer system, a three-dimensional model process for executing various processes on an edit target object three-dimensionally displayed on a display device.
- the computer program recorded on the recording medium detects a relative position between a target object tool whose movement and posture can be changed and an editing tool whose relative distance to the target object table can be changed. Executes the processing set for the editing tool based on the step and the detected relative position information Changing the attribute information of the object to be edited displayed on the means.
- the program providing medium on which the computer program is recorded is, for example, a medium that provides a computer program in a computer-readable format to a general-purpose computer system capable of executing various programs.
- the form of the medium is not particularly limited, such as a storage medium such as a CD, FD, or MO, or a transmission medium such as a network.
- Such a program providing medium is defined as a structural or functional cooperative relationship between the computer and the providing medium for realizing a predetermined combination program function on a computer system. It is. In other words, a cooperative action is exerted on the computer system by installing the computer 'program' on the computer system via the providing medium.
- FIG. 1 is a block diagram illustrating a three-dimensional model processing configuration using a conventional glove-type manipulator.
- FIG. 2 is a block diagram showing the configuration of the three-dimensional model processing device according to the present invention.
- FIG. 3 is a diagram illustrating an outline of processing in the three-dimensional model processing device according to the present invention.
- FIG. 4 is a diagram showing a configuration example using a tablet as an input device used in the three-dimensional model processing device according to the present invention.
- FIG. 5 is a diagram illustrating a configuration example using a magnetic sensor as an input device used in the three-dimensional model processing device according to the present invention.
- FIG. 6 is a diagram showing a configuration example using an ultrasonic sensor as an input device used in the three-dimensional model processing device according to the present invention.
- FIG. 7 shows an ultrasonic device as an input device used in the three-dimensional model processing device according to the present invention. It is a figure showing the example of composition using a lance bonder.
- FIG. 8 is a diagram showing a process in the three-dimensional model processing device according to the present invention as a flowchart.
- FIG. 9 is a flowchart showing a process in a case where the trowel is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 10 is a diagram showing, as a data flow diagram, the processing when the trowel is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 11 is a diagram showing, as a flowchart, a process in a case where a pinch is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 12 is a flow chart showing a process when a pinch is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 13 is a diagram showing, as a flowchart, processing when a brush is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 14 is a data flow diagram showing processing when a brush is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 15 is a flowchart showing a process when a spray is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 16 is a diagram showing, as a data flow diagram, a process when a spray is used as an editing tool in the three-dimensional model processing apparatus according to the present invention.
- FIG. 17A shows an action area when a spray is used as an editing tool in the three-dimensional model processing apparatus according to the present invention
- FIG. 17B shows a case where the spray is used as an editing tool. It is a figure which shows a coloring range.
- FIGS. 18A, 18B, and 18C are diagrams illustrating a specific configuration example using the three-dimensional model processing device according to the present invention.
- FIG. 19 is a view for explaining a specific configuration example using the three-dimensional model processing device according to the present invention.
- FIG. 2 is a block diagram showing one embodiment of the three-dimensional model processing apparatus according to the present invention.
- the three-dimensional model processing apparatus 100 includes an arithmetic processing circuit (CPU) 101 that executes a processing program, a program memory 102 that stores a processing program, processing data, and a position of an object to be edited. , Posture, shape, color, etc., data memory 103 for storing information such as attributes of editing tools, objects to be edited, editing tools, and instructions to the user, etc. on the image display device 110.
- CPU arithmetic processing circuit
- a frame memory 104 for storing image information to be displayed, an input device 105 for inputting various instructions for an object to be edited, an external storage device 106 for storing observation information or processing results, and There is a bus 107 that enables data transfer between these devices.
- the image display device 110 is connected via the frame memory 104, and the image data stored in the frame memory 104 is displayed.
- the input device 105 which is a characteristic means of the present invention, includes a plurality of input devices 105 that perform various processes such as deformation and surface color change on the edit target object having the three-dimensional shape displayed on the image display device 110. Tool group.
- the three-dimensional model processing apparatus performs various processes on the target object tool 301 corresponding to the edit target object displayed on the display and the edit target object as shown in FIG. Using these tools, a process such as deformation and coloring of the object to be edited 308 displayed on the display 307 is executed by using these tools.
- Figure 3 shows the editing tool performing a “push” action on the object to be edited.
- a pressing tool with a “trowel” shape 302 performing a “pull” action on the object to be edited
- a pinch-shaped drawing tool 304 a brush tool that performs a “fine coloring” action on the object to be edited, for example, a brush tool 304 that has the shape of a “brush”.
- a tool that performs a “color” action for example, a spray tool with a “colored spray” shape, and a tool that performs a “cut and tick” action on the object to be edited, for example, a cutting tool with a shape of “knife”.
- the target object tool 301 is operated by an operator in a predetermined action area (sensor detection area) as indicated by 201 to 204 in FIG. 3, and specifically, the target object tool 301 is operated.
- the processing corresponding to each tool that is, the “push” action by the pressing tool 302, and the “pushing” action by the pulling tool 303
- Deformation of the object to be edited corresponding to the ⁇ draw '' operation, or the change of the surface color of the object to be edited according to the processing by the brush tool 304 and the spray tool 305 is displayed on the display 307. Applied to the target object 308.
- Input information from sensors that detect the position and orientation in a three-dimensional space is used as a metaphor of the clay model that is the object of operations such as deformation and coloring.
- This is a clay object Tosensa, c object tool 3 0 1 which is the subject Obujekuto tool 3 0 1 3, cubic shape as shown in FIG. 3, there is a sphere-like shape, or, on the display
- the shape may be similar to the displayed object to be edited.
- the object object tool 301 is held by the operator with one hand, and has a configuration that can be freely moved and rotated.
- An editing tool that transforms or colors the object to be edited
- the metaphor also has as input parameters from sensors that detect position and orientation in three-dimensional space.
- the editing tool has a shape that allows the operating style to be intuitively associated with the processing mode. That is, as shown in Fig. 3, the shape of the "trowel” as the pressing tool 302, the shape of the pinch J as the pulling tool 303, the brush tool 304, and the spray tool 300 "brush” ,
- the object object template 301 and the editing tools 310 to 300 correspond to the input device 105 in FIG. 2, and a specific configuration example of this input device is shown in FIGS. This will be described using FIG. In FIGS. 4 to 7, the editing tools 302 to 302 are not classified in detail, but are roughly classified into a deforming tool and a coloring tool.
- FIG. 4 shows an example in which an input device is configured using evening bullets 404.
- Target object tool 401, deformation tool 402, coloring tool 400 With the input through the let 404, the position information and posture of each tool are detected, and the detection data is input. Similar to the input pen attached to the conventional tablet, the target object tool 401, deformation tool 402, and coloring tool 403 have a chip and coil detection function. The position information and posture of the tool are detected. For example, if a coil that can obtain different detection values is installed on each surface 1 to 6 of the hexahedron target object tool 301 as shown in Fig. 3, the mounting surface on the sunset can be identified, From this, it becomes possible to obtain three-dimensional information. Alternatively, by combining the XY axis 2D information obtained from the sunset with other optical sensors and the Z axis information obtained from the magnetic sensor, the 3D position can be identified to obtain 3D data. Is also good.
- FIG. 5 shows an example in which an input device is configured using a magnetic sensor.
- the inside of a magnetic field emitted from the b magnetic source 504 is set as an action area. Operate with tool 502 and coloring tool 503.
- a magnetic sensor is attached to each of the target object tool 501, the deformation tool 502, and the coloring tool 503, and is set in the magnetic field generated from the magnetic source 504.
- the magnetic sensors of the target object tool 501, deformation tool 502, and coloring tool 503 output the magnetic displacement data to the dedicated interface hardware 605 I do.
- the dedicated interface hardware 605 calculates position information and attitude information of each tool based on these data. It should be noted that the dedicated interface hardware 505 shown in FIG. 5 can be substituted by a software processing program.
- FIG. 6 is an example in which an input device is configured using an ultrasonic sensor.
- the area where the ultrasonic wave emitted from the ultrasonic wave transmitting section 604 can be detected is set as the action area, and the object tool 601, the deformation tool 602, and the coloring tool 603 within the action area. Perform the operation.
- An ultrasonic sensor is attached to each of the target object tool 601, the deformation tool 602, and the coloring tool 603, and detects the ultrasonic waves emitted from the ultrasonic transmission unit 604.
- the ultrasonic sensor of the target object tool 601, deformation tool 602, and coloring tool 603 in the working area, which is a possible area, is the arrival time from the source of ultrasonic waves, or interference of ultrasonic waves Etc. are detected.
- Dedicated The connection hardware 605 calculates the position information and the attitude information source data of each mail based on these data.
- the dedicated interface hardware 605 shown in FIG. 6 can be substituted by a software calculation processing program.
- FIG. 7 is a second example in which an input device is configured using an ultrasonic sensor.
- the ultrasonic transmission / reception unit 704 transmits and receives ultrasonic waves.
- Ultrasound transmission / reception unit 704 The area where the emitted ultrasonic waves can be detected is defined as the action area.
- the object object tool 701, the deformation tool 702, and the coloring tool 703 Perform the processing operation by.
- Each of the target object tool 701, deformation tool 702, and coloring tool 703 has the function of receiving and sending back the transmitted ultrasonic waves from the ultrasonic transmission / reception unit 704. Is installed.
- the ultrasonic transmitter / receiver 704 receives the ultrasonic waves sent back from each tool, detects the time required for the ultrasonic waves to go back and forth, or detects the interference of the ultrasonic waves, etc. Based on this, the position information and attitude information source of each tool are calculated.
- the input device 105 shown in FIG. 2 is realized as an input device using various sensors as described above.
- Information input / output from the sensor of each tool to the three-dimensional model processing apparatus 100 may be either a configuration via a wired data line or a wireless data transmission / reception configuration. Further, the input device 105 may be configured as a combination of the above-described various sensors.
- the external storage device 106 is preferably a hard disk 106, which is preferably a hard disk drive (HDD) or a randomly accessible storage medium such as an optical disk. Alternatively, a tape streamer or a memory stick may be used. It may be constituted by a nonvolatile semiconductor memory typified by It is also possible to use an external storage device of another system connected to the network.
- a hard disk 106 which is preferably a hard disk drive (HDD) or a randomly accessible storage medium such as an optical disk.
- HDD hard disk drive
- a tape streamer or a memory stick may be used. It may be constituted by a nonvolatile semiconductor memory typified by It is also possible to use an external storage device of another system connected to the network.
- the arithmetic processing circuit (CPU) 101 executes processing according to a processing program recorded in the program memory 102.
- the processing flow shown in FIG. 8 is processing executed according to a processing program recorded in the program memory 102.
- FIG. 8 an example in which the pressing tool 302, the pulling tool 303, the brush tool 304 as a coloring tool, and the spray tool 300 as described in FIG. 3 are used as the editing tools will be described. .
- step S801 in FIG. 8 data such as the position and orientation of the object to be edited in the three-dimensional space is acquired.
- This data is stored in the data memory 103 shown in FIG. 2, and the data corresponding to the attribute data such as the position, posture, shape, and color of the object to be edited displayed on the image display device 110 is displayed. It is.
- the initial position of the object object tool 301 shown in FIG. 3 is set as the position and orientation corresponding to these attribute data.
- the position and posture are detected by the sensor output of the target object tool 301.
- the attribute is rewritten according to the movement, and the object to be edited is displayed on the image display device 110 based on the rewritten attribute.
- step S802 it is checked whether or not a tool, that is, a target object tool and an editing tool such as deformation and coloring are included in the action area.
- the action area is the detectable area of various sensors used as an input device, the evening bullet area when using an evening bullet, the detectable area of a magnetic sensor when using a magnetic sensor, and an ultrasonic sensor. When this is used, it becomes the detectable area of the ultrasonic sensor.
- step S803 If it is determined in step S803 that the tool is not in the action area, the process jumps to step S814. If it is determined that a tool is included in the action area, the flow advances to step S804 to determine the type of tool (ID).
- the determination of the tool type here is a determination on the type of editing tool such as deformation and coloring.
- the operator may specify the tool to be used and input identification data (ID) indicating the type of tool, or may input an identification signal (ID) from the editing tool in the action area. As an output configuration, the tool may be determined based on the identification signal.
- step S804 5 to S 8 12 The processing from step S805 to step S806 is the processing when the editing tool is the action to be pressed, and the processing from step S807 to step S808 is the editing tool.
- the processing from step S809 to step S810 is the processing when step S810 is a brush tool, and the processing from step S811 to step S811 when the editing tool is a brush tool.
- the process up to 2 is a process when the editing tool is a spray tool.
- the processing of steps S806, S808, S810, and S812 is a subroutine corresponding to each tool, and is called and executed when an editing tool is specified. You. If none of the rules apply, proceed to step S813 and a warning is issued.
- the warning is processing such as displaying a warning message on the image display device 110 shown in FIG. 2, outputting a warning sound, or the like.
- steps S806, S808, S810, and S812 that is, the subroutine processing corresponding to each editing program will be described later with reference to FIG. .
- each editing tool is displayed on the image display device 110, and the editing object processed by the editing tool is displayed.
- the display mode of each editing tool is based on the shape data set in advance for each editing tool.
- the shape of a trowel is used for a pressing tool
- the shape of a spray is used for a spray tool.
- the object to be edited is displayed as a deformed or colored object by each editing tool. That is, an object to be edited that reflects attributes such as the position, posture, shape, and color of the object to be edited changed by each editing tool is displayed.
- the attribute of the object to be edited rewritten by the operation of each editing tool by the operator is stored in the data memory 103 shown in FIG.
- step S815 the end of the process is determined.
- the processing end may be an input from the user who is in the operation overnight, or a rule specified in the processing application, for example, if the game program, an end command by a game bar, Alternatively, an end determination may be made based on hardware or software restrictions such as memory full.
- End at step S 8 15 If so, the process returns to the beginning of the processing flow and repeats the same processing.
- FIG. 9 is a flowchart for explaining a process in which a tool pressed on the object to be edited is applied to deform the object.
- Step S901 is a step of obtaining the position and posture of the "trowel" used as a pushing tool in a three-dimensional space. As described above, this is a data obtained based on the sensor output according to the type of the input device used, that is, the evening plate, the magnetic sensor, and the ultrasonic sensor.
- step S902 the position and orientation of the object to be edited already acquired in step S801 of FIG. 8 and the editing tool “trowel” obtained in step S901. This is the process of calculating the spatial relationship and the relative position between the object to be edited and the editing tool from the position and orientation data.
- step S903 it is determined whether or not to execute the deformation processing of the editing object using the editing tool “trowel”. Specifically, this is determined by whether or not the position data of the editing tool detected by the various sensors has entered the inside of the object to be edited. In this case, it is not necessary for the editing tools 302 to 360 to actually enter the inside of the target object tools 301 shown in FIG. In the case of approaching to the editing object, the editing tool may be configured to execute processing for determining that the editing tool has entered the inside of the object to be edited.
- the target object tool 301 can be made of a deformable material such as a sponge-like material so that the editing tool can actually enter the inside of the target object tool 301.
- the deformation of the object to be edited displayed on the image display device 110 is executed in accordance with the actual position data based on the actual position data of the editing tool and the object object tool. It can be.
- the deformation processing of the object to be edited displayed on the image display device 110 in step S904 can be realized by moving the positions of the vertices of the polygon if the display data is a polygon data. it can. In this case, if the vertices are largely separated, a process of generating a new vertex is executed. Also, the object to be edited If the control point is represented as a parametric surface, the control point is moved. In addition, when it is represented by a poxel, the transformation can be realized by adding a Boolean operation (Boolean operation). The deformation process is performed according to the data mode, but in any case, the process based on the displacement data of the relative distance between the target object tool and the editing tool is performed. Become.
- step S905 After performing the transformation processing of the object to be edited in step S904, in step S905, the rewritten attribute data of the object to be edited resulting from the transformation is stored in the data memory shown in FIG. Store in 103 or external storage device 106.
- the data flow diagram is a diagram showing the data flow from the data source inside the object to the target inside other objects via the process of converting the data.
- the process of transforming the data (indicated by an ellipse), the data flow that carries the data (indicated by an arrow), the actor objects that show the production and consumption of the data (indicated by a rectangle), and the passive data It consists of a storage object (displayed between two straight lines) that stores the data.
- the process shown in Fig. 10 is a process using a pressing tool “trowel”.
- the target object as an actor object (sensor) 1001 and the iron tool (sensor) 1002
- the processing is started by the operation by the operator.
- a deformation degree calculation process 1003 is executed. Based on this calculation, a process of changing attribute information, that is, a deformation process 1004 is executed.
- a rendering process 10 as a process of generating a display image for the image display device 110 is performed.
- 05 is executed, and the rendering processing result is stored in a display memory, that is, the frame memory 104 in FIG. 2 and displayed on the image display device 110.
- FIG. 11 is a flowchart illustrating a process in which a tool “pinch” is applied to an object to be edited to apply a deformation.
- Step S111 is a step of obtaining the position and posture of a "pinch” used as a drawing tool in a three-dimensional space. As described above, this is data obtained based on the sensor output according to the type of input device used, that is, the evening bullet, the magnetic sensor, and the ultrasonic sensor.
- step S111 it is determined whether or not the switch of the pinch is pressed (the pulling pin "pinch” in the present embodiment is a tool “pinch” that is used to determine the starting point and the ending point of the pulling operation.
- a switch is provided at the position where the switch is pressed, and the position at which the switch is pressed is set as the start point (“on”) or the end point (“off”) of the drawing operation for the object to be edited. If it is determined in step 02 that the switch has not been pressed, in step S1109, the processing for turning off the "previous pressed flag" is executed, and then the processing is terminated, that is, the processing returns to the flow start position. Execute:
- step S1102 If it is determined in step S1102 that the switch has been pressed, the process proceeds to step S1103, where it is determined whether the "previous pressed flag" is on and not "on". In this case, it is determined that the current switch pressing is “ON”, that is, the starting position of the pulling operation, and in step S 1108, the position information of the pulling “binch” of the switch pressing position is stored in the data memory 10. Executes the process of storing in 3 and returning to the flow start position.
- step S1103 If it is determined in step S1103 that the "previous pressed flag" is on, the current switch depression is "off", and it is determined that this period is the execution period of the operation to be subtracted.
- the position information of the switch pressing position that is, the position information as the end position of the drawing operation is stored in the data memory 103 in step S111.
- step S1105 the amount of movement of the tool "pinch” that subtracts the current switch depression point from the previous switch depression point is calculated. This is the effective processing section for the operation in which this movement amount is subtracted.
- step S1106 the transfer determined in step S1105 Based on the movement amount, the target object is deformed.
- step S111 new attribute information obtained by the deformation process, that is, shape data of the target object is stored in the memory. The target object is displayed on the image display device 110 based on the new attribute information.
- Attitude information and position information of the target object are acquired from the target object tool 201, and two pieces of position information of the tool position information and the previous position information of the tool are acquired from the pinch tool 122. . These correspond to the start and end positions of the pinch pulling action.
- the processing of calculating the movement amount of the tool is executed, and the processing of calculating the deformation amount of the target object is performed based on the calculated movement amount and the position information and the posture information of the target object. 1 204 is executed.
- a process of changing the attribute information that is, a deformation process 125
- a rendering process 1206 as a process of generating a display image for the image display device 110 is executed, and a rendering process result is displayed on a memory, that is, the frame memory 1 in FIG. This is stored in 04 and displayed on the image display device 110.
- FIG. 13 is a flowchart illustrating a process for applying a color by applying a brush rule as a coloring rule to an object to be edited.
- Step S1301 is a step for obtaining the position and posture of a "brush" used as a rule in a three-dimensional space. As described above, this is data obtained based on the sensor output according to the type of input device used, that is, the tablet, magnetic sensor, or ultrasonic sensor.
- step S1302 the position and orientation of the object to be edited obtained in step S801 of FIG. 8 and the editing tool obtained in step S1301 are determined. Calculate the spatial relationship and relative position between the object to be edited and the editing tool from the position and orientation data of the “brush”.
- step S133 it is determined whether or not to execute the deformation processing of the object to be edited using the editing tool "brush". This can be done, for example, by providing a switch on the brush and detecting the depression of the switch, as in the case of the aforementioned “pinch” to pull, or by setting the distance between the object to be edited and the editing tool determined in step S132. When the distance becomes equal to or less than the predetermined distance, coloring with a brush may be performed.
- step S1304 a coloring process is performed on the target object.
- the color at this time is set in advance by the operator.
- step S1335 the rewritten attribute data of the object to be edited as a result of the coloring is stored in the data memory 10 shown in FIG. 3, or store in external storage device 106.
- the posture information and position information of the target object are acquired from the object file 1401, and the position information of the tool is acquired from the brush tool 1402.
- the coloring range calculation processing 1443 is executed, and the attribute information change processing, that is, the coloring processing 144, is executed based on the calculation result and the surface attribute information of the target object.
- a rendering process 1405 as a process of generating a display image for the image display device 110 is executed, and the rendering process result is displayed in a memory for display, that is, a frame memory 104 in FIG. And displayed on the image display device 110.
- Step S1501 is a step for obtaining the position and posture of the “spray” used as a coloring tool in a three-dimensional space. As described above, this is data obtained based on the sensor output depending on the type of input device used, that is, the bullet, magnetic sensor, and ultrasonic sensor.
- step S1502 the position and orientation data of the object to be edited obtained in step S801 of FIG. 8 and the editing tool "spray" obtained in step S1501
- the position relationship between the object to be edited and the range of action of the spray can be determined from the position and orientation data.
- step S1503 it is determined whether or not the positional relationship between the object to be edited and the working range of the spray determined in step S1502 is within the working area of the editing tool "spray". Is determined.
- an area that can be colored by the spray is preset as an attribute, and in order for the object to be edited to be colored, the positional relationship between the object to be edited and the working range of the spray is determined. It must be within the working area of the editing tool "spray”.
- the action area of the spray is set in advance as a tool attribute.
- the action area is set as a conical area 1701 having a vertex at a predetermined point of the sprayer, and a coloring area 1701 for the target object is set. 2 is set in the working area on the surface of the object 1704 as shown in FIG. 17B.
- the spray tool is provided with a push button 1703, and the coloring range 1702 is set based on the positional relationship when the push button 1703 is pressed.
- step S1503 if it is determined that the area is outside the action area of the editing tool "spray”, the coloring processing by spraying is not executed, the processing ends, and the routine starts and returns to the position. If it is determined in step S 1503 that the area is within the action area of the editing tool “spray one”, the action area is displayed on the image display device 110 in step S 1504.
- step S155 the switch of the editing tool "spray" is pressed. Then, it is determined whether or not the target object has been turned on. On condition that the switch has been turned on, a coloring process is performed on the target object in step S1506. The color at this time is set in advance by the operator. After performing the coloring processing of the object to be edited in step S 1506, the rewritten attribute data of the object to be edited as a result of the coloring is stored in the data memory shown in FIG. 2 in step S 1507. 103 or stored in the external storage device 106.
- the processing in the case of using the coloring tool “spray” will be further described with reference to the data flow diagram of FIG.
- the process starts with the operation of the object tool (sensor) 1601 as an actor object and the spray tool (sensor) 1602 as an actor object.
- posture information and position information of the target object are obtained, and from the spray tool 1602, position information and posture information of the tool are obtained.
- a calculation process 1603 of an action area that can be colored by spraying is executed.
- a coloring range calculation process 1604 is performed, and based on this calculation, a process of changing attribute information, that is, a coloring process 1605 is executed.
- a rendering process 166 as a process of generating a display image for the image display device 110 is executed, and the rendering process result is displayed in a memory for display, that is, the frame memory 1 in FIG. 04 is stored in the image display device 110.
- the operator uses the target object tool corresponding to the object to be edited and various dedicated tools to change the relative position between the two tools.
- the processing device executes processing set in advance according to the dedicated tool.
- each editing tool such as deforming or coloring the editing target object displayed on the image display device. According to the process, that is, the attribute is changed, and the result is displayed. Therefore, it is possible for the operator to execute various processes as if he or she were directly processing the object to be edited on the screen.
- FIGS. 18A to 18C show examples in which the three-dimensional model processing apparatus of the present invention is applied to online shopping using a communication network such as an Internet network.
- the three-dimensional model processing apparatus of the present invention is configured in a user's personal computer, for example, and displays a product from a provider that provides online shopping on a display 1801.
- This product display can be performed by storing the three-dimensional image data of the product provided by the provider via the network in a storage device such as a memory or a hard disk of the user's computer, and reading out the data.
- the display may be performed using data stored on media such as a CD and a DVD provided by the product provider.
- a user who considers a product displays a three-dimensional image of the product on the display 1801, holds a target object corresponding to the displayed product in one hand (product metaphor) 1802, and holds the other object in the other hand. It has an operation tool (tool 'metaphor) 1803 to execute various processes on the product in hand.
- the user who examines the product should operate the product metaphor 1802 and the tool metaphor 1803 relatively to execute the process corresponding to each tool 'metaphor on the product displayed on the display. Is possible. For example, by changing the angle and posture of the product metaphor 1802, the product displayed on the display is displayed with the angle and posture changed from those shown in Fig. 18A to Fig. 18B. The user can observe the goods from all angles.
- each operation button of the product metaphor corresponding to the video camera as a product is shown in FIG. 18C.
- the operation button of the video camera on the display 1801 is pressed, and the processing corresponding to the pressing of each operation button is executed on the display. It becomes possible.
- each switch of the object to be edited is based on the position information of the tool to be pressed. This is realized by a configuration in which the position is specified and a processing program corresponding to the specified switch is executed. By executing the program corresponding to the switch, the operation of the product in the display 1801 is displayed. In this way, the user can feel the sensation of actually picking up and operating the product by hand, by the relative operation of the product metaphor and the file metaphor.
- FIG. 19 shows a case where the three-dimensional model processing apparatus of the present invention is configured in, for example, a personal computer of a user, displays a virtual unit on a display 190 1 of a personal computer, and corresponds to the displayed virtual unit. Hold the target object tool 1902 in one hand, hold the various editing tools 1903 in the other hand, and use the editing tool 1 903 for the target object tool 1 902 corresponding to the kit. Various processes are executed by the.
- the displayed unit can make a voice through a speaker. It can.
- This is realized by executing a sub-program set corresponding to the microphone when the microphone and the target object tool 1902 approach each other within a predetermined distance or less, and executing a process of producing a voice. Further, it may be set so as to execute a process of changing the voice tone based on the posture information of the target object tool at this time.
- the processing programs stored in the program memory 102 shown in FIG. 2 are applied to the processing programs shown in, for example, FIGS. 8, 9, 11, 13, and 15 in the three-dimensional model processing apparatus of the present invention. Or may be stored in various storage media such as CD-ROM, CD-R, CD-RW, DVD-RAM, DVD-RW, DVD + RW, MO, hard disk, floppy disk, etc. Good.
- the processing program is also stored in a CD that stores the three-dimensional data of the product as a product catalog and provided to the user. It is good also as a result.
- the configuration is such that the operator operates various editing tools to execute the processing set according to the tools. During the operation, it becomes possible to execute various processes as if the user were directly operating the object to be edited displayed on the display.
- the present invention has been described with reference to the specific embodiments, it is obvious that those skilled in the art can modify or substitute the embodiments without departing from the gist of the present invention.
- the present invention has been demonstrated by way of example, and should not be construed as limiting.
- what is configured by appropriately combining the above-described embodiments is also included in the scope of the present invention, and in order to determine the gist of the present invention, refer to the claims described at the beginning. Should. INDUSTRIAL APPLICABILITY
- the three-dimensional model processing device and the three-dimensional model processing method according to the present invention operate various editing tools by operating the operating tool, and perform processing set according to the rules. Is executed, the operator can execute various processes as if he / she were directly operating the object to be edited displayed on the display.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Position Input By Displaying (AREA)
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/019,575 US7023436B2 (en) | 2000-04-19 | 2001-04-18 | Three-dimensional model processing device, three-dimensional model processing method, program providing medium |
EP01921890A EP1211647A4 (en) | 2000-04-19 | 2001-04-18 | METHOD AND DEVICE FOR PROCESSING A THREE-DIMENSIONAL MODEL, MEDIUM AND PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2000-117849 | 2000-04-19 | ||
JP2000117849A JP2001307134A (ja) | 2000-04-19 | 2000-04-19 | 三次元モデル処理装置および三次元モデル処理方法、並びにプログラム提供媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2001080188A1 true WO2001080188A1 (fr) | 2001-10-25 |
Family
ID=18629097
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2001/003328 WO2001080188A1 (fr) | 2000-04-19 | 2001-04-18 | Procede et dispositif de traitement de modele tridimensionnel, moyen et programme |
Country Status (4)
Country | Link |
---|---|
US (1) | US7023436B2 (ja) |
EP (1) | EP1211647A4 (ja) |
JP (1) | JP2001307134A (ja) |
WO (1) | WO2001080188A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7724250B2 (en) * | 2002-12-19 | 2010-05-25 | Sony Corporation | Apparatus, method, and program for processing information |
Families Citing this family (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4085918B2 (ja) | 2003-07-18 | 2008-05-14 | ソニー株式会社 | 3次元モデル処理装置、および3次元モデル処理方法、並びにコンピュータ・プログラム |
JP4017166B2 (ja) | 2004-04-20 | 2007-12-05 | 日本アイ・ビー・エム株式会社 | 編集装置、編集方法、プログラム、及び記録媒体 |
FR2871596B1 (fr) * | 2004-06-09 | 2016-01-08 | Giat Ind Sa | Methode de construction d'une session de formation |
EP1605420A3 (fr) * | 2004-06-09 | 2010-05-05 | Nexter Training | Système de formation à l'exploitation, l'utilisation ou la maintenance d'un cadre de travail dans un environnement de realité virtuelle |
US7821513B2 (en) * | 2006-05-09 | 2010-10-26 | Inus Technology, Inc. | System and method for analyzing modeling accuracy while performing reverse engineering with 3D scan data |
US8771071B2 (en) * | 2006-11-22 | 2014-07-08 | Sony Computer Entertainment America Llc | System and method of rendering controller information |
US20090109236A1 (en) * | 2007-10-30 | 2009-04-30 | Microsoft Corporation | Localized color transfer |
US8542907B2 (en) * | 2007-12-17 | 2013-09-24 | Sony Computer Entertainment America Llc | Dynamic three-dimensional object mapping for user-defined control device |
US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
JP4793422B2 (ja) * | 2008-10-10 | 2011-10-12 | ソニー株式会社 | 情報処理装置、情報処理方法、情報処理システムおよび情報処理用プログラム |
TWI370418B (en) | 2008-10-27 | 2012-08-11 | Ind Tech Res Inst | Computer system and controlling method thereof |
WO2010103482A2 (en) * | 2009-03-13 | 2010-09-16 | Primesense Ltd. | Enhanced 3d interfacing for remote devices |
EP2241988B1 (en) | 2009-04-14 | 2018-07-25 | Dassault Systèmes | Method, program and product edition system for visualizing objects displayed on a computer screen |
US20110164032A1 (en) * | 2010-01-07 | 2011-07-07 | Prime Sense Ltd. | Three-Dimensional User Interface |
US9201501B2 (en) | 2010-07-20 | 2015-12-01 | Apple Inc. | Adaptive projector |
CN102959616B (zh) | 2010-07-20 | 2015-06-10 | 苹果公司 | 自然交互的交互真实性增强 |
US8959013B2 (en) | 2010-09-27 | 2015-02-17 | Apple Inc. | Virtual keyboard for a non-tactile three dimensional user interface |
US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
WO2012107892A2 (en) | 2011-02-09 | 2012-08-16 | Primesense Ltd. | Gaze detection in a 3d mapping environment |
JP5670255B2 (ja) * | 2011-05-27 | 2015-02-18 | 京セラ株式会社 | 表示機器 |
JP5864144B2 (ja) | 2011-06-28 | 2016-02-17 | 京セラ株式会社 | 表示機器 |
US9377865B2 (en) | 2011-07-05 | 2016-06-28 | Apple Inc. | Zoom-based gesture user interface |
US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US9459758B2 (en) | 2011-07-05 | 2016-10-04 | Apple Inc. | Gesture-based interface with enhanced features |
AU2011205223C1 (en) | 2011-08-09 | 2013-03-28 | Microsoft Technology Licensing, Llc | Physical interaction with virtual objects for DRM |
US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
US9153195B2 (en) | 2011-08-17 | 2015-10-06 | Microsoft Technology Licensing, Llc | Providing contextual personal information by a mixed reality device |
US10019962B2 (en) | 2011-08-17 | 2018-07-10 | Microsoft Technology Licensing, Llc | Context adaptive user interface for augmented reality display |
WO2013028908A1 (en) | 2011-08-24 | 2013-02-28 | Microsoft Corporation | Touch and social cues as inputs into a computer |
US9122311B2 (en) | 2011-08-24 | 2015-09-01 | Apple Inc. | Visual feedback for tactile and non-tactile user interfaces |
US9218063B2 (en) | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
US9229534B2 (en) | 2012-02-28 | 2016-01-05 | Apple Inc. | Asymmetric mapping for tactile and non-tactile user interfaces |
WO2013144807A1 (en) | 2012-03-26 | 2013-10-03 | Primesense Ltd. | Enhanced virtual touchpad and touchscreen |
JP5902346B2 (ja) * | 2012-03-29 | 2016-04-13 | インテル コーポレイション | ジェスチャーを用いた3次元グラフィックスの作成 |
CN103472985B (zh) * | 2013-06-17 | 2017-12-26 | 展讯通信(上海)有限公司 | 一种三维购物平台显示界面的用户编辑方法 |
US20160184724A1 (en) * | 2014-08-31 | 2016-06-30 | Andrew Butler | Dynamic App Programming Environment with Physical Object Interaction |
US9710156B2 (en) * | 2014-09-25 | 2017-07-18 | Disney Enterprises, Inc. | Three-dimensional object sculpting and deformation on a mobile device |
DE102015119806A1 (de) * | 2015-11-16 | 2017-05-18 | Grob-Werke Gmbh & Co. Kg | Verfahren zur Darstellung der Bearbeitung in einer Werkzeugmaschine |
JP7072378B2 (ja) * | 2017-12-13 | 2022-05-20 | キヤノン株式会社 | 画像生成装置およびその制御方法、画像生成システム、プログラム |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6428719A (en) * | 1987-07-24 | 1989-01-31 | Hitachi Ltd | Three-dimension information input device |
JPH05224853A (ja) * | 1992-02-13 | 1993-09-03 | Hitachi Ltd | 電子ボード |
JPH05282426A (ja) * | 1992-04-02 | 1993-10-29 | Matsushita Electric Ind Co Ltd | 視点・光源機能の属性変更直接操作システム |
JPH07282115A (ja) * | 1994-04-06 | 1995-10-27 | Matsushita Electric Ind Co Ltd | 操作性評価装置 |
JPH1020914A (ja) * | 1996-07-08 | 1998-01-23 | Kawasaki Heavy Ind Ltd | 模擬加工方法および装置 |
JP2000194736A (ja) * | 1998-12-25 | 2000-07-14 | Kawasaki Heavy Ind Ltd | 模擬加工方法および装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5237647A (en) * | 1989-09-15 | 1993-08-17 | Massachusetts Institute Of Technology | Computer aided drawing in three dimensions |
US5184319A (en) * | 1990-02-02 | 1993-02-02 | Kramer James F | Force feedback and textures simulating interface device |
US5396265A (en) * | 1990-09-17 | 1995-03-07 | Massachusetts Institute Of Technology | Three-dimensional tactile computer input device |
US5418712A (en) * | 1993-06-04 | 1995-05-23 | Matsushita Electric Industrial Co., Ltd. | Manipulation performance evaluating apparatus for evaluating manipulation performance of a commodity having operating parts |
US5625576A (en) * | 1993-10-01 | 1997-04-29 | Massachusetts Institute Of Technology | Force reflecting haptic interface |
US5659493A (en) * | 1995-03-03 | 1997-08-19 | Ford Motor Company | Virtual machining techniques for modifying computer models of parts |
US5882206A (en) * | 1995-03-29 | 1999-03-16 | Gillio; Robert G. | Virtual surgery system |
US5991693A (en) * | 1996-02-23 | 1999-11-23 | Mindcraft Technologies, Inc. | Wireless I/O apparatus and method of computer-assisted instruction |
US5802353A (en) * | 1996-06-12 | 1998-09-01 | General Electric Company | Haptic computer modeling system |
US5973678A (en) * | 1997-08-29 | 1999-10-26 | Ford Global Technologies, Inc. | Method and system for manipulating a three-dimensional object utilizing a force feedback interface |
US6421048B1 (en) * | 1998-07-17 | 2002-07-16 | Sensable Technologies, Inc. | Systems and methods for interacting with virtual objects in a haptic virtual reality environment |
-
2000
- 2000-04-19 JP JP2000117849A patent/JP2001307134A/ja not_active Abandoned
-
2001
- 2001-04-18 US US10/019,575 patent/US7023436B2/en not_active Expired - Fee Related
- 2001-04-18 EP EP01921890A patent/EP1211647A4/en not_active Withdrawn
- 2001-04-18 WO PCT/JP2001/003328 patent/WO2001080188A1/ja active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6428719A (en) * | 1987-07-24 | 1989-01-31 | Hitachi Ltd | Three-dimension information input device |
JPH05224853A (ja) * | 1992-02-13 | 1993-09-03 | Hitachi Ltd | 電子ボード |
JPH05282426A (ja) * | 1992-04-02 | 1993-10-29 | Matsushita Electric Ind Co Ltd | 視点・光源機能の属性変更直接操作システム |
JPH07282115A (ja) * | 1994-04-06 | 1995-10-27 | Matsushita Electric Ind Co Ltd | 操作性評価装置 |
JPH1020914A (ja) * | 1996-07-08 | 1998-01-23 | Kawasaki Heavy Ind Ltd | 模擬加工方法および装置 |
JP2000194736A (ja) * | 1998-12-25 | 2000-07-14 | Kawasaki Heavy Ind Ltd | 模擬加工方法および装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1211647A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7724250B2 (en) * | 2002-12-19 | 2010-05-25 | Sony Corporation | Apparatus, method, and program for processing information |
Also Published As
Publication number | Publication date |
---|---|
US7023436B2 (en) | 2006-04-04 |
US20020149583A1 (en) | 2002-10-17 |
JP2001307134A (ja) | 2001-11-02 |
EP1211647A4 (en) | 2006-04-19 |
EP1211647A1 (en) | 2002-06-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2001080188A1 (fr) | Procede et dispositif de traitement de modele tridimensionnel, moyen et programme | |
CN110603509B (zh) | 计算机介导的现实环境中直接和间接交互的联合 | |
US11714492B2 (en) | Three-dimensional perceptions in haptic systems | |
US20150309575A1 (en) | Stereo interactive method, display device, operating stick and system | |
JP2004062657A (ja) | 座標入力装置及びその制御方法、座標入力指示具、プログラム | |
JP3242079U (ja) | 浮遊画像表示装置、及び浮遊画像表示システム | |
US10846943B2 (en) | Optimizing viewing assets | |
JP2001325614A (ja) | 3次元モデル処理装置および3次元モデル処理方法、並びにプログラム提供媒体 | |
US7289121B1 (en) | System for creating and modifying curves and surfaces | |
JP4484570B2 (ja) | 音響情報処理装置、音響情報提供方法 | |
CN111080757B (zh) | 基于惯性测量单元的绘画方法及其绘画***和计算*** | |
JP2001338306A (ja) | 編集ツール属性変更処理装置、編集ツール属性変更処理方法、および3次元モデル処理装置、3次元モデル処理方法、並びにプログラム提供媒体 | |
JP4389350B2 (ja) | オブジェクト属性変更処理装置、オブジェクト属性変更処理方法、および3次元モデル処理装置、3次元モデル処理方法、並びにプログラム提供媒体 | |
US20100073360A1 (en) | Input device for graphics | |
JP2001291118A (ja) | 3次元モデル処理装置および3次元モデル処理方法、並びにプログラム提供媒体 | |
JPWO2017061178A1 (ja) | 触覚再現装置 | |
JP2006048386A (ja) | 力覚提示装置、仮想オブジェクト算出方法、および仮想オブジェクト算出プログラム | |
JP2010271821A (ja) | ポリライン生成方法及びポリライン生成システム | |
JPH09138867A (ja) | 3次元造形方法および3次元データ入力装置 | |
KR20170085836A (ko) | 3차원 형상 디자인을 위한 정보입력장치 및 이를 이용한 3차원 이미지 생성 방법 | |
JP2002032786A (ja) | 処理パラメータ制御装置、処理パラメータ制御方法、および3次元モデル処理装置、並びにプログラム提供媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2001921890 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10019575 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 2001921890 Country of ref document: EP |