WO2006041097A1 - 3次元ポインティング方法、3次元表示制御方法、3次元ポインティング装置、3次元表示制御装置、3次元ポインティングプログラム、及び3次元表示制御プログラム - Google Patents
3次元ポインティング方法、3次元表示制御方法、3次元ポインティング装置、3次元表示制御装置、3次元ポインティングプログラム、及び3次元表示制御プログラム Download PDFInfo
- Publication number
- WO2006041097A1 WO2006041097A1 PCT/JP2005/018799 JP2005018799W WO2006041097A1 WO 2006041097 A1 WO2006041097 A1 WO 2006041097A1 JP 2005018799 W JP2005018799 W JP 2005018799W WO 2006041097 A1 WO2006041097 A1 WO 2006041097A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pointer
- dimensional
- pointing
- pen
- input
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
Definitions
- 3D pointing method 3D display control method, 3D pointing device, 3D display control device, 3D pointing program, and 3D display control program
- the present invention relates to a technique for performing three-dimensional pointing, and more particularly to a technique for performing pointing in a three-dimensional space represented on a display device using an input device having a pen-type operation unit. .
- the present invention relates to a technique for three-dimensionally pointing an object placed (displayed) in a three-dimensional space represented on a display device with a pointer.
- the present invention relates to a 3D display control technique for selecting an object arranged and displayed in a three-dimensional space or pointing an object arranged and displayed in a three-dimensional space.
- GUI graphical 'user' interface
- PC personal 'computer'
- enhancement of graphic functions In many cases, the operations are more complicated than ever, and when the operator performs a desired action, they become a footstep and prevent efficient operation.
- the idea of improving such a situation is to present and manipulate information using a three-dimensional space.
- This is often called a 3D GUI and so on, and it is a mechanism that places objects in a 3D space in 3D space and manipulates them using a specified input device.
- the same mechanism may be used in CAD, CG, etc. that perform design in a 3D space From the viewpoint of manipulating and pointing to similar 3D objects From here, we will proceed with a 3D GUI as an example. If this 3D GUI is used, objects that have been arranged two-dimensionally or stacked on top of each other can be arranged three-dimensionally, and the work space can be used efficiently. And because the real world that surrounds us is a three-dimensional space, By making the GUI three-dimensional, the GUI can be handled more intuitively than the two-dimensional GUI.
- pen-type input device that is familiar to us and has a form.
- pen tablets have been frequently used for pointing and object operations such as conventional two-dimensional GUIs.
- electromagnetic induction pen tablets are portable due to the simplicity of the device, that the screen can be pointed directly with a pen, and information that can be acquired (such as two-dimensional position and The power of the pen pressure, the pen housing angle, the buttons on the pen, the state of the wheel, etc.) can also be changed to a mouse.
- an input device it can be used as a PC, PDA (Personal Digital Assistant), and more recently a mobile phone. Is also being installed.
- pen tablets pen-type input devices
- chairs used as 3D input devices.
- a pen-type device that can acquire the tilt and 3D position of the pen is held in the air, and the pen tip (pointer) is virtually displayed in the display device at the tip of the pen.
- the pen tip pointer
- the space where the operator operates and the space where the pointer is actually displayed are separated, but the force displayed by the virtually displayed pen tip feels as if it is part of his / her pen. So it surpasses the conventional technology.
- the three-dimensional desktop has an advantage that the icons and windows can be arranged functionally because the degree of freedom in the depth direction is expanded.
- the desktop is made three-dimensional, pointing using pointers is still constrained by two-dimensional movement. For this reason, it is difficult to fully utilize the degree of freedom in the depth direction.
- the back of an object (the object in front) is hidden, just as the pointer is hidden behind the object and cannot be recognized. If there is another object (back object), the back object cannot be recognized directly. Therefore, when recognizing or manipulating the position of an object in the back, it is necessary to move the object in front, or reduce the display area or hide it. If the display area outside the front object is reduced or hidden, the display area is set to the original size in order to recognize the display contents of the front object. An operation to return or return to the display state is required. For this reason, there is a problem that the convenience for the operator is poor.
- Patent Document 1 JP-A-5-073208
- Patent Document 2 JP-A-6-75693
- Patent Document 3 Japanese Patent Laid-Open No. 8-248938
- Non-Patent Document 2 Keita Watanabe, Michiyasu Yasumura, “RUI: Realizable User Interface-Information Realization Using Carsonole”, Proceedings of Human Interface Symposium 2003, 2003, p. 541-544
- Non-Patent Document 3 George Robertson, 7 others, "The Task Gallery: A 3D Window Manage", Proceedings of CHI2000, 1-6 APRIL 2000, pp.494-501
- the first object of the present invention is to provide a device that the operator is familiar with as small as possible outside the pointer or object at any position in the 3D space on the 3D GUI. It is intended to provide 3D pointing technology that can be used efficiently and intuitively without any fatigue.
- a second object of the present invention is to make the operator intuitively and accurately recognize the depth position of the pointer and the pointing position in the pointing using the pointer in the 3D GUI. It is to provide a pointing technique that can be performed.
- a third object of the present invention is to move a pointer to a position on the back (back) side of another object when the pointer in the three-dimensional space represented on the display device is three-dimensionally powered. In this case, it is also necessary to provide a technology that can easily recognize the position of the pointer.
- the third object of the present invention is to further facilitate the recognition and operation of another object hidden behind the object displayed in the foreground in the three-dimensional space represented on the display device. The object is to provide a technique capable of improving the convenience of the operator.
- the first object is to provide a two-dimensional coordinate of the pointed position when the input point is pointed on a predetermined detection surface and the pen point of the input pen.
- the first purpose is that the two-dimensional coordinates of the pointed position when the input pen point is pointed on the predetermined detection surface and the pen point of the input pen.
- an extension line in the three-dimensional space of the axis of the input pen is obtained, a three-dimensional pointer is displayed on the extension line in the three-dimensional space, and the three-dimensional pointer is displayed according to the writing pressure of the input pen.
- Change the coordinates of the extension direction of the pointer in the 3D space It can be achieved by three-dimensional pointing method and displaying by.
- the first purpose is to indicate the two-dimensional coordinates of the pointed position when the pointed point on the detection surface is pointed with the pen tip of the input pen
- a three-dimensional pointing method for pointing a desired point in a three-dimensional space represented on a display device on the basis of an operation of an operation means provided in the input pen, which is continuously pointed at the pen tip of the input pen This is also achieved by a three-dimensional pointing method characterized by changing and displaying the coordinate in the depth direction of the three-dimensional pointer displayed in the three-dimensional space according to the time or the operation of the operation means of the input pen. it can.
- the first purpose is to specify the two-dimensional coordinates of the pointed position when the pointed point on the detection surface is pointed with the pen tip of the input pen,
- the operation of the operation means provided in the input pen, the tilt angle of the input pen that is an angle formed by the axis of the input pen and the detection surface, the projection of the axis of the input pen onto the detection surface, and the detection surface A three-dimensional pointing method for pointing a desired point in a three-dimensional space represented on a display device based on an azimuth angle of an input pen, which is an angle formed by a predetermined straight line.
- An extension line in the three-dimensional space of the axis of the input pen based on an angle and an azimuth, and a time when the three-dimensional pointer is displayed on the extension line in the three-dimensional space and pointed at the pen tip of the input pen, or Of the input pen This can also be achieved by a three-dimensional pointing method characterized in that the coordinates of the three-dimensional pointer in the three-dimensional space are changed in accordance with the operation of the operation means.
- the pointing object when pointing to the object displayed in the three-dimensional space, When an operation is performed on the object, or an operation for starting editing or processing is performed, the pointing object is displayed two-dimensionally on the surface closest to the operator of the display device, and the 2 A two-dimensional operation, editing, or force by the input pen may be received for an object displayed in a three-dimensional manner.
- the coordinate in the depth direction of the three-dimensional pointer may be changed while the two-dimensional coordinate of the point to be pointed is fixed.
- the present invention relates to the two-dimensional coordinates of the pointed position when the point on the predetermined detection surface is pointed with the pen point of the input pen and the pen point of the input pen.
- a three-dimensional pointing device that generates a pointer based on a writing pressure that is pressure, displays the generated pointer at a desired point in a three-dimensional space represented on a display device, and points the generated pointer;
- the input information acquisition means Based on the information acquired by the input information acquisition means, the input information acquisition means for acquiring the two-dimensional coordinates and the writing pressure information from the input pen, and in the three-dimensional space represented on the display device
- a pointer position Z rotation angle calculation means for calculating a position for displaying the pointer and a rotation angle; a pointer generation means for generating a pointer based on a calculation result of the pointer position Z rotation angle calculation means;
- Pointing determination means for determining whether or not there is an object that is pointed by the pointer generated by the pointer generation means in the three-dimensional space expressed
- the input pen has a structure in which a length of a pen tip is shortened according to the writing pressure, and the tertiary pen
- the original pointer has the same shape as the pen tip of the input pen, or the same shape as a part of the pen tip.
- two-dimensional coordinates of the pointed position when the point on the predetermined detection surface is pointed with the pen tip of the input pen, and the time point during which the point is pointed or the input pen A three-dimensional pointing device that generates a pointer based on an operation of an operation means provided, displays the pointer at a desired point in a three-dimensional space represented on the display device, and performs pointing.
- Pointer position Z rotation angle calculation means for calculating a position for displaying a pointer in the three-dimensional space represented by the display device, and pointer generation means for generating a pointer based on the calculation result of the pointer position Z rotation angle calculation means
- a pointing determination means for determining whether or not there is an object pointed to by the pointer generated by the pointer generation means in the three-dimensional space expressed by the display device, and the display device
- the object generation means for generating an object to be displayed in the three-dimensional space, the pointer generated by the pointer generation means, and the object generated by the object generation means are displayed in the three-dimensional space represented on the display device.
- Display control means for displaying, the pointer position Z rotation angle calculation means is the input pen
- the three-dimensional pointer to be displayed in the three-dimensional space is calculated by changing the coordinate in the depth direction according to the time pointed by the pen tip or the operation of the input pen operation means. It can also be configured as a two-dimensional pointing device.
- the input pen has a structure in which the length of the pen tip is shortened according to the time pointed by the pen tip or the operation of the operating means, and the three-dimensional pointer includes the pen tip of the input pen and It is good also as having an equivalent shape or a shape equivalent to a part of nib.
- the present invention can also be configured as a three-dimensional pointing program that causes a computer to execute the processing of each means in each of the three-dimensional pointing devices.
- a second object is to move a pointer displayed in the 3D space of the display device capable of expressing the 3D space in the 3D space to move the pointer in the 3D space.
- the pointer is moved or rotated in a two-dimensional plane orthogonal to the depth direction of the three-dimensional space of the display device and moved in the depth direction.
- Step 2 in which the pointer pointing portion is moved in the depth direction while keeping This can be achieved by a three-dimensional pointing method characterized by comprising the step 3 of displaying the pointer moved in step 2 on the display device.
- the step 2 uses a predetermined center point or center axis that removes the pointer from a point on or inside the pointer. You can also rotate it as the center!
- the second object is to move the pointer displayed in the three-dimensional space of the display device capable of expressing the three-dimensional space in the three-dimensional space so that the three-dimensional space is displayed.
- a three-dimensional pointing method for pointing a desired point in a space wherein the pointer is moved or rotated in a two-dimensional plane orthogonal to the depth direction of the three-dimensional space of the display device, and in the depth direction.
- Step 1 to be moved and the position in the depth direction of one point determined by force on the pointer excluding the portion where the pointer is pointed are kept constant, and the shape and size of the pointer Step 4 in which the pointer is moved in the depth direction while the pointer is moved in the depth direction, and the pointer moved in steps 1 and 4 And cut with accomplished by three-dimensional pointing method characterized by a step 3 to be displayed on the display device.
- the step 4 uses a predetermined center point or center axis for excluding the pointer on the surface of the pointer or inside the pointer. It is good also as rotating as a center.
- the center for rotating the pointer Point or central axial force The point or center axial force may be moved according to the rotation angle when the pointer rotates.
- the pointer includes a first portion having a fixed position in the depth direction, a position, a shape, and a size in the two-dimensional plane, and a second position in which at least the position in the depth direction changes.
- a third partial force connecting the first portion and the second portion, and the step 4 may be performed by moving the second portion of the pointer in the depth direction. Good. Further, the step 4 moves the second part in the depth direction while changing the position, shape, or size of the second part in the two-dimensional plane of the three-dimensional pointer. As well.
- the two-dimensional plane that can be represented by the display device is displayed.
- the protruding part may be projected or folded and displayed on a two-dimensional plane close to the protruding part! /.
- the display device may display a reference pointer having a fixed position in the depth direction together with the pointer!
- the present invention displays a pointer in a three-dimensional space expressed on a display device capable of expressing the three-dimensional space, and the pointer is displayed in a three-dimensional manner based on input information from the input device.
- a three-dimensional pointing device that moves an object to point at an arbitrary point in the three-dimensional space, and acquires the input information from the input device, and the input information acquisition unit
- a pointer position Z deformation amount calculating means for calculating the display position and deformation amount of the pointer, and a pointer to be displayed at the display position calculated by the pointer position Z deformation amount calculating means.
- the pointer is pointed and the object is Pointing determination means for determining whether or not there is a force, object generation means for changing the object to a pointing state when it is determined that there is an object pointed by the pointing determination means, and the pointer generation
- the pointer generated by the means and the object generated by the object generating means are displayed on the display device. It can be configured as a three-dimensional pointing device characterized by comprising display control means for displaying. Further, the present invention is configured by configuring the processing in each means in the above three-dimensional pointing device as a three-dimensional pointing program that causes a computer to execute.
- a third purpose is to display a pointer and one or more objects in a three-dimensional space represented on a display device capable of representing the three-dimensional space, and based on input information from the input device.
- a three-dimensional display control method for controlling the display state of the pointer and the object when the pointer is moved three-dimensionally and pointing to an arbitrary point in the three-dimensional space, and based on the input information Step 1 for calculating the display position of the pointer, Step 2 for displaying the pointer at the display position calculated in Step 1, and the display position of the pointer calculated in Step 1 It is determined whether or not the object has a force in front of the depth position of the pointer, and the object in front of the depth position of the pointer is displayed transparently. This can be achieved by a three-dimensional display control method characterized by having step 3 to be performed.
- Step 3 of the three-dimensional display control method may be configured to display only the object overlapping the pointer among the objects located in front of the depth position of the pointer in a transparent manner.
- step 3 other objects that are specified or selected based on predetermined input information of the input device power among the objects that are in front of the depth position of the pointer are excluded. It is also possible to display objects with transparency.
- the transparency of the object is changed according to the distance of the object in front of the depth position of the pointer and the depth direction of the pointer, and the depth direction of the object and the pointer is changed.
- the larger the distance the larger the transparency can be displayed.
- step 3 only an area of an arbitrary shape centering on a point on the object that overlaps the point where the pointer is pointing can be displayed transparently.
- the arbitrary shape to be transparent is the depth of the pointer and the object The arbitrary shape changes as the distance in the vertical direction increases and the distance in the depth direction increases.
- the step 3 may include a step of returning the transparent object to an opaque state before the transparent object is displayed when the pointer is stationary for a predetermined time.
- the present invention displays a pointer and one or more objects in a three-dimensional space represented on a display device capable of representing the three-dimensional space, and the above-described method is based on input information from the input device.
- a three-dimensional display control device for controlling a display state of the pointer and the object when a pointer is moved three-dimensionally to point at an arbitrary point in the three-dimensional space, from the input device
- Input information acquisition means for acquiring the input information
- pointer position calculation means for calculating the display position of the pointer based on the input information acquired by the input information acquisition means
- pointer position calculation means Pointer generation means for generating a pointer to be displayed at the display position calculated in step (b), and the pointer based on the display position of the pointer calculated by the pointer position calculation means.
- An object change determination means for determining whether or not there is an object in front of the depth position of the screen, and for determining whether or not to make the object in front of the depth position of the pointer transparent.
- It can also be configured as a three-dimensional display control device comprising display control means for displaying the object generated by the generation Z transparency means or the transparent object on the display device.
- the present invention comprises a 3D display control program for causing a computer to execute the processing of each means of the 3D display control device.
- the position of the pen tip of the input pen and the time during which the pen pressure or the pen point of the input pen has been pointed to or the input pen operating means ( Button, wheel, slide bar, etc.) and the input pen A pointer reflecting information such as the tilt and direction of the image is generated and displayed on the display device, thereby pointing an arbitrary point in the three-dimensional space represented on the display device.
- the input pen is, for example, an operation pen (electronic pen) of a pen tablet or a stylus that operates the touch panel, and the pen tip is brought into contact with a predetermined detection surface. It can be operated in the state. Therefore, an accurate pointing operation is easy, and fatigue due to a long pointing operation can be reduced.
- the input pen is brought into contact with the display surface of the display device by overlapping the detection means (digitizer) of the pen tablet with the display surface of the display device or using the touch panel.
- a pointing operation can be performed. This enables more accurate and intuitive 3D pointing operations.
- the objectator is adjusted in accordance with the change in the pointer.
- the pointer is moved by changing the depth position of the pointer pointing portion while keeping the depth position of one point on the pointer constant. Recognize the change in the depth direction of the part. This In this case, the operator recognizes the depth position accurately and intuitively by pointing the state force of the portion where the depth position of the pointer is kept constant! can do.
- the inclination of the pointer in the depth direction changes each time the pointer rotates. It is possible to easily recognize the depth position.
- the center point or the center axis for rotating the pointer may be fixed, or may be moved according to a rotation angle when the pointer rotates.
- the pointer includes the first portion, the second portion, and the third portion
- only the pointing portion is a pointer bent in the depth direction. Therefore, it is possible to easily recognize the pointed depth position from the state of the first portion, the second portion, and the third portion.
- the second portion is moved in the depth direction while changing the position, shape, or size of the second portion in the two-dimensional plane, so that the depth position can be accurately determined. Recognition is possible.
- the object in front of the depth position of the pointer is displayed transparently. Therefore, when the pointer is moved three-dimensionally, the pointer force is not hidden behind an object in front of the pointer. Therefore, the position of the pointer can be easily recognized even when the pointer moves to a position corresponding to the back side of the object. Further, by making the object in front of the pointer transparent, the position of another object hidden behind the transparent object can be easily recognized and can be pointed. Also, by moving the pointer forward, the object that has been made transparent returns to the original opaque display, so that the display content of the object that has been made transparent can be easily recognized. . Furthermore, since the transparency and opacity of the object can be controlled by moving the pointer in the depth direction, the convenience of the pointer operator is improved.
- the object in the selected state is made transparent even before the pointer. By avoiding this, the selected object can be easily recognized. Further, by increasing the transparency of an object having a greater distance in the depth direction of the pointer, it is possible to easily recognize the depth position of the pointer and an object in the vicinity of the pointer.
- the entire object is made transparent by making only an area of an arbitrary shape such as a circle, an ellipse, or a polygon centering on a point that overlaps the point where the pointer is pointing.
- Visual inconvenience such as when switching between transparency and opacification is performed continuously, can be reduced.
- the transparent object is returned to its original opaque state and displayed, thereby making the pointer more transparent than the object that has been made transparent. Even if it is not moved to the front, the display content of the transparent object can be recognized, further improving the convenience for the operator.
- FIG. 1 is a schematic diagram for explaining an overview of a three-dimensional pointing method of the first embodiment, and shows a configuration example of a system that realizes the pointing method of the first embodiment. is there.
- FIG. 2 is a schematic diagram for explaining the outline of the three-dimensional pointing method of the first and second embodiments, and explains the principle of the three-dimensional pointing method of the first and second embodiments.
- FIG. 2 is a schematic diagram for explaining the outline of the three-dimensional pointing method of the first and second embodiments, and explains the principle of the three-dimensional pointing method of the first and second embodiments.
- FIG. 3 is a schematic diagram for explaining the outline of the three-dimensional pointing method of the first embodiment, and shows a configuration example of an input pen used in the three-dimensional pointing method of the first embodiment. .
- FIG. 4A is a schematic diagram for explaining the three-dimensional pointing method in Examples 1 1 and 2-1, and shows a front view, a right side view, and a bottom view showing an example of a three-dimensional space represented on the display device. It is.
- Fig. 4B is a schematic diagram for explaining the three-dimensional pointing method of the embodiments 1 1 and 2-1, and is a bird's-eye view showing an example of a three-dimensional space expressed on the display device.
- FIG. 5A is a schematic diagram for explaining the three-dimensional pointing method of Example 1 1, 2-1.
- FIG. 6 is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 5B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1 and 2-1, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 5C is a schematic diagram for explaining the three-dimensional pointing method of Embodiments 1 and 2-1, and is a bird's eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 6A is a schematic diagram for explaining the three-dimensional pointing method in Examples 1 1 and 2-1, and is a front view and a right side view showing a state in a three-dimensional space when operated with an input pen. is there.
- FIG. 6B is a schematic diagram for explaining the three-dimensional pointing method in Examples 1 1 and 2-1, and is a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. is there.
- FIG. 6C is a schematic diagram for explaining the three-dimensional pointing method in Examples 1 1 and 2-1, and is a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. is there.
- FIG. 7 is a schematic diagram for explaining the three-dimensional pointing method of Example 11 and is a flowchart showing the processing procedure of the three-dimensional pointing method of Example 11.
- FIG. 8A is a schematic diagram for explaining a modification of the three-dimensional pointing method in the embodiments 1 1 and 2-1, and shows the shape of a pointer to be displayed.
- FIG. 8B is a schematic diagram for explaining a modified example of the three-dimensional pointing method in the embodiments 1 1 and 2-1, and shows the shape of a pointer to be displayed.
- FIG. 8C is a schematic diagram for explaining a modification of the three-dimensional pointing method in the embodiments 1 1 and 2-1, and shows the shape of the pointer to be displayed.
- FIG. 8D is a schematic diagram for explaining a modification of the three-dimensional pointing method in the embodiments 1 1 and 2-1, and shows the shape of a pointer to be displayed.
- FIG. 9A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-2 and 2-2, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 9B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-2 and 2-2, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 9C is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-2 and 2-2, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 10A is a schematic diagram for explaining the three-dimensional pointing method of Example 1, 2, 2-2, and is a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. .
- FIG. 10B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-2, 2-2, and is a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. is there.
- FIG. 10C is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-2, 2-2, and shows a front view and a right side view showing a state in a three-dimensional space when operated with an input pen. is there.
- FIG. 11 is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-3 and 2-3, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 12 is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-3 and 2-3, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 13 is a schematic diagram for explaining the three-dimensional pointing method of the embodiment 1-3 according to the present invention, and is a flowchart showing the processing procedure of the three-dimensional pointing method of the embodiment 13;
- FIG. 14A is a schematic diagram for explaining the three-dimensional pointing method in Examples 14 and 2-4, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 14B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-4 and 2-4, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 14C is a schematic diagram for explaining the three-dimensional pointing method in Examples 14 and 2-4, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 15A is a schematic diagram for explaining the three-dimensional pointing method in Examples 14 and 2-4, and is a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. .
- FIG. 15B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-4 and 2-4. They are a front view and a right side view showing a state in a three-dimensional space when operated with an input pen.
- FIG. 15C is a schematic diagram for explaining the three-dimensional pointing method in Examples 14 and 2-4, and is a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. .
- FIG. 16A is a schematic diagram for explaining the three-dimensional pointing method in Examples 15 and 2-5, and is a diagram for explaining the principle of the display device (DFD) used in Examples 15 and 2-5. .
- FIG. 16B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a diagram for explaining the principle of the display device (DFD) used in Examples 15 and 2-5. is there.
- FIG. 17A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and shows a front view, a right side view, and a bottom view showing an example of a three-dimensional space represented on the display device.
- FIG. 17A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and shows a front view, a right side view, and a bottom view showing an example of a three-dimensional space represented on the display device.
- FIG. 17B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a bird's-eye view showing an example of a three-dimensional space expressed on the display device.
- FIG. 18A is a schematic diagram for explaining the three-dimensional pointing method of Examples 15 and 2-5, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 18B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 18C is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 18D is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 19A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 19B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 19C is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 19D is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-5 and 2-5, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 20 is a schematic diagram for explaining the three-dimensional pointing method of Example 15 and is a flowchart showing the processing procedure of the three-dimensional pointing method of Example 15;
- FIG. 21A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-6 and 2-6, and shows a front view, a right side view, and a bottom view showing an example of a three-dimensional space represented on the display device.
- FIG. 21A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-6 and 2-6, and shows a front view, a right side view, and a bottom view showing an example of a three-dimensional space represented on the display device.
- FIG. 21B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-6 and 2-6, and is a bird's-eye view showing an example of a three-dimensional space expressed on the display device.
- FIG. 22A is a schematic diagram for explaining the three-dimensional pointing method in Examples 16 and 2-6, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 22B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-6 and 2-6, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 22C is a schematic diagram for explaining the three-dimensional pointing method of Examples 16 and 2-6, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 23A is a schematic diagram for explaining the three-dimensional pointing method in Examples 16 and 2-6, and is a bird's eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 23B is a schematic diagram for explaining the three-dimensional pointing method in Examples 1-6 and 2-6, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 23C is a schematic diagram for explaining the three-dimensional pointing method in Examples 16 and 2-6, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 24A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-6 and 2-6, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 24B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-6 and 2-6, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 24C is a schematic diagram for explaining the three-dimensional pointing method in Examples 16 and 2-6, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 25A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1 6 and 2-6.
- FIG. 6 is a front view showing a state in a three-dimensional space when operated with an input pen.
- FIG. 25B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-6 and 2-6, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 25C is a schematic diagram for explaining the three-dimensional pointing method in Examples 16 and 2-6, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 26 is a schematic diagram for explaining the three-dimensional pointing method of Example 16 and is a flowchart showing the processing procedure of the three-dimensional pointing method of Example 16;
- FIG. 27 is a schematic diagram for explaining the three-dimensional pointing method of Example 16 and is a flowchart showing a modification of the processing procedure of the three-dimensional pointing method of Example 16;
- FIG. 28A is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method of Examples 16 and 2-6, and shows a front view and an example of a three-dimensional space expressed on a display device; It is a right view and a bottom view.
- FIG. 28B is a schematic diagram for explaining a method of deleting an object using the three-dimensional pointing method in Examples 1-6 and 2-6, and is a bird's-eye view showing an example of a 3D space represented on the display device. .
- FIG. 29A is a schematic diagram for explaining a method of deleting an object using the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a bird's-eye view showing a state in a three-dimensional space when operated with an input pen. It is.
- FIG. 29B is a schematic diagram for explaining a method of deleting an object using the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a bird's-eye view showing a state in a three-dimensional space when operated with an input pen. It is.
- FIG. 30A is a schematic diagram for explaining a method of deleting an object using the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a bird's-eye view showing a state in a three-dimensional space when operated with an input pen. It is.
- FIG. 30B is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a bird's-eye view showing a state in a three-dimensional space when operated with an input pen. It is.
- FIG. 30C Example 1 6 and 2-6 How to delete an object using the 3D pointing method It is a schematic diagram for explaining a method, and is a bird's-eye view showing a state in a three-dimensional space when operated with an input pen.
- FIG. 31A is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of a state in a three-dimensional space when operated with an input pen.
- FIG. 31A is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of a state in a three-dimensional space when operated with an input pen.
- FIG. 31B is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the three-dimensional space when operated with an input pen.
- FIG. 31B is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the three-dimensional space when operated with an input pen.
- FIG. 32A is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the three-dimensional space when operated with an input pen.
- FIG. 32A is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the three-dimensional space when operated with an input pen.
- FIG. 32B is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the 3D space when operated with an input pen.
- FIG. 32B is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the 3D space when operated with an input pen.
- FIG. 32C is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the three-dimensional space when operated with an input pen.
- FIG. 32C is a schematic diagram for explaining a method of deleting an object by the three-dimensional pointing method in Examples 1-6 and 2-6, and shows a front view of the state in the three-dimensional space when operated with an input pen.
- FIG. 33 is a schematic diagram for explaining the three-dimensional pointing method of Example 1-7, and shows a configuration example of the input pen used in the three-dimensional pointing method of Example 1-7.
- FIG. 34 is a schematic diagram for explaining the three-dimensional pointing method of Example 1-7, and shows a configuration example of an input pen used in the three-dimensional pointing method of Example 1-7.
- FIG. 35A is a schematic diagram for explaining the three-dimensional pointing method of Examples 17 and 2-7, and is a bird's eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 35B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-7 and 2-7, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 35C is a schematic diagram for explaining the three-dimensional pointing method of Examples 17 and 2-7, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 36A is a schematic diagram for explaining the three-dimensional pointing method in Examples 1-7 and 2-7. They are a front view and a right side view showing a state in a three-dimensional space when operated with an input pen.
- FIG. 36B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-7 and 2-7, and shows a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. is there.
- FIG. 36C is a schematic diagram for explaining the three-dimensional pointing method in Examples 17 and 2-7, and is a front view and a right side view showing a state in the three-dimensional space when operated with an input pen. .
- FIG. 37A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-8 and 2-8, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 37B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-8 and 2-8, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 37C is a schematic diagram for explaining the three-dimensional pointing method of Examples 18 and 2-8, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 38A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-8 and 2-8, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 38B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-8 and 2-8, and is a bird's-eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 38C is a schematic diagram for explaining the three-dimensional pointing method of Examples 18 and 2-8, and is a bird's eye view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 39A is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-8 and 2-8, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 39B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-8 and 2-8, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 39C is a schematic diagram for explaining the three-dimensional pointing method of Examples 18 and 2-8, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 40A is a schematic diagram for explaining the three-dimensional pointing method of Examples 18 and 2-8, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 40B is a schematic diagram for explaining the three-dimensional pointing method of Examples 1-8 and 2-8, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 40C is a schematic diagram for explaining the three-dimensional pointing method of Examples 18 and 2-8, and is a front view showing a state in the three-dimensional space when operated with an input pen.
- FIG. 41 is a schematic diagram for explaining the outline of the three-dimensional pointing method of the second embodiment, and shows a configuration example of a system that realizes the pointing method of the second embodiment.
- FIG. 42 is a schematic diagram for explaining the outline of the 3D pointing method of the second embodiment, and shows a configuration example of an input pen used in the 3D pointing method of the second embodiment. .
- FIG. 43 is a schematic diagram for explaining the three-dimensional pointing method of Example 2-1 and is a flowchart showing the processing procedure of the three-dimensional pointing method of Example 2-1;
- FIG. 44 is a schematic diagram for explaining the 3D pointing method of Example 2-2, and is a flowchart showing the processing procedure of the 3D pointing method of Example 2-2.
- FIG. 45 is a schematic diagram for explaining the three-dimensional pointing method of Example 2-3, and is a flowchart showing the processing procedure of the three-dimensional pointing method of Example 2-3.
- FIG. 46 is a schematic diagram for explaining the three-dimensional pointing method of Example 2-5, and is a flowchart showing the processing procedure of the three-dimensional pointing method of Example 2-5.
- FIG. 47 is a schematic diagram for explaining the three-dimensional pointing method of Example 2-6, and is a flowchart showing the processing procedure of the three-dimensional pointing method of Example 2-6.
- FIG. 48 is a schematic diagram for explaining the three-dimensional pointing method of Example 2-7, and shows a configuration example of an input pen used in the three-dimensional pointing method of Example 2-7.
- FIG. 49 is a schematic diagram for explaining the three-dimensional pointing method of Example 2-7, and shows a configuration example of the input pen used in the three-dimensional pointing method of Example 2-7.
- FIG. 50 is a schematic diagram showing a configuration example of a system for realizing the three-dimensional pointing method of the third embodiment.
- FIG. 51 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-1, and is a diagram for explaining the operation method of the pointer.
- FIG. 52 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-1, and shows a change in the three-dimensional space when an object located behind the pointer is pointed.
- FIG. 51 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-1, and is a diagram for explaining the operation method of the pointer.
- FIG. 52 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-1, and shows a change in the three-dimensional space when an object located behind the pointer is pointed.
- FIG. 53 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-1, and is a perspective view of changes in the three-dimensional space of FIG. 52.
- FIG. 54 is a schematic diagram for explaining the three-dimensional pointing method in Example 3-1, and shows a front view and a right side showing a change in the three-dimensional space when pointing an object in front of the pointer.
- FIG. 54 is a schematic diagram for explaining the three-dimensional pointing method in Example 3-1, and shows a front view and a right side showing a change in the three-dimensional space when pointing an object in front of the pointer.
- FIG. 55 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-1, when the three-dimensional pointing method of Example 3-1 is executed by the system controller (pointing device). It is a flowchart for demonstrating a process sequence.
- FIG. 56A This is a schematic diagram showing a modification of the shape of the pointer, and shows a triangular pointer.
- FIG. 56B is a schematic diagram showing a modification of the shape of the pointer, and shows a pointer in the shape of a human hand.
- Fig. 56 is a schematic diagram showing a modification of the shape of the pointer, and is a diagram showing a hook-shaped pointer.
- Fig. 56 is a schematic diagram showing a modification of the shape of the pointer, and is a diagram showing a cross-shaped pointer.
- FIG. 57 is a diagram showing an example of displaying a reference in the three-dimensional pointing method of the embodiment 3-1;
- FIG. 58 is a schematic diagram for explaining a modified example of the three-dimensional pointing method of the embodiment 3-1, and shows a front view showing a change in the three-dimensional space when an object behind the pointer is pointed It is a right side view.
- FIG. 59 is a schematic diagram for explaining a first application example of the three-dimensional pointing method of the embodiment 3-1, and changes in the three-dimensional space when an object behind the pointer is pointed It is the front view and right view which show these.
- FIG. 60 is a diagram for explaining a second application example of the three-dimensional pointing method of the embodiment 3-1. It is a schematic diagram, and is a front view and a right side view showing a change in the three-dimensional space when an object behind the pointer is pointed.
- FIG. 61 is a schematic diagram for explaining a third application example of the three-dimensional pointing method of the embodiment 3-1 and shows a configuration example of the system.
- FIG. 62A is a schematic diagram for explaining a third application example of the three-dimensional pointing method of the embodiment 3-1 and a diagram for explaining the operating principle of the DFD.
- FIG. 62B is a schematic diagram for explaining a third application example of the three-dimensional pointing method of the embodiment 3-1 and a diagram for explaining the operating principle of the DFD.
- FIG. 63A is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following a linear locus.
- FIG. 63B is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following a linear locus.
- FIG. 64A is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following a linear locus.
- FIG. 64B is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following a linear locus.
- FIG. 65A is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following an arcuate locus.
- FIG. 65B is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following an arcuate locus.
- FIG. 66A is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following an arcuate locus.
- FIG. 66B is a schematic diagram for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1 and shows an application example in the case of following an arcuate locus.
- FIG. 67 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-2, and shows a change in the three-dimensional space when an object located behind the pointer is pointed.
- FIG. 67 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-2, and shows a change in the three-dimensional space when an object located behind the pointer is pointed.
- FIG. 68A is a schematic diagram for explaining the three-dimensional pointing method of Example 3-2.
- FIG. 10 is a diagram for explaining a problem in the three-dimensional pointing method of Example 3-2.
- FIG. 68B is a schematic diagram for explaining the three-dimensional pointing method of Example 3-2.
- FIG. 69 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-2.
- FIG. 70A is a schematic diagram for explaining an application example of the three-dimensional pointing method of the embodiment 3-2, and shows an example in which a point that becomes the center of rotation is fixed in a three-dimensional space. It is.
- FIG. 70B is a schematic diagram for explaining an application example of the three-dimensional pointing method of the embodiment 3-2, and shows an example in which a point that becomes the center of rotation moves in a three-dimensional space. .
- FIG. 71 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-3, and shows a change in the three-dimensional space when an object located behind the pointer is pointed.
- FIG. 71 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-3, and shows a change in the three-dimensional space when an object located behind the pointer is pointed.
- FIG. 72 is a schematic diagram for explaining the three-dimensional pointing method of Example 3-3, and is a perspective view of changes in the three-dimensional space of FIG. 71.
- FIG. 73 is a schematic diagram for explaining the 3D pointing method of Example 3-3, and shows the processing procedure when the 3D pointing method of Example 3-3 is executed by the system controller (pointing device). It is a flowchart for demonstrating.
- FIG. 74A is a schematic diagram for explaining a pointer coupling method in the three-dimensional pointing method of the third embodiment, and shows a pointer coupling method as viewed from the XZ plane side.
- FIG. 74B is a schematic diagram for explaining a pointer connection method in the three-dimensional pointing method of the third embodiment, and shows a pointer connection method as viewed from the XZ plane side.
- FIG. 74C is a schematic diagram for explaining a pointer coupling method in the three-dimensional pointing method of the third embodiment, and shows a pointer coupling method as viewed from the XZ plane side.
- FIG. 74D is a schematic diagram for explaining the pointer connection method in the three-dimensional pointing method of the third embodiment, and shows the pointer connection method as viewed from the XZ plane side.
- FIG. 75 is a schematic diagram for explaining an application example of the three-dimensional pointing method of Example 3-3, and shows the response when moving while maintaining the shape of the part to be pointed. It is a figure which shows an example.
- FIG. 76A is a schematic diagram for explaining an application example of the three-dimensional pointing method of the embodiment 3-3, and is a diagram showing an application example in the case of moving while keeping the shape of the part to be pointed.
- FIG. 76B is a schematic diagram for explaining an application example of the three-dimensional pointing method of the embodiment 3-3, and is a diagram showing an application example in the case of moving while maintaining the shape of the part to be pointed.
- FIG. 77A is a schematic diagram for explaining an application example of the three-dimensional pointing method of the embodiment 3-3, and shows an application example in the case of changing the shape of the part to be pointed.
- FIG. 77B is a schematic diagram for explaining an application example of the three-dimensional pointing method of the embodiment 3-3, and is a diagram showing an application example in a case where the shape of the part to be pointed is changed.
- FIG. 78A is a schematic diagram for explaining an application example of the three-dimensional pointing method of the embodiment 3-3, and is a diagram showing an application example when the shape of the part to be pointed is changed.
- FIG. 78B is a schematic diagram for explaining an application example of the three-dimensional pointing method of Example 3-3, and is a diagram showing an application example when changing the shape of the part to be pointed.
- FIG. 10 is a diagram for explaining an example of a combination of the second embodiment and the third embodiment.
- FIG. 79 is a diagram for describing an example of a combination of the first and second embodiments and the third embodiment.
- FIG. 79 is a diagram for describing an example of a combination of the first and second embodiments and the third embodiment.
- FIG. 80 This is a schematic diagram for explaining the outline of the 3D display control method of the fourth embodiment, and shows a schematic configuration of a computer system to which the 3D display control method of the fourth embodiment is applied.
- FIG. 81 This is a schematic diagram for explaining the outline of the three-dimensional display control method of the fourth embodiment, and is a diagram for explaining the operation principle of a display device (DFD) capable of expressing a three-dimensional space.
- DFD display device
- a schematic diagram for explaining the outline of the 3D display control method of the fourth embodiment and is a front view and a right side view showing an example of a 3D space expressed on the display device. .
- ⁇ 83 A schematic diagram for explaining the outline of the 3D display control method of the fourth embodiment, and is a perspective view (bird's eye view) showing an example of a 3D space represented on the display device.
- ⁇ 84 A schematic diagram for explaining the outline of the three-dimensional display control method of the fourth embodiment, and is a diagram showing an example of a pointer operation method.
- FIG. 85 is a schematic diagram for explaining the three-dimensional display control method of Example 41, and shows how the three-dimensional space changes when the display control method of Example 41 is applied.
- FIG. 86 is a schematic diagram for explaining the three-dimensional display control method of Example 41, and shows how the three-dimensional space changes when the display control method of Example 41 is applied.
- FIG. 87 A schematic diagram for explaining the 3D display control method of Example 4-1, and a conventional 3D space control method for comparison with the display control method of Example 4 1.
- FIG. 87 A schematic diagram for explaining the 3D display control method of Example 4-1, and a conventional 3D space control method for comparison with the display control method of Example 4 1.
- FIG. 88 is a flowchart showing a processing procedure in the apparatus for realizing the three-dimensional display control method of Embodiment 41.
- FIG. 89 is a schematic diagram for explaining an application example of the three-dimensional display control method of the embodiment 41, and shows a change in the three-dimensional space when the application example is applied.
- FIG. 90 is a schematic diagram for explaining an application example of the three-dimensional display control method of the embodiment 41, and is a flowchart showing a processing procedure in the apparatus for realizing the application example.
- FIG. 91 is a schematic diagram for explaining the three-dimensional display method of Example 4 2;
- FIG. 6 is a diagram showing a change in a three-dimensional space when the display control method 2 is applied.
- FIG. 92 is a schematic diagram for explaining the three-dimensional display method of 4-2, and is a diagram showing an example of a method for determining transparency.
- FIG. 93 is a schematic diagram for explaining the 3D display method of Example 42, and is a flowchart showing the processing procedure in the apparatus for realizing the 3D display method of Example 42.
- FIG. 94 is a schematic diagram for explaining the 3D display control method of Example 4 3, and shows a change in the 3D space when the display control method of Example 4 3 is applied.
- FIG. 95A is a schematic diagram for explaining the three-dimensional display control method of Example 43, and is a diagram showing an example of a transparency determination method.
- FIG. 95B is a schematic diagram for explaining the three-dimensional display control method of Example 43, and shows an example of a transparency determination method.
- FIG. 96 is a schematic diagram for explaining the 3D display control method of Example 4 4 and shows how the 3D space changes when the display control method of Example 4 4 is applied.
- FIG. 97 is a schematic diagram for explaining the 3D display control method of Embodiment 44, and is a flowchart showing the processing procedure in the apparatus for realizing the 3D display method of Embodiment 44.
- FIG. 98 is a schematic diagram for explaining the three-dimensional display control method of Example 45, and is a diagram showing a method for selecting an object.
- FIG. 99 is a schematic diagram for explaining the three-dimensional display control method of Example 45, and shows how the three-dimensional space changes when the display control method of Example 45 is applied.
- FIG. 100 is a schematic diagram for explaining the 3D display control method of Example 45, and is a flowchart showing the processing procedure in the apparatus for realizing the 3D display method of Example 45.
- Input information acquisition means 102 Pointer position Z rotation angle calculation means
- Operation means button, wheel, slide bar
- 201P The tip of the input pen
- Trash can object
- Processing control means 107 .
- Input information acquisition means 102 Pointer position calculation means
- the first embodiment corresponds to the first object of the present invention.
- the three-dimensional pointing method uses a pen-shaped input device to point an object in a three-dimensional space represented on a display device capable of three-dimensional display. And operating the pointed object.
- the pen-shaped input device is, for example, a pen-shaped operation means (hereinafter referred to as an input device) that is operated by an operator who performs pointing and operation of the object, such as a pen tablet.
- a detection means for detecting information such as the position of the pen tip of the input pen, the writing pressure, and the direction of the axis.
- the pointer are determined based on the information detected by the detection means, and the three-dimensional space represented on the display device is determined.
- the pointer is displayed on the top. In this way, the operator can point the object on the three-dimensional space represented on the display device with the pen tip of the input pen in contact with the detection surface of the detection means. The operator's fatigue can be reduced when pointing and operating objects for a long time.
- the writing pressure of the input pen is made to correspond to the movement or deformation of the pointer in the depth direction, so that the three-dimensional expression expressed on the display device Make it possible to point to a point in space.
- the operator can also apply the force applied by the pointer displayed on the three-dimensional space represented on the display device. You can feel it as a part of the pen tip of your own input pen, and you can easily and intuitively point 3D objects.
- the pointing object after pointing the object, the pointing object can be operated in a 2D GUI-like editing, processing, and the like.
- the object on the two-dimensional GUI can be operated by operating the input pen.
- the object editing process and! / Are completed the object is again treated as a three-dimensional object so that it can be moved to a three-dimensional position desired by the operator.
- the operation of the 3D object can be realized with the same operation as that in the conventional 2D GUI using the existing pen-shaped input device. There is no need to learn a new 3D input pen operation for operation.
- FIGS. 1 to 3 are schematic diagrams for explaining the outline of the three-dimensional pointing method of the first embodiment, and FIG. 1 realizes the three-dimensional pointing method of the first embodiment.
- FIG. 2 is a diagram for explaining the principle of the three-dimensional pointing method of the first embodiment, and
- FIG. 3 is a three-dimensional pointing method of the first embodiment. It is a figure which shows the structural example of the input pen to be used.
- FIG. 1 is a system controller, 101 is input information acquisition means, 102 is pointer position Z rotation angle calculation means, 103 is pointer generation means, 104 is pointing determination means, 105 is object generation means, 106 is display control Means 107, processing control means 108, storage means 2, input device 2, and display device 3.
- 201P is the pen tip of the input pen
- 201X is the axis of the input pen housing.
- 201 is an input pen
- 201A is a coil
- 201B is a rotation angle detection coil
- 201C is a writing pressure sensing unit.
- the three-dimensional pointing method of the first embodiment is expressed on a display device connected to the system control device using the pen-shaped input device connected to a system control device such as a PC, for example.
- This is a pointing method that is preferred to be applied to 3D manipulation of pointers and pointing objects in a 3D space!
- the system control device 1 has input information acquisition means 101 for acquiring input information input from the input device 2, and input information acquired by the input information acquisition means 101.
- the pointer position Z rotation angle calculating means 102 for calculating the movement direction and amount of the pointer, the rotation direction and the rotation angle based on the input information
- the pointer position Z Pointer generation means 103 for generating a pointer based on the calculation result of the rotation angle calculation means 102
- pointing determination means for determining whether or not there is an object pointed to the pointer generated by the pointer generation means 103 104, and when there is an object that is pointed to, for example, change the color of the object or move the pointer
- An object generation unit 105 that generates an object having a position and an orientation following the rotation, the pointer generated by the pointer generation unit 103, and the object generated by the object generation unit 105 are stored in the display device 3.
- Display control means 106 for displaying.
- the system control device 1 is a device that activates and operates software or controls other devices in accordance with input information from the input device 2, such as the PC.
- processing control means 107 for controlling processing such as software activation, and processing by the processing control means 107
- a storage means 108 that stores data to be used is provided.
- the input information acquisition unit 101 passes the acquired information to the processing control unit 107 and acquires it.
- the system control device 1 is caused to execute processing corresponding to the information. Therefore, in addition to the pointer and the object, the display control means 106 displays on the display means 3 the contents of the processing being executed by the system control device 1 (processing control means 107) and the processing result. It is also a means that can be made to.
- the input device 2 includes, for example, a pen-shaped operation means (input pen) held by an operator who operates the pointer and the object, and a pen tip of the input pen.
- This is a detection means having a detection surface for detecting information such as position, pressure applied to the pen tip (writing pressure), inclination, azimuth, and rotation angle of the input pen.
- a Cartesian coordinate system XYZ corresponding to the three-dimensional space represented by the display device 3 is taken on the detection surface of the detection means, and the XY plane of the Cartesian coordinate system XYZ is Assuming that it is a detection surface, the detection means, when the pen tip 201P of the input pen comes into contact with the detection surface (XY plane), the coordinates (X, y) of the pen tip 201P, the axis 201X of the housing of the input pen Azimuth ⁇ (for example, 0 degrees ⁇ 360 degrees), inclination (for example, 0 degrees ⁇ ⁇ ⁇ 90 degrees), rotation angle around the axis ⁇ (for example, 0 degrees ⁇ ⁇ 360 degrees), pen pressure, etc. Detect information.
- the configuration of the input device 2 capable of detecting information such as the coordinates of the pen tip 201P, the orientation a of the axis 201X of the input pen housing 201X, the inclination j8, the rotation ⁇ around the axis, and the writing pressure is as follows. For example, from the contents described in Reference Document 1 (Yuji Mitani, “Basics and Applications of Touch Panel,” Techno Times, Inc., 2001.) and Reference Document 2 (catalog of intuos2 manufactured by WACOM Co., Ltd.) If there is, it can be easily guessed and can be easily realized. However, the angle of the rotation ⁇ around the axis 201X of the input box housing cannot be obtained by the structure described in Reference Document 1 or Reference Document 2.
- the coil 201A of the coordinate indicator described in Reference Document 1 inside the input pen 201 is used.
- another coil 201B for detecting the rotation ⁇ around the axis is added, and the pen pressure sensing unit 201C obtains the change in the magnetic flux linked to both coils 201A and 201B, and calculates the amount of rotation.
- Rub What is necessary is easily conceivable and can be realized by those skilled in the art.
- the input pen 201 used in the three-dimensional pointing method of the present embodiment does not have to have a mechanism for detecting the angle of the rotation ⁇ around the axis as shown in FIG.
- the input device 2 is not limited to a device in which the input pen and the detection means are separated, such as a pen tablet or a combination of a touch panel and a stylus pen.
- a pen mouse is used.
- the input device may be an input device in which the detection means is inserted in the housing of the input pen.
- the display device 3 may be any display device that can represent a three-dimensional space. For example, a three-dimensional object is projected and displayed on a two-dimensional plane such as a CRT display or a liquid crystal display.
- a display device that can display and display 3D images such as HMD (Head Mount Display) and DFD (Depth Fused 3D) (details of DFD will be described later) But you can. That is, the display device 3 may be any display device that can three-dimensionally perceive the pointer or object displayed by the operator.
- the display device 3 and the detection means of the input device 2 it is also possible to take the form of integral (e.g., see JP-A 5-073208 discloses.) 0 the input device 2
- the detection means can be integrated with the display device 3 so as to overlap the display surface of the display device 3.
- a form in which a touch panel and a stylus pen are combined can be applied. In this way, the operator can make a pointing by bringing the input pen into contact with the display surface of the display device 3 such as a liquid crystal display, and the detection unit and the display device 3 are separated. Compared to operating in a state, more intuitive operation is possible.
- the detection means and the display device 3 are physically integrated like a general pen tablet that does not limit the configurations of the detection means of the input device 2 and the display device 3. It doesn't have to be.
- FIGS. 6A, 6B, and 6C are schematic diagrams for explaining the three-dimensional pointing method of the embodiment 1-1 according to the present invention.
- FIG. 4A is a front view showing an example of the three-dimensional space represented on the display device.
- 4B is a bird's-eye view showing an example of a three-dimensional space represented on the display device
- FIGS. 6A, 6B, and 6C are a front view and a right side view showing the state in a three-dimensional space when operated with an input pen, respectively
- FIG. 7 is a three-dimensional diagram of the embodiment 11 It is a flowchart which shows the process sequence of the pointing method.
- FIGS. 6A, 6B, and 6C correspond to FIGS. 5A, 5B, and 5C, respectively.
- the three-dimensional pointing method of Example 11 is a method of pointing an object in the depth direction as viewed from the operator in the three-dimensional space by changing the writing pressure of the input pen 201.
- an electromagnetic induction pen tablet is used as the input device 2, and a liquid crystal display is used as the display device 3 that can display the three-dimensional space. Further, the detection means (digitizer) of the input device 2 is overlapped with the display surface of the liquid crystal display 3, and the pointing can be performed by directly operating the input pen on the display screen. Further, it is assumed that the input device 2 and the display device 3 are connected to the system control device 1 configured as shown in FIG.
- Example 11 as shown in FIGS. 4A and 4B, the three-dimensional space 301 expressed in the liquid crystal display 3 is associated with the coordinate system XYZ shown in FIG. Assume that the coordinate system XYZ is set and the object 302 is arranged at a position of z ⁇ 0 in the three-dimensional space 301. Further, it is assumed that the operator who operates the input pen 201 of the input device 2 observes the directional force of z> 0 on the XY plane of the three-dimensional space 301.
- the z 0 XY plane of the three-dimensional space 301, that is, the surface closest to the operator is the display surface of the liquid crystal display. It is assumed that this is the detection surface of the detection means of the input device 2.
- the operator when the operator brings the pen tip 201P of the input pen 201 into contact with the display surface of the liquid crystal display 3, the operator overlaps the display surface.
- the detected means detects the position (coordinates), pen pressure, etc. of the pen tip 201P.
- the system control apparatus 1 acquires information such as the position (coordinates) and writing pressure of the pen tip 201P detected by the detection means by the input information acquisition unit 101, and the pointer The position on the three-dimensional space 301 corresponding to the position where the pen tip 2 01 P is in contact with the position Z rotation angle calculation means 102 and the pointer generation means 103 (for example, the extension of the axis of the input pen in the three-dimensional space) A pointer to be displayed at a position on the line is generated. Then, when the pointer is generated, a pointer display signal is sent from the display control means 106 to the display device 3, and for example, as shown in FIGS. 5A and 6A, the display device A pointer 303 reflecting the position and pen pressure of the pen tip 201P is displayed on the three-dimensional space 301 expressed in FIG.
- the display of the pointer 303 is continuously performed almost at the same time as the operator performs an operation of increasing the writing pressure on the input pen 201, so that the operator It is possible to feel that the pointer extends in the three-dimensional depth direction (z 0) by the pen pressure with the added force. Further, a force not shown in the figure.
- the object generation means 105 detects the object. By changing the color of the tattoo 302 and switching the display of the object 302 displayed on the three-dimensional space 301, the operator 302 can change the object 302 in the three-dimensional depth direction (z 0). It is possible to notify that the pointing has been successful.
- the pointer 303 is tilted in the 3D depth direction (z 0) and displayed. After letting the brush When the pressure is lowered, the pointer 303 may be returned to the depth position reflecting the writing pressure V, and it may be fixed at the depth position before the pressure is lowered!
- the system control apparatus 1 may execute the processing from step 401 to step 406 as shown in FIG.
- the display control means 106 displays the pointer 303 and the object 302 on the three-dimensional space 301 expressed in the display device (liquid crystal display) 3 (step 401). .
- the pointer 303 is displayed at an arbitrary position.
- the input information acquisition unit 101 is in a state in which information detected by the detection unit of the input device 2 can be acquired.
- the detection means detects the position (coordinates), writing pressure, and the like of the pen tip 201P.
- the input information acquisition means 101 acquires information such as the detected position (coordinate) and writing pressure of the pen tip 201P (step 402). In the three-dimensional pointing method of Embodiment 11 of the present invention, it is only necessary to obtain the position (coordinates) and writing pressure information of the pen tip 201P.
- the detection means in addition to the position (coordinates) of the pen tip 201P and the writing pressure, the azimuth and inclination ⁇ of the input pen 201, the rotation ⁇ around the axis, etc. Can also be detected. Therefore, information on the position ⁇ , tilt j8, and rotation ⁇ around the axis of the input pen 201 may be acquired together with information on the position (coordinates) of the pen tip 201P and writing pressure.
- the position (coordinates) of the tip 201P out of the acquired information by the pointer position ⁇ rotation angle calculation means 102 is next.
- pen pressure information are used to calculate the position, orientation, length, etc. of the pointer reflecting these information (step 403).
- the position (coordinate) of the pen tip 201P corresponding to the position on the heel plane of the three-dimensional space represented on the display device 3 Calculate the (coordinates) and length proportional to the pen pressure.
- step 403 the pointer generation unit 103 then moves the pointer position based on the calculation result of the pointer position / rotation angle calculation unit 102.
- Information about the pointer generated from the display control means 106 is sent to the display device 3 and displayed on the three-dimensional space 301 (step 404).
- the pointing determination unit 104 performs the position (coordinates) on the XY plane of the three-dimensional space calculated by the pointer position Z rotation angle calculation unit 102 and It is determined whether or not there is an object to be pointed at a position corresponding to the depth position (step 405). At this time, if there is no pointing object, only the display control of the pointer 303 is performed, and the process returns to step 402 and waits until the next input information is acquired.
- the object generation means 105 generates, for example, an object in which the color of the pointing object is changed, and relates to the object generated from the display control means 106.
- Information is sent to the display device 3 and displayed on the three-dimensional space 301 (step 406). Thereafter, the process returns to step 402 and waits until the next input information is acquired.
- the three-dimensional pointing method of Embodiment 11 of the present invention information on the position (coordinates) and writing pressure of the pen tip 201 P of the input pen 201 and the writing pressure is acquired.
- the position (coordinate) on the XY plane of the three-dimensional space represented on the display device 3 corresponding to the position (coordinate) of the tip 201P and the depth position corresponding to the pen pressure are calculated, and the calculated position and depth position are calculated.
- An arbitrary point on the three-dimensional space 301 represented on the display device 3 can be pointed by generating a pointer to be displayed and displaying the pointer.
- a general pen tablet is used as the input device 2, and the pointing position in the depth direction of the pointer 303 is changed in a state where the pen tip 201P of the input pen 201 is in contact with the detection means. Therefore, operator fatigue can be reduced.
- the detection means of the input device 2 is overlapped with the display surface of the display device (liquid crystal display) 3, the operator The input pen can be operated on the display surface. If you do this, the pointer 30 A visual effect as if 3 is a part of the pen tip 201 of the input pen 201 is obtained, and accurate pointing of the object 302 is facilitated and intuitive pointing is possible.
- the length changes according to the writing pressure of the input pen 201.
- the force causing the pointer 303 to be displayed is not limited to this.
- Point the 3D depth direction (z 0) like a pointer that changes in the 3D depth direction (z 0) or a pointer that changes the tilt in the 3D depth direction (z 0). Any change is possible as long as it is possible.
- the length of the pointer it may be proportional to the height of the writing pressure, or may be proportional to the power of the writing pressure or the power root.
- Example 11 an electromagnetic induction pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is connected to the display surface of the display device (liquid crystal display) 3.
- the force exemplified in the case of superposition is not limited to this, and the configuration may be such that both are in different positions.
- the force described by taking the electromagnetic induction pen tablet and the liquid crystal display as an example of the combination of the input device 2 and the display device 3 is not limited to this.
- It may be a combination of a touch panel and a stylus pen used in a PDA or the like.
- FIGS. 8A to 8D are schematic diagrams for explaining a modification of the three-dimensional pointing method of the embodiment 11 and FIGS. 8A, 8B, 8C, and 8D respectively show the shapes of pointers to be displayed.
- FIG. 8A, 8B, 8C, and 8D respectively show the shapes of pointers to be displayed.
- Example 1-1 force using a flat arrow-shaped pointer 303a as the pointer 303, for example, as shown in FIG. 8A, the shape of the pointer 303 is not limited to this, but pointing Any shape may be used as long as the position is visually clear.
- Examples of the shape of such a pointer include a three-dimensional arrow pointer 303b in which a cylinder is connected to the bottom of a cone as shown in FIG. 8B, a cone-shaped pointer 303c as shown in FIG. 8C, and FIG. 8D.
- the pointer 303 is pointed, and the point is the force used as the tip of the arrow type pointer (the tip of the arrow). It is possible to make it possible to point at all, or to make it a part other than the tip of the pointer.
- a force that takes a folder-type object as an example of the object 302 is not limited to this, and the object 302 may have any shape.
- the 3D origin can be anywhere, and it needs to be a Cartesian coordinate system, such as a cylindrical coordinate system or a spherical coordinate system.
- Figures 9 ⁇ to 10C are schematic diagrams for explaining the three-dimensional pointing method of Example 1-2 according to the present invention.
- Figs. 9 ⁇ , 9 ⁇ , and 9C are three-dimensional images when operated with an input pen, respectively.
- FIGS. 10A, 10B, and 10C are a front view and a right side view showing a state in a three-dimensional space when operated with an input pen, respectively. Note that FIG. 1 OA, FIG. 10B, and FIG. 10C correspond to FIG. 9A, FIG. 9B, and FIG. 9C, respectively.
- the three-dimensional pointing method of Example 1 2 is a method in which the direction of the input pen 201 is changed to point the object in the depth direction as viewed from the operator in the three-dimensional space in various directions. .
- the input device 2 and the display device 3 are assumed to use an electromagnetic induction type pen tablet and a liquid crystal display, respectively, as in the case of Example 1-1. Further, it is assumed that the detection means (digitizer) of the pen tablet 2 is superimposed on the display surface of the liquid crystal display 3.
- the system control device 1 uses the input information means 101 to detect the position (coordinates) of the pen tip 201 of the input pen 201 detected by the detection means (digitizer), and the input pen 201.
- Information such as azimuth, inclination
- a pointer to be displayed at the corresponding position on the three-dimensional space 301 (for example, the position on the extension line in the three-dimensional space of the axis of the input pen) is generated.
- a pointer display signal is sent from the display control means 106 to the display device 3, and for example, a pointer 303 as shown in FIG. 9B and FIG. 10B is displayed.
- the operator changes the direction of the input pen 201 to, for example, the direction shown in Figs. 9 and 10 or 10C and 10C with the writing pressure being almost constant. Then, the pointer position / rotation angle calculation means 102 and the pointer generation means 103 calculate a new pointer direction from the new azimuth and inclination
- the newly generated pointer display signal is sent from the display control means 106 to the display device 3, the pointer 303 as shown in FIGS. 9A and 10B or FIGS. 9C and 10C is displayed. .
- the display of the pointer 303 is continuously performed almost simultaneously with the operation of changing the direction of the input pen 201 by the operator, so that the operator tilts the pen. It is possible to feel as if the pointer 303 is tilted and displayed on the extension line of the pen tip 201P in the selected direction.
- the object generation unit 105 changes the color of the object, and the three-dimensional To perform processing such as switching the display of the object 302 displayed on the space 301 Therefore, it is possible to inform the operator that the object in the three-dimensional depth direction (z 0) has been pointed.
- the pointer position Z rotation angle calculation unit 102 and the pointer generation unit 103 For example, in addition to the direction of the pointer, a length proportional to the height of the writing pressure as described in Example 1-1 may be calculated, and a pointer reflecting the calculation result may be generated. it can.
- the system control device has only to execute the processing from step 401 to step 406 as shown in FIG. Omitted.
- step 402 in addition to the position (coordinates) of the pen tip 201P and the pen pressure, It is necessary to acquire information on the orientation ⁇ of the input pen 201 and the rotation ⁇ around the tilt axis.
- step 403 the position (coordinates) on the heel plane of the three-dimensional space represented by the display device 3 corresponding to the position (coordinates) of the pen tip 201P and the length proportional to the pen pressure
- the input pen 201 has a force on the position (coordinates) and writing pressure of the pen tip 201 of the input pen 201!
- Information on the orientation ⁇ , tilt j8, and rotation ⁇ around the axis of 201 is acquired, and the position (coordinates) on the ⁇ plane of the three-dimensional space expressed on the display device 3 corresponding to the position (coordinates) of the pen tip 201P )
- the three-dimensional space 301 expressed in the display device 3 is displayed. Any one of the above points can be pointed
- a general pen tablet is used as the input device 2, and the pointing position in the depth direction of the pointer 303 is changed in a state where the pen tip 201P of the input pen 201 is in contact with the detection means. Therefore, operator fatigue can be reduced.
- the detection means of the input device 2 is overlapped with the display surface of the display device (liquid crystal display) 3, the operator The input pen can be operated on the display surface. This rubbing provides a visual effect as if the pointer 303 is a part of the pen tip 201 of the input pen 201, facilitates accurate pointing of the object 302, and is intuitive. Pointing is possible.
- the present invention is not limited to changing the tilt, azimuth, and rotation of the pointer in proportion to the tilt, azimuth, and rotation of the input pen 201.
- any of the tilt, azimuth, and rotation of the input pen 201 is a power. Or maybe it's proportional to the power root.
- an electromagnetic induction pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is connected to the display surface of the display device (liquid crystal display) 3.
- the force exemplified in the case of superposition is not limited to this, and the configuration may be such that both are in different positions.
- the force described by taking the electromagnetic induction type pen tablet and the liquid crystal display as an example as a combination of the input device 2 and the display device 3 is not limited thereto. It may be a combination of a touch panel and a stylus pen used in a PDA or the like.
- the shape of the pointer may be any shape as long as the pointing position is visually clear, as shown in FIG. 8A.
- a three-dimensional arrow-shaped pointer 303b in which a cylinder is connected to the bottom of a cone as shown in FIG. 8B, a conical pointer 303c as shown in FIG. 8C, and an index finger as shown in FIG. 8D It may be the hand-shaped pointer 303d of the person pointing at the object.
- the pointer 303 is pointed and the point is the force used as the tip of the arrow pointer (the tip of the arrow). Even if it is possible to point, it is possible to make it a part of the pointer other than the pointer.
- the force in which a folder type object is cited as an example of the object 302 is not limited to this, and the object 302 may have any shape.
- FIGS. 11 to 13 are schematic diagrams for explaining the three-dimensional pointing method according to the first to third embodiments of the present invention, and FIG. 11 shows a state in the three-dimensional space when operated with the input pen.
- FIG. 12 is a front view showing a state in the three-dimensional space when operated with the input pen, and FIG. 13 is a flowchart showing the processing procedure of the three-dimensional pointing method of the embodiment 1-3.
- Example 1 1 and Example 1 2 the pointer display control and pointing on the three-dimensional space 301 represented on the display device 3 are performed in accordance with the operation of the input pen 201 of the input device 2.
- the pointing method related to the display control of objects was explained.
- Embodiment 13 it is assumed that the input device 2 and the display device 3 use an electromagnetic induction pen tablet and a liquid crystal display, respectively, as in Embodiment 1-1. Further, it is assumed that the detection means (digitizer) of the pen tablet 2 is overlapped with the display surface of the liquid crystal display 3.
- the method for pointing the object on the three-dimensional space 301 is the same as that described in the embodiment 11 and the embodiment 12. Therefore, explanation is omitted.
- the object 302 is pointed after the object 302 is pointed as shown in FIG. 11 and FIG. 12 by the same method as in Example 1-1 or Example 1-2.
- the operator who confirms the above-mentioned operation performs an operation of holding the object 302 such as pressing the button 201D provided on the input pen 201. Then, when the button 201D of the input pen 201 is pressed and the pen tip 201P of the input pen 201 is in contact with the display surface (detection surface of the detection means) of the display device 3, the input pen 201 is touched.
- the movement of the input pen 201 is followed as shown in FIGS.
- the object 302 moves in the three-dimensional space 301. In this way, the object can be moved from its original position to the target position in 3D space.
- the input information acquisition unit 101 of the system control device 1 receives the position (coordinates) of the input pen 201, the writing pressure, and the orientation ⁇ of the input pen 201 from the detection unit (digitizer) of the display device 2.
- the information indicating that the button 201D is pressed is acquired together with the detection information of the tilt j8 and the rotation ⁇ around the axis.
- the system control device 1 includes the button Since the 20 ID is pressed, the operator can know that the operation of moving the object 302 is being performed. Therefore, if the pointing determination unit 104 and the object generation unit 105 generate an object that follows the movement of the input pen 201 and display it on the display device 3, the movement operation of the object 302 as described above is possible. It becomes.
- step 401 to step 410 as shown in Fig. 13 may be executed in the system control apparatus.
- step 401 to step 406 is processing until the object 302 on the three-dimensional space 301 is pointed, and the processing from step 401 to step of the processing shown in FIG. This is the same as the processing up to 406. Since the processing from step 401 to step 406 is the same as that described in the embodiment 11 or the embodiment 12, detailed description thereof is omitted.
- the system control apparatus 1 does not return to the step 402 after pointing in the step 406 and changing the color of the object to be displayed.
- the input information acquisition unit 101 acquires information on the pointer (step 408).
- the information acquired at this time includes the position (coordinate) of the pen tip 201P of the input pen 201, the writing pressure, the azimuth and inclination of the input pen 201 described in the embodiments 11 and 12
- the pointer position ⁇ rotation angle calculation means 102 calculates the position, orientation, length, etc. of the pointer based on the acquired information, and the position of the object.
- the orientation is calculated (step 409). in front Since the calculation of the position, orientation, length, etc. of the pointer is as described in Example 11 or Example 12 above, detailed description is omitted.
- the position and orientation of the object may be determined by, for example, the relative positional relationship force between the reference position of the object when the pointing is performed and the pointer and the position calculated by the step 409. Calculate the position and orientation that can be stored in the position.
- a pointer based on the position, orientation, length of the pointer calculated by the pointer generation means 103 is generated.
- an object based on the position and orientation of the object calculated by the object generation means 105 is generated, and the display signal is sent from the display control means 106 to the display device 3 to display the pointer and the object (step 410).
- step 410 When the pointer and the object are displayed in step 410, the process returns to step 407. If the button 201D of the input pen 201 is continuously pressed, the processing from step 408 to step 410 is performed. repeat. Then, when the operator releases the button 201D, the pointer and object moving operation ends.
- the pointing object is translated in accordance with the movement of the input pen. Can be made.
- the detection means of the input device 2 is overlapped with the display surface of the display device (liquid crystal display) 3, the operator The input pen can be operated on the display surface. This rubbing provides a visual effect as if the pointer 303 is a part of the pen tip 201 of the input pen 201, facilitates accurate pointing of the object 302, and is intuitive. Pointin Is possible.
- an electromagnetic induction pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is connected to the display surface of the display device (liquid crystal display) 3.
- the force exemplified in the case of superposition is not limited to this, and the configuration may be such that both are in different positions.
- the combination of the input device 2 and the display device 3 is not limited to the force described by taking the electromagnetic induction pen tablet and the liquid crystal display as an example. For example, it is used in a PDA or the like. It can be a combination of the touch panel and stylus pen!
- the shape of the pointer may be any shape as long as the pointing position is visually clear, as shown in FIG. 8A.
- a three-dimensional arrow-shaped pointer 303b with a circular cylinder connected to the bottom of the cone as shown in FIG. 8B, a conical pointer 303c as shown in FIG. 8C, and an index finger as shown in FIG. 8D It may be the hand-shaped pointer 303d of the person pointing at the object.
- the pointer 303 is pointed and the point is the force used as the tip of the arrow pointer (the tip of the arrow). Even if it is possible to point, it is possible to make it a part of the pointer other than the pointer.
- Example 13 of the present invention a force that uses a folder-type object as an example of the object 302 is not limited to this, and the object 302 may have any shape.
- the force of moving the input object by moving the input pen 201 while pressing the button 201D of the input pen 201 is not limited to this.
- the object may be moved by a method such as moving the input pen 201 while pressing a specific key or another switch.
- FIGS. 14A to 15C are schematic diagrams for explaining the three-dimensional pointing method of the embodiment 1-4 according to the present invention.
- FIGS. 14A, 14B, and 14C are three-dimensional spaces when operating with an input pen, respectively.
- FIG. 15A, FIG. 15B, and FIG. 15C are a front view and a right side view showing a state in the three-dimensional space when operated with an input pen, respectively.
- 15A, 15B, and 15C correspond to FIGS. 14A, 14B, and 14C, respectively.
- the pointing object 302 on the three-dimensional space 301 expressed by the display device 3 is pointed, and then the pointing object 302 can be moved in parallel. The method was explained.
- the first embodiment 1 is performed.
- the input device 2 and the display device 3 are respectively an electromagnetic induction type pen tablet and a liquid crystal display, as in the case of Example 1-1. Further, it is assumed that the detection means (digitizer) of the pen tablet 2 is superimposed on the display surface of the liquid crystal display 3.
- the method for pointing the object on the three-dimensional space 301 is the method described in the embodiment 11 or the embodiment 12. Therefore, explanation is omitted.
- the object 302 is pointed after the object 302 is pointed, for example, as shown in FIGS. 14A and 15A by the same method as in the embodiment 1-1 and the embodiment 1-2.
- the input pen 201 is pressed in a state where the button 201D of the input pen 201 is pressed and the pen tip 201P of the input pen 201 is in contact with the display surface of the display device 3 (detection surface of the detection means).
- the object 302 force follows the change in the direction of the input pen 201 and tilts in the depth direction in the S3D space 301. . In this way, the object 302 can be tilted in any direction on the three-dimensional space 301.
- the input information acquisition means 101 of the system control device 1 receives the position (coordinates) of the input pen 201, the writing pressure, the orientation ⁇ of the input pen 201, from the detection means (digitizer) of the display device 2. Information indicating that the button 201D is pressed is acquired together with the detection information of the inclination
- the system control apparatus 1 can know that the operator performs an operation of moving the object 302. Therefore, if the pointing determination unit 104 and the object generation unit 105 generate an object that follows the change in the direction of the input pen 201 and display it on the display device 3, the movement operation of the object 302 as described above can be performed. It becomes possible.
- the system control apparatus 1 may execute the processing from step 401 to step 410 as shown in Fig. 13, for example.
- an object pointing operation and a tilting operation in the depth direction are possible.
- the pointing object is adjusted in accordance with the change in the direction of the input pen 201. It can be rotated in the same XY plane or tilted in the depth direction.
- Example 14 if the detection means of the input device 2 is superimposed on the display surface of the display device (liquid crystal display) 3, the operator can The input pen can be operated on the display surface. This rubbing provides a visual effect as if the pointer 303 is a part of the pen tip 201 of the input pen 201, facilitates accurate pointing of the object 302, and is intuitive. Pointing is possible.
- an electromagnetic induction pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is connected to the display surface of the display device (liquid crystal display) 3.
- the force exemplified in the case of superposition is not limited to this, and the configuration may be such that both are in different positions.
- the force described by taking the electromagnetic induction type pen tablet and the liquid crystal display as an example as a combination of the input device 2 and the display device 3 is not limited thereto. It may be a combination of a touch panel and a stylus pen used in a PDA or the like.
- the shape of the pointer may be any shape as long as the pointing position is visually clear, as shown in FIG. 8A.
- a three-dimensional arrow-shaped pointer 303b with a circular cylinder connected to the bottom of the cone as shown in FIG. 8B, a conical pointer 303c as shown in FIG. 8C, and an index finger as shown in FIG. 8D It may be the hand-shaped pointer 303d of the person pointing at the object.
- the point where the pointer 303 is pointing is the force used as the tip of the arrow pointer (the tip of the arrow). Even if it is possible to point, it is possible to make it a part of the pointer other than the pointer.
- a force that takes a folder-type object as an example of the object 302 is not limited to this, and the object 302 may have any shape.
- the force of moving the input pen 201 while moving the input pen 201 while pressing the button 201D of the input pen 201 is not limited to this.
- the object may be moved by a method such as moving the input pen 201 while pressing a specific key or another switch. In this case, in step 407 shown in FIG. 13, it is determined whether or not the input information corresponding to the operation of moving the object has been acquired.
- Example 14 the operation method for rotating the pointed object 302 within the same XY plane or tilting the object 302 in the depth direction has been described. By adding an operation method for parallel movement as described in Example 1-3, it is possible to perform more various object operations.
- FIGS. 16A to 20 are schematic diagrams for explaining the three-dimensional pointing method of Embodiments 1-5 according to the present invention.
- FIGS. 16A and 16B show the principle of the display device (DFD) used in Embodiment 15 of the present invention.
- FIG. 17A is a front view, a right side view, and a bottom view showing an example of a three-dimensional space represented on the display device
- FIG. 17B is a bird's-eye view showing an example of the three-dimensional space represented on the display device
- FIG. 18B, 18C, and 18D are bird's-eye views showing the state in the three-dimensional space when operated with the input pen
- FIGS. 19A, 19B, 19C, and 19D are three-dimensional images when operated with the input pen.
- FIG. 20 is a flowchart showing the processing procedure of the three-dimensional pointing method according to the embodiment 15 of the present invention.
- 19A, 19B, 19C, and 19D are respectively shown in FIGS. 18A, 18B, 18C, and 18D. Suppose that it is a corresponding figure.
- Example 1 3 and Example 1 4 the object 302 in the three-dimensional space 301 represented on the display device 3 is pointed by the method described in Example 1 1 or Example 1 2. Subsequently, a description has been given of a three-dimensional pointing method capable of performing operations such as moving or rotating the pointed object 302 in the XY plane and tilting the object 302 in the depth direction. However, after pointing the object 302 in the three-dimensional space 301, the operator continuously performs operations such as editing and deformation that are performed only by moving and rotating the object 302. There are also two-dimensional GUI operations.
- the method described in the first and third embodiments and the first to fourth embodiments for example, an object in the back (distant) of the three-dimensional space 301 when viewed from the operator.
- the operator When pointing and operating, the operator must operate the input pen 201 in a state where the writing pressure is increased in order to maintain the state in which the object is pointed. Therefore, in the present Example 15, after pointing the object, the pointed object is automatically moved to a position where a two-dimensional GUI-like operation can be applied, and the operator can edit or deform the target object.
- a 3D pointing method that can be operated to return to the 3D position desired by the operator after it has been performed will be described.
- the input device 2 uses the same electromagnetic induction pen tablet as in each of the above embodiments, and a DFD is used as the display device 3 capable of expressing the three-dimensional space. Taking the case of using as an example, the pointing method and the following object operation method will be explained.
- the DFD is a display device in which two or more display surfaces are arranged so as to overlap each other when viewed from an observer (operator), for example, as shown in FIGS. 16A and 16B.
- an observer for example, as shown in FIGS. 16A and 16B.
- the detailed configuration and operation principle of the DFD are described in, for example, the specification of Japanese Patent No. 3022558 and the specification of Japanese Patent No. 3460671, and detailed description thereof is omitted.
- the simple operation of the DFD is described. Only the principle will be explained.
- the pointer 302 and the object 303 displayed on the DFD are displayed on both the front display surface 3A and the back display surface 3B as seen from the operator, for example, as shown in FIG. 16A.
- the DFD is a luminance modulation type
- the object 303A on the front display surface 3A is displayed with luminance L
- the object 303B on the rear display surface 3B is displayed with luminance L.
- the object 303 is displayed at a depth position where the ratio of the distance from the front display surface 3A and the distance from the back display surface 3B in the three-dimensional space is L: L.
- one object 303 can be displayed tilted in the depth direction. it can.
- the brightness of the object 303A on the front display surface 3A increases as it moves downward from the top of the paper
- the brightness of the object 303B on the back display surface 3B increases from the bottom of the paper. Try to grow as you go up! Therefore, the operator can observe the three-dimensional object 303 in which the upper side of the paper is inclined to the back and the lower side of the paper is inclined to the front.
- the DFD is a transmission type
- the object 303A on the front display surface 3A is displayed, and the transparency of each point (pixel) in the area is adjusted.
- a stereoscopic image of the pointer 302 and the object 303 can be displayed at an arbitrary depth position between the front display surface 3A and the back display surface 3B, similarly to the luminance modulation type DFD.
- a process for projecting and displaying the three-dimensional space expressed above on a two-dimensional plane is necessary.
- a three-dimensional display device it is only necessary to set the luminance ratio of points (pixels) on each display surface according to the position in the depth direction of the three-dimensional space! It is possible to reduce the load on the machine.
- the 3D space to be displayed is projected and displayed on a 2D plane, so that some operators have the same feeling as in real space.
- a pointing operation can be performed with a sense closer to real space. For these reasons, by using a 3D display device such as the DFD, the operator is more accurate and faster than a pointing operation using a general 2D display. 3D depth can be pointed.
- the detection means (digitizer) of the input means (pen tablet) 2 is connected to the display surface of the DFD as described in the above embodiments. It is possible to superimpose. Further, in the case of the electromagnetic induction type pen tablet, since there is a detectable range of ⁇ on the detection surface of the detection means, the pen can be used even if the pen tip of the input pen is not in contact with the detection surface. It is possible to detect information such as the position 'tilt' orientation. Therefore, even if there is a space between the display surfaces such as DFD, information such as the pen position 'tilt' orientation can be acquired if the detection means is placed behind the DFD display device.
- the detection means is often arranged on the back side of the display surface.
- the detection means is a transparent electrode, it can be arranged on the front side of the display surface. In this way, by superimposing the detection means on the display surface of the DFD, the input pen can be operated on the front display surface of the DFD and direct pointing can be performed. Therefore, it is assumed that the detection means of the electromagnetic induction pen tablet 2 and the display surface of the DFD are also overlapped in this embodiment 1-5.
- the display surface of the DFD 3 is assumed to be two, and as shown in FIGS. 17A and 17B, the three-dimensional space 301 represented in the DFD 3 Assume that a coordinate system ⁇ corresponding to the coordinate system ⁇ shown in Fig. 2 is set, and the object 302 is placed at the position ⁇ ⁇ 0 in the three-dimensional space 301. In addition, it is assumed that the operator who operates the input pen 201 of the input device 2 is observing the plane of the three-dimensional space 301 from the direction of ⁇ > 0.
- the operator performs, for example, the method described in Example 1 1 or Example 1 2 as shown in FIG. As shown in FIG. 8A and FIG. 19A, it is assumed that an object 302 displayed three-dimensionally on the three-dimensional space 301 is pointed. Then, the operator confirms that the object 302 is pointed by a change in the color of the object 302, for example, when the button 201D provided on the input pen 201 is pressed once, For example, as shown in FIGS. 18B and 19B, the pointed object 302 is displayed as a two-dimensional object on the display surface in front of the DFD, and the pointer 303 disappears. In FIG.
- DFD is used in this process.
- Three-dimensional display is not performed, and the image is displayed as a projection image on the display surface in front of the DFD as 2D.
- the operator performs a target operation such as writing a character on the object 302 as an operation of the two-dimensional GUI.
- a target operation such as writing a character on the object 302 as an operation of the two-dimensional GUI.
- the button 201D on the input button 201 is pressed again, the pointer 303 appears again, and the procedure described in the first and third embodiments and the first to fourth embodiments is followed.
- the object 302 can be moved to a three-dimensional position desired by the operator or tilted in the depth direction.
- step 401 to step 415 the processing from step 401 to step 415 as shown in FIG.
- the processing from step 401 to step 406 described in the embodiment 11 and the embodiment 12 is performed, and the object on the three-dimensional space 301 expressed in the display device 3 is processed.
- Point 302. If the object 302 pointed to in step 406 is displayed in a different color, the operator then presses the button 201D of the input pen 201 as shown in FIG. It is determined whether or not an operation for starting the operation / editing of the inked object 302 is performed (step 411). If the operation for starting the 2D GUI operation “edit” processing is not performed, the process returns to step 402 and waits until the next input information is acquired.
- Step 412 and Step 413 the object 302 can be manipulated and edited in a two-dimensional GUI.
- a two-dimensional GUI-like operation from the input pen 201 is received and executed (step 414).
- the operator has performed the operation for ending the two-dimensional GUI-like operation 'editing' such as pressing the button 201D of the input pen 201 again.
- Judge (Step 415). At this time, if the operation for finishing the 2D GUI operation 'editing' processing is not performed, the process returns to step 414 to accept and execute another 2D GUI operation 'editing' processing. .
- the pointing object 302 can be moved, rotated, and tilted (deformed) in three dimensions.
- the modeler that performs the three-dimensional pointing operation by performing a specific operation such as pressing the button 201D of the input pen 201 also allows the object to be displayed.
- 2D GUI-like operation Switches to the mode for 'editing' processing.
- the information acquired by the input information acquisition means 101 of the system control device 1 is processed as information for performing the two-dimensional GUI operation “edit” processing of the object.
- a 2D GUI-like operation 'edit' processing is performed by pointing an object behind the operator's power, it is necessary to maintain a state where the pen pressure of the input pen 201 is increased. There is no. As a result, the operator's fatigue can be reduced.
- the detection means (digitizer) of the electromagnetic induction pen tablet that is the input device 2 is superimposed on the display surface of the DFD that is the display device 3 that can display a three-dimensional space.
- the present invention is not limited to this, and the configuration may be such that both are in different positions.
- the DFD is used as the display device 3.
- the present invention is not limited to this, and a display device such as a liquid crystal display as described in the above Examples 1-1 to 1-4. Use ⁇ .
- the force that mentioned a two-dimensional quadrangular object as an example of the target object 302 is not limited to this, and the shape of the object 302 is any shape. May be.
- the force of moving the input object by moving the input pen 201 while pressing the button 201D of the input pen 201 is not limited to this.
- the object may be moved by a method such as moving the input pen 201 while pressing a specific key or another switch.
- the object 302 is directly edited, for example, as shown in FIG.
- 302 is an object that represents a file and a 2D GUI operation is performed after pointing to it
- the file opens, the contents are edited on the 2D GUI by the operator, and the file is closed. It can be moved to the desired 3D position.
- FIG. 21A to FIG. 27 are schematic diagrams for explaining the three-dimensional pointing method of the embodiment 1-6 according to the present invention.
- FIG. 21A is a front view and a right side view showing an example of a three-dimensional space represented on the display device.
- Fig. 21B is a bird's-eye view showing an example of a three-dimensional space represented on the display device.
- Fig. 22A, Fig. 22B, Fig. 22C, Fig. 23A, Fig. 23B, and Fig. 23C are each operated with an input pen.
- FIG. 26 is a flowchart showing the processing procedure of the three-dimensional pointing method of Embodiment 1-6
- FIG. 27 is a flowchart showing a modification of the processing procedure of the three-dimensional pointing method of Embodiment 16.
- 24A, 24B, and 24C correspond to FIGS.22A, 22B, and 22C, respectively
- FIGS.25A, 25B, and 25C correspond to FIGS.23A, 23B, and 23C, respectively.
- FIG.23A, 23B, and 23C respectively.
- the three-dimensional pointing method of Example 16 is one of the application examples of the three-dimensional pointing method described in Example 15 above, and an object 302 in a three-dimensional space 301 is used as Example 1.
- the pointed object 302 is automatically moved to the heel position where the operator can easily operate, that is, the position where the previous 2D GU Fujisaku can be applied.
- the operator performs the desired editing and processing.
- the object 302 moves in the 3D depth direction until it interferes with other objects in the 3D depth direction with the operator's power.
- This is a pointing method for changing the state of an object that has moved according to the attributes of the other object after interfering with the other object.
- Example 16 similarly to the Example 15, the case where an electromagnetic induction pen tablet is used as the input device 2 and a DFD is used as the display device 3 is taken as an example. A method and an object operation method will be described.
- the detection means (digitizer) of the input device (pen tablet) 2 is provided integrally with the display surface of the display device (DFD) 3.
- FIGS. 21A and 21B a three-dimensional space 301 expressed between two display surfaces of the DFD 3 is shown.
- the coordinate system XYZ is set, and the object 302 and the window 304 are arranged at the position 0 in the three-dimensional space 301.
- an operator who wants to operate the object 302 first points the object 302 by the method described in the embodiment 11 or the embodiment 12. To do. Then, it is confirmed that the pointing has been made due to a change in the color of the object 302. For example, when the button 201D provided on the input pen 201 is pressed once, the pointing is performed as shown in FIGS. 22B and 24B.
- the displayed object 302 is displayed as a two-dimensional object on the display surface in front of the DFD, and the pointer 303 disappears.
- the object 302 is an object with no thickness in the z direction. However, even if the object 302 is a solid object with a thickness in the force direction, DFD is used in this process. Three-dimensional images are displayed as projection images on the display device in front of the DFD as 2D.
- the object 302 When pressed, the object 302 moves in the three-dimensional depth direction (z 0) as viewed from the operator until it interferes with the window 304, as shown in FIG. 23B and FIG. 25B. At this time If an animation is made such that the z coordinate of the three-dimensional depth direction of the object 302 decreases stepwise, the movement process is easily transmitted to the operator.
- an operation of moving the object 302 on the window as an attribute of the window 304 is executed.
- the system control device 1 executes the processing from step 401 force to step 406, step 411 to step 419 as shown in FIG. do it.
- the system control device 1 first performs the processing from step 401 to step 406 described in the embodiment 1-1 and the embodiment 1-2, and the three-dimensional space expressed in the display device 3.
- the operator presses the button 201D of the input vane 201 as shown in FIG.
- the pointed object 302 is subjected to a two-dimensional GUI-like operation 'edit' to determine the key for performing the operation (Step 41 Do, and the two-dimensional GUI-like operation 'edit' is started. If no operation has been performed, the process returns to step 402 and waits until the next input information is obtained.
- step 412 and step 413 the object 302 can be manipulated and edited in a two-dimensional GUI.
- step 414 Two-dimensional GUI-like operation After the “edit” processing is enabled, the two-dimensional GUI-like operation from the input pen 201 is received and executed (step 414). Step 4 above After the process of 14, the operator determines the force of performing the operation for ending the two-dimensional GUI-like operation 'editing' such as pressing the button 201D of the input pen 201 again (step 415). At this time, if the operation for ending the 2D GUI operation 'editing' processing is not performed, the process returns to step 414 to accept and execute another 2D GUI operation 'editing' processing. .
- the system control apparatus 1 includes other objects such as the window 304 that interfere with the object 302. It is determined whether or not there is an object (step 417). If there is an interfering object, the object 302 stops moving in the three-dimensional depth direction at the time of the interference and interferes with the object 302. The attributes of other objects are executed (step 418).
- the depth position determined by compelling the object 302, for example, the same depth position as before the processing after step 411 is performed. And display it (step 419).
- step 415 that is, after returning to the state where the two-dimensional G UI operation is possible and the state where a three-dimensional pointing operation is possible
- the force of automatically moving the object 302 in the depth direction is not limited to this, and the operator operates the input pen 201 by the method described in the first and third embodiments.
- the object 302 may be moved in the depth direction.
- the system controller 1 As shown in FIG. 27, the processing from step 401 to step 415, step 417, step 418, step 420, and step 421 may be executed.
- step 401 to step 406, step 411 to step 415) from pointing to the object 302 on the three-dimensional space 301 expressed in the DFD 3 and performing a two-dimensional GUI operation is shown in FIG. Since the same processing as the processing procedure shown in the above may be performed, description thereof is omitted.
- step 415 for example, after confirming an operation of returning from a state in which a two-dimensional GUI operation is possible, such as pressing the button 201D of the input pen 201 once, to a state in which a three-dimensional pointing operation is possible
- the object 302 is pointed by the procedure (step 402 ′ to step 406 ′) as described in the embodiment 11 and the embodiment 12.
- step 407 it is next checked whether or not the button 201D of the input pen 201 has been pressed (step 407).
- the steps as described in the embodiments 1-3 and 1-4 are subsequently performed in accordance with the operation of the input pen 201. From step 408 to step 410, the object 302 is moved three-dimensionally, rotated or deformed.
- the system control apparatus 1 performs the processing from step 408 to step 410, and interferes with the object 302 while the object 302 is moved, rotated, or deformed three-dimensionally.
- Check if there are other objects step 4 17). If there is another object that interferes, for example, the color of the other object that interferes is changed (step 420). At the same time, the color of the other object that interferes is changed and displayed, and the force in which the button 201D of the input pen 201 is pressed is confirmed. If the button 201D is being pressed, the processing from step 408 is continued, and the three-dimensional movement, rotation, or deformation of the object 302 is continued.
- the operator can easily operate the pointed object 302.
- the object 302 is moved to the position automatically, and the target editing and processing are performed by the conventional 2D GUI operation.
- the state of the moved object 302 can be changed according to the attribute of the other object.
- the moderator that performs the three-dimensional pointing operation by performing a specific operation such as pressing the button 201D of the input pen 201 can also change the object.
- 2D GUI-like operation Switches to the mode for 'editing' processing.
- the information acquired by the input information acquisition means 101 of the system control device 1 is processed as information for performing the two-dimensional GUI operation “edit” processing of the object.
- a 2D GUI-like operation 'edit' processing is performed by pointing an object behind the operator's power, it is necessary to maintain a state where the pen pressure of the input pen 201 is increased. There is no. As a result, the operator's fatigue can be reduced.
- the detection means (digitizer) of the electromagnetic induction pen tablet that is the input device 2 is superimposed on the display surface of the DFD that is the display device 3 that can display a three-dimensional space.
- the present invention is not limited to this, and the configuration may be such that both are in different positions.
- a DFD is used as the display device 3, but the present invention is not limited to this, and a display device such as a liquid crystal display as described in Examples 1-1 to 1-4 is used. Use ⁇ .
- a two-dimensional quadrangular shape is used as an example of the target object 302 .
- the force enumerating the object not limited to this, the shape of the object 302 may be any shape.
- the force that moves the input pen 201 while moving the input pen 201 while pressing the button 201D of the input pen 201 is not limited to this.
- the object may be moved by a method such as moving the input pen 201 while pressing a specific key or another switch. In that case, in step 407, step 411, and step 415 shown in FIG. 20, it is sufficient to determine whether or not the input information corresponding to each operation is acquired.
- Example 1-6 as an example of directly editing the object 302, for example, the power given when writing the letter "B" as shown in Fig. 22C.
- 302 is an object that represents a file and a 2D GUI operation is performed after pointing to it, the file opens, the contents are edited on the 2D GUI by the operator, and the file is closed. It can be moved to the desired 3D position.
- the window 304 is given as an example of the other object, and the case where the attribute of moving a file when it interferes with the window 304 has been described.
- FIGS. 28A to 32C are schematic diagrams for explaining a method of deleting an object by the three-dimensional pointing method of the embodiment 16 and FIG. 28A is a three-dimensional space represented on the display device.
- FIG. 28B is a bird's eye view showing an example of a three-dimensional space represented on the display device
- FIGS. 29A, 29B, 30A, 30B, and 30C are input pens, respectively.
- Figure 31 is a bird's-eye view showing the situation in 3D space when operated with A
- FIG. 31B, FIG. 32A, FIG. 32B, and FIG. 32C are front views showing the inside of the three-dimensional space when operated with the input pen, respectively.
- 31A and 31B are diagrams corresponding to FIGS. 29A and 29B, respectively
- FIGS. 32A, 32B, and 32C are diagrams corresponding to FIGS. 30A, 30B, and 30C, respectively.
- the object 302 can be moved to the trash can object and deleted, for example, by simply moving the object 302 to the window 304. Therefore, as shown in FIG. 28A and FIG. 28B, a coordinate system XYZ is set between the three-dimensional sky 301 expressed between the two display surfaces of the DFD 3, and the object 302 and the trash can 305 are The procedure for deleting the object 302 will be described by taking as an example a case where it is arranged at a position of z ⁇ 0 in the three-dimensional space 301.
- the operator who wishes to delete the object 302 first operates the input pen 201 and points to the object 302, as shown in FIG. Then, with the object 302 to be deleted pointing, for example, when a specific operation such as pressing the button 201D of the input pen 201 is performed once, the pointed object is displayed as shown in FIGS. 29B and 31B.
- 302 moves to the front display surface and changes to a state where two-dimensional GUI operation is possible.
- the operator moves the pointing object 302 to the trash can 305 and presses the button 201D of the input pen 201 once again.
- the state in which the 2D GUI operation can be performed returns to the state in which the 3D pointing can be performed. Then, in the case of the procedure as shown in FIG. 26, the processing performed by the system control device 1 returns to the state where 3D pointing is possible, and then the object 302 automatically moves in the 3D depth direction ( z), the object 302 disappears as shown in FIG. 30B and FIG. 32B, and the trash can 305 is displayed with dust (object) in it. Switch to
- the processing power performed by the system control device 1 is the procedure as shown in FIG. 27, after the operator returns to the state where the three-dimensional pointing is possible, the operator moves the input pen 201. To move the object 302 in the three-dimensional depth direction. Then, when the object 302 interferes with the trash can, FIG. 30C and FIG. 32C As shown, the display of the object 302 disappears, and the trash box 305 is switched to a display of a state in which trash (object) enters!
- any object can be used as long as an object that interferes with the object 302 can execute an attribute on the object 302. Even an object with attributes! /.
- FIGS. 33 to 36C are schematic diagrams for explaining the three-dimensional pointing method of the embodiment 1-7 according to the present invention.
- FIGS. 33 and 34 are the inputs used in the three-dimensional pointing method of the embodiment 1-7.
- Fig. 35A, Fig. 35B, and Fig. 35C are diagrams showing examples of pen configurations.
- Fig. 36A, Fig. 36B, and Fig. 36C are input pens, respectively. They are a front view and a right side view showing a state in the three-dimensional space when operated. Note that FIGS. 36A, 36B, and 36C correspond to FIGS. 35A, 35B, and 35C, respectively.
- the pen tip 201P is used as the input pen when performing the three-dimensional pointing method as described in the above Examples 1-1 to 1-6.
- a pointing method using an input pen that is pushed into the housing of the input pen 201 in accordance with the height of the input pen 201 will be described.
- the input pen 201 used in this embodiment 1-7 has a panel 201E inside the casing, and the operator applies pressure to the input pen 201.
- the pressure detection means 201F has a structure that senses pressure from the repulsive force of the panel 201E when it is picked up.
- the configuration of the input pen 201 is not limited to the configuration shown in FIG. 33.
- a pneumatic piston 201G is used and the pressure applied to the air pressure sensing unit 201H is applied. It may be a structure that senses.
- the pressure detected by the input pen 201 having the structure shown in FIG. 33 or FIG. 34 is processed by the system control device 1, and for example, a pointer 303 having a length proportional to the pressure applied by the operator is displayed. Displayed on device 3.
- Example 1-7 as an example of performing pointing of the three-dimensional space represented on the display device 3 using the input pen 201 having the configuration shown in FIG. 33 or FIG. 34, A case where the pointing method described in Example 1-1 is performed will be described.
- the input device 2 and the display device 3 respectively use an electromagnetic induction type pen tablet and a liquid crystal display as in the case of the embodiment 11.
- the detection means (digitizer) of the input device 2 is superimposed on the display surface of the liquid crystal display 3.
- the size of the conical pointer 303 is pushed into the housing of the pen tip 201P of the input pen 201
- the visual effect that the pointer 303 is a part of the pen tip 201P is further enhanced as compared with the method as described in the embodiment 11.
- Example 1-7 as an example of the input pen 201 having a structure in which the pen tip is recessed, as shown in FIGS. 33 and 34, an input pen of a mechanism using a panel and air pressure is given.
- an input pen of another mechanism may be used as long as the same effect can be obtained.
- An input pen of another mechanism may be used instead of providing the pressure detection unit 201F and the air pressure detection unit 201H inside the pen housing, for example, if a similar effect is obtained, for example, by providing a mechanism for measuring the amount of movement of the pen tip 201P.
- the case where the three-dimensional pointing method described in the embodiment 1-1 is performed is taken as an example.
- the same pointing method as described in Example 1-6 is also shown in Figs. 33 and 34.
- the input pen 201 having such a configuration can be used.
- FIGS. 37A to 40C are schematic diagrams for explaining the three-dimensional pointing method of the embodiment 1-8 according to the present invention, and FIGS. 37A, 37B, 37C, 38A, 38B, and 38C are the same.
- a bird's-eye view showing the state in the three-dimensional space when operating with the input pen, respectively, Fig. 39A, Fig. 39B, Fig. 39C, Fig. 40A, Fig. 40B, and Fig. 40C are the three-dimensional sky when operated with the input pen, respectively.
- It is a front view which shows the mode in between. 39A, 39B, and 39C are diagrams corresponding to FIGS. 37A, 37B, and 37C, respectively, and FIGS. 40A, 40B, and 40C are FIGS. 38A, 38B, and 38C, respectively.
- FIGS. 38A, 38B, and 38C are FIGS. 38A, 38B, and 38C, respectively.
- a display screen capable of three-dimensional display as a specific use scene of the three-dimensional pointing method as described in the embodiments 1-1 to 1-6.
- An example is a remote control with a built-in pen tablet detection means for operating the music playback function.
- the display control procedure of the pointer and the object on the display screen of the remote control when the input pen 201 is operated is the same as the procedure described in the Example 11 to Example 1-7. Since they may be the same, detailed description is omitted.
- the operator uses the input pen 201 in which the pen tip 201P is pushed into the housing, as described in Example 1-7, for example, as shown in FIGS. 37A and 39A.
- the object displayed in the three-dimensional space 301 of the remote controller is operated.
- the display switches to a state where the playback button 302a is pressed, and music is displayed. Generation is started.
- the input pen 201 is operated to point to the volume knob 302b.
- the knob 302b is When the input pen is moved to rotate, the volume of the music being played can be increased or decreased.
- the operation is not limited to the operation shown in Fig. 37C and Fig. 39C.
- Fig. 38A and Fig. 40A After pointing the knob 302b, move the input pen 201 around the axis. By rotating, the knob 302b is rotated to increase or decrease the volume.
- FIGS. 38B and 40B when an area 302c displaying information related to the music being played is pointed with the input pen 201, as shown in FIGS. 38C and 40C, If the region 302c is switched to a two-dimensional display so that a two-dimensional G UI operation can be performed, for example, in combination with a handwritten character recognition function, the music to be reproduced in the region 302c can be displayed. You can enter a track number and skip to the desired track number.
- Example 1-8 an example of the operation of a music device using a remote controller has been shown.
- the present invention is not limited to this.
- a PDA, mobile phone, kiosk terminal, ATM It can be applied to the operation of devices that can take the same form, and each device can be operated more intuitively.
- music is played, the volume is raised, and the track is changed.
- the present invention is not limited to this, and any operation associated with the operation in the input bin 201 may be performed. Any operation is possible
- handwriting recognition is used to input the track number. Any method can be used as long as it can be realized by a 1S two-dimensional GUI. For example, a pull-down menu The track number may be displayed with and the input pen 201 may be used for the input method.
- the 3D pointing device does not have to be a dedicated device specialized for realizing the 3D pointing method.
- a three-dimensional pointing program that causes the computer to execute the three-dimensional pointing method as described in the above embodiments.
- the three-dimensional pointing program can be recorded in any of magnetic, electrical, and optical formats as long as it is recorded in a state that can be read by the computer. It may be recorded on a recording medium.
- the three-dimensional pointing program can be provided through a network such as the Internet, which is not only recorded and provided on the recording medium.
- the second embodiment corresponds to the first object of the present invention in the same manner as the first embodiment.
- the pointer is controlled in accordance with the pen pressure or inclination of the input pen, while in the second embodiment, the contact time of the pen tip of the input pen or the input pen The pointer is controlled according to the amount of operation of the attached operating means. Since the first embodiment and the second embodiment are the same as trying to achieve different forces in the operation of the input pen for controlling the pointer, the first embodiment is appropriately The second embodiment will be described with reference to the drawings.
- the 3D pointing method of the second embodiment uses a pen-type input device (means) to point an object in a 3D space represented on a display device capable of 3D display. Or operating a pointed object.
- the pen-type input device is, for example, a contact between an input pen (electronic pen) that is operated by an operator who performs pointing and operation of the object, such as a pen tablet, and a pen tip of the input pen.
- detecting means for detecting information such as presence / absence, position, axis direction, and operation information of the operating means provided in the input pen.
- the pointer is placed in the three-dimensional space represented on the display device. Display.
- the operator can point the objectat in the three-dimensional space represented on the display device with the pen tip of the input pen in contact with the detection surface of the detection means. And the operator's fatigue can be reduced during long-term pointing and object control operations.
- the time in which the pointer is continuously pointed with the pen tip of the input pen or the operation status of the operation means of the input pen is moved in the depth direction of the pointer.
- the third order expressed in the display device by corresponding to the deformation. make it possible to point to a point in the original space.
- the operator can display the pointer displayed on the three-dimensional space represented on the display device. You can feel the force as part of the pen tip of your input, and you can easily and intuitively point to 3D objects.
- an operation of selecting or grasping the pointed object is performed to edit and process the object in a two-dimensional GUI.
- processing (operation) such as editing and processing of the object is completed
- the object is treated as a three-dimensional object again so that it can be moved to a three-dimensional position desired by the operator.
- the operation of the 3D object can be realized with the same operation as that of the conventional 2D GUI using the existing pen-type input device. Therefore, for example, the operator does not need to newly learn the operation of the three-dimensional input pen for operating the object.
- FIG. 41 is a diagram showing a configuration example of a system that realizes the three-dimensional pointing method of the present embodiment.
- 1 is a system controller
- 101 is input information acquisition means
- 102 is pointer position Z rotation angle calculation means
- 103 is pointer generation means
- 104 is pointing determination means
- 105 is object generation means
- 106 is display Control means 107
- processing control means 108 storage means 108
- 2 an input device
- 3 a display device 3
- FIG. 42 is a diagram showing a configuration example of an input pen used in the three-dimensional pointing method of the present embodiment.
- 201 is an input pen
- 201A is a coil
- 201B is a coil for detecting a rotation angle
- 201D is an operation means (button)
- 201D and 201D are seesaw type buttons or wheels or
- a slide bar 201D is a push button. In the following explanation, there is a particular description.
- 201D and 201D are seesaw-type buttons, and 201D is a push button.
- the input pen 201 is not limited to the above mechanism as long as it performs the same function.
- the three-dimensional pointing method of the present embodiment is, for example, a system control device such as a PC. Using the pen-type input device connected to a device, a pointer in a three-dimensional space represented on a display device connected to the system control device or an object pointed with the pointer is displayed in a three-dimensional manner. This is a preferred pointing method that is applied when operating.
- the system control device 1 acquires the input information input means 101 for acquiring the input information input from the input device 2 and the input information acquisition means 101.
- the input information is information related to the control of the pointer, based on the input information!
- Input time information processing means 109 for calculating a pointer position Z rotation angle calculation means 102 for calculating a pointer movement direction and amount, a rotation direction and a rotation angle based on the input information
- Pointer position Z rotation angle calculation means 102 generates a pointer based on the calculation result of the pointer generation means 103 and the pointer generated by the pointer generation means 103 is pointed to
- Pointing determination means 104 for determining whether or not the object has a force, and when there is an object that is pointed, for example, a position where the color of the object is changed or the movement or rotation of the pointer is followed
- An object generation means 105 for generating an object of a direction and a display control means 106 for displaying the pointer generated by the pointer generation means 103 and the object generated by the object generation means 105 on the display device 3.
- the system control device 1 is a device that activates and operates software according to input information from the input device 2 or controls other devices, such as the PC, for example.
- processing control means 107 for controlling processing such as software activation, data used in processing by the processing control means 107, and the like are stored.
- Storage means 108 is provided.
- the display control means 106 includes the system control in addition to the pointer and the object. It is assumed that the display device 3 can display the contents of the process being executed by the apparatus 1 (process control means 107) and the result of the process.
- the input device 2 is, for example, whether or not an input pen (electronic pen) held by an operator who operates the pointer or the object and the pen tip of the input pen are in contact with each other. It is also assumed that the detection means has a detection surface for detecting information such as a position, a state of an operation means such as a button provided in the input pen, a tilt, a direction, and a rotation angle of the input pen.
- the Cartesian coordinate system XYZ corresponding to the three-dimensional space represented on the display device 3 is taken on the detection surface of the detection means as shown in FIG. Assuming that the XY plane of the default coordinate system XYZ is the detection surface, when the pen tip 201P of the input pen comes into contact with the detection surface (XY plane), the detection means determines whether or not the pen tip 201P is in contact.
- the presence or absence of contact of the pen tip 201P of the input pen, the coordinates (X, y) of the contact position, the azimuth ⁇ and the inclination of the axis 201 of the input pen housing The configuration of the input device 2 capable of detecting information such as rotation ⁇ around the axis is described in, for example, Reference 1 (Yuji Mitani, “Basics and Applications of Touch Panel,” Techno Times, 2001.) and references 2 (the catalog of intuos2 manufactured by WACOM Co., Ltd.) etc. can be easily guessed by those skilled in the art and can be realized easily.
- the angle of the rotation ⁇ around the axis 2011X of the input pen housing cannot be obtained by the structure described in Reference Document 1 or Reference Document 2.
- the coordinate indicator described in the reference document 1 inside the input pen 201 is used.
- another coil 201B for detecting the rotation ⁇ around the axis is added, and the change in the magnetic flux interlinking between the coils 201A and 201B is obtained, and the amount of rotation is calculated. This can be easily imagined and realized by those skilled in the art.
- the input pen 201 used in the three-dimensional pointing method of the present embodiment does not have to have a mechanism for detecting the angle of the rotation ⁇ around the axis as shown in FIG. [0265]
- the input device 2 is not limited to a device in which the input pen and the detection means are separated, such as a pen tablet or a stitch panel and a stylus pen, for example, a pen tablet.
- the input means may be an input device in which the detection means is incorporated in the housing of the input pen! /.
- the display device 3 may be any display device that can represent a three-dimensional space. For example, a three-dimensional object is projected and displayed on a two-dimensional plane such as a CRT display or a liquid crystal display. Even a 2D display device, such as a HMD (Head Mount Display) or DFD, can display and display a 3D stereoscopic image. That is, the display device 3 may be any display device as long as the operator can perceive the displayed pointer or object three-dimensionally.
- a two-dimensional plane such as a CRT display or a liquid crystal display.
- a 2D display device such as a HMD (Head Mount Display) or DFD, can display and display a 3D stereoscopic image. That is, the display device 3 may be any display device as long as the operator can perceive the displayed pointer or object three-dimensionally.
- the detection means of the input device 2 and the display device 3 can also take an integrated form (see, for example, JP-A-5-073208) 0
- the detection means can be integrated with the display device 3 so as to overlap the display surface of the display device 3.
- a form in which a touch panel and a stylus pen are combined can be applied. In this way, the operator can make a pointing by bringing the input pen into contact with the display surface of the display device 3 such as a liquid crystal display, and the detection means and the display device 3 are separated.
- the detection means and the display device 3 are physically integrated like a general pen tablet which does not limit the configurations of the detection means of the input device 2 and the display device 3. It ’s not necessary.
- Example 2-1 is an example in which an object is selected or grabbed in the same manner as Example 1-1, and will be described with reference to FIGS.
- the processing procedure of Example 2-1 will be described with reference to FIG. Note that in FIG. 5B and FIG. 6B, the force in which the arrow indicating the direction in which the pressure is applied to the input pen is described, this example does not have this arrow.
- the three-dimensional pointing method of the embodiment 2-1 uses the pen tip 201P of the input pen 201. Using the seesaw-type buttons 201D and 201D in contact with the detection surface of the detection means, 3
- Example 2-1 an electromagnetic induction pen tablet is used as the input device 2, and a liquid crystal display is used as the display device 3 capable of displaying the three-dimensional space. Further, the detection means (digitizer) of the input device 2 is superimposed on the display surface of the liquid crystal display 3, and can be operated by pointing directly on the display screen by operating the input pen, and further selecting or grasping. And Further, it is assumed that the input device 2 and the display device 3 are connected to the system control device 1 configured as shown in FIG.
- Example 2-1 as in Example 1-1, as shown in FIGS. 4A and 4B, in the three-dimensional space 301 expressed in the liquid crystal display 3, FIG. Assume that a coordinate system XYZ corresponding to the coordinate system X YZ shown in Fig. 5 is set, and the object 302 is arranged at a position of z ⁇ 0 in the three-dimensional space 301. At this time, it is assumed that an operator who operates the input pen 201 of the input device 2 observes the XY plane of the three-dimensional space 301 from the direction of z> 0.
- the operator brings the pen tip 201P of the input pen 201 into contact with the display surface of the liquid crystal display 3, for example, as shown in FIG. 5A and FIG. 6A, the operator overlaps the display surface.
- the detecting means detects the position (coordinates) of the pen tip 201P.
- a pointer 303 reflecting the interval (or number of times) is displayed.
- buttons 201D and 201D are further displayed while the operator keeps the pen tip 201P of the input pen 201 in contact with the display surface of the liquid crystal display 3.
- any one of the seesaw type buttons 201D and 201D is At the time (or number of times)
- a pointer having a corresponding shape is generated.
- the arrow-shaped pointer is lengthened in proportion to the time (or the number of times) the seesaw button 201D has been pressed.
- the pointer 303 becomes longer than the pointer shown in FIGS. 5A and 6A, as shown in FIGS. 5B and 6B.
- the seesaw button 201D is further extended.
- buttons 303 If pressed for one hour (or many times), the button 303 becomes longer as shown in FIGS. 5C and 6C.
- the pointer 303 is further shortened.
- the display of the pointer 303 is continuously (or a predetermined level) almost simultaneously with the time when the operator touches the input pen 201 and presses the seesaw type buttons 201D and 201D.
- the maximum and minimum values of the length of the pointer 303 may be set by the operator, or may be set in advance by the system.
- the force that is not shown is the tip of the pointer 303.
- the object generating means 105 performs processing such as changing the color of the object 302, thereby pointing the object 302 to the operator. Tell.
- the operator presses the push button 201D of the input pen 201 for example,
- the pointing determination means 104 of the system control apparatus 1 determines that an operation for selecting or grasping the object 302 has been performed, and maintains the state in which the color of the object is changed.
- the pointer 303 is changed to the pen tip 201P according to the time when the pen tip 201P is released (floated). It may be shortened at a constant speed from the state immediately before it is released (floating) from the detection surface, or the pen tip may be fixed to the state immediately before the detection surface force is released (buoyancy).
- the system control device 1 may execute the processing from step 401 to step 410 as shown in FIG.
- the display control means 106 displays the object 302 on the three-dimensional space 301 expressed in the display device (liquid crystal display) 3 (step 401).
- the input information acquisition means 101 is set in a state where the information detected by the detection means of the input device 2 can be acquired.
- the detection means detects the position (coordinates) and the like of the pen tip 201P.
- the information acquisition means 101 acquires information representing the state of the input pen 201 such as the detected position (coordinates) of the pen tip 201P (step 402).
- the detection means can also detect the azimuth and inclination ⁇ of the input pen 201, the rotation ⁇ around the axis, and the like. Therefore, information on the orientation (alpha), inclination (j8), and rotation (gamma) around the axis of the input pen 201 may be acquired together with information on the position (coordinates) of the pen tip 201P.
- the seesaw-type button 2 of the input pen 201 is processed by the input information processing means 109. Determines whether one of 01D or 201D is pressed, and either is pressed
- this process first determines whether or not the button 201D is pressed (step 403a).
- Step 403b if it is pressed, the information is output to the pointer position Z rotation angle calculation means 102 (step 404b).
- step 405 The position, orientation, length, etc. of the pointer reflecting these pieces of information are calculated using the information on whether or not the displacement is pushed! /, Or the force or not (step 405).
- step 405 the position on the XY plane of the three-dimensional space 301 represented on the display device 3 corresponding to the position (coordinates) of the pen tip 201P. (Coordinate) and proportional to the time (or number of times) that either of the saw-type buttons 201D, 201D was pressed
- the length to be calculated is calculated.
- the pointer generation unit 103 After the calculation processing in step 405 is completed, the pointer generation unit 103 generates a pointer having a shape based on the calculation result of the pointer position / rotation angle calculation unit 102, and the display control Information on the generated pointer is sent from the means 106 to the display device 3 and displayed on the three-dimensional space 301 (step 406).
- the pointing determination unit 104 calculates the plane of the three-dimensional space calculated by the pointer position / rotation angle calculation unit 102 It is determined whether or not there is a pointing object at a position corresponding to the upper position (coordinates) and depth position (step 407).
- the determination at the step 407 is, for example, as shown in FIG. Judge by whether or not the object is within 10 pixels from the position where the pointer is pointing. In Fig. 43, the force is within 10 pixels, but not limited to this. Whether or not there is an object to be pointed can be determined by the method described here also in the first embodiment. If there is no object to be pointed to! /, Only the display control of the pointer 303 is performed!
- the object generation means 105 generates, for example, an object in which the color of the pointing object is changed, and the display control means 106 generates the generated object. Information relating to this is sent to the display device 3 and displayed on the three-dimensional space 301 (step 408). Further, the pointing determination means 104 determines whether or not the push button 201D of the input pen 201 has been pressed (step 409). And the push button 201D
- the pointing object 302 can be selected or grabbed (step 410).
- the push button 201D is not pressed,
- step 402 the next input information is obtained.
- step 401 to step 406 when the processing of step 401 to step 406 has already been performed, that is, when the pointer is displayed on the display device 3, the length of the pointer is calculated by the step 405. For example, when information indicating that the button 201D is pressed is input from the input information processing means 109, the pointer position Z times
- the turning angle calculation means 102 increases the length of the pointer by a predetermined length. It also indicates that the button 201D is pressed from the input information processing means 109.
- the pointer position Z rotation angle calculation means 102 shortens the length of the pointer by a predetermined length. By repeatedly executing such a processing loop, the time when the seesaw type buttons 201D and 201D are pressed (or
- the input Information on the position (coordinates) of the pen tip 201P of the force pen 201 and the detection surface of the detection means is acquired, and the XY plane in the three-dimensional space represented on the display device 3 corresponding to the position (coordinates) of the pen tip 201P
- the depth position corresponding to the upper position (coordinates) and the time (or number of times) the seesaw-type button 201D, 201D of the input pen 201 was pressed was calculated and calculated
- any one point on the three-dimensional space 301 expressed in the display device 3 can be pointed. Furthermore, it is determined whether or not the push button 201D of the input pen 201 is pressed.
- the object pointed by the pointer can be selected or grasped.
- a general pen tablet is used as the input device 2, and the pointing position in the depth direction of the pointer 303 is changed in a state where the pen tip 201P of the input pen 201 is in contact with the detection means. Therefore, operator fatigue can be reduced.
- the detection means of the input device 2 is overlapped with the display surface of the display device (liquid crystal display) 3, the operator The input pen 201 can be operated on the display surface. In this way, a visual effect can be obtained as if the pointer 303 is a part of the pen tip 201P of the input pen 201, and accurate pointing of the object 302 is facilitated and intuitive. Pointing is possible.
- Example 2-1 the seesaw buttons 201D and 201D of the input pen 201 are used.
- Example 2-1 an electromagnetic induction type pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is used as the display device (liquid crystal display). Ray)
- the force given as an example when superimposed on the display surface of 3 The configuration is not limited to this, and both may be in different positions. Further, in the present embodiment 2-1, the force described by taking the electromagnetic induction type pen tablet and the liquid crystal display as an example as a combination of the input device 2 and the display device 3 is not limited thereto. A combination of a touch panel and a stylus pen used in a PDA or the like may be used.
- whether or not the tip of the pointer 303 is in the vicinity of the target object 302 is determined by whether or not it is within a range of 10 pixels from the tip of the pointer 303.
- this range can be arbitrarily set and changed by the system administrator or operator.
- Example 2-1 in the same way as Example 1-1, pointers of various shapes can be adopted as shown in FIGS. 8A, 8B, 8C, and 8D. it can.
- Example 2-1 as the pointer 303, for example, as shown in FIG. 8A, a flat arrow-shaped pointer 303a is generated and displayed.
- the shape is not limited to any shape as long as the pointing position is visually clear. Examples of the shape of such a pointer include a three-dimensional arrow pointer 303b in which a cylinder is connected to the bottom of a cone as shown in FIG. 8B, a conical pointer 303c as shown in FIG. 8C, and FIG. 8D.
- a hand-shaped pointer 303d of a person pointing at an object with an index finger like a clasp can be considered.
- a polygonal pyramid type pointer similar to the conical type pointer 303c shown in FIG. 8C may be used.
- the point where the pointer 303 is pointing is the force used as the tip of the arrow type pointer (the tip of the arrow). It can be pointable, or it can be part of the pointer.
- the present invention is not limited to this, and the object 302 may have any shape.
- Example 2-1 the coordinate system of the three-dimensional space represented by the display device 3 is set to a force 3 so that the display surface force is 0 as shown in Fig. 4A. Dimensional space can be expressed If so, the 3D origin can be anywhere. Also, the coordinate system need not be a Cartesian coordinate system, for example, a cylindrical coordinate system or a spherical coordinate system.
- Example 2-1 the seesaw buttons 201D and 201D of the input pen 201 are used.
- Example 2-2 As in Example 1-2, by changing the direction of the input pen 201, an object in the depth direction as viewed from the operator in the three-dimensional space can be pointed from various directions. The method will be described with reference to FIGS. Also, the processing procedure of this embodiment will be described with reference to FIG.
- Example 2-2 it is assumed that the input device 2 and the display device 3 respectively use an electromagnetic induction type pen tablet and a liquid crystal display as in the case of the above-described Example 2-1. Further, it is assumed that the detection means (digitizer) of the pen tablet 2 is superimposed on the display surface of the liquid crystal display 3. Further, it is assumed that the pen tablet 2 and the display device 3 are connected to the system control device 1 configured as shown in FIG.
- the method of taking the coordinate system of the three-dimensional space expressed on the liquid crystal display 3, the operation method of the input pen 201 of the pen tablet 2, and the like are as described in the embodiment 2-1.
- the seesaw type buttons 201D and 201D and the push button 201D of the input pen 201 used in the embodiment 2-1 are used.
- An input method is used in which the object is pointed based on the operation and the pointed object can be selected or grasped.
- the system control device 1 causes the input information acquisition unit 101 to detect the pen tip 201P of the input pen 201 detected by the detection unit (digitizer). Information such as the position (coordinates), the azimuth and inclination j8 of the input pen 201, and the rotation ⁇ around the axis is acquired.
- the input information processing means 109 calculates the time during which the pen tip 201 of the input pen 201 is in contact with the detection surface.
- the input information processing means 109, the pointer position / rotation angle calculation means 102, and the pointer generation means 103 are placed at a position on the three-dimensional space 301 corresponding to the position where the pen tip 201P is in contact. Create a pointer to display.
- a pointer display signal is sent from the display control means 106 to the display device 3, and for example, a pointer 303 as shown in FIG. 9B and FIG. 10B is displayed.
- the calculation of the pointer position and the rotation angle is performed.
- the means 102 and the pointer generation means 103 calculate the direction of the new pointer from the new azimuth and inclination ⁇ 8 of the input pen 201 and the rotation ⁇ around the axis, and generate a pointer based on the calculation result.
- the newly generated pointer display signal is sent from the display control means 106 to the display device 3, the pointer 303 as shown in FIGS. 9 and 10 or FIGS. 9C and 10C is displayed. Is done.
- the display of the pointer 303 is continuously performed almost simultaneously with the operation of changing the direction of the input pen 201 by the operator, so that the operator tilts the pen. It is possible to feel as if the pointer 303 is tilted and displayed on the extension line of the pen tip 201P in the selected direction.
- the length of the pointer 303 can be adjusted according to the presence / absence of contact between the pen tip 201P of the input pen 201 and the detection surface and the contact time.
- the operator causes the pen tip 201P of the input pen 201 to contact the detection surface, and the operator touches the pen tip 201P again with the detection surface force released within a certain period of time (for example, within 0.5 seconds).
- the (tap operation) is performed, the length of the pointer 303 starts to increase. If the tap operation is performed again while the length of the pointer 303 is extended, the extension of the pointer 303 stops, and the operator can handle it as a pointer having the length at that time.
- the pointer 303 can be further extended.
- double tap operation When the operator performs an operation of performing the tap operation twice in succession (double tap operation), the length of the pointer 303 starts to shrink. Then, if the tap operation is performed while the length of the pointer 303 is contracted, the contraction of the pointer 303 stops, and the operator can handle it as a pointer of the length at that time. Further, if the double tap operation is performed when the length of the pointer 303 is not changed, the pointer 303 can be further contracted.
- the system controller 1 A state in which the length of the pointer 303 has not changed by performing processing such as changing the color of the object by the object generation means 105 and switching the display of the object 302 displayed on the display device 3.
- the operator can be informed that the object 302 can be selected or grasped.
- the state force that the color of the object 302 changes and can be selected or grasped For example, if the operator continues for 0.5 second or longer, the operator can select or grasp the object 302.
- the input information processing means 109 and the pointer position Z rotation angle calculation means 102 are provided.
- the pointer generation means 103 can generate, for example, a pointer reflecting the length of the pointer immediately before the direction of the input pen 201 is changed in addition to the direction of the pointer.
- the system control apparatus 1 may execute processing as shown in FIG.
- the display control means 106 displays the object 302 on the three-dimensional space 301 represented on the display device (liquid crystal display) 3 and also represents a variable indicating the state of the pointer.
- the input information acquisition means 101 is in a state where it can acquire the information detected by the detection means of the input device 2.
- the detection means detects the position (coordinates) of the pen tip 201P, etc.
- the information acquisition means 101 acquires information representing the state of the input pen 201 such as the detected position (coordinates) of the pen tip 201P (step 402).
- step 2 information on the position (coordinates) of the pen tip 201P, information on the orientation ⁇ , inclination
- step 404c whether the length of the pointer is changed by the input information processing means 109 from the variable s indicating the pointer state and the contact state between the pen tip 201P of the input pen 201 and the detection surface is determined. Determine whether or not, and output the information.
- the pointer position Z rotation angle calculation means 102 among the acquired information, the position (coordinates) of the pen tip 201P and the value of the variable s indicating the state of the pointer are used. The position, orientation, length, etc. of the pointer reflecting this information are calculated (step 405), and the pointer generating means 103 generates a pointer having a shape based on the calculation result of the pointer position Z rotation angle calculating means 102. Then, information on the generated pointer is sent from the display control means 106 to the display device 3 and displayed on the three-dimensional space 301 (step 406).
- Step 405 the nib 2 The position (coordinates) on the XY plane of the three-dimensional space 301 represented on the display device 3 corresponding to the position (coordinates) of 01P, and the pen tip after performing the tap operation or double tap operation A length proportional to the time during which 201P is in contact with the detection surface is calculated.
- the variable s l
- the length of the pointer becomes longer in proportion to the time during which the pen tip 201P is in contact with the detection surface.
- the system control apparatus 1 performs the processing of step 407 and step 408 described in the embodiment 2-1 in parallel with the processing of step 406.
- the step 407 determines whether or not the object has a force within 10 pixels from the tip of the pointer, in other words, the position where the pointer is pointing. In FIG. 44, the force is within 10 pixels, but not limited to this. If there is no pointing object, only the display control of the pointer 303 is performed, and the process returns to step 402 to acquire the next input information.
- the process returns to step 402 to acquire the next input information.
- the duration of the state in which the object is pointing is not limited to 0.5 seconds or longer, and may be another time.
- FIGS. 43 and 44 correspond to the processing procedure shown in FIG. 7 in the first embodiment. 43 and 44 show the details of the processing.
- the input pen 201 since it is not necessary to provide the input pen 201 with an operation means such as a button for performing an operation of changing the depth position of the pointer or an operation of changing the pointing direction, as the input device 2, Conventional general pen tablet and input pen can be used. Further, the pointing position in the depth direction of the pointer 303 can be changed in a state where the pen tip 201P of the input pen 201 is in contact with the detection means, and the operator's fatigue can be reduced.
- an operation means such as a button for performing an operation of changing the depth position of the pointer or an operation of changing the pointing direction
- Conventional general pen tablet and input pen can be used.
- the pointing position in the depth direction of the pointer 303 can be changed in a state where the pen tip 201P of the input pen 201 is in contact with the detection means, and the operator's fatigue can be reduced.
- Example 2-2 if the detection means of the input device 2 is overlapped with the display surface of the display device (liquid crystal display) 3, the operator The input pen can be operated on the display surface. Such a rubbing provides a visual effect as if the pointer 303 is a part of the pen tip 201P of the input pen 201, facilitates accurate pointing of the object 302, and is intuitive. Pointing is possible. [0316] Also, in this embodiment 2-2, the direction, inclination, and rotation around the axis of the pointer change in proportion to the direction, inclination ⁇ and rotation ⁇ around the input pen 201, respectively.
- the shape changes in the 3D depth direction ( ⁇ 0), or the pointer tilt changes in the 3D depth direction ( ⁇ ⁇ 0).
- the change may be any.
- the present invention is not limited to changing the tilt, direction, and rotation of the pointer in proportion to the tilt, azimuth, and rotation of the input pen 201.
- the tilt, azimuth, and rotation of the input pen 201 May be proportional to the power or power root!
- Example 2-2 an electromagnetic induction pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is used as the display surface of the display device (liquid crystal display) 3.
- the force exemplified in the case of overlapping with each other is not limited to this, and the configuration may be such that both are in different positions.
- the force described by taking the electromagnetic induction pen tablet and the liquid crystal display as an example of the combination of the input device 2 and the display device 3 is not limited to this.
- It may be a combination of a touch panel and a stylus pen used in a PDA or the like.
- Example 2-2 as long as the same effect can be obtained as an example of the operation of the input pen 201, the force that indicates the presence or absence of contact between the pen tip 201P and the detection surface can be obtained.
- the length of the button 303 may be changed according to the time (or the number of times) the button of the input pen 201 is pressed, the amount and direction of rotation of the wheel, and the amount and direction of movement of the slide bar.
- the maximum value and the minimum value of the length of the pointer 303 may be set by the operator, or may be set by the bullying system control device 1.
- whether or not the pointer 303 (pointer point) is in the vicinity of the target object 302 is determined based on whether or not it is within the range of 10 pixels. This range can be arbitrarily set and changed by a system administrator or operator. Similarly, the system administrator or operator can arbitrarily set and change the length of the pointing time in step 409 ′.
- the shape of the pointer is a shape in which the pointing position is visually clear. Any shape is acceptable, not limited to the flat arrow-shaped pointer 303a shown in FIG. 8A, for example, a three-dimensional shape in which a cylinder is connected to the bottom of a cone as shown in FIG. 8B. It may be an arrow-shaped pointer 303b, a conical pointer 303c as shown in FIG. 8C, or a hand-shaped pointer 303d of a person pointing an object with an index finger as shown in FIG. 8D.
- the point where the pointer 303 is pointing is the force used as the tip of the arrow pointer (the tip of the arrow).
- the tip of the arrow the force used as the tip of the arrow pointer.
- a force that takes a folder-type object as an example of the object 302 is not limited to this, and the object 302 may have any shape.
- the force focusing on the azimuth ⁇ , the inclination ⁇ , and the rotation ⁇ around the axis of the input pen 201 is not necessary to use the azimuth and inclination ⁇ of the input pen 201 and the rotation ⁇ around the axis, and the depth position may be simply changed, or the azimuth ⁇ and inclination j8 of the input pen 201 may be used. May be used alone.
- Example 2-1 and Example 2-2 display control of the pointer on the three-dimensional space 301 represented on the display device 3 in accordance with the operation of the input pen 201 of the input device 2 and The pointing method related to the display control of the pointed object was explained.
- Example 2-3 it is assumed that the input device 2 and the display device 3 use an electromagnetic induction type pen tablet and a liquid crystal display, respectively, as in Example 2-1 above. Further, it is assumed that the detection means (digitizer) of the pen tablet 2 is superimposed on the display surface of the liquid crystal display 3.
- the method of selecting or grasping after pointing the object on the three-dimensional space 301 is as described in the second and second embodiments. Since this method may be used, the description is omitted.
- Example 2-1 and Example 2-2 By the same method as in Example 2-1 and Example 2-2, for example, after pointing the object 302 as shown in FIGS. 11 and 12, the color of the object 302 is changed.
- An operator who confirms that the object 302 has been pointed for example, performs an operation of selecting or grasping the object 302 by performing the pointing using the method of Example 2-1 or Example 2-2. Do.
- the input 302 is in a state where the object 302 is selected or grasped and the pen tip 201P of the input pen 201 is in contact with the display surface of the display device 3 (detection surface of the detection means).
- the object 302 moves in the three-dimensional space 301 following the movement of the input pen 201 as shown in FIGS.
- the input pen 2 When a predetermined operation such as pressing the push button 201D of 01 once is performed, the object 302
- the object 302 is displayed at that position even after the position of the input pen 201 is fixed and the input pen 201 is moved or the pen tip 201P is released. In this way, the object 302 can be moved from the original position to the target position in the three-dimensional space.
- the input information acquisition means 101 of the system control device 1 receives the object described in the embodiment 2-1 and the embodiment 2-2 from the detection means (digitizer) of the display device 2. In addition to information necessary for the operation of selecting or grasping, for example, information that the push button 201D is pressed is acquired. And the system control device 1
- the operator can know that the operation of moving the object 302 is performed. Therefore, if the pointing determination unit 104 and the object generation unit 105 generate an object that follows the movement of the input pen 201 and display it on the display device 3, the movement operation of the object 302 as described above is performed. Is possible.
- the system control apparatus 1 may execute a process as shown in FIG.
- the first process (step 420) in FIG. 45 is a process until the object 302 on the three-dimensional space 301 is selected or grasped, and has been described in the embodiment 2-1 or the embodiment 2-2. The detailed explanation is omitted.
- the system controller 1 selects or grasps the object 302 by the procedure described in the embodiment 2-1 or the embodiment 2-2 ( After performing step 420), the input information acquisition means 101 acquires information on the pointer (step 421).
- the information acquired in the step 421 is information such as the position (coordinates) of the pen tip 201P of the input pen 201, the azimuth ⁇ , the tilt j8, and the rotation ⁇ around the axis of the input pen 201.
- the input information processing means 109 and the pointer position / rotation angle calculation means 102 then execute the pointer 303 based on the acquired information.
- the position, orientation, length, etc. of the object 302 are calculated, and the position, orientation of the object 302 is calculated (step 422).
- Position and direction of the pointer The calculation of the length and the like is as described in Example 2-1 and Example 2-2, and detailed description thereof is omitted.
- the position and orientation of the object are, for example, the reference position of the object when it is pointed and the relative positional relationship force between the position pointed by the pointer and the position calculated by the step 422. Calculate the position and orientation that can be saved even at the position of.
- Example 2-1 only the information on the position (coordinates) of the pen tip 201P of the input pen 201 is used, and the azimuth and inclination ⁇ of the input pen 201 and the rotation ⁇ around the axis
- the pointer is displayed with the direction of the pointer being always constant. Therefore, it is not necessary to calculate the direction of the pointer in the step 422.
- the movement of the object 302 is only a parallel movement as in the second to third embodiments, the object is displayed with the direction of the object being always constant. There is no need to calculate the orientation.
- the pointer position, orientation In addition to generating a pointer based on the length, an object based on the position and orientation of the object calculated in the object generation means 105 is generated, and these display signals are sent from the display control means 106 to the display device 3. The pointer and the object are displayed (step 423).
- step 423 When the pointer and the object are displayed in step 423, for example, in the input information processing means 109, the push button 201D of the input pen 201 is pressed.
- step 424 determines whether the object selection or not is pressed. If not pressed, the state of the input pen is acquired again (step 421), and the processing of step 422 and step 423 is continued. On the other hand, when the push button 201D is pressed many times, the object selection or
- the object pointing operation and moving operation as shown in FIGS. 11 and 12 can be performed.
- the pointing object is selected or grasped, and the input pen is moved.
- the object can be translated.
- a general pen tablet is used as the input device 2, and the object is pointed, selected or grasped and moved further while the pen tip 201P of the input pen 201 is in contact with the detection means. Therefore, the operator's fatigue can be reduced.
- the detection means of the input device 2 is overlapped with the display surface of the display device (liquid crystal display) 3, the operator The input pen can be operated on the display surface.
- This rubbing provides a visual effect as if the pointer 303 is part of the pen tip 201P of the input pen 201, facilitates accurate pointing and movement of the object 302, and Intuitive pointing and movement are possible.
- an electromagnetic induction pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is used as the display surface of the display device (liquid crystal display) 3.
- the force exemplified in the case of overlapping with each other is not limited to this, and the configuration may be such that both are in different positions.
- the force described by taking the electromagnetic induction type pen tablet and the liquid crystal display as an example as a combination of the input device 2 and the display device 3 is not limited thereto. It may be a combination of a touch panel and a stylus pen used in a PDA or the like.
- the shape of the pointer may be any shape as long as the pointing position is visually clear, as shown in FIG. 8A.
- a three-dimensional arrow-shaped pointer 303b in which a cylinder is connected to the bottom of the cone as shown in FIG. 8B, a conical pointer 303c as shown in FIG. 8C, and an index finger as shown in FIG. 8D It may be the hand-shaped pointer 303d of the person pointing at the object.
- the pointer 303 is pointed and the point is an arrow type.
- the force used as the tip of the pointer is not limited to this, and it is possible to point at any part of the pointer, or to use another part other than the tip of the pointer.
- a force that takes a folder-type object as an example of the object 302 is not limited to this, and the object 302 may have any shape.
- the coordinate system of the three-dimensional space represented by the display device 3 is the same as that of the Embodiment 2-1, that is, as shown in FIG.
- the object 302 that has been selected or grasped is moved to a desired position, and then the push button 201D of the input pen 201 is pressed once, thereby
- Force that ended the moving operation of the jet is not limited to this, for example, the push button 201D
- Example 2-4 Next, Example 2-4 will be described.
- Example 2-3 after pointing the object 302 on the three-dimensional space 301 represented by the display device 3, the pointed object 302 is selected or grasped, and the object 302 is selected.
- the 3D pointing method that can translate the.
- the embodiment 2 is applied.
- Example 2-4 As in Example 1-4, after pointing and selecting or grasping an object, the pointed object is subsequently tilted in the depth direction in the three-dimensional space.
- FIGS. 14A to 15C used in Examples 1-4.
- the input device 2 and the display device 3 respectively use an electromagnetic induction type pen tablet and a liquid crystal display as in the case of Example 2-1. Further, it is assumed that the detection means (digitizer) of the pen tablet 2 is superimposed on the display surface of the liquid crystal display 3.
- Example 2-4 the method of selecting or grasping after pointing the object on the three-dimensional space 301 is as described in Example 2-1 or Example 2-2. Since this method may be used, the description is omitted.
- Example 2-1 and Example 2-2 In the same manner as in Example 2-1 and Example 2-2, for example, as shown in FIGS. 14A and 15A, after pointing the object 302, the object 302 is selected or grasped. Then, when the input pen 201 is set in a desired direction while the object 302 is selected or grasped, as shown in FIG. 14B and FIG. 15B, or as shown in FIG. 14C and FIG. Following the above, the object 302 force tilts in the depth direction in the S3D space 301. In this way, the object 302 can be tilted in any direction on the three-dimensional space 301.
- the input information acquisition means 101 of the system control device 1 acquires information necessary for the operation of selecting or grasping the object in the embodiment 2-1 or the embodiment 2-2. Since the system control device 1 performs an operation of selecting or grasping an object in the embodiment 2-1 or the embodiment 2-2, the operator moves the object 302. You can know that you are doing. Therefore, if the pointing determination unit 104 and the object generation unit 105 generate an object that follows the change in the direction of the input pen 201 and display it on the display device 3, the object 302 as described above can be displayed. Move operation becomes possible.
- the system control apparatus 1 may execute, for example, a process as shown in FIG.
- the operations as shown in FIGS. 14A to 14C are executed, when the position and orientation of the object are calculated in step 422, for example, the above pointing is performed.
- the position and orientation are calculated so that the relative positional relationship between the current object and the pointer is all preserved.
- the object is changed in the direction of the input vane 201. It can be rotated in the 3D space or tilted in the depth direction.
- the operator can The input pen can be operated on the display surface.
- Such a rubbing provides a visual effect as if the pointer 303 is a part of the pen tip 201P of the input pen 201, facilitates accurate pointing of the object 302, and is intuitive. Pointing is possible.
- an electromagnetic induction pen tablet is used as the input device 2, and the detection means (digitizer) of the pen tablet is used as the display surface of the display device (liquid crystal display) 3.
- the force exemplified in the case of overlapping with each other is not limited to this, and the configuration may be such that both are in different positions.
- the force described by taking the electromagnetic induction type pen tablet and the liquid crystal display as an example as a combination of the input device 2 and the display device 3 is not limited to this, for example, It may be a combination of a touch panel and a stylus pen used in a PDA or the like.
- the shape of the pointer is a shape in which the pointing position is visually clear. Any shape is acceptable, not limited to the flat arrow-shaped pointer 303a shown in FIG. 8A, for example, a three-dimensional shape in which a cylinder is connected to the bottom of a cone as shown in FIG. 8B. It may be an arrow-shaped pointer 303b, a conical pointer 303c as shown in FIG. 8C, or a hand-shaped pointer 303d of a person pointing an object with an index finger as shown in FIG. 8D.
- the point where the pointer 303 is pointing is the force used as the tip of the arrow pointer (the tip of the arrow). It can be pointable, or it can be part of the pointer, not the other end.
- the force that takes a folder type object as an example of the object 302 is not limited to this, and the object 302 may have any shape.
- Example 2-4 after selecting or grasping and moving the object 302 by the method of Example 2-1 or Example 2-2, for example, description will be given in Example 2-3.
- the push button 201D of the input pen 201 is pressed,
- the force for ending the moving operation is not limited to this.
- the push button 201D instead of the push button 201D
- Example 2-4 the operation method for rotating the pointed object 302 in the three-dimensional space or tilting the object 302 in the depth direction has been described.
- Example 2-3 and Example 2-4 the description is given in Example 2-1 and Example 2-2.
- the pointing object 302 is translated or rotated, or tilted in the depth direction.
- the operations that the operator wants to continue include editing, deformation, etc. that can be performed only by moving and rotating the object 302.
- the object after selecting or grasping the object, the object is automatically moved to a position where a two-dimensional GUI-like operation can be applied.
- the following describes a 3D pointing method that allows the user to perform operations such as returning to the 3D position desired by the operator after performing the desired editing or transformation.
- FIGS. 16A to 19D used in Examples 1-5.
- FIG. 46 the processing procedure of the embodiment 2-5 will be described.
- the processing procedure shown in FIG. 46 corresponds to the processing procedure shown in FIG. 20 in the first embodiment.
- the input device 2 uses the same electromagnetic induction pen tablet as in each of the above embodiments, and a DFD is used as the display device 3 capable of expressing the three-dimensional space. Taking the case of using as an example, the pointing method and the following object operation method will be explained.
- the detection means (digitizer) of the input device (pen tablet) 2 is used as the display surface of the DFD. Can be superimposed. Further, in the case of the electromagnetic induction type pentabet, since there is a detectable range of ⁇ on the detection surface of the detection means, the input pen can be used even if the pen tip of the input pen is not in contact with the detection surface. It is possible to detect information such as the position, inclination, and direction of the.
- the detection means is arranged on the back side of the DFD display device, information such as the position, tilt, and orientation of the pen can be acquired.
- the detection means is arranged behind the display surface.
- the detection means is a transparent electrode, it can be arranged on the front side behind the display surface. In this way, by superimposing the detection means on the display surface of the DFD, it is possible to operate the input pen on the front display surface of the DFD and perform direct pointing.
- Example 2-5 it is assumed that the detection means of the electromagnetic induction type pentablet 2 and the display surface of the DFD are overlapped.
- the operator displays the image in a three-dimensional manner on the three-dimensional space 301, for example, by the method described in Example 2-1 or Example 2-2.
- the pointed object 302 is displayed as a two-dimensional object on the display surface 3A in front of the DFD, for example, as shown in FIGS. 18B and 19B, and the pointer 303 disappears.
- the object 302 has no thickness in the z direction and is displayed as an object! However, even if this object is a solid object with a thickness in the z direction, DFD 3D stereoscopic display using is not performed, and the image is displayed as a projected image on the display surface 3A in front of the DFD as 2D.
- the operator performs a target operation such as writing a character on the object 302 as an operation of the two-dimensional GUI. Then, after the operation of the two-dimensional GUI is finished, for example, when the push button 201D of the input pen 201 is pressed, the pointer 303 appears again, and the above-described implementation is performed.
- a target operation such as writing a character on the object 302 as an operation of the two-dimensional GUI.
- the pointer 303 appears again, and the above-described implementation is performed.
- the procedure described in Example 2-3 and Example 2-4 for example, as shown in Fig. 18C and Fig. 19C, or in Fig. 18D and Fig. 19D, object 3 02 Can be moved or tilted in the depth direction.
- the system control apparatus 1 may execute processing as shown in FIG.
- the first process (step 420) in FIG. 46 is a process until the object 302 on the three-dimensional space 301 is selected or grasped, and has been described in the embodiment 2-1 or the embodiment 2-2. The detailed explanation is omitted.
- the system control device 1 determines that an operation for starting a two-dimensional GUI operation, editing, and processing has been performed. Therefore, next, for example, the pointer 303 displayed on the display device 3 is hidden (step 426), and the projection of the selected or grasped object 302 is most recently viewed from the operator. Display on surface 3A (step 427). As a result, the object 302 can be operated, edited, and processed in a two-dimensional GUI manner.
- the two-dimensional GUI operation from the input pen 201 is received and executed (step 428). Further, for example, the operator presses the push button 201D of the input pen 201 once.
- step 429 It is determined whether or not a 2D GUI-like operation such as 3 s, an operation for finishing editing, or the like has been performed (step 429). At this time, if the operation for ending the operation, editing, and processing of the two-dimensional GUI is not performed, another operation, editing, and processing of another two-dimensional GUI are accepted and executed. On the other hand, when a 2D GUI-like operation, editing, or finishing operation is performed, the mode in which the 2D-GUI-like operation is performed is used as described in Example 2-3 or Example 24. Returning to the mode for performing a simple process, the object can be operated to translate, rotate, or tilt the object according to the procedure described in the embodiment 2-3 or the embodiment 2-4 (step 430).
- a 2D GUI-like operation such as 3 s, an operation for finishing editing, or the like
- Step 430 may be performed according to the procedure described in Example 2-3 or Example 2-4, and thus detailed description thereof is omitted.
- operations such as pointer display control and three-dimensional movement of the object as shown in FIGS. 18A, 18B, 18C, and 18D are performed.
- the object can be operated in a two-dimensional GUI.
- the object is selected or grasped by the method of the embodiment 2-1 or the embodiment 2-2, so that the three-dimensional pointing is performed.
- Force that can be processed as information for various operations, editing, and processing After the operation of pointing the object, the specific 2D GUI of the object is operated by operating a specific key or other switch on the keyboard.
- the mode may be switched to a mode for performing general operations, editing, and processing.
- the detection means (digitizer) of the electromagnetic induction pen tablet that is the input device 2 is overlapped with the display surface of the DFD that is the display device 3 that can display a three-dimensional space.
- the present invention is not limited to this, and a configuration in which both are in different positions may be used.
- a DFD is used as the display device 3, but the present invention is not limited to this, and a display device such as a liquid crystal display as described in Example 2-1 to Example 2-4. Use ⁇ .
- the origin of 3D is not necessary anywhere, and it is not necessary to be Cartesian coordinate system.
- it can be cylindrical coordinate system or spherical coordinate system.
- the operation of the push button 201D of the input pen 201 causes the
- the 2D GUI operation, editing, and processing modes have been terminated.
- the present invention is not limited to this.
- a specific key on the keyboard or another switch is used instead of the push button 201D.
- Example 2-5 as an example of performing the two-dimensional GUI-like editing on the object 302, for example, a case where the letter "A" is entered as shown in Fig. 18C was given.
- Force When the object 302 is an object that represents a file and a 2D GUI operation is performed after pointing to it, the file is opened and its contents are edited on the 2D GUI and the file is closed. It may be possible to move to the 3D position desired by the operator later.
- the three-dimensional pointing method of this embodiment 2-6 is one of the application examples of the three-dimensional pointing method described in the embodiments 2-5, and the object 302 in the three-dimensional space 301 is used as the embodiment 2— After pointing and selecting or grasping by the method 1 or Example 2-2, the object 302 is automatically moved to a position where the operator can easily operate, that is, a position to which the conventional two-dimensional GUI operation can be applied. The operator performs the desired editing and processing.
- Example 2-6 will be described with reference to FIGS. 21A to 25C used in Example 1-6.
- the processing procedure of Example 2-6 will be described with reference to FIG.
- the processing procedure shown in FIG. 47 corresponds to the processing procedure shown in FIG. 26 in the first embodiment.
- Example 2-6 as in Example 2-5, an electromagnetic induction pen tablet is used as the input device 2 and a DFD is used as the display device 3 as an example. A pointing method and an object operation method will be described. [0390] Further, it is assumed that the detection means (digitizer) of the input device (pen tablet) 2 is provided integrally with the display surface of the display device (DFD) 3.
- the coordinate system XYZ is set in the three-dimensional space 301, and the object 302 and the window 304 are arranged at the position of z in the three-dimensional space 301.
- an operator who wants to operate the object 302 first points the object 302 by the method described in the embodiment 2-1 and the embodiment 2-2. Then, select or grasp. As a result, the selected or grasped object 302 is displayed as a two-dimensional object on the display surface 3A in front of the DFD, and the pointer 303 disappears.
- the object 302 is a solid object having a thickness in the z direction
- the object 302 is a solid object having a thickness in this force direction
- DFD was used in this process.
- Dimensional 3D display is not performed, and it is displayed as a projected image on the display surface 3A in front of the DFD as 2D.
- the operator uses the input pen 201.
- a target operation such as writing a character on the object 302 can be performed as a two-dimensional GUI operation.
- FIGS. 23A and 25A the operator continues to move the object 302 to a desired position by a two-dimensional operation, and then, for example, push button 201D of the input pen 201 When the button is pressed once, the object 302 is moved forward as shown in FIGS. 23B and 25B.
- the object 302 is automatically moved.
- the push button 201D of the input pen 201 is pressed and then the three-dimensional object is moved instead of moving in the three-dimensional depth direction (z 0).
- the object 302 can be moved in the three-dimensional depth direction (z 0) to a position where it interferes with the window 304.
- the system control apparatus 1 may execute processing as shown in FIG. Note that the first process (step 420) in FIG. 47 is a process until the object 302 on the three-dimensional space 301 is selected or grasped, and has been described in the embodiment 2-1 or the embodiment 2-2. The detailed explanation is omitted.
- step 420 when the processing (step 420) of selecting or grasping the object 302 is performed according to the procedure described in the embodiment 2-1 or the embodiment 2-2, The system control device 1 determines that an operation for starting a two-dimensional GUI operation, editing, and processing has been performed. Therefore, next, for example, the pointer 303 displayed on the display device 3 is hidden (step 426), and the projection of the selected or grasped object 302 is most recently viewed from the operator. Display on surface 3A (step 427). As a result, the object 302 can be operated, edited, and processed in a two-dimensional GUI manner.
- the two-dimensional GUI operation from the input pen 201 is received and executed (step 428). Further, for example, the operator presses the push button 201D of the input pen 201 once.
- Step 429 It is determined whether or not a 2D GUI-like operation such as 3 s, an operation for finishing editing, or the like has been performed (step 429). At this time, if the operation for ending the operation, editing, and processing of the two-dimensional GUI is not performed, another operation, editing, and processing of another two-dimensional GUI are accepted and executed. On the other hand, when a 2D GUI operation, editing, and finishing operation are performed, the mode in which the 2D GUI operation is performed is described in the above Example 2-1 to Example 2-4. Return to the 3D pointing mode. As a result, the system control apparatus 1 determines whether there is another object that interferes with the object 302 such as the window 304 in the depth direction of the object 302. (Step 431).
- the object 302 is moved to the three-dimensional depth direction (z 0) until it interferes with the other object (window 304). ) To display (Step 43 2). Then, when the object 302 interferes with the other object, the movement of the object 302 in the three-dimensional depth direction is stopped, and the attribute of the other object that interferes with the object 302 is executed (step 433). .
- the system controller 1 displays a series of pointers as shown in Fig. 22A, Fig. 22B, Fig. 23C, Fig. 23A, Fig. 23B, and Fig. 23C by performing the process shown in Fig. 47. Control, operations such as three-dimensional movement of objects, and two-dimensional GUI operations of the objects are possible.
- the object 302 is pointed and selected or grasped in the three-dimensional space 301 by the operator. Easy to operate, automatically move to the position, perform the target operation, editing, processing, etc. by the conventional two-dimensional GUI operation, and after the target operation, editing, processing, the object 302 is moved. When there is another object that interferes with the moved object 302, the state of the object 302 moved according to the attribute of the other object is changed. Can be changed.
- mode force that performs 3D pointing operation by selecting or grasping the object 2D GUI-like operation and editing of the object
- the mode is switched to the processing mode, and the information acquired by the input information acquisition unit 101 of the system control device 1 is processed as information for performing the two-dimensional GUI operation, editing, and processing of the object.
- Force After the operation to point to the object, operate a specific key or other switch on the keyboard to switch to a mode in which the object is operated, edited, or processed in a two-dimensional GUI. May be.
- Example 2-6 the detection means (digitizer) of the electromagnetic induction pen tablet as the input device 2 is overlapped with the display surface of the DFD as the display device 3 capable of displaying a three-dimensional space.
- the present invention is not limited to this, and a configuration in which both are in different positions may be used.
- a DFD is used as the display device 3, but the present invention is not limited to this, and a display device such as a liquid crystal display as described in Example 2-1 to Example 2-4. Use ⁇ .
- the force that mentioned a two-dimensional quadrangular object as an example of the target object 302 is not limited to this, and the shape of the object 302 is any shape. May be.
- the 2D GUI operation, editing, and processing modes have been terminated.
- the present invention is not limited to this.
- a specific key on the keyboard or another switch is used instead of the push button 201D.
- Example 2-6 as an example of performing the two-dimensional GUI editing on the object 302, for example, the case where the letter “B” is entered as shown in FIG. 22C was given.
- Force When the object 302 is an object that represents a file and a 2D GUI operation is performed after pointing to it, the file is opened and its contents are edited on the 2D GUI and the file is closed. It may be possible to move to the 3D position desired by the operator later.
- the window 304 is given as an example of the other object, and the case of executing the attribute of moving a file when it interferes with the window 304 has been described.
- FIG. 28A to 32C used in the embodiment 16.
- the operator who wants to delete the object 302 first operates the input pen 201 to point to the object 302 to be deleted, and selects or grabs the object 302. Do.
- the pointed object 302 moves to the front display surface 3A, and changes to a state in which a two-dimensional GUI operation is possible.
- the operator moves the pointed object 302 to the trash can object 3 05 and, for example, presses the button 201D of the input pen 201 once.
- the state where the 2D GUI operation is possible returns to the state where 3D pointing is possible. Then, in the case of the procedure as shown in FIG. 47, the processing performed by the system control device 1 returns to the state where 3D pointing is possible, and then the object 302 automatically moves in the 3D depth direction ( z), when it interferes with the trash can object 305, as shown in FIGS. 30B and 32B, the display of the object 302 disappears, and the trash can 305 is in a state where trash (object) is contained. Switch to the display.
- any object that interferes with the object 302 can execute an attribute on the object 302. Even an object with attributes! /.
- Example 2-7 Next, Example 2-7 will be described. 48 and 49 are diagrams showing a configuration example of the input pen used in the three-dimensional pointing method of the embodiment 2-7.
- the input pen 201 is used as the input pen 201 used when the three-dimensional pointing method as described in Example 2-1 to Example 2-6 is performed.
- a pointing method when an input pen having a structure in which the pen tip 201P enters and exits the housing of the input pen 201 in accordance with the amount of rotation of the wheel provided in the 201 or the amount of movement of the slide bar will be described.
- the input pen 201 used in the present embodiment 2-7 has, for example, a gear (or screw) portion 201F provided inside a wheel 201D force pen tip 201P attached to the casing 201E as shown in FIG.
- a gear (or screw) portion 201F provided inside a wheel 201D force pen tip 201P attached to the casing 201E as shown in FIG.
- the pen point 201P enters the housing 201E.
- a gear (or screw) portion 201F provided in the slide bar 201D force pen tip 201P and an internal gear
- the pen tip 201P may enter the housing 201E when the slide bar 201D is moved.
- 48 and 49 are configuration examples of the input pen 201.
- the wheel or slide bar 201D is rotated or moved to move the pen tip 201P. Any structure may be used as long as the structure enters the housing 201E.
- Example 2-7 as an example of performing the pointing of the three-dimensional space represented on the display device 3 using the input pen 201 having the configuration shown in FIG. 48 or FIG. A case where the pointing method described in Example 2-1 is performed will be described. At this time, it is assumed that the input device 2 and the display device 3 respectively use an electromagnetic induction type pen tablet and a liquid crystal display as in the case of Example 2-1. Further, it is assumed that the detection means (digitizer) of the input device 2 is superimposed on the display surface of the liquid crystal display 3.
- the operator places the pen tip 201P of the input pen 201 on the liquid crystal display.
- the input pen such as a conical shape is placed on the three-dimensional space 301 of the liquid crystal display 3 as shown in FIGS. 35A and 36A.
- a pointer 303 having a shape reflecting the shape of the pen tip 201P is displayed.
- the size of the conical pointer 303 is pushed into the housing of the pen tip 201P of the input pen 201
- the visual effect that the pointer 303 is a part of the pen tip 201P is further enhanced as compared with the method described in the embodiment 2-1.
- Example 2-7 the force exemplified in the case where the three-dimensional pointing method described in Example 2-1 is performed. For example, the presence or absence of contact with the detection surface of the pen tip 201P is determined. If an input pen configured to detect and control the entry / exit of the pen tip 201P into the housing 201E by an electric mechanism is used, as described in the embodiment 2-2, the detection surface of the pen tip 201P Of course, it is possible to perform pointing depending on the presence or absence of contact.
- Example 2-8 will be described.
- Example 2-8 a specific usage scene similar to that in Example 1-8 will be described.
- description will be made with reference to FIGS. 37A to 40C used in Example 1-8.
- a display screen capable of three-dimensional display as a specific use scene of the three-dimensional pointing method as described in the embodiments 2-1 to 2-7.
- An example is a remote control for operating the music playback function, which incorporates a pen tablet detection means.
- the procedure of display control of the pointer and the object on the display screen of the remote controller when the input pen 201 is operated is from Example 2-1 to Example 2-7.
- the detailed procedure is omitted because it may be the same as the procedure described in.
- Example 2-8 the operator uses the input pen 201 in which the pen tip 201P is pushed into the housing, as described in Example 2-7, for example, FIG. 37A and FIG.
- the object displayed in the three-dimensional space 301 of the remote control as shown in 39A is operated.
- the display switches to a display in which the play button 302a is pressed, and music is displayed. Playback starts.
- the input pen 201 is operated to point to the volume knob 302b.
- the knob 302b is When the input pen is moved to rotate, the volume of the music being played can be increased or decreased.
- the operation is not limited to the operation shown in FIGS. 37C and 39C.
- the volume is increased near the center of the knob 302b.
- the input pen 201 is rotated around the axis, whereby the knob 302b is rotated to increase or decrease the volume.
- FIGS. 38B and 40B when an area 302c displaying information related to the music being played is pointed with the input pen 201, as shown in FIGS. 38C and 40C, If the region 302c is switched to a two-dimensional display so that a two-dimensional G UI operation can be performed, for example, in combination with a handwritten character recognition function, the music to be reproduced in the region 302c can be displayed. You can enter a track number and skip to the desired track number.
- Example 2-8 an example of the operation of a music device using a remote controller has been shown.
- the present invention is not limited to this.
- it is a PDA, a mobile phone, a kiosk terminal, an ATM, or the like.
- It can also be applied to the operation of devices that can take the same form, making each device more intuitive. It becomes possible to operate.
- music is played, the volume is raised, and the track is changed.
- the present invention is not limited to this, and any operation associated with the operation in the input bin 201 may be performed. Any operation is possible
- handwriting recognition is used to input the track number.
- any method can be used as long as it can be realized by a two-dimensional GUI.
- an input method may be used in which the track number is displayed in the pull-down menu and the input pen 201 is selected.
- the 3D pointing device that realizes the 3D pointing described in each of the embodiments need not be a dedicated device specialized for realizing the 3D pointing method.
- FIG. it can also be realized by a computer (system control device 1) such as a PC and a three-dimensional pointing program for causing the computer to execute the three-dimensional pointing method as described in the above embodiments.
- the three-dimensional pointing program may be recorded on any one of magnetic, electrical, and optical recording media as long as it is recorded in a state that can be read by the computer.
- the three-dimensional pointing program can be provided through a network such as the Internet, which is not only recorded on the recording medium but also provided.
- the third embodiment corresponds to the second object of the present invention.
- a pointer and an object are displayed in a three-dimensional space expressed on a display device, and an arbitrary point in the three-dimensional space is displayed with the pointer.
- the pointing is performed with the depth position, shape, and size of the pointer different from the portion to be pointed being maintained constant. Move the part to be performed in the depth direction. And if there is an object at the 3D spatial position where the pointing part is pointing, the object is pointed and Change to the state shown. In this way, the operator of the pointer can easily and accurately recognize the depth position of the pointer and the position where the pointer is pointing.
- FIG. 50 is a schematic diagram showing a configuration example of a system for realizing the three-dimensional pointing method of the present invention.
- 1 is a pointing device (system control device), 101 is input information acquisition means, 102 is pointing position / deformation amount calculation means, 103 is pointer generation means, 104 is display control means, 105 is pointing determination means, 106 Is an object generation means, 107 is a processing control means, 108 is a storage means, 2 is an input device, and 3 is a display device.
- 101 is input information acquisition means
- 102 is pointing position / deformation amount calculation means
- 103 is pointer generation means
- 104 is display control means
- 105 is pointing determination means
- 107 is a processing control means
- 108 is a storage means
- 2 is an input device
- 3 is a display device.
- a pointer on a 3D space represented on a display device connected to a system control device such as the PC is controlled by the system control described above.
- This is a three-dimensional pointing method that is preferably applied to a case where an arbitrary position on the three-dimensional space is pointed by operating three-dimensionally using an input device connected to the apparatus.
- the system control apparatus 1 acquires the input information input means 101 for acquiring the input information input from the input apparatus 2 and the input information acquisition means 101.
- the input information is information related to the operation of the pointer (pointer operation information)
- the pointing position and the amount of deformation of the pointer are calculated after calculating the moving direction and moving amount of the pointing point based on the input information.
- Display control means 104 for displaying on the display device 3.
- the system control apparatus 1 includes an object pointed to a pointer to be generated based on the calculation result of the pointing position Z deformation amount calculation means 102 as shown in FIG.
- the system control device 1 is a device that activates and operates software according to input information from the input device 2 and controls other devices, such as the PC, for example. As shown in FIG.
- a processing control means 107 and a storage means 108 are provided.
- the processing control unit 107 executes a process according to the acquired information.
- the three-dimensional pointing method of the present embodiment can be realized using the existing system control device 1 as shown in FIG. 50, which does not require the use of a special three-dimensional pointing device.
- the input device 2 is not limited to an input device generally connected to the system control device 1 (PC) such as a keyboard or a mouse, but may be a pen tablet or a joystick (joypad). ) Or the like. Further, when operating the pointer with the input device 2, for example, it may be performed with one kind of input device such as a mouse or a keyboard, and the mouse operation and pressing of a specific key on the keyboard are combined. For example, two or more types of input devices may be used.
- the entering-force device 2 for example, the display surface of the display device 3 and may be integrally I spoon (e.g., see JP-A 5-73208 discloses.)
- the pointer operation information can be input by touching the display screen of the display device 3 with a pen or a fingertip.
- the display device 3 may be any display device that can represent a three-dimensional space. For example, a two-dimensional display that projects a three-dimensional object such as a CRT display or a liquid crystal display onto a two-dimensional plane. It may be a display device or a display device capable of displaying a three-dimensional stereoscopic image such as a DFD (for example, see Japanese Patent No. 3022558 and Japanese Patent No. 3460671). That is, the display device 3 is any display device as long as the operator (observer) can three-dimensionally recognize (perceive) the displayed pointer and the position and shape of the object.
- a two-dimensional display that projects a three-dimensional object such as a CRT display or a liquid crystal display onto a two-dimensional plane. It may be a display device or a display device capable of displaying a three-dimensional stereoscopic image such as a DFD (for example, see Japanese Patent No. 3022558 and Japanese Patent No. 3460671). That is, the display device 3 is any display
- FIGS. 51 to 55 illustrate the three-dimensional pointing method of the embodiment 3-1 according to the present invention.
- FIG. 51 is a diagram for explaining the operation method of the pointer
- FIG. 52 is a front view and a right side view showing a change in the three-dimensional space when an object behind the pointer is pointed.
- 53 is a perspective view of the change in the three-dimensional space of FIG. 52
- FIG. 54 is a front view and right side view showing the change in the three-dimensional space when pointing to an object in front of the pointer
- FIG. FIG. 10 is a flowchart for explaining a processing procedure when the three-dimensional pointing method of the embodiment 3-1 is executed by a system control device (pointing device).
- FIG. 52 shows three states in the three-dimensional space in the upper, middle, and lower stages. By performing operations as shown between each stage, the state in the three-dimensional space is changed to the upper force in the middle stage. It is a figure which shows a mode that middle stage force also changes to a lower stage.
- FIG. 53 also shows three states in the upper, middle, and lower stages, and the perspective views of the states in the upper, middle, and lower stages in FIG. 52, respectively.
- FIG. 54 shows three states in the three-dimensional space in the upper, middle, and lower stages, and the state in the three-dimensional space can be changed by performing the operations shown between the respective stages. It is a figure which shows a mode that it changes from the middle stage to the middle stage, and the middle stage to the lower stage.
- 1 is a system control device
- 201 is a keyboard
- 202 is a mouse
- 3 is a display device (two-dimensional display device)
- 301 is a display surface
- 4 is a pointer.
- Example 3-1 as shown in FIG. 51, a keyboard 201 and a mouse 202 are used as the input device 2, and a two-dimensional display device such as a liquid crystal display is used as the display device 3.
- a two-dimensional display device such as a liquid crystal display
- the moving direction and moving distance of the pointer 4 displayed in the three-dimensional space in the XY plane are the moving direction and the moving distance when the mouse 202 is powered two-dimensionally on a plane such as a desktop. Calculate (determine) based on the distance traveled.
- the moving direction and moving distance in the depth direction can be determined by, for example, a key 201A that is preliminarily determined such as a control key (Ctrl key) of the keyboard 201 as shown in FIG. How to rotate the wheel 202A of the mouse 202 while holding down Calculation (determination) based on direction and rotation angle.
- a key 201A that is preliminarily determined such as a control key (Ctrl key) of the keyboard 201 as shown in FIG.
- the pointer pointing portion is viewed in the + Z direction of the three-dimensional space, that is, from the operator. And move it in the direction of the back.
- the mouse wheel 202A is turned in the Z direction
- the pointing portion of the pointer is moved in the -Z direction of the three-dimensional space, that is, in the direction facing the front in view of the operator force.
- the pointer 4 has an arrow shape, and represents the point (X, y, z) where the tip of the arrow is pointing.
- the arrow portion is moved in the depth direction while keeping the depth position of the end opposite to the tip of the arrow constant. Tilt the pointer 4.
- object 5 Since object 5 has a different depth position, object 5 is pointed! / ⁇ !
- the three-dimensional pointing method of the embodiment 3-1 does not specify the tilting method of the pointer 4 as shown in the middle part of Fig. 52 and the middle part of Fig. 53. As long as the depth position of the end opposite to the tip of the arrow is kept constant, it can be tilted in any way.
- the operator can recognize that the pointer 4 is further tilted in the depth direction (+ Z direction), and at the same time, the pointer 4 is further pointing away from the position before the operation due to the shape of the pointer 4. And you can also recognize that.
- the object 5 is pointed by the pointer 4. Therefore, for example, as shown in the lower part of FIG. 52 and the lower part of FIG. Change the color to indicate that you are pointing. In this way, the operator can intuitively and accurately recognize the position of the pointer 4 in the depth direction and the position in the depth direction where the pointer 4 is pointing. In addition, by changing the color of the object 5 when it is pointed by the pointer 4, it is intuitive whether the pointer 4 overlaps the object 5! /, And the pointer 4 is pointing to the object 5. And it can be recognized accurately.
- FIGS. 52 and 53 when the pointer 4 is in front of the depth position of the object 5 as viewed from the operator, and the pointer 4 is tilted back (+ Z direction).
- the pointer 4 can be tilted forward (one Z direction).
- the width of the arrow portion of the pointer is displayed wider.
- the operator can recognize that the pointer 4 is tilted in the depth direction (+ Z direction), and at the same time, the pointer 4 points to the position before the operation from the position before the operation due to the shape of the pointer 4. I can also recognize that.
- the mouse wheel 202A is further rotated in the ⁇ Z direction while pressing the control key 201A of the keyboard again from the state shown in the middle part of FIG.
- the pointer 4 keeps the tip of the arrow in the Z direction in a state where the depth position of the end opposite to the tip of the arrow is kept constant. It moves, and the part of the arrow is tilted further in front of the operator.
- the pointer is rotated while maintaining its shape and size, the width of the arrow of the pointer is displayed wider.
- the operator can recognize that the pointer 4 is further tilted in the depth direction (-Z direction), and at the same time, the pointer 4 is pointing further forward than the position before the operation due to the shape of the pointer 4. Can also recognize
- the destination position is reached, the xyz coordinate (X, y, z) of the tip of the pointer 4 and the object 5
- the object 5 When the xyz coordinates of an arbitrary point on or inside the surface of the object 5 coincide, the object 5 is pointed to by the pointer 4. Therefore, for example, as shown in the lower part of FIG. 54, the color of the object 5 is changed to indicate the pointing state. In this way, the operator can intuitively and accurately recognize the position in the depth direction of the shape force pointer 4 of the pointer 4 and the position in the depth direction where the pointer 4 is pointing. In addition, by changing the color of the object 5 when pointed by the pointer 4, it is intuitive whether the pointer 4 overlapping the object 5 is pointing to the object 5 or not. It can be recognized accurately.
- the pointer 4 is moved in the depth direction (in accordance with the rotation of the mouse wheel 202A). + Z direction or ⁇ Z direction), for example, the pointer 4 may be displayed so as to be continuously tilted in accordance with the total rotation angle of the wheel 202A, or the wheel 202A. In synchronization with this rotation step, the pointer 4 may be displayed so as to tilt stepwise by a predetermined angle each time the wheel 202A is rotated one step.
- step 601 When the three-dimensional pointing method of the present embodiment 3-1 is executed by the system control device 1 (pointing device), the process starts from step 601 as shown in FIG. The processing of 608 may be executed.
- the system control device 1 first displays the object 5 and the pointer 4 on the display device 3 using the display control means 104 (step 601).
- a plurality of the objects 5 may be displayed.
- the object 5 and the pointer 4 are expressed by the display device 3 and can be displayed at any position within the three-dimensional space.
- the input information acquired by the input information acquisition means 101 is to acquire not only information related to the operation of the pointer 4 (pointer operation information) but also input information such as application 'software activation. 1S
- the pointer Suppose that information about the operation of 4 is acquired.
- the input information acquired by the input information acquisition means 101 is determined as to whether or not it is the pointer operation information, and the input information is, for example, movement information (operation information) of the mouse 202 or wheel 202A.
- the input information acquisition unit 101 passes the input information (pointer operation information) to the pointing position Z deformation amount calculation unit 102 to change the pointing position and the pointer. Let the amount be calculated.
- the pointing position Z deformation amount calculation means 102 first calculates, for example, as shown in FIG. 55, the pointer movement direction, the movement amount, and the like based on the pointer operation information (step 603).
- step 603 for example, from the information of the two-dimensional movement direction and movement amount of the mouse body, the movement direction, movement amount, or rotation of the pointer 4 in the XY plane of the three-dimensional space represented by the display device. An angle or the like is calculated.
- the pointer 4 displayed on the display device 3 is moved and displayed based on the calculation result (step 604). .
- the step 604 uses the display control unit 104 after the pointer generation unit 103 generates the destination pointer 4 based on the movement direction, movement amount, etc. of the pointer 4 in the XY plane.
- the generated pointer 4 is displayed on the display device 3.
- the pointer operation information includes information for moving or rotating the pointer 4 in the XY plane. In this case, the operation in step 604 is omitted, and the processing in the next step 605 is performed. .
- the pointing position Z deformation amount calculation unit 102 causes the pointer generation unit 103 and the display control unit 104 to perform the processing of step 604, while the pointer Based on the operation information, the direction and amount of tilt of the pointer 4 are calculated (step 605).
- the tilting direction is determined, for example, by the information power of the rotation direction of the wheel 202A of the mouse 202. Further, the amount of tilt is calculated from the amount of rotation of the wheel 202A of the mouse 202, for example.
- the pointer 4 displayed on the display device 3 is tilted and displayed based on the calculation result (step 606).
- the step 606 uses the display control unit 104 after the pointer 4 is tilted by the pointer generation unit 103 based on the tilting amount of the pointer 4, for example, FIG.
- the pointer 4 tilted as shown in the middle section of the screen is displayed on the display device 3.
- the operation in step 606 is omitted, and the processing in the next step 607 is performed.
- the pointing position Z deformation amount calculation means 102 performs the processing of step 603 and step 606, and then passes the calculation result to the pointer generation means 103 to generate a pointer, and also performs the pointing determination.
- the calculation result is also passed to means 105.
- the pointing determination means 105 determines from the received calculation result whether or not there is an object to which the pointer 4 after the operation is pointing, that is, the pointer 4
- the xyz coordinate force of the point is determined to determine whether it is within a predetermined range from the xyz coordinate of any point on or inside the surface of the object (step 607). If there is no pointing object at this time, the process returns to step 602 and waits until the next input information (pointer operation information) is acquired.
- the pointing determination means 105 causes the object generation means 106 to generate an object in which the color of the pointing object is changed, and uses the display control means 104. Then, it is displayed on the display device 3 (step 608). After displaying the object whose color has been changed, the process returns to step 602 and waits until the next input information (pointer operation information) is acquired.
- the processing control means 107 performs processing according to the acquired input information.
- examples of input information other than the pointer operation information include activation of software associated with the object 5 pointed by the pointer 4 and input information of numerical values or character strings.
- the processing control means 107 performs processing such as activation of software associated with the object 5 based on the input information.
- the object generation means 105 generates an object related to the processing result
- the display control means 104 is used to display the processing result object on the display device 3.
- the depth position of the opposite end of the pointer 4 from the pointing point (the tip of the arrow) is constant.
- the pointer is tilted in the depth direction and displayed on the display device 3 so that the operator viewing the pointer 4 can move the depth position of the pointer 4 and the pointer 4
- the depth position is intuitively and accurately recognized.
- the pointer 4 is tilted in the depth direction while maintaining the entire length of the pointer 4, and the object 5 is pointed.
- the three-dimensional length of the pointer 4 does not change, so that a natural object display closer to a real object can be presented to the operator.
- the pointer 4 is tilted in the depth direction by combining the rotation operation of the keyboard control key 201A and the mouse wheel 202A.
- Power not limited to this, it may be a combination of another key of the keyboard 201 and the wheel 202A! /, Or may be combined with a cursor key (direction key) of the keyboard 201 instead of the wheel 202A.
- the pen tablet may tilt when a predetermined operation is performed with a touch panel, joystick, etc.
- an arrow-shaped pointer has been given as an example of the pointer 4.
- the present invention is not limited to this, and the tilt when tilted in the depth direction is displayed. Any shape can be used as long as the direction and the pointing point (position) can be visually recognized.
- FIGS. 56A to 56D are schematic diagrams showing modifications of the shape of the pointer
- FIG. 56A is a diagram showing a triangular pointer
- FIG. 56B is a diagram showing a pointer in the shape of a human hand
- FIG. 56C is a diagram
- FIG. 56D is a diagram showing a saddle-shaped pointer
- FIG. 56D is a diagram showing a cross-shaped pointer.
- the pointer 4 is not limited to the arrow shape as shown in the upper part of FIG. 52, and may be a triangular pointer 4A as shown in FIG. 56A, for example.
- the vertex angle of the pointer 4A should be the point (X, y, z) that is pointed and tilted in the depth direction with the bottom position kept constant.
- Pointer 4 in the shape of the person's hand In the case of B, for example, the point (x, y, z) where the tip of the index finger is pointing
- the point of intersection (X, y, z) is set as the point of intersection, and the depth position of the end of one of the four axes extending from the point of intersection.
- the above object is also a force exemplified by a folder icon type object as shown in Fig. 51 or the like. Any shape can be used as long as it can be pointed to with the pointer, such as shortcuts and windows! /.
- the pointing position in the depth direction of the pointer 4 can be recognized from the change in the visual shape of the pointer 4, for example.
- a reference that serves as an index for allowing the operator to recognize how much the pointer 4 is tilted in the space may be displayed on the display device 3.
- FIG. 57 is a diagram showing an example in which a reference is displayed in the three-dimensional pointing method of the embodiment 3-1.
- the operator recognizes the pointing position in the depth direction of the pointer 4 by the change in the visual shape when the pointer 4 is tilted in the depth direction. However, if the same depth position remains pointing for a long time, there is no change in the visual shape, which may obscure the recognition of the pointing position of the pointer 4 in the depth direction. .
- the xyz coordinate axis (reference) 7 reflecting the XYZ coordinate system set in the three-dimensional space is displayed in the three-dimensional space displayed on the display device 3.
- the reference 7 may be fixed at a specific position in the three-dimensional space, or is moved along with the movement of the pointer 4, and is fixed on the spot when the pointer 4 tilts. Also good. Further, the operator can set the reference 7 at any position in the three-dimensional space represented by the display device 3 and can set the position. In FIG.
- a display object representing the xyz coordinate axis is used as the reference 7.
- the display object is not limited to this, and any display object can be used as an index for allowing the operator to recognize the degree of inclination of the pointer 4.
- it may be a semi-transparent display object having an inclination of 0 (parallel to the XY plane) and similar to the pointer.
- FIG. 58 is a schematic diagram for explaining a modified example of the three-dimensional pointing method of the embodiment 3-1, and the change in the three-dimensional space when the object behind the pointer is pointed It is the front view and right view which show these.
- the upper, middle, and lower stages show three states in the three-dimensional space.By performing operations as shown between the stages, the state in the three-dimensional space is changed from the upper stage to the middle stage. It is a figure which shows a mode that it changes from the middle stage to the lower stage.
- the entire length, shape, etc. of the pointer 4 are kept constant in the depth direction. It is displayed. Therefore, the object that can be pointed by the pointer 4 at a certain depth position is limited to an object that is closer in the depth direction than the length of the pointer 4.
- the three-dimensional pointing method of this embodiment 3-1 is not limited to the way of tilting the pointer as shown in the lower part of FIG. 52, for example.
- One end of the pointer (the opposite side of the tip of the arrow) This is a method that allows the operator to recognize the tilt of the pointer and recognize the pointing position by comparing the depth position of the end of the pointer and the depth position of the tip of the pointer (the tip of the arrow).
- any tilting method can be used as long as the depth position of the part different from the pointed part is kept constant.
- the change is to change the size used as a psychological drawing method for presenting a three-dimensional stereoscopic effect to the operator when the pointer 4 is displayed in the three-dimensional space represented on the display device 3. In other words, it is different from ⁇ changing the size on the display '' where the pointer behind the operator is displayed smaller, or the pointer closer to the operator is viewed larger from the operator. .
- the three-dimensional pointing method of the present embodiment 3-1 as shown in FIG. 52 and FIG. 53, the force described for the method of pointing the object at different depth positions by tilting the pointer 4 at this time As shown in FIG. 59 in which the pointer 4 is simply tilted, an operation of translating in the depth direction while maintaining the shape of the pointer 4 may be added.
- Fig. 59 is a schematic diagram for explaining a first application example of the three-dimensional pointing method of the embodiment 3-1 and shows a three-dimensional view when pointing an object behind the pointer. It is the front view and right view which show the change in space.
- the upper, middle, and lower stages show three states in the three-dimensional space. By performing the operations shown between each stage, the state in the three-dimensional space It is a figure which shows a mode that it changes from the middle stage and the middle stage to the lower stage.
- the length of the entire pointer changes as it tilts in the depth direction. Pointing is possible even for objects with a large distance in the depth direction. However, even if the pointer is tilted as shown in the lower part of FIG. 52, for example, by moving the pointer 4 in the Z-axis direction while maintaining the shape, the pointer 4 and the object 5 in the depth direction. If the distance is made shorter than the entire length of the pointer, all objects in the three-dimensional space represented on the display device 3 can be pointed.
- the translation of the pointer 4 in the Z-axis direction is performed by holding the mouse wheel 202A + Z while pressing a key other than the keyboard control key (Ctrl key) 201A, for example, the shift key (Shift key). If the pointer 4 is rotated in the direction, the pointer 4 is translated in the + Z direction of the three-dimensional space, and if the mouse wheel 202A is rotated in the Z direction while holding the shift key (Shift key), the pointer 4 is moved. Just translate it in the Z direction in the 3D space!
- the pointer 4 and the object 5 are displayed at different depth positions in the three-dimensional space as viewed from the operator's viewpoint. Give a case. At this time, from the viewpoint of the operator looking at the three-dimensional space from the ⁇ Z direction, the force that the pointer 4 appears to overlap the object 5 The point (x, y, z) where the inter 4 is pointing and the object 5 are
- the pointer 4 When the pointer 4 is translated in the Z-axis direction in accordance with the rotation of the mouse wheel 202A, for example, the pointer 4 is continuously moved in accordance with the total rotation angle of the wheel 202A.
- the pointer 4 may be displayed to move stepwise by a predetermined distance every time the wheel 202A is rotated by one step in synchronization with the rotation step of the wheel 202A. It's okay.
- the two-dimensional movement of the mouse 202 body is reflected in the movement of the pointer 4 in the XY plane, and the control key of the keyboard 201 is reflected.
- the combination of the rotation operation of 201 A and the wheel 202A of the mouse 202 is reflected in the inclination of the pointer 4 in the depth direction.
- the position shown in FIG. In the inching method the combination of the shift operation of the keyboard 201 and the rotation operation of the wheel 201A of the mouse 202 is reflected in the translational movement of the pointer 4 in the depth direction.
- the pointer 4 is rotated in the XY plane by combining a key other than the control key (Ctrl key) 201A or the shift key (Shift key) of the keyboard 201 and the rotation operation of the mouse wheel 201A. Is also possible.
- the rotation of the pointer 4 in the XY plane is performed by pressing a key other than the control key (Ctrl key) or the shift key (Shift key) of the keyboard 201, for example, the ortho key (Alt key).
- a key other than the control key (Ctrl key) or the shift key (Shift key) of the keyboard 201 for example, the ortho key (Alt key).
- the mouse wheel 202A is rotated in the + Z direction
- the pointer 4 is rotated clockwise, while the mouse wheel 202A is rotated in the -Z direction while holding the Alt key (Alt key).
- the pointer 4 may be rotated counterclockwise.
- FIG. 60 is a schematic diagram for explaining a second application example of the three-dimensional pointing method of the embodiment 3-1, and is a three-dimensional view when pointing an object behind the pointer. It is the front view and right view which show the change in space.
- the upper stage, middle stage, and lower stage show three states in the three-dimensional space. By performing operations as shown between each stage, the state in the three-dimensional space becomes the upper stage force. It is a figure which shows a mode that it changes from the middle stage and the middle stage to the lower stage.
- the pointer 4 When the pointer 4 is rotated and displayed in the XY plane in accordance with the rotation of the mouse wheel 202A, for example, the pointer 4 is continuously aligned with the total rotation angle of the wheel 202A.
- the pointer 4 may be displayed stepwise by a predetermined angle every time the wheel 202A is rotated by one step in synchronization with the rotation step of the wheel 202A. It may be displayed to rotate.
- Example 3-1 as shown in FIG. 51, a case where a two-dimensional display device such as a liquid crystal display is used as the display device 3 is described as an example.
- a 3D display device capable of 3D display such as DFD, the depth position of the pointer can be recognized more accurately and intuitively.
- FIG. 61 and FIGS. 62A and B are schematic diagrams for explaining a third application example of the three-dimensional pointing method of the embodiment 3-1
- FIG. 61 is a diagram showing a configuration example of the system
- 62A and 62B are diagrams for explaining the operating principle of the DFD.
- the display device 3 may be any display device as long as it can represent a three-dimensional space.
- DF It is preferable to use a three-dimensional display device (display) such as D.
- the DFD is a display device that includes a plurality of display surfaces that overlap in the depth direction when the viewpoint power of the operator (observer) is also seen (for example, Japanese Patent No. 3022558) And patent 3460671).
- the operating principle of the DFD is the same as described in the first embodiment. To simplify the explanation, it is assumed that two display surfaces 301A and 301B are overlapped as shown in FIG. At this time, the pointer 4 and the object 5 are displayed by reflecting the depth position in the three-dimensional space between the two display surfaces 301A and 301B.
- the pointer 4 and the object 5 displayed on the DFD are displayed on both the front display surface 301A and the back display surface 301B as viewed from the operator.
- the DFD is a luminance modulation type
- the object 5A on the front display surface 301A is represented by luminance L
- the object 5B on the rear display surface 301B is represented by luminance L.
- the object 5 is displayed at a depth position in which the ratio of the distance from the front display surface 301A and the distance from the back display surface 301B in the three-dimensional space is L: L.
- one object 5 can be displayed tilted in the depth direction.
- the brightness of the object 5A on the front display surface 301A increases from the top to the bottom of the paper
- the brightness of the object 5B on the back display surface 301B increases from the bottom to the top of the paper.
- the DFD is a transmissive type
- the transparency of each point (pixel) in the area displaying the object 5A on the front display surface 301A is adjusted.
- the stereoscopic image of the pointer 4 and the object 5 can be displayed at an arbitrary depth position between the front display surface 301A and the back display surface 301B, similarly to the luminance modulation type DFD. It is out.
- the depth of the pointer 4 does not change by simply recognizing the depth depending on the width of the pointer 4 displayed on the display device 3, and the one end and the tip of the pointer 4 are By comparing a certain depth position, the depth position of the pointer 4 can be recognized intuitively and accurately. Also, when using DFD, if the display surface in front of DFD or the display surface in the back is the depth position with one end where the depth of pointer 4 does not change, the operator can more accurately and intuitively determine the depth of pointer 4 There is a big effect that the position can be recognized.
- a process for projecting and displaying the three-dimensional space to be expressed on a two-dimensional plane is necessary.
- a three-dimensional display device it is only necessary to set the luminance ratio of points (pixels) on each display surface according to the position in the depth direction of the three-dimensional space! It is possible to reduce the load on 1).
- the 3D space to be displayed is projected and displayed on a 2D plane.
- a 3D display device such as the DFD makes it possible to perform a pointing operation with a feeling closer to real space. From these facts, by using a 3D display device such as the DFD, the operator is more accurate and faster than a pointing operation using a general 2D display. 3D depth can be pointed.
- FIG. 61 shows a case where a keyboard 201 and a pen tablet are used in combination as the input device 2, and the technique of the first embodiment can be applied. That is, the pen tablet is an input device that detects the movement of the pen tip of the input pen 203B, writing pressure, etc. by operating the input pen (electronic pen) 203B on the detection means (digitizer) 203A. . Therefore, for example, the movement of the pen tip of the input pen 203B is
- the amount of movement of the pointer 4 within the keyboard By reflecting the amount of writing pressure at the time of the movement in the Z direction, pointing with the pointer 4 is possible with the same operational feeling as when using the mouse 202 It becomes.
- the technique of the second embodiment can also be applied.
- the amount by which the pointer 4 tilts is determined according to the number of times the detection means 203A is pressed with the input pen 203b, for example.
- the detection means (digitizer) 203A of the pen tablet is superimposed on the display surfaces 301A and 301B of the display device 3 (DFD), the operator can input the input on the display surfaces 301A and 301B. Since the pen 203B can be operated and pointed, the operator can recognize the depth position of the pointer 4 more accurately and intuitively.
- the three-dimensional pointing method of the present embodiment 3-1 is performed when pointing using an input device that can be integrated with the display device 3, such as a touch panel, instead of the pen tablet.
- an input device that can be integrated with the display device 3, such as a touch panel, instead of the pen tablet.
- the touch panel for example, the pointer 4 can be operated by touching the screen of the display device 3 with an operator's finger instead of the input pen 203B. Therefore, the touch panel is more than a pen tablet using the input pen 203B. However, the pointer 4 can be operated more intuitively.
- y is kept constant, and the pointing point is a straight line extending in the depth direction.
- the pointing point may be tilted so as to follow various trajectories!
- FIGS. 63A to 66B are schematic diagrams for explaining a fourth application example of the three-dimensional pointing method of the embodiment 3-1, and FIGS. 63A, 63B, 64A, and 64B are respectively shown.
- FIG. 65A, FIG. 65B, FIG. 66A, and FIG. 66B are diagrams showing application examples in the case of following a circular trajectory, respectively.
- Figure 63A, Figure 63B, Figure 64A, Figure 64B, Figure 65A, Figure 65B, Figure 66A, and Figure 66B show the three-dimensional space viewed from the XZ plane side to make the trajectory easier to move. The figure (right side view) is shown.
- the pointing point may be traced so as to move in the + X direction in proportion to the movement in the + Z direction.
- the pointing may follow a trajectory in which the point moves in the ⁇ X direction in proportion to the movement in the + Z direction.
- a force indicating a case where the pointer is tilted while maintaining the entire length of the pointer is not limited to this, and as shown in FIGS. 64A and 64B, a three-dimensional view of one end of the pointer is provided. It is also possible to incline in the depth direction while changing the length of the entire pointer by moving the point in the depth direction with the fixed position fixed.
- the trace is not limited to the linear trajectory.
- a trace from a certain point (X, y, z) in a three-dimensional space with an radius r in the XZ plane of y y Arc shape drawn
- FIGS. 65A and 65B a force indicating a case where the entire pointer is tilted while maintaining the entire length is not limited to this, and as shown in FIGS. 66A and 66B, a three-dimensional view of one end of the pointer is provided. It is also possible to incline in the depth direction while changing the length of the entire pointer by moving the point in the depth direction by performing the above-mentioned pointing with a fixed position fixed.
- the trace is not limited to the trajectory as shown in each of the drawings, and may be any trajectory.
- FIGS. 67 to 69 are schematic diagrams for explaining the three-dimensional pointing method according to the embodiment 3-2 of the present invention.
- FIG. 67 shows a three-dimensional space when an object behind the pointer is pointed.
- 68A is a diagram for explaining a problem in the three-dimensional pointing method of Example 3-2
- FIG. 68B is a method for solving the problem shown in FIG. 68A.
- FIG. 69 is a diagram for explaining a modification of the solution shown in FIG. 68B.
- Fig. 67 shows three states in the three-dimensional space in the upper, middle, and lower stages. By performing the operations shown between each stage, the state in the three-dimensional space is changed to the upper stage force.
- Example 3-1 the point (X, y, z) of the pointer 4 is the point
- Example 3-2 as an example of the tilting method different from Example 3-1, the pointer 4 is centered on a certain point (X, y, z) in a three-dimensional space, and Describes a three-dimensional pointing method for recognizing the inclination of the pointer 4 and recognizing the pointing position by rotating while keeping the distance between each point on the pointer 4 and the center constant. To do.
- the input device 2 includes a keyboard, a mouse, a pen tablet, a touch panel, a joystick, etc. Force that can use an input device As shown in FIG. 51, it is assumed that a keyboard 201 and a mouse 202 are used.
- the display device 3 can use a display device such as a two-dimensional display device such as a CRT or a liquid crystal display, or a three-dimensional display device such as a DFD. As shown in FIG. Suppose you use a display (two-dimensional display)!
- the moving direction and the moving distance of the pointer 4 displayed in the three-dimensional space in the XY plane are the moving direction and the moving distance when the mouse 202 body is two-dimensionally driven on a plane such as a desktop. Calculate (determine) based on the distance traveled.
- the moving direction and moving distance in the depth direction (Z direction) are, for example, as shown in FIG. 51, such as the control keys (Ctrl key) of the keyboard 201 and the like.
- Calculation determination is performed based on the rotation direction and rotation angle when the wheel 202A of the mouse 202 is rotated while pressing.
- the pointer pointing portion is viewed in the + Z direction of the three-dimensional space, that is, from the operator. Move in the direction of Move.
- the mouse wheel 202A is turned in the Z direction, the pointing portion of the pointer is moved in the -Z direction of the three-dimensional space, that is, in the direction facing the front in view of the operator force.
- the shape of the pointer is an arrow, and the tip of the arrow represents a point (X, y, z) that is pointing .
- the operator can recognize that the pointer 4 is further tilted in the depth direction (+ Z direction), and at the same time, the pointer 4 is further out of the position before the operation from the shape of the pointer 4. You can also recognize that you are pointing in the back.
- the pointer 4 is rotated by an angle ⁇ so as to move in the + Z direction) or the near side (-Z direction). Therefore, as shown in the middle of Fig. 67, when the pointer 4 is tilted to the back, the depth position of a part of the pointer 4 after tilting (the part opposite to the tip of the arrow) is the depth before tilting. It will be in front of the position. At this time, if the depth position of the pointer is sufficiently large, the portion on the opposite side of the tip of the arrow of the pointer after being tilted exists in the three-dimensional space represented on the display device 3 and is displayed. Can do. However, for example, as shown in FIG.
- the pointer 4 is on the display surface 301, in other words, on the boundary surface between the three-dimensional space represented by the display device and the real space where the operator is present.
- the portion opposite to the tip of the arrow of the pointer 4 is outside the three-dimensional space represented on the display device 3 when the pointer 4 is rotated and tilted. I can't display it. Therefore, there is a problem in that the shape of the pointer 4 on the side opposite to the tip of the arrow is missing.
- a pointer may be generated by bending the protruding portion to the boundary surface (XY plane) of the three-dimensional space and displayed.
- step 606 where the pointer is tilted and displayed, the pointer protrudes from the three-dimensional space when the pointer is tilted based on the calculation result of step 605. It is determined whether the part has a certain force or not, and if there is a part that protrudes, a pointer is created to project or fold the protruding part onto the XY plane (boundary surface), and displayed on the display device 3 .
- FIG. 67 illustrates an operation when the pointer 4 is in front of the depth position of the object 5 as viewed from the operator and the pointer 4 is tilted back (+ Z direction).
- the pointer 4 can be tilted forward (in the Z direction).
- the operator who has seen the pointer 4 can intuitively and accurately recognize the depth position of the pointer 4 and the depth position at which the pointer 4 is pointing.
- the keyboard 201 and the mouse 202 are used as the input device 2, and the control operation 201A of the keyboard and the rotation operation of the mouse wheel 202A are combined.
- the force shown in the example in which the pointer 4 is tilted in the depth direction is not limited to this, and may be a combination of another key of the keyboard 201 and the wheel 202A, or the cursor key of the keyboard 201 instead of the wheel 202A. (Direction key) may be combined.
- the pen tablet may be tilted when a predetermined operation is performed with a touch panel, a joystick, or the like.
- the display device 3 is not limited to a two-dimensional display device such as a liquid crystal display, but may be a three-dimensional display device such as DFD.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/586,447 US7880726B2 (en) | 2004-10-12 | 2005-10-12 | 3D pointing method, 3D display control method, 3D pointing device, 3D display control device, 3D pointing program, and 3D display control program |
JP2006540952A JP4515458B2 (ja) | 2004-10-12 | 2005-10-12 | 3次元ポインティング方法、3次元ポインティング装置、及び3次元ポインティングプログラム |
EP05793698A EP1821182B1 (en) | 2004-10-12 | 2005-10-12 | 3d pointing method, 3d display control method, 3d pointing device, 3d display control device, 3d pointing program, and 3d display control program |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004297927 | 2004-10-12 | ||
JP2004-297927 | 2004-10-12 | ||
JP2004306636 | 2004-10-21 | ||
JP2004-306636 | 2004-10-21 | ||
JP2004309667 | 2004-10-25 | ||
JP2004-309667 | 2004-10-25 | ||
JP2005185131 | 2005-06-24 | ||
JP2005-185131 | 2005-06-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006041097A1 true WO2006041097A1 (ja) | 2006-04-20 |
Family
ID=36148384
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/018799 WO2006041097A1 (ja) | 2004-10-12 | 2005-10-12 | 3次元ポインティング方法、3次元表示制御方法、3次元ポインティング装置、3次元表示制御装置、3次元ポインティングプログラム、及び3次元表示制御プログラム |
Country Status (6)
Country | Link |
---|---|
US (1) | US7880726B2 (ja) |
EP (1) | EP1821182B1 (ja) |
JP (3) | JP4515458B2 (ja) |
KR (1) | KR100832355B1 (ja) |
CN (3) | CN101308441B (ja) |
WO (1) | WO2006041097A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007317050A (ja) * | 2006-05-29 | 2007-12-06 | Nippon Telegr & Teleph Corp <Ntt> | 3次元表示を用いたユーザインタフェースシステム |
WO2008024072A1 (en) * | 2006-08-25 | 2008-02-28 | Weike (S) Pte Ltd | Virtual gaming system and method |
JP2009157908A (ja) * | 2007-12-07 | 2009-07-16 | Sony Corp | 情報表示端末、情報表示方法、およびプログラム |
JP2009245239A (ja) * | 2008-03-31 | 2009-10-22 | Sony Corp | ポインタ表示装置、ポインタ表示検出方法、ポインタ表示検出プログラム及び情報機器 |
US20090309854A1 (en) * | 2008-06-13 | 2009-12-17 | Polyvision Corporation | Input devices with multiple operating modes |
JP2011113107A (ja) * | 2009-11-24 | 2011-06-09 | Konami Digital Entertainment Co Ltd | 入力受付装置、入力判定方法、および、プログラム |
JP2011170901A (ja) * | 2011-06-09 | 2011-09-01 | Sony Corp | ポインタ表示装置、ポインタ表示検出方法及び情報機器 |
JP2012018559A (ja) * | 2010-07-08 | 2012-01-26 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
JP2012113389A (ja) * | 2010-11-22 | 2012-06-14 | Internatl Business Mach Corp <Ibm> | タッチパネルにおけるドラッグ操作でオブジェクトを移動させる方法、装置及びコンピュータプログラム |
WO2013054583A1 (ja) * | 2011-10-11 | 2013-04-18 | インターナショナル・ビジネス・マシーンズ・コーポレーション | オブジェクト指示方法、装置及びコンピュータ・プログラム |
JP2013190926A (ja) * | 2012-03-13 | 2013-09-26 | Nikon Corp | 入力装置、及び表示装置 |
WO2014054317A1 (ja) * | 2012-10-05 | 2014-04-10 | Necソフト株式会社 | ユーザインタフェース装置及びユーザインタフェース方法 |
JP2018106499A (ja) * | 2016-12-27 | 2018-07-05 | 株式会社コロプラ | 仮想空間における画像の表示を制御するためにコンピュータによって実行される方法、当該方法をコンピュータに実現させるためのプログラム、および、コンピュータ装置 |
Families Citing this family (173)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7961909B2 (en) | 2006-03-08 | 2011-06-14 | Electronic Scripting Products, Inc. | Computer interface employing a manipulated object with absolute pose detection component and a display |
US9229540B2 (en) | 2004-01-30 | 2016-01-05 | Electronic Scripting Products, Inc. | Deriving input from six degrees of freedom interfaces |
JP4832826B2 (ja) * | 2005-07-26 | 2011-12-07 | 任天堂株式会社 | オブジェクト制御プログラムおよび情報処理装置 |
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
JP2007272067A (ja) * | 2006-03-31 | 2007-10-18 | Brother Ind Ltd | 画像表示装置 |
KR100811954B1 (ko) * | 2006-07-13 | 2008-03-10 | 현대자동차주식회사 | 물체상 표시방법 |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
WO2008029467A1 (fr) * | 2006-09-07 | 2008-03-13 | Osaka Electro-Communication University | Système, procédé et programme d'entrée de mouvement |
US8564543B2 (en) * | 2006-09-11 | 2013-10-22 | Apple Inc. | Media player with imaged based browsing |
KR101299682B1 (ko) * | 2006-10-16 | 2013-08-22 | 삼성전자주식회사 | 범용 입력장치 |
US8519964B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US8619038B2 (en) | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
JP4442683B2 (ja) * | 2007-11-27 | 2010-03-31 | セイコーエプソン株式会社 | 表示システム、表示装置及びプログラム |
US20090167702A1 (en) * | 2008-01-02 | 2009-07-02 | Nokia Corporation | Pointing device detection |
KR101499546B1 (ko) * | 2008-01-17 | 2015-03-09 | 삼성전자주식회사 | 터치 스크린 장치의 디스플레이 영역 제어 방법, 장치, 및기록매체 |
US8169414B2 (en) * | 2008-07-12 | 2012-05-01 | Lim Seung E | Control of electronic games via finger angle using a high dimensional touchpad (HDTP) touch user interface |
US8508475B2 (en) * | 2008-10-24 | 2013-08-13 | Microsoft Corporation | User interface elements positioned for display |
US8984431B2 (en) | 2009-03-16 | 2015-03-17 | Apple Inc. | Device, method, and graphical user interface for moving a current position in content at a variable scrubbing rate |
TWI378370B (en) * | 2009-05-07 | 2012-12-01 | Wistron Corp | Buffered stylus |
KR20110004027A (ko) * | 2009-07-07 | 2011-01-13 | 삼성전자주식회사 | 펜형 입력 장치 및 이를 이용한 입력 방법 |
CN101599182B (zh) * | 2009-07-29 | 2012-10-03 | 威盛电子股份有限公司 | 三维物件旋转方法和与其对应的计算机*** |
JP4701424B2 (ja) * | 2009-08-12 | 2011-06-15 | 島根県 | 画像認識装置および操作判定方法並びにプログラム |
KR100940744B1 (ko) * | 2009-09-24 | 2010-02-04 | (주)디지탈아리아 | 임베디드 및 모바일 기기에서 3차원 질감을 가진 표면의 접촉 그래픽 효과를 구현하는 방법 |
WO2011047618A1 (zh) * | 2009-10-20 | 2011-04-28 | Tuan Hsi-Ching | 鼠标笔及其光电控制开关 |
US9681112B2 (en) * | 2009-11-05 | 2017-06-13 | Lg Electronics Inc. | Image display apparatus and method for controlling the image display apparatus |
CN101739202B (zh) * | 2009-11-06 | 2011-11-30 | 谢达 | 一种局部会自动透视的用户界面显示方法 |
FI20090434A (fi) * | 2009-11-17 | 2011-05-18 | Tampereen Yliopisto | Menetelmä, tietokoneohjelma ja laite tietokoneen kanssa tapahtuvaan vuorovaikutukseen |
JP5898842B2 (ja) * | 2010-01-14 | 2016-04-06 | 任天堂株式会社 | 携帯型情報処理装置、携帯型ゲーム装置 |
EP2355526A3 (en) * | 2010-01-14 | 2012-10-31 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
US10007393B2 (en) * | 2010-01-19 | 2018-06-26 | Apple Inc. | 3D view of file structure |
JP2011177203A (ja) * | 2010-02-26 | 2011-09-15 | Nintendo Co Ltd | オブジェクト制御プログラムおよびオブジェクト制御装置 |
JP5800501B2 (ja) | 2010-03-12 | 2015-10-28 | 任天堂株式会社 | 表示制御プログラム、表示制御装置、表示制御システム、及び、表示制御方法 |
US20130021288A1 (en) * | 2010-03-31 | 2013-01-24 | Nokia Corporation | Apparatuses, Methods and Computer Programs for a Virtual Stylus |
US8826184B2 (en) * | 2010-04-05 | 2014-09-02 | Lg Electronics Inc. | Mobile terminal and image display controlling method thereof |
US9170708B2 (en) | 2010-04-07 | 2015-10-27 | Apple Inc. | Device, method, and graphical user interface for managing folders |
US10788976B2 (en) | 2010-04-07 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for managing folders with multiple pages |
JP5229273B2 (ja) * | 2010-05-28 | 2013-07-03 | 株式会社Jvcケンウッド | タッチパネルを有する電子機器、及び動作制御方法 |
US8384770B2 (en) | 2010-06-02 | 2013-02-26 | Nintendo Co., Ltd. | Image display system, image display apparatus, and image display method |
JP5520698B2 (ja) | 2010-06-07 | 2014-06-11 | 任天堂株式会社 | ゲーム装置、ゲームプログラムおよびゲームシステム |
US11068149B2 (en) * | 2010-06-09 | 2021-07-20 | Microsoft Technology Licensing, Llc | Indirect user interaction with desktop using touch-sensitive control surface |
JP5617375B2 (ja) * | 2010-06-22 | 2014-11-05 | ソニー株式会社 | 画像表示装置、表示制御方法及びプログラム |
JP5574849B2 (ja) * | 2010-06-28 | 2014-08-20 | キヤノン株式会社 | 情報処理装置及びその制御方法、プログラム |
JP5676608B2 (ja) * | 2010-06-29 | 2015-02-25 | 富士フイルム株式会社 | 立体表示装置、立体撮影装置、および指示判定方法 |
WO2012025159A1 (en) * | 2010-08-27 | 2012-03-01 | Brainlab Ag | Multiple-layer pointing position determination on a medical display |
JP2012058896A (ja) | 2010-09-07 | 2012-03-22 | Sony Corp | 情報処理装置、プログラムおよび情報処理方法 |
US9207859B2 (en) | 2010-09-14 | 2015-12-08 | Lg Electronics Inc. | Method and mobile terminal for displaying fixed objects independent of shifting background images on a touchscreen |
KR101708696B1 (ko) * | 2010-09-15 | 2017-02-21 | 엘지전자 주식회사 | 휴대 단말기 및 그 동작 제어방법 |
JP5655478B2 (ja) * | 2010-10-01 | 2015-01-21 | ソニー株式会社 | 情報処理装置、情報処理方法 |
US9001053B2 (en) * | 2010-10-28 | 2015-04-07 | Honeywell International Inc. | Display system for controlling a selector symbol within an image |
US10146426B2 (en) * | 2010-11-09 | 2018-12-04 | Nokia Technologies Oy | Apparatus and method for user input for controlling displayed information |
JP2012103980A (ja) * | 2010-11-11 | 2012-05-31 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
US20120119990A1 (en) * | 2010-11-12 | 2012-05-17 | Kye Systmes Corp. | Pointer control device, system and method |
KR101788049B1 (ko) * | 2010-12-15 | 2017-10-19 | 엘지전자 주식회사 | 이동 단말기 및 그 제어방법 |
JP6021296B2 (ja) | 2010-12-16 | 2016-11-09 | 任天堂株式会社 | 表示制御プログラム、表示制御装置、表示制御システム、および、表示制御方法 |
KR101763263B1 (ko) * | 2010-12-24 | 2017-07-31 | 삼성전자주식회사 | 3d 디스플레이 단말 장치 및 그 조작 방법 |
US9519357B2 (en) * | 2011-01-30 | 2016-12-13 | Lg Electronics Inc. | Image display apparatus and method for operating the same in 2D and 3D modes |
US9271027B2 (en) * | 2011-01-30 | 2016-02-23 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
US20120206419A1 (en) * | 2011-02-11 | 2012-08-16 | Massachusetts Institute Of Technology | Collapsible input device |
US20120223935A1 (en) * | 2011-03-01 | 2012-09-06 | Nokia Corporation | Methods and apparatuses for facilitating interaction with a three-dimensional user interface |
JP2012190184A (ja) * | 2011-03-09 | 2012-10-04 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2012190183A (ja) * | 2011-03-09 | 2012-10-04 | Sony Corp | 画像処理装置および方法、並びにプログラム |
KR101781908B1 (ko) * | 2011-03-24 | 2017-09-26 | 엘지전자 주식회사 | 이동 단말기 및 그것의 제어방법 |
US8314790B1 (en) * | 2011-03-29 | 2012-11-20 | Google Inc. | Layer opacity adjustment for a three-dimensional object |
US20120272180A1 (en) * | 2011-04-20 | 2012-10-25 | Nokia Corporation | Method and apparatus for providing content flipping based on a scrolling operation |
US8933913B2 (en) | 2011-06-28 | 2015-01-13 | Microsoft Corporation | Electromagnetic 3D stylus |
JP2013016018A (ja) * | 2011-07-04 | 2013-01-24 | Canon Inc | 表示制御装置、制御方法及びプログラム |
WO2013008121A1 (en) * | 2011-07-13 | 2013-01-17 | Koninklijke Philips Electronics N.V. | Method for automatically adjusting a focal plane of a digital pathology image |
KR101941644B1 (ko) * | 2011-07-19 | 2019-01-23 | 삼성전자 주식회사 | 휴대 단말기의 피드백 제공 방법 및 장치 |
CN103946773A (zh) * | 2011-08-29 | 2014-07-23 | S·瓦利切克 | 多功能笔输入***计算机控制器 |
CN102981743B (zh) * | 2011-09-05 | 2016-05-25 | 联想(北京)有限公司 | 控制操作对象的方法及电子设备 |
US9519350B2 (en) | 2011-09-19 | 2016-12-13 | Samsung Electronics Co., Ltd. | Interface controlling apparatus and method using force |
US9501098B2 (en) | 2011-09-19 | 2016-11-22 | Samsung Electronics Co., Ltd. | Interface controlling apparatus and method using force |
CN102508563B (zh) * | 2011-11-03 | 2013-08-07 | 深圳超多维光电子有限公司 | 一种立体交互方法以及***作设备 |
CN102426486B (zh) * | 2011-11-03 | 2013-08-07 | 深圳超多维光电子有限公司 | 一种立体交互方法及***作设备 |
US20130117717A1 (en) * | 2011-11-03 | 2013-05-09 | Shenzhen Super Perfect Optics Limited | 3d user interaction system and method |
CN102508562B (zh) * | 2011-11-03 | 2013-04-10 | 深圳超多维光电子有限公司 | 一种立体交互*** |
CN102508561B (zh) * | 2011-11-03 | 2013-11-06 | 深圳超多维光电子有限公司 | 一种操作棒 |
JP2013118468A (ja) * | 2011-12-02 | 2013-06-13 | Sony Corp | 画像処理装置および画像処理方法 |
JP2013125247A (ja) * | 2011-12-16 | 2013-06-24 | Sony Corp | ヘッドマウントディスプレイ及び情報表示装置 |
WO2013135270A1 (en) * | 2012-03-13 | 2013-09-19 | Telefonaktiebolaget L M Ericsson (Publ) | An apparatus and method for navigating on a touch sensitive screen thereof |
AU2013259613B2 (en) | 2012-05-09 | 2016-07-21 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
CN104487928B (zh) | 2012-05-09 | 2018-07-06 | 苹果公司 | 用于响应于手势而在显示状态之间进行过渡的设备、方法和图形用户界面 |
WO2013169851A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for facilitating user interaction with controls in a user interface |
WO2013169846A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for displaying additional information in response to a user contact |
WO2013169865A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input |
WO2013169842A2 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for selecting object within a group of objects |
AU2013259642A1 (en) | 2012-05-09 | 2014-12-04 | Apple Inc. | Device, method, and graphical user interface for moving and dropping a user interface object |
WO2013169849A2 (en) | 2012-05-09 | 2013-11-14 | Industries Llc Yknots | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
WO2013169843A1 (en) | 2012-05-09 | 2013-11-14 | Yknots Industries Llc | Device, method, and graphical user interface for manipulating framed graphical objects |
CN104471521B (zh) | 2012-05-09 | 2018-10-23 | 苹果公司 | 用于针对改变用户界面对象的激活状态来提供反馈的设备、方法和图形用户界面 |
CN106201316B (zh) | 2012-05-09 | 2020-09-29 | 苹果公司 | 用于选择用户界面对象的设备、方法和图形用户界面 |
JP5808712B2 (ja) * | 2012-05-23 | 2015-11-10 | 日立マクセル株式会社 | 映像表示装置 |
US9098516B2 (en) * | 2012-07-18 | 2015-08-04 | DS Zodiac, Inc. | Multi-dimensional file system |
KR101986218B1 (ko) * | 2012-08-02 | 2019-06-05 | 삼성전자주식회사 | 디스플레이 장치 및 방법 |
JP6096473B2 (ja) * | 2012-11-01 | 2017-03-15 | 東芝メディカルシステムズ株式会社 | 医用画像診断装置、および、医用画像処理方法 |
US10241638B2 (en) * | 2012-11-02 | 2019-03-26 | Atheer, Inc. | Method and apparatus for a three dimensional interface |
EP2912542B1 (en) | 2012-12-29 | 2022-07-13 | Apple Inc. | Device and method for forgoing generation of tactile output for a multi-contact gesture |
EP2939095B1 (en) * | 2012-12-29 | 2018-10-03 | Apple Inc. | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
KR101812329B1 (ko) | 2012-12-29 | 2017-12-26 | 애플 인크. | 콘텐츠를 스크롤할지 선택할지 결정하기 위한 디바이스, 방법 및 그래픽 사용자 인터페이스 |
US9075464B2 (en) | 2013-01-30 | 2015-07-07 | Blackberry Limited | Stylus based object modification on a touch-sensitive display |
EP2763019A1 (en) * | 2013-01-30 | 2014-08-06 | BlackBerry Limited | Stylus based object modification on a touch-sensitive display |
CN104077013B (zh) * | 2013-03-28 | 2019-02-05 | 联想(北京)有限公司 | 指令识别方法和电子设备 |
TWI502459B (zh) * | 2013-07-08 | 2015-10-01 | Acer Inc | 電子裝置及其觸控操作方法 |
JP5616503B1 (ja) * | 2013-08-30 | 2014-10-29 | ヤフー株式会社 | 配信装置、端末装置、配信方法及び配信プログラム |
KR101531169B1 (ko) * | 2013-09-23 | 2015-06-24 | 삼성전자주식회사 | 사용자 단말에서 3차원 객체를 그리는 방법 및 이를 수행하는 사용자 단말 |
KR102129594B1 (ko) | 2013-10-30 | 2020-07-03 | 애플 인크. | 관련 사용자 인터페이스 객체를 표시 |
US10430017B2 (en) * | 2013-12-04 | 2019-10-01 | City University Of Hong Kong | Target pointing system making use of velocity dependent cursor |
US10423245B2 (en) * | 2014-01-31 | 2019-09-24 | Qualcomm Incorporated | Techniques for providing user input to a device |
JP2015149634A (ja) | 2014-02-07 | 2015-08-20 | ソニー株式会社 | 画像表示装置および方法 |
CN106030460B (zh) * | 2014-02-18 | 2017-12-19 | 三菱电机株式会社 | 移动体用手势引导装置、移动体用手势引导***及移动体用手势引导方法 |
JP2015156135A (ja) * | 2014-02-20 | 2015-08-27 | 株式会社東芝 | 表示装置、方法及びプログラム |
CN103984425B (zh) * | 2014-04-25 | 2017-01-04 | 深圳超多维光电子有限公司 | 用于立体显示交互的操作棒和操作棒的控制方法 |
US9827714B1 (en) | 2014-05-16 | 2017-11-28 | Google Llc | Method and system for 3-D printing of 3-D object models in interactive content items |
JP6112618B2 (ja) * | 2014-07-17 | 2017-04-12 | Necプラットフォームズ株式会社 | 情報処理システム |
CN110072131A (zh) | 2014-09-02 | 2019-07-30 | 苹果公司 | 音乐用户界面 |
US9766724B2 (en) * | 2014-11-03 | 2017-09-19 | Lenovo (Singapore) Pte. Ltd. | Orientation dependent stylus button function |
CN105787402B (zh) | 2014-12-16 | 2019-07-05 | 阿里巴巴集团控股有限公司 | 一种信息展示方法及装置 |
KR101652973B1 (ko) * | 2014-12-17 | 2016-09-05 | 주식회사 트레이스 | 스타일러스 펜의 기울기 정보를 이용하는 디지타이저 |
US9632664B2 (en) | 2015-03-08 | 2017-04-25 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
US10048757B2 (en) | 2015-03-08 | 2018-08-14 | Apple Inc. | Devices and methods for controlling media presentation |
US9645732B2 (en) | 2015-03-08 | 2017-05-09 | Apple Inc. | Devices, methods, and graphical user interfaces for displaying and using menus |
US10095396B2 (en) | 2015-03-08 | 2018-10-09 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
US9639184B2 (en) | 2015-03-19 | 2017-05-02 | Apple Inc. | Touch input cursor manipulation |
EP3273331A4 (en) * | 2015-03-20 | 2018-04-18 | Ricoh Company, Ltd. | Display apparatus, display control method, display control program, and display system |
JP6651297B2 (ja) | 2015-03-27 | 2020-02-19 | ユニバーシティ・オブ・タンペレUniversity of Tampere | ハプティック・スタイラス |
US20170045981A1 (en) | 2015-08-10 | 2017-02-16 | Apple Inc. | Devices and Methods for Processing Touch Inputs Based on Their Intensities |
KR102465804B1 (ko) * | 2015-05-12 | 2022-11-10 | 엘지전자 주식회사 | 영상표시장치 및 그것의 제어방법 |
US9860451B2 (en) | 2015-06-07 | 2018-01-02 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US9830048B2 (en) | 2015-06-07 | 2017-11-28 | Apple Inc. | Devices and methods for processing touch inputs with instructions in a web page |
US9891811B2 (en) | 2015-06-07 | 2018-02-13 | Apple Inc. | Devices and methods for navigating between user interfaces |
US10200598B2 (en) | 2015-06-07 | 2019-02-05 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
US10346030B2 (en) | 2015-06-07 | 2019-07-09 | Apple Inc. | Devices and methods for navigating between user interfaces |
US9658704B2 (en) | 2015-06-10 | 2017-05-23 | Apple Inc. | Devices and methods for manipulating user interfaces with a stylus |
EP3308288A4 (en) * | 2015-06-12 | 2019-01-23 | Nureva Inc. | METHOD AND DEVICE FOR ADMINISTERING AND ORGANIZING OBJECTS IN A VIRTUAL MEMORY |
US10235035B2 (en) | 2015-08-10 | 2019-03-19 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
US10416800B2 (en) | 2015-08-10 | 2019-09-17 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
US9880735B2 (en) | 2015-08-10 | 2018-01-30 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
KR102449838B1 (ko) | 2015-09-01 | 2022-09-30 | 삼성전자주식회사 | 사용자의 인터랙션에 기초하여 3차원 오브젝트를 처리하는 방법 및 장치 |
JP5997824B1 (ja) * | 2015-11-10 | 2016-09-28 | 株式会社オプティム | 遠隔端末、遠隔指示方法及び遠隔端末用プログラム |
US9684391B1 (en) * | 2015-12-11 | 2017-06-20 | Logomark, Inc. | Telescopic mechanism and touch tool for writing instrument or the like |
CN105427824B (zh) * | 2016-01-05 | 2016-11-30 | 京东方科技集团股份有限公司 | 具有漏电补偿模块的goa电路、阵列基板和显示面板 |
KR102307215B1 (ko) | 2016-02-04 | 2021-10-01 | 후아웨이 테크놀러지 컴퍼니 리미티드 | 데이터 처리 방법 및 전자 디바이스 |
US11577159B2 (en) | 2016-05-26 | 2023-02-14 | Electronic Scripting Products Inc. | Realistic virtual/augmented/mixed reality viewing and interactions |
DK201670595A1 (en) | 2016-06-11 | 2018-01-22 | Apple Inc | Configuring context-specific user interfaces |
US11816325B2 (en) | 2016-06-12 | 2023-11-14 | Apple Inc. | Application shortcuts for carplay |
CN107621893B (zh) * | 2016-07-15 | 2020-11-20 | 苹果公司 | 在非电子表面上使用电子输入设备的内容创建 |
US10915185B2 (en) | 2016-10-31 | 2021-02-09 | Hewlett-Packard Development Company, L.P. | Generating a three-dimensional image using tilt angle of a digital pen |
KR102674463B1 (ko) * | 2016-12-23 | 2024-06-13 | 현대자동차주식회사 | 차량, 및 그 제어방법 |
EP3595424B1 (en) * | 2017-03-08 | 2022-04-27 | Fuji Corporation | Three-dimensional mounting device and three-dimensional mounting method |
CN107102750B (zh) * | 2017-04-23 | 2019-07-26 | 吉林大学 | 一种基于笔式交互***的虚拟三维空间中目标的选择方法 |
US11431836B2 (en) | 2017-05-02 | 2022-08-30 | Apple Inc. | Methods and interfaces for initiating media playback |
US10992795B2 (en) | 2017-05-16 | 2021-04-27 | Apple Inc. | Methods and interfaces for home media control |
US10417827B2 (en) * | 2017-05-04 | 2019-09-17 | Microsoft Technology Licensing, Llc | Syndication of direct and indirect interactions in a computer-mediated reality environment |
US10928980B2 (en) | 2017-05-12 | 2021-02-23 | Apple Inc. | User interfaces for playing and managing audio items |
US20220279063A1 (en) | 2017-05-16 | 2022-09-01 | Apple Inc. | Methods and interfaces for home media control |
CN111343060B (zh) | 2017-05-16 | 2022-02-11 | 苹果公司 | 用于家庭媒体控制的方法和界面 |
EP3574387A4 (en) * | 2017-07-18 | 2020-09-30 | Hewlett-Packard Development Company, L.P. | PROJECTION OF INPUTS ON THREE-DIMENSIONAL OBJECT REPRESENTATIONS |
JP7134656B2 (ja) * | 2018-03-15 | 2022-09-12 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用表示制御装置、および表示制御方法 |
US10592013B2 (en) * | 2018-04-25 | 2020-03-17 | Microsoft Technology Licensing, Llc | Systems and methods for unifying two-dimensional and three-dimensional interfaces |
KR20190140657A (ko) * | 2018-06-12 | 2019-12-20 | 삼성전자주식회사 | 오브젝트를 생성하는 전자장치 및 시스템 |
CN108984262B (zh) * | 2018-07-12 | 2021-04-13 | 宁波视睿迪光电有限公司 | 三维指针的创建方法、装置及电子设备 |
EP3675063A1 (en) * | 2018-12-29 | 2020-07-01 | Dassault Systèmes | Forming a dataset for inference of solid cad features |
EP3675062A1 (en) | 2018-12-29 | 2020-07-01 | Dassault Systèmes | Learning a neural network for inference of solid cad features |
US11675476B2 (en) | 2019-05-05 | 2023-06-13 | Apple Inc. | User interfaces for widgets |
US11010121B2 (en) | 2019-05-31 | 2021-05-18 | Apple Inc. | User interfaces for audio media control |
EP4231124A1 (en) | 2019-05-31 | 2023-08-23 | Apple Inc. | User interfaces for audio media control |
US11042220B2 (en) * | 2019-08-02 | 2021-06-22 | Gustavo Mantovani Ruggiero | Three-dimensional input device |
KR102221172B1 (ko) * | 2019-10-04 | 2021-03-02 | (주) 아이.에스.브이. | 펜마우스의 가상클릭 제어방법 |
US11946996B2 (en) | 2020-06-30 | 2024-04-02 | Apple, Inc. | Ultra-accurate object tracking using radar in multi-object environment |
US20220004299A1 (en) * | 2020-07-01 | 2022-01-06 | Wacom Co., Ltd. | Systems and methods for dynamic shape sketching |
US11392291B2 (en) | 2020-09-25 | 2022-07-19 | Apple Inc. | Methods and interfaces for media control with dynamic feedback |
US11681835B2 (en) * | 2020-12-31 | 2023-06-20 | Hexagon Technology Center Gmbh | Rapid positioning drawing system and method |
US11614806B1 (en) | 2021-05-12 | 2023-03-28 | Apple Inc. | Input device with self-mixing interferometry sensors |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003085590A (ja) * | 2001-09-13 | 2003-03-20 | Nippon Telegr & Teleph Corp <Ntt> | 3次元情報操作方法およびその装置,3次元情報操作プログラムならびにそのプログラムの記録媒体 |
JP2004070920A (ja) * | 2002-06-11 | 2004-03-04 | Sony Computer Entertainment Inc | 情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、情報処理方法、及び情報処理装置 |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6022359B2 (ja) * | 1981-12-10 | 1985-06-01 | 松下電器産業株式会社 | 立体表示装置の表示入力装置 |
US6590573B1 (en) * | 1983-05-09 | 2003-07-08 | David Michael Geshwind | Interactive computer system for creating three-dimensional image information and for converting two-dimensional image information for three-dimensional display systems |
US4808979A (en) * | 1987-04-02 | 1989-02-28 | Tektronix, Inc. | Cursor for use in 3-D imaging systems |
JPH01134521A (ja) * | 1987-11-19 | 1989-05-26 | Toppan Printing Co Ltd | 三次元座標入力・修正装置 |
JPH031217A (ja) * | 1989-05-29 | 1991-01-07 | Olympus Optical Co Ltd | 立体画像処理装置 |
JPH0573208A (ja) | 1991-09-13 | 1993-03-26 | Wacom Co Ltd | 制御装置分離型の表示装置付座標検出装置 |
JPH0675693A (ja) | 1992-08-25 | 1994-03-18 | Toshiba Corp | 3次元ポインティング装置 |
JPH0816137A (ja) * | 1994-06-29 | 1996-01-19 | Nec Corp | 3次元座標入力装置及びカーソル表示制御方式 |
JPH08248938A (ja) | 1995-03-13 | 1996-09-27 | Mitsubishi Electric Corp | ポインティング装置 |
JP3741764B2 (ja) * | 1996-02-09 | 2006-02-01 | シャープ株式会社 | 三次元画像作成装置 |
JP2976879B2 (ja) * | 1996-03-22 | 1999-11-10 | 日本電気株式会社 | 深さ付きウィンドウ表示方式 |
US5880733A (en) | 1996-04-30 | 1999-03-09 | Microsoft Corporation | Display system and method for displaying windows of an operating system to provide a three-dimensional workspace for a computer system |
JP3539839B2 (ja) * | 1996-06-18 | 2004-07-07 | コナミ株式会社 | 疑似3次元表示されたフィールド内におけるカーソルの表示方法、ゲームシステム及び記録媒体 |
JP3476651B2 (ja) * | 1997-05-09 | 2003-12-10 | シャープ株式会社 | データ表示装置およびデータ表示プログラムを格納したコンピュータ読み取り可能な記録媒体 |
JPH117372A (ja) * | 1997-06-17 | 1999-01-12 | Hitachi Ltd | 図形選択装置 |
JPH11341456A (ja) * | 1998-05-29 | 1999-12-10 | Matsushita Graphic Communication Systems Inc | 家庭用マルチメディア通信システム |
JP2000099232A (ja) * | 1998-09-17 | 2000-04-07 | Nippon Telegr & Teleph Corp <Ntt> | Webページ多重同時表示閲覧方法および装置とWebページ多重同時表示閲覧プログラムを記録した記録媒体 |
JP2000185179A (ja) | 1998-12-24 | 2000-07-04 | Copcom Co Ltd | 画像処理装置および記録媒体 |
EP1109091A1 (en) | 1999-12-14 | 2001-06-20 | Sun Microsystems, Inc. | Display of selectable objects using visual feedback |
JP4582863B2 (ja) | 2000-05-22 | 2010-11-17 | 株式会社バンダイナムコゲームス | 立体視映像表示装置、及び情報記憶媒体 |
JP2002196874A (ja) * | 2000-12-27 | 2002-07-12 | Ntt Docomo Inc | 手書きデータ入力装置及び方法並びに個人認証装置及び方法 |
CN1369863A (zh) * | 2001-02-15 | 2002-09-18 | 矽统科技股份有限公司 | 通过预先排序以增进三维空间电脑绘图效果的方法 |
NZ511120A (en) | 2001-04-12 | 2003-05-30 | Deep Video Imaging Ltd | Multi screen display, each screen being spaced physically apart from the other |
JP2003076719A (ja) * | 2001-06-22 | 2003-03-14 | Sony Computer Entertainment Inc | 情報閲覧プログラム、情報閲覧プログラムを記録した記録媒体、情報閲覧装置及び方法、情報生成プログラム、情報生成プログラムを記録した記録媒体、情報生成装置及び方法、情報生成閲覧システム |
AU2003214910A1 (en) * | 2002-01-25 | 2003-10-13 | Silicon Graphics, Inc. | Three dimensional volumetric display input and output configurations |
JP3982288B2 (ja) * | 2002-03-12 | 2007-09-26 | 日本電気株式会社 | 三次元ウィンドウ表示装置、三次元ウィンドウ表示方法及び三次元ウィンドウ表示プログラム |
JP2004054777A (ja) * | 2002-07-23 | 2004-02-19 | Minolta Co Ltd | ファイル管理装置 |
EP1400924B1 (en) * | 2002-09-20 | 2008-12-31 | Nippon Telegraph and Telephone Corporation | Pseudo three dimensional image generating apparatus |
JP2004362408A (ja) | 2003-06-06 | 2004-12-24 | Canon Inc | 3次元データ表示操作装置 |
US7480873B2 (en) | 2003-09-15 | 2009-01-20 | Sun Microsystems, Inc. | Method and apparatus for manipulating two-dimensional windows within a three-dimensional display model |
US8542219B2 (en) * | 2004-01-30 | 2013-09-24 | Electronic Scripting Products, Inc. | Processing pose data derived from the pose of an elongate object |
-
2005
- 2005-10-12 CN CN2008101256414A patent/CN101308441B/zh not_active Expired - Fee Related
- 2005-10-12 EP EP05793698A patent/EP1821182B1/en not_active Expired - Fee Related
- 2005-10-12 CN CN2005800026763A patent/CN100407118C/zh not_active Expired - Fee Related
- 2005-10-12 KR KR1020067014265A patent/KR100832355B1/ko not_active IP Right Cessation
- 2005-10-12 US US10/586,447 patent/US7880726B2/en not_active Expired - Fee Related
- 2005-10-12 WO PCT/JP2005/018799 patent/WO2006041097A1/ja active Application Filing
- 2005-10-12 CN CN2008101256429A patent/CN101308442B/zh not_active Expired - Fee Related
- 2005-10-12 JP JP2006540952A patent/JP4515458B2/ja not_active Expired - Fee Related
-
2009
- 2009-11-06 JP JP2009255526A patent/JP5074475B2/ja active Active
-
2011
- 2011-11-21 JP JP2011254231A patent/JP5230787B2/ja active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003085590A (ja) * | 2001-09-13 | 2003-03-20 | Nippon Telegr & Teleph Corp <Ntt> | 3次元情報操作方法およびその装置,3次元情報操作プログラムならびにそのプログラムの記録媒体 |
JP2004070920A (ja) * | 2002-06-11 | 2004-03-04 | Sony Computer Entertainment Inc | 情報処理プログラム、情報処理プログラムを記録したコンピュータ読み取り可能な記録媒体、情報処理方法、及び情報処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1821182A4 * |
Cited By (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007317050A (ja) * | 2006-05-29 | 2007-12-06 | Nippon Telegr & Teleph Corp <Ntt> | 3次元表示を用いたユーザインタフェースシステム |
WO2008024072A1 (en) * | 2006-08-25 | 2008-02-28 | Weike (S) Pte Ltd | Virtual gaming system and method |
JP2009157908A (ja) * | 2007-12-07 | 2009-07-16 | Sony Corp | 情報表示端末、情報表示方法、およびプログラム |
US8711104B2 (en) | 2008-03-31 | 2014-04-29 | Sony Corporation | Pointer display device, pointer display/detection method, pointer display/detection program and information apparatus |
JP2009245239A (ja) * | 2008-03-31 | 2009-10-22 | Sony Corp | ポインタ表示装置、ポインタ表示検出方法、ポインタ表示検出プログラム及び情報機器 |
US9710085B2 (en) | 2008-03-31 | 2017-07-18 | Sony Corporation | Pointer display device, pointer display/detection method, pointer display/detection program and information apparatus |
US8913030B2 (en) | 2008-03-31 | 2014-12-16 | Sony Corporation | Pointer display device, pointer display/detection method, pointer display/detection program and information apparatus |
US11029775B2 (en) | 2008-03-31 | 2021-06-08 | Sony Corporation | Pointer display device, pointer display detection method, pointer display detection program and information apparatus |
US10191573B2 (en) | 2008-03-31 | 2019-01-29 | Sony Corporation | Pointer display device, pointer display/detection method, pointer display/detection program and information apparatus |
US20090309854A1 (en) * | 2008-06-13 | 2009-12-17 | Polyvision Corporation | Input devices with multiple operating modes |
JP2011113107A (ja) * | 2009-11-24 | 2011-06-09 | Konami Digital Entertainment Co Ltd | 入力受付装置、入力判定方法、および、プログラム |
JP2012018559A (ja) * | 2010-07-08 | 2012-01-26 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
US10656821B2 (en) | 2010-11-22 | 2020-05-19 | International Business Machines Corporation | Moving an object by drag operation on a touch panel |
US10379727B2 (en) | 2010-11-22 | 2019-08-13 | International Business Machines Corporation | Moving an object by drag operation on a touch panel |
US10140010B2 (en) | 2010-11-22 | 2018-11-27 | International Business Machines Corporation | Moving an object by drag operation on a touch panel |
US9898181B2 (en) | 2010-11-22 | 2018-02-20 | International Business Machines Corporation | Moving an object by drag operation on a touch panel |
US9298359B2 (en) | 2010-11-22 | 2016-03-29 | International Business Machines Corporation | Moving an object by drag operation on a touch panel |
JP2012113389A (ja) * | 2010-11-22 | 2012-06-14 | Internatl Business Mach Corp <Ibm> | タッチパネルにおけるドラッグ操作でオブジェクトを移動させる方法、装置及びコンピュータプログラム |
US9041664B2 (en) | 2010-11-22 | 2015-05-26 | International Business Machines Corporation | Moving an object by drag operation on a touch panel |
US9875011B2 (en) | 2010-11-22 | 2018-01-23 | International Business Machines Corporation | Moving an object by drag operation on a touch panel |
JP2011170901A (ja) * | 2011-06-09 | 2011-09-01 | Sony Corp | ポインタ表示装置、ポインタ表示検出方法及び情報機器 |
US8860679B2 (en) | 2011-10-11 | 2014-10-14 | International Business Machines Corporation | Pointing to a desired object displayed on a touchscreen |
CN103842945B (zh) * | 2011-10-11 | 2016-09-28 | 国际商业机器公司 | 对象指向方法、设备 |
GB2509651B (en) * | 2011-10-11 | 2015-07-08 | Ibm | Object designation method, device and computer program |
JP5576571B2 (ja) * | 2011-10-11 | 2014-08-20 | インターナショナル・ビジネス・マシーンズ・コーポレーション | オブジェクト指示方法、装置及びコンピュータ・プログラム |
GB2509651A (en) * | 2011-10-11 | 2014-07-09 | Ibm | Object designation method, device and computer program |
CN103842945A (zh) * | 2011-10-11 | 2014-06-04 | 国际商业机器公司 | 对象指向方法、设备和计算机程序 |
WO2013054583A1 (ja) * | 2011-10-11 | 2013-04-18 | インターナショナル・ビジネス・マシーンズ・コーポレーション | オブジェクト指示方法、装置及びコンピュータ・プログラム |
JP2013190926A (ja) * | 2012-03-13 | 2013-09-26 | Nikon Corp | 入力装置、及び表示装置 |
JP5863984B2 (ja) * | 2012-10-05 | 2016-02-17 | Necソリューションイノベータ株式会社 | ユーザインタフェース装置及びユーザインタフェース方法 |
US9760180B2 (en) | 2012-10-05 | 2017-09-12 | Nec Solution Innovators, Ltd. | User interface device and user interface method |
WO2014054317A1 (ja) * | 2012-10-05 | 2014-04-10 | Necソフト株式会社 | ユーザインタフェース装置及びユーザインタフェース方法 |
JP2018106499A (ja) * | 2016-12-27 | 2018-07-05 | 株式会社コロプラ | 仮想空間における画像の表示を制御するためにコンピュータによって実行される方法、当該方法をコンピュータに実現させるためのプログラム、および、コンピュータ装置 |
Also Published As
Publication number | Publication date |
---|---|
JP5230787B2 (ja) | 2013-07-10 |
JP5074475B2 (ja) | 2012-11-14 |
EP1821182A4 (en) | 2011-02-23 |
JP2010055627A (ja) | 2010-03-11 |
KR100832355B1 (ko) | 2008-05-26 |
EP1821182A1 (en) | 2007-08-22 |
KR20070039868A (ko) | 2007-04-13 |
EP1821182B1 (en) | 2013-03-27 |
US7880726B2 (en) | 2011-02-01 |
JP4515458B2 (ja) | 2010-07-28 |
CN101308442A (zh) | 2008-11-19 |
US20080225007A1 (en) | 2008-09-18 |
CN100407118C (zh) | 2008-07-30 |
CN101308441A (zh) | 2008-11-19 |
CN101308441B (zh) | 2010-09-22 |
JPWO2006041097A1 (ja) | 2008-05-15 |
JP2012079328A (ja) | 2012-04-19 |
CN1910543A (zh) | 2007-02-07 |
CN101308442B (zh) | 2012-04-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5230787B2 (ja) | 3次元ポインティング方法、3次元ポインティング装置、及び3次元ポインティングプログラム | |
CN110603509B (zh) | 计算机介导的现实环境中直接和间接交互的联合 | |
Fishkin et al. | Embodied user interfaces for really direct manipulation | |
US10031608B2 (en) | Organizational tools on a multi-touch display device | |
US7480873B2 (en) | Method and apparatus for manipulating two-dimensional windows within a three-dimensional display model | |
US7898529B2 (en) | User interface having a placement and layout suitable for pen-based computers | |
US20170228138A1 (en) | System and method for spatial interaction for viewing and manipulating off-screen content | |
KR101794498B1 (ko) | 터치-감지형 디스플레이를 사용하여 오브젝트를 복제하는 방법 및 시스템 | |
US10180714B1 (en) | Two-handed multi-stroke marking menus for multi-touch devices | |
Hachet et al. | Navidget for easy 3d camera positioning from 2d inputs | |
US20050204306A1 (en) | Enhancements for manipulating two-dimensional windows within a three-dimensional display model | |
US9035877B2 (en) | 3D computer cursor | |
Mine et al. | Making VR work: building a real-world immersive modeling application in the virtual world | |
JP2006500676A (ja) | グラフィカル・ユーザ・インタフェース・ナビゲーション方法、及び装置。 | |
JPH06282368A (ja) | 情報処理装置の位置情報入力システム | |
Brasier et al. | AR-enhanced Widgets for Smartphone-centric Interaction | |
Zeleznik et al. | Look-that-there: Exploiting gaze in virtual reality interactions | |
JP4907156B2 (ja) | 3次元ポインティング方法および3次元ポインティング装置ならびに3次元ポインティングプログラム | |
Olwal et al. | Unit-A Modular Framework for Interaction Technique Design, Development and Implementation | |
Declec et al. | Tech-note: Scruticam: Camera manipulation technique for 3d objects inspection | |
KR102118046B1 (ko) | 포터블 디바이스 및 그 제어 방법 | |
KR20100119599A (ko) | 휴대 단말기의 터치 및 커서 제어방법 및 이를 적용한 휴대 단말기 | |
Bierwirth | 3D Interaction Widget: A Metaphor for 2D and 3D Lens Interaction | |
Schulze | Look-That-There: Exploiting Gaze in Virtual Reality Interactions Robert C. Zeleznik, Andrew S. Forsberg | |
Denter | PyMT: New and Advanced Interaction Widgets |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006540952 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005793698 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067014265 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10586447 Country of ref document: US Ref document number: 200580002676.3 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 1020067014265 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2005793698 Country of ref document: EP |