GB2376397A - Virtual or augmented reality - Google Patents

Virtual or augmented reality Download PDF

Info

Publication number
GB2376397A
GB2376397A GB0113559A GB0113559A GB2376397A GB 2376397 A GB2376397 A GB 2376397A GB 0113559 A GB0113559 A GB 0113559A GB 0113559 A GB0113559 A GB 0113559A GB 2376397 A GB2376397 A GB 2376397A
Authority
GB
United Kingdom
Prior art keywords
user
virtual
physical space
physical
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0113559A
Other versions
GB0113559D0 (en
Inventor
Stephen Philip Cheatle
Stephen Bernard Pollard
Maurizio Pilu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HP Inc
Original Assignee
Hewlett Packard Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Co filed Critical Hewlett Packard Co
Priority to GB0113559A priority Critical patent/GB2376397A/en
Publication of GB0113559D0 publication Critical patent/GB0113559D0/en
Publication of GB2376397A publication Critical patent/GB2376397A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/373Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/376Image reproducers using viewer tracking for tracking left-right translational head movements, i.e. lateral movements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/378Image reproducers using viewer tracking for tracking rotational head movements around an axis perpendicular to the screen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/38Image reproducers using viewer tracking for tracking vertical translational head movements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B2027/0192Supplementary details
    • G02B2027/0198System for aligning or maintaining alignment of an image in a predetermined direction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • H04N13/289Switching between monoscopic and stereoscopic modes

Abstract

A portable head mounted display, HMD, (<B>10</B>) displays a virtual or augmented reality operating area e.g. for use in game playing. Two cameras (<B>12, 14</B>) placed on the HMD comprise a stereo vision depth sensing system to recognise and determine the distance to obstacles in the user's field of view. Real obstacles in the display may be 'cloaked' by forming virtual object around them. The user can therefore move around the physical environment, using only the displayed reality as a guide without fear of colliding onto real objects.

Description

<Desc/Clms Page number 1>
VIRTUAL REALITY Field of the Invention This invention relates to virtual reality and, in particular, to creating virtual reality environments for use in virtual reality games and similar arrangements.
Background to the Invention It is, of course, well known to provide computer game apparatus comprising a twodimensional display screen on which is graphically defined game area which can be viewed by a player. The player can notionally move through the game area using controls such as those provided by a joystick, computer keyboard, cursor control device or the like. However, irrespective of how realistically the game area is portrayed on the screen, the fact that it is twodimensional effectively prevents the player from feeling as if they are actually within the game area itself and, in any event, the screen/control interface involves player movements which are very different to the movements which are being simulated in the game, i. e. , for example, the user presses a button to'jump'within the game.
In order to provide a more realistic game playing environment, virtual reality systems have been developed which generally comprise a head mounted display (HMD) through which a virtual world is three-dimensionally defined and can be viewed as such, giving the player a sensation of actually being present within the game area. The head mounted display typically has some form of motion sensor which causes the display to change as the player moves a part of their body, such as their head, thereby giving the user the sensation of looking around the virtual world which makes up the game area.
Some virtual reality systems permit the player to move around a physical area, such movement being translated into movement within the virtual game area. For example, US Patent No.
5,913, 727 describes an interactive contact and simulation game in which a player and a threedimensional computer-generated image interact in simulated physical contact. The apparatus includes control means for generating a simulated image of the player and displaying that image within the computer-generated game area, and a number of position sensing means and
<Desc/Clms Page number 2>
impact generating means are secured to various parts of the player's body. The player moves around a predetermined physical space, and the position sensing means determine the player's position accordingly, in response to which determination, the computer-generated image is moved accordingly, to give the player the sensation of actually moving through the game area. When simulated contact between the player's image and virtual objects in the game area is determined by the control means, the impact generating means positioned at the apparent point of contact on the player's body is activated to apply pressure to the player at that point, thereby simulating contact between the player and the virtual game area.
One of the main disadvantages associated with such a system is that the player must be positioned in a relatively large, substantially empty physical space to prevent them from colliding with objects which they are moving around. In practice, this problem is generally overcome by providing some form of hand control which allows the user to simulate movement through the virtual reality game area by turning in the direction in which they wish to move and then using the hand control to give a'move forward'command.
The concept of"Z-keying"is known in the prior art as a technique for merging multiple graphics and video streams. The streams to be merged are assumed to come from coincident viewpoints. Each stream has a corresponding depth value. Pixel values for frames in the resulting merged stream are taken from the corresponding pixel location of the source stream with the lowest depth (z) value. As a result, virtual objects can be combined into real world scenes.
Augmented reality systems also exist whereby the head mounted display is effectively seethrough so that the user can see computer-generated graphical objects which are superimposed on physical objects present within the physical space in which the player is located, the computer-generated objects being substantially the same as the true physical objects in size, shape and nature. Such systems have the obvious advantage that the player can see the surrounding physical environment and is therefore free to move around without the fear of colliding with any objects which are present therein. However, one of the major disadvantages
<Desc/Clms Page number 3>
is of course that the true physical environment (say a living room or the like) and the objects within it are not as stimulating and exciting as a typical graphical games environment.
We have now devised an arrangement which overcomes the problems outlined above.
Summary of the Invention Thus, in accordance with the present invention, there is provided Interactive image display apparatus, comprising portable display means for displaying a virtual scene, means for generating an at least partially virtual scene including one or more virtual objects, and means for rendering said virtual scene in said display means for viewing by a user, wherein the apparent distance of any point in the user's field of view within the virtual scene to the nearest virtual object in said virtual scene is no further than the actual distance in the same direction of view to the nearest physical object within the physical space surrounding the user.
In one embodiment of the present invention, the means for generating a virtual scene may comprise means for determining a depth map from the user's viewpoint of the physical space surrounding said user, and means for generating said virtual scene such that its virtual depth is no greater than the corresponding depth of said depth map of said physical space. The depth map preferably comprises a pixel image of the physical scene within the user's field of view, wherein each pixel represents the distance to the nearest physical object within said field of view.
In contrast to the concept of"Z-keying"the apparatus of an exemplary embodiment of the present invention operates in a different manner in that it uses the determined depth information relating the surrounding physical world to cover at least parts of that physical world completely. In one embodiment, the apparatus operates to cover up the real world completely by explicitly generating a graphical world which is always closer than the real world equivalent.
<Desc/Clms Page number 4>
The apparatus beneficially comprises means for generating a new virtual scene in response to a change of the user's viewpoint, the new virtual scene preferably being displayed on display means as a smooth perturbation of the preceding rendered virtual scene from the preceding user viewpoint.
A preferred embodiment of the present invention comprises a motion sensor for providing a sensory input of the relative change of position of the user, the sensed change of position being translated into a corresponding change in the virtual position of the viewpoint of the virtual scene, the motion sensor preferably being located on a user's head, when in use.
In another embodiment of the invention, the means for generating a virtual scene comprises means for generating a (at least approximate) three-dimensional model of the physical space surrounding the user from the user's viewpoint, a three-dimensional virtual world being generated such that virtual objects appear to surround all of the physical objects within said physical space, and the virtual scene rendered in the display for view by the user provides a view of the said virtual world in accordance with the position of said user. The three dimensional model of the physical space may predetermined for a given physical space.
Alternatively, the apparatus may comprise means for determining depth information from a user's viewpoint at any given time, said depth information relating to the physical space surrounding said user, and sensor means carried or worn by the user for determining the physical location and/or orientation of said user, said depth information and said information relating to the physical location and/or orientation of the user being used to generate a threedimensional model of the physical space from the user's viewpoint.
In any event, the apparatus preferably comprises sensor means for determining the user's position and orientation relative to the surrounding physical space, and/or imaging means for determining the user's position relative to the surrounding physical space. Such imaging means may be arranged to identify one or more markers carried or worn by a user so that their position can be determined accordingly, or such imaging means may comprise image capturing means carried or worn by the user, and include means for recognising one or more elements or locations within an image captured thereby, means for determining its relative
<Desc/Clms Page number 5>
location within a surrounding physical space and means for determining its relative location within a corresponding virtual scene by identifying the corresponding one or more elements or locations within said virtual scene. In any event, the imaging means may comprise stereo imaging means.
In one exemplary embodiment of the present invention, a three dimensional model of the physical space surrounding the user can be updated to take into account changes in the surrounding physical space since said three dimensional model was generated, and, optionally, the first three dimensional model can be stored and retrieved for use when required. In this case, the three dimensional virtual model corresponding to said first three dimensional model of the surrounding physical space can beneficially also be stored and retrieved when required.
In a first exemplary embodiment of the present invention, the virtual scene may comprise an entirely virtual environment which includes virtual obstacles or entities that appear to occupy at least the same space as the physical obstacles or entities within the surrounding physical space which a user would wish to avoid colliding with.
In another exemplary embodiment of the present invention, the virtual scene may comprise an augmented scene, with only moving entities (such as people) within the physical space being represented within the displayed scene as virtual entities. In this case, the display means may be substantially opaque, the scene rendered in said display means for view by the user comprising an image of the surrounding physical space captured by image capturing means carried or worn by the user. Alternatively, the display means may be substantially transparent with opaque pixels being rendered in the display means in the form of a partial virtual scene for masking one or more areas of the surrounding physical space viewed by the user through the display means.
In all cases, the display means preferably comprises a head mounted display (HMD).
<Desc/Clms Page number 6>
If required, the virtual obstacles or entities may be, or appear to be, larger than the physical obstacles or entities they are intended to represent, and optionally, additional virtual obstacles or entities may be provided within the displayed scene which do not represent obstacles or entities within the surrounding physical space.
The apparatus preferably comprises means arranged to adapt said displayed scene, in real time, to the three-dimensional physical space in which a user is moving, and may include warning means, such as an audio or tactile signal, arranged to alert the user of any potential collision with a physical object or entity.
The apparatus may include means (particularly where the apparatus comprises apparatus for playing games) for providing virtual moving entities within said displayed scene, said entities being perceived to move through the virtual environment but being constrained by the same physical environment as is the user and, in the case where there are two or more users operating within the same virtual and physical environments (and each having their own display means), each user is preferably represented in the virtual scene by a different virtual character, the position of which within the virtual scene is determined by the user's position within the surrounding physical space. In this case, the apparatus is preferably arranged such that the or each user can select to be represented within the operating area by one of a plurality of virtual characters, such as monsters, robots, animals, soldiers, etc. , according to the nature of the game.
The apparatus beneficially comprises means for replacing one virtual scene with another virtual scene which is adapted to the same physical layout of the physical space in which the apparatus is being operated.
Further, the apparatus may comprise motion sensing means for sensing motion of a user within the surrounding physical space, means for magnifying said motion and means for translating such motion to motion within said virtual space such that motion of the user within said surrounding physical space appears as substantially faster motion within said virtual scene.
<Desc/Clms Page number 7>
It will be apparent that, unlike the situation with totally augmented reality systems, registration between real physical objects or entities and the virtual reality cloaking objects or entities intended to mark their presence and position need not be exact, nor is the speed of rendering the virtual reality cloaking objects or entities quite as critical as it is with augmented reality alignment. It will be appreciated that it is sufficient to ensure that the virtual reality cloaking objects or entities appear to present virtual surfaces which effectively surround the physical objects or entities with a large enough safety margin to prevent collision. In practice, it is really only necessary to ensure that the clear space perceived by the user in the virtual scene is in fact clear within the physical environment in which they are moving. This would at least give the user an opportunity to safely move through the physical environment by ensuring that all physical obstacles or entities are marked as such in the displayed scene.
Of course, there is nothing to prevent the user from attempting to move through the virtual surface (thinking, perhaps, that it is simply a virtual object or entity which does not cloak a physical object or entity) and, therefore, colliding with a physical obstacle or entity in their path. For this reason, warning means, such as an audio or tactile signal, may be provided to alert the user of any potential collision with a physical object or entity.
Many different types of virtual environment are envisaged. In the case where the apparatus forms part of a games system, the virtual scene could comprise, for example, one or more caves, tunnels or rooms in a building. The apparatus provides a high degree of realism in this case, because the user can move freely within their physical environment (without fear of collision with any obstacles), and such movement is translated into simulated movement through the virtual environment which is displayed in three dimensions. Any games might include the provision of virtual characters, such as monsters, enemies, etc. which are also perceived to move through the virtual environment but which are preferably constrained by the same physical environment as is the user.
The present invention can, by its very nature, be used in a variety of different physical environments, including within a residential building, garden, garage, etc. In the case that the user only has use of a single confined space, such as a living room or bedroom, the apparatus
<Desc/Clms Page number 8>
may comprise means for replacing one virtual scene with another virtual scene, as mentioned above, the second virtual scene being adapted to the same physical layout of the physical space in which the apparatus is being operated, i. e. both operating areas or virtual environments including virtual images which cloak the same physical environment layout.
Brief Description of the Drawings An embodiment of the present invention will now be described by way of example only and with reference to the accompanying drawing which is a perspective view of a head mounted display system for use in apparatus according to an exemplary embodiment of the present invention.
Detailed Description of the Invention In order to effectively'cloak'obstacles present in the physical space surrounding the apparatus of the present invention with virtual reality images appearing in appropriate positions within the virtual area as viewed by a user during use, there are three main issues to be considered.
Firstly, the three-dimensional free space around the user at any one time (in the direction in which they are nominally looking) needs to be mapped. Secondly, a three-dimensional virtual environment needs to be generated with its free space lying within the known free space of the surrounding physical space. Finally, any movement by the user within the physical environment needs to be monitored and fed back so that the virtual environment can be appropriately updated (while still conforming to the available free space within the surrounding physical environment).
Referring now to Figure 1 of the drawings, a head mounted display 10 for use in an exemplary embodiment of the present invention comprises a depth sensor which can determine the definite free space in the field of view of the player. Depth sensors for mapping the profile of a predetermined environment or area, and therefore determining the definite free space therein (using, for example, sonar, laser range-finding or stereo vision camera systems) are well known.
<Desc/Clms Page number 9>
In the head mounted display of Figure 1, a stereo vision system is employed and, as such, two cameras 12,14 are mounted on opposing sides of the front panel 16 of the head mounted display 10. The cameras 12,14 capture images within their field of view, and a plurality of points on the images captured by both cameras 12,14 are searched until two points are found (one from each image captured by a respective camera) which correspond to the same point on a three dimensional object appearing in the images. The position coordinates of these corresponding points on the images and the relative positions of the two cameras 12,14 are used to provide data as to the specific point on the object in a three-dimensional space. By identifying a plurality of such corresponding points, the system can recognise the position, shape and size of a three dimensional object within its field of view. Using this method, all of the obstacles within the user's field of view can be recognised and the remaining free space can be determined. An operating area is generated having the same perceived free space as is available within the surrounding physical environment, any obstacles being marked or 'cloaked'with virtual images which are in keeping with the theme of the displayed scene.
A relative position sensor 18, comprising for example an accelerometer or gyroscope, is mounted generally centrally on the front panel 16 of the head mounted display 10. The relative position sensor measures relative motion of the user's head with respect to the evolving three-dimensional virtual environment. An (absolute) orientation sensor 20 is also mounted generally centrally on the front panel 16 of the head mounted display 10 to measure the orientation of the cameras 12,14 with respect to gravity.
The head mounted display 10 is retained on a user's head (so that the front panel 16 thereof covers the user's eyes) by means of a pair of side arms 22, similar to those used to retain conventional spectacles in position during use.
Consider a simple exemplary embodiment of the present invention comprising a game based on the exploration of a plurality of tunnels. Within each of the tunnels is a number of rocks, and aliens (or the like) appear from behind the rocks. The player is required to shoot the aliens as they appear.
<Desc/Clms Page number 10>
The stereo vision camera system can be used to determine the largest area of free space available (in any one view) and the orientation of its ground plane within the physical space surrounding the apparatus, and the apparatus then creates virtually the largest tunnel corresponding thereto and populates it with rocks and aliens. As the wearer moves their head and body, the relative motion is measured and the position of the previously-constructed tunnel is updated and evolved to agree with the updated depth map created by the stereo vision camera system according to the updated head position. As the user'moves'through the tunnel, additional tunnel elements (such as branches, additional rocks, extensions, etc) can be added according to a simple, predefined tunnel modelling scheme and the evolving free space around the user. It should be noted that the evolution of tunnels need not be entirely geometrically consistent in order to give the impression of inhabiting a virtual world.
In the case where there are two or more players, each wearing their own head mounted display 10, each head mounted display 10 could be provided with an additional imaging system (similar to the stereo vision camera system described above and possibly sharing common elements therewith). Each player could be provided with an arrangement of markers (worn above the head, say) which are recognisable by the imaging system. Such an arrangement of markers would have a known geometry such that the imaging system could be used to determine the position and three-dimensional orientation of the markers (and therefore the associated player) with respect thereto (as is the case in automated motion capture systems).
Using this information, the additional players could be'cloaked'within the virtual environment by the generation of respective virtual images having the same approximate poses as the additional players. A similar marking and imaging system could be applied to cloak key pieces of equipment. For example, a stick used by a player could be portrayed in the virtual environment as a sword or similar weapon.
In an alternative embodiment of the invention, the game could be played within an augmented world, with only the players being cloaked. In other words, instead of generating a virtual environment, the players effectively see the real world through, for example, a head mounted video and display system, and within this environment, real players are replaced in real time by cloaked counterparts using substantially the same techniques as are described above.
<Desc/Clms Page number 11>
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be apparent to a person skilled in the art that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. Accordingly, the specification and drawings are to be regarded in an illustrative, rather than a restrictive, sense.

Claims (33)

  1. CLAIMS 1. Interactive image display apparatus, comprising portable display means for displaying a virtual scene, means for generating an at least partially virtual scene including one or more virtual objects, and means for rendering said virtual scene in said display means for viewing by a user, wherein the apparent distance of any point in the user's field of view within the virtual scene to the nearest virtual object in said virtual scene is no further than the actual distance in the same direction of view to the nearest physical object within the physical space surrounding the user.
  2. 2. Apparatus according to claim 1, wherein said means for generating a virtual scene comprises means for determining a depth map from the user's viewpoint of the physical space surrounding said user, and means for generating said virtual scene such that its virtual depth is no greater than the corresponding depth of said depth map of said physical space.
  3. 3. Apparatus according to claim 2, wherein said depth map comprises a pixel image of the physical scene within the user's field of view, wherein each pixel represents the distance to the nearest physical object within said field of view.
  4. 4. Apparatus according to any one of claims 1 to 3, comprising means for generating a new virtual scene in response to a change of the user's viewpoint.
  5. 5. Apparatus according to claim 4, wherein said new virtual scene is displayed on said display means as a smooth perturbation of the preceding rendered virtual scene from the preceding user viewpoint.
    <Desc/Clms Page number 13>
  6. 6. Apparatus according to claim 4 or claim 5, comprising a motion sensor for providing a sensory input of the relative change of position of the user, the sensed change of position being translated into a corresponding change in the virtual position of the viewpoint of the virtual scene.
  7. 7. Apparatus according to claim 6, wherein said motion sensor is located on a user's head, when in use.
  8. 8. Apparatus according to claim 1, wherein said means for generating a virtual scene comprises means for generating a (at least approximate) three-dimensional model of the physical space surrounding the user from the user's viewpoint, a three- dimensional virtual world being generated such that virtual objects appear to surround all of the physical objects within said physical space, and the virtual scene rendered in the display for view by the user provides a view of the said virtual world in accordance with the position of said user.
  9. 9. Apparatus according to claim 8, wherein said three dimensional model of said physical space is predetermined for a given physical space.
  10. 10. Apparatus according to claim 8, comprising means for determining depth information from a user's viewpoint at any given time, said depth information relating to the physical space surrounding said user, and sensor means carried or worn by the user for determining the physical location and/or orientation of said user, said depth information and said information relating to the physical location and/or orientation of the user being used to generate a three-dimensional model of the physical space from the user's viewpoint.
  11. 11. Apparatus according to claim 8, comprising sensor means for determining the user's position and orientation relative to the surrounding physical space.
    <Desc/Clms Page number 14>
  12. 12. Apparatus according to claim 8, comprising imaging means for determining the user's position relative to the surrounding physical space.
  13. 13. Apparatus according to claim 12, wherein said imaging means is arranged to identify one or more markers carried or worn by a user so that their position can be determined accordingly.
  14. 14. Apparatus according to claim 12, wherein said imaging means comprises image capturing means carried or worn by the user, and includes means for recognising one or more elements or locations within an image captured thereby, means for determining its relative location within a surrounding physical space and means for determining its relative location within a corresponding virtual scene by identifying the corresponding one or more elements or locations within said virtual scene.
  15. 15. Apparatus according to claim 12, wherein said imaging means comprises stereo imaging means.
  16. 16. Apparatus according to any one of claims 8 to 13, wherein a three dimensional model of the physical space surrounding the user can be updated to take into account changes in the surrounding physical space since said three dimensional model was generated.
  17. 17. Apparatus according to claim 14, wherein said first three dimensional model can be stored and retrieved for use when required.
  18. 18. Apparatus according to claim 15, wherein the three dimensional virtual model corresponding to said first three dimensional model ofthe surrounding physical space can also be stored and retrieved when required.
    <Desc/Clms Page number 15>
  19. 19. Apparatus according to any one of the preceding claims, wherein said virtual scene comprises an entirely virtual environment which includes virtual obstacles or entities that appear to occupy at least the same space as the physical obstacles or entities within the surrounding physical space which a user would wish to avoid colliding with.
  20. 20. Apparatus according to claim 1, wherein said virtual scene comprises an augmented scene, with only moving entities (such as people) within the physical space being represented within the displayed scene as virtual entities
  21. 21. Apparatus according to claim 20, wherein said display means is substantially opaque, the scene rendered in said display means for view by the user comprising an image of the surrounding physical space captured by image capturing means carried or worn by the user, said apparatus including means for modifying said image after capture by replacing pixels in one or more areas thereof with their virtual image equivalents, prior to display.
  22. 22. Apparatus according to claim 20, wherein said display means is substantially transparent with opaque pixels being rendered in the display means in the form of a partial virtual scene for masking one or more areas of the surrounding physical space viewed by the user through the display means.
  23. 23. Apparatus according to any one of the preceding claims, wherein said display means comprises a head mounted display (HMD).
  24. 24. Apparatus according to any one of the preceding claims, wherein the virtual obstacles or entities are, or appear to be, larger than the physical obstacles or entities they are intended to represent.
    <Desc/Clms Page number 16>
  25. 25. Apparatus according to any one of the preceding claims, wherein there are additional virtual obstacles or entities within said displayed operating area which do not represent obstacles or entities within the surrounding physical space.
  26. 26. Apparatus according to any one of the preceding claims, arranged to adapt said
    displayed scene, in real time, to the three-dimensional physical space in which a user is moving.
  27. 27. Apparatus according to any one of the preceding claims, including warning means, such as an audio or tactile signal, arranged to alert the user of any potential collision with a physical object or entity.
  28. 28. Apparatus according to any one of the preceding claims, including means for providing virtual moving entities within said displayed scene, said entities being perceived to move through the virtual environment but being constrained by the same physical environment as is the user.
  29. 29. Apparatus according to any one of the preceding claims, wherein, in the case where there are two or more users operating within the same virtual and physical environments (and each having their own display means), each user is represented in the virtual scene by a different virtual character, the position of which within the virtual scene is determined by the user's position within the surrounding physical space.
  30. 30. Apparatus according to claim 29, arranged such that the or each user can select to be represented within the operating area by one of a plurality of virtual characters.
  31. 31. Apparatus according to any one of the preceding claims comprising means for replacing one virtual scene with another virtual scene which is adapted to the same physical layout of the physical space in which the apparatus is being operated.
    <Desc/Clms Page number 17>
  32. 32. Apparatus according to any one of the preceding claims, comprising motion sensing means for sensing motion of a user within the surrounding physical space, means for magnifying said motion and means for translating such motion to motion within said virtual space such that motion of the user within said surrounding physical space appears as substantially faster motion within said virtual scene.
  33. 33. Interactive image display apparatus substantially as herein described with reference to the accompanying drawing.
GB0113559A 2001-06-04 2001-06-04 Virtual or augmented reality Withdrawn GB2376397A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0113559A GB2376397A (en) 2001-06-04 2001-06-04 Virtual or augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0113559A GB2376397A (en) 2001-06-04 2001-06-04 Virtual or augmented reality

Publications (2)

Publication Number Publication Date
GB0113559D0 GB0113559D0 (en) 2001-07-25
GB2376397A true GB2376397A (en) 2002-12-11

Family

ID=9915867

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0113559A Withdrawn GB2376397A (en) 2001-06-04 2001-06-04 Virtual or augmented reality

Country Status (1)

Country Link
GB (1) GB2376397A (en)

Cited By (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004107272A1 (en) * 2003-05-29 2004-12-09 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
WO2005017729A2 (en) * 2003-08-19 2005-02-24 Luigi Giubbolini Interface method and device between man and machine realised by manipulating virtual objects
EP1521165A2 (en) * 2003-09-30 2005-04-06 Canon Kabushiki Kaisha Data conversion method and apparatus, and orientation measurement apparatus
WO2005064440A2 (en) * 2003-12-23 2005-07-14 Siemens Aktiengesellschaft Device and method for the superposition of the real field of vision in a precisely positioned manner
WO2007090660A1 (en) * 2006-02-08 2007-08-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and arrangement for blending location-related information into a visual representation or view of a scene
EP1873617A2 (en) * 2006-06-28 2008-01-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
WO2009146234A1 (en) 2008-05-30 2009-12-03 Sony Computer Entertainment America Inc. Determination of controller three-dimensional location using image analysis and ultrasonic communication
US7646372B2 (en) 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US7663689B2 (en) 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
US7760248B2 (en) 2002-07-27 2010-07-20 Sony Computer Entertainment Inc. Selective sound source listening in conjunction with computer interactive processing
US7803050B2 (en) 2002-07-27 2010-09-28 Sony Computer Entertainment Inc. Tracking device with sound emitter for use in obtaining information for controlling game program execution
US7850526B2 (en) 2002-07-27 2010-12-14 Sony Computer Entertainment America Inc. System for tracking user manipulations within an environment
US7854655B2 (en) 2002-07-27 2010-12-21 Sony Computer Entertainment America Inc. Obtaining input for controlling execution of a game program
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US7918733B2 (en) 2002-07-27 2011-04-05 Sony Computer Entertainment America Inc. Multi-input game control mixer
CN102141885A (en) * 2010-02-02 2011-08-03 索尼公司 Image processing device, image processing method, and program
US8035629B2 (en) 2002-07-18 2011-10-11 Sony Computer Entertainment Inc. Hand-held computer interactive device
US8139793B2 (en) 2003-08-27 2012-03-20 Sony Computer Entertainment Inc. Methods and apparatus for capturing audio signals based on a visual image
US8142288B2 (en) 2009-05-08 2012-03-27 Sony Computer Entertainment America Llc Base station movement detection and compensation
US8160269B2 (en) 2003-08-27 2012-04-17 Sony Computer Entertainment Inc. Methods and apparatuses for adjusting a listening area for capturing sounds
US8188968B2 (en) 2002-07-27 2012-05-29 Sony Computer Entertainment Inc. Methods for interfacing with a program using a light input device
US8233642B2 (en) 2003-08-27 2012-07-31 Sony Computer Entertainment Inc. Methods and apparatuses for capturing an audio signal based on a location of the signal
EP2485692A1 (en) * 2009-10-09 2012-08-15 National ICT Australia Limited Vision enhancement for a vision impaired user
US8287373B2 (en) 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8342963B2 (en) 2009-04-10 2013-01-01 Sony Computer Entertainment America Inc. Methods and systems for enabling control of artificial intelligence game characters
US8368753B2 (en) 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
US8393964B2 (en) 2009-05-08 2013-03-12 Sony Computer Entertainment America Llc Base station for position location
EP2579128A1 (en) * 2011-10-05 2013-04-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Portable device, virtual reality system and method
US8527657B2 (en) 2009-03-20 2013-09-03 Sony Computer Entertainment America Llc Methods and systems for dynamically adjusting update rates in multi-player network gaming
US8542907B2 (en) 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US8547401B2 (en) 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
EP2660643A2 (en) * 2012-05-04 2013-11-06 Sony Computer Entertainment Europe Limited Head mountable display system
WO2014042320A1 (en) 2012-09-14 2014-03-20 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
US8686939B2 (en) 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US8797260B2 (en) 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US8840470B2 (en) 2008-02-27 2014-09-23 Sony Computer Entertainment America Llc Methods for capturing depth data of a scene and applying computer actions
WO2014156033A1 (en) * 2013-03-26 2014-10-02 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system
EP2816519A1 (en) * 2013-06-17 2014-12-24 Spreadtrum Communications (Shanghai) Co., Ltd. Three-dimensional shopping platform displaying system
US8947347B2 (en) 2003-08-27 2015-02-03 Sony Computer Entertainment Inc. Controlling actions in a video game unit
CN104345455A (en) * 2013-07-29 2015-02-11 索尼公司 Information presentation apparatus and information processing system
US8961313B2 (en) 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
EP2697792A4 (en) * 2011-04-12 2015-06-03 Yuval Boger Apparatus, systems and methods for providing motion tracking using a personal viewing device
GB2524269A (en) * 2014-03-17 2015-09-23 Sony Comp Entertainment Europe Virtual reality
US9177387B2 (en) 2003-02-11 2015-11-03 Sony Computer Entertainment Inc. Method and apparatus for real time motion capture
US9174119B2 (en) 2002-07-27 2015-11-03 Sony Computer Entertainement America, LLC Controller for providing inputs to control execution of a program when inputs are combined
US9275626B2 (en) 2012-05-04 2016-03-01 Sony Computer Entertainment Europe Limited Audio system
CN105446310A (en) * 2015-12-31 2016-03-30 广东美的制冷设备有限公司 Air environment manufacturing method and device
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
WO2016120510A1 (en) * 2015-01-28 2016-08-04 Pablo Abad Rubio System for adapting virtual reality glasses for surround stereoscopic augmented reality visualisation
WO2016135472A1 (en) * 2015-02-25 2016-09-01 Bae Systems Plc Immersive vehicle simulator apparatus and method
US9474968B2 (en) 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US9573056B2 (en) 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
EP3136372A1 (en) * 2015-08-28 2017-03-01 BAE Systems PLC Immersive vehicle simulator apparatus and method
EP2526527A4 (en) * 2010-01-22 2017-03-15 Sony Computer Entertainment America, Inc. Capturing views and movements of actors performing within generated scenes
DE102015012291A1 (en) * 2015-09-23 2017-03-23 Audi Ag Method for operating a virtual reality system and virtual reality system
US9669321B2 (en) 2015-09-21 2017-06-06 Figment Productions Limited System for providing a virtual reality experience
US9682319B2 (en) 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
DE102016000627A1 (en) * 2016-01-22 2017-07-27 Audi Ag Method for operating a virtual reality system and virtual reality system
US9720231B2 (en) 2012-09-26 2017-08-01 Dolby Laboratories Licensing Corporation Display, imaging system and controller for eyewear display device
WO2017172982A1 (en) * 2016-03-31 2017-10-05 Magic Leap, Inc. Interactions with 3d virtual objects using poses and multiple-dof controllers
US9881422B2 (en) 2014-12-04 2018-01-30 Htc Corporation Virtual reality system and method for controlling operation modes of virtual reality system
WO2018039070A1 (en) * 2016-08-23 2018-03-01 Google Llc System and method for placement of virtual characters in an augmented/virtual reality environment
US10096166B2 (en) 2014-11-19 2018-10-09 Bae Systems Plc Apparatus and method for selectively displaying an operational environment
US10108013B2 (en) 2016-08-22 2018-10-23 Microsoft Technology Licensing, Llc Indirect-view augmented reality display system
WO2019027530A1 (en) * 2017-08-02 2019-02-07 Google Llc Depth sensor aided estimation of virtual reality environment boundaries
US10216273B2 (en) 2015-02-25 2019-02-26 Bae Systems Plc Apparatus and method for effecting a control action in respect of system functions
US10262465B2 (en) 2014-11-19 2019-04-16 Bae Systems Plc Interactive control station
US10279254B2 (en) 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
US10908421B2 (en) 2006-11-02 2021-02-02 Razer (Asia-Pacific) Pte. Ltd. Systems and methods for personal viewing devices
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
US10942617B2 (en) 2019-01-08 2021-03-09 International Business Machines Corporation Runtime adaptation of augmented reality gaming content based on context of surrounding physical environment
WO2021154433A1 (en) * 2020-01-27 2021-08-05 Facebook Technologies, Llc Systems, methods, and media for displaying real-time visualization of physical environment in artificial reality
US11200745B2 (en) 2020-01-27 2021-12-14 Facebook Technologies, Llc. Systems, methods, and media for automatically triggering real-time visualization of physical environment in artificial reality
US11210860B2 (en) 2020-01-27 2021-12-28 Facebook Technologies, Llc. Systems, methods, and media for visualizing occluded physical objects reconstructed in artificial reality
US11410387B1 (en) 2020-01-17 2022-08-09 Facebook Technologies, Llc. Systems, methods, and media for generating visualization of physical environment in artificial reality
US11451758B1 (en) 2020-02-12 2022-09-20 Meta Platforms Technologies, Llc Systems, methods, and media for colorizing grayscale images
US11501488B2 (en) 2020-01-27 2022-11-15 Meta Platforms Technologies, Llc Systems, methods, and media for generating visualization of physical environment in artificial reality

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106484085B (en) * 2015-08-31 2019-07-23 北京三星通信技术研究有限公司 The method and its head-mounted display of real-world object are shown in head-mounted display
JP6675209B2 (en) * 2016-01-20 2020-04-01 株式会社ソニー・インタラクティブエンタテインメント Information processing apparatus and user guide presentation method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796991A (en) * 1994-05-16 1998-08-18 Fujitsu Limited Image synthesis and display apparatus and simulation system using same
US6084557A (en) * 1997-05-23 2000-07-04 Minolta Co., Ltd. System for displaying combined imagery

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796991A (en) * 1994-05-16 1998-08-18 Fujitsu Limited Image synthesis and display apparatus and simulation system using same
US6084557A (en) * 1997-05-23 2000-07-04 Minolta Co., Ltd. System for displaying combined imagery

Cited By (142)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8035629B2 (en) 2002-07-18 2011-10-11 Sony Computer Entertainment Inc. Hand-held computer interactive device
US9682320B2 (en) 2002-07-22 2017-06-20 Sony Interactive Entertainment Inc. Inertially trackable hand-held controller
US8797260B2 (en) 2002-07-27 2014-08-05 Sony Computer Entertainment Inc. Inertially trackable hand-held controller
US8675915B2 (en) 2002-07-27 2014-03-18 Sony Computer Entertainment America Llc System for tracking user manipulations within an environment
US8313380B2 (en) 2002-07-27 2012-11-20 Sony Computer Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8188968B2 (en) 2002-07-27 2012-05-29 Sony Computer Entertainment Inc. Methods for interfacing with a program using a light input device
US9381424B2 (en) 2002-07-27 2016-07-05 Sony Interactive Entertainment America Llc Scheme for translating movements of a hand-held controller into inputs for a system
US8303405B2 (en) 2002-07-27 2012-11-06 Sony Computer Entertainment America Llc Controller for providing inputs to control execution of a program when inputs are combined
US10099130B2 (en) 2002-07-27 2018-10-16 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US10406433B2 (en) 2002-07-27 2019-09-10 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US9174119B2 (en) 2002-07-27 2015-11-03 Sony Computer Entertainement America, LLC Controller for providing inputs to control execution of a program when inputs are combined
US8686939B2 (en) 2002-07-27 2014-04-01 Sony Computer Entertainment Inc. System, method, and apparatus for three-dimensional input control
US10220302B2 (en) 2002-07-27 2019-03-05 Sony Interactive Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US7760248B2 (en) 2002-07-27 2010-07-20 Sony Computer Entertainment Inc. Selective sound source listening in conjunction with computer interactive processing
US8570378B2 (en) 2002-07-27 2013-10-29 Sony Computer Entertainment Inc. Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera
US7918733B2 (en) 2002-07-27 2011-04-05 Sony Computer Entertainment America Inc. Multi-input game control mixer
US7854655B2 (en) 2002-07-27 2010-12-21 Sony Computer Entertainment America Inc. Obtaining input for controlling execution of a game program
US8976265B2 (en) 2002-07-27 2015-03-10 Sony Computer Entertainment Inc. Apparatus for image and sound capture in a game environment
US9474968B2 (en) 2002-07-27 2016-10-25 Sony Interactive Entertainment America Llc Method and system for applying gearing effects to visual tracking
US7850526B2 (en) 2002-07-27 2010-12-14 Sony Computer Entertainment America Inc. System for tracking user manipulations within an environment
US9393487B2 (en) 2002-07-27 2016-07-19 Sony Interactive Entertainment Inc. Method for mapping movements of a hand-held controller to game commands
US7803050B2 (en) 2002-07-27 2010-09-28 Sony Computer Entertainment Inc. Tracking device with sound emitter for use in obtaining information for controlling game program execution
US9682319B2 (en) 2002-07-31 2017-06-20 Sony Interactive Entertainment Inc. Combiner method for altering game gearing
US9177387B2 (en) 2003-02-11 2015-11-03 Sony Computer Entertainment Inc. Method and apparatus for real time motion capture
WO2004107272A1 (en) * 2003-05-29 2004-12-09 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US8072470B2 (en) 2003-05-29 2011-12-06 Sony Computer Entertainment Inc. System and method for providing a real-time three-dimensional interactive environment
US11010971B2 (en) 2003-05-29 2021-05-18 Sony Interactive Entertainment Inc. User-driven three-dimensional interactive gaming environment
WO2005017729A2 (en) * 2003-08-19 2005-02-24 Luigi Giubbolini Interface method and device between man and machine realised by manipulating virtual objects
WO2005017729A3 (en) * 2003-08-19 2005-06-16 Luigi Giubbolini Interface method and device between man and machine realised by manipulating virtual objects
US8139793B2 (en) 2003-08-27 2012-03-20 Sony Computer Entertainment Inc. Methods and apparatus for capturing audio signals based on a visual image
US8233642B2 (en) 2003-08-27 2012-07-31 Sony Computer Entertainment Inc. Methods and apparatuses for capturing an audio signal based on a location of the signal
US8947347B2 (en) 2003-08-27 2015-02-03 Sony Computer Entertainment Inc. Controlling actions in a video game unit
US8160269B2 (en) 2003-08-27 2012-04-17 Sony Computer Entertainment Inc. Methods and apparatuses for adjusting a listening area for capturing sounds
US7646372B2 (en) 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
US7874917B2 (en) 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US8303411B2 (en) 2003-09-15 2012-11-06 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US8758132B2 (en) 2003-09-15 2014-06-24 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US8251820B2 (en) 2003-09-15 2012-08-28 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
EP1521165A2 (en) * 2003-09-30 2005-04-06 Canon Kabushiki Kaisha Data conversion method and apparatus, and orientation measurement apparatus
US7414596B2 (en) 2003-09-30 2008-08-19 Canon Kabushiki Kaisha Data conversion method and apparatus, and orientation measurement apparatus
EP1521165A3 (en) * 2003-09-30 2007-10-03 Canon Kabushiki Kaisha Data conversion method and apparatus, and orientation measurement apparatus
WO2005064440A3 (en) * 2003-12-23 2006-01-26 Siemens Ag Device and method for the superposition of the real field of vision in a precisely positioned manner
WO2005064440A2 (en) * 2003-12-23 2005-07-14 Siemens Aktiengesellschaft Device and method for the superposition of the real field of vision in a precisely positioned manner
US7663689B2 (en) 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
US8547401B2 (en) 2004-08-19 2013-10-01 Sony Computer Entertainment Inc. Portable augmented reality device and method
US10099147B2 (en) 2004-08-19 2018-10-16 Sony Interactive Entertainment Inc. Using a portable device to interface with a video game rendered on a main display
US10279254B2 (en) 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
US9573056B2 (en) 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
WO2007090660A1 (en) * 2006-02-08 2007-08-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Method and arrangement for blending location-related information into a visual representation or view of a scene
US8633871B2 (en) 2006-06-28 2014-01-21 Canon Kabushiki Kaisha Image processing apparatus and image processing method
EP1873617A2 (en) * 2006-06-28 2008-01-02 Canon Kabushiki Kaisha Image processing apparatus and image processing method
EP1873617A3 (en) * 2006-06-28 2013-02-27 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US8310656B2 (en) 2006-09-28 2012-11-13 Sony Computer Entertainment America Llc Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen
USRE48417E1 (en) 2006-09-28 2021-02-02 Sony Interactive Entertainment Inc. Object direction using video input combined with tilt angle information
US8781151B2 (en) 2006-09-28 2014-07-15 Sony Computer Entertainment Inc. Object detection using video input combined with tilt angle information
US9891435B2 (en) 2006-11-02 2018-02-13 Sensics, Inc. Apparatus, systems and methods for providing motion tracking using a personal viewing device
US10908421B2 (en) 2006-11-02 2021-02-02 Razer (Asia-Pacific) Pte. Ltd. Systems and methods for personal viewing devices
US8542907B2 (en) 2007-12-17 2013-09-24 Sony Computer Entertainment America Llc Dynamic three-dimensional object mapping for user-defined control device
US8840470B2 (en) 2008-02-27 2014-09-23 Sony Computer Entertainment America Llc Methods for capturing depth data of a scene and applying computer actions
US8368753B2 (en) 2008-03-17 2013-02-05 Sony Computer Entertainment America Llc Controller with an integrated depth camera
EP2303422A4 (en) * 2008-05-30 2017-06-07 Sony Computer Entertainment America LLC Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8323106B2 (en) 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
WO2009146234A1 (en) 2008-05-30 2009-12-03 Sony Computer Entertainment America Inc. Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8287373B2 (en) 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
US8527657B2 (en) 2009-03-20 2013-09-03 Sony Computer Entertainment America Llc Methods and systems for dynamically adjusting update rates in multi-player network gaming
US8342963B2 (en) 2009-04-10 2013-01-01 Sony Computer Entertainment America Inc. Methods and systems for enabling control of artificial intelligence game characters
US8393964B2 (en) 2009-05-08 2013-03-12 Sony Computer Entertainment America Llc Base station for position location
US8142288B2 (en) 2009-05-08 2012-03-27 Sony Computer Entertainment America Llc Base station movement detection and compensation
US8961313B2 (en) 2009-05-29 2015-02-24 Sony Computer Entertainment America Llc Multi-positional three-dimensional controller
US9162061B2 (en) 2009-10-09 2015-10-20 National Ict Australia Limited Vision enhancement for a vision impaired user
EP2485692A4 (en) * 2009-10-09 2013-06-12 Nat Ict Australia Ltd Vision enhancement for a vision impaired user
EP2485692A1 (en) * 2009-10-09 2012-08-15 National ICT Australia Limited Vision enhancement for a vision impaired user
EP2526527A4 (en) * 2010-01-22 2017-03-15 Sony Computer Entertainment America, Inc. Capturing views and movements of actors performing within generated scenes
US10810803B2 (en) 2010-02-02 2020-10-20 Sony Corporation Image processing device, image processing method, and program
US11651574B2 (en) 2010-02-02 2023-05-16 Sony Corporation Image processing device, image processing method, and program
US10515488B2 (en) 2010-02-02 2019-12-24 Sony Corporation Image processing device, image processing method, and program
CN102141885B (en) * 2010-02-02 2013-10-30 索尼公司 Image processing device and image processing method
US10037628B2 (en) 2010-02-02 2018-07-31 Sony Corporation Image processing device, image processing method, and program
US10223837B2 (en) 2010-02-02 2019-03-05 Sony Corporation Image processing device, image processing method, and program
US9805513B2 (en) 2010-02-02 2017-10-31 Sony Corporation Image processing device, image processing method, and program
US9754418B2 (en) 2010-02-02 2017-09-05 Sony Corporation Image processing device, image processing method, and program
CN102141885A (en) * 2010-02-02 2011-08-03 索尼公司 Image processing device, image processing method, and program
US11189105B2 (en) 2010-02-02 2021-11-30 Sony Corporation Image processing device, image processing method, and program
EP2697792A4 (en) * 2011-04-12 2015-06-03 Yuval Boger Apparatus, systems and methods for providing motion tracking using a personal viewing device
KR20140083015A (en) * 2011-10-05 2014-07-03 프라운호퍼 게젤샤프트 쭈르 푀르데룽 데어 안겐반텐 포르슝 에. 베. Portable device, virtual reality system and method
EP2579128A1 (en) * 2011-10-05 2013-04-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Portable device, virtual reality system and method
US9216347B2 (en) 2011-10-05 2015-12-22 Fraunhofer-Gesellschaft Zur Foerderung Der Andewandten Forschung E.V. Portable device, virtual reality system and method
KR101670147B1 (en) * 2011-10-05 2016-11-09 프라운호퍼 게젤샤프트 쭈르 푀르데룽 데어 안겐반텐 포르슝 에. 베. Portable device, virtual reality system and method
WO2013050473A1 (en) * 2011-10-05 2013-04-11 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Portable device, virtual reality system and method
CN104024984A (en) * 2011-10-05 2014-09-03 弗兰霍菲尔运输应用研究公司 Portable Device, Virtual Reality System And Method
CN104024984B (en) * 2011-10-05 2017-06-09 弗劳恩霍夫应用研究促进协会 Portable set, virtual reality system and method
JP2015502584A (en) * 2011-10-05 2015-01-22 フラウンホーファー−ゲゼルシャフト・ツール・フェルデルング・デル・アンゲヴァンテン・フォルシュング・アインゲトラーゲネル・フェライン Portable device, virtual reality system and method
US9275626B2 (en) 2012-05-04 2016-03-01 Sony Computer Entertainment Europe Limited Audio system
US9310884B2 (en) 2012-05-04 2016-04-12 Sony Computer Entertainment Europe Limited Head mountable display system
EP2660643A2 (en) * 2012-05-04 2013-11-06 Sony Computer Entertainment Europe Limited Head mountable display system
EP2660643A3 (en) * 2012-05-04 2013-12-25 Sony Computer Entertainment Europe Limited Head mountable display system
US9448624B2 (en) 2012-09-14 2016-09-20 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
EP2896205A4 (en) * 2012-09-14 2016-07-27 Lg Electronics Inc Apparatus and method of providing user interface on head mounted display and head mounted display thereof
WO2014042458A1 (en) 2012-09-14 2014-03-20 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
WO2014042320A1 (en) 2012-09-14 2014-03-20 Lg Electronics Inc. Apparatus and method of providing user interface on head mounted display and head mounted display thereof
EP2895911A4 (en) * 2012-09-14 2016-07-27 Lg Electronics Inc Apparatus and method of providing user interface on head mounted display and head mounted display thereof
US9720231B2 (en) 2012-09-26 2017-08-01 Dolby Laboratories Licensing Corporation Display, imaging system and controller for eyewear display device
WO2014156033A1 (en) * 2013-03-26 2014-10-02 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system
US11054650B2 (en) 2013-03-26 2021-07-06 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system
EP2816519A1 (en) * 2013-06-17 2014-12-24 Spreadtrum Communications (Shanghai) Co., Ltd. Three-dimensional shopping platform displaying system
CN104345455A (en) * 2013-07-29 2015-02-11 索尼公司 Information presentation apparatus and information processing system
GB2524269B (en) * 2014-03-17 2021-04-14 Sony Interactive Entertainment Europe Ltd Virtual reality
GB2524269A (en) * 2014-03-17 2015-09-23 Sony Comp Entertainment Europe Virtual reality
US10096166B2 (en) 2014-11-19 2018-10-09 Bae Systems Plc Apparatus and method for selectively displaying an operational environment
US10262465B2 (en) 2014-11-19 2019-04-16 Bae Systems Plc Interactive control station
US9881422B2 (en) 2014-12-04 2018-01-30 Htc Corporation Virtual reality system and method for controlling operation modes of virtual reality system
EP3029552B1 (en) * 2014-12-04 2018-10-17 HTC Corporation Virtual reality system and method for controlling operation modes of virtual reality system
WO2016120510A1 (en) * 2015-01-28 2016-08-04 Pablo Abad Rubio System for adapting virtual reality glasses for surround stereoscopic augmented reality visualisation
WO2016135472A1 (en) * 2015-02-25 2016-09-01 Bae Systems Plc Immersive vehicle simulator apparatus and method
US10216273B2 (en) 2015-02-25 2019-02-26 Bae Systems Plc Apparatus and method for effecting a control action in respect of system functions
EP3136372A1 (en) * 2015-08-28 2017-03-01 BAE Systems PLC Immersive vehicle simulator apparatus and method
US9669321B2 (en) 2015-09-21 2017-06-06 Figment Productions Limited System for providing a virtual reality experience
DE102015012291A1 (en) * 2015-09-23 2017-03-23 Audi Ag Method for operating a virtual reality system and virtual reality system
CN105446310B (en) * 2015-12-31 2018-07-03 广东美的制冷设备有限公司 A kind of air environment manufacturing method and device
CN105446310A (en) * 2015-12-31 2016-03-30 广东美的制冷设备有限公司 Air environment manufacturing method and device
DE102016000627B4 (en) 2016-01-22 2024-03-28 Audi Ag Method for operating a virtual reality system and virtual reality system
DE102016000627A1 (en) * 2016-01-22 2017-07-27 Audi Ag Method for operating a virtual reality system and virtual reality system
US11049328B2 (en) 2016-03-31 2021-06-29 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-DOF controllers
US10078919B2 (en) 2016-03-31 2018-09-18 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-DOF controllers
WO2017172982A1 (en) * 2016-03-31 2017-10-05 Magic Leap, Inc. Interactions with 3d virtual objects using poses and multiple-dof controllers
US10733806B2 (en) 2016-03-31 2020-08-04 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-dof controllers
US10417831B2 (en) 2016-03-31 2019-09-17 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-DOF controllers
US11657579B2 (en) 2016-03-31 2023-05-23 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-DOF controllers
US10510191B2 (en) 2016-03-31 2019-12-17 Magic Leap, Inc. Interactions with 3D virtual objects using poses and multiple-DOF controllers
US10108013B2 (en) 2016-08-22 2018-10-23 Microsoft Technology Licensing, Llc Indirect-view augmented reality display system
WO2018039070A1 (en) * 2016-08-23 2018-03-01 Google Llc System and method for placement of virtual characters in an augmented/virtual reality environment
US10803663B2 (en) 2017-08-02 2020-10-13 Google Llc Depth sensor aided estimation of virtual reality environment boundaries
WO2019027530A1 (en) * 2017-08-02 2019-02-07 Google Llc Depth sensor aided estimation of virtual reality environment boundaries
US10942617B2 (en) 2019-01-08 2021-03-09 International Business Machines Corporation Runtime adaptation of augmented reality gaming content based on context of surrounding physical environment
US11410387B1 (en) 2020-01-17 2022-08-09 Facebook Technologies, Llc. Systems, methods, and media for generating visualization of physical environment in artificial reality
US11210860B2 (en) 2020-01-27 2021-12-28 Facebook Technologies, Llc. Systems, methods, and media for visualizing occluded physical objects reconstructed in artificial reality
US11501488B2 (en) 2020-01-27 2022-11-15 Meta Platforms Technologies, Llc Systems, methods, and media for generating visualization of physical environment in artificial reality
US11200745B2 (en) 2020-01-27 2021-12-14 Facebook Technologies, Llc. Systems, methods, and media for automatically triggering real-time visualization of physical environment in artificial reality
US11113891B2 (en) 2020-01-27 2021-09-07 Facebook Technologies, Llc Systems, methods, and media for displaying real-time visualization of physical environment in artificial reality
WO2021154433A1 (en) * 2020-01-27 2021-08-05 Facebook Technologies, Llc Systems, methods, and media for displaying real-time visualization of physical environment in artificial reality
US11451758B1 (en) 2020-02-12 2022-09-20 Meta Platforms Technologies, Llc Systems, methods, and media for colorizing grayscale images

Also Published As

Publication number Publication date
GB0113559D0 (en) 2001-07-25

Similar Documents

Publication Publication Date Title
GB2376397A (en) Virtual or augmented reality
CN110199325B (en) Simulation system, processing method, and information storage medium
US10372209B2 (en) Eye tracking enabling 3D viewing
US11865453B2 (en) Simulation system, process method, and information storage medium
JP6373920B2 (en) Simulation system and program
JP6684559B2 (en) Program and image generation device
KR101480994B1 (en) Method and system for generating augmented reality with a display of a moter vehicle
KR100812624B1 (en) Stereovision-Based Virtual Reality Device
JP6761340B2 (en) Simulation system and program
JP3530772B2 (en) Mixed reality device and mixed reality space image generation method
JP7361502B2 (en) System and method for adjusting stereoscopic effects
JP2000350859A (en) Marker arranging method and composite reality really feeling device
CN105359063A (en) Head mounted display with tracking
CA2951058A1 (en) Autostereoscopic virtual reality platform
KR20140043522A (en) Apparatus and method for controlling of transparent both-sided display
JP7144796B2 (en) Simulation system and program
KR20130052021A (en) Image display apparatus, game program, and method of controlling game
US20190143223A1 (en) Simulation system and game system
JP5123353B2 (en) A virtual flashlight that illuminates and discovers real-time scenes
Lou et al. Reducing cybersickness by geometry deformation
JP4282112B2 (en) Virtual object control method, virtual object control apparatus, and recording medium
JP7351638B2 (en) Image generation device, image display system, and information presentation method
CN113574591A (en) Boundary setting device, boundary setting method, and program
JP7104539B2 (en) Simulation system and program
JP6918189B2 (en) Simulation system and program

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)