US10252162B2 - Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space - Google Patents

Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space Download PDF

Info

Publication number
US10252162B2
US10252162B2 US15/619,333 US201715619333A US10252162B2 US 10252162 B2 US10252162 B2 US 10252162B2 US 201715619333 A US201715619333 A US 201715619333A US 10252162 B2 US10252162 B2 US 10252162B2
Authority
US
United States
Prior art keywords
user
virtual
virtual space
field
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US15/619,333
Other languages
English (en)
Other versions
US20170354882A1 (en
Inventor
Yuki Kono
Naruatsu Baba
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Colopl Inc
Original Assignee
Colopl Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016116437A external-priority patent/JP6177965B1/ja
Priority claimed from JP2016116444A external-priority patent/JP6126273B1/ja
Application filed by Colopl Inc filed Critical Colopl Inc
Assigned to COLOPL, INC. reassignment COLOPL, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KONO, YUKI, BABA, NARUATSU
Publication of US20170354882A1 publication Critical patent/US20170354882A1/en
Priority to US16/376,376 priority Critical patent/US10589176B2/en
Application granted granted Critical
Publication of US10252162B2 publication Critical patent/US10252162B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5258Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2250/00Miscellaneous game characteristics
    • A63F2250/02Miscellaneous game characteristics having an effect on the human senses
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • This disclosure relates to a technology of providing virtual reality, and more specifically, to a technology of helping to reduce a visually induced motion sickness in virtual reality.
  • HMD head mounted display
  • VR virtual reality
  • Patent Literature 1 Japanese Patent No. 5869177
  • Patent Literature 1 there is disclosed a technology of “generating an image while suppressing the amount of information to be visually recognized by a wearer of a head mounted display when a visual-field image of a virtual space to which a user is immersed is provided to the HMD” (see [Abstract]).
  • Patent Literature 2 Japanese Patent Application Laid-open No. 2007-116309
  • Patent Literature 1 Japanese Patent No. 5,869,177
  • Patent Literature 2 Japanese Patent Application Laid-open No. 2007-116309
  • the virtual reality there is a game that utilizes a motion of a user wearing the HMD device, for example, a motion of a hand. Therefore, a technology of helping to reduce the visually induced motion sickness in accordance with the motion of the user is desired.
  • This disclosure has been made in order to help solve the problem described above, and at least one embodiment has an object in one aspect to provide a method of reducing a visually induced motion sickness (VR sickness) when virtual reality is provided.
  • This disclosure in at least one embodiment, further has an object in another aspect to provide a medium storing instructions for reducing a visually induced motion sickness when virtual reality is provided.
  • This disclosure in at least one embodiment, further has an object in another aspect to provide a system for reducing a visually induced motion sickness when virtual reality is provided.
  • a method of providing a virtual space to a head mounted display device by a computer includes defining a virtual space by a processor of the computer. The method further includes determining, by the processor, a flying direction of an object that flies in the virtual space in accordance with a motion of a user of the head mounted display device, based on the motion of the user. The method further includes causing, by the processor, the head mounted display device to display a field of view of the user in the virtual space so that the field of view is moved in the flying direction.
  • FIG. 1 A diagram of a configuration of a head mounted display (HMD) system according to at least one embodiment of this disclosure.
  • HMD head mounted display
  • FIG. 2 A block diagram of a hardware configuration of a computer according to at least one embodiment of this disclosure.
  • FIG. 3 A schematic diagram of a UVW visual-field coordinate system to be set for an HMD device of at least one embodiment of this disclosure.
  • FIG. 4 A schematic diagram of a mode of expressing a virtual space of at least one embodiment of this disclosure.
  • FIG. 5 A diagram of a head of a user wearing an HMD device of at least one embodiment of this disclosure.
  • FIG. 6 A cross-sectional diagram of a YZ cross section obtained by viewing a field-of-view region from an X direction in the virtual space according to at least one embodiment of this disclosure.
  • FIG. 7 A cross-sectional diagram of an XZ cross section obtained by viewing the field-of-view region from a Y direction in the virtual space according to at least one embodiment of this disclosure.
  • FIG. 8 A block diagram of a functional configuration to be achieved by the computer of at least one embodiment of this disclosure.
  • FIG. 9 A flowchart of processing to be executed by the HMD system according to at least one embodiment of this disclosure.
  • FIG. 10 A flow chart of processing to be executed by the computer when a virtual user throws a lasso or other objects to a target in the virtual space according to at least one embodiment of this disclosure.
  • FIG. 11A A diagram of a state in which a mountain and a tree are present in the virtual space according to at least one embodiment of this disclosure.
  • FIG. 11B A diagram of a state in which a mountain and a tree are present in the virtual space according to at least one embodiment of this disclosure.
  • FIG. 12A A diagram of a state when the virtual user throws a rope toward a tree according to at least one embodiment of this disclosure.
  • FIG. 12B A diagram of a state when the virtual user throws a rope toward a tree according to at least one embodiment of this disclosure.
  • FIG. 13A A diagram of the virtual user having moved to the vicinity of the tree in response to a loop of the rope being caught on the tree according to at least one embodiment of this disclosure.
  • FIG. 13B A diagram of the virtual user having moved to the vicinity of the tree in response to a loop of the rope being caught on the tree according to at least one embodiment of this disclosure.
  • FIG. 14A A diagram of a state in which a mountain is present in the virtual space and a horse is running according to at least one embodiment of this disclosure.
  • FIG. 14B A diagram of a state in which a mountain is present in the virtual space and a horse is running according to at least one embodiment of this disclosure.
  • FIG. 15A A diagram of an image indicating a state in which the target has moved in the virtual space according to at least one embodiment of this disclosure.
  • FIG. 15B A diagram of an image indicating a state in which the target has moved in the virtual space according to at least one embodiment of this disclosure.
  • FIG. 16A A diagram of a state in which, in one aspect, the virtual user has moved to a moving object at high speed according to at least one embodiment of this disclosure.
  • FIG. 16B A diagram of a state in which, in one aspect, the virtual user has moved to a moving object at high speed according to at least one embodiment of this disclosure.
  • FIG. 17 A diagram of a configuration of an HMD system according to at least one embodiment of this disclosure.
  • FIG. 18 A block diagram of a hardware configuration of a computer according to at least one embodiment of this disclosure.
  • FIG. 19 A block diagram of the computer of at least one embodiment of this disclosure as a module configuration.
  • FIG. 20 A flow chart of processing to be executed in the HMD system according to at least one embodiment of this disclosure.
  • FIG. 21 A flow chart of processing to be executed by the computer in accordance with a motion of the user sitting on a board according to at least one embodiment of this disclosure.
  • FIG. 22A A diagram of a mode in which the user is sitting on the board of at least one embodiment of this disclosure.
  • FIG. 22C A diagram of a mode in which the user is sitting on the board of at least one embodiment of this disclosure.
  • FIG. 23A A diagram of a mode in which the user is standing on the board of at least one embodiment of this disclosure.
  • FIG. 23C A diagram of a mode in which the user is standing on the board of at least one embodiment of this disclosure.
  • FIG. 24B A diagram of a mode in which the user is sitting on a sled of at least one embodiment of this disclosure.
  • FIG. 24C A diagram of a mode in which the user is sitting on a sled of at least one embodiment of this disclosure.
  • FIG. 25A A diagram of a case where the board or the sled on which the user is sitting or standing is maintained in a horizontal state according to at least one embodiment of this disclosure.
  • FIG. 26A A diagram of a case where the user maintaining a horizontal state leans to the right side to incline the board or the sled to the right side according to at least one embodiment of this disclosure.
  • FIG. 26B A diagram of a case where the user maintaining a horizontal state leans to the right side to incline the board or the sled to the right side according to at least one embodiment of this disclosure.
  • FIG. 27A A diagram of a case where the user maintaining the horizontal state leans to the left side to incline the board or the sled to the left side according to at least one embodiment of this disclosure.
  • FIG. 27B A diagram of a case where the user maintaining the horizontal state leans to the left side to incline the board or the sled to the left side according to at least one embodiment of this disclosure.
  • FIG. 28A A diagram of a case where the user maintaining the horizontal state leans forward to incline the board forward according to at least one embodiment of this disclosure.
  • FIG. 28B A diagram of a case where the user maintaining the horizontal state leans forward to incline the board forward according to at least one embodiment of this disclosure.
  • FIG. 29A A diagram of a case where the user maintaining the horizontal state leans backward to incline the board backward according to at least one embodiment of this disclosure.
  • FIG. 29B A diagram of a case where the user maintaining the horizontal state leans backward to incline the board backward according to at least one embodiment of this disclosure.
  • FIG. 1 is a diagram of the configuration of the HMD system 100 according to at least one embodiment of this disclosure.
  • the HMD system 100 is provided as a system for household use or a system for professional use.
  • the HMD system 100 includes an HMD device 110 , an HMD sensor 120 , a controller 160 , and a computer 200 .
  • the HMD device 110 includes a monitor 112 and an eye gaze sensor 140 .
  • the controller 160 includes a motion sensor 130 .
  • the computer 200 can be connected to a network 19 and can communicate to/from a server 150 connected to the network 19 .
  • the HMD device 110 may include a sensor 114 instead of the HMD sensor 120 .
  • the HMD device 110 of at least one embodiment of this disclosure may be worn on a head of a user to provide a virtual space to the user during operation. More specifically, the HMD device 110 displays each of a right-eye image and a left-eye image on the monitor 112 . When each eye of the user visually recognizes each image, the user may recognize the image as a three-dimensional image based on the parallax of both the eyes.
  • the monitor 112 is achieved as, for example, a non-transmissive display device or a partially transmissive display device.
  • the monitor 112 is arranged on a main body of the HMD device 110 so as to be positioned in front of both the eyes of the user. Therefore, when the user visually recognizes the three-dimensional image displayed on the monitor 112 , the user can be immersed to the virtual space.
  • the virtual space includes, for example, a background, objects that can be operated by the user, and menu images that can be selected by the user.
  • the monitor 112 may be achieved as a liquid crystal monitor or an organic electroluminescence (EL) monitor included in a so-called smart phone or other information display terminals.
  • EL organic electroluminescence
  • the monitor 112 may include a sub-monitor for displaying a right-eye image and a sub-monitor for displaying a left-eye image. In at least one aspect, the monitor 112 may be configured to integrally display the right-eye image and the left-eye image. In this case, the monitor 112 includes a high-speed shutter. The high-speed shutter operates so as to enable alternate display of the right-eye image and the left-eye image so that only one of the eyes can recognize the image at any single point in time.
  • the HMD sensor 120 includes a plurality of light sources (not shown). Each light source is achieved by, for example, an LED configured to emit an infrared ray.
  • the HMD sensor 120 has a position tracking function for detecting the movement of the HMD device 110 . The HMD sensor 120 uses this function to detect the position and the inclination of the HMD device 110 in a real space.
  • the HMD sensor 120 may be achieved by a camera.
  • the HMD sensor 120 may use image information of the HMD device 110 output from the camera to execute image analysis processing, to thereby enable detection of the position and the inclination of the HMD device 110 .
  • the HMD device 110 may include the sensor 114 instead of the HMD sensor 120 as a position detector.
  • the HMD device 110 may use the sensor 114 to detect the position and the inclination of the HMD device 110 itself.
  • the sensor 114 is an angular velocity sensor, a geomagnetic sensor, an acceleration sensor, or a gyrosensor
  • the HMD device 110 may use any of those sensors instead of the HMD sensor 120 to detect the position and the inclination of the HMD device 110 .
  • the sensor 114 is an angular velocity sensor
  • the angular velocity sensor detects over time the angular velocity about each of three axes of the HMD device 110 in the real space.
  • the HMD device 110 calculates a temporal change of the angle about each of the three axes of the HMD device 110 based on each angular velocity, and further calculates an inclination of the HMD device 110 based on the temporal change of the angles.
  • the motion sensor 130 is mounted on the hand of the user to detect the movement of the hand of the user.
  • the motion sensor 130 detects a rotational speed and the number of rotations of the hand.
  • the detection signal is transmitted to the computer 200 .
  • the motion sensor 130 is provided to, for example, the glove-type controller 160 .
  • the controller 160 is mounted on an object that does not easily fly away, for example a glove-type object being worn on a hand of a user 190 .
  • a sensor that is not mounted on the user 190 may detect the movement of the hand of the user 190 .
  • a signal of a camera that captures images of the user 190 may be input to the computer 200 as a signal representing the motion of the user 190 .
  • the motion sensor 130 and the computer 200 are connected to each other through wired or wireless communication.
  • the communication mode is not particularly limited, and for example, Bluetooth® or other known communication methods may be used.
  • the eye gaze sensor 140 is configured to detect a direction (line-of-sight direction) in which the lines of sight of the right eye and the left eye of the user 190 are directed.
  • the direction is detected by, for example, a known eye tracking function.
  • the eye gaze sensor 140 is achieved by a sensor having the eye tracking function.
  • the eye gaze sensor 140 includes a right-eye sensor and a left-eye sensor.
  • the eye gaze sensor 140 may be, for example, a sensor configured to irradiate the right eye and the left eye of the user 190 with infrared light, and to receive reflection light from the cornea and the iris with respect to the irradiation light, to thereby detect a rotational angle of each eyeball.
  • the eye gaze sensor 140 can detect the line-of-sight direction of the user 190 based on each detected rotational angle.
  • the server 150 may transmit instructions to the computer 200 .
  • the server 150 may communicate to/from another computer 200 for providing virtual reality to an HMD device used by another user. For example, when a plurality of users play a participatory game in an amusement facility, each computer 200 communicates to/from another computer 200 with a signal based on the motion of each user, to thereby enable the plurality of users to enjoy a common game in the same virtual space.
  • FIG. 2 is a block diagram of the hardware configuration of the computer 200 in at least one embodiment of this disclosure.
  • the computer 200 includes a processor 10 , a memory 11 , a storage 12 , an input/output interface 13 , and a communication interface 14 . Each component is connected to a bus 15 .
  • the processor 10 is configured to execute a series of commands included in a program stored in the memory 11 or the storage 12 based on a signal transmitted to the computer 200 or on satisfaction of a condition determined in advance.
  • the processor 10 is achieved as a central processing unit (CPU), a micro-processor unit (MPU), a field-programmable gate array (FPGA), or other devices.
  • the memory 11 temporarily stores programs and data. Temporary storage of data is performed in a non-transitory manner, such that detection of the stored date is possible.
  • the programs are loaded from, for example, the storage 12 .
  • the data includes data input to the computer 200 and data generated by the processor 10 .
  • the memory 11 is achieved as a random access memory (RAM) or other volatile memories.
  • the storage 12 permanently stores programs and data.
  • the storage 12 is achieved as, for example, a read-only memory (ROM), a hard disk device, a flash memory, or other non-volatile storage devices.
  • the programs stored in the storage 12 include programs for providing a virtual space in the HMD system 100 , simulation programs, game programs, user authentication programs, and programs for achieving communication to/from other computers 200 .
  • the data stored in the storage 12 includes data and objects for defining the virtual space.
  • the storage 12 may be achieved as a removable storage device like a memory card.
  • a configuration that uses programs and data stored in an external storage device may be used instead of the storage 12 built into the computer 200 . With such a configuration, for example, in a situation where a plurality of HMD systems 100 are used as in an amusement facility, the programs and the data can be collectively updated.
  • the input/output interface 13 is configured to allow communication of signals among the HMD device 110 , the HMD sensor 120 , the motion sensor 130 , and the server 150 .
  • the input/output interface 13 is achieved with use of a universal serial bus (USB) interface, a digital visual interface (DVI), a high-definition multimedia interface (HDMI)®, or other terminals.
  • USB universal serial bus
  • DVI digital visual interface
  • HDMI high-definition multimedia interface
  • the input/output interface 13 is not limited to the examples described above.
  • the communication interface 14 is connected to the network 19 to communicate to/from other computers (for example, the server 150 ) connected to the network 19 .
  • the communication interface 14 is achieved as, for example, a local area network (LAN), other wired communication interfaces, wireless fidelity (WiFi), Bluetooth®, near field communication (NFC), or other wireless communication interfaces.
  • LAN local area network
  • WiFi wireless fidelity
  • NFC near field communication
  • the communication interface 14 is not limited to the examples described above.
  • the processor 10 loads one or more programs stored in the storage 12 to the memory 11 to execute a series of commands included in the program.
  • the one or more programs may include an operating system of the computer 200 , an application program for providing a virtual space, and game software that can be executed in the virtual space with use of the controller 160 .
  • the processor 10 transmits a signal for providing a virtual space to the HMD device 110 via the input/output interface 13 .
  • the HMD device 110 displays a video on the monitor 112 based on the signal.
  • the computer 200 is provided outside of the HMD device 110 , but in at least one aspect, the computer 200 may be built into the HMD device 110 .
  • a portable information communication terminal for example, a smartphone
  • the monitor 112 may function as the computer 200 .
  • the computer 200 may be used in common among a plurality of HMD devices 110 .
  • the same virtual space can be provided to a plurality of users, and hence each user can enjoy the same application with other users in the same virtual space.
  • a global coordinate system is set in advance.
  • the global coordinate system has three reference directions (axes) that are respectively parallel to a vertical direction, a horizontal direction orthogonal to the vertical direction, and a front-rear direction orthogonal to both of the vertical direction and the horizontal direction in a real space.
  • the global coordinate system is one type of point-of-view coordinate system.
  • the horizontal direction, the vertical direction (up-down direction), and the front-rear direction in the global coordinate system are defined as an x axis, a y axis, and a z axis, respectively.
  • the x axis of the global coordinate system is parallel to the horizontal direction of the real space
  • the y axis thereof is parallel to the vertical direction of the real space
  • the z axis thereof is parallel to the front-rear direction of the real space.
  • the HMD sensor 120 includes an infrared sensor.
  • the infrared sensor detects the infrared ray emitted from each light source of the HMD device 110 .
  • the infrared sensor detects the presence of the HMD device 110 .
  • the HMD sensor 120 further detects the position and the inclination of the HMD device 110 in the real space in accordance with the movement of the user wearing the HMD device 110 based on the value of each point (each coordinate value in the global coordinate system).
  • the HMD sensor 120 can detect the temporal change of the position and the inclination of the HMD device 110 with use of each value detected over time.
  • the global coordinate system is parallel to a coordinate system of the real space. Therefore, each inclination of the HMD device 110 detected by the HMD sensor 120 corresponds to each inclination about each of the three axes of the HMD device 110 in the global coordinate system.
  • the HMD sensor 120 sets a UVW visual-field coordinate system to the HMD device 110 based on the inclination of the HMD device 110 in the global coordinate system.
  • the UVW visual-field coordinate system set to the HMD device 110 corresponds to a point-of-view coordinate system used when the user wearing the HMD device 110 views an object in the virtual space.
  • a method of providing a virtual space 2 to the HMD device 110 by the computer 200 includes processing of defining the virtual space 2 by the processor 10 of the computer 200 .
  • the method further includes processing of determining, by the processor 10 , a flying direction of an object that flies in the virtual space 2 in accordance with a motion of the user 190 of the HMD device 110 , based on the motion of the user 190 .
  • the method further includes processing of causing, by the processor 10 , the HMD device 110 to display a field of view of the user 190 in the virtual space 2 so that the field of view is moved in the flying direction.
  • the processing of determining a flying direction includes processing of determining a flying distance of the object in the virtual space 2 based on the motion of the user 190 .
  • the processing of causing the HMD device 110 to display a field of view includes processing of causing the monitor 112 to display the field of view obtained from a position after movement in accordance with the flying distance in the virtual space 2 .
  • the method further includes processing of moving, by the processor 10 , when the object reaches a target in the virtual space 2 , the user 190 in the virtual space 2 to the target.
  • the processing of moving the user 190 to the target include moving, when the object reaches a stationary object, the user 190 to the stationary object in the virtual space 2 .
  • the processing of moving the user 190 to the target include moving, when the object reaches a moving object, the user 190 in a traveling direction of the moving object.
  • the processing of moving the user 190 to the target include causing the HMD device 110 to display the field of view so that a landscape around the user 190 in the virtual space 2 approaches the user 190 .
  • the method further include processing of causing, by the processor 10 , when the object reaches a target in the virtual space 2 , the HMD device 110 to display the field of view of the user 190 so that the target is attracted toward the user 190 .
  • the processing of determining a flying direction of an object include determining the flying direction of the object based on a physical quantity corresponding to a movement of a hand of the user 190 or on an operation performed on a controller connected to the computer 200 .
  • the processing of determining a flying distance of an object include determining the flying distance of the object based on a physical quantity corresponding to a movement of a hand of the user 190 or on an operation performed on a controller connected to the computer 200 .
  • the HMD system 100 includes the HMD device 110 and the computer 200 configured to provide the virtual space 2 to the HMD device 110 .
  • the computer 200 includes the memory configured to store a series of commands and the processor 10 configured to execute the series of commands.
  • the processor 10 is configured to, when the processor 10 executes the series of commands, define the virtual space 2 .
  • the processor 10 is further configured to, when the processor 10 executes the series of commands, determine the flying direction of the object that flies in the virtual space 2 in accordance with the motion of the user 190 of the HMD device 110 , based on the motion of the user 190 .
  • the processor 10 is further configured to, when the processor 10 executes the series of commands, cause the HMD device 110 to display the field of view of the user 190 in the virtual space 2 so that the field of view is moved in the flying direction.
  • FIG. 3 is a schematic diagram of a UVW visual-field coordinate system to be set for the HMD device 110 of at least one embodiment of this disclosure.
  • the HMD sensor 120 detects the position and the inclination of the HMD device 110 in the global coordinate system when the HMD device 110 is activated.
  • the processor 10 sets the UVW visual-field coordinate system to the HMD device 110 based on the detected values.
  • the HMD device 110 sets a three-dimensional UVW visual-field coordinate system defining the head of the user wearing the HMD device 110 as a center (origin). More specifically, the HMD device 110 sets three directions newly obtained by inclining the horizontal direction, the vertical direction, and the front-rear direction (x axis, y axis, and z axis), which define the global coordinate system, about the respective axes by the inclinations about the respective axes of the HMD device 110 in the global coordinate system as a pitch direction (u axis), a yaw direction (v axis), and a roll direction (w axis) of the UVW visual-field coordinate system in the HMD device 110 .
  • a pitch direction u axis
  • v axis a yaw direction
  • w axis roll direction
  • the processor 10 sets the UVW visual-field coordinate system that is parallel to the global coordinate system to the HMD device 110 .
  • the horizontal direction (x axis), the vertical direction (y axis), and the front-rear direction (z axis) of the global coordinate system directly match with the pitch direction (u axis), the yaw direction (v axis), and the roll direction (w axis) of the UVW visual-field coordinate system in the HMD device 110 .
  • the HMD sensor 120 can detect the inclination (change amount of the inclination) of the HMD device 110 in the UVW visual-field coordinate system that is set based on the movement of the HMD device 110 .
  • the HMD sensor 120 detects, as the inclination of the HMD device 110 , each of a pitch angle ( ⁇ u), a yaw angle ( ⁇ v), and a roll angle ( ⁇ w) of the HMD device 110 in the UVW visual-field coordinate system.
  • the pitch angle ( ⁇ u) represents an inclination angle of the HMD device 110 about the pitch direction in the UVW visual-field coordinate system.
  • the yaw angle ( ⁇ v) represents an inclination angle of the HMD device 110 about the yaw direction in the UVW visual-field coordinate system.
  • the roll angle ( ⁇ w) represents an inclination angle of the HMD device 110 about the roll direction in the UVW visual-field coordinate system.
  • the HMD sensor 120 sets, to the HMD device 110 , the UVW visual-field coordinate system of the HMD device 110 obtained after the movement of the HMD device 110 based on the detected inclination angle of the HMD device 110 .
  • the relationship between the HMD device 110 and the UVW visual-field coordinate system of the HMD device 110 is constant regardless of the position and the inclination of the HMD device 110 .
  • the position and the inclination of the HMD device 110 change, the position and the inclination of the UVW visual-field coordinate system of the HMD device 110 in the global coordinate system change in synchronization with the change of the position and the inclination.
  • the HMD sensor 120 may specify the position of the HMD device 110 in the real space as a position relative to the HMD sensor 120 based on the light intensity of the infrared ray or a relative positional relationship between a plurality of detection points (for example, a distance between the detection points), which is acquired based on output from the infrared sensor. Further, the processor 10 may determine the origin of the UVW visual-field coordinate system of the HMD device 110 in the real space (global coordinate system) based on the specified relative position.
  • FIG. 4 is a diagram of a mode of expressing the virtual space of at least one embodiment of this disclosure.
  • the virtual space 2 has a structure with an entire celestial sphere shape covering a center 21 in all 360-degree directions.
  • FIG. 4 for the sake of simplicity, only the upper-half celestial sphere of the virtual space 2 is exemplified although one of ordinary skill would recognize that the virtual space 2 includes a lower-half celestial sphere as well.
  • Each mesh section is defined in the virtual space 2 .
  • the position of each mesh section is defined in advance as coordinate values in an XYZ coordinate system defined in the virtual space 2 .
  • the computer 200 associates each partial image forming content (for example, still image or moving image) that can be developed in the virtual space 2 with each corresponding mesh section in the virtual space 2 , to thereby provide, to the user, the virtual space 2 in which a virtual space image 22 that can be visually recognized by the user is developed.
  • each partial image forming content for example, still image or moving image
  • an XYZ spatial coordinate system having the center 21 as the origin is defined.
  • the XYZ coordinate system is, for example, parallel to the global coordinate system.
  • the XYZ coordinate system is one type of the point-of-view coordinate system, and hence the horizontal direction, the vertical direction (up-down direction), and the front-rear direction of the XYZ coordinate system are defined as an X axis, a Y axis, and a Z axis, respectively.
  • the X axis (horizontal direction) of the XYZ coordinate system is parallel to the x axis of the global coordinate system
  • the Y axis (vertical direction) of the XYZ coordinate system is parallel to they axis of the global coordinate system
  • the Z axis (front-rear direction) of the XYZ coordinate system is parallel to the z axis of the global coordinate system.
  • a virtual camera 1 is arranged at the center 21 of the virtual space 2 .
  • the virtual camera 1 similarly moves in the virtual space 2 . With this, the change in position and direction of the HMD device 110 in the real space is reproduced similarly in the virtual space 2 .
  • the UVW visual-field coordinate system is defined in the virtual camera 1 similarly to the HMD device 110 .
  • the UVW visual-field coordinate system of the virtual camera in the virtual space 2 is defined to be synchronized with the UVW visual-field coordinate system of the HMD device 110 in the real space (global coordinate system). Therefore, when the inclination of the HMD device 110 changes, the inclination of the virtual camera 1 also changes in synchronization therewith.
  • the virtual camera 1 can also move in the virtual space 2 in synchronization with the movement of the user wearing the HMD device 110 in the real space.
  • the direction of the virtual camera 1 is determined based on the position and the inclination of the virtual camera 1 , and hence a line of sight (reference line of sight 5 ) serving as a reference when the user visually recognizes the virtual space image 22 is determined based on the direction of the virtual camera 1 .
  • the processor 10 of the computer 200 defines a field-of-view region 23 in the virtual space 2 based on the reference line of sight 5 .
  • the field-of-view region 23 corresponds to a field of view of the user wearing the HMD device 110 in the virtual space 2 .
  • the line-of-sight direction of the user 190 detected by the eye gaze sensor 140 is a direction in the point-of-view coordinate system obtained when the user 190 visually recognizes an object.
  • the UVW visual-field coordinate system of the HMD device 110 is equal to the point-of-view coordinate system used when the user 190 visually recognizes the monitor 112 .
  • the UVW visual-field coordinate system of the virtual camera 1 is synchronized with the UVW visual-field coordinate system of the HMD device 110 . Therefore, in the HMD system 100 in at least one aspect, the line-of-sight direction of the user 190 detected by the eye gaze sensor 140 can be regarded as the user's line-of-sight direction in the UVW visual-field coordinate system of the virtual camera 1 .
  • FIG. 5 is a diagram of the head of the user wearing the HMD device of at least one embodiment of this disclosure.
  • the eye gaze sensor 140 detects lines of sight of the right eye and the left eye of the user 190 . In at least one aspect, when the user 190 is looking at a near place, the eye gaze sensor 140 detects lines of sight R 1 and L 1 . In at least one aspect, when the user 190 is looking at a far place, the eye gaze sensor 140 detects lines of sight R 2 and L 2 . In this case, the angles formed by the lines of sight R 2 and L 2 with respect to the roll direction w are smaller than the angles formed by the lines of sight R 1 and L 1 with respect to the roll direction w. The eye gaze sensor 140 transmits the detection results to the computer 200 .
  • the computer 200 When the computer 200 receives the detection values of the lines of sight R 1 and L 1 from the eye gaze sensor 140 as the detection results of the lines of sight, the computer 200 specifies a point of gaze N 1 being an intersection of both the lines of sight R 1 and L 1 based on the detection values. Meanwhile, when the computer 200 receives the detection values of the lines of sight R 2 and L 2 from the eye gaze sensor 140 , the computer 200 specifies an intersection of both the lines of sight R 2 and L 2 as the point of gaze. The computer 200 identifies a line-of-sight direction N 0 of the user 190 based on the specified point of gaze N 1 .
  • the computer 200 detects, for example, an extension direction of a straight line that passes through the point of gaze N 1 and a midpoint of a straight line connecting a right eye R and a left eye L of the user 190 to each other as the line-of-sight direction N 0 .
  • the line-of-sight direction N 0 is a direction in which the user 190 actually directs his or her lines of sight with both eyes. Further, the line-of-sight direction N 0 corresponds to a direction in which the user 190 actually directs his or her lines of sight with respect to the field-of-view region 23 .
  • the HMD system 100 may include microphones and speakers in any part constructing the HMD system 100 .
  • the user speaks to the microphone, an instruction can be given to the virtual space 2 with voice.
  • the HMD system 100 may include a television broadcast reception tuner. With such a configuration, the HMD system 100 can display a television program in the virtual space 2 .
  • the HMD system 100 may include a communication circuit for connecting to the Internet or have a verbal communication function for connecting to a telephone line.
  • FIG. 6 is a cross-sectional diagram of a YZ cross section obtained by viewing the field-of-view region 23 from an X direction in the virtual space 2 according to at least one embodiment of this disclosure.
  • FIG. 7 is a cross-sectional diagram of an XZ cross section obtained by viewing the field-of-view region 23 from a Y direction in the virtual space 2 according to at least one embodiment of this disclosure.
  • the field-of-view region 23 in the YZ cross section includes a region 24 .
  • the region 24 is defined by the reference line of sight 5 of the virtual camera 1 and the YZ cross section of the virtual space 2 .
  • the processor 10 defines a range of a polar angle ⁇ or more from the reference line of sight 5 serving as the center in the virtual space as the region 24 .
  • the field-of-view region 23 in the XZ cross section includes a region 25 .
  • the region 25 is defined by the reference line of sight 5 and the XZ cross section of the virtual space 2 .
  • the processor 10 defines a range of a polar angle ⁇ or more from the reference line of sight 5 serving as the center in the virtual space 2 as the region 25 .
  • the HMD system 100 causes the monitor 112 to display a field-of-view image 26 based on the signal from the computer 200 , to thereby provide the virtual space to the user 190 .
  • the field-of-view image 26 corresponds to a part of the virtual space image 22 , which is superimposed on the field-of-view region 23 .
  • the virtual camera 1 is also moved in synchronization with the movement. As a result, the position of the field-of-view region 23 in the virtual space 2 is changed.
  • the field-of-view image 26 displayed on the monitor 112 is updated to an image that is superimposed on the field-of-view region 23 of the virtual space image 22 in a direction in which the user faces in the virtual space 2 .
  • the user can visually recognize a desired direction in the virtual space 2 .
  • the HMD system 100 can provide a high sense of immersion in the virtual space 2 to the user.
  • the processor 10 may move the virtual camera 1 in the virtual space 2 in synchronization with the movement in the real space of the user 190 wearing the HMD device 110 .
  • the processor 10 specifies an image region to be projected on the monitor 112 of the HMD device 110 (that is, the field-of-view region 23 in the virtual space 2 ) based on the position and the direction of the virtual camera 1 in the virtual space 2 .
  • the virtual camera 1 is desired to include two virtual cameras, that is, a virtual camera for providing a right-eye image and a virtual camera for providing a left-eye image. Further, in at least one embodiment, an appropriate parallax be set for the two virtual cameras so that the user 190 can recognize the three-dimensional virtual space 2 .
  • a technical idea of this disclosure is exemplified assuming that the virtual camera 1 includes two virtual cameras, and the roll directions of the two virtual cameras are synthesized so that the generated roll direction (w) is adapted to the roll direction (w) of the HMD device 110 .
  • FIG. 8 is a block diagram of the computer 200 of at least one embodiment of this disclosure as a module configuration.
  • the computer 200 includes a display control module 220 , a virtual space control module 230 , a memory module 240 , and a communication control module 250 .
  • the display control module 220 includes, as sub-modules, a virtual camera control module 221 , a field-of-view region determining module 222 , a field-of-view image generating module 223 , and a reference line-of-sight specifying module 224 .
  • the virtual space control module 230 includes, as sub-modules, a virtual space defining module 231 , a virtual object generating module 232 , and an object control module 233 .
  • the memory module 240 stores space information 241 , object information 242 , and user information 243 .
  • the display control module 220 and the virtual space control module 230 are achieved by the processor 10 .
  • a plurality of processors 10 may be combined to function as the display control module 220 and the virtual space control module 230 .
  • the memory module 240 includes the memory 11 or the storage 12 .
  • the communication control module 250 includes the communication interface 14 .
  • the display control module 220 is configured to control the image display on the monitor 112 of the HMD device 110 .
  • the virtual camera control module 221 is configured to arrange the virtual camera 1 in the virtual space 2 , and control the behavior, the direction, and the like of the virtual camera 1 .
  • the field-of-view region determining module 222 is configured to define the field-of-view region 23 .
  • the field-of-view image generating module 223 is configured to generate the field-of-view image 26 to be displayed on the monitor 112 based on the determined field-of-view region 23 .
  • the reference line-of-sight specifying module 224 is configured to specify the line of sight of the user 190 based on the signal from the eye gaze sensor 140 .
  • the virtual space control module 230 is configured to control the virtual space 2 to be provided to the user 190 .
  • the virtual space defining module 231 is configured to generate virtual space data representing the virtual space 2 to define the virtual space 2 in the HMD system 100 .
  • the virtual object generating module 232 is configured to generate a target to be displayed in the virtual space 2 .
  • Examples of the target include forests, mountains, other landscapes, and animals to be displayed in accordance with the progression of the story of the game.
  • the object control module 233 is configured to control the motion of the object held by the user in the virtual space 2 .
  • Examples of the object may include ropes, stones, lassos, and other throwing objects to be thrown in the virtual space 2 in synchronization with the motion of the user 190 in the real space.
  • the memory module 240 stores data to be used for providing the virtual space 2 to the user 190 by the computer 200 .
  • the memory module 240 stores the space information 241 , the object information 242 , and the user information 243 .
  • the space information 241 stores one or more templates defined for providing the virtual space 2 .
  • the object information 242 stores content to be played in the virtual space 2 , and information for displaying an object to be used in the content. Examples of the content may include a game and content representing a landscape similar to that of the real society.
  • the user information 243 stores a program for causing the computer 200 to function as the control device of the HMD system 100 , an application program that uses each piece of content stored in the object information 242 , and the like.
  • the data and programs stored in the memory module 240 are input by the user of the HMD device 110 .
  • the computer 200 downloads the data and programs from a computer (for example, the server 150 ) that is managed by an external operator providing the content, to thereby store the data and programs in the memory module 240 .
  • the communication control module 250 may communicate to/from the server 150 or other information communication devices via the network 19 .
  • the display control module 220 and the virtual space control module 230 may be achieved with use of Unity® provided by Unity Technologies.
  • FIG. 9 is a flow chart of processing to be executed by the HMD system 100 according to at least one embodiment of this disclosure.
  • Step S 910 the processor 10 of the computer 200 serves as the virtual space defining module 231 to specify the virtual space image data.
  • Step S 920 the processor 10 initializes the virtual camera 1 .
  • the processor 10 arranges the virtual camera 1 at the center point defined in advance in the virtual space 2 , and directs the line of sight of the virtual camera 1 to a direction in which the user 190 faces.
  • Step S 930 the processor 10 serves as the field-of-view image generating module 223 to generate an initial field-of-view image.
  • the generated field-of-view image is transmitted to the HMD device 110 by the communication control module 250 via the field-of-view image generating module 223 .
  • Step S 932 the monitor 112 of the HMD device 110 displays the field-of-view image based on the signal received from the computer 200 .
  • the user 190 wearing the HMD device 110 may recognize the virtual space 2 through visual recognition of the field-of-view image.
  • Step S 934 the HMD sensor 120 detects the position and the inclination of the HMD device 110 based on a plurality of infrared beams emitted from the HMD device 110 .
  • the detection result is transmitted to the computer 200 as movement detection data.
  • Step S 940 the processor 10 specifies the field-of-view direction of the user 190 wearing the HMD device 110 based on the position and the inclination of the HMD device 110 .
  • the processor 10 executes an application program to display an object in the virtual space 2 based on the command included in the application program.
  • the user 190 enjoys the content that can be visually recognized in the virtual space 2 through execution of the application program.
  • examples of the content include ball games, lassos, other playgames, and guided tours in sightseeing spots.
  • Step S 942 the motion sensor 130 detects the movement of the hand of the user 190 .
  • the signal representing the detected movement is transmitted to the computer 200 .
  • the signal includes a rotational speed and an acceleration of the hand.
  • an application program of a ball game when executed, there may be a scene in which a virtual user present in the virtual space 2 throws a ball in accordance with the swing of the arm of the user 190 . In this case, when the user 190 actually moves his or her arm, the rotational direction and the speed of the arm are detected.
  • the virtual user in the virtual space 2 may function as a sightseeing guide. In this case, the movement of the right arm and the direction indicated by the right arm when the sightseeing guide in the virtual space 2 says “please look on the right side” to the tourist are detected.
  • Step S 950 the processor 10 specifies a flying direction based on the signal output from the motion sensor 130 .
  • the flying direction includes a direction in which balls, rings, stones, or other virtual objects fly in the virtual space 2 , or a direction indicated by an arm, a finger, a pointer, or other objects of the virtual user.
  • Step S 960 the processor 10 determines the field of view obtained after the movement of the virtual user based on the specified flying direction.
  • Step S 970 the processor 10 determines the direction of the virtual camera 1 based on the specified flying direction.
  • Step S 980 the processor 10 determines the field-of-view region based on the determined direction of the virtual camera 1 .
  • the field-of-view region represents a range that the virtual can visually recognize in the virtual space 2 .
  • Step S 990 the computer 200 generates the field-of-view image data for displaying the field-of-view image in accordance with the determined field-of-view region, and outputs the generated field-of-view image data to the HMD device 110 .
  • Step S 992 the monitor 112 of the HMD device 110 updates the field-of-view image based on the received field-of-view image data, and displays the updated field-of-view image.
  • the user 190 can recognize the updated field-of-view image, that is, the field of view obtained after the line of sight has been moved in the flying direction. While the above description refers to processor 10 , one of ordinary skill in the art would recognize that the functions of processor 10 could be separated among a plurality of processors.
  • FIG. 10 is a flow chart of processing to be executed by the computer 200 when the virtual user throws a lasso or other objects to a target in the virtual space 2 according to at least one embodiment of this disclosure.
  • Step S 1010 the processor 10 serves as the virtual space defining module 231 to define the virtual space 2 in the memory 11 .
  • Step S 1020 the processor 10 detects the motion of the user 190 wearing the HMD device 110 based on the signal from the motion sensor 130 .
  • the processor 10 determines the flying direction of the object that flies in the virtual space 2 based on the motion of the user 190 .
  • the processor 10 determines parameters such as an initial speed and the flying direction when the object flies based on the rotational speed and the throwing direction of the right hand of the user 190 .
  • the processor 10 may determine the parameters based on the number of rotations and the throwing direction of the right hand of the user 190 .
  • the controller 160 can more easily perform control so that the object flies before the controller 160 separates from the hand of the user 190 , and hence the HMD device 110 can be used more safely.
  • Step S 1040 the processor 10 serves as the virtual object generating module 232 to generate the field-of-view image data for displaying the field-of-view image on the HMD device 110 so that the field of view of the user 190 in the virtual space 2 is moved in the flying direction.
  • the processor 10 transmits the generated field-of-view image data to the HMD device 110 to cause the monitor 112 to display the field-of-view image based on the field-of-view image data.
  • Step S 1050 the processor 10 determines whether or not the object has reached the target of the virtual space 2 based on a virtual distance between the virtual user and the target in the virtual space and on the rotational motion of the arm of the user 190 .
  • the target may be a tree, a rock, or other stationary objects displayed in the virtual space 2 , or may be moving objects that run in the virtual space 2 such as an automobile, a dog, or a horse.
  • Whether or not the object has reached the target may be determined based on, for example, the initial speed and the flying direction of the object in the virtual space 2 and the virtual distance from the virtual user to the target.
  • Step S 1050 determines that the object has reached the target of the virtual space 2 (YES in Step S 1050 ). Otherwise (NO in Step S 1050 ), the processor 10 returns the control to Step S 1020 .
  • Step S 1060 the processor 10 serves as the object control module 233 to move the position of the virtual user in the virtual space 2 to the target.
  • Step S 1050 physical calculation to be executed in game software may be taken into consideration.
  • the size and the mass of the object, the gravity or the air resistance in the virtual space 2 , and the like may be taken into consideration. In this manner, a more enjoyable experience may be given in the virtual reality.
  • the processor 10 may execute processing of specifying a target that the object has reached.
  • the target include a tree, a rock, a ground, other stationary objects, an animal, a vehicle, and other moving objects.
  • the processor 10 may generate the field-of-view image so that the virtual user is moved to the target at high speed in the virtual space 2 . In this manner, even in a situation of high-speed movement, which is a unique situation in the virtual reality, the VR sickness can be reduced or prevented from occurring to the user 190 . While the above description refers to processor 10 , one of ordinary skill in the art would recognize that the functions of processor 10 could be separated among a plurality of processors.
  • a tree is exemplified as the stationary object, but the stationary object is not limited to the exemplified object.
  • the stationary object may be at least an object that can be visually recognized by the user in the virtual space 2 , for example, a building or the other shore of a river.
  • FIGS. 11A and 11B are diagrams of a state in which a mountain and a tree are present in the virtual space 2 according to at least one embodiment of this disclosure.
  • the user recognizes a tree on the front right side and recognizes a mountain of the front left side.
  • FIG. 11A is an image 1100 that is visually recognized in the virtual space 2 by the user 190 wearing the HMD device 110 .
  • FIG. 11 B is a diagram of a state in which the virtual space 2 is viewed from above (Y direction).
  • the HMD system 100 causes the virtual user to throw a rope in the virtual space 2 so that the virtual user is moved to a position at which the rope is caught.
  • the user 190 wearing the HMD device 110 mounts a sensor on his or her right hand.
  • the sensor detects the movement (rotation or throwing motion) of the right hand, and the detection result is input to the computer 200 .
  • the computer 200 executes an application program for implementing the scenario, and transmits a signal for displaying the image to the monitor 112 .
  • the monitor 112 displays the image in accordance with the signal, the user 190 recognizes the image.
  • the virtual user in the virtual space 2 visually recognizes a mountain and a tree 1110 .
  • a rope 1120 is displayed.
  • FIGS. 12A and 12B are diagrams of a state when the virtual user throws the rope 1120 toward the tree 1110 according to at least one embodiment of this disclosure.
  • FIG. 12A is a diagram of a state in which the rope 1120 flies toward the tree 1110 .
  • FIG. 12B is a diagram of a state in which the virtual camera 1 is directed toward the tree 1110 in accordance with the thrown rope 1120 .
  • the processor 10 switches the image to be displayed on the monitor 112 based on the signal output from the sensor mounted to the right hand. More specifically, the field-of-view image in the virtual space 2 is switched to an image having the tree 1110 at the center.
  • the virtual user may visually recognize the field-of-view image as if the line of sight is switched in the direction of the tree 1110 .
  • the user 190 can predict the direction in which the rope 1120 flies, and hence a so-called visually induced motion sickness (VR sickness) can be reduced or prevented without reducing the user's sense of immersion.
  • VR sickness visually induced motion sickness
  • FIGS. 13A and 13B are diagrams of the virtual user having moved to the vicinity of the tree 1110 in response to a loop of the rope 1120 being caught on the tree 1110 according to at least one embodiment of this disclosure.
  • FIG. 13A is a diagram of an image that is visually recognized by the virtual user that has moved to the vicinity of the tree 1110 .
  • FIG. 13B is a diagram of the position and the direction of the virtual camera 1 in the virtual space 2 .
  • an image 1300 displays a state in which the virtual user has moved to the vicinity of the tree 1110 .
  • the virtual user may move at high speed toward the tree 1110 in the virtual space 2 . More specifically, the virtual user may move at higher speed than the movement speed in accordance with the temporal change of the position of the HMD device 110 detected by the HMD sensor 120 .
  • the field-of-view image may be displayed so that the landscape around the virtual user and the tree 1110 moves at high speed toward the virtual user.
  • the position of the virtual camera 1 is instantaneously moved to the vicinity of the tree 1110 , and hence the user 190 may visually recognize the image as if the virtual user has instantaneously moved to the tree 1110 .
  • the field-of-view image visually recognized by the virtual user is switched to a direction in which the object flies, that is, a direction in which the field of view of the virtual user moves.
  • the motion of the user 190 includes the rotation of the arm and the gesture of throwing an object.
  • the direction to switch the field-of-view image corresponds to such motions. Therefore, the user 190 can visually recognize the field-of-view image in the virtual space 2 in accordance with his or her motion, and hence the visually induced motion sickness may be suppressed or avoided.
  • a horse is exemplified as the moving object, but the moving object is not limited to the exemplified object.
  • the moving object may be objects that move in the real space, such as other animals, automobiles, other vehicles, birds, airplanes, and rockets.
  • FIGS. 14A and 14B are diagrams of a state in which a mountain is present in the virtual space 2 and a horse is running according to at least one embodiment of this disclosure.
  • FIG. 14A is a diagram of a field-of-view image 1400 that is visually recognized in the virtual space 2 by the user 190 wearing the HMD device 110 .
  • FIG. 14B is a diagram of a state in which the virtual space 2 is viewed from above (Y direction).
  • the user 190 wearing the HMD device 110 visually recognizes the field-of-view image 1400 as the virtual user.
  • the field-of-view image 1400 displays a mountain and a horse 1410 .
  • the processor 10 of the HMD device 110 displays a rope 1120 in the field-of-view image 1400 .
  • the motion sensor 130 detects the movement of the hand, and the processor 10 displays, on the monitor 112 , such an image that the rope 1120 flies toward the running horse 1410 .
  • the monitor 112 displays the image so that the horse 1410 is positioned at the center of the field-of-view image 1400 in accordance with the motion of the user 190 . Therefore, the direction in which the user 190 performs the motion and the direction displayed in the field-of-view image 1400 are substantially the same, and hence the VR sickness may be suppressed or avoided.
  • FIGS. 15A and 15B are diagrams of an image 1500 displaying a state in which the target has moved in the virtual space 2 according to at least one embodiment of this disclosure.
  • FIG. 15A is a diagram of a state in which the horse 1410 has run and thus the location has changed in the virtual space 2 .
  • FIG. 15B is a diagram of a state in which the virtual space 2 is viewed from above (Y direction).
  • the motion sensor 130 detects the motion to detect the movement direction. For example, when the horse 1410 is running from the right to the left in the image 1500 , the user 190 performs a motion of swinging his or her right hand to the front left side. Then, the processor 10 generates a signal for switching the field-of-view image to the left direction, and transmits the signal to the monitor 112 .
  • the direction of the virtual camera 1 corresponding to the virtual user is switched to the direction in which the horse 1410 is positioned at the center.
  • the monitor 112 displays the image under this state, the user 190 can visually recognize the horse 1410 at the center of the image in the virtual space 2 as the virtual user.
  • the user 190 when the rope 1120 is caught to the horse 1410 , the user 190 performs a motion of pulling the rope 1120 , and thus the virtual user can be moved at high speed toward the horse 1410 .
  • FIGS. 16A and 16B are diagrams of a state in which, in one aspect, the virtual user has moved to a moving object at high speed according to at least one embodiment of this disclosure.
  • FIG. 16A is a diagram of a field-of-view image 1600 to be visually recognized by the virtual user.
  • FIG. 16B is a diagram of a state in which the virtual space 2 is viewed from above (Y direction).
  • the virtual user can move at high speed to the target in the virtual space 2 .
  • the processor 10 displays, on the monitor 112 , such a screen that the virtual user approaches the horse 1410 at high speed.
  • such an image that the landscape around the virtual user passes behind the virtual user at high speed is displayed as the field-of-view image.
  • the user 190 that visually recognizes the field-of-view image can predict the movement direction that is based on his or her motion also in the virtual space 2 , and hence the VR sickness may be reduced or prevented from occurring.
  • the movement direction of the virtual user in the virtual space 2 is displayed based on the motion of the user 190 in the real space, and hence the motion of the user 190 and the movement direction in the virtual space 2 are synchronized with each other.
  • the VR sickness may be reduced or prevented from occurring to the user 190 that has visually recognized the image at that time.
  • FIG. 17 is a diagram of an overview of the configuration of the HMD system 100 A according to at least one embodiment of this disclosure.
  • the HMD system 100 A is provided as a system for household use or a system for professional use.
  • Components having the same configurations as those of the above-mentioned HMD system 100 A are denoted by the same reference symbols, and redundant descriptions of the functions of those components are not repeated.
  • the HMD system 100 A includes the above-mentioned HMD device 110 , the above-mentioned HMD sensor 120 , a board 180 , and a computer 200 A.
  • the HMD device 110 includes the above-mentioned monitor 112 and the above-mentioned eye gaze sensor 140 .
  • the board 180 includes an inclination sensor 170 .
  • the computer 200 A can be connected to the network 19 , and can communicate to/from the server 150 connected to the network 19 .
  • the HMD device 110 may include the above-mentioned sensor 114 instead of the HMD sensor 120 .
  • the HMD system 100 A may further include a controller 160 A.
  • the controller 160 A may include a motion sensor 130 A.
  • the controller 160 A is configured to receive input of a command from the user 190 to the computer 200 A.
  • the controller 160 A can be held by the user 190 .
  • the controller 160 A can be mounted to the body or a part of the clothes of the user 190 .
  • the controller 160 A may be configured to output at least one of a vibration, a sound, or light based on the signal transmitted from the computer 200 A.
  • the motion sensor 130 A is mounted on the hand of the user to detect the movement of the hand of the user.
  • the motion sensor 130 A detects the rotational speed and the number of rotations of the hand.
  • the detection signal is transmitted to the computer 200 A.
  • the motion sensor 130 A is provided to, for example, the glove-type controller 160 A.
  • the controller 160 A is desired to be mounted on an object that does not easily fly away like a glove-type object being worn on the hand of the user 190 .
  • a sensor that is not mounted on the user 190 may detect the movement of the hand of the user 190 .
  • a signal of a camera that photographs the user 190 may be input to the computer 200 A as a signal representing the motion of the user 190 .
  • the motion sensor 130 A and the computer 200 A are connected to each other through wired or wireless communication.
  • the communication mode is not particularly limited, and for example, Bluetooth® or other known communication methods may be used.
  • the inclination sensor 170 is achieved by, for example, an acceleration sensor or a touch sensor.
  • an acceleration sensor is used as the inclination sensor 170 .
  • a touch sensor may be arranged on each of the right and left side walls as the inclination sensor in addition to the acceleration sensor or instead of the acceleration sensor. In this case, the equipment may be detected to be inclined toward the side wall touched by the user 190 .
  • the board 180 is configured to incline in accordance with a load applied to its upper surface.
  • the board 180 has a bottom surface that is formed to have an arc-shaped inclination.
  • the board 180 includes a plurality of springs or other elastic members arranged so as to transmit the applied load to the floor on which the board 180 is placed.
  • FIG. 18 is a block diagram of an example of the hardware configuration of the computer 200 A according to at least one embodiment of this disclosure.
  • the computer 200 A includes the above-mentioned processor 10 , the above-mentioned memory 11 , the above-mentioned storage 12 , the above-mentioned input/output interface 13 , and the above-mentioned communication interface 14 . Each component is connected to the above-mentioned bus 15 .
  • the input/output interface 13 is configured to allow communication of signals among the HMD device 110 , the HMD sensor 120 , the motion sensor 130 A, and the inclination sensor 170 .
  • the input/output interface 13 is achieved with use of a universal serial bus (USB) interface, a digital visual interface (DVI), a high-definition multimedia interface (HDMI) (trademark), or other terminals.
  • USB universal serial bus
  • DVI digital visual interface
  • HDMI high-definition multimedia interface
  • the input/output interface 13 is not limited to ones described above.
  • the input/output interface 13 may further communicate to/from the controller 160 A.
  • the input/output interface 13 receives input of a signal output from the motion sensor 130 A.
  • the input/output interface 13 transmits a command output from the processor 10 to the controller 160 A.
  • the command instructs the controller 160 A to vibrate, output a sound, emit light, or the like.
  • the controller 160 A executes any one of vibration, sound output, and light emission in accordance with the command.
  • the processor 10 loads one or more programs stored in the storage 12 to the memory 11 to execute a series of commands included in the program.
  • the one or more programs may include an operating system of the computer 200 A, an application program for providing a virtual space, and game software that can be executed in the virtual space with use of the controller 160 A.
  • the processor 10 transmits a signal for providing a virtual space to the HMD device 110 via the input/output interface 13 .
  • the HMD device 110 displays a video on the monitor 112 based on the signal.
  • a method of providing a virtual space to a head mounted display device by a computer includes defining the virtual space by at least one processor.
  • the method further includes detecting, by the at least one processor, a direction in which a user of the head mounted display device is inclined (for example, one of right-left direction and front-back direction).
  • the method further includes determining, by the at least one processor, a movement direction of a virtual user corresponding to the user 190 in the virtual space based on the direction in which the user 190 is inclined.
  • the method further includes causing, by the at least one processor, the head mounted display device to display a field of view of the virtual user in the virtual space so that the field of view is moved in the determined movement direction of the virtual user.
  • the step of determining a movement direction includes determining the movement direction in accordance with a time for which the inclination of the user 190 is continued.
  • the step of determining a movement direction includes determining the movement direction in accordance with a degree of the inclination of the user 190 .
  • the method further includes a step of determining, by the at least one processor, a movement distance of the virtual user in the virtual space based on the detected inclination of the user 190 .
  • the step of causing the head mounted display device to display a field of view includes causing the head mounted display device to display the field of view of the virtual user so that the field of view is moved by the movement distance in the movement direction.
  • a method of providing a virtual space to a head mounted display device by a computer includes defining a virtual space by at least one processor.
  • the method further includes detecting, by the at least one processor, a direction in which a user of the head mounted display device is inclined.
  • the method further includes determining, by the at least one processor, a movement distance of a virtual user in the virtual space based on the detected inclination of the user 190 .
  • the method further includes causing, by the at least one processor, the head mounted display device to display a field of view of the virtual user in the virtual space so that the field of view is moved by the determined movement distance of the virtual user.
  • the step of determining a movement distance includes determining the movement distance in accordance with a time for which the inclination of the user 190 is continued.
  • the step of determining a movement distance includes determining the movement distance in accordance with a degree of the inclination of the user 190 .
  • the step of detecting a direction in which the user 190 is inclined includes a step of detecting an acceleration based on a posture or a motion performed by the user 190 through weight shift.
  • the step of detecting a direction in which the user 190 is inclined includes detecting a load applied from the user 190 .
  • the processor is configured to define a virtual space, detect a direction in which a user of the head mounted display device is inclined, determine a movement direction of a virtual user in the virtual space based on the detected direction, and cause the head mounted display device to display a field of view of the virtual user in the virtual space so that the field of view is moved in the determined movement direction of the virtual user.
  • the processor is further configured to determine a movement distance of the virtual user in the virtual space based on the detected inclination of the user 190 .
  • Causing the head mounted display device to display a field of view includes causing the head mounted display device to display the field of view of the virtual user so that the field of view is moved by the movement distance in the movement direction.
  • the processor is configured to define a virtual space, detect a direction in which a user of the head mounted display device is inclined, determine a movement distance of a virtual user in the virtual space based on the detected direction, and cause the head mounted display device to display a field of view of the virtual user in the virtual space so that the field of view is moved by the determined movement distance of the virtual user.
  • FIG. 19 is a block diagram of the computer 200 A of one embodiment of this disclosure as a module configuration according to at least one embodiment of this disclosure.
  • the computer 200 A includes the above-mentioned display control module 220 , the above-mentioned virtual space control module 230 , the above-mentioned memory module 240 , and the above-mentioned communication control module 250 .
  • the display control module 220 includes, as sub-modules, the above-mentioned virtual camera control module 221 , the above-mentioned field-of-view region determining module 222 , the above-mentioned field-of-view image generating module 223 , and the above-mentioned reference line-of-sight specifying module 224 .
  • the virtual space control module 230 includes, as sub-modules, the above-mentioned virtual space defining module 231 , the above-mentioned virtual object generating module 232 , the above-mentioned object control module 233 , a movement direction determining module 234 , and a movement distance determining module 235 .
  • the movement direction determining module 234 is configured to determine the movement direction of the virtual user in the virtual space 2 based on the output from the inclination sensor 170 . According to at least one embodiment of this disclosure, the movement direction determining module 234 is configured to detect the direction in which a load is applied by the user 190 based on the signal output from the inclination sensor 170 , to thereby determine the direction as a direction to move the virtual user.
  • the movement distance determining module 235 is configured to determine the movement distance of the virtual user based on the output from the inclination sensor 170 . For example, when the board 180 in a horizontal state is inclined, the movement distance determining module 235 determines the movement distance in accordance with an inclination angle. In at least one aspect, the movement distance determining module 235 is configured to determine the movement distance of the virtual user based on a virtual movement distance per degree of inclination angle defined in advance in the program executed by the processor 10 , and on the inclination angle. According to at least one embodiment of this disclosure, the movement distance determining module 235 is configured to determine the movement distance of the virtual user based on the time for which the inclination is continued. For example, the movement distance determining module 235 may determine the movement distance of the virtual user based on a product of the virtual movement distance per time defined in advance in the program executed by the processor 10 and the time for which the inclination is continued.
  • FIG. 20 is a flow chart of processing to be executed by the HMD system 100 A according to at least one embodiment of this disclosure.
  • the processing other than Step S 942 A is similar to the processing described above, and hence redundant description is not repeated.
  • Step S 940 the processor 10 specifies the field-of-view direction of the user 190 wearing the HMD device 110 based on the position and the inclination of the HMD device 110 .
  • the processor 10 executes an application program to display an object in the virtual space 2 based on the command included in the application program.
  • the user 190 enjoys the content that can be visually recognized in the virtual space 2 through execution of the application program.
  • examples of the content include bobsled, other sports using sleds, skiing, and snowboarding.
  • the user 190 may sit on the board 180 and change his or her posture in accordance with the transition of the image displayed on the monitor 112 .
  • Step S 942 A the inclination sensor 170 detects the direction in which the user 190 is inclined.
  • the signal representing the detected inclination is transmitted to the computer 200 A.
  • the signal includes an inclination angle of the board 180 .
  • the inclination sensor 170 detects the direction of the inclination.
  • Step S 950 the processor 10 determines the movement direction and the movement distance of the virtual user based on the signal output from the inclination sensor 170 .
  • the movement direction includes a direction in which the virtual user or a sled or other types of equipment on which the virtual user is sitting travels in the virtual space 2 .
  • the processor 10 may serve as the movement direction determining module 234 to determine only the movement direction.
  • the processor 10 may serve as the movement distance determining module 235 to determine only the movement distance.
  • the processor 10 may be configured to determine the movement direction and a movement speed of the virtual user based on the signal output from the inclination sensor 170 .
  • the movement speed of the virtual user may be set to be increased as the inclination angle of the board 180 is increased.
  • Step S 960 the processor 10 determines the field of view obtained after the movement of the virtual user based on the determined movement direction and movement distance.
  • Step S 970 the processor 10 determines the position and the direction of the virtual camera 1 in the virtual space 2 based on the determined movement direction and movement distance.
  • Step S 980 the processor 10 determines the field-of-view region based on the determined position and direction of the virtual camera 1 .
  • the field-of-view region represents a range that the virtual user can visually recognize in the virtual space 2 .
  • Step S 990 the computer 200 A generates the field-of-view image data for displaying the field-of-view image in accordance with the determined field-of-view region, and outputs the generated field-of-view image data to the HMD device 110 .
  • the field-of-view image data is configured to enable display of a mode in which a landscape around the virtual user approaches the virtual user at high speed so that the virtual user moves at high speed in the determined movement direction.
  • Step S 992 the monitor 112 of the HMD device 110 updates the field-of-view image based on the received field-of-view image data, and displays the updated field-of-view image.
  • the user 190 can recognize the updated field-of-view image, that is, the field of view obtained after the line of sight has been moved in the movement direction. While the above description refers to processor 10 , one of ordinary skill in the art would recognize that the functions of processor 10 could be separated among a plurality of processors.
  • FIG. 21 is a flow chart of the processing to be executed by the computer 200 A in accordance with the motion of the user 190 sitting on the board 180 according to at least one embodiment of this disclosure.
  • Step S 1010 A the processor 10 serves as the virtual space defining module 231 to define the virtual space 2 in the memory 11 .
  • Step S 1020 A the processor 10 detects the direction in which the user 190 wearing the HMD device 110 and sitting on the board 180 is inclined, specifically, the right, left, front, or back direction based on the signal from the inclination sensor 170 .
  • Step S 1030 A the processor 10 determines the movement direction of the virtual user traveling in the virtual space 2 based on the inclination of the user 190 . For example, when the user 190 inclines his or her body to the right side for a certain time period, the processor 10 determines that the virtual user or a vehicle on which the virtual user is sitting is moved to the right side. In at least one aspect, the processor 10 determines the direction in which the virtual user is moved to the right side in accordance with the number of times that the user 190 performs the motion of temporarily inclining his or her body.
  • the processor 10 determines that the virtual user is moved by one degree to the right side with respect to the traveling direction.
  • the processor 10 determines that the virtual user is moved by 10 degrees to the right side.
  • the processor 10 determines the left movement direction.
  • the inclination sensor 170 may detect that the user 190 has inclined his or her body forward.
  • the processor 10 may detect that the user 190 has inclined his or her body forward in accordance with the signal from the inclination sensor 170 , and may determine the movement distance based on the inclination angle. For example, in a case where the program executed by the processor 10 predetermines to move the virtual user by 10 meters when the forward inclination of one degree is detected, when the processor 10 detects that the user 190 has inclined forward by 5 degrees, the processor 10 determines to move the virtual user by 50 meters in the virtual space 2 .
  • Step S 1040 A the processor 10 serves as the virtual object generating module 232 to generate the field-of-view image data for displaying the field-of-view image on the HMD device 110 so that the field of view of the virtual user in the virtual space 2 is moved in the movement direction determined in Step S 1030 A.
  • the processor 10 generates the field-of-view image data in which, in addition to the movement direction, the movement distance obtained based on the time for which the inclination of the user 190 is continued is reflected.
  • the processor 10 transmits the generated field-of-view image data to the HMD device 110 , and the HMD device 110 displays the field-of-view image on the monitor 112 based on the field-of-view image data. While the above description refers to processor 10 , one of ordinary skill in the art would recognize that the functions of processor 10 could be separated among a plurality of processors.
  • FIGS. 22A, 22B and 22C are diagrams of a mode in which the user 190 is sitting on the board 180 of at least one embodiment of this disclosure.
  • FIG. 22A is a diagram of a mode in which the user 190 wearing the HMD device 110 (not shown) is sitting on the board 180 while maintaining the horizontal state.
  • FIG. 22B is a diagram of a mode in which the user 190 shifts his or her weight to incline the board 180 to the left side as viewed from the user 190 .
  • FIG. 22C is a diagram of a state in which the user shifts his or her weight to incline the board 180 to the right side as viewed from the user 190 .
  • the inclination sensor 170 outputs a signal corresponding to the direction.
  • the output signal is input to the computer 200 A.
  • the direction in which the inclination is detected is not limited to the left as in FIG. 22B or to the right as in FIG. 22C .
  • the inclination sensor 170 may detect the forward inclination when the user 190 shifts his or her weight forward or the backward inclination when the user 190 shifts his or her weight backward.
  • FIGS. 23A, 23B and 23C are diagrams of a mode in which the user 190 is standing on the board 180 of at least one embodiment of this disclosure.
  • FIG. 23A is a diagram of a mode in which the user 190 wearing the HMD device 110 (not shown) is sitting on the board 180 while maintaining the horizontal state.
  • FIG. 23B is a diagram of a mode in which the user 190 shifts his or her weight to incline the board 180 to the left side as viewed from the user 190 .
  • FIG. 23C is a diagram of a state in which the user shifts his or her weight to incline the board 180 to the right side as viewed from the user 190 .
  • the inclination sensor 170 outputs a signal corresponding to the direction.
  • the output signal is input to the computer 200 A.
  • the direction in which the inclination is detected is not limited to the left as in FIG. 23B or to the right as in FIG. 23C .
  • the inclination sensor 170 may detect the forward inclination when the user 190 shifts his or her weight forward or the backward inclination when the user 190 shifts his or her weight backward.
  • the user 190 can predict the movement direction of the virtual user in the virtual space 2 in accordance with his or her weight shift, and hence a so-called visually induced motion sickness (VR sickness) can be reduced or prevented without reducing the user's sense of immersion.
  • VR sickness visually induced motion sickness
  • FIGS. 24A, 24B and 24C are diagrams of a mode in which the user 190 is sitting on a sled 1300 A in at least one embodiment of this disclosure.
  • touch sensors 1370 and 1371 are arranged on both side surfaces of the sled 1300 A.
  • the output of the touch sensors 1370 and 1371 is connected to an input interface of the computer 200 A.
  • one or more sleds 1300 A may be provided in a playground.
  • FIG. 24A is a diagram of a mode in which the user 190 wearing the HMD device 110 (not shown) is sitting on the sled 1300 A while maintaining the horizontal state. At this time, the user 190 is not touching the touch sensor 1370 or 1371 .
  • FIG. 24B is a diagram of a mode in which the user 190 shifts his or her weight to incline the sled 1300 A to the right side as viewed from the user 190 .
  • a signal output from the touch sensor 1370 is input to the computer 200 A.
  • the processor 10 detects that the sled 1300 A has inclined to the right side based on the signal. Further, the processor 10 generates the field-of-view image data so that the field of view from the front right side of the virtual user in the virtual space 2 may be recognized by the user 190 .
  • the processor 10 transmits the field-of-view image data to the HMD device 110 , and the HMD device 110 displays the image based on the field-of-view image data on the monitor 112 .
  • the user 190 views the image, the user 190 feels like the user is moving in the right direction in the virtual space 2 .
  • FIG. 24C is a diagram of a state in which the user 190 shifts his or her weight to incline the sled 1300 A to the left side as viewed from the user 190 .
  • a signal output from the touch sensor 1371 is input to the computer 200 A.
  • the processor 10 detects that the sled 1300 A has inclined to the left side based on the signal. Further, the processor 10 generates the field-of-view image data so that the field of view from the front left side of the virtual user in the virtual space 2 may be recognized by the user 190 .
  • the processor 10 transmits the field-of-view image data to the HMD device 110 to display the image based on the field-of-view image data on the monitor 112 .
  • the user 190 views the image, the user 190 feels like the user is moving in the left direction in the virtual space 2 .
  • the inclination sensor 170 may be arranged on the bottom portion of the sled 1300 A instead of the touch sensors 1370 and 1371 .
  • the movement of the virtual user in the virtual space 2 is described.
  • the user 190 wearing the HMD device 110 visually recognizes a field-of-view image 1400 A in the virtual space 2 as the virtual user.
  • FIGS. 25A and 25B are diagrams of a case where the board 180 or the sled 1300 A on which the user 190 is sitting or standing is maintained in a horizontal state according to at least one embodiment of this disclosure.
  • the processor 10 determines that the user 190 is maintaining the horizontal state based on the signal output from the inclination sensor 170 or the touch sensors 1370 and 1371 . Further, the processor 10 generates such field-of-view image data that the image in the horizontal state is visually recognized in the virtual space 2 , and transmits the field-of-view image data to the monitor 112 of the HMD device 110 .
  • the user 190 recognizes the field-of-view image 1400 A as the virtual user.
  • the virtual camera 1 corresponding to the virtual user captures the field-of-view region 23 corresponding to the field of view of the virtual user.
  • FIGS. 26A and 26B are diagrams of a case where the user 190 maintaining a horizontal state leans to the right side to incline the board 180 or the sled 1300 A to the right side according to at least one embodiment of this disclosure.
  • the processor 10 determines that the user 190 is inclined to the right side based on the signal output from the inclination sensor 170 or the touch sensors 1370 and 1371 . Further, the processor 10 generates the field-of-view image data in which the virtual user in the virtual space 2 is recognized as having moved to the right side from the original location (for example, see FIGS. 25A and 25B ).
  • the processor 10 generates the field-of-view image data for displaying the field-of-view image so that the image that is recognized in the virtual space 2 flows from the right to the left, and transmits the field-of-view image data to the HMD device 110 .
  • the monitor 112 displays a field-of-view image 1500 A based on the field-of-view image data.
  • a tree 1410 A displayed on the right in the field-of-view image 1400 A is moved to the center of the field-of-view image 1500 A.
  • such movement corresponds to, for example, movement of the virtual camera 1 assumed in the virtual space 2 to the front of the tree 1410 A.
  • the processor 10 generates the field-of-view image data in order to display an image similar to the image recognized in accordance with the movement of the virtual camera 1 .
  • the transition from the field-of-view image 1400 A to the field-of-view image 1500 A is performed based on the shift of the weight of the user 190 to the right side.
  • the user 190 can recognize the movement in the virtual space 2 due to the transition of the field-of-view image in accordance with his or her motion corresponding to the weight shift to the right side, and hence the inclination of the user 190 and the movement direction in the virtual space 2 are synchronized with each other.
  • occurrence of a so-called VR sickness may be suppressed or avoided.
  • FIGS. 27A and 27B are diagrams of a case where the user 190 maintaining the horizontal state leans to the left side to incline the board 180 or the sled 1300 A to the left side according to at least one embodiment of this disclosure.
  • the processor 10 determines that the user 190 is inclined to the left side based on the signal output from the inclination sensor 170 or the touch sensors 1370 and 1371 . Further, the processor 10 generates the field-of-view image data in which the virtual user in the virtual space 2 is recognized as having moved to the left side from the original location (for example, see FIGS. 25 a and 25 B).
  • the processor 10 generates the field-of-view image data for displaying the field-of-view image so that the image that is recognized in the virtual space 2 flows from the left to the right, and transmits the field-of-view image data to the HMD device 110 .
  • the monitor 112 displays a field-of-view image 1600 A based on the field-of-view image data.
  • a mountain 1420 A that has been displayed on the left in the field-of-view image 1400 A is moved to the center of the field-of-view image 1600 A.
  • such movement corresponds to, for example, movement of the virtual camera 1 assumed in the virtual space 2 to the front of the mountain 1420 A.
  • the processor 10 generates the field-of-view image data in order to display an image similar to the image recognized in accordance with the movement of the virtual camera 1 .
  • the transition from the field-of-view image 1400 A to the field-of-view image 1600 A is performed based on the shift of the weight of the user 190 to the left side.
  • the user 190 can recognize the movement in the virtual space 2 due to the transition of the field-of-view image in accordance with his or her motion corresponding to the weight shift to the left side, and hence the inclination of the user 190 and the movement direction in the virtual space 2 are synchronized with each other.
  • occurrence of a so-called VR sickness may be suppressed or avoided.
  • FIGS. 28A and B are diagrams of a case where the user 190 maintaining the horizontal state leans forward to incline the board 180 forward according to at least one embodiment of this disclosure.
  • the processor 10 determines that the user 190 is inclined to the forward side based on the signal output from the inclination sensor 170 . Further, the processor 10 generates the field-of-view image data in which the virtual user in the virtual space 2 is recognized as having moved to the forward side from the original location (for example, see FIGS. 25A and 25B ). For example, the processor 10 generates the field-of-view image data for displaying the field-of-view image so that the image that is recognized in the virtual space 2 is approaching at high speed, and transmits the field-of-view image data to the HMD device 110 .
  • the monitor 112 displays a field-of-view image 1700 based on the field-of-view image data.
  • the tree 1410 A and the mountain 1420 A that have been displayed small at far places in the field-of-view image 1400 A are displayed large in accordance with the movement distance in the virtual space 2 .
  • the movement distance in the virtual space 2 may be calculated based on, for example, the angle of the forward inclination of the user 190 and on the movement distance per unit angle determined in advance.
  • the movement distance may be calculated based on the time for which the state of the forward inclination is continued and on the movement distance per unit time determined in advance.
  • the movement of the field-of-view image in the virtual space 2 corresponds to, for example, movement of the virtual camera 1 assumed in the virtual space 2 at high speed toward the tree 1410 A or the mountain 1420 A.
  • the processor 10 generates the field-of-view image data in order to display an image similar to the image recognized in accordance with the movement of the virtual camera 1 .
  • the transition from the field-of-view image 1400 A to the field-of-view image 1700 is performed based on the shift of the weight of the user 190 to the front side.
  • the user 190 can recognize the movement in the virtual space 2 in accordance with his or her motion corresponding to the weight shift, and hence the inclination of the user 190 and the movement direction in the virtual space 2 are synchronized with each other. As a result, occurrence of a so-called VR sickness may be suppressed or avoided.
  • FIGS. 29A and 29B are diagrams of a case where the user 190 maintaining the horizontal state leans backward to incline the board 180 backward according to at least one embodiment of this disclosure.
  • the processor 10 determines that the user 190 is inclined to the backward side based on the signal output from the inclination sensor 170 . Further, the processor 10 generates the field-of-view image data in which the virtual user in the virtual space 2 is recognized as having moved to the backward side from the original location (for example, see FIGS. 25 a and 25 B). For example, the processor 10 generates the field-of-view image data for displaying the field-of-view image so that the image that is recognized in the virtual space 2 is moving away, and transmits the field-of-view image data to the HMD device 110 .
  • the monitor 112 displays a field-of-view image 1800 based on the field-of-view image data.
  • the tree 1410 A and the mountain 1420 A that have been displayed in the field-of-view image 1400 A are displayed smaller as compared to the case illustrated in FIGS. 25A and 25B .
  • the tree 1410 A and the mountain 1420 A are displayed smaller in accordance with the movement distance in the virtual space 2 .
  • the movement distance in the virtual space 2 may be calculated based on, for example, the angle of the backward inclination of the user 190 and on the movement distance per unit angle determined in advance.
  • the movement distance may be calculated based on the time for which the state of the backward inclination is continued and on the movement distance per unit time determined in advance.
  • the movement of the field-of-view image in the virtual space 2 corresponds to, for example, movement of the virtual camera 1 assumed in the virtual space 2 at high speed in a direction departing from the tree 1410 A or the mountain 1420 A.
  • the processor 10 generates the field-of-view image data in order to display an image similar to the image recognized in accordance with the movement of the virtual camera 1 .
  • the transition from the field-of-view image 1400 A to the field-of-view image 1800 is performed based on the shift of the weight of the user 190 to the back side.
  • the user 190 can recognize the movement in the virtual space 2 in accordance with his or her motion corresponding to the weight shift, and hence the inclination of the user 190 and the movement direction in the virtual space 2 are synchronized with each other similarly to the case of the forward inclination. As a result, occurrence of a so-called VR sickness may be suppressed or avoided.
  • the movement direction of the virtual user in the virtual space 2 is determined based on the direction in which the user 190 is inclined in the real space, and hence the inclination of the user 190 and the movement direction in the virtual space 2 are synchronized with each other. As a result, even when the virtual user is moved in the virtual space 2 , the VR sickness may be reduced or prevented from occurring to the user 190 that visually recognizes the image at this time.
  • the features described in the embodiments above may be combined as appropriate.
  • the movement distance may be calculated so that the virtual user moves forward or backward in the direction of the reference line of sight 5 of the virtual camera 1 in the virtual space 2 , and the virtual camera 1 may be rotated about the yaw direction (v axis) in accordance with the degree of inclination in which the user inclines the board 180 to the right side or the left side.
  • a method of providing a virtual space to a head mounted display device by a computer includes defining a virtual space.
  • the method further includes detecting a direction in which a user of the head mounted display device is inclined.
  • the method further includes determining a movement direction of the user in the virtual space based on the detected direction in which the user is inclined.
  • the method further includes causing the head mounted display device to display a field of view of the user in the virtual space so that the field of view is moved in the determined movement direction of the user.
  • a method in which the determining of the movement direction includes determining the movement direction in accordance with a time for which inclination of the user is continued.
  • a method in which the determining of the movement direction includes determining the movement direction in accordance with the direction in which the user is inclined.
  • a method according to any one of Items 1 to 3, further including determining a movement distance of the user in the virtual space based on detected inclination of the user.
  • the causing of the head mounted display device to display a field of view includes causing the head mounted display device to display the field of view of the user so that the field of view is moved by the movement distance in the movement direction.
  • a method of providing a virtual space to a head mounted display device by a computer includes defining a virtual space.
  • the method further includes detecting inclination of a user of the head mounted display device is inclined.
  • the method further includes determining a movement distance of the user in the virtual space based on the detected inclination of the user.
  • the method further includes causing the head mounted display device to display a field of view of the user in the virtual space so that the field of view is moved in the determined movement distance of the user.
  • a method according to Item 4 or 5, in which the determining of the movement distance includes determining the movement distance in accordance with a time for which the inclination of the user is continued.
  • a method according to any one of Items 1 to 7, in which the detecting of the inclination of the user includes a step of detecting an acceleration based on a posture or a motion performed by the user through weight shift.
  • a non-transitory computer readable medium configured to store instructions for causing a computer to execute the method of any one of Items 1 to 9.
  • a system including a head mounted display device The system further includes a computer configured to provide a virtual space to the head mounted display device.
  • the system further includes a sensor configured to detect that a user of the head mounted display device is inclined.
  • the computer includes a memory configured to store a series of commands.
  • the computer further includes a processor configured to execute the series of commands.
  • the processor is configured to, when the processor executes the series of commands define a virtual space.
  • the processor is further configured to detect a direction in which the user of the head mounted display device is inclined.
  • the processor is further configured to determine a movement direction of the user in the virtual space based on the direction in which the user is inclined.
  • the processor is further configured to cause the head mounted display device to display a field of view of the user in the virtual space so that the field of view is moved in the determined movement direction of the user.
  • a system in which the processor is further configured to determine a movement distance of the user in the virtual space based on detected inclination of the user.
  • the processor is further configured to cause the head mounted display device to display a field of view by causing the head mounted display device to display the field of view of the user so that the field of view is moved by the movement distance in the movement direction.
  • a system including a head mounted display device The system further includes a computer configured to provide a virtual space to the head mounted display device.
  • the system further includes a sensor configured to detect inclination of a user of the head mounted display device.
  • the computer includes a memory configured to store a series of commands.
  • the computer includes a processor configured to execute the series of commands.
  • the processor is configured to, when the processor executes the series of commands, define a virtual space.
  • the processor is further configured to detect the inclination of the user of the head mounted display device.
  • the processor is further configured to determine a movement distance of the user in the virtual space based on the detected inclination of the user.
  • the processor is further configured to cause the head mounted display device to display a field of view of the user in the virtual space so that the field of view is moved by the determined movement distance of the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Cardiology (AREA)
  • Remote Sensing (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Computing Systems (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
US15/619,333 2016-06-10 2017-06-09 Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space Active US10252162B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/376,376 US10589176B2 (en) 2016-06-10 2019-04-05 Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016116437A JP6177965B1 (ja) 2016-06-10 2016-06-10 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム
JP2016-116444 2016-06-10
JP2016-116437 2016-06-10
JP2016116444A JP6126273B1 (ja) 2016-06-10 2016-06-10 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/376,376 Continuation US10589176B2 (en) 2016-06-10 2019-04-05 Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space

Publications (2)

Publication Number Publication Date
US20170354882A1 US20170354882A1 (en) 2017-12-14
US10252162B2 true US10252162B2 (en) 2019-04-09

Family

ID=60573510

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/619,333 Active US10252162B2 (en) 2016-06-10 2017-06-09 Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space
US16/376,376 Active US10589176B2 (en) 2016-06-10 2019-04-05 Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/376,376 Active US10589176B2 (en) 2016-06-10 2019-04-05 Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space

Country Status (3)

Country Link
US (2) US10252162B2 (ja)
CN (1) CN109069927A (ja)
WO (1) WO2017213218A1 (ja)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108463787B (zh) 2016-01-05 2021-11-30 瑞尔D斯帕克有限责任公司 多视角图像的注视校正
EP3561570A1 (en) * 2016-12-22 2019-10-30 Shenzhen Royole Technologies Co., Ltd. Head-mounted display apparatus, and visual-aid providing method thereof
EP3665553B1 (en) 2017-08-08 2023-12-13 RealD Spark, LLC Adjusting a digital representation of a head region
JP6340464B1 (ja) * 2017-09-27 2018-06-06 株式会社Cygames プログラム、情報処理方法、情報処理システム、頭部装着型表示装置、情報処理装置
US11017575B2 (en) * 2018-02-26 2021-05-25 Reald Spark, Llc Method and system for generating data to provide an animated visual representation
CN109298779B (zh) * 2018-08-10 2021-10-12 济南奥维信息科技有限公司济宁分公司 基于虚拟代理交互的虚拟训练***与方法
CN114144753A (zh) * 2019-07-30 2022-03-04 索尼集团公司 图像处理装置、图像处理方法和记录介质
DE102019124386A1 (de) * 2019-09-11 2021-03-11 Audi Ag Verfahren zum Betreiben einer Virtual-Reality-Brille in einem Fahrzeug sowie Virtual-Reality-System mit einer Virtual-Reality-Brille und einem Fahrzeug
CN113722644B (zh) * 2021-09-03 2023-07-21 如你所视(北京)科技有限公司 基于外接设备在虚拟空间选取浏览点位的方法及其装置
CN113730905A (zh) * 2021-09-03 2021-12-03 北京房江湖科技有限公司 一种在虚拟空间中实现自由游走的方法及其装置
CN116501175B (zh) * 2023-06-25 2023-09-22 江西格如灵科技股份有限公司 虚拟角色移动方法、装置、计算机设备及介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007116309A (ja) 2005-10-19 2007-05-10 Seiko Epson Corp 画像情報再生装置
US20100245365A1 (en) 2009-03-30 2010-09-30 Namco Bandai Games Inc. Image generation system, image generation method, and computer program product
US20120262558A1 (en) * 2006-11-02 2012-10-18 Sensics, Inc. Apparatus, systems and methods for providing motion tracking using a personal viewing device
JP2013039225A (ja) 2011-08-16 2013-02-28 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム装置の制御方法、及びプログラム
US20130201082A1 (en) * 2008-06-11 2013-08-08 Honeywell International Inc. Method and system for operating a near-to-eye display
JP2014038403A (ja) 2012-08-13 2014-02-27 Konami Digital Entertainment Co Ltd 表示制御装置、表示制御装置の制御方法、表示制御システム、表示制御システムの制御方法、及びプログラム
JP5869177B1 (ja) 2015-09-16 2016-02-24 株式会社コロプラ 仮想現実空間映像表示方法、及び、プログラム
US20160252729A1 (en) 2015-02-27 2016-09-01 Sony Computer Entertainment Inc. Display control apparatus, display control method, and recording medium

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5629594A (en) * 1992-12-02 1997-05-13 Cybernet Systems Corporation Force feedback system
US6753828B2 (en) * 2000-09-25 2004-06-22 Siemens Corporated Research, Inc. System and method for calibrating a stereo optical see-through head-mounted display system for augmented reality
US7667700B1 (en) * 2004-03-05 2010-02-23 Hrl Laboratories, Llc System and method for navigating operating in a virtual environment
JP4673570B2 (ja) * 2004-03-31 2011-04-20 株式会社セガ 画像生成装置、画像表示方法及びプログラム
US8913009B2 (en) * 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
JP6217747B2 (ja) * 2013-04-16 2017-10-25 ソニー株式会社 情報処理装置及び情報処理方法
US20150185825A1 (en) * 2013-12-30 2015-07-02 Daqri, Llc Assigning a virtual user interface to a physical object
CN103877726B (zh) * 2014-04-10 2017-09-26 北京蚁视科技有限公司 一种虚拟现实组件***
CN204864894U (zh) * 2015-08-19 2015-12-16 天津先驱领域科技有限公司 一种用于虚拟现实游戏的头盔
CN105222761A (zh) * 2015-10-29 2016-01-06 哈尔滨工业大学 借助虚拟现实及双目视觉技术实现的第一人称沉浸式无人机驾驶***及驾驶方法

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007116309A (ja) 2005-10-19 2007-05-10 Seiko Epson Corp 画像情報再生装置
US20120262558A1 (en) * 2006-11-02 2012-10-18 Sensics, Inc. Apparatus, systems and methods for providing motion tracking using a personal viewing device
US20130201082A1 (en) * 2008-06-11 2013-08-08 Honeywell International Inc. Method and system for operating a near-to-eye display
US20100245365A1 (en) 2009-03-30 2010-09-30 Namco Bandai Games Inc. Image generation system, image generation method, and computer program product
JP2010237882A (ja) 2009-03-30 2010-10-21 Namco Bandai Games Inc プログラム、情報記憶媒体及び画像生成システム
JP2013039225A (ja) 2011-08-16 2013-02-28 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム装置の制御方法、及びプログラム
JP2014038403A (ja) 2012-08-13 2014-02-27 Konami Digital Entertainment Co Ltd 表示制御装置、表示制御装置の制御方法、表示制御システム、表示制御システムの制御方法、及びプログラム
US20160252729A1 (en) 2015-02-27 2016-09-01 Sony Computer Entertainment Inc. Display control apparatus, display control method, and recording medium
JP2016158794A (ja) 2015-02-27 2016-09-05 株式会社ソニー・インタラクティブエンタテインメント 表示制御プログラム、表示制御装置、及び表示制御方法
JP5869177B1 (ja) 2015-09-16 2016-02-24 株式会社コロプラ 仮想現実空間映像表示方法、及び、プログラム

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Earnshaw, R., Gigante, M., Jones, H., Virtual Reality Systems, Mar. 1993, Academic Press, 1st Editiion, pp. 1-282. *
Notice of Allowance in JP Application No. 2016-116444, dated Mar. 14, 2017.
Office Action in JP Application No. 2016-116437, dated Feb. 21, 2017.
Office Action in JP Application No. 2016-116437, dated Oct. 11, 2016.
Office Action in JP Application No. 2016-116444, dated Oct. 18, 2016.

Also Published As

Publication number Publication date
CN109069927A (zh) 2018-12-21
US20190232165A1 (en) 2019-08-01
US20170354882A1 (en) 2017-12-14
WO2017213218A1 (ja) 2017-12-14
US10589176B2 (en) 2020-03-17

Similar Documents

Publication Publication Date Title
US10589176B2 (en) Method of providing a virtual space, medium for causing a computer to execute the method, and system for providing a virtual space
JP6342038B1 (ja) 仮想空間を提供するためのプログラム、当該プログラムを実行するための情報処理装置、および仮想空間を提供するための方法
US10223064B2 (en) Method for providing virtual space, program and apparatus therefor
US20180357817A1 (en) Information processing method, program, and computer
JP6290467B1 (ja) 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム
JP2017220224A (ja) 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム
JP6126273B1 (ja) 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム
JP2018125003A (ja) 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるプログラム
JP6425846B1 (ja) プログラム、情報処理装置、及び情報処理方法
JP6513241B1 (ja) プログラム、情報処理装置、及び情報処理方法
US20190114841A1 (en) Method, program and apparatus for providing virtual experience
JP6839046B2 (ja) 情報処理方法、装置、情報処理システム、および当該情報処理方法をコンピュータに実行させるプログラム
JP6382928B2 (ja) 仮想空間における画像の表示を制御するためにコンピュータによって実行される方法、当該方法をコンピュータに実現させるためのプログラム、および、コンピュータ装置
JP6934383B2 (ja) 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム
US20180239420A1 (en) Method executed on computer for providing virtual space to head mount device, program for executing the method on the computer, and computer apparatus
JP2019168962A (ja) プログラム、情報処理装置、及び情報処理方法
JP6220473B1 (ja) 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、当該プログラムを記録した記録媒体、および仮想空間を提供するためのシステム
JP6965304B2 (ja) プログラム、情報処理装置、及び情報処理方法
JP6177965B1 (ja) 仮想空間を提供するための方法、当該方法をコンピュータに実現させるためのプログラム、および仮想空間を提供するためのシステム
CN114007707A (zh) 游戏程序、游戏方法以及信息终端装置
JP2019179434A (ja) プログラム、情報処理装置、及び情報処理方法
JP7037467B2 (ja) プログラム、情報処理装置、及び情報処理方法
JP2019211865A (ja) コンピュータプログラム、情報処理装置および情報処理方法
JP2018106697A (ja) 仮想空間を提供するための方法、プログラム、および、装置
JP6946084B2 (ja) 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: COLOPL, INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KONO, YUKI;BABA, NARUATSU;SIGNING DATES FROM 20170715 TO 20170803;REEL/FRAME:043549/0878

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4