US20120196684A1 - Combining motion capture and timing to create a virtual gaming experience - Google Patents

Combining motion capture and timing to create a virtual gaming experience Download PDF

Info

Publication number
US20120196684A1
US20120196684A1 US13/275,212 US201113275212A US2012196684A1 US 20120196684 A1 US20120196684 A1 US 20120196684A1 US 201113275212 A US201113275212 A US 201113275212A US 2012196684 A1 US2012196684 A1 US 2012196684A1
Authority
US
United States
Prior art keywords
player
action
timing
players
gaming system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/275,212
Inventor
David Richardson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US13/275,212 priority Critical patent/US20120196684A1/en
Publication of US20120196684A1 publication Critical patent/US20120196684A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5573Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history player location
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the present invention generally relates to games and, more particularly, to a virtual gaming experience combining motion capture and timing.
  • a method for creating a virtual gaming experience comprises establishing a connection between all players of the game and a gaming system, each player being within view of each other player; receiving input of a player action by the gaming system, the player action pertaining to the virtual gaming experience, wherein the action equates to a physical motion combined with a timed response; determining whether the player action has an effect on a previous action, if any, by another player; if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and determining consequences for said player action and previous action, if any.
  • a gaming system comprises one or more motion sensors controlled by one or more players; and a processor adapted to detect motion of the one or more motion sensors and create a player action pertaining to the virtual gaming experience, wherein the player action equates to a physical motion combined with a timed response, wherein the processor determines whether the player action has an effect on a previous action, if any, by another player and if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and the processor determines consequences for said player action and previous action, if any.
  • FIG. 1 illustrates a flow chart describing a method for creating a virtual gaming experience in accordance with one embodiment of the present invention
  • FIG. 1A illustrates a flow chart method describing action logic for creating a virtual gaming experience in accordance with one embodiment of the present invention.
  • an embodiment of the present invention generally provides a method for creating a virtual gaming experience without avatars.
  • the method described herein may be a computer-implemented software process.
  • Prior art virtual games may require an avatar on a screen with which the player interacts.
  • the present invention provides for the direct interaction between two or more players, without the need for a separate visual representation on a screen, or television.
  • the two or more gamers may be within view of each other, thus obviating the need for a separate visual representation
  • the process described herein may use basic motion capture (using accelerometer technology) and timing to represent gaming maneuvers and situations to create a virtual gaming experience in real time. This way, people will be able to physically react to the other people and their actions and reactions involved in this experience.
  • the data generated by the motion capture devices may be delivered to a processor of a gaming system.
  • a connection may need to be established between all users so that the actions of different users can be processed on a real-time basis with no substantial lag or delay between the actions and reactions of the different users.
  • step 2 game play may begin and the game is able to start.
  • the game play in step 2 may involve a number of additional steps, including a step 4 , including motion detection and action, where the action may equate to some sort of physical motion combined with a timed response or timing parameter; a step 5 , where a response may include some programmed reaction representing an activity; and a step 6 , where each of these actions and responses requires a timing element to allow for the interactive experience to be fully realized. Additional details for these steps, as well as the game play, are described further below.
  • the addition of a global positioning system could enhance the experience by determining the users locations and react to the program accordingly (mainly to be used in conjunction with an optional element compass described immediately hereinafter).
  • An addition of compass, or directional information can enhance the experience by determining the users' direction and possible intent, and react to the program accordingly.
  • the addition of sound can further enhance the experience, for example, gun shots with battle situations, cracks of the bat for baseball games, or a virtual ping pong bail hitting the table can help a player determine when to react, such as when to swing a virtual paddle. Vibration can also add to the experience.
  • a vibration could allow a player to sense when they are hit with a bullet, or a sword. It could also let the player know the magnitude of damage they are taking with such a hit.
  • Sound and vibration could be further used to help players anticipate the timing element. For example, in a baseball game, a fastball would make a higher pitch sound than a change-up. This would help the batter determine when to swing the bat. In a ping pong game, the players may hear (with sound) and feel (with vibration) the ball bounce off the table, and that can help them judge the timing of when to swing in order to return the ping pong ball. Lights and lighting effects can also add to the experience. Lights and/or lighting effects may allow a player to see a reaction, for example, a light may illuminate when a user strikes a ping pong ball. Lighting effects may be used to describe closeness to an object. Using the ping pong game as an example, a series of lights may light up as the virtual ball approaches the virtual paddle. As the ball gets closer, more lights may illuminate. Lights and lighting effects may be used in other ways to enhance the gaming experience.
  • FIG. 1A illustrated is a flow chart method describing action logic for creating a virtual gaming experience in accordance with one embodiment of the present invention.
  • two or more users decide that they want to use this game, they will stand in the vicinity and within eyesight of each other and begin. At that point, each user becomes the avatar and makes the motions that he would be using in real life; for example, if he were playing baseball, he would make the motion of throwing a ball or swinging a bat; for a game involving guns, he would make a physical motion that would represent firing a gun; a sword would be a relatively simple motion to understand.
  • a first user may perform an action at step 20 .
  • the action is any action that the game allows, and may equate to some physical motion combined with a timed response.
  • the response may be a programmed reaction representing an activity. Some examples of such responses may include, but are not limited to, hitting a bail, shooting a gun, dodging attack, blocking with a sword, attacking with a magical fireball, or throwing a football.
  • Each of the actions and responses requires a timing element. This element may allow for the interactive experience to be fully realized. For example, each action may need some sort of timing parameters in order to allow the other player(s) to respond or not respond.
  • a second or subsequent user may have the potential ability to react to the first user's action with another action, if done correctly and with accurate timing, this subsequent action may take effect and cause the desired result.
  • the second or subsequent user may perform such an action at step 30 .
  • a second player may block.
  • a second player may swing a bat.
  • step 40 it was determined that the second or subsequent user performed an action then, at step 60 , it is determined whether this latest action has any effect on the first users actions. If not, then at step 50 , the first action is completed as directed by the first user.
  • step 60 If it is determined, at step 60 , that the latest action does have an effect on the first user's actions, then, at step 70 , it is then determined whether the latest action was within the timing parameters or limits set for response. If, at step 70 , it is determined that the latest action was outside the timing parameters/limits set for response, then, at step 50 , the first action may be completed as directed by the user.
  • the result of the first action is changed based on the second or subsequent user's latest action.
  • the process described herein may include a pre-set end to a particular action that would compel the action to end For example, if a user runs out of energy, the user dies in the game. If time runs out, the game is over. If the user is killed with a sword, the user dies. If someone fails to return a ping pong ball, the point is won.
  • the consequences of said actions of FIG. 1A may take place.
  • one of the characters may lose his health, a user may hit a home run, a user may catch a football pass, or someone may be killed in battle, just to name a few possible consequences.
  • the method described herein may be used with phones running iPhone® or Android® OS software as well as other phones.
  • the method described herein may also be used with mobile applications, certain wireless controllers, desktop computers, laptop computers, iPads® and other tablets and hardware.
  • the method may also be used with other devices that can capture human motion.
  • the present method could be used in places like the stock market exchange to facilitate communication and trading. Many possible other uses also exist.
  • the system described above may be used to track movements of a single player. For example, if the player holds the device as a golf club, they can practice their swing. The app or hardware could compare the motion data of the swing with standards and tell the player how they did. in some embodiments, the motion detection device could be attached to a golf club during play and a user may input swing results into the system—this could help teach, or calibrate, the device.
  • HMD Head Mounted Displays
  • Augmented Reality Other elements that could be used to enhance the experience include HMD (Head Mounted Displays) as well as various forms of Augmented Reality. These technologies could allow for the user to view the existing world, while at the same time overlaying and superimposing images of gameplay effects.
  • the HMD would have the ability to display a representation of the ball in flight.
  • this effect can be accomplished in HMD hardware is with the use of lenses with mirrors and/or semi-transparent mirrors which will allow these computer generated images to be superimposed onto a real world view. This would allow other effects, such as simulating different battle scenarios: watching an arrow fly at a target in a cowboys and Indians game, or viewing the muzzleflash and smoke that could be overlayed with the reality behind.

Abstract

A method for creating a virtual gaming experience without avatars is disclosed. The method described herein may be a computer-implemented software process. Conventional virtual games may require an avatar on a screen with which the player interacts. The present invention provides for the direct interaction between two or more players, without the need for a separate visual representation on a screen, or television. The two or more gamers may be within view of each other, thus obviating the need for a separate visual representation. The process described herein may use basic motion capture (using accelerometer technology) and timing to represent gaming maneuvers and situations to create a virtual gaming experience in real time. This way, people will be able to physically react to the other people and their actions and reactions involved in this experience.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of priority of U.S. provisional patent application number 61/438,379, filed Feb. 1, 2011, the contents of which are herein incorporated by reference.
  • BACKGROUND OF THE INVENTION
  • The present invention generally relates to games and, more particularly, to a virtual gaming experience combining motion capture and timing.
  • Conventional video gaming uses an avatar on a screen with which the player reacts and interacts. This type of gaming requires the use of a separate visual representation on a screen or television and, in multi-player games, the gamers do not directly interact with each other.
  • As can be seen, there is a need for a video game system that permits the players to directly interact without the need for a separate visual representation.
  • SUMMARY OF THE INVENTION
  • In one aspect of the present invention, a method for creating a virtual gaming experience comprises establishing a connection between all players of the game and a gaming system, each player being within view of each other player; receiving input of a player action by the gaming system, the player action pertaining to the virtual gaming experience, wherein the action equates to a physical motion combined with a timed response; determining whether the player action has an effect on a previous action, if any, by another player; if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and determining consequences for said player action and previous action, if any.
  • In another aspect of the present invention, a gaming system comprises one or more motion sensors controlled by one or more players; and a processor adapted to detect motion of the one or more motion sensors and create a player action pertaining to the virtual gaming experience, wherein the player action equates to a physical motion combined with a timed response, wherein the processor determines whether the player action has an effect on a previous action, if any, by another player and if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and the processor determines consequences for said player action and previous action, if any.
  • These and other features, aspects and advantages of the present invention will become better understood with reference to the following drawings, description and claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a flow chart describing a method for creating a virtual gaming experience in accordance with one embodiment of the present invention; and
  • FIG. 1A illustrates a flow chart method describing action logic for creating a virtual gaming experience in accordance with one embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The following detailed description is of the best currently contemplated modes of carrying out exemplary embodiments of the invention. The description is not to be taken in a limiting sense, but is made merely for the purpose of illustrating the general principles of the invention, since the scope of the invention is best defined by the appended claims.
  • Broadly, an embodiment of the present invention generally provides a method for creating a virtual gaming experience without avatars. The method described herein may be a computer-implemented software process.
  • Prior art virtual games may require an avatar on a screen with which the player interacts. The present invention provides for the direct interaction between two or more players, without the need for a separate visual representation on a screen, or television. The two or more gamers may be within view of each other, thus obviating the need for a separate visual representation The process described herein may use basic motion capture (using accelerometer technology) and timing to represent gaming maneuvers and situations to create a virtual gaming experience in real time. This way, people will be able to physically react to the other people and their actions and reactions involved in this experience. The data generated by the motion capture devices (each player may use one or more such devices, for example, for some games, the motion capture devices may be incorporated into another device, such as a baseball bat, in some embodiments, the players may attach motion capture devices to various body parts to capture the player's motion) may be delivered to a processor of a gaming system.
  • Referring now to FIG. 1, illustrated is a flow chart describing a method for creating a virtual gaming experience in accordance with one embodiment of the present invention. At step 1, a connection may need to be established between all users so that the actions of different users can be processed on a real-time basis with no substantial lag or delay between the actions and reactions of the different users.
  • After the connection is established at step 1, then at step 2, game play may begin and the game is able to start. The game play in step 2 may involve a number of additional steps, including a step 4, including motion detection and action, where the action may equate to some sort of physical motion combined with a timed response or timing parameter; a step 5, where a response may include some programmed reaction representing an activity; and a step 6, where each of these actions and responses requires a timing element to allow for the interactive experience to be fully realized. Additional details for these steps, as well as the game play, are described further below.
  • The addition of a global positioning system could enhance the experience by determining the users locations and react to the program accordingly (mainly to be used in conjunction with an optional element compass described immediately hereinafter). An addition of compass, or directional information, can enhance the experience by determining the users' direction and possible intent, and react to the program accordingly. The addition of sound can further enhance the experience, for example, gun shots with battle situations, cracks of the bat for baseball games, or a virtual ping pong bail hitting the table can help a player determine when to react, such as when to swing a virtual paddle. Vibration can also add to the experience. A vibration could allow a player to sense when they are hit with a bullet, or a sword. It could also let the player know the magnitude of damage they are taking with such a hit. Sound and vibration could be further used to help players anticipate the timing element. For example, in a baseball game, a fastball would make a higher pitch sound than a change-up. This would help the batter determine when to swing the bat. In a ping pong game, the players may hear (with sound) and feel (with vibration) the ball bounce off the table, and that can help them judge the timing of when to swing in order to return the ping pong ball. Lights and lighting effects can also add to the experience. Lights and/or lighting effects may allow a player to see a reaction, for example, a light may illuminate when a user strikes a ping pong ball. Lighting effects may be used to describe closeness to an object. Using the ping pong game as an example, a series of lights may light up as the virtual ball approaches the virtual paddle. As the ball gets closer, more lights may illuminate. Lights and lighting effects may be used in other ways to enhance the gaming experience.
  • Referring now to FIG. 1A, illustrated is a flow chart method describing action logic for creating a virtual gaming experience in accordance with one embodiment of the present invention. When two or more users decide that they want to use this game, they will stand in the vicinity and within eyesight of each other and begin. At that point, each user becomes the avatar and makes the motions that he would be using in real life; for example, if he were playing baseball, he would make the motion of throwing a ball or swinging a bat; for a game involving guns, he would make a physical motion that would represent firing a gun; a sword would be a relatively simple motion to understand.
  • According to this action logic, a first user may perform an action at step 20. The action is any action that the game allows, and may equate to some physical motion combined with a timed response. The response may be a programmed reaction representing an activity. Some examples of such responses may include, but are not limited to, hitting a bail, shooting a gun, dodging attack, blocking with a sword, attacking with a magical fireball, or throwing a football. Each of the actions and responses requires a timing element. This element may allow for the interactive experience to be fully realized. For example, each action may need some sort of timing parameters in order to allow the other player(s) to respond or not respond.
  • After seeing the first user perform an action, a second or subsequent user may have the potential ability to react to the first user's action with another action, if done correctly and with accurate timing, this subsequent action may take effect and cause the desired result. At step 40, it is determined whether the second or subsequent user performs an action. If not, then at step 50, the first action is completed as directed by the first user.
  • Alternatively to performing no action, the second or subsequent user may perform such an action at step 30. For example, if player one attacks; a second player may block. Also by way of example, if one player pitches a baseball, a second player may swing a bat. Once the game has started, depending upon the game and the implementation of the actions, the order may change with each game.
  • If, at step 40, it was determined that the second or subsequent user performed an action then, at step 60, it is determined whether this latest action has any effect on the first users actions. If not, then at step 50, the first action is completed as directed by the first user.
  • If it is determined, at step 60, that the latest action does have an effect on the first user's actions, then, at step 70, it is then determined whether the latest action was within the timing parameters or limits set for response. If, at step 70, it is determined that the latest action was outside the timing parameters/limits set for response, then, at step 50, the first action may be completed as directed by the user.
  • If it was within such timing; limits at step 70, then, at step 80, the result of the first action is changed based on the second or subsequent user's latest action. The process described herein may include a pre-set end to a particular action that would compel the action to end For example, if a user runs out of energy, the user dies in the game. If time runs out, the game is over. If the user is killed with a sword, the user dies. If someone fails to return a ping pong ball, the point is won.
  • Referring back to FIG. 1, at step 3, once ail factors are assessed, the consequences of said actions of FIG. 1A may take place. For example, one of the characters may lose his health, a user may hit a home run, a user may catch a football pass, or someone may be killed in battle, just to name a few possible consequences.
  • The method described herein may be used with phones running iPhone® or Android® OS software as well as other phones. The method described herein may also be used with mobile applications, certain wireless controllers, desktop computers, laptop computers, iPads® and other tablets and hardware. The method may also be used with other devices that can capture human motion. The present method could be used in places like the stock market exchange to facilitate communication and trading. Many possible other uses also exist.
  • In an alternate embodiment of the present invention, the system described above may be used to track movements of a single player. For example, if the player holds the device as a golf club, they can practice their swing. The app or hardware could compare the motion data of the swing with standards and tell the player how they did. in some embodiments, the motion detection device could be attached to a golf club during play and a user may input swing results into the system—this could help teach, or calibrate, the device.
  • Other elements that could be used to enhance the experience include HMD (Head Mounted Displays) as well as various forms of Augmented Reality. These technologies could allow for the user to view the existing world, while at the same time overlaying and superimposing images of gameplay effects. As a user swings a golf club, the HMD would have the ability to display a representation of the ball in flight. In a game with magicians, as you wave a wand toward an enemy, you can see both the person you are playing in real life, and the fireball being thrown at her. One way this effect can be accomplished in HMD hardware is with the use of lenses with mirrors and/or semi-transparent mirrors which will allow these computer generated images to be superimposed onto a real world view. This would allow other effects, such as simulating different battle scenarios: watching an arrow fly at a target in a cowboys and Indians game, or viewing the muzzleflash and smoke that could be overlayed with the reality behind.
  • It should be understood, of course, that the foregoing relates to exemplary embodiments of the invention and that modifications may be made without departing from the spirit and scope of the invention as set forth in the following claims.

Claims (12)

1. A method for creating a virtual gaming experience, comprising:
establishing a connection between all players of the game and a gaming system, each player being within view of each other player;
receiving input of a player action by the gaming system, the player action pertaining to the virtual gaming experience, wherein the action equates to a physical motion combined with timed responses and/or timing parameters;
determining whether the player action has an effect on a previous action, if any, by another player;
if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits or parameters; and
determining consequences for said player action and previous action, if any.
2. The method of claim 1, further comprising obtaining directional information on each of the players.
3. The method of claim 1, further comprising creating at least one of a vibration, a sound and a light detectable by the player, the vibration, the sound and the light being responsive to the player action.
4. The method of claim 1, further comprising detecting a location of each player with a global positioning device.
5. The method of claim 1, further comprising recording actions made by each of the players for later analysis.
6. The method of claim 1, further comprising utilizing an augmented reality device to provide the player with gameplay effects.
7. The method of claim 1, further comprising utilizing an HMD (Head or Helmet Mounted Display) to provide the player with gameplay effects.
8. A gaming system comprising:
one or more motion sensors controlled by one or more players; and
a processor adapted to detect motion of the one or more motion sensors and create a player action pertaining to the virtual gaming experience, wherein the player action equates to a physical motion combined with timed responses and/or timing parameters, wherein
the processor determines whether the player action has an effect on a previous action, if any, by another player and if the player action has an effect on the previous action, if any, by another player, then assessing whether the timing of the response was within timing tolerance limits; and
the processor determines consequences for said player action and previous action, if any.
9. The gaming system of claim 8, further comprising one or more effects generators, the effects generators adapted to create at least one of a vibration, a sound and a light.
10. The gaming system of claim 8, wherein the gaming system does not include a separate visual representation of the players.
11. The gaming system of claim 8, further comprising at least one augmented reality device adapted to provide the players with gameplay effects.
12. The gaming system of claim 8, further comprising at least one HMD (Head or Helmet Mounted Device) adapted to provide the players with gameplay effects.
US13/275,212 2011-02-01 2011-10-17 Combining motion capture and timing to create a virtual gaming experience Abandoned US20120196684A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/275,212 US20120196684A1 (en) 2011-02-01 2011-10-17 Combining motion capture and timing to create a virtual gaming experience

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161438379P 2011-02-01 2011-02-01
US13/275,212 US20120196684A1 (en) 2011-02-01 2011-10-17 Combining motion capture and timing to create a virtual gaming experience

Publications (1)

Publication Number Publication Date
US20120196684A1 true US20120196684A1 (en) 2012-08-02

Family

ID=46577783

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/275,212 Abandoned US20120196684A1 (en) 2011-02-01 2011-10-17 Combining motion capture and timing to create a virtual gaming experience

Country Status (1)

Country Link
US (1) US20120196684A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103617614A (en) * 2013-11-26 2014-03-05 新奥特(北京)视频技术有限公司 Method and system for determining ping-pong ball drop point data in video images
US20140302919A1 (en) * 2013-04-05 2014-10-09 Mark J. Ladd Systems and methods for sensor-based mobile gaming
US20170168556A1 (en) * 2015-12-11 2017-06-15 Disney Enterprises, Inc. Launching virtual objects using a rail device
EP3831454A4 (en) * 2018-07-30 2022-03-30 Sony Interactive Entertainment Inc. Game device, and golf game control method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6757068B2 (en) * 2000-01-28 2004-06-29 Intersense, Inc. Self-referenced tracking
US20090143882A1 (en) * 2007-12-03 2009-06-04 Julius Young Machine and Method for Caddying and Golf Instruction
US20090158220A1 (en) * 2007-12-17 2009-06-18 Sony Computer Entertainment America Dynamic three-dimensional object mapping for user-defined control device
US20100103075A1 (en) * 2008-10-24 2010-04-29 Yahoo! Inc. Reconfiguring reality using a reality overlay device
US20100306825A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100302143A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US20110077065A1 (en) * 2009-09-29 2011-03-31 Rudell Design, Llc Game set with wirelessly coupled game units
US20110121953A1 (en) * 2009-11-24 2011-05-26 Immersion Corporation Handheld Computer Interface with Haptic Feedback
US20110151955A1 (en) * 2009-12-23 2011-06-23 Exent Technologies, Ltd. Multi-player augmented reality combat
US20110216002A1 (en) * 2010-03-05 2011-09-08 Sony Computer Entertainment America Llc Calibration of Portable Devices in a Shared Virtual Space

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6757068B2 (en) * 2000-01-28 2004-06-29 Intersense, Inc. Self-referenced tracking
US20090143882A1 (en) * 2007-12-03 2009-06-04 Julius Young Machine and Method for Caddying and Golf Instruction
US20090158220A1 (en) * 2007-12-17 2009-06-18 Sony Computer Entertainment America Dynamic three-dimensional object mapping for user-defined control device
US20100103075A1 (en) * 2008-10-24 2010-04-29 Yahoo! Inc. Reconfiguring reality using a reality overlay device
US20100306825A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for facilitating user interaction with a simulated object associated with a physical location
US20100302143A1 (en) * 2009-05-27 2010-12-02 Lucid Ventures, Inc. System and method for control of a simulated object that is associated with a physical location in the real world environment
US20110077065A1 (en) * 2009-09-29 2011-03-31 Rudell Design, Llc Game set with wirelessly coupled game units
US20110121953A1 (en) * 2009-11-24 2011-05-26 Immersion Corporation Handheld Computer Interface with Haptic Feedback
US20110151955A1 (en) * 2009-12-23 2011-06-23 Exent Technologies, Ltd. Multi-player augmented reality combat
US20110216002A1 (en) * 2010-03-05 2011-09-08 Sony Computer Entertainment America Llc Calibration of Portable Devices in a Shared Virtual Space

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140302919A1 (en) * 2013-04-05 2014-10-09 Mark J. Ladd Systems and methods for sensor-based mobile gaming
US10092835B2 (en) 2013-04-05 2018-10-09 LyteShot Inc. Systems and methods for sensor-based mobile gaming
CN103617614A (en) * 2013-11-26 2014-03-05 新奥特(北京)视频技术有限公司 Method and system for determining ping-pong ball drop point data in video images
US20170168556A1 (en) * 2015-12-11 2017-06-15 Disney Enterprises, Inc. Launching virtual objects using a rail device
US9904357B2 (en) * 2015-12-11 2018-02-27 Disney Enterprises, Inc. Launching virtual objects using a rail device
EP3831454A4 (en) * 2018-07-30 2022-03-30 Sony Interactive Entertainment Inc. Game device, and golf game control method
US11845003B2 (en) 2018-07-30 2023-12-19 Sony Interactive Entertainment Inc. Game device and golf game control method

Similar Documents

Publication Publication Date Title
US10821347B2 (en) Virtual reality sports training systems and methods
US20240058691A1 (en) Method and system for using sensors of a control device for control of a game
US11826628B2 (en) Virtual reality sports training systems and methods
Miles et al. A review of virtual environments for training in ball sports
JP5965089B1 (en) Screen baseball system competition method
US20210283487A1 (en) Swing alert system and method
US20200376381A1 (en) Posture adjustment method and apparatus, storage medium, and electronic device
US10653945B1 (en) Action or position triggers in a game play mode
CN107469343B (en) Virtual reality interaction method, device and system
JP2021514753A (en) Statistically defined game channels
US20100245365A1 (en) Image generation system, image generation method, and computer program product
KR20090003337A (en) Method for automatically adapting virtual equipment model
JP6889944B2 (en) Game controls, game systems and programs
US20230009354A1 (en) Sporting sensor-based apparatus, system, method, and computer program product
US20120196684A1 (en) Combining motion capture and timing to create a virtual gaming experience
Yeo et al. Augmented learning for sports using wearable head-worn and wrist-worn devices
JP2012101026A (en) Program, information storage medium, game device, and server system
JP5864406B2 (en) GAME DEVICE, GAME CONTROL PROGRAM, AND GAME CONTROL DEVICE
JP6722320B1 (en) Game program, game method, and information terminal device
JP6230132B2 (en) GAME DEVICE, GAME CONTROL PROGRAM, AND GAME CONTROL DEVICE
JP6770603B2 (en) Game programs, game methods, and information terminals
JP6501814B2 (en) Game program, method, and information processing apparatus
RU2719103C1 (en) Method of training young sportsmen in opponent sports
JP2019051372A (en) Game apparatus and game control program
WO2021144847A1 (en) Exercise learning device

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION