NL2014976B1 - Gesture game controlling. - Google Patents

Gesture game controlling. Download PDF

Info

Publication number
NL2014976B1
NL2014976B1 NL2014976A NL2014976A NL2014976B1 NL 2014976 B1 NL2014976 B1 NL 2014976B1 NL 2014976 A NL2014976 A NL 2014976A NL 2014976 A NL2014976 A NL 2014976A NL 2014976 B1 NL2014976 B1 NL 2014976B1
Authority
NL
Netherlands
Prior art keywords
interactive system
scene
user
body part
trajectory
Prior art date
Application number
NL2014976A
Other languages
Dutch (nl)
Inventor
Thomas Gertruda Beumers Mark
Josephus Adrianus Groen In 't Wout Jacobus
Original Assignee
Lagotronics Projects B V
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lagotronics Projects B V filed Critical Lagotronics Projects B V
Priority to NL2014976A priority Critical patent/NL2014976B1/en
Priority to PCT/NL2016/050434 priority patent/WO2016204617A2/en
Application granted granted Critical
Publication of NL2014976B1 publication Critical patent/NL2014976B1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/219Input arrangements for video game devices characterised by their sensors, purposes or types for aiming at specific areas on the display, e.g. light-guns
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention generally relates to a system for interaction of at least one user with a virtual environment. In particular the present invention relates to gesture game centrolling to shoot virtual projectiles towards an object displayed on the display screen. In a first aspect of the invention a system is provided for interaction of at least one user with a virtual environment, the system comprising at least one computer device, arranged for processing and outputting a video signal to at least one display screen for displaying the virtual environment and moving at least one object within the virtual environment; at least one human interface device, arranged for generating control signals corresponding to user input from the user and for communicating the control signal towards the at least one computer device for manipulating movement of at least one object on the display screen, characterized in that the at least one human interface device camprises a motion controller device, and in particular an optical motion controller device, arranged for generating the control signals on position and movement of at least one body part of the user, and wherein the at least one computer device is arranged to display movement of the at least one object on the display screen in a trajectory defined as an extrapolation of a trajectory defined by the position and movement of the at least one body part of the user.

Description

Title: Gesture game controlling
Description
FIELD OF THE INVENTION
The present invention generally relates to an interactive system for interaction of multiple users with multiple objects within a scene. In particular the present invention relates to gesture game controlling to shoot virtual projectiles towards an object displayed on the display screen.
BACKGROUND OF THE INVENTION
Interactive systems are typically known as professional entertainment systems present in theme/amusement parks but also widely and increasingly used at other locations such as public entertainment environments, temporary game locations, at exhibitions or even at home or work locations. Most entertainment systems are non-interactive and only display pre-programmed information which users cannot manipulate or control.
The concept of an interactive system is that users can interact with the scene and the objects present in the scene can be manipulated. A dedicated device known as a game controller typically performs the manipulation. In home use, i.e. small screen, home entertainment systems these controllers are for example gamepads, mice, or a pointing device such as a laser gun shooter.
Both a home and a professional entertainment system is controlled by a processing device such as a dedicated central processing unit, i.e. computer. The computer receives the input signals from the input devices, e.g. the gamepads, and processes these signals for controlling manipulation of certain physical objects in the scene or objects of a virtual environment shown on the display.
In professional environments such as in an amusement ride of an amusement/theme park the amount of players playing in the virtual environment is significant higher then in home environments. A central computer device, or a cluster of central computer devices, is used to calculate and process al the input and output signals of the system to enable a multi-user interactive environment.
Traditional entertainment systems consist of game controllers that are able to manipulate objects within the virtual environment shown on the screen, such as a projectile shot from a shooter, such by relative movement of the projectile. The game controller to that end for example consists of a series of action buttons, e.g. to trigger the launch of the projectile, and a direction controller known as a joy-pad or d-pad, to control the direction of the projectile.
Modern amusement parks are subject to a public that demand amusement rides that are more and more spectacular and with more realistic virtual environment experiences. Traditional game controllers are not arranged for realistic interactive environment experiences since the control of for example projectiles or other objects is not realistic. Moreover, traditional game controllers, such as gamepads have restrictions with respect to detection of only a certain amount of input by the user.
SUMMARY OF THE INVENTION
It is an object of the present invention to provide an improved professional interactive system wherein at least some of the above mentioned drawbacks are removed.
It is a further object of the present invention to provide an improved professional interactive system that can be used as a professional entertainment or infotainment system capable of processing multiple simultaneous users such that they can all interact with objects of the scene.
Yet another object of the present invention is to provide an improved professional interactive system that can be used as a professional entertainment or infotainment system capable of processing multiple simultaneous users such that they can all interact with objects of the scene with an increased realistic experience for the users and a large degree of freedom of control.
In a first aspect of the invention there is provided an interactive system for interaction of at least one user with at least one object within a scene, the system comprising: at least one human interface device arranged for generating control signals corresponding to a user input from the at least one user; at least one computer device arranged for receiving and processing the control signals from the at least one human interface device, and for controlling manipulation of the at least one object within the scene, characterized in that, the at least one human interface device is an optical motion sensor controller device arranged for determining motion and generating the control signals for controlling the manipulation of the at least one object within the scene in correspondence to the determined motion, and wherein the manipulation of the at least one object within the scene is further controlled by a determined position of the motion controller device in respect of the at least one object.
Known professional interactive systems for interaction of one or, mostly a plurality of simultaneous users are comprised of at least one computer device, one or more (large) screen displays for displaying a virtual environment and at least one human interface device per user for manipulating objects within the virtual environment.
The computer device is the central device within the system, which consists of a single computer or a cluster of computers to perform all calculations, process all input control signals from the human interface devices and control manipulation of the objects in the scene as a response to the input signals.
The human interface device or devices are arranged to generate the input control signals for the computer device based on input of the user and to communicate these control signals towards the computer device which calculates in accordance therewith the movement of the object, for example on the display screen. The human interface device of the present invention is a motion controller device, an in particular an optical motion controller device, which motion controller device is arranged to generating the control signals on position and movement of at least one body part of the user.
The motion controller is arranged to generate a depth map of the image, i.e. the scene with the user(s). This can be done in several manners, such as by imaging, e.g. optical motion detection, or by radio frequency signal reflection, e.g. RF motion detection or laser motion detection. In the present invention reference is made to motion optical motion controller, however only by way of example. The invention is not restricted to such optical motion detection only, but is also applicable for radio frequency or laser motion detection, such in all examples demonstrated below.
Optical motion detection can for example be performed by capturing the scene and determine difference in depth within the scene on the basis of differences in colour and/or texture. This way objects can be distinguished from their backgrounds and depths can be determined. An alternative method of distinguishing objects and determining depth perspectives is by transmitting light, e.g. invisible (near)-infrared light (or laser light, or even RF signals), towards the scene and its objects. Then a camera or other type of RF or laser sensor can measure the time-of-flight of the light after it reflects off the objects. Since some objects are further away then others, the time-of-flight of the light reflecting from these objects is larger, hence, depth can be determined.
In yet another alternative manner the (optical) motion controller can be arranged to transmit a light pattern to the scene, i.e. illuminate the scene with a pattern. This pattern is comprised of elements that can be individually distinguished. This can be done in several manners, as long as the elements of the pattern can be individually distinguished. Thus each element is unique in one-way or the other. This uniqueness can be determined by the shape of the element, e.g. unique characters can be used that all have different shapes, or unique colours can be used wherein each element has the same shape but a unique colour, or it can be determined by its orientation or by the positioning of the element in respect of other elements. Thus examples of such patterns are an image having plural unique characters, plural dots in a unique colour, dots with a “random” but unique position within the picture (such the stars at night, each star appears in the sky as a dot and most dots are alike, however, due to the position of the star in respect of other stars one can distinguish the individual stars).
By illuminating the scene and its objects therein, e.g. the user(s), the user (and body parts thereof), other objects and background elements are all constantly illuminated with this pattern of unique identifiable elements such as dots, characters or coloured dots. Then the difference between the observed and expected element positions can be determined by a camera placed at an offset relative to the light source, i.e. a (near) infrared transmitter. The difference can be used to calculate the depth at each element.
The camera can also be a stereoscopic camera setup wherein two cameras are placed at an offset relative to each other and offset relative to the camera. Both can distinguish each element transmitted by the infrared transmitter and on the basis of triangular measurement the depth can be calculated for each object of the scene, i.e. for each element that is illuminated thereon.
The depth information is used to generate a depth map of the scene, which in turn is used to calculate a skeletal model of any objects of interest within the scene, which preferably are the users. Each dot, e.g. illuminated element, of the skeletal model is mapped in a large database to determine a body part and/or to determine if it belongs to the background of the scene. Then the dots of the skeletal model are grouped, i.e. clustered to these body parts such that body parts of the users can be determined from the scene and motion tracking of these body parts, i.e. movements of the clusters of the skeletal model from one captured frame (video still) to another, can be determined.
In this manner motion tracking of the user and in particular of one or more body parts of the user within the scene is accomplished. That motion tracking is used to control elements of the scene such as objects physically present therein, or objects shown within a virtual environment on a display screen. For example, the virtual environment can be an interactive shooter game wherein users can shoot a projectile towards a target displayed on the large screen. The motion captured body part movement determines the trajectory of the projectile. The determined body part movement of the user, i.e. determined gesture defines a trajectory. The object within the virtual environment then moves within the virtual environment in a corresponding trajectory.
Known gesture control game controlling however is only able to control by relative movement. Thus in a shooter game, wherein a projectile is fired, the trajectory of the projectile displayed on the screen always starts from the same position. That position is for example a position at the bottom centre of the display, where an end part of the gun or the like is permanently displayed. The user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the projectile starts its trajectory in a straight line from the bottom centre of the display towards the aimed position on the display, hoping to hit the target if the aimed position and target position correspond. The trajectory of the gesture then defines the direction and trajectory of the projectile on the large screen, however, always starting from the same initial starting point, e.g. from the bottom centre of the large screen wherein the end part of the gun is displayed.
Such could be sufficient in a single user environment wherein only one user interacts within the virtual environment but in a professional environment such as an amusement park ride wherein plural users simultaneously interact with the virtual environment this is insufficient. In case of for example 8 users, these will all fire projectiles either from the same end part of the gun at the bottom centre of the large screen, or 8 different guns have to be displayed, or the large screen has to be divided in 8 individual small screens, each displaying an end part of a gun.
The invention is based on the insight that in order to really satisfy the need for interaction of multiple simultaneous users with a scene or virtual environment with an increased realistic experience for the users and a large degree of freedom of control the position of each user in the scene, the position of the user has to be determined and used to define the manipulation of the objects in the scene and in particular a trajectory of a projectile in the virtual environment starting from the position of the user towards the aimed position.
In for example an amusement park ride a carriage can ride over a track through a scene wherein the users in the carriage can interact with several objects physically placed within the scene or displayed as virtual objects within the virtual environment displayed on a large screen in the scene. Then each user has a different position within the scene and with respect to the object (physically or virtually). Thus when a user sitting in the carriage on a position which is at the left side of the object, fires a projectile by making a gesture with a body part, the trajectory defined by the gesture will define the trajectory towards the object, or in particular displayed on the screen which does not start from the bottom centre of the screen but from the left side of the screen, since this is the position of the user relative to the screen. Thus the trajectory of the projectile on the screen is an extrapolation of the trajectory defined by the gesture which comprises preferably the start point of the gesture, the end point of the gesture, the trajectory of the gesture from start to end point and the relative position of the body part making the gesture relative to the large display screen. Since all users have different positions in the scene, i.e. relative to the large display screen, each projectile trajectory on the screen starts from a different point, by which the projectiles from the individual users are manipulated, i.e. moved, in a realistic manner and each user can be distinguished.
In an example the manipulation of the at least one object within the scene is controlled if the at least one object is located in a trajectory defined as an extrapolation of a trajectory defined by the determined position and movement of the at least one body part of the user.
With one example according to the invention the optical motion controller is arranged to manipulate an object in a relative manner, i.e. if the motion controller determines that the body part of the user is moved forward, the object moves forward as well for example. This example is in particular useful when using a large display screen on which the object can be shown and manipulated, e.g. moved.
In another example according to the invention the object is however controlled in a different manner more in accordance with a shooter game. If the object is for example a target object within the shooter game, the body part can act as a gun. When the gun, as determined by the optical motion controller, is aimed at the object, e.g. moved towards the direction of the object, the shot can be counted as a hit. To determine if the object is correctly aimed at, from the movement of the body part a trajectory is determined. From that trajectory an extrapolated projectile trajectory is determined and the computer calculates if the object is within that projectile trajectory. If that is the case, the shot can be considered a hit.
In an example the at least one object within the scene is a physical object.
In an example the at least one object within the scene is a virtual object within a virtual environment displayed on at least one display screen.
In an example the manipulation comprises providing any one or more of the group consisting of: a visual signal, an audio signal, change of position of the at least one object, change of appearance of the at least one object.
As indicated, the object can be a physical object located somewhere within the scene, or a virtual object displayed on a display screen. In the first option, the manipulation of the object is to be understood as for example an audio signal or a visual signal such as a blinking light by which the object indicates that is has been hit. In the other option, of a virtual object manipulation is to be understood in a broad sense. The virtual object can for example explode, move, fall over or whatever is suitable in the game setting in which it is used.
In an example the at least one computer device is arranged to display a projectile on the at least one display screen in a projectile trajectory defined by the determined position and movement of the at least one body part of the user.
In yet another example of a shooter game, the object can be target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the position and movement of the body part.
In an example the control signal comprises a trigger signal for triggering manipulation of the at least one object, in particular generated by at least one of the group comprising: a gun trigger, a fire button and a pull cord.
The control signals comprise the movement of the body part and the position of the motion controller in respect of the object or the large display screen. The control signals can further comprise a trigger signal to trigger the firing of the projectile. A separate trigger unit can do this, or a trigger unit on a hand held device, e.g. a gun trigger, fire button or pull cord, or by a particular movement of body or body part that is pre-defined as being the trigger gesture.
In yet another example of a shooter game, the object can be a target object and a further object can be displayed on the display screen in the form of a projectile that is launched towards the target object. That projectile is displayed on the screen in a trajectory that corresponds to the movement of the body part.
In an example the at least one computer device is arranged to determine the position of the at least one body part by determining a position of the optical motion controller device in respect of the object, by receiving an identification value comprised in the control signals and determining a corresponding pre-defined position stored within a memory of the at least one computer device.
The system preferably comprises multiple optical motion controller devices, for example 1 per user, or 1 per 2 users or 1 per 4 or 8 users, etc. The optical motion controller can determine movement of a body part or multiple body parts and the computer device can calculate the trajectory of the projectile thereof by determining the position of the optical motion controller device for example by a position table stored in a memory of the computer device wherein each motion controller device is identified by a unique identification value and a corresponding position in the scene, i.e. in relative to the large screen or the physical object. When the computer device receives control signals from the optical motion controller device it can distinguish each optical motion controller device by an identification value comprised in the control signals. Then the computer device can access the memory to determine which position belongs to that motion controller device.
In an example the at least one computer device is arranged to determine the position of the at least one body part by determining a position of the optical motion controller device in respect of the object and a position of the at least one body part of the user in respect of the optical motion controller device.
If the computer device can determine the relative position of each optical motion controller device in respect of the large display screen or the physical object, and each optical motion controller device can determine the relative position of the body part in respect of the controller, the computer can determine the relative position of the user and the body part in respect of the large display screen or the object on the basis of the sum of both positions.
In an example the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the optical motion controller device.
In an example the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the user.
In an example the at least one computer device is arranged to determine the position of the at least one body part in respect of the object by determining a position of a visual and/or RF marker provided on the body part of the user.
In an alternative, the computer device can also determine the position by visual and/or RF recognition of the optical motion controller device or in particular of the user or the body part thereof. This has the advantage that the optical motion controller device does not have to be stationary and can be moved in the scene since the computer device can determine current positions of each motion controller device at any time.
The computer device to which the optical motion controllers are connected preferably knows the position of the individual controllers and knows the position of the display or the physical objects, for example by a predefined display location position and predefined controller position location stored in the computer or remote. In an alternative example, one or both of the positions of the controller and the display or objects are determined at certain (calibration) moments in time, for example when a carriage of a ride enters the scene where the users can see the display and can start the game. As a second alternative, the calibration methods of determining one of the controller and display or object positions (or both) can be performed on a real-time or near real-time basis.
In an example the interactive system is arranged for stereoscopy, and in particular, wherein the system comprises at least one 3D glass for the at least one user, and wherein the display screen is arranged to display virtual environment with a depth perspective.
In another example multiple dimensions are added. Thus, not only the X and Y dimensions, but also the Z dimension, e.g. Z position, being the distance from display to controller, is a variable in manipulating the object (projectile). Multiple dimensions based input determination is in particular more realistic if a three-dimensional representation is used. Preferably, the optical motion controller device is arranged not only for X, Y, and Z position and movement, i.e. motion, but also for so called six degrees of freedom motion detection, which refers to the freedom of movement of a the body part in three-dimensional space. Specifically, the body parts detected are free to move forward/backward (Z), up/down (X), left/right (Y) (translation in three perpendicular axes) combined with rotation about three perpendicular axes, known as pitch, yaw, and roll. As such, the system, i.e. computer, and display are arranged for 3D. The trajectory of the projectile can then be displayed as a 3D representation of the projectile starting from the body part, in an extrapolated trajectory towards a certain target location on the large screen, all in 3D. The person skilled in the art will understand what 3D display methods are applicable, such as a stereoscopic projector, large screen 3D display, etc. The system can also be arranged for both 2D and 3D representation, by use of a 3D capable display or projector that can switch between 2D and 3D, for example by turning of a single lamp in the projector.
In an example the optical motion controller device is arranged for wireless communication with the at least one computer via one or more of the group comprising Bluetooth, Wifi, Ethernet IEEE 802.3, Zigbee, RS422, RS485 and CAN.
The computer is in communicative connection with the optical motion controller device, either by wire or wireless. The person skilled in the art will understand which wired or wireless communication standards are applicable, e.g. peer-to-peer, Ethernet, 801.11 wireless, 801.15 wireless PAN, Bluetooth, IEEE 802.3, Zigbee, RS422, RS485 and CAN.
In an example the interactive system comprises at least 2 optical motion controller devices, and in particular at least 4, more in particular at least 8, and even more in particular at least 16 or at least 32.
In an example each of the optical motion controller devices is arranged to generate the control signals on movement of the body part or body parts of at least 1 user, or at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
In an example each of the motion controller devices is arranged to generate the control signals on position and movement of at least one body part of at least 2 users, and in particular at least 4 users, and more in particular at least 8, and even more in particular at least 16 or at least 32 users simultaneously.
In an example the scene is a scene of a shooter game, and wherein the at least one object is a target object to be shot with a projectile.
In an example the at least one computer device is arranged to communicate with a plurality of optical motion controller devices and wherein each respective position thereof is determined by the at least one computer device.
In an example a speed of movement of the at least one body part is determined by the optical motion controller device and wherein a speed of movement of the at least one object on the display screen corresponds to the determined speed of movement of the at least one body part.
In a second aspect of the invention there is provided an optical motion controller device arranged to be used as a human interface device in an interactive system for interaction of multiple users with scene according to any of the previous descriptions.
In a third aspect of the invention there is provided a computer device arranged to be used as a computer in an interactive system for interaction of multiple users with scene according to any of the previous descriptions.
In a fourth aspect of the invention there is provided an interactive amusement ride comprising a scene and an interactive system according to any of the previous descriptions, and wherein the interactive amusement ride in particular comprises a track and at least one car or carriage for moving users over the track, wherein the scene, and in particular the car or carriage comprises a plurality of the optical motion controller devices.
As indicated above, all examples of the present invention can be applied in a plurality of different virtual environments. The preferred example or embodiment is a virtual environment such as a virtual game environments, and in particular a shooter game. The projectile of such a shooter game is thus the projectile used in the shooter. That could be a bullet of a gun, arrow of a bow, a javelin, a spear, etc. This projectile can also be a snowball from a snowball cannon in a snowball shooter game as available from the applicant of the present invention.
In a preferred example of the invention the scene comprises a virtual environment having a large screen display. Such is, however only by way of example. The invention also applies to static displays, i.e. a scene in which the objects are physical objects, e.g. the targets of the shooter game, and the system is comprised of one or more computers, one or more optical motion controller devices and one or more physical target objects positioned somewhere in the scene. The trajectory of the projectile is thus not displayed on a screen, due to the absence of the screen, but the system is still arranged to determine a virtual trajectory as an extrapolation of the trajectory defined by the gesture. The system, i.e. the computer thereof, then determines the extrapolation of the trajectory defined by the gesture, the position of the optical motion controller device and the position of the physical target object and then calculates whether or not the projectile hits or misses the target object. The user, i.e. player, is informed of a hit or misses by visual and/or audio and/or force feedback information on the optical motion controller device or an additional device such a scoreboard, speaker, etc.
The above-mentioned and other features and advantages of the invention are illustrated in the appended figures and detailed description which are provided by way of illustration only and which are not limitative to the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 shows a setup of a system according to a first aspect of the invention with a computer, large screen display and multiple users and optical motion controller devices.
Figures 2 shows an illustration according to an example of the invention of a trajectory of a projectile defined by gesture detected by the optical motion controller device and the corresponding trajectory of the projectile on the large screen display.
Figures 3 shows other illustrations according to an example of the invention of trajectories of a projectile defined by detected gestures.
DETAILED DESCRIPTION OF THE DRAWINGS
In the subsequent description of the drawing reference is made to the following: 100 Scene 110 Large screen display 111 Active area of large screen display 120 Computer device 121 Wired communication with large screen display 122 Wired communication with projector 131 First user 132 Second user 141 First target object in virtual environment 142 Second target object in virtual environment 143 Third object in virtual environment 151 First body part of first user 152 First body part of second user 153 Optical motion sensor controller device 160 Projector arrangement 161 First projector of stereoscopic projector arrangement 162 Second projector of stereoscopic projector arrangement 171 Camera system 181, 182 Stereo audio speaker system 191 Fan 200 Example of a trajectory on a large display screen 211 First start point 212 Second end point 213 Trajectory 261 Trajectory start point on large display screen 262 Trajectory end point on large display screen 263 Trajectory on large display screen 271 Virtual extrapolated trajectory 300 Another example of a trajectory on a large display screen 311 First start point 312 Second end point 313 Trajectory 321 Trajectory start point on large display screen 322 Trajectory end point on large display screen 323 Trajectory on large display screen 331 Virtual extrapolated trajectory 371 First start point 372 Second end point 373 Trajectory 381 First movement 382 Second movement X horizontal axis of large display screen Y vertical axis of large display screen Z depth axis of large display screen
In Fig. 1 a scene 100 is illustrated in accordance with a first aspect of the invention. The scene 100 is in this example an interactive stationary ride for multiple users, which users 131, 132 can interact and manipulate objects 141, 142, 143, in the virtual environment. The interactive virtual environment illustrated in this example is a shooter game and in particular a game wherein multiple users 131, 132 can each manipulate the objects within the virtual environment on the basis of detected gestures, i.e. position, orientation, motion, of their body parts 151, 152. The game illustrated here is a game wherein snowballs are fired by motion of the body parts 151, 152 for example as if they would start from the body or body part in particular.
The first user 131 manipulates the object 141 or objects 141, 142, 143 on the screen by moving his or her arm 151. The movement, i.e. motion, defines a trajectory that corresponds with either the trajectory of the object moving on the screen, or with a projectile trajectory towards the object. Thus, in one example the gesture is detected and used to manipulate the object, i.e. as a target object, and in the other example the gesture is detected and used to manipulate a projectile towards the object, i.e. the target object. In the later, there are two objects to be recognised, one being the target object, the other being the projectile, e.g. a snowball.
Thus, in case of the gesture trajectory to correspond to the projectile trajectory, the trajectory of the projectile, e.g. the snowball, is shown on the large screen display 110, and in particular on the active part of the display 111, towards a first target object 141 of the plurality of objects 141, 142, 143 of the virtual environment. Moreover, the snowball trajectory is not only defined by relative movement, i.e. the movement of the body part in relation to the motion sensor 153, but also by the absolute movement, i.e. by the sum of the movement of the body part in relation to the motion sensor 153 and the position of the motion sensor 153 in relation to the large screen display 110 or particular objects, hence its absolute position within the scene 100.
Fig. 1 further shows a fan 191 which can be activated to create a flow of air, and a computer device 120. In this example shown as a single computer device, however, the computer can also be comprised of a plurality of computers, for example a cluster of computers, whatever is sufficient to process all the data of the system. The computer device 120 is either attached to an active large screen display 110, for example a large computer screen or large size television. In the example of Fig. 1 the large screen display is a passive display and the actual image of the virtual environment on the screen is generated by a projector arrangement 160. This can either be a single mono projector for displaying a two-dimensional, 2D, virtual environment, wherein only a single projector 161 and corresponding lens is used of the projector arrangement 160, or this could be a three-dimensional, 3D, virtual environment, wherein two projectors 161 and 162 and corresponding lenses are used to generate a stereoscopic image with depth perspective. In case of a projector 160 that produces the images, the projector will be connected with the computer device 120 via wired communication 122. As the skilled person will understand, the present invention is not restricted to the form of communication, i.e. wired or wireless, and is applicable to both forms and implementations thereof.
The scene 100 of Fig. 1 further shows additional speakers 181, 182 which are arranged to add an audio component to the virtual environment. The scene 100 of Fig. 1 shows, by way of example, a two-speaker set-up. The invention, as the skilled person will understand, is not restricted to merely a two speaker stereo setup, and is applicable to all audio set-ups, i.e. mono, stereo, and surround sound setups.
Fig. 1 further shows a camera system 171 which can for example be used to record the users 131, 132 and to determine for example whether or not a pointing device 151, 152 should be enabled or disabled from/in the game if a user 131, 132 is detected who is operating the pointing device. The camera system 171 can further be used to for example trigger the start of the game, upon detection of movement of any person within the scene or to recognise users, or to extract images of the users such that these images of the users can be used in the virtual environment, e.g. as avatars.
Known game controllers are only able to control the aspects of a game or virtual environment by relative movement. Thus in the snowball shooter game according to Fig. 1, wherein a projectile is fired in the form of a snowball, the trajectory of the snowball displayed on the screen always starts from the same position. That position is for example a position at the bottom centre of the display, where an end part of a gun, or applicable here, a snowball cannon, is permanently displayed. The user aims at a target somewhere in the screen and when the fire button or trigger is pushed, the snowball starts its trajectory in a straight line from the bottom centre of the display, at the end of the cannon, towards the aimed position on the display, hoping to hit the target, e.g. target object 141, if the aimed position and target object position 141 correspond.
With a motion detection system 151, 152, 153 according to the invention, the trajectory is different since the starting position of the snowball on the display is not static but corresponds to the relative position between the users 131, 132 or at least their body parts 151, 152, and the display 110. If the body part 151 is located at a certain distance away from the display, illustrated by a position on the Z depth axis of the large display screen 110 as illustrated in Fig. 1, but positioned at centre of the display, i.e. in the centre of the width of the display, thus in the origin of Z horizontal axis of the large screen display 110 as illustrated in Fig. 1, the trajectory of the projectile starts at the middle of the display, from the bottom towards the aimed position on the display, e.g. target object 141. But if the body part 151 is located at the right side of the display 110, thus on a position away from the origin on the X horizontal axis, the trajectory of the snowball starts from the bottom right corner of the screen 110. In this way, the motion detection system 153 is able to manipulate an object in a virtual environment, e.g. a trajectory of a snowball in a snowball shooter game or the target object, not only on the basis of the aimed direction but also on the relative position of the body part 151 in relation to the display 110.
The motion sensor device 153 is in particular arranged to define or calculate a trajectory of the projectile by a first, starting position of the body part, a second, end position on the body part and a trajectory defined from first to second position.
The example shown in Fig. 1 is an example wherein the scene 100 comprises a large screen display 110 connected to the computer device 120. On the display the objects 141, 142, 143 are shown as virtual objects within a virtual environment. The invention however also relates to a scene wherein the objects are physical objects within the scene 100. These objects can be target objects towards which the users should aim the projectiles. If the projectile hits the target object, the object interacts by for example an audio signal, a visual signal or the like. The interaction can also be in the form of an additional score board or other device from which the user can determine if the shot was a hit or miss.
In all examples of the invention, the computer device 120 to which the motion sensor device 153 is connected knows the position of each body part 151, 152 and knows the position of the display 110 for example by a predefined device location position value, stored local or remote. In an alternative, one of, or both the positions of the body parts 151, 152 and the display 110 are determined at certain (calibration) moments in time, for example when a carriage of a ride (not shown) enters the scene where the users 131, 132 can see the display 110 and can start the game. As a second alternative, the calibration methods of determining one of the body parts 151, 152 and display 110 positions (or both), can be performed on a real-time or near real-time basis.
Thus, in an example wherein the motion sensor device 153 is arranged to define or calculate a trajectory of the projectile, that trajectory is defined by the first position on of the body part 151, the second position thereon and trajectory from first to second. The computer device then calculates a further virtual trajectory based on the trajectory determined thereof. The further virtual trajectory is thus an extrapolation of the determined trajectory. If the position of the display is somewhere within that further virtual trajectory, the projectile is shown on the display. If however, the extrapolated trajectory does not cross the display, the projectile is not shown on the display. In the first option wherein the display is in the extrapolated trajectory, the computer device can determine on the basis of the position of the display and the defined virtual trajectory, i.e. the extrapolated trajectory, how the projectile should be displayed. Thus what the starting position of the projectile on the display would have to be, and what trajectory the projectile would than follow from there on. As such, if the trajectory on the display, i.e. as an extrapolation of the trajectory defined by the gesture, via the virtual trajectory between the body part and display also as extrapolation of the first, crosses the target on the display, the computer can count the “shot” as a “hit”.
In Fig. 2 and 3 the different trajectories are shown. In Fig. 2 the large screen display 110, with active area 111 shows (besides other elements of the game) two objects 141, 142 that can be manipulated. These are the targets that are aimed at by the two users 131, 132. The first user 131 with first body part 151 is on the left side of the screen 110, thus as shown on Fig. 1, on the left side of the X-axis. That first user 131 generates a trajectory 211-212-213 with his hand, i.e. the body part 151. A first start point 211, a second end point 212 and the trajectory 213 from first to second define the trajectory. These variables are communicated to the computer device, which adds the location position variable/value thereto, and on the basis thereof as well as on the basis of the position of the display 110 the virtual trajectory 271 is calculated. That virtual trajectory 271 is an extrapolation of the trajectory 211-212-213. The virtual trajectory, i.e. the extrapolated trajectory 213, continues on the screen 110 at start point 261 and the trajectory 263 is further continued, as an extrapolation of the trajectory 211-212-213 and the virtual trajectory 271, towards the end point on the screen 262. If trajectory 263 displayed on the screen 110 crosses a target object 141, 142, a hit is counted, otherwise, the shot was counted as a miss.
In Fig. 3 yet another trajectory is shown. In this figure the trajectory 312-313-314 of the body part, i.e. the movement of the hand 151 is not a straight line but is curved 381. The extrapolation 331 of that trajectory 312-313-314 is thus also curved according to a corresponding radius. This results in the example shown here in a miss of the target object 141 since the extrapolation 331 of the trajectory as displayed on the screen 110 starts at point 321 towards end point 322 via the displayed trajectory 323. This trajectory 323 does not cross a target object 141 and thus the shot is registered by the computer as a missed shot.
The figure also shows an example of relative movement wherein the position of the hand 151 in relation to the display 110 is not relevant and determined. In this situation the movement 382 of the hand from starting position 371 towards end position 372 defines a trajectory 373. That trajectory defines the control signal and thus the trajectory in which the object is moved on the screen, i.e. the trajectory 323 on the screen from start position 321 towards end position 322.
The skilled person will appreciate that the invention has been described in the foregoing with reference to the described examples. A skilled person may provide modifications and additions to the examples disclosed, which modifications and additions are all comprised by the scope of the appended claims.

Claims (22)

1. Een interactief systeem voor interactie van ten minste één gebruiker met ten minste één object binnen een scène, het systeem omvattende: ten minste één human interface-inrichting ingericht voor het genereren van stuursignalen overeenkomstig een gebruikersinvoer van de ten minste ene gebruiker; ten minste één computerinrichting ingericht voor het ontvangen en verwerken van de stuursignalen van de ten minste ene human interface-inrichting en voor het regelen van manipulatie van het ten minste ene object binnen de scène, met het kenmerk dat, de ten minste ene human interface-inrichting een bewegingssensorregelinrichting betreft, en in het bijzonder een optische bewegingssensorregelinrichting, ingericht voor het vaststellen van beweging en voor het genereren van de stuursignalen voor het regelen van de manipulatie van het ten minste ene object binnen de scène overeenkomstig de vastgestelde beweging, en waarbij de manipulatie van het ten minste ene object binnen de scène verder geregeld wordt door een vastgestelde positie van de ten minste ene bewegingsbesturingsinrichting ten opzichte van het ten minste ene object.An interactive system for interacting at least one user with at least one object within a scene, the system comprising: at least one human interface device adapted to generate control signals in accordance with a user input of the at least one user; at least one computer device adapted to receive and process the control signals from the at least one human interface device and to control manipulation of the at least one object within the scene, characterized in that the at least one human interface device device is a motion sensor control device, and in particular an optical motion sensor control device, arranged for determining movement and for generating the control signals for controlling the manipulation of the at least one object within the scene in accordance with the determined movement, and wherein the manipulation of the at least one object within the scene is further controlled by a predetermined position of the at least one motion control device relative to the at least one object. 2. Het interactieve systeem volgens conclusie 1, waarbij de manipulatie van het ten minste ene object binnen de scène aangestuurd wordt wanneer het ten minste ene object zich bevindt in een traject gedefinieerd als een extrapolatie van een traject gedefinieerd door de vastgestelde positie en beweging van het ten minste ene lichaamsdeel van de gebruiker.The interactive system of claim 1, wherein the manipulation of the at least one object within the scene is controlled when the at least one object is in a trajectory defined as an extrapolation of a trajectory defined by the determined position and movement of the at least one body part of the user. 3. Het interactieve systeem volgens conclusie 1 of 2, waarbij het ten minste ene object binnen de scène een fysiek object is.The interactive system according to claim 1 or 2, wherein the at least one object within the scene is a physical object. 4. Het interactieve systeem volgens conclusie 1 of 2 waarbij het ten minste ene object binnen de scène een virtueel object is binnen een virtuele omgeving dat op ten minste een beeldscherm wordt weergegeven.The interactive system according to claim 1 or 2, wherein the at least one object within the scene is a virtual object within a virtual environment that is displayed on at least one screen. 5. Het interactieve systeem volgens een van de voorgaande conclusies, de manipulatie omvattende het verschaffen van één of meer van de groep bestaande uit: een visueel signaal, een audio signaal, een verandering van positie van het ten minste ene object, verandering van uiterlijk van het ten minste ene object.The interactive system according to any of the preceding claims, the manipulation comprising providing one or more of the group consisting of: a visual signal, an audio signal, a change of position of the at least one object, change of appearance of the at least one object. 6. Het interactieve systeem volgens een van de voorgaande conclusies 4 of 5, waarbij de ten minste ene computerinrichting ingericht is voor het weergeven van een projectiel op het ten minste ene beeldscherm in een projectieltraject gedefinieerd door de vastgestelde positie en beweging van het ten minste ene lichaamsdeel van de gebruiker.The interactive system according to any of the preceding claims 4 or 5, wherein the at least one computer device is adapted to display a projectile on the at least one screen in a projectile trajectory defined by the determined position and movement of the at least one body part of the user. 7. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij het stuursignaal een triggersignaal omvat voor het triggeren van manipulatie van het ten minste ene object, en in het bijzonder gegenereerd door ten minste één of meer van de groep omvattende: een pistooltrekker, een vuurknop en een trekkoord.The interactive system according to any of the preceding claims, wherein the control signal comprises a trigger signal for triggering manipulation of the at least one object, and in particular generated by at least one or more of the group comprising: a gun trigger, a fire button and a drawstring. 8. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de ten minste ene computerinrichting ingericht is voor het vaststellen van de positie van het ten minste ene lichaamsdeel door het vaststellen van een positie van de optische bewegingsbesturingsinrichting ten opzichte van het object, door het ontvangen van een identificatiewaarde omvat in de stuursignalen en het bepalen van een overeenkomstig vooraf bepaalde positie opgeslagen in een geheugen van de ten minste ene computerinrichting.The interactive system according to any of the preceding claims, wherein the at least one computer device is adapted to determine the position of the at least one body part by determining a position of the optical motion control device relative to the object, by receiving an identification value included in the control signals and determining a corresponding predetermined position stored in a memory of the at least one computer device. 9. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de ten minste ene computerinrichting ingericht is voor het bepalen van de positie van het ten minste ene lichaamsdeel door het vaststellen van een positie van de optische bewegingsbesturingsinrichting ten opzichte van het object en een positie van het ten minste ene lichaamsdeel van de gebruiker ten opzichte van de optische bewegingsbesturingsinrichting.The interactive system according to any of the preceding claims, wherein the at least one computer device is adapted to determine the position of the at least one body part by determining a position of the optical motion control device relative to the object and a position of the at least one body part of the user relative to the optical motion controller. 10. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de ten minste ene computerinrichting ingericht is voor het bepalen van de positie van ten minste één lichaamsdeel ten opzichte van het object door het bepalen van een positie van een visuele en/of hoogfrequent marker verschaft op de optische bewegingsbesturingsinrichting.The interactive system according to any of the preceding claims, wherein the at least one computer device is adapted to determine the position of at least one body part relative to the object by determining a position of a visual and / or high-frequency marker provided on the optical motion controller. 11. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de ten minste ene computerinrichting ingericht is voor het bepalen van de positie van het ten minste ene lichaamsdeel ten opzichte van het object door het vaststellen van een positie van een visuele en/of hoogfrequent marker verschaft op de gebruiker.The interactive system according to any of the preceding claims, wherein the at least one computer device is adapted to determine the position of the at least one body part relative to the object by determining a position of a visual and / or high-frequency marker provided on the user. 12. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de ten minste ene computerinrichting ingericht is voor het bepalen van de positie van het ten minste ene lichaamsdeel ten opzichte van het object door het bepalen van een positie van een visuele en/of hoogfrequent marker verschaft op het lichaamsdeel van de gebruiker.The interactive system according to any of the preceding claims, wherein the at least one computer device is adapted to determine the position of the at least one body part relative to the object by determining a position of a visual and / or high-frequency marker provided on the body part of the user. 13. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij het interactieve systeem ingericht is voor stereoscopie, en in het bijzonder, waarbij het systeem ten minste één 3D-bril omvat voor de ten minste ene gebruiker, en waarbij het beeldscherm ingericht is voor het weergeven van een virtuele omgeving met een diepteperspectief.The interactive system according to any of the preceding claims, wherein the interactive system is arranged for stereoscopy, and in particular, wherein the system comprises at least one 3D glasses for the at least one user, and wherein the display is arranged for displaying a virtual environment with a depth perspective. 14. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de optische bewegingsbesturingsinrichting ingericht is voor draadloze communicatie met ten minste één computer via één of meer van de groep omvattende bluetooth, wifi, ethernet IEEE 802.3, Zigbee, RS422, RS485 en CAN.The interactive system according to any of the preceding claims, wherein the optical motion control device is adapted for wireless communication with at least one computer via one or more of the group comprising bluetooth, wifi, ethernet IEEE 802.3, Zigbee, RS422, RS485 and CAN. 15. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij het interactieve systeem ten minste 2 optische bewegingsbesturingsinrichtingen omvat, en in het bijzonder ten minste 4, meer in het bijzonder ten minste 8, en nog meer in het bijzonder ten minste 16 of ten minste 32.The interactive system according to any of the preceding claims, wherein the interactive system comprises at least 2 optical motion control devices, and in particular at least 4, more in particular at least 8, and even more particularly at least 16 or at least least 32. 16. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij elk van de bewegingsbesturingsinrichtingen ingericht is voor het gelijktijdig genereren van de stuursignalen van posities en beweging van ten minste één lichaamsdeel van ten minste 2 gebruikers, en in het bijzonder ten minste 4 gebruikers, meer in het bijzonder ten minste 8, en nog meer in het bijzonder ten minste 16 of ten minste 32 gebruikers.The interactive system according to any of the preceding claims, wherein each of the motion control devices is adapted to simultaneously generate the control signals of positions and movement of at least one body part of at least 2 users, and in particular at least 4 users, more in particular at least 8, and even more in particular at least 16 or at least 32 users. 17. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de scène een scène is van een schietspel, waarbij het ten minste ene object een doelobject is dat beschoten kan worden met een projectiel.The interactive system according to any of the preceding claims, wherein the scene is a scene of a shooting game, wherein the at least one object is a target object that can be shot at with a projectile. 18. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij de ten minste ene computerinrichting ingericht is voor het communiceren met een veelheid optische bewegingsbesturingsinrichtingen en waarbij elke respectievelijke positie daarvan bepaald wordt door de ten minste ene computerinrichting.The interactive system according to any of the preceding claims, wherein the at least one computer device is adapted to communicate with a plurality of optical motion control devices and wherein each respective position thereof is determined by the at least one computer device. 19. Het interactieve systeem volgens een van de voorgaande conclusies, waarbij een snelheid van beweging van het ten minste ene lichaamsdeel vastgesteld wordt door de optische bewegingsbesturingsinrichting en waarbij een snelheid van beweging van het ten minste ene object op het beeldscherm overeenkomt met de vastgestelde snelheid van beweging van het ten minste ene lichaamsdeel.The interactive system according to any of the preceding claims, wherein a speed of movement of the at least one body part is determined by the optical motion control device and wherein a speed of movement of the at least one object on the display corresponds to the determined speed of movement of the at least one body part. 20. Een optische bewegingsbesturingsinrichting ingericht om gebruikt te worden als human interface-inrichting in een interactief systeem voor interactie van meerdere gebruikers met een scène volgens een van voorgaande conclusies 1-19.An optical motion control device adapted to be used as a human interface device in an interactive system for multi-user interaction with a scene according to any of the preceding claims 1-19. 21. Een computerinrichting ingericht om gebruikt te worden als computer in een interactief systeem voor interactie van meerdere gebruikers met een scène volgens een van de voorgaande conclusies 1-19.A computer device adapted to be used as a computer in an interactive system for multi-user interaction with a scene according to any of the preceding claims 1-19. 22. Een interactieve amusementsattractie omvattende een scène en een interactief systeem volgens een van de voorgaande conclusies 1-18, en waarbij de interactieve amusementsattractie in het bijzonder een baan omvat en ten minste een wagentje of karretje voor het voortbewegen van de gebruikers over de baan, waarbij de scène, en in het bijzonder het wagentje of karretje een veelheid van de optische bewegingsbesturingsinrichtingen omvat.An interactive amusement attraction comprising a scene and an interactive system according to any of the preceding claims 1-18, and wherein the interactive amusement attraction in particular comprises a track and at least one trolley or cart for moving the users along the track, wherein the scene, and in particular the trolley or trolley, comprises a plurality of the optical motion control devices.
NL2014976A 2015-06-17 2015-06-17 Gesture game controlling. NL2014976B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
NL2014976A NL2014976B1 (en) 2015-06-17 2015-06-17 Gesture game controlling.
PCT/NL2016/050434 WO2016204617A2 (en) 2015-06-17 2016-06-17 Game controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL2014976A NL2014976B1 (en) 2015-06-17 2015-06-17 Gesture game controlling.

Publications (1)

Publication Number Publication Date
NL2014976B1 true NL2014976B1 (en) 2016-09-26

Family

ID=53901083

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2014976A NL2014976B1 (en) 2015-06-17 2015-06-17 Gesture game controlling.

Country Status (1)

Country Link
NL (1) NL2014976B1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Kinect - Wikipedia, the free encyclopedia", 19 March 2015 (2015-03-19), XP055177978, Retrieved from the Internet <URL:http://en.wikipedia.org/wiki/Kinect> [retrieved on 20150319] *
ANONYMOUS: "PlayStation Move - Wikipedia, the free encyclopedia", 27 June 2010 (2010-06-27), XP055125047, Retrieved from the Internet <URL:http://en.wikipedia.org/w/index.php?title=PlayStation_Move&oldid=370488813> [retrieved on 20140624] *

Similar Documents

Publication Publication Date Title
US9555337B2 (en) Method for tracking physical play objects by virtual players in online environments
US9684369B2 (en) Interactive virtual reality systems and methods
US8882559B2 (en) Mixed reality remote control toy and methods therfor
US9542011B2 (en) Interactive virtual reality systems and methods
WO2016204617A2 (en) Game controller
CN107469343B (en) Virtual reality interaction method, device and system
US9244525B2 (en) System and method for providing user interaction with projected three-dimensional environments
KR101366444B1 (en) Virtual reality shooting system for real time interaction
US9511290B2 (en) Gaming system with moveable display
US10928915B2 (en) Distributed storytelling environment
US20170056783A1 (en) System for Obtaining Authentic Reflection of a Real-Time Playing Scene of a Connected Toy Device and Method of Use
EP3129111A2 (en) Interactive virtual reality systems and methods
JP2022141691A (en) Immersive and reactive game play range, system and process
US9550129B2 (en) Multiplayer game platform for toys fleet controlled by mobile electronic device
ES2774390T3 (en) Method and game system to project volumetric images in a physical scene
CN206444151U (en) A kind of immersive VR shoots interaction platform
US11707668B2 (en) Screen shooting range and method of playing screen shooting game using artificial intelligence technology
Wolf BattleZone and th e Origins of First-Person Shooting Games
US10369487B2 (en) Storytelling environment: mapping virtual settings to physical locations
NL2014976B1 (en) Gesture game controlling.
KR101586651B1 (en) Multiplayer Robot Game System using Augmented Reality
US20220258062A1 (en) 4d screen shooting range and playing method using thereof
NL2014974B1 (en) Hand held controller.
NL2014664B1 (en) Touchscreen game controller.
WO2016167664A2 (en) Game controller