WO2022013561A1 - Extended reality emulation of a real-life sporting event - Google Patents

Extended reality emulation of a real-life sporting event Download PDF

Info

Publication number
WO2022013561A1
WO2022013561A1 PCT/GB2021/051823 GB2021051823W WO2022013561A1 WO 2022013561 A1 WO2022013561 A1 WO 2022013561A1 GB 2021051823 W GB2021051823 W GB 2021051823W WO 2022013561 A1 WO2022013561 A1 WO 2022013561A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sporting
real
sporting event
life
Prior art date
Application number
PCT/GB2021/051823
Other languages
French (fr)
Inventor
Jack TIPTON
Original Assignee
Generation Vr Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Generation Vr Ltd filed Critical Generation Vr Ltd
Publication of WO2022013561A1 publication Critical patent/WO2022013561A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/0015Training appliances or apparatus for special sports for cricket
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • A63B2024/0015Comparing movements or motion sequences with computerised simulations of movements or motion sequences, e.g. for generating an ideal template as reference to be achieved by the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment

Definitions

  • the invention relates to a method and a system for generating an extended reality emulation based on a real-life sporting event that has taken place in a real-life sporting environment.
  • the user can view the motion of the cricket ball that has occurred during a real cricket match, perhaps as delivered by a famous bowler, and can subsequently interact with the ball as if they were the batsman who actually received the real-life delivery.
  • perception-action coupling is essential to how people learn to play sport, such that a tool which facilitates a more realistic perception of a sporting event has pedagogical advantages.
  • a method for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment comprises retrieving first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data; displaying an XR representation of the first object to a user based on the first sporting event XR data; and facilitating interaction by the user with the representation of the first object.
  • the volumetric data may comprise trajectory data and/or at least one further property.
  • the method may comprise modifying the trajectory data and/or the at least one further property, optionally by the user, wherein the displayed XR representation of the first object is representative of that modification.
  • the modifying may be restricted to being within a predetermined range such that the modified XR representation of the first object is realistic.
  • the at least one further property may comprise rotation (i.e. , spin) data.
  • Retrieving the first sporting event XR data may comprise obtaining pre-recorded first sporting event XR data.
  • the method may comprise retrieving sporting environment XR data corresponding to the real- life sporting environment, the sporting environment XR data having a first temporal relationship with the first sporting event XR data; and displaying an XR representation of the sporting environment to the user based on the sporting environment XR data according to the first temporal relationship.
  • Retrieving the sporting environment XR data may comprise obtaining pre-recorded sporting environment XR data.
  • the method may comprise retrieving second sporting event XR data corresponding to a real- life motion of a second object in the real-life sporting environment, the second sporting event XR data having a second temporal relationship with the first sporting event XR data; and displaying an XR representation of the second object to the user based on the second sporting event XR data according to the second temporal relationship.
  • Retrieving the second sporting event XR data may comprise obtaining pre-recorded second sporting event XR data.
  • the second object may be a person.
  • the second sporting event XR data may be volumetric data.
  • Retrieving the first sporting event XR data may comprise selecting from a plurality of sporting event XR data each corresponding to a real-life motion of the first object in the real-life sporting environment.
  • the method may comprise selecting a perspective from which the XR emulation is displayed.
  • a system for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment comprising a processor configured to retrieve first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data; a display configured to display an XR representation of the first object to a user based on the first sporting event XR data; and a user interface configured to facilitate interaction by the user with the representation of the first object.
  • Figure 1 schematically depicts a method for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment
  • Figure 2A schematically depicts an XR representation of a first object and a second object as displayed to a user according to an exemplary embodiment
  • Figure 2B schematically depicts an XR representation of a first object and a second object as displayed to a user according to an exemplary embodiment
  • Figure 3 schematically depicts a system for generating an XR emulation based on a real-life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment
  • the term “comprising” or “comprises” means including the component(s) specified but not to the exclusion of the presence of other components.
  • the term “consisting essentially of or “consists essentially of” means including the components specified but excluding other components except for materials present as impurities, unavoidable materials present as a result of processes used to provide the components, and components added for a purpose other than achieving the technical effect of the invention, such as colourants, and the like.
  • Figure 1 schematically depicts a method for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment.
  • a real-life sporting eventing event is a sporting event that has taken place for which a motion of at least a first object was tracked to provide volumetric data for the first object.
  • volumetric data comprises, or is defined or described as comprising, spatial coordinates in three dimensions (i.e. , (x, y, z)) plus another property.
  • volumetric data is a subtle but powerful approach in the context of the invention.
  • Use of volumetric data allows for far more accurate and realistic tracking and recording of objects in a 3D space than other approaches, and it has been realised that this is an extremely powerful tool for use in emulating a sporting event, and, importantly, for allowing a user to accurately and realistically interact with a moving object in that emulation.
  • This is far more accurate and representative of real-life positions, poses, and movements than, for instance, 2D to 3D conversion, motion tracking and related user interactions.
  • the first object may be an object projected or propelled by a sportsperson in the real-life sporting event such that the volumetric data for the first object comprises trajectory data.
  • the real-life sporting event may be the delivery of a cricket ball by a bowler in an ICC Cricket World Cup match, the kicking of a ball by a footballer to take a penalty in a FIFA World Cup tournament or a serve in a Grand Slam tennis tournament.
  • the first object is the ball, but the first object of the real-life sporting event may be any sporting projectile, for example a shuttlecock.
  • the first object may comprise a plurality of sub-objects.
  • the other property may be imparted to the first object.
  • a cricket ball, tennis ball, or football may be imparted with spin by a bowler, server or player.
  • the ball may rotate about one of its axes as it moves, for example along a trajectory, during the real-life sporting event.
  • this motion corresponds to the real-life sporting event that has taken place, in is herein termed real-life motion.
  • the method uses this real-life motion obtained from the real-life sporting event rather than any type of reconstructed motion, such as that used in video games.
  • Other properties might include swerve of the object, which may relate to spin, or be separate to spin, for example due to surface features of the object, or atmospheric conditions.
  • Bounce of a ball might also be one such property, and this could be related to spin or be separate to spin, for example due to features of the surface which the object impacts during a bounce or due to atmospheric conditions. In other instances, bounce may be closely related to, or even be, the trajectory. Depending on how ‘trajectory’ is understood or implemented, another property might simply the speed of the object with that trajectory. For instance, the user may face an object that has the same path but at a faster or slower speed. This feature might, again, assist with training.
  • a real-life sporting environment is the environment, for example the surroundings, in which the sporting event took place.
  • the sporting environment is the ground or stadium or the like where the sporting event occurred.
  • the method comprises a step of retrieving S1 first sporting event XR data corresponding to the real-life motion of the first object in the real-life sporting environment.
  • retrieving S1 first sporting event XR data may comprise communicating with a server to download stored first sporting event XR data that has been pre-recorded for the real-life sporting event. This data might typically have been stored some time ago, for example days, weeks or months. However, as technology improves, this data could be retrieved in almost real-time.
  • X Reality As is known, X Reality (XR or Extended or Cross Reality) is defined as: a form of “mixed reality environment that comes from the fusion (union) of ... ubiquitous sensor/actuator networks and shared online virtual worlds....”. It encompasses a wide spectrum of hardware and software, including sensory interfaces, applications, and infrastructures, that enable content creation for or use of virtual reality (VR), mixed reality (MR) and cinematic reality (CR). That is, XR is a broad term that encompasses VR, MR, CR and so on. With these tools, users generate or interact with new forms of reality by bringing digital objects into the physical world and bringing physical world objects into the digital world. Its meaning has more recently been broadened to include technologies of extended (“cyborg” or wearable) intelligence in regards to the IEEE Council on Extended Intelligence (CXI).
  • CXI IEEE Council on Extended Intelligence
  • retrieving S1 the first sporting event XR data may comprise selecting from a plurality of first sporting event XR data, each corresponding to a real-life motion of the first object in the real-life sporting environment. For example, this could involve choosing one or more of the fastest or most difficult cricket bowling deliveries of all time, or the fastest tennis serves. This selection may be performed by a user or may be random. For example, first sporting event XR data may be retrieved corresponding to one of multiple deliveries by a particular bowler at the same ground during a test match, or, as above, might have a far wider range of choice or variation for the user.
  • the method comprises displaying S2 an XR representation of the first object to the user based on the first sporting event XR data.
  • the cricket ball is shown to the user moving as occurred in the sporting event that took place.
  • Figures 2A and 2B which are discussed in more detail below, include a representation of the first object where the first object is a ball 100 with a trajectory 130, 230 and spin 120, 220.
  • the method may comprise modifying the trajectory data and/or the at least one further property.
  • This modification may be performed by the user or could be introduced by the implementing system (e.g., randomly).
  • the displayed XR representation of the first object is representative of that modification.
  • the user may modify the spin of a cricket ball delivery so that the displayed first sporting event XR data on which the display is based is modified such that the trajectory of the cricket ball is the same as the real-life motion, but the spin of the cricket ball is different to real-life motion.
  • This modification gives the user an experience of a delivery that is very close to real-life but subtly different. Therefore, this modification gives a realistic, yet new, experience, which may be advantageous for training and enjoyment purposes.
  • the modifying may be restricted to being within a predetermined range such that the modified XR representation of the first object is realistic. For instance, only certain types of spin may be possible or there may be a threshold placing restrictions on how the trajectory of the first object can be changed (e.g., a maximum/minimum height of the trajectory or a maximum/minimum velocity of the first object).
  • the method comprises facilitating S3 interaction by the user with the representation of the first object.
  • the first sporting event data may be appended with simulated data according to an input of the user to change the first sporting data (e.g., the trajectory and spin) by, for example, performing a movement of an electronic device.
  • An XR representation of the first object may be displayed representative of the first sporting event data changed according to the input of the user by the movement of the electronic device.
  • the user may input data by performing a movement of the electronic device, for example, as if responding to the delivery of a cricket ball as if they were the batsman receiving the delivery.
  • the ball may be shown changing direction based on the input.
  • the simulated data may cause the ball to rebound in a particular direction as if struck by a cricket bat.
  • the user will typically not type in data or similar. Instead, the user input may be based on motion sensing of the user or a tool (e.g., the electronic device) that the user is using in order to map user input with the displayed first object, allowing for interaction to take place.
  • a tool e.g., the electronic device
  • the method may comprise retrieving sporting environment XR data corresponding to the real- life sporting environment, the sporting environment XR data having a first temporal relationship with the first sporting event XR data.
  • Sporting environment XR data is data relating to the real- life sporting environment, such as images of the crowds in a stadium or images of weather conditions.
  • the sporting environment XR data may include sound data, for example crowd noise or noise from sportspersons participating the real-life sporting event.
  • the sporting environment XR data could even be very simplistic and relate to a basic reference frame in which the first object is moving.
  • Retrieving sporting environment XR data may comprise retrieving the sporting environment XR data obtaining pre-recorded sporting environment XR data, for example by communicating with a server.
  • volumetric data for the sporting environment may typically have to have been obtained for the sporting environment simultaneous with the volumetric data for the first object. For instance, the entire event is recorded using volumetric tracking or processing, and this includes the first object and the sporting environment data.
  • the sporting event XR data may be a static image.
  • the method may comprise displaying an XR representation of the sporting environment to the user based on the sporting environment XR data according to the first temporal relationship.
  • the user is represented with the motion of the first object and the sporting environment in which this motion occurred in synchronicity.
  • the display may show changes in lighting conditions in the real-life sporting environment during the motion of the first object.
  • the method may comprise retrieving second sporting event XR data corresponding to a real- life motion of a second object in the real-life sporting environment, the second sporting event XR data having a second temporal relationship with the first sporting event XR data.
  • Retrieving the second sporting event XR data may comprise obtaining pre-recorded second sporting event XR data.
  • the second object may be the sportsperson who projected the first object.
  • the motion of the sportsperson for example, is tracked during the real-life sporting event simultaneous with the tracking of the motion of the first object, meaning that there is no need to retrospectively match the motion of the first object with the motion of the second object (e.g., there is no need to match the motion of a ball with the motion of a sportsperson at the moment they release the ball from their hand).
  • Such retrospective matching frequently results in inaccuracies.
  • the second sporting event XR data may be volumetric data, again typically obtained at the same time (in the same obtaining step or phase) as for the first sporting event XR data.
  • Figures 2A and 2B which are described in more detail below, include a sportsperson 200.
  • the method may comprise displaying an XR representation of the second object to the user based on the second sporting event XR data according to the second temporal relationship.
  • a user can see the motion of the first object and the second object according to their motion in the real-life sporting environment.
  • a user can see the motion of the bowler before, during and after the motion of the ball.
  • This facility is particularly useful for learning to anticipate the motion of the cricket ball according to the body shape and/or hand/finger position (e.g., grip) and/or facial expression (e.g., eye movement) of the bowler, for example.
  • the length of time the ball is gripped by the bowler can also be indicative of the ball’s motion.
  • first sporting event XR data, second sporting event XR data and sporting environment XR data may include colour/visual information. This might facilitate the reconstruction of actual visual/colour information, as well as positional data, so that the reconstruction contains accurate markings, colours and details that correspond to the actual event, as opposed to just positional data, with a generic ball to represent the projectile path.
  • the method may comprise modifying the motion of the second object.
  • This modification may be performed by the user or could be introduced by the implementing system (e.g., randomly).
  • the displayed XR representation of the second object is representative of that modification.
  • the user may modify the running speed, gait, posture or grip of a sportsperson projecting the first object.
  • the displayed motion of the first object may be affected by modification of the motion of the second object.
  • the user can further enhance their understanding of how the, for example, motion of the bowler affects the motion of the ball.
  • An even more realistic experience may be provided by displaying the sporting environment XR data at the same time as the first sporting event XR data and the second sporting event XR data in the case that the first temporal relationship and the second temporal relationship are the same.
  • Figure 2A schematically depicts an XR representation of a first object and a second object as displayed to a user 10 according to an exemplary embodiment.
  • the first object is a ball 100.
  • the second object is a person 200.
  • the person 200 is throwing a ball using a hand in the manner of a bowler delivering a cricket ball to a batsman, causing the ball to follow a trajectory 130 such that there is a temporal relationship between the first object and the second object as described.
  • the motion of the first object may be caused by another body part (e.g., a foot) or by sporting equipment (e.g., a racquet).
  • the ball 100 is shown having a top spin 120 as the ball leaves the hand of the person 200 to travel along the trajectory.
  • an XR representation of the sporting environment may also be displayed, for example, in the case of cricket, the crease, the wicket, the wickets, the stadium, the boundary, etc.
  • a representation of the user, or any equipment associated with the user may be shown in the XR environment.
  • at least a part of the user’s body may be shown, for example arms or hands, and/or a bat or racket, and so on.
  • Figure 2B schematically depicts an XR representation of a first object and a second object as displayed to the user 10 according to an exemplary embodiment.
  • Figure 2B corresponds to Figure 2A save that the trajectory 230 of the ball 100 has been modified as described and the ball 100 is travelling along the trajectory with backspin 220 rather than topspin 120. This might involve the trajectory itself being modified, in terms of modifying from retrieved to represented, or the spin may be modified which then impacts on the trajectory.
  • the method may comprise selecting a perspective from which the XR emulation is displayed.
  • the motion of the XR emulation may be displayed as if it is moving towards the user or from above.
  • Figures 2A and 2B the XR simulation is shown in a side view.
  • the user may be located within the simulation at a default location that generally corresponds with a perspective of a user (e.g., player) in the real-world or real- life sporting event from which data was obtained. This may give a more accurate or useful training or enjoyment experience.
  • the perspective might be changed from this default location, either before or during the emulation.
  • the user may enjoy the experience using known technology, for example using a set of VR goggles or similar.
  • the user may interact with the first object in a known XR manner, for example with a user input being based on sensing of the user, or a tool that the user is using, in order to map user input with the displayed object, allowing for interaction to take place.
  • This might typically involve equipment, for example a sensed bat, or a bat with sensors, but could involve the user of audio input (e.g., sensing the word “hit” or “strike” or something more complex), and/or eye gaze tracking, and/or gesture control.
  • FIG. 3 schematically depicts a system 500 for generating an XR emulation based on a real- life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment.
  • the system 500 comprises a processor 511 , a display 512 and a user interface 513.
  • the processor 511 is configured to retrieve first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data.
  • the processor 511 may be communicable with a server or other data store (permanent or transient in nature).
  • the processor 511 may comprise or provide a receiver.
  • the processor 511 may be configured to retrieve sporting environment XR data and/or second sporting event XR data in addition to first sporting event XR data.
  • the volumetric data may comprise trajectory data and/or a further property.
  • the display 512 is configured to display an XR representation of the first object to a user based on the first sporting event XR data.
  • the display 512 may be configured to display an XR representation of the sporting environment and/or the second object to the user.
  • the display could be a screen or similar that is fixed, relative to any movement of the user, or be moveable with the user (e.g., via a headset) or a combination of both.
  • the user interface 513 is configured to facilitate interaction by the user with the representation of the first object.
  • the system 500 may comprise an electronic device configured to receive a user input such that the user can interact with the representation of the first object and/or modify the volumetric data.
  • the user interface may be configured to enable selection of a perspective from which the XR emulation is displayed.
  • the system 500 may comprise a microphone to play sound data for the recorded for the real- life sporting environment.
  • the invention provides an XR emulation of a real-life sporting event that can be used for training because of the real-life information used to generate the XR emulation and the facility of the user being able to interact with objects the XR emulation that generally follow a real-life motion.
  • An accurate and realistic simulator is provided, and that may be simply for entertainment purposes.
  • the volumetric data may typically be obtained from pre-recorded data, so that a user can interact with and ‘play’ a part of a prior, real event.
  • the prerecorded data may be from a long time ago, for example days, weeks, or even years.
  • the time-frame may be shorter, so that a user can quickly interact with and ‘play’ a part of a real event that has only just happened.
  • processing resources in-between the live event and the user’s location, or accidental or even deliberate time-delays it might be possible for the user to interact with what appears to be the live event itself.
  • Pre-recorded data would typically be in a data or memory store, for example in a server or similar.
  • the invention might extend to actually obtaining the data from the live event itself. This might give a greater, or desired, degree of control over the nature of data that is obtained, for subsequent processing, and interaction with by the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Processing Or Creating Images (AREA)

Abstract

There is provided a method for generating an extended reality, XR, emulation based on a real- life sporting event that has taken place in a real-life sporting environment, the method comprising: retrieving (S1) first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data; displaying (S2) an XR representation of the first object to a user based on the first sporting event XR data; and facilitating (S3) interaction by the user with the representation of the first object.

Description

EXTENDED REALITY EMULATION OF A REAL-LIFE SPORTING EVENT
Field
The invention relates to a method and a system for generating an extended reality emulation based on a real-life sporting event that has taken place in a real-life sporting environment.
Background to the Invention
It is known to provide a simulation of a sport. For example, there exist video games in which a user participates in a simulation of a sport. However, video games do not use data corresponding to the real-life motion of objects which may be included in the video game simulation. For example, in a video game of a sport in which a ball is used, the motion of a ball is typically entirely simulated. Consequently, video games are not useful for any purpose requiring real-life motion of objects, such as sports training, or real-life sporting experiences.
Typically, technologies that replay real-life sporting events, for example for the purposes of checking a referee’s or an umpire’s decision, are confined to showing the motion of, for example, the ball (continuing with the example of a sport in which a ball is used). Players are not shown and the user of the technology cannot affect the ball before during or after the event (i.e. the user cannot interact with the ball). Most technologies in which players are shown combine 2D video capture footage of players with ball tracking data. However, in this case the ball release position obtained from the 2D video does not typically match, accurately or realistically with the tracked ball trajectory. Again, these drawbacks preclude the use of the technology for training purposes or accurate real-life sporting experiences.
Resultingly, there is a need for a method and system enabling emulation of real-life sporting events in which a user can interact with objects involved in the event and in which the objects’ position is accurately represented throughout the duration of the emulation. In this way, a user of the method and system can improve their understanding and enjoyment of the sport. For example, such an emulation is useful for training, because a user can experience responding to, for example, the real-life delivery of a cricket ball as actually bowled by a real-life bowler or a slight or subtle realistic variation of that delivery. In other words, in this example, the user can view the motion of the cricket ball that has occurred during a real cricket match, perhaps as delivered by a famous bowler, and can subsequently interact with the ball as if they were the batsman who actually received the real-life delivery. In short, perception-action coupling is essential to how people learn to play sport, such that a tool which facilitates a more realistic perception of a sporting event has pedagogical advantages. Summary of the Invention
It is one aim of the present invention, amongst others, to provide a method and system which at least partially obviate or mitigate at least some of the disadvantages of the prior art, whether identified herein or elsewhere, or to provide an alternative approach. For instance, it is an aim of embodiments of the invention to provide a method for generating an emulation of a real-life sporting event. For instance, it is an aim of embodiments of the invention to provide a method enabling interaction with an object, the movement of which occurred during a real-life sporting event.
According to the present invention there is provided a method and a system, as set forth in the appended claims. Other features of the invention will be apparent from the dependent claims, and the description that follows.
According to a first aspect there is provided a method for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment. The method comprises retrieving first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data; displaying an XR representation of the first object to a user based on the first sporting event XR data; and facilitating interaction by the user with the representation of the first object.
The volumetric data may comprise trajectory data and/or at least one further property.
The method may comprise modifying the trajectory data and/or the at least one further property, optionally by the user, wherein the displayed XR representation of the first object is representative of that modification.
The modifying may be restricted to being within a predetermined range such that the modified XR representation of the first object is realistic.
The at least one further property may comprise rotation (i.e. , spin) data.
Retrieving the first sporting event XR data may comprise obtaining pre-recorded first sporting event XR data. The method may comprise retrieving sporting environment XR data corresponding to the real- life sporting environment, the sporting environment XR data having a first temporal relationship with the first sporting event XR data; and displaying an XR representation of the sporting environment to the user based on the sporting environment XR data according to the first temporal relationship.
Retrieving the sporting environment XR data may comprise obtaining pre-recorded sporting environment XR data.
The method may comprise retrieving second sporting event XR data corresponding to a real- life motion of a second object in the real-life sporting environment, the second sporting event XR data having a second temporal relationship with the first sporting event XR data; and displaying an XR representation of the second object to the user based on the second sporting event XR data according to the second temporal relationship.
Retrieving the second sporting event XR data may comprise obtaining pre-recorded second sporting event XR data.
The second object may be a person.
The second sporting event XR data may be volumetric data.
Retrieving the first sporting event XR data may comprise selecting from a plurality of sporting event XR data each corresponding to a real-life motion of the first object in the real-life sporting environment.
The method may comprise selecting a perspective from which the XR emulation is displayed.
According to a second aspect there is provided a system for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment, the system comprising a processor configured to retrieve first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data; a display configured to display an XR representation of the first object to a user based on the first sporting event XR data; and a user interface configured to facilitate interaction by the user with the representation of the first object.
Brief Description of the Drawings For a better understanding of the invention, and to show how exemplary embodiments of the same may be brought into effect, reference will be made, by way of example only, to the accompanying diagrammatic Figures, in which:
Figure 1 schematically depicts a method for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment; and
Figure 2A schematically depicts an XR representation of a first object and a second object as displayed to a user according to an exemplary embodiment;
Figure 2B schematically depicts an XR representation of a first object and a second object as displayed to a user according to an exemplary embodiment; and
Figure 3 schematically depicts a system for generating an XR emulation based on a real-life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment;
Detailed Description
Throughout this specification, the term “comprising” or “comprises” means including the component(s) specified but not to the exclusion of the presence of other components. The term “consisting essentially of or “consists essentially of means including the components specified but excluding other components except for materials present as impurities, unavoidable materials present as a result of processes used to provide the components, and components added for a purpose other than achieving the technical effect of the invention, such as colourants, and the like.
The term “consisting of or “consists of means including the components specified but excluding other components.
Whenever appropriate, depending upon the context, the use of the term “comprises” or “comprising” may also be taken to include the meaning “consists essentially of or “consisting essentially of, and also may also be taken to include the meaning “consists of or “consisting of.
The optional features set out herein may be used either individually or in combination with each other where appropriate and particularly in the combinations as set out in the accompanying claims. The optional features for each aspect or exemplary embodiment of the invention, as set out herein are also applicable to all other aspects or exemplary embodiments of the invention, where appropriate. In other words, the skilled person reading this specification should consider the optional features for each aspect or exemplary embodiment of the invention as interchangeable and combinable between different aspects and exemplary embodiments.
Figure 1 schematically depicts a method for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment.
A real-life sporting eventing event is a sporting event that has taken place for which a motion of at least a first object was tracked to provide volumetric data for the first object. In one example, volumetric data comprises, or is defined or described as comprising, spatial coordinates in three dimensions (i.e. , (x, y, z)) plus another property.
The use of volumetric data is a subtle but powerful approach in the context of the invention. Use of volumetric data allows for far more accurate and realistic tracking and recording of objects in a 3D space than other approaches, and it has been realised that this is an extremely powerful tool for use in emulating a sporting event, and, importantly, for allowing a user to accurately and realistically interact with a moving object in that emulation. This is far more accurate and representative of real-life positions, poses, and movements than, for instance, 2D to 3D conversion, motion tracking and related user interactions.
The first object may be an object projected or propelled by a sportsperson in the real-life sporting event such that the volumetric data for the first object comprises trajectory data. For example, the real-life sporting event may be the delivery of a cricket ball by a bowler in an ICC Cricket World Cup match, the kicking of a ball by a footballer to take a penalty in a FIFA World Cup tournament or a serve in a Grand Slam tennis tournament. In each of these examples, the first object is the ball, but the first object of the real-life sporting event may be any sporting projectile, for example a shuttlecock. The first object may comprise a plurality of sub-objects.
As the first object is set in motion, the other property may be imparted to the first object. For example, a cricket ball, tennis ball, or football, may be imparted with spin by a bowler, server or player. In other words, the ball may rotate about one of its axes as it moves, for example along a trajectory, during the real-life sporting event. As this motion corresponds to the real-life sporting event that has taken place, in is herein termed real-life motion. The method uses this real-life motion obtained from the real-life sporting event rather than any type of reconstructed motion, such as that used in video games. Other properties might include swerve of the object, which may relate to spin, or be separate to spin, for example due to surface features of the object, or atmospheric conditions. Bounce of a ball might also be one such property, and this could be related to spin or be separate to spin, for example due to features of the surface which the object impacts during a bounce or due to atmospheric conditions. In other instances, bounce may be closely related to, or even be, the trajectory. Depending on how ‘trajectory’ is understood or implemented, another property might simply the speed of the object with that trajectory. For instance, the user may face an object that has the same path but at a faster or slower speed. This feature might, again, assist with training.
A real-life sporting environment is the environment, for example the surroundings, in which the sporting event took place. For instance, the sporting environment is the ground or stadium or the like where the sporting event occurred.
The method comprises a step of retrieving S1 first sporting event XR data corresponding to the real-life motion of the first object in the real-life sporting environment. For example, retrieving S1 first sporting event XR data may comprise communicating with a server to download stored first sporting event XR data that has been pre-recorded for the real-life sporting event. This data might typically have been stored some time ago, for example days, weeks or months. However, as technology improves, this data could be retrieved in almost real-time.
As is known, X Reality (XR or Extended or Cross Reality) is defined as: a form of “mixed reality environment that comes from the fusion (union) of ... ubiquitous sensor/actuator networks and shared online virtual worlds....”. It encompasses a wide spectrum of hardware and software, including sensory interfaces, applications, and infrastructures, that enable content creation for or use of virtual reality (VR), mixed reality (MR) and cinematic reality (CR). That is, XR is a broad term that encompasses VR, MR, CR and so on. With these tools, users generate or interact with new forms of reality by bringing digital objects into the physical world and bringing physical world objects into the digital world. Its meaning has more recently been broadened to include technologies of extended ("cyborg" or wearable) intelligence in regards to the IEEE Council on Extended Intelligence (CXI).
Referring back to Figure 1 , retrieving S1 the first sporting event XR data may comprise selecting from a plurality of first sporting event XR data, each corresponding to a real-life motion of the first object in the real-life sporting environment. For example, this could involve choosing one or more of the fastest or most difficult cricket bowling deliveries of all time, or the fastest tennis serves. This selection may be performed by a user or may be random. For example, first sporting event XR data may be retrieved corresponding to one of multiple deliveries by a particular bowler at the same ground during a test match, or, as above, might have a far wider range of choice or variation for the user.
The method comprises displaying S2 an XR representation of the first object to the user based on the first sporting event XR data. In other words, taking the example of the sporting event being the delivery of a cricket ball, the cricket ball is shown to the user moving as occurred in the sporting event that took place. Figures 2A and 2B, which are discussed in more detail below, include a representation of the first object where the first object is a ball 100 with a trajectory 130, 230 and spin 120, 220.
The method may comprise modifying the trajectory data and/or the at least one further property. This modification may be performed by the user or could be introduced by the implementing system (e.g., randomly). Resultingly, the displayed XR representation of the first object is representative of that modification. For example, the user may modify the spin of a cricket ball delivery so that the displayed first sporting event XR data on which the display is based is modified such that the trajectory of the cricket ball is the same as the real-life motion, but the spin of the cricket ball is different to real-life motion. This modification gives the user an experience of a delivery that is very close to real-life but subtly different. Therefore, this modification gives a realistic, yet new, experience, which may be advantageous for training and enjoyment purposes.
The modifying may be restricted to being within a predetermined range such that the modified XR representation of the first object is realistic. For instance, only certain types of spin may be possible or there may be a threshold placing restrictions on how the trajectory of the first object can be changed (e.g., a maximum/minimum height of the trajectory or a maximum/minimum velocity of the first object).
The method comprises facilitating S3 interaction by the user with the representation of the first object. For example, the first sporting event data may be appended with simulated data according to an input of the user to change the first sporting data (e.g., the trajectory and spin) by, for example, performing a movement of an electronic device. An XR representation of the first object may be displayed representative of the first sporting event data changed according to the input of the user by the movement of the electronic device. For instance, following the displaying S2 the user may input data by performing a movement of the electronic device, for example, as if responding to the delivery of a cricket ball as if they were the batsman receiving the delivery. The ball may be shown changing direction based on the input. For instance, the simulated data may cause the ball to rebound in a particular direction as if struck by a cricket bat. As alluded to above, the user will typically not type in data or similar. Instead, the user input may be based on motion sensing of the user or a tool (e.g., the electronic device) that the user is using in order to map user input with the displayed first object, allowing for interaction to take place.
The method may comprise retrieving sporting environment XR data corresponding to the real- life sporting environment, the sporting environment XR data having a first temporal relationship with the first sporting event XR data. Sporting environment XR data is data relating to the real- life sporting environment, such as images of the crowds in a stadium or images of weather conditions. The sporting environment XR data may include sound data, for example crowd noise or noise from sportspersons participating the real-life sporting event. The sporting environment XR data could even be very simplistic and relate to a basic reference frame in which the first object is moving.
Retrieving sporting environment XR data may comprise retrieving the sporting environment XR data obtaining pre-recorded sporting environment XR data, for example by communicating with a server. To obtain sporting environment XR data that has a strong, for example accurate, temporal relationship with the first sporting event XR data, volumetric data for the sporting environment may typically have to have been obtained for the sporting environment simultaneous with the volumetric data for the first object. For instance, the entire event is recorded using volumetric tracking or processing, and this includes the first object and the sporting environment data. Alternatively, the sporting event XR data may be a static image.
The method may comprise displaying an XR representation of the sporting environment to the user based on the sporting environment XR data according to the first temporal relationship. In this way, the user is represented with the motion of the first object and the sporting environment in which this motion occurred in synchronicity. For example, the display may show changes in lighting conditions in the real-life sporting environment during the motion of the first object.
The method may comprise retrieving second sporting event XR data corresponding to a real- life motion of a second object in the real-life sporting environment, the second sporting event XR data having a second temporal relationship with the first sporting event XR data. Retrieving the second sporting event XR data may comprise obtaining pre-recorded second sporting event XR data.
The second object may be the sportsperson who projected the first object. The motion of the sportsperson, for example, is tracked during the real-life sporting event simultaneous with the tracking of the motion of the first object, meaning that there is no need to retrospectively match the motion of the first object with the motion of the second object (e.g., there is no need to match the motion of a ball with the motion of a sportsperson at the moment they release the ball from their hand). Such retrospective matching frequently results in inaccuracies. The second sporting event XR data may be volumetric data, again typically obtained at the same time (in the same obtaining step or phase) as for the first sporting event XR data. Figures 2A and 2B, which are described in more detail below, include a sportsperson 200.
The method may comprise displaying an XR representation of the second object to the user based on the second sporting event XR data according to the second temporal relationship. In this way, a user can see the motion of the first object and the second object according to their motion in the real-life sporting environment. In the example of cricket, a user can see the motion of the bowler before, during and after the motion of the ball. This facility is particularly useful for learning to anticipate the motion of the cricket ball according to the body shape and/or hand/finger position (e.g., grip) and/or facial expression (e.g., eye movement) of the bowler, for example. The length of time the ball is gripped by the bowler can also be indicative of the ball’s motion. The same or analogous is true for a footballer taking a penalty kick or a tennis player going through a serving motion. It is well known that such motions are key in the receiving or facing player predicting the movement of the incoming object (i.e., sporting projectile), and hence an accurate, useful training aid is facilitated by the invention. Just displaying the motion of the first object does not fully allow the user to build up their understanding of the relationship between, for example, the motion of the ball and the motion of the bowler.
One or more of first sporting event XR data, second sporting event XR data and sporting environment XR data may include colour/visual information. This might facilitate the reconstruction of actual visual/colour information, as well as positional data, so that the reconstruction contains accurate markings, colours and details that correspond to the actual event, as opposed to just positional data, with a generic ball to represent the projectile path.
The method may comprise modifying the motion of the second object. This modification may be performed by the user or could be introduced by the implementing system (e.g., randomly). Resultingly, the displayed XR representation of the second object is representative of that modification. For example, the user may modify the running speed, gait, posture or grip of a sportsperson projecting the first object. The displayed motion of the first object may be affected by modification of the motion of the second object. In this way, the user can further enhance their understanding of how the, for example, motion of the bowler affects the motion of the ball. An even more realistic experience may be provided by displaying the sporting environment XR data at the same time as the first sporting event XR data and the second sporting event XR data in the case that the first temporal relationship and the second temporal relationship are the same.
Figure 2A schematically depicts an XR representation of a first object and a second object as displayed to a user 10 according to an exemplary embodiment. The first object is a ball 100. The second object is a person 200. In the case of Figure 2A, the person 200 is throwing a ball using a hand in the manner of a bowler delivering a cricket ball to a batsman, causing the ball to follow a trajectory 130 such that there is a temporal relationship between the first object and the second object as described. For other sports the motion of the first object may be caused by another body part (e.g., a foot) or by sporting equipment (e.g., a racquet). The ball 100 is shown having a top spin 120 as the ball leaves the hand of the person 200 to travel along the trajectory. As described, an XR representation of the sporting environment may also be displayed, for example, in the case of cricket, the crease, the wicket, the wickets, the stadium, the boundary, etc.
In other examples, a representation of the user, or any equipment associated with the user, may be shown in the XR environment. For example, at least a part of the user’s body may be shown, for example arms or hands, and/or a bat or racket, and so on.
Figure 2B schematically depicts an XR representation of a first object and a second object as displayed to the user 10 according to an exemplary embodiment. Figure 2B corresponds to Figure 2A save that the trajectory 230 of the ball 100 has been modified as described and the ball 100 is travelling along the trajectory with backspin 220 rather than topspin 120. This might involve the trajectory itself being modified, in terms of modifying from retrieved to represented, or the spin may be modified which then impacts on the trajectory.
The method may comprise selecting a perspective from which the XR emulation is displayed. For example, the motion of the XR emulation may be displayed as if it is moving towards the user or from above. In Figures 2A and 2B the XR simulation is shown in a side view.
It will be appreciated that the user may be located within the simulation at a default location that generally corresponds with a perspective of a user (e.g., player) in the real-world or real- life sporting event from which data was obtained. This may give a more accurate or useful training or enjoyment experience. The perspective might be changed from this default location, either before or during the emulation. The user may enjoy the experience using known technology, for example using a set of VR goggles or similar.
The user may interact with the first object in a known XR manner, for example with a user input being based on sensing of the user, or a tool that the user is using, in order to map user input with the displayed object, allowing for interaction to take place. This might typically involve equipment, for example a sensed bat, or a bat with sensors, but could involve the user of audio input (e.g., sensing the word “hit” or “strike” or something more complex), and/or eye gaze tracking, and/or gesture control.
Figure 3 schematically depicts a system 500 for generating an XR emulation based on a real- life sporting event that has taken place in a real-life sporting environment according to an exemplary embodiment. The system 500 comprises a processor 511 , a display 512 and a user interface 513.
The processor 511 is configured to retrieve first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data. The processor 511 may be communicable with a server or other data store (permanent or transient in nature). The processor 511 may comprise or provide a receiver. The processor 511 may be configured to retrieve sporting environment XR data and/or second sporting event XR data in addition to first sporting event XR data. As discussed in relation to the method, the volumetric data may comprise trajectory data and/or a further property.
The display 512 is configured to display an XR representation of the first object to a user based on the first sporting event XR data. The display 512 may be configured to display an XR representation of the sporting environment and/or the second object to the user. The display could be a screen or similar that is fixed, relative to any movement of the user, or be moveable with the user (e.g., via a headset) or a combination of both.
The user interface 513 is configured to facilitate interaction by the user with the representation of the first object. The system 500 may comprise an electronic device configured to receive a user input such that the user can interact with the representation of the first object and/or modify the volumetric data. The user interface may be configured to enable selection of a perspective from which the XR emulation is displayed.
The system 500 may comprise a microphone to play sound data for the recorded for the real- life sporting environment. In summary, the invention provides an XR emulation of a real-life sporting event that can be used for training because of the real-life information used to generate the XR emulation and the facility of the user being able to interact with objects the XR emulation that generally follow a real-life motion. An accurate and realistic simulator is provided, and that may be simply for entertainment purposes.
As discussed above, the volumetric data may typically be obtained from pre-recorded data, so that a user can interact with and ‘play’ a part of a prior, real event. In one example, the prerecorded data may be from a long time ago, for example days, weeks, or even years. However, in other examples, the time-frame may be shorter, so that a user can quickly interact with and ‘play’ a part of a real event that has only just happened. Depending on processing resources in-between the live event and the user’s location, or accidental or even deliberate time-delays, it might be possible for the user to interact with what appears to be the live event itself.
Pre-recorded data would typically be in a data or memory store, for example in a server or similar. In a related example, the invention might extend to actually obtaining the data from the live event itself. This might give a greater, or desired, degree of control over the nature of data that is obtained, for subsequent processing, and interaction with by the user.
Although a preferred embodiment has been shown and described, it will be appreciated by those skilled in the art that various changes and modifications might be made without departing from the scope of the invention, as defined in the appended claims and as described above.
Attention is directed to all papers and documents which are filed concurrently with or previous to this specification in connection with this application and which are open to public inspection with this specification, and the contents of all such papers and documents are incorporated herein by reference.
All of the features disclosed in this specification (including any accompanying claims and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at most some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features. The invention is not restricted to the details of the foregoing embodiment(s). The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed.

Claims

1 . A method for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment, the method comprising: retrieving first sporting event XR data corresponding to a real-life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data; displaying an XR representation of the first object to a user based on the first sporting event XR data; and facilitating interaction by the user with the representation of the first object.
2. The method of claim 1 , wherein the volumetric data comprises trajectory data and/or at least one further property.
3. The method of claim 2, wherein the method comprises: modifying the trajectory data and/or the at least one further property, optionally by the user, wherein the displayed XR representation of the first object is representative of that modification.
4. The method of claim 3, wherein the modifying is restricted to being within a predetermined range such that the modified XR representation of the first object is realistic.
5. The method of claim 2, 3 or 4, wherein the at least one further property comprises rotation data.
6. The method of any one of claims 1 to 5, wherein retrieving the first sporting event XR data comprises obtaining pre-recorded first sporting event XR data.
7. The method of any one of claims 1 to 6, comprising: retrieving sporting environment XR data corresponding to the real-life sporting environment, the sporting environment XR data having a first temporal relationship with the first sporting event XR data; and displaying an XR representation of the sporting environment to the user based on the sporting environment XR data according to the first temporal relationship.
8. The method of claim 7, wherein retrieving the sporting environment XR data comprises obtaining pre-recorded sporting environment XR data.
9. The method of any one of claims 1 to 8, comprising: retrieving second sporting event XR data corresponding to a real-life motion of a second object in the real-life sporting environment, the second sporting event XR data having a second temporal relationship with the first sporting event XR data; and displaying an XR representation of the second object to the user based on the second sporting event XR data according to the second temporal relationship.
10. The method of claim 9, wherein retrieving the second sporting event XR data comprises obtaining pre-recorded second sporting event XR data.
11. The method of any one of claim 9 or 10, wherein the second object is a person.
12. The method of any one of claims 9 to 11 , wherein the second sporting event XR data is volumetric data.
13. The method of any one of claims 1 to 12, wherein retrieving the first sporting event XR data comprises selecting from a plurality of sporting event XR data each corresponding to a real-life motion of the first object in the real-life sporting environment.
14. The method of any one of claims 1 to 13, comprising: selecting a perspective from which the XR emulation is displayed.
15. A system for generating an extended reality, XR, emulation based on a real-life sporting event that has taken place in a real-life sporting environment, the system comprising: a processor configured to retrieve first sporting event XR data corresponding to a real- life motion of a first object in the real-life sporting environment, wherein the first sporting event XR data comprises volumetric data; a display configured to display an XR representation of the first object to a user based on the first sporting event XR data; and a user interface configured to facilitate interaction by the user with the representation of the first object.
PCT/GB2021/051823 2020-07-16 2021-07-15 Extended reality emulation of a real-life sporting event WO2022013561A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2010978.1 2020-07-16
GB2010978.1A GB2597108A (en) 2020-07-16 2020-07-16 Extended reality emulation of a real-life sporting event

Publications (1)

Publication Number Publication Date
WO2022013561A1 true WO2022013561A1 (en) 2022-01-20

Family

ID=72338861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2021/051823 WO2022013561A1 (en) 2020-07-16 2021-07-15 Extended reality emulation of a real-life sporting event

Country Status (2)

Country Link
GB (1) GB2597108A (en)
WO (1) WO2022013561A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180311554A1 (en) * 2017-04-27 2018-11-01 TrinityVR, Inc. Baseball Pitch Simulation and Swing Analysis Using Virtual Reality Device and System
US20180369699A1 (en) * 2017-06-22 2018-12-27 Centurion VR, LLC Virtual reality simulation of a live-action sequence
US10713494B2 (en) * 2014-02-28 2020-07-14 Second Spectrum, Inc. Data processing systems and methods for generating and interactive user interfaces and interactive game systems based on spatiotemporal analysis of video content

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10713494B2 (en) * 2014-02-28 2020-07-14 Second Spectrum, Inc. Data processing systems and methods for generating and interactive user interfaces and interactive game systems based on spatiotemporal analysis of video content
US20180311554A1 (en) * 2017-04-27 2018-11-01 TrinityVR, Inc. Baseball Pitch Simulation and Swing Analysis Using Virtual Reality Device and System
US20180369699A1 (en) * 2017-06-22 2018-12-27 Centurion VR, LLC Virtual reality simulation of a live-action sequence

Also Published As

Publication number Publication date
GB2597108A (en) 2022-01-19
GB202010978D0 (en) 2020-09-02

Similar Documents

Publication Publication Date Title
US11836929B2 (en) Systems and methods for determining trajectories of basketball shots for display
US20240100445A1 (en) Virtual reality simulation of a live-action sequence
Miles et al. A review of virtual environments for training in ball sports
KR100870307B1 (en) Computer-readable recording medium having program for controlling progress of game, and method for controlling progress of game
US20090029754A1 (en) Tracking and Interactive Simulation of Real Sports Equipment
CN104394949A (en) Web-based game platform with mobile device motion sensor input
CN101991949A (en) Computer based control method and system of motion of virtual table tennis
WO2020235339A1 (en) Play analyzing device, and play analyzing method
US20220401841A1 (en) Use of projectile data to create a virtual reality simulation of a live-action sequence
WO2022013561A1 (en) Extended reality emulation of a real-life sporting event
JP7502957B2 (en) Haptic metadata generating device, video-haptic interlocking system, and program
JP2020195551A (en) Physical activity supporting system, method and program
US12033332B2 (en) Systems and methods for evaluating performance of players at sporting events using trajectory predictions
Richard et al. Modeling dynamic interaction in virtual environments and the evaluation of dynamic virtual fixtures
O'Connor et al. Interactive games for preservation and promotion of sporting movements
Min et al. Development of a virtual pitching system in screen baseball game
CN117065363A (en) Football game and match real-time prediction system based on deep learning
TW202419135A (en) System for analyzing user swing to determine ball trajectory and method thereof
Justham et al. The use of virtual reality and automatic training devices in sport: A review of technology within cricket and related disciplines
JP2021082954A (en) Tactile metadata generation device, video tactile interlocking system, and program
NZ753620B2 (en) Virtual reality simulation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21748651

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21748651

Country of ref document: EP

Kind code of ref document: A1