WO2019165501A1 - Virtual locomotion device - Google Patents

Virtual locomotion device Download PDF

Info

Publication number
WO2019165501A1
WO2019165501A1 PCT/AU2019/050166 AU2019050166W WO2019165501A1 WO 2019165501 A1 WO2019165501 A1 WO 2019165501A1 AU 2019050166 W AU2019050166 W AU 2019050166W WO 2019165501 A1 WO2019165501 A1 WO 2019165501A1
Authority
WO
WIPO (PCT)
Prior art keywords
platform
interface device
user
tilt
base
Prior art date
Application number
PCT/AU2019/050166
Other languages
French (fr)
Inventor
Peter Puya Abolfathi
Robert James SEGAL
Original Assignee
Visospace Holdings Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2018900655A external-priority patent/AU2018900655A0/en
Application filed by Visospace Holdings Pty Ltd filed Critical Visospace Holdings Pty Ltd
Publication of WO2019165501A1 publication Critical patent/WO2019165501A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0338Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of limited linear or angular displacement of an operating part of the device from a neutral position, e.g. isotonic or isometric joysticks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/28Output arrangements for video game devices responding to control signals received from the game device for affecting ambient conditions, e.g. for vibrating players' seats, activating scent dispensers or affecting temperature or light
    • A63F13/285Generating tactile feedback signals via the game input device, e.g. force feedback
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C9/00Measuring inclination, e.g. by clinometers, by levels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0334Foot operated pointing devices

Definitions

  • the present invention relates to a virtual locomotion device.
  • the present invention relates to a virtual locomotion device which includes a platform configured to accommodate a user, and which includes a base which is at least partly formed of resilient material and which supports the platform.
  • HMDs head-mount displays
  • VR Virtual Reality
  • AR Augmented Reality
  • their usefulness is limited by current user interfaces.
  • Head-mount displays act as both input and output devices allowing the user to be perceptually immersed in a virtual world.
  • the HMD Using stereoscopic audio-visual presentation, the HMD‘tricks’ the user’s brain into believing he/she is in the virtual world. This is reinforced by the fact that as the user moves his/her head, the visual input is modified accordingly.
  • its head tracking may afford the user up to six degrees of freedom of movement (6DoF) allowing full rotation as well as translation of the head and the body.
  • 6DoF degrees of freedom of movement
  • an interface device for virtual locomotion including: a platform configured to accommodate a user; a base supporting the platform, the base having a lower surface configured to rest upon a substrate surface, wherein the base is configured to allow tilting of the platform by the user; and a first sensor configured to detect the tilting of the platform.
  • the platform is configured to accommodate a user in one of a standing, sitting, or kneeling position.
  • the substrate surface is a floor or ground surface supporting the interface device.
  • the base is configured to allow tilting of the platform relative to the substrate surface.
  • the substrate surface is a floor or ground surface.
  • the base is configured to allow tilting of the platform by the user’s weight.
  • the base includes a resilient member.
  • the first sensor is fixed to the platform.
  • the first sensor is configured to measure an angle of tilt of the platform relative to a horizontal plane.
  • the first sensor is further configured to measure a direction of tilt of the platform relative to the horizontal plane.
  • the interface device further includes a second sensor configured to measure the distance between the platform and the substrate surface.
  • the second sensor is fixed to a lower surface of the platform.
  • the interface device further includes a haptic feedback device configured to provide vibrations or a force to the user.
  • the haptic feedback device is fixed to the platform.
  • the interface device further includes a stabiliser.
  • the stabiliser is located on a rim of a lower surface of the platform.
  • the stabiliser includes a resilient material.
  • an interface device for virtual locomotion including: a platform configured to accommodate a user; a base supporting the platform, the base being configured to be supported by a substrate surface, wherein the base pivotally connects the platform to the substrate surface; and a sensor configured to detect a tilt angle of the platform relative to the substrate surface, and a tilt orientation of the platform relative to the substrate surface.
  • a system for virtual locomotion including: the interface device as described above; and a processing system configured to: obtain data relating to the tilt of the platform; and convert the data relating to the tilt of the platform into movement commands in the virtual environment.
  • the system further includes a display device.
  • the display device is a virtual reality headset.
  • the system further includes a hand-held controller.
  • the hand-held controller is a virtual reality joystick.
  • the data relating to the tilt of the platform includes an angle of tilt of the platform relative to a horizontal surface.
  • the data relating to the tilt of the platform includes an orientation of tilt of the platform in a horizontal surface.
  • the processing system is further configured to stimulate forces or vibrations in the platform.
  • the processing system is further configured to: determine a location and orientation of the interface device relative to the user; and create an image of the interface device in the virtual environment, wherein the image has a location and orientation relative to the user corresponding to the location and orientation of the interface device relative to the user.
  • a method of controlling movement in a virtual environment including the steps of: tilting a platform relative to a horizontal plane; obtaining data relating to the tilt of the platform; and converting the data relating to the tilt of the platform into movement commands in the virtual environment.
  • the step of obtaining data relating to the tilt of the platform includes the steps of: measuring an angle of tilt of the platform relative to the horizontal surface; and measuring an orientation of tilt of the platform in the horizontal surface.
  • the step of converting the data relating to the tilt of the platform into movement commands in the virtual environment includes the steps of: converting the angle of tilt in an acceleration magnitude in the virtual environment; and converting the orientation of tilt in a direction of acceleration in the virtual environment.
  • the method further includes the step of detecting the presence of a user on the platform.
  • the step of detecting the presence of a user on the platform includes the steps of: measuring a distance between the platform and a substrate surface; and measuring an angle of tilt of the platform.
  • the interface device for virtual locomotion includes;
  • a platform configured to accommodate a user
  • a base supporting the platform having a lower surface configured to rest upon a substrate surface, wherein the base is being at least partially formed of resilient material, and configured to allow tilting of the platform by the user;
  • At least one sensor configured to detect the tilting of the platform.
  • the platform is substantially rigid and adapted to support a person standing thereon.
  • said platform is of substantially circular shape.
  • said platform includes an upwardly facing curved upper surface.
  • said surface of said platform is substantially concave in shape.
  • said base includes an inner portion and an outer portion, wherein said outer portion is at least partially formed of resilient material which is less resilient/softer than the resilient material of the inner portion of the base.
  • said inner portion of said base is adapted to contact a substrate surface at a predetermined diameter from a centre point of said platform.
  • said inner portion of said base includes a plurality of shaped protrusions formed of resilient material which are adapted to contact said substrate surface at a plurality of spaced apart inner base contact positions.
  • said inner portion of said base includes eight substantially equidistant spaced protrusions.
  • said outer portion of said base is adapted to contact a substrate surface at a predetermined diameter from a centre point of said platform.
  • said outer portion of said base is adapted to contact said substrate surface proximal to a peripheral edge of said platform.
  • said outer portion of said base is formed as a downwardly extending protrusion which extends substantially around the platform.
  • Figure 1 illustrates a three-dimensional view of an example interface device for virtual locomotion
  • Figure 2 illustrates a front view of the example interface device of Figure 1, where a user’s centre of gravity is aligned with a centre of a platform of the interface device;
  • Figure 3 illustrates a front view of the example interface device of Figure 1, where a user’s centre of gravity (COG) is offset from a centre of a platform of the interface device;
  • COG centre of gravity
  • Figure 4 illustrates a bottom view of an example interface device
  • Figure 5 illustrates a cross-section of an example interface device along a vertical plane, where the interface device is in an equilibrium state
  • Figure 6 illustrates a cross-section of an example interface device along a vertical plane, where the interface device is in a tilted state
  • Figure 7 illustrates a cross-section of an example interface device along a vertical plane, where haptic feedback devices of the interface devices are producing vibrations in the interface device;
  • Figure 8 illustrates a cross-section of an example interface device along a vertical plane, with a user standing alongside the interface device;
  • Figure 9 illustrates a cross-section of the interface device of Figure 8 along a vertical plane, with the user standing on top of the interface device;
  • Figure 10 illustrates an example system architecture for an example interface device
  • Figure 11 illustrates example configurations of an example interface device and Bluetooth dongle with a PC or mobile device
  • Figure 12 illustrates communications between an example interface device and the digital experience
  • Figure 13 illustrates example data sent by the interface device
  • Figure 14 illustrates example data received by the interface device
  • Figure 15 illustrates an example system for virtual locomotion
  • Figure 16 illustrates an example processing system for use in the system of Figure 15;
  • Figure 17 illustrates an example method of controlling movement in a virtual environment
  • Figure 18 illustrates details of a preferred embodiment of a virtual locomotion platform design in accordance with the present invention
  • Figure 19 schematically illustrates details of the underside of the base of the embodiment of the platform shown in Figure 18;
  • Figure 20 illustrates an example of the responsive tilt of the virtual locomotion platform due to a shift in the centre of gravity;
  • Figure 21 graphically illustrates the rest angle of a platform after a shift of the centre of gravity
  • Figure 22 graphically illustrates the dynamic transit response to a change of angle of the platform of the present invention.
  • Interface device 100 for virtual locomotion.
  • Interface device 100 includes a platform, or board, 110 configured to accommodate a user 120.
  • Interface device further includes a base 130 supporting the platform and configured to allow tilting of the platform by user 120.
  • Interface device 100 further includes a sensor (not shown) configured to detect the tilting of the platform.
  • platform 110 is configured to accommodate user 120 in a standing, or upright, position, where the user positions their feet on top of platform 110 such that the user’s whole body weight is exerted onto interface device 100.
  • platform 110 is further configured to accommodate user 120 in a sitting position, such as user 120 being seated cross-legged onto platform 110.
  • platform 110 is configured to accommodate user 120 in any position.
  • platform 110 is configured to accommodate user 120 in any position such that the user’s whole weight is exerted onto interface device 100.
  • Platform 110 has a circular shape although other shapes, such as an oval, square or any other shape, may also be suitable.
  • Platform 110 has an upper surface, on which user 120 stands, and a lower surface connected to base 130.
  • Base 130 may be attached to platform 110, or it may be integrally formed with platform 110.
  • platform 110 includes a flexible or elastic layer with sufficient tension to support user 120 above a floor or ground surface.
  • Base 130 has a lower surface 132 configured to rest upon a substrate surface 140.
  • Substrate surface 140 supports interface device 100 and user 120.
  • substrate surface 140 is a floor or ground surface, such as the floor of a room where user 120 is using interface device 100.
  • Base 130 is configured to allow tilting of platform 110 relative to lower surface 132. During normal operation of interface device 100, where interface device 100 is stationary relative to substrate surface 140, lower surface 132 abuts and is substantially parallel to substrate surface 140. Therefore, in some examples, base 130 is configured to allow tilting of platform 110 relative to lower surface 132 or substrate surface 140.
  • interface device 100 includes platform 110 and base 130 supporting platform 110, wherein base 130 is configured to rest on substrate surface 140.
  • Base 130 elevates platform 110 above substrate surface 140.
  • Base 130 further provides a pivotal connection between platform 110 and substrate surface 140.
  • Base 130 is configured to allow tilting of platform 110 by the user’s weight.
  • base 130 is configured to allow tilting of platform 110 by the user shifting their weight, or centre of gravity (COG) relative to platform 110.
  • User 120 may shift their centre of gravity through motion relative to platform 110 (e.g. by stepping to an edge of platform 110), or by leaning their body at an angle relative to platform 110.
  • base 130 allows tilting of platform 110 by including one or more resilient members, or one or more spring members.
  • base 130 is located in a centre of platform 110, in order to improve stability of interface device 100.
  • the resilient members are located in a symmetrical arrangement with respect to a lower surface of platform 110.
  • the resilient members may be circumferentially evenly spaced on the lower surface of platform 110.
  • the resilient members are spaced at regular intervals on the lower surface of platform 110.
  • base 130 is configured to provide a restoring force to return platform 110 from a tilted position to a non-tilted, or equilibrium, position.
  • interface device 100 which includes an electronic module 150 located at or near the centre of the lower surface of platform 110.
  • Electronic module 150 includes a first sensor 151, a second sensor 152, a communication unit 154, and a power source 156.
  • Electronic module 150 may be contained within a casing or housing, attached to the lower surface of platform 110.
  • Interface device 100 further includes haptic feedback devices 158.
  • Power source 156 is configured to power electronic module 150. Power source 156 may further be configured to power other electrical devices of interface device 100, such as haptic feedback devices 158. Preferably, though not necessarily, power source 156 is a rechargeable battery. In other examples, power source 156 may be any other source of electrical energy, including a wired connection to a mains or external power supply.
  • Communication unit 154 is configured to transmit output data, or processed data derived from first sensor 151, second sensor 152, haptic feedback devices 158, and/or any other electronic device of interface device 100. Communication unit 154 is further configured to receive input data (such as configuration data) to first sensor 151, second sensor 152, haptic feedback devices 158, and/or any other electronic device of interface device 100.
  • Interface device further includes a removable antenna 155 stationed on electronic module 150. Removable antenna 155 is configured to be disconnected from electronic module 150 and to be connected to a processing system. Removable antenna 155 is then configured to receive data from and/or transmit data to communication unit 154. In some examples, communication unit 154 is configured to communicate with the processing system (e.g.
  • communication unit 154 is configured to communicate with a processing system (e.g. a smart device) directly, without the need for removable antenna 155. Therefore, in some examples, communication unit 154 includes two modes of operation: in a first mode, communication unit 154 communicates with smart phones or other smart devices; in a second mode, communication unit 154 communicates with PCs or console devices through an additional dongle plugged into the PC or console device. This dongle may be removable antenna 155.
  • removable antenna 155 is a Bluetooth dongle connected to a USB port of electronic module 150.
  • removable antenna is a device, such as a transmitter, a receiver, or a transceiver, configured to provide a wireless communication interface with communication unit 154.
  • communication unit 154 is a Bluetooth chip.
  • communication unit 154 may be any other type of wired or wireless communication unit (e.g. WiFi capability).
  • Interface device 100 further includes a stabiliser 160 connected to the lower surface of platform 110.
  • Stabiliser 160 includes a ring mounted to a rim of the lower surface of platform 110.
  • stabiliser 160 includes an elastic or resilient material. Stabiliser 160 stabilises platform 110 when this is tilted by user 120, helping user 120 in maintaining their balance.
  • stabiliser 160 maintains, preserves, sustains, or secures platform 110 when this is tilted, steadying interface device 100.
  • stabiliser 160 is a damping member that dampens or at least partially absorbs shock from platform 110 being tilted and impacting substrate surface 140.
  • stabiliser 160 is a cushioning, padding, or buffering member that protects platform 110 from damage when tilted, and provides a smoother user experience. Therefore, the stabilising effect that stabiliser 160 has on interface device 100 improves the accuracy of tilting platform 110.
  • FIG 5 there is illustrated user 120 standing on platform 110, where platform 110 is in a rest state, or equilibrium state, such that it is not tilted relative to substrate surface 140. When platform 110 is in a state of equilibrium, stabiliser 160 may or may not be in contact with substrate surface 140.
  • user 120 and interface device 100 are mainly supported by base 130. In some examples, user 120 and interface device 100 are supported by base 130 and stabiliser 160.
  • FIG. 6 there is illustrated user 120 standing on platform 110, where platform 110 is in a tilted state.
  • stabiliser 160 When in a tilted state, stabiliser 160 is configured to contact substrate surface 140 to provide a cushioning or damping effect on the motion of user 120, aiding user 120 in maintaining their balance as platform 110 is tilted.
  • stabiliser 160 is biased against the tilting of platform 110, and therefore provides a restoring force to aid user 120 in returning platform 110 to its non-tilted, rest, or equilibrium state.
  • haptic feedback devices 158 are configured to selectively provide haptic feedback (such as forces or vibrations) to user 120 through platform 110. Therefore, the haptic feedback devices are configured to stimulate a tactile sensation perceptible to user 120. In some examples, the tactile stimulation is indicative of a virtual objection or a virtual motion within a virtual environment.
  • Interface device 100 may include any number, such as one, two, or more, haptic feedback devices 158.
  • haptic feedback devices 158 are located on the lower surface of platform 110. In some examples, haptic feedback devices 158 are located within, or embedded into, platform 110. In other examples, haptic feedback devices 158 are located in any location of platform 110 or interface device 100 to allow user 120 to experience haptic feedback (such as on an upper surface of platform 110). Haptic feedback devices 158 may be evenly circumferentially spaced on platform 110, or they may be provided in any other arrangement.
  • First sensor 151 is configured to detect and/or measure the tilting (or a tilt angle) of platform 110.
  • first sensor 151 measures a tilt orientation, being the orientation of tilt of platform 110 relative to a plane parallel to substrate surface 140.
  • first sensor 151 also measures the tilt angle of platform 110 relative to a plane parallel to substrate surface 140.
  • first sensor 151 is an inertial measurement unit (IMU).
  • the IMU includes an accelerometer and a gyroscope.
  • interface device 100 includes one or more sensors configured to detect and/or measure the tilting (or a tilt angle) of platform 110.
  • Second sensor 152 is configured to detect and/or measure a distance between second sensor 152 and substrate surface 140.
  • second sensor 152 may be used to detect the presence or absence of user 120 onto platform 110 by measuring the distance between second sensor 152 and substrate surface 140.
  • FIG. 8 there is illustrated a first example where user 120 is not standing on platform 110, and second sensor 152 measures a first distance 801 between second sensor 152 and substrate surface 140.
  • FIG 9 there is illustrated a second example where user 120 is standing on platform 110, and second sensor 152 measures a second distance 802 between second sensor 152 and substrate surface 140.
  • Second distance 802 is smaller than first distance 801 due to the weight of user 120 that compresses base 130 (and possibly pushes down on elastic platform 110).
  • a processing system is able to detect the presence (or absence) of user 120 onto platform 110 and therefore commence operation (or terminate operation) of interface device 100.
  • a user state of interface device 100 is determined by considering the information provided by one or more sensors. In one example, if second sensor 152 measures first distance 801 and first sensor 151 detects no tilt of platform 110, the user state of interface device 100 is determined to be OFF. Conversely, if second sensor 152 measures a reduction in distance between second sensor 152 and substrate surface 140, and first sensor 151 detects a tilt of platform 110, the user state of interface device 100 is determined to be ON. However, if the second sensor 152 measures second distance 802 and first sensor 151 detects no tilt of platform 110 (due to the user’s centre of gravity being aligned with the centre of platform 110), the user state of interface device 100 may still be determined to be ON.
  • second sensor 152 may be enough to determine a user state of interface device 100 by detecting a reduction in distance due to an elastic depression of platform 110 and/or base 130.
  • the locomotion device i.e. interface device
  • dongle are separate components of the complete solution
  • Figure 11 shows how the two can be configured to be used with both a PC and a mobile smart device.
  • the dongle If the dongle is plugged into the locomotion device it advertises by Bluetooth low energy for mobile devices to connect with. If the dongle is plugged into a PC then it is automatically paired to the locomotion device, blocking other devices from connecting.
  • Figure 12 illustrates example communication between the locomotion device (i.e. interface device) and the digital experience on the PC/standalone VR device/smart device. It is bidirectional communications in order to interact with the experience and for the experience to interact with the device.
  • Figures 13 and 14 show example data sent and received by the interface device.
  • a virtual locomotion control input device where a user can step onto the device (or mount the device) and step off the device (or dismount the device) during a virtual reality experience, as part of the virtual reality experience, without interrupting the virtual reality experience.
  • Example applications of the interface device include, but are not limited to, locomotion around real estate, simulation of skiing or snowboarding or skateboarding, educational experience in museums, *** maps-like locomotion, locomotion for tours and travel, locomotion for sports, interface for manipulating an UI, meditation, flying carpet, and stepmania games.
  • the interface device may be used in conjunction with smart devices (smart phones and tablets) to play games on smart apps (not necessarily VR).
  • the interface device may be used as an input device for control of augmented reality (AR) experiences.
  • AR augmented reality
  • the interface device may be used as an input method to control robotic devices such as a vehicles and drones.
  • the interface device is used to navigate, move within, or interact with, a virtual environment.
  • the interface device is used to control, or remotely control, unmanned vehicles (e.g. drones, driverless cars) or any other system or apparatus whose movement is able to be controlled.
  • unmanned vehicles e.g. drones, driverless cars
  • the interface device is used to control, or remotely control, unmanned vehicles (e.g. drones, driverless cars) or any other system or apparatus whose movement is able to be controlled.
  • a system for virtual locomotion including a processing system 200 and an interface device 210.
  • Processing system 200 is configured to obtain data relating to the tilt of the platform, and to convert the data relating to the tilt of the platform into movement commands in the virtual environment.
  • interface device 210 is an interface device for virtual locomotion as described above.
  • Interface device 210 is configured to be mounted by a user 220.
  • User 220 interacts with interface device 210 by tilting the platform of interface device 210 using their body weight.
  • Processing system 200 obtains data relating to the tilt of the platform.
  • the data relating to the tilt of the platform includes an angle of tilt of the platform relative to a horizontal surface (e.g. a substrate surface such as a ground or floor surface onto which interface device 210 is placed).
  • the data relating to the tilt of the platform includes an orientation, or direction, of tilt of the platform in a horizontal surface (e.g. a substrate surface such as a ground or floor surface onto which interface device 210 is placed).
  • a surface here is deemed to be horizontal if it is orthogonal to (or approximately orthogonal to) the Earth’s gravitational field.
  • Processing system 200 further converts the data relating to the tilt of the platform into movement commands in the virtual environment.
  • the system further includes additional devices for executing the movement commands. Therefore, in some examples, the system further includes a display device 230.
  • display device 230 is a wearable display such as a virtual reality (VR) headset, an augmented reality (AR) headset, or a head-mount display (HMD).
  • the display device could be any other type of display device such as a screen.
  • the system further includes hand-held controllers 240 configured to be held by a user to provide additional commands (e.g. movement commands or other types of commands) in the virtual environment.
  • hand-held controllers 240 are virtual reality joysticks.
  • hand-held controllers 240, display device 230 interact together to provide user 220 seamless virtual reality experience.
  • the system further includes a substrate surface 250, which may be a floor or ground surface of a room, or other location or object external to interface device 210.
  • surface 250 is part of a virtual reality experience of user 220.
  • user 220 may selectively mount onto or dismount from interface device 210 without interrupting the virtual reality experience.
  • user 220 may walk or otherwise move around substrate surface 250 while still being within the virtual reality experience.
  • user 220 may mount interface device 210 from surface 250 without interrupting their virtual reality experience.
  • Interface device 210 may include a communication unit configured to communicate with processing system 200. Furthermore, the communication unit may be configured to communicate with display device 230 and/or hand-held controllers 240. Processing system 200 may further be configured to transmit data relating to the virtual environment to interface device 210. In some examples, processing system 200 interacts with user 220 through interface device 210. For example, processing system 200 may further be configured to stimulate forces, vibrations, or other tactile effects in the platform or in interface device 210. In some examples, processing system provides commands to tactile feedback devices, or operates, or activates tactile feedback devices of interface device 210.
  • the system further includes other devices or components for playing a game, or for interacting with a virtual environment.
  • the system may further include handles, guns, and/or snowboarding or skateboarding boards.
  • processing system 200 is further configured to track interface device 210, or the platform of interface device 210. That is, processing system 200 may be configured to track or determine the location and/or orientation of interface device 210. In some examples, the processing system is further configured to display an image of interface device 210 in the virtual environment, where the location and/or orientation of the image in the virtual environment corresponds (or directly corresponds in a one-to-one mapping or transformation) to the actual, physical location and/or orientation of interface device within a real environment of user 220. In some examples, processing system 200 creates the virtual environment. [0113] In some examples, tracking can be done using existing six degree of freedom (6DoF) trackers, or be camera based, or location can be calibrated prior to use so that the VR experience can be presented virtually around the physical location/orientation of the device.
  • 6DoF six degree of freedom
  • the processing system 200 generally includes at least one processor 302, or processing unit or plurality of processors, memory 304, at least one input device 306 and at least one output device 308, coupled together via a bus or group of buses 310.
  • input device 306 and output device 308 could be the same device.
  • An interface 312 can also be provided for coupling the processing system 200 to one or more peripheral devices, for example interface 312 could be a PCI card or PC card.
  • At least one storage device 314 which houses at least one database 316 can also be provided.
  • the memory 304 can be any form of memory device, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc.
  • the processor 302 could include more than one distinct processing device, for example to handle different functions within the processing system 200.
  • Input device 306 receives input data 318 and can include, for example, a keyboard, a pointer device such as a pen-like device or a mouse, audio receiving device for voice controlled activation such as a microphone, data receiver or antenna such as a modem or wireless data adaptor, data acquisition card, etc.
  • Input data 318 could come from different sources, for example keyboard instructions in conjunction with data received via a network.
  • Output device 308 produces or generates output data 320 and can include, for example, a display device or monitor in which case output data 320 is visual, a printer in which case output data 320 is printed, a port for example a USB port, a peripheral component adaptor, a data transmitter or antenna such as a modem or wireless network adaptor, etc.
  • Output data 320 could be distinct and derived from different output devices, for example a visual display on a monitor in conjunction with data transmitted to a network. A user could view data output, or an interpretation of the data output, on, for example, a monitor or using a printer.
  • the storage device 314 can be any form of data or information storage means, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc.
  • the processing system 200 is adapted to allow data or information to be stored in and/or retrieved from, via wired or wireless communication means, the at least one database 316.
  • the interface 312 may allow wired and/or wireless communication between the processing unit 302 and peripheral components that may serve a specialised purpose.
  • the processor 302 receives instructions as input data 318 via input device 306 and can display processed results or other output to a user by utilising output device 308. More than one input device 306 and/or output device 308 can be provided. It should be appreciated that the processing system 200 may be any form of terminal, server, specialised hardware, or the like.
  • Method 400 includes step 410 of tilting a platform relative to a horizontal plane, and step 420 of obtaining data relating to the tilt of the platform.
  • Method 400 further includes step 420 of converting the data relating to the tilt of the platform into movement commands in the virtual environment.
  • step 420 of obtaining data relating to the tilt of the platform includes a step measuring an angle of tilt of the platform relative to the horizontal surface, and a step of measuring an orientation of tilt of the platform in the horizontal surface.
  • step 430 of converting the data relating to the tilt of the platform into simulated or virtual experience movement commands in the virtual environment includes a step of converting the angle of tilt into a simulated acceleration magnitude in the virtual environment, and a step of converting the orientation of tilt in a direction of simulated acceleration in the virtual environment.
  • method 400 further includes the step of detecting the presence of a user on the platform.
  • the step of detecting the presence of a user on the platform includes the step of measuring a distance between the platform and a substrate surface, and the step of measuring an angle of tilt of the platform.
  • the substrate surface is a ground or floor surface supporting the platform.
  • method 400 is a method of virtual locomotion, further including a step of converting the data relating to the tilt of the platform into simulated movement, or movement commands, in the virtual environment.
  • Figure 18 illustrates an exemplary embodiment of a virtual locomotion device 500 including an upper surface or platform 501, and a base 502, in which the base 502 is at least partially formed of resilient material.
  • Figure 18(a) illustrates an isometric view of the device 500, whilst Figure 18 (b) shows the device 500 partially cut-away, and figure 18(c) shows details of the cross section of Figure 18(b).
  • the upper surface of the platform 501 is preferably substantially rigid and is adapted to support at least one foot of a person. Most preferably the platform 501 is adapted to support a person standing thereon.
  • the platform 501 may be of any desired shape, but in a preferred embodiment of the invention, the platform 501 is preferably of substantially circular shape.
  • the upper surface of the platform may be‘flat’ or planar, as perhaps best illustrated in Figure 18(c)
  • the upper surface of the platform may, in a preferred embodiment, be formed as an upwardly facing curved upper surface 501, which is preferably slightly concave in shape.
  • a slight upwardly facing concave curvature of the upper surface 501 facilitates a person standing on the platform 500 to readily position themselves in the centre of the device 500, and/or, may assist in their sense of balance.
  • the base 502 preferably includes an inner portion 504 and an outer portion 503.
  • the outer portion 503 is preferably at least partially formed of resilient material which is of lower resilience or softer than the resilient material of the inner portion 504 of the base 502.
  • the inner portion 504 of the base 502 is adapted to contact a substrate surface at a predetermined diameter from a centre point of said device 500.
  • the inner portion 504 of said base 502 includes a plurality of shaped protrusions 505 formed of resilient material which are adapted to contact said substrate surface at a plurality of spaced-apart inner base contact positions.
  • protrusions 505 are preferably symmetrically spaced about the base 502, in the four quadrants thereof.
  • the inner portion 504 of said base 502 may preferably include four, eight or twelve protrusions, or, some other multiple of four protrusions.
  • eight substantially equidistant spaced protrusions 505 are provided.
  • the outer portion 503 of the base 502 is preferably adapted to contact a substrate surface at a predetermined diameter from a centre point of said device 500.
  • the outer portion 503 of said base 502 is adapted to contact said substrate surface proximal to a peripheral edge of said device 500.
  • the outer portion of said base is formed as a downwardly extending protrusion 506 which extends substantially around the base 502 of the device 500.
  • the virtual locomotion platform of the present invention works by presenting an experiential illusion to the user. This experience relies on audio-visual inputs as well as passive and active physical responses of the platform on which the user stands.
  • the passive inputs include the way the platform tilts when the user shifts his/her weight around the platform.
  • the active response is derived from vibrotactile haptic motors under the platform which are‘felt’ by the user’s feet.
  • the passive response of the platform can be grouped into dynamic and static.
  • the passive response is the angle at which the platform rests when the user stabilises his/her posture and centre of gravity (CoG) above the platform, such as illustrated in Figure 20.
  • the dynamic response is given by the viscoelasticity and inertia of the movement of the platform during the movement of the user. This latter attribute affects the speed profile and stability of the response.
  • Figures 21 and 22 show graphs of the profile of the responses in measurable attributes.
  • the shapes of these plots describe the static response of each platform once the construction elements have been curated and assembled.
  • the shapes of these plots are related to how responsive the platform is as a sensor and how effective it is an agent of the illusion that informs the user they are floating on a moving platform. This experience is in turn very important for comfort and stability.
  • the interface device enables simulated locomotion in digital or virtual spaces, in an intuitive and immersive manner.
  • the interface device eliminates or reduces simulation sickness (i.e. motion sickness) in virtual reality (VR).
  • VR virtual reality
  • the interface device therefore provides a method of locomotion in digital spaces including VR. It is done through a physical board that a user stands on and shifts their body weight, through leaning or stepping. This translates to the board mildly tilting (in some examples, up to five degrees in any given direction), which is measured by the hardware and processed as the user’s intention to move virtually, in a particular direction by a proportioned level of intensity (e.g. speed or acceleration).
  • a proportioned level of intensity e.g. speed or acceleration
  • the software processes the virtual outcome and returns data back to the device informing it of the state of its virtual representation (e.g. its speed or whether it has impacted something).
  • the return data is processed by the interface device and presented back to the user as vibrotactile haptic feedback. Using this method, the user can“feel” the ground or impacts under his/her feet.
  • the interface device provides a fun, engaging, versatile method for locomotion in VR.
  • the interface device is low cost (compared to certain alternative solutions for virtual locomotion) and is simple, from a developer’s perspective, to create content for.
  • the interface device is a peripheral platform in VR that allows users to step onto locomotion platform and get off without needing to stop experience or remove headset.
  • the interface device leaves the hands of the user free to engage with the virtual world in an unencumbered way.
  • the haptic feedback under feet enhances engagement and effectiveness of the experience.
  • the interface device enables locomotion in a virtual environment by processing movement control inputs from the actions and movements of the user. If the user wants to move in a particular direction in the horizontal plane, he/she will shift their centre of mass on the board towards that direction. This leads to a physical tilting of the board in that direction. The magnitude of this tilt is proportional to the amount by which the user moves his/her centre of gravity on the board.
  • This control input is sent to a computer handling the experience in a simple format: namely direction and magnitude of an angle of tilt of the platform.
  • the user will perceive virtual movement following his/her shifts/tilts.
  • This perceived motion is accompanied by vibrotactile feedback in harmony with the experience. For example, movement on the ground leads to a humming vibration on the board whose amplitude and frequency relates to the perceived velocity. A collision with a virtual object leads to a brief but distinct/deliberate vibration on a second channel of haptic feedback.
  • the user is able to enjoy additional depth and nuances to the experience by going on and off the board during the experience. Off the board, the user is able to walk around in a‘1 to room scale scenario. However when on the board, the board is activated as a‘personal transport device’ .
  • the haptic feedback under the feet gives an extra layer of sensory input to the user enhancing his/her immersion and making the experience more believable.
  • the device may be battery operated and Bluetooth-compatible, making it portable.
  • Physical tilting provides, in some examples, unlimited movement range within the constraints of the virtual environment, without physical constraints such as the physical room layout or the maximum range of an electrical cable. This provides increased immersion, more akin to a natural experience, therefore reducing motion sickness in some cases.
  • the interface device involves an intuitive method of locomotion in digital spaces. This is done with a physical platform that the user stands on. The platform tilts downwards the direction that the user leans/steps or shifts their body weight. This tilt is translated into locomotion in the digital space, such as virtual reality, augmented reality, or mixed reality.
  • the interface device also provides vibration haptic feedback to the user to increase the immersion of the experience in the digital space.
  • the interface device also includes the methods of employing the physical device for locomotion in the digital space. There is also a method of detecting when the user is on/off the board.
  • a method of virtual locomotion includes the way the physical tilt of the platform is translated into digital locomotion in the software/experience. This includes how it is implemented in the physics engine of the development environment (Unity, Unreal Engine).
  • the board tilts about a fulcrum point at the centre of the bottom of the board (or multiple fulcrum points about the centre of the bottom of the board).
  • the tilting mechanism employs a combination of elastic and damping materials. This is to increase the comfort of the user when leaning and quickly changing positions/angles.
  • the user steps onto/off the board to utilise the locomotion in the digital spaces.
  • the user physically moves/leans/steps to shift their body weight in the direction that they want to move in the digital space.
  • the user can sit on the board. In some examples, the user can use it for a balancing application.
  • an angle of tilt of the platform is measured by an IMU.
  • the IMU uses a combination of an electronic accelerometer and gyroscope.
  • the accelerometer measures the angle of tilt (in an x, y, z reference frame) with respect to gravity.
  • the gyroscope measures the rotational speed (in an x, y, z reference frame).
  • the sensor signals are combined into one angle tilt function.
  • Example sensor fusion formula: current combined pitch angle 0.95 x (previous combined pitch angle + gyroscope value x time increment) + 0.05 x accelerometer pitch.
  • the method further includes the step of using the combined pitch and roll angles to calculate the magnitude and direction using trigonometry.
  • Magnitude data is the angle of tilt made with respect to the gravity vector.
  • the Direction data is the 0-359 position with respect to the horizontal plane. The 0 direction is defined at start-up.
  • the angle tilt value is used to calculate the direction and magnitude values, for locomotion around the digital space.
  • the interface device is able to determine if the user is on or off the board.
  • the proximity sensor is fixed to the base of the board, which is a distance from the floor. This distance changes depending on the weight on the board. The proximity signal is then used to measure if someone is on/off the board and also if a new user has stepped onto the board.
  • the interface device is capable of wireless communication.
  • Bluetooth low energy is used as the primary communication method between the board and a PC or mobile device.
  • a custom HID GATT profile is used for the communications
  • the interface device includes modules or devices for vibration haptic feedback.
  • haptic feedback is delivered to the user through the use of vibration on the board to the user’s feet or other part of the body that is in direct contact with the board.
  • this vibration is created using vibration motors (e.g. eccentric mass motors). This vibration gives the user haptic feedback for events such as collisions, or for different states that the user may in the digital/virtual space, such as a hover sensation or a movement across rough terrain.
  • vibration motors e.g. eccentric mass motors
  • This vibration gives the user haptic feedback for events such as collisions, or for different states that the user may in the digital/virtual space, such as a hover sensation or a movement across rough terrain.
  • the method or software accounts for responsiveness to physical movement, speed and momentum in the digital space, the on/off boarding experience (e.g. slow start/stop when
  • the interface device eliminates the difficulty of developing methods to simulate walking.
  • the interface device acts like a virtual hoverboard, and the user more readily accepts that they are hovering around in a virtual world.
  • the interface device offers increased immersion in the virtual experience and in most cases a reduction in the feelings of motion sickness caused by the locomotion joystick methods.
  • the interface device provides a more intuitive method of physically moving on the board and also the ability to deliver haptic feedback to the user.
  • the interface device allows users to mount and dismount the platform during use (e.g. during a VR experience).
  • the interface device in order to provide such functionality of the interface device, should be presented in the virtual experience as placed in the same physical vicinity and orientation to the user so that the user can walk towards it and knowingly get on the board without having to take the HMD off.
  • Optional embodiments may also be said to broadly include the parts, elements, steps and/or features referred to or indicated herein, individually or in any combination of two or more of the parts, elements, steps and/or features, and wherein specific integers are mentioned which have known equivalents in the art to which the invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.

Abstract

An interface device (100) for virtual locomotion configured to accommodate a user (120). The interface device (100) includes a platform (110) and a base (130) to support the platform (110). The base (130) includes a lower surface (132) which is at least partly formed of resilient material and adapted to rest upon a substrate surface (140). The base (130) is configured to allow tilting of the platform (110) by the user (120). The interface device further includes at least one sensor (151, 152) configured to detect the tilting of the platform (100). Also disclosed is a system for virtual locomotion, and a method of controlling simulated movement in a virtual environment.

Description

VIRTUAL LOCOMOTION DEVICE
Technical Field
[001] The present invention relates to a virtual locomotion device.
[002] In particular, the present invention relates to a virtual locomotion device which includes a platform configured to accommodate a user, and which includes a base which is at least partly formed of resilient material and which supports the platform. Background of the Invention
[003] The reference in this specification to any prior publication (or information derived from the prior publication), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that the prior publication (or information derived from the prior publication) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
[004] As promising and exciting as Virtual Reality (VR) and Augmented Reality (AR) are as emerging mediums of the future, their usefulness is limited by current user interfaces. Head-mount displays (HMDs) act as both input and output devices allowing the user to be perceptually immersed in a virtual world. Using stereoscopic audio-visual presentation, the HMD‘tricks’ the user’s brain into believing he/she is in the virtual world. This is reinforced by the fact that as the user moves his/her head, the visual input is modified accordingly. Depending on the sophistication of the HMD technology, its head tracking may afford the user up to six degrees of freedom of movement (6DoF) allowing full rotation as well as translation of the head and the body. However, for current systems which are mostly tethered to a computer, this is only possible up to a point.
[005] Normally, a user can only move a maximum of two to three meters within an allocated space before reaching either the end of the tether, or an obstacle such as a wall or furniture. Safety is especially compromised if the user moves too much beyond the previously cleared allocated space. Hence when using VR experiences (games or non games) that require movement of the user’s point of view, there is a need for a device or system to make this movement possible beyond just a few meters. This movement may be digitally abstracted using a joystick of other peripheral (eg a hand controller). However, this presents its own problems.
[006] For example, smooth virtual motion of a user, perceived by the visual cortex as movement, without matching perception by the inner ear (vestibular system) causes an incongruence leading to discomfort and nausea in many users. Alternative non-smooth, discrete motion in the form of ‘teleportation’ from one location to another is more comfortable but often leads to disorientation and the“breaking of immersion” within the experience. This is undesirable for many users. In addition, use of hand controllers for the purpose of locomotion get in the way of the hands in being used otherwise for intuitive hand interaction with the virtual world and its objects.
[007] Therefore, there is a need to alleviate one or more of the above-mentioned problems or at least provide a useful alternative.
Summary of the Invention
[008] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Preferred Embodiments. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[009] According to one example aspect, there is provided an interface device for virtual locomotion including: a platform configured to accommodate a user; a base supporting the platform, the base having a lower surface configured to rest upon a substrate surface, wherein the base is configured to allow tilting of the platform by the user; and a first sensor configured to detect the tilting of the platform.
[010] In one form, the platform is configured to accommodate a user in one of a standing, sitting, or kneeling position.
[011] In one form, the substrate surface is a floor or ground surface supporting the interface device. [012] In one form, the base is configured to allow tilting of the platform relative to the substrate surface.
[013] In one form, the substrate surface is a floor or ground surface.
[014] In one form, the base is configured to allow tilting of the platform by the user’s weight.
[015] In one form, the base includes a resilient member.
[016] In one form, the first sensor is fixed to the platform.
[017] In one form, the first sensor is configured to measure an angle of tilt of the platform relative to a horizontal plane.
[018] In one form, the first sensor is further configured to measure a direction of tilt of the platform relative to the horizontal plane.
[019] In one form, the interface device further includes a second sensor configured to measure the distance between the platform and the substrate surface.
[020] In one form, the second sensor is fixed to a lower surface of the platform.
[021 ] In one form, the interface device further includes a haptic feedback device configured to provide vibrations or a force to the user.
[022] In one form, the haptic feedback device is fixed to the platform.
[023] In one form, the interface device further includes a stabiliser.
[024] In one form, the stabiliser is located on a rim of a lower surface of the platform.
[025] In one form, the stabiliser includes a resilient material. [026] In another example aspect, there is provided an interface device for virtual locomotion including: a platform configured to accommodate a user; a base supporting the platform, the base being configured to be supported by a substrate surface, wherein the base pivotally connects the platform to the substrate surface; and a sensor configured to detect a tilt angle of the platform relative to the substrate surface, and a tilt orientation of the platform relative to the substrate surface.
[027] In another example aspect, there is provided a system for virtual locomotion including: the interface device as described above; and a processing system configured to: obtain data relating to the tilt of the platform; and convert the data relating to the tilt of the platform into movement commands in the virtual environment.
[028] In one form, the system further includes a display device.
[029] In one form, the display device is a virtual reality headset.
[030] In one form, the system further includes a hand-held controller.
[031] In one form, the hand-held controller is a virtual reality joystick.
[032] In one form, the data relating to the tilt of the platform includes an angle of tilt of the platform relative to a horizontal surface.
[033] In one form, the data relating to the tilt of the platform includes an orientation of tilt of the platform in a horizontal surface.
[034] In one form, the processing system is further configured to stimulate forces or vibrations in the platform.
[035] In one form, the processing system is further configured to: determine a location and orientation of the interface device relative to the user; and create an image of the interface device in the virtual environment, wherein the image has a location and orientation relative to the user corresponding to the location and orientation of the interface device relative to the user. [036] In another example aspect, there is provided a method of controlling movement in a virtual environment including the steps of: tilting a platform relative to a horizontal plane; obtaining data relating to the tilt of the platform; and converting the data relating to the tilt of the platform into movement commands in the virtual environment.
[037] In one form, the step of obtaining data relating to the tilt of the platform includes the steps of: measuring an angle of tilt of the platform relative to the horizontal surface; and measuring an orientation of tilt of the platform in the horizontal surface.
[038] In one form, the step of converting the data relating to the tilt of the platform into movement commands in the virtual environment includes the steps of: converting the angle of tilt in an acceleration magnitude in the virtual environment; and converting the orientation of tilt in a direction of acceleration in the virtual environment.
[039] In one form, the method further includes the step of detecting the presence of a user on the platform.
[040] In one form, the step of detecting the presence of a user on the platform includes the steps of: measuring a distance between the platform and a substrate surface; and measuring an angle of tilt of the platform.
[041] In one form, the interface device for virtual locomotion includes;
a platform configured to accommodate a user;
a base supporting the platform, the base having a lower surface configured to rest upon a substrate surface, wherein the base is being at least partially formed of resilient material, and configured to allow tilting of the platform by the user; and,
at least one sensor configured to detect the tilting of the platform.
[042] In one preferred form, the platform is substantially rigid and adapted to support a person standing thereon.
[043] In a preferred form, said platform is of substantially circular shape.
[044] In a preferred form, said platform includes an upwardly facing curved upper surface. [045] In a preferred form, said surface of said platform is substantially concave in shape.
[046] In a preferred form, said base includes an inner portion and an outer portion, wherein said outer portion is at least partially formed of resilient material which is less resilient/softer than the resilient material of the inner portion of the base.
[047] In a preferred form, said inner portion of said base is adapted to contact a substrate surface at a predetermined diameter from a centre point of said platform.
[048] In a preferred form, said inner portion of said base includes a plurality of shaped protrusions formed of resilient material which are adapted to contact said substrate surface at a plurality of spaced apart inner base contact positions.
[049] In a preferred form, said inner portion of said base includes eight substantially equidistant spaced protrusions.
[050] In a preferred form, said outer portion of said base is adapted to contact a substrate surface at a predetermined diameter from a centre point of said platform.
[051] In a preferred form, said outer portion of said base is adapted to contact said substrate surface proximal to a peripheral edge of said platform.
[052] In a preferred form, said outer portion of said base is formed as a downwardly extending protrusion which extends substantially around the platform.
Brief Description of Figures
[053] Example embodiments are apparent from the following description, which is given by way of example only, of at least one preferred but non-limiting embodiment, described in connection with the accompanying figures, wherein
[054] Figure 1 illustrates a three-dimensional view of an example interface device for virtual locomotion;
[055] Figure 2 illustrates a front view of the example interface device of Figure 1, where a user’s centre of gravity is aligned with a centre of a platform of the interface device; [056] Figure 3 illustrates a front view of the example interface device of Figure 1, where a user’s centre of gravity (COG) is offset from a centre of a platform of the interface device;
[057] Figure 4 illustrates a bottom view of an example interface device;
[058] Figure 5 illustrates a cross-section of an example interface device along a vertical plane, where the interface device is in an equilibrium state;
[059] Figure 6 illustrates a cross-section of an example interface device along a vertical plane, where the interface device is in a tilted state;
[060] Figure 7 illustrates a cross-section of an example interface device along a vertical plane, where haptic feedback devices of the interface devices are producing vibrations in the interface device;
[061] Figure 8 illustrates a cross-section of an example interface device along a vertical plane, with a user standing alongside the interface device;
[062] Figure 9 illustrates a cross-section of the interface device of Figure 8 along a vertical plane, with the user standing on top of the interface device;
[063] Figure 10 illustrates an example system architecture for an example interface device;
[064] Figure 11 illustrates example configurations of an example interface device and Bluetooth dongle with a PC or mobile device;
[065] Figure 12 illustrates communications between an example interface device and the digital experience;
[066] Figure 13 illustrates example data sent by the interface device;
[067] Figure 14 illustrates example data received by the interface device; [068] Figure 15 illustrates an example system for virtual locomotion;
[069] Figure 16 illustrates an example processing system for use in the system of Figure 15;
[070] Figure 17 illustrates an example method of controlling movement in a virtual environment;
[071] Figure 18 illustrates details of a preferred embodiment of a virtual locomotion platform design in accordance with the present invention;
[072] Figure 19 schematically illustrates details of the underside of the base of the embodiment of the platform shown in Figure 18; [073] Figure 20 illustrates an example of the responsive tilt of the virtual locomotion platform due to a shift in the centre of gravity;
[074] Figure 21 graphically illustrates the rest angle of a platform after a shift of the centre of gravity; and,
[075] Figure 22 graphically illustrates the dynamic transit response to a change of angle of the platform of the present invention.
Detailed Description of Preferred Embodiments
[076] The following modes, given by way of example only, are described in order to provide a more precise understanding of the subject matter of a preferred embodiment or embodiments. In the figures, incorporated to illustrate features of an example embodiment, like reference numerals are used to identify like parts throughout the figures.
[077] Referring to Figures 1 to 3, there is illustrated an example interface device 100 for virtual locomotion. Interface device 100 includes a platform, or board, 110 configured to accommodate a user 120. Interface device further includes a base 130 supporting the platform and configured to allow tilting of the platform by user 120. Interface device 100 further includes a sensor (not shown) configured to detect the tilting of the platform. [078] Preferably, though not necessarily, platform 110 is configured to accommodate user 120 in a standing, or upright, position, where the user positions their feet on top of platform 110 such that the user’s whole body weight is exerted onto interface device 100. In some examples, platform 110 is further configured to accommodate user 120 in a sitting position, such as user 120 being seated cross-legged onto platform 110. In some examples, platform 110 is configured to accommodate user 120 in any position. Preferably, though not necessarily, platform 110 is configured to accommodate user 120 in any position such that the user’s whole weight is exerted onto interface device 100.
[079] Platform 110 has a circular shape although other shapes, such as an oval, square or any other shape, may also be suitable. Platform 110 has an upper surface, on which user 120 stands, and a lower surface connected to base 130. Base 130 may be attached to platform 110, or it may be integrally formed with platform 110. Preferably, though not necessarily, platform 110 includes a flexible or elastic layer with sufficient tension to support user 120 above a floor or ground surface.
[080] Base 130 has a lower surface 132 configured to rest upon a substrate surface 140. Substrate surface 140 supports interface device 100 and user 120. In some examples, substrate surface 140 is a floor or ground surface, such as the floor of a room where user 120 is using interface device 100.
[081] Base 130 is configured to allow tilting of platform 110 relative to lower surface 132. During normal operation of interface device 100, where interface device 100 is stationary relative to substrate surface 140, lower surface 132 abuts and is substantially parallel to substrate surface 140. Therefore, in some examples, base 130 is configured to allow tilting of platform 110 relative to lower surface 132 or substrate surface 140.
[082] Therefore, interface device 100 includes platform 110 and base 130 supporting platform 110, wherein base 130 is configured to rest on substrate surface 140. Base 130 elevates platform 110 above substrate surface 140. Base 130 further provides a pivotal connection between platform 110 and substrate surface 140.
[083] Base 130 is configured to allow tilting of platform 110 by the user’s weight. Referring to Figures 2 and 3, base 130 is configured to allow tilting of platform 110 by the user shifting their weight, or centre of gravity (COG) relative to platform 110. User 120 may shift their centre of gravity through motion relative to platform 110 (e.g. by stepping to an edge of platform 110), or by leaning their body at an angle relative to platform 110.
[084] In some examples, base 130 allows tilting of platform 110 by including one or more resilient members, or one or more spring members. Preferably, though not necessarily, base 130 is located in a centre of platform 110, in order to improve stability of interface device 100. In some examples, where base 130 comprises a plurality (such as two or more) resilient members, the resilient members are located in a symmetrical arrangement with respect to a lower surface of platform 110. For example, the resilient members may be circumferentially evenly spaced on the lower surface of platform 110. In some examples, the resilient members are spaced at regular intervals on the lower surface of platform 110. In some examples, base 130 is configured to provide a restoring force to return platform 110 from a tilted position to a non-tilted, or equilibrium, position.
[085] Referring to Figure 4, there is illustrates a bottom-side view of interface device 100, which includes an electronic module 150 located at or near the centre of the lower surface of platform 110. Electronic module 150 includes a first sensor 151, a second sensor 152, a communication unit 154, and a power source 156. Electronic module 150 may be contained within a casing or housing, attached to the lower surface of platform 110. Interface device 100 further includes haptic feedback devices 158.
[086] Power source 156 is configured to power electronic module 150. Power source 156 may further be configured to power other electrical devices of interface device 100, such as haptic feedback devices 158. Preferably, though not necessarily, power source 156 is a rechargeable battery. In other examples, power source 156 may be any other source of electrical energy, including a wired connection to a mains or external power supply.
[087] Communication unit 154 is configured to transmit output data, or processed data derived from first sensor 151, second sensor 152, haptic feedback devices 158, and/or any other electronic device of interface device 100. Communication unit 154 is further configured to receive input data (such as configuration data) to first sensor 151, second sensor 152, haptic feedback devices 158, and/or any other electronic device of interface device 100. [088] Interface device further includes a removable antenna 155 stationed on electronic module 150. Removable antenna 155 is configured to be disconnected from electronic module 150 and to be connected to a processing system. Removable antenna 155 is then configured to receive data from and/or transmit data to communication unit 154. In some examples, communication unit 154 is configured to communicate with the processing system (e.g. a PC) through removable antenna 155. In some examples, communication unit 154 is configured to communicate with a processing system (e.g. a smart device) directly, without the need for removable antenna 155. Therefore, in some examples, communication unit 154 includes two modes of operation: in a first mode, communication unit 154 communicates with smart phones or other smart devices; in a second mode, communication unit 154 communicates with PCs or console devices through an additional dongle plugged into the PC or console device. This dongle may be removable antenna 155.
[089] Preferably, though not necessarily, removable antenna 155 is a Bluetooth dongle connected to a USB port of electronic module 150. In other examples, removable antenna is a device, such as a transmitter, a receiver, or a transceiver, configured to provide a wireless communication interface with communication unit 154. Preferably, though not necessarily, communication unit 154 is a Bluetooth chip. In other examples, communication unit 154 may be any other type of wired or wireless communication unit (e.g. WiFi capability).
[090] Interface device 100 further includes a stabiliser 160 connected to the lower surface of platform 110. Stabiliser 160 includes a ring mounted to a rim of the lower surface of platform 110. Preferably, though not necessarily, stabiliser 160 includes an elastic or resilient material. Stabiliser 160 stabilises platform 110 when this is tilted by user 120, helping user 120 in maintaining their balance. In some examples, stabiliser 160 maintains, preserves, sustains, or secures platform 110 when this is tilted, steadying interface device 100. In some examples, stabiliser 160 is a damping member that dampens or at least partially absorbs shock from platform 110 being tilted and impacting substrate surface 140. In some examples, stabiliser 160 is a cushioning, padding, or buffering member that protects platform 110 from damage when tilted, and provides a smoother user experience. Therefore, the stabilising effect that stabiliser 160 has on interface device 100 improves the accuracy of tilting platform 110. [091] Referring to Figure 5, there is illustrated user 120 standing on platform 110, where platform 110 is in a rest state, or equilibrium state, such that it is not tilted relative to substrate surface 140. When platform 110 is in a state of equilibrium, stabiliser 160 may or may not be in contact with substrate surface 140. Preferably, though not necessarily, user 120 and interface device 100 are mainly supported by base 130. In some examples, user 120 and interface device 100 are supported by base 130 and stabiliser 160.
[092] Referring to Figures 6, there is illustrated user 120 standing on platform 110, where platform 110 is in a tilted state. When in a tilted state, stabiliser 160 is configured to contact substrate surface 140 to provide a cushioning or damping effect on the motion of user 120, aiding user 120 in maintaining their balance as platform 110 is tilted. In some examples, stabiliser 160 is biased against the tilting of platform 110, and therefore provides a restoring force to aid user 120 in returning platform 110 to its non-tilted, rest, or equilibrium state.
[093] Referring to Figure 7, there is illustrated user 120 standing on platform 110, wherein platform 110 is in a tilted state. Also shown in Figure 7 are haptic feedback devices 158. Haptic feedback devices 158 are configured to selectively provide haptic feedback (such as forces or vibrations) to user 120 through platform 110. Therefore, the haptic feedback devices are configured to stimulate a tactile sensation perceptible to user 120. In some examples, the tactile stimulation is indicative of a virtual objection or a virtual motion within a virtual environment. Interface device 100 may include any number, such as one, two, or more, haptic feedback devices 158.
[094] In some examples, haptic feedback devices 158 are located on the lower surface of platform 110. In some examples, haptic feedback devices 158 are located within, or embedded into, platform 110. In other examples, haptic feedback devices 158 are located in any location of platform 110 or interface device 100 to allow user 120 to experience haptic feedback (such as on an upper surface of platform 110). Haptic feedback devices 158 may be evenly circumferentially spaced on platform 110, or they may be provided in any other arrangement.
[095] First sensor 151 is configured to detect and/or measure the tilting (or a tilt angle) of platform 110. Preferably, though not necessarily, first sensor 151 measures a tilt orientation, being the orientation of tilt of platform 110 relative to a plane parallel to substrate surface 140. Preferably, though not necessarily, first sensor 151 also measures the tilt angle of platform 110 relative to a plane parallel to substrate surface 140. In some examples, first sensor 151 is an inertial measurement unit (IMU). In some examples, the IMU includes an accelerometer and a gyroscope. In other examples, interface device 100 includes one or more sensors configured to detect and/or measure the tilting (or a tilt angle) of platform 110.
[096] Second sensor 152 is configured to detect and/or measure a distance between second sensor 152 and substrate surface 140. In particular, second sensor 152 may be used to detect the presence or absence of user 120 onto platform 110 by measuring the distance between second sensor 152 and substrate surface 140.
[097] Referring to Figure 8, there is illustrated a first example where user 120 is not standing on platform 110, and second sensor 152 measures a first distance 801 between second sensor 152 and substrate surface 140. Referring to Figure 9, there is illustrated a second example where user 120 is standing on platform 110, and second sensor 152 measures a second distance 802 between second sensor 152 and substrate surface 140. Second distance 802 is smaller than first distance 801 due to the weight of user 120 that compresses base 130 (and possibly pushes down on elastic platform 110). By comparing second distance 802 to first distance 801, a processing system is able to detect the presence (or absence) of user 120 onto platform 110 and therefore commence operation (or terminate operation) of interface device 100.
[098] In some examples, a user state of interface device 100 is determined by considering the information provided by one or more sensors. In one example, if second sensor 152 measures first distance 801 and first sensor 151 detects no tilt of platform 110, the user state of interface device 100 is determined to be OFF. Conversely, if second sensor 152 measures a reduction in distance between second sensor 152 and substrate surface 140, and first sensor 151 detects a tilt of platform 110, the user state of interface device 100 is determined to be ON. However, if the second sensor 152 measures second distance 802 and first sensor 151 detects no tilt of platform 110 (due to the user’s centre of gravity being aligned with the centre of platform 110), the user state of interface device 100 may still be determined to be ON. Therefore, second sensor 152 may be enough to determine a user state of interface device 100 by detecting a reduction in distance due to an elastic depression of platform 110 and/or base 130. [099] Referring to Figure 10, there is illustrated an example architecture of the electronic components and their respective connections. The locomotion device (i.e. interface device) and dongle are separate components of the complete solution, and Figure 11 shows how the two can be configured to be used with both a PC and a mobile smart device.
[0100] If the dongle is plugged into the locomotion device it advertises by Bluetooth low energy for mobile devices to connect with. If the dongle is plugged into a PC then it is automatically paired to the locomotion device, blocking other devices from connecting.
[0101] Figure 12 illustrates example communication between the locomotion device (i.e. interface device) and the digital experience on the PC/standalone VR device/smart device. It is bidirectional communications in order to interact with the experience and for the experience to interact with the device. Figures 13 and 14 show example data sent and received by the interface device.
[0102] In some examples, there is provided a virtual locomotion control input device where a user can step onto the device (or mount the device) and step off the device (or dismount the device) during a virtual reality experience, as part of the virtual reality experience, without interrupting the virtual reality experience.
[0103] Example applications of the interface device include, but are not limited to, locomotion around real estate, simulation of skiing or snowboarding or skateboarding, educational experience in museums, *** maps-like locomotion, locomotion for tours and travel, locomotion for sports, interface for manipulating an UI, meditation, flying carpet, and stepmania games. In some examples, the interface device may be used in conjunction with smart devices (smart phones and tablets) to play games on smart apps (not necessarily VR). In some examples, the interface device may be used as an input device for control of augmented reality (AR) experiences. In some examples, the interface device may be used as an input method to control robotic devices such as a vehicles and drones. In some examples, the interface device is used to navigate, move within, or interact with, a virtual environment. In other examples, the interface device is used to control, or remotely control, unmanned vehicles (e.g. drones, driverless cars) or any other system or apparatus whose movement is able to be controlled. [0104] Referring to Figure 15, there is illustrated a system for virtual locomotion including a processing system 200 and an interface device 210. Processing system 200 is configured to obtain data relating to the tilt of the platform, and to convert the data relating to the tilt of the platform into movement commands in the virtual environment.
[0105] In some examples, interface device 210 is an interface device for virtual locomotion as described above. Interface device 210 is configured to be mounted by a user 220. User 220 interacts with interface device 210 by tilting the platform of interface device 210 using their body weight.
[0106] Processing system 200 obtains data relating to the tilt of the platform. In some examples, the data relating to the tilt of the platform includes an angle of tilt of the platform relative to a horizontal surface (e.g. a substrate surface such as a ground or floor surface onto which interface device 210 is placed). In some examples, the data relating to the tilt of the platform includes an orientation, or direction, of tilt of the platform in a horizontal surface (e.g. a substrate surface such as a ground or floor surface onto which interface device 210 is placed). A surface here is deemed to be horizontal if it is orthogonal to (or approximately orthogonal to) the Earth’s gravitational field.
[0107] Processing system 200 further converts the data relating to the tilt of the platform into movement commands in the virtual environment. In some examples, the system further includes additional devices for executing the movement commands. Therefore, in some examples, the system further includes a display device 230. In some examples, display device 230 is a wearable display such as a virtual reality (VR) headset, an augmented reality (AR) headset, or a head-mount display (HMD). In other examples, the display device could be any other type of display device such as a screen.
[0108] The system further includes hand-held controllers 240 configured to be held by a user to provide additional commands (e.g. movement commands or other types of commands) in the virtual environment. In some examples, hand-held controllers 240 are virtual reality joysticks. In some examples, hand-held controllers 240, display device 230, interact together to provide user 220 seamless virtual reality experience. [0109] In some examples, the system further includes a substrate surface 250, which may be a floor or ground surface of a room, or other location or object external to interface device 210. In some examples, surface 250 is part of a virtual reality experience of user 220. During the experience, user 220 may selectively mount onto or dismount from interface device 210 without interrupting the virtual reality experience. When dismounting interface device 210, user 220 may walk or otherwise move around substrate surface 250 while still being within the virtual reality experience. Furthermore, user 220 may mount interface device 210 from surface 250 without interrupting their virtual reality experience.
[0110] Interface device 210 may include a communication unit configured to communicate with processing system 200. Furthermore, the communication unit may be configured to communicate with display device 230 and/or hand-held controllers 240. Processing system 200 may further be configured to transmit data relating to the virtual environment to interface device 210. In some examples, processing system 200 interacts with user 220 through interface device 210. For example, processing system 200 may further be configured to stimulate forces, vibrations, or other tactile effects in the platform or in interface device 210. In some examples, processing system provides commands to tactile feedback devices, or operates, or activates tactile feedback devices of interface device 210.
[0111] In some examples, the system further includes other devices or components for playing a game, or for interacting with a virtual environment. For example, the system may further include handles, guns, and/or snowboarding or skateboarding boards.
[0112] In some examples, processing system 200 is further configured to track interface device 210, or the platform of interface device 210. That is, processing system 200 may be configured to track or determine the location and/or orientation of interface device 210. In some examples, the processing system is further configured to display an image of interface device 210 in the virtual environment, where the location and/or orientation of the image in the virtual environment corresponds (or directly corresponds in a one-to-one mapping or transformation) to the actual, physical location and/or orientation of interface device within a real environment of user 220. In some examples, processing system 200 creates the virtual environment. [0113] In some examples, tracking can be done using existing six degree of freedom (6DoF) trackers, or be camera based, or location can be calibrated prior to use so that the VR experience can be presented virtually around the physical location/orientation of the device.
[0114] Referring to Figure 16, there is illustrated an example processing system 200, for use in combination with the interface device. In particular, the processing system 200 generally includes at least one processor 302, or processing unit or plurality of processors, memory 304, at least one input device 306 and at least one output device 308, coupled together via a bus or group of buses 310. In certain embodiments, input device 306 and output device 308 could be the same device. An interface 312 can also be provided for coupling the processing system 200 to one or more peripheral devices, for example interface 312 could be a PCI card or PC card. At least one storage device 314 which houses at least one database 316 can also be provided. The memory 304 can be any form of memory device, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc. The processor 302 could include more than one distinct processing device, for example to handle different functions within the processing system 200.
[0115] Input device 306 receives input data 318 and can include, for example, a keyboard, a pointer device such as a pen-like device or a mouse, audio receiving device for voice controlled activation such as a microphone, data receiver or antenna such as a modem or wireless data adaptor, data acquisition card, etc. Input data 318 could come from different sources, for example keyboard instructions in conjunction with data received via a network. Output device 308 produces or generates output data 320 and can include, for example, a display device or monitor in which case output data 320 is visual, a printer in which case output data 320 is printed, a port for example a USB port, a peripheral component adaptor, a data transmitter or antenna such as a modem or wireless network adaptor, etc. Output data 320 could be distinct and derived from different output devices, for example a visual display on a monitor in conjunction with data transmitted to a network. A user could view data output, or an interpretation of the data output, on, for example, a monitor or using a printer. The storage device 314 can be any form of data or information storage means, for example, volatile or non-volatile memory, solid state storage devices, magnetic devices, etc.
[0116] In use, the processing system 200 is adapted to allow data or information to be stored in and/or retrieved from, via wired or wireless communication means, the at least one database 316. The interface 312 may allow wired and/or wireless communication between the processing unit 302 and peripheral components that may serve a specialised purpose. The processor 302 receives instructions as input data 318 via input device 306 and can display processed results or other output to a user by utilising output device 308. More than one input device 306 and/or output device 308 can be provided. It should be appreciated that the processing system 200 may be any form of terminal, server, specialised hardware, or the like.
[0117] Referring to Figure 17, there is illustrated a method 400 of controlling movement in a virtual environment. Method 400 includes step 410 of tilting a platform relative to a horizontal plane, and step 420 of obtaining data relating to the tilt of the platform. Method 400 further includes step 420 of converting the data relating to the tilt of the platform into movement commands in the virtual environment.
[0118] In some examples, step 420 of obtaining data relating to the tilt of the platform includes a step measuring an angle of tilt of the platform relative to the horizontal surface, and a step of measuring an orientation of tilt of the platform in the horizontal surface.
[0119] In some examples, step 430 of converting the data relating to the tilt of the platform into simulated or virtual experience movement commands in the virtual environment includes a step of converting the angle of tilt into a simulated acceleration magnitude in the virtual environment, and a step of converting the orientation of tilt in a direction of simulated acceleration in the virtual environment.
[0120] In some examples, method 400 further includes the step of detecting the presence of a user on the platform. In some examples, the step of detecting the presence of a user on the platform includes the step of measuring a distance between the platform and a substrate surface, and the step of measuring an angle of tilt of the platform. In some examples, the substrate surface is a ground or floor surface supporting the platform.
[0121] In some examples, method 400 is a method of virtual locomotion, further including a step of converting the data relating to the tilt of the platform into simulated movement, or movement commands, in the virtual environment. [0122] Figure 18 illustrates an exemplary embodiment of a virtual locomotion device 500 including an upper surface or platform 501, and a base 502, in which the base 502 is at least partially formed of resilient material.
[0123] Figure 18(a) illustrates an isometric view of the device 500, whilst Figure 18 (b) shows the device 500 partially cut-away, and figure 18(c) shows details of the cross section of Figure 18(b).
[0124] The upper surface of the platform 501 is preferably substantially rigid and is adapted to support at least one foot of a person. Most preferably the platform 501 is adapted to support a person standing thereon.
[0125] The platform 501 may be of any desired shape, but in a preferred embodiment of the invention, the platform 501 is preferably of substantially circular shape.
[0126] Whilst the upper surface of the platform may be‘flat’ or planar, as perhaps best illustrated in Figure 18(c), the upper surface of the platform may, in a preferred embodiment, be formed as an upwardly facing curved upper surface 501, which is preferably slightly concave in shape. A slight upwardly facing concave curvature of the upper surface 501, facilitates a person standing on the platform 500 to readily position themselves in the centre of the device 500, and/or, may assist in their sense of balance.
[0127] The base 502 preferably includes an inner portion 504 and an outer portion 503. The outer portion 503 is preferably at least partially formed of resilient material which is of lower resilience or softer than the resilient material of the inner portion 504 of the base 502.
[0128] The inner portion 504 of the base 502 is adapted to contact a substrate surface at a predetermined diameter from a centre point of said device 500. In a preferred embodiment, the inner portion 504 of said base 502 includes a plurality of shaped protrusions 505 formed of resilient material which are adapted to contact said substrate surface at a plurality of spaced-apart inner base contact positions.
[0129] Any number of protrusions may be provided, but in a preferred embodiment, as shown in Figure 19, the protrusions 505 are preferably symmetrically spaced about the base 502, in the four quadrants thereof. For example the inner portion 504 of said base 502 may preferably include four, eight or twelve protrusions, or, some other multiple of four protrusions. In a preferred embodiment eight substantially equidistant spaced protrusions 505 are provided.
[0130] The outer portion 503 of the base 502 is preferably adapted to contact a substrate surface at a predetermined diameter from a centre point of said device 500. In a preferred embodiment, the outer portion 503 of said base 502 is adapted to contact said substrate surface proximal to a peripheral edge of said device 500. Preferably, the outer portion of said base is formed as a downwardly extending protrusion 506 which extends substantially around the base 502 of the device 500.
[0131] The virtual locomotion platform of the present invention works by presenting an experiential illusion to the user. This experience relies on audio-visual inputs as well as passive and active physical responses of the platform on which the user stands. The passive inputs include the way the platform tilts when the user shifts his/her weight around the platform. The active response is derived from vibrotactile haptic motors under the platform which are‘felt’ by the user’s feet.
[0132] The passive response of the platform can be grouped into dynamic and static. The passive response is the angle at which the platform rests when the user stabilises his/her posture and centre of gravity (CoG) above the platform, such as illustrated in Figure 20. The dynamic response is given by the viscoelasticity and inertia of the movement of the platform during the movement of the user. This latter attribute affects the speed profile and stability of the response. In order to quantify aspects of this experience objectively Figures 21 and 22 show graphs of the profile of the responses in measurable attributes.
[0133] The shapes of these plots describe the static response of each platform once the construction elements have been curated and assembled. The shapes of these plots are related to how responsive the platform is as a sensor and how effective it is an agent of the illusion that informs the user they are floating on a moving platform. This experience is in turn very important for comfort and stability.
[0134] In some examples, the interface device enables simulated locomotion in digital or virtual spaces, in an intuitive and immersive manner. In some examples, the interface device eliminates or reduces simulation sickness (i.e. motion sickness) in virtual reality (VR). [0135] In some examples, the interface device therefore provides a method of locomotion in digital spaces including VR. It is done through a physical board that a user stands on and shifts their body weight, through leaning or stepping. This translates to the board mildly tilting (in some examples, up to five degrees in any given direction), which is measured by the hardware and processed as the user’s intention to move virtually, in a particular direction by a proportioned level of intensity (e.g. speed or acceleration).
[0136] As the interface device transmits the user’s control inputs back into experience, the software processes the virtual outcome and returns data back to the device informing it of the state of its virtual representation (e.g. its speed or whether it has impacted something). The return data is processed by the interface device and presented back to the user as vibrotactile haptic feedback. Using this method, the user can“feel” the ground or impacts under his/her feet.
[0137] In some examples, the interface device provides a fun, engaging, versatile method for locomotion in VR. In some examples, the interface device is low cost (compared to certain alternative solutions for virtual locomotion) and is simple, from a developer’s perspective, to create content for. In some examples, the interface device is a peripheral platform in VR that allows users to step onto locomotion platform and get off without needing to stop experience or remove headset. In some examples, the interface device leaves the hands of the user free to engage with the virtual world in an unencumbered way. In some examples, the haptic feedback under feet enhances engagement and effectiveness of the experience.
[0138] In some examples, the interface device enables locomotion in a virtual environment by processing movement control inputs from the actions and movements of the user. If the user wants to move in a particular direction in the horizontal plane, he/she will shift their centre of mass on the board towards that direction. This leads to a physical tilting of the board in that direction. The magnitude of this tilt is proportional to the amount by which the user moves his/her centre of gravity on the board. This control input is sent to a computer handling the experience in a simple format: namely direction and magnitude of an angle of tilt of the platform. [0139] Depending on the parameters and context of the digital experience as designed by the developer of the content, the user will perceive virtual movement following his/her shifts/tilts. This perceived motion is accompanied by vibrotactile feedback in harmony with the experience. For example, movement on the ground leads to a humming vibration on the board whose amplitude and frequency relates to the perceived velocity. A collision with a virtual object leads to a brief but distinct/deliberate vibration on a second channel of haptic feedback.
[0140] Research on simulation sickness has indicated that the phenomenon is complex and not purely based on simple reflexes. The cause of sim-sickness is likely to be a composite of physiological, cognitive and psychological factors that can all contribute to excite and/or inhibit the effect. In some trials, an example interface device as described herein has been shown to allow VR locomotion while significantly reducing the chances of triggering any simulation sickness. It is believed that this is due to the fact that the user is engaging his/her body to control movement and that the response to this movement is“believed” as an experience for the brain of the user. This is why it is preferable, in some examples, for the user to be standing on the board during use.
[0141] Some advantages of the interface device as described herein include but are not limited to:
1) The user is upright and standing on a board that is flexible as a platform and feels like it is floating in the perception of the virtual world.
2) The user is able to see the board in the same location and in the same form in both real and virtual spaces. The user can thus engage the board and mount it without having to remove the head mount display.
3) The user is able to enjoy additional depth and nuances to the experience by going on and off the board during the experience. Off the board, the user is able to walk around in a‘1 to room scale scenario. However when on the board, the board is activated as a‘personal transport device’ .
4) The haptic feedback under the feet gives an extra layer of sensory input to the user enhancing his/her immersion and making the experience more believable.
5) The device may be battery operated and Bluetooth-compatible, making it portable.
This allows it to be used for multiple platforms of VR devices as well as standard smart tablets and phones for AR and for non-AR/VR applications. 6) Physical tilting provides, in some examples, unlimited movement range within the constraints of the virtual environment, without physical constraints such as the physical room layout or the maximum range of an electrical cable. This provides increased immersion, more akin to a natural experience, therefore reducing motion sickness in some cases.
[0142] In some examples, the interface device involves an intuitive method of locomotion in digital spaces. This is done with a physical platform that the user stands on. The platform tilts downwards the direction that the user leans/steps or shifts their body weight. This tilt is translated into locomotion in the digital space, such as virtual reality, augmented reality, or mixed reality. The interface device also provides vibration haptic feedback to the user to increase the immersion of the experience in the digital space. The interface device also includes the methods of employing the physical device for locomotion in the digital space. There is also a method of detecting when the user is on/off the board.
[0143] In some examples, a method of virtual locomotion includes the way the physical tilt of the platform is translated into digital locomotion in the software/experience. This includes how it is implemented in the physics engine of the development environment (Unity, Unreal Engine).
[0144] In some examples, the board tilts about a fulcrum point at the centre of the bottom of the board (or multiple fulcrum points about the centre of the bottom of the board). In some examples, the tilting mechanism employs a combination of elastic and damping materials. This is to increase the comfort of the user when leaning and quickly changing positions/angles. In some examples, the user steps onto/off the board to utilise the locomotion in the digital spaces. In some examples, the user physically moves/leans/steps to shift their body weight in the direction that they want to move in the digital space. In some examples, the user can sit on the board. In some examples, the user can use it for a balancing application.
[0145] In some examples, an angle of tilt of the platform is measured by an IMU. In some examples, the IMU uses a combination of an electronic accelerometer and gyroscope. In some examples, the accelerometer measures the angle of tilt (in an x, y, z reference frame) with respect to gravity. In some examples, the gyroscope measures the rotational speed (in an x, y, z reference frame). In some examples, the sensor signals are combined into one angle tilt function. In some examples, a method for combining the sensor signals into one angle tilt function includes the steps of: calculating the accelerometer pitch and roll angles from the sensor x-y-z output using trigonometric functions; calculating the gyroscope pitch and roll angles from the sensors degrees/second output, for example, current pitch angle = previous pitch angle + gyroscope value x increment in time. Example sensor fusion formula: current combined pitch angle = 0.95 x (previous combined pitch angle + gyroscope value x time increment) + 0.05 x accelerometer pitch. The method further includes the step of using the combined pitch and roll angles to calculate the magnitude and direction using trigonometry. Magnitude data is the angle of tilt made with respect to the gravity vector. The Direction data is the 0-359 position with respect to the horizontal plane. The 0 direction is defined at start-up. The angle tilt value is used to calculate the direction and magnitude values, for locomotion around the digital space.
[0146] In some examples, the interface device is able to determine if the user is on or off the board. In some examples, the proximity sensor is fixed to the base of the board, which is a distance from the floor. This distance changes depending on the weight on the board. The proximity signal is then used to measure if someone is on/off the board and also if a new user has stepped onto the board.
[0147] In some examples, the interface device is capable of wireless communication. In some examples, Bluetooth low energy is used as the primary communication method between the board and a PC or mobile device. In some examples, a custom HID GATT profile is used for the communications
[0148] In some examples, the interface device includes modules or devices for vibration haptic feedback. In some examples, haptic feedback is delivered to the user through the use of vibration on the board to the user’s feet or other part of the body that is in direct contact with the board. In some examples, this vibration is created using vibration motors (e.g. eccentric mass motors). This vibration gives the user haptic feedback for events such as collisions, or for different states that the user may in the digital/virtual space, such as a hover sensation or a movement across rough terrain. [0149] In some examples, there is provided a method or a software implementation of converting physical movement or input on the interface device to movement in a virtual environment. In some examples, the method or software accounts for responsiveness to physical movement, speed and momentum in the digital space, the on/off boarding experience (e.g. slow start/stop when getting on or off the board), and utilising translational and rotational movement.
[0150] In some examples, it is desirable to recreate natural feelings in virtual environments to increase the immersion of the user. The interface device eliminates the difficulty of developing methods to simulate walking. The interface device acts like a virtual hoverboard, and the user more readily accepts that they are hovering around in a virtual world. In addition, in some examples, the interface device offers increased immersion in the virtual experience and in most cases a reduction in the feelings of motion sickness caused by the locomotion joystick methods. In some examples, the interface device provides a more intuitive method of physically moving on the board and also the ability to deliver haptic feedback to the user.
[0151] In some examples, the interface device allows users to mount and dismount the platform during use (e.g. during a VR experience). In some examples, in order to provide such functionality of the interface device, the interface device should be presented in the virtual experience as placed in the same physical vicinity and orientation to the user so that the user can walk towards it and knowingly get on the board without having to take the HMD off.
[0152] Optional embodiments may also be said to broadly include the parts, elements, steps and/or features referred to or indicated herein, individually or in any combination of two or more of the parts, elements, steps and/or features, and wherein specific integers are mentioned which have known equivalents in the art to which the invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.
[0153] Although a preferred embodiment has been described in detail, it should be understood that many modifications, changes, substitutions or alterations will be apparent to those skilled in the art without departing from the scope of the present invention. [0154] Throughout this specification and the claims which follow, unless the context requires otherwise, the word "comprise", and variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated integer or step or group of integers or steps but not the exclusion of any other integer or step or group of integers or steps.

Claims

THE CLAIMS:
1. An interface device for virtual locomotion including:
a platform configured to accommodate a user;
a base supporting the platform, the base having a lower surface configured to rest upon a substrate surface, wherein the base is configured to allow tilting of the platform by the user; and
a first sensor configured to detect the tilting of the platform.
2. The interface device of claim 1, wherein the platform is configured to accommodate a user in one of a standing, sitting, or kneeling position.
3. The interface device of claim 1 or 2, wherein the substrate surface is a floor or ground surface supporting the interface device.
4. The interface device of any one of claims 1 to 3, wherein the base is configured to allow tilting of the platform relative to the substrate surface.
5. The interface device of any one of claims 1 to 4, wherein the substrate surface is a floor or ground surface.
6. The interface device of any one of claims 1 to 5, wherein the base is configured to allow tilting of the platform by the user’s weight.
7. The interface device of any one of claims 1 to 6, wherein the base includes a resilient member.
8. The interface device of any one of claims 1 to 7, wherein the first sensor is fixed to the platform.
9. The interface device of any one of claims 1 to 8, wherein the first sensor is configured to measure an angle of tilt of the platform relative to a horizontal plane.
10. The interface device of claim 9, wherein the first sensor is further configured to measure a direction of tilt of the platform relative to the horizontal plane.
11. The interface device of any one of claims 1 to 10, further including a second sensor configured to measure the distance between the platform and the substrate surface.
12. The interface device of claim 11, wherein the second sensor is fixed to a lower surface of the platform.
13. The interface device of any one of claims 1 to 12, further including a haptic feedback device configured to provide vibrations or a force to the user.
14. The interface device of claim 13, wherein the haptic feedback device is fixed to the platform.
15. The interface device of any one of claims 1 to 14, further including a stabiliser.
16. The interface device of claim 15, wherein the stabiliser is located on a rim of a lower surface of the platform.
17. The interface device of claim 15 or 16, wherein the stabiliser includes a resilient material.
18. An interface device for virtual locomotion including:
a platform configured to accommodate a user;
a base supporting the platform, the base being configured to be supported by a substrate surface, wherein the base pivotally connects the platform to the substrate surface; and
a sensor configured to detect a tilt angle of the platform relative to the substrate surface, and a tilt orientation of the platform relative to the substrate surface.
19. A system for virtual locomotion including:
the interface device of any one of claims 1 to 18; and
a processing system configured to: obtain data relating to the tilt of the platform; and
convert the data relating to the tilt of the platform into movement commands in the virtual environment.
20. The system of claim 19, further including a display device.
21. The system of claim 20, wherein the display device is a virtual reality headset.
22. The system of any one of claims 19 to 21, further including a hand-held controller.
23. The system of claim 22, wherein the hand-held controller is a virtual reality joystick.
24. The system of any one of claims 19 to 23, wherein the data relating to the tilt of the platform includes an angle of tilt of the platform relative to a horizontal surface.
25. The system of any one of claims 19 to 24, wherein the data relating to the tilt of the platform includes an orientation of tilt of the platform in a horizontal surface.
26. The system of any one of claims 19 to 25, wherein the processing system is further configured to stimulate forces or vibrations in the platform.
27. The system of any one of claims 19 to 26, wherein the processing system is further configured to:
determine a location and orientation of the interface device relative to the user; and create an image of the interface device in the virtual environment, wherein the image has a location and orientation relative to the user corresponding to the location and orientation of the interface device relative to the user.
28. A method of controlling movement in a virtual environment including the steps of: tilting a platform relative to a horizontal plane;
obtaining data relating to the tilt of the platform; and
converting the data relating to the tilt of the platform into movement commands in the virtual environment.
29. The method of claim 28, wherein the step of obtaining data relating to the tilt of the platform includes the steps of:
measuring an angle of tilt of the platform relative to the horizontal surface; and measuring an orientation of tilt of the platform in the horizontal surface.
30. The method of claim 29, wherein the step of converting the data relating to the tilt of the platform into movement commands in the virtual environment includes the steps of: converting the angle of tilt in an acceleration magnitude in the virtual environment; and
converting the orientation of tilt in a direction of acceleration in the virtual environment.
31. The method of any one of claims 28 to 30, wherein the method further includes the step of detecting the presence of a user on the platform.
32. The method of any one of claims 28 to 31, wherein the step of detecting the presence of a user on the platform includes the steps of:
measuring a distance between the platform and a substrate surface; and
measuring an angle of tilt of the platform.
33. An interface device for virtual locomotion including:
a platform configured to accommodate a user;
a base supporting the platform, the base having a lower surface configured to rest upon a substrate surface, wherein the base is being at least partially formed of resilient material, and configured to allow tilting of the platform by the user; and,
at least one sensor configured to detect the tilting of the platform.
34. The interface device as claimed in claim 33, wherein the platform is substantially rigid and adapted to support a person standing thereon.
35. The interface device as claimed in claim 33 or 34, wherein said platform is of substantially circular shape.
36. The interface device as claimed in any one of claims 33 to 35 wherein said platform includes an upwardly facing curved upper surface.
37. The interface device as claimed in any one of claims 33 to 36 wherein said platform is slightly concave in shape.
38. The interface device as claimed in any one of claims 33 to 37 wherein said base includes an inner portion and an outer portion, wherein said outer portion is at least partially formed of resilient material which is less resilient/softer than the resilient material of the inner portion of the base.
39. The interface device as claimed in any one of claims 33 to 38 wherein said inner portion of said base is adapted to contact a substrate surface at a predetermined diameter from a centre point of said platform.
40. The interface device as claimed in any one of claims 33 to 39 wherein said inner portion of said base includes a plurality of shaped protrusions formed of resilient material which are adapted to contact said substrate surface at a plurality of spaced apart inner base contact positions.
41. The interface device as claimed in any one of claims 33 to 40, wherein said inner portion of said base includes eight substantially equidistant spaced protrusions.
42. The interface device as claimed in any one of claims 33 to 41, wherein said outer portion of said base is adapted to contact a substrate surface at a predetermined diameter from a centre point of said platform.
43. The interface device as claimed in any one of claims 33 to 42, wherein said outer portion of said base is adapted to contact said substrate surface proximal to a peripheral edge of said platform.
44. The interface device as claimed in any one of claims 33 to 43, wherein said outer portion of said base is formed as a downwardly extending protrusion which extends substantially around the platform.
PCT/AU2019/050166 2018-02-28 2019-02-27 Virtual locomotion device WO2019165501A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AU2018900655 2018-02-28
AU2018900655A AU2018900655A0 (en) 2018-02-28 Interface device, system, and method for virtual locomotion
AU2019900314A AU2019900314A0 (en) 2019-02-01 Virtual locomotion device
AU2019900314 2019-02-01

Publications (1)

Publication Number Publication Date
WO2019165501A1 true WO2019165501A1 (en) 2019-09-06

Family

ID=67804800

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2019/050166 WO2019165501A1 (en) 2018-02-28 2019-02-27 Virtual locomotion device

Country Status (1)

Country Link
WO (1) WO2019165501A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111184993A (en) * 2019-12-30 2020-05-22 河南水利与环境职业学院 Virtual reality self-service gaming device
KR20220140152A (en) * 2021-04-09 2022-10-18 손영범 Board type controller for virtual reality application program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4817950A (en) * 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US20040224824A1 (en) * 2003-05-05 2004-11-11 Brett Lickle Balance training device and method of use
US20050195128A1 (en) * 2004-03-03 2005-09-08 Sefton Robert T. Virtual reality system
US7542040B2 (en) * 2004-08-11 2009-06-02 The United States Of America As Represented By The Secretary Of The Navy Simulated locomotion method and apparatus
US20110009241A1 (en) * 2009-04-10 2011-01-13 Sovoz, Inc. Virtual locomotion controller apparatus and methods
US20140035888A1 (en) * 2011-01-05 2014-02-06 Stelulu Technology Inc. Foot-operated controller for controlling a machine

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4817950A (en) * 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US20040224824A1 (en) * 2003-05-05 2004-11-11 Brett Lickle Balance training device and method of use
US20050195128A1 (en) * 2004-03-03 2005-09-08 Sefton Robert T. Virtual reality system
US7542040B2 (en) * 2004-08-11 2009-06-02 The United States Of America As Represented By The Secretary Of The Navy Simulated locomotion method and apparatus
US20110009241A1 (en) * 2009-04-10 2011-01-13 Sovoz, Inc. Virtual locomotion controller apparatus and methods
US20140035888A1 (en) * 2011-01-05 2014-02-06 Stelulu Technology Inc. Foot-operated controller for controlling a machine

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111184993A (en) * 2019-12-30 2020-05-22 河南水利与环境职业学院 Virtual reality self-service gaming device
CN111184993B (en) * 2019-12-30 2021-06-29 河南水利与环境职业学院 Virtual reality self-service gaming device
KR20220140152A (en) * 2021-04-09 2022-10-18 손영범 Board type controller for virtual reality application program
KR102657001B1 (en) * 2021-04-09 2024-04-26 주식회사 미디어큐빗 Board type controller for virtual reality application program

Similar Documents

Publication Publication Date Title
JP2021180859A (en) Motion control seat input device
US8152640B2 (en) Information processing apparatus and computer readable storage medium
EP2215618B1 (en) Motion simulation chair
US8675018B2 (en) Electromechanical surface of rotational elements for motion compensation of a moving object
US20160320862A1 (en) Motion control seat input device
JP2017534303A (en) Foot operation controller, apparatus and furniture including the same, and operation method thereof
US20090119030A1 (en) Force feedback and interactive system
KR101624787B1 (en) virtual reality simulator and walking mimicking shoes
JP6900091B2 (en) Virtual reality mobile device
CA2625748A1 (en) Interface device
WO2019165501A1 (en) Virtual locomotion device
US20200097069A1 (en) Virtual Reality Input Device
JP3847634B2 (en) Virtual space simulation device
US10926056B2 (en) Method, apparatus and system for mitigating motion sickness in a virtual reality environment
WO2018170159A1 (en) Virtual reality training device
KR101364594B1 (en) Tangible Snowboard Apparatus based on Bi-directional Interaction between motion of user and Motion Platform
US20210380189A1 (en) Rotating Platform With Navigation Controller For Use With Or Without A Chair
Wang et al. Isometric versus elastic surfboard interfaces for locomotion in virtual reality
US11036283B2 (en) Navigation controller
Loviscach Playing with all senses: Human–Computer interface devices for games
Whitton et al. Locomotion interfaces
WO2022051229A1 (en) A rotating platform with navigation controller for use with or without a chair
JP2020004263A (en) Program for providing user with virtual experience, computer, and method
KR20220125439A (en) Input device interconnected with exercise equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19760819

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19760819

Country of ref document: EP

Kind code of ref document: A1