CN113260427A - Drop detection system and method - Google Patents

Drop detection system and method Download PDF

Info

Publication number
CN113260427A
CN113260427A CN202080008665.0A CN202080008665A CN113260427A CN 113260427 A CN113260427 A CN 113260427A CN 202080008665 A CN202080008665 A CN 202080008665A CN 113260427 A CN113260427 A CN 113260427A
Authority
CN
China
Prior art keywords
visualization device
wearable visualization
processor
response
wearable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080008665.0A
Other languages
Chinese (zh)
Inventor
P·J·格尔根
T·M·特鲁吉罗
M·E·格拉厄姆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Universal City Studios LLC
Original Assignee
Universal City Studios LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universal City Studios LLC filed Critical Universal City Studios LLC
Publication of CN113260427A publication Critical patent/CN113260427A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/73Authorising game programs or game devices, e.g. checking authenticity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/75Enforcing rules, e.g. detecting foul play or generating lists of cheating players
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/98Accessories, i.e. detachable arrangements optional for the use of the video game device, e.g. grip supports of game controllers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63GMERRY-GO-ROUNDS; SWINGS; ROCKING-HORSES; CHUTES; SWITCHBACKS; SIMILAR DEVICES FOR PUBLIC AMUSEMENT
    • A63G31/00Amusement arrangements
    • A63G31/16Amusement arrangements creating illusions of travel
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P15/00Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
    • G01P15/18Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration in two or more dimensions
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Computer Security & Cryptography (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Quality & Reliability (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Cardiology (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Geometry (AREA)
  • Eye Examination Apparatus (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

A detection system (50) is configured to detect improper manipulation of a wearable visualization device (12). The detection system (50) includes a sensor (40) coupled to the wearable visualization device (12), a light emitter (42) coupled to the wearable visualization device (12), and a processor (54) configured to receive a signal from the sensor (40). The processor (54) is further configured to determine whether the signal indicates incorrect manipulation of the wearable visualization device (12), and direct illumination of the light emitter (42) in response to determining that the signal indicates incorrect manipulation of the wearable visualization device (12).

Description

Drop detection system and method
Cross reference to related applications
This application claims priority and benefit from united states provisional application No. 62/791,735 entitled "AUGMENTED REALITY (AR) head set FOR HIGH THROUGHPUT landscapes" filed on 2019, month 1, and day 11, which is hereby incorporated by reference in its entirety FOR all purposes.
Background
This section is intended to introduce the reader to various aspects of art, which may be related to various aspects of the present technology that are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
Amusement and/or theme parks are designed to provide entertainment to patrons. The area of the amusement park may have different themes specifically targeting certain audiences. For example, some areas may include topics that are traditionally of interest to children, while other areas may include topics that are traditionally of interest to more mature viewers. In general, such areas with themes may be referred to as landscapes or themed landscapes. It is recognized that it may be desirable to increase the immersive experience of guests in such a landscape, such as by augmenting the theme with virtual features.
Disclosure of Invention
The following sets forth a summary of certain embodiments disclosed herein. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
In one embodiment, the detection system is configured to detect incorrect manipulation of the wearable visualization device. The detection system includes a sensor coupled to the wearable visualization device, a light emitter coupled to the wearable visualization device, and a processor configured to receive a signal from the sensor. The processor is further configured to determine whether the signal indicates improper manipulation of the wearable visualization device, and direct illumination of the light emitter in response to determining that the signal indicates improper manipulation of the wearable visualization device.
In one embodiment, a wearable visualization device includes a housing, a sensor supported by the housing and configured to detect motion of the wearable visualization device, a light emitter supported by the housing, and a processor configured to receive a signal from the sensor, determine whether the signal indicates that the wearable visualization device has been dropped or thrown based on the detected motion of the wearable visualization device, and direct illumination of the light emitter in response to determining that the signal indicates that the wearable visualization device has been dropped.
In one embodiment, a method of detecting improper manipulation of a wearable visualization device using a detection system includes receiving, at a processor, a signal from a sensor coupled to the wearable visualization device. The method also includes determining, using the processor, that the signal indicates improper manipulation of the wearable visualization device. The method also includes counting, using the processor, a number of incorrectly manipulated events of the wearable visualization device over time. The method also includes directing, using the processor, illumination of the light emitter in response to determining that the number of events exceeds the count threshold.
Various modifications may be made to the above-described features with respect to various aspects of the present disclosure. Additional features may also be incorporated in these various aspects as well. These refinements and additional features may exist individually or in any combination.
Drawings
These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
fig. 1 is a perspective view of a wearable visualization device and interface device of an Augmented Reality (AR), Virtual Reality (VR) and/or mixed reality (combination of AR and VR) system (AR/VR system) in a jointed configuration according to an embodiment;
fig. 2 is a perspective view of the wearable visualization device and the interface device of fig. 1 in a disassembled configuration according to the present embodiment;
fig. 3 is a schematic illustration of components of a detection system for the wearable visualization device of fig. 1 according to the present embodiment;
FIG. 4 is a perspective view of a portion of a ride feature in which the AR/VR system of FIG. 1 may be utilized in accordance with the present embodiments;
FIG. 5 is a method of using the AR/VR system of FIG. 1 in accordance with the present embodiments; and
fig. 6 is a schematic illustration of a question that may be presented via the wearable visualization device of fig. 1 according to the present embodiment, wherein the wearable visualization device enables a user to respond to the question with gesture inputs.
Detailed Description
One or more specific embodiments will be described below. In an effort to provide a concise description of these embodiments, not all features of an actual implementation are described in the specification. It should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another. Moreover, it should be appreciated that such a development effort might be complex and time consuming, but would nevertheless be a routine undertaking of design, fabrication, and manufacture for those of ordinary skill having the benefit of this disclosure.
When introducing elements of various embodiments of the present disclosure, the articles "a," "an," and "the" are intended to mean that there are one or more of the elements. The terms "comprising," "including," and "having" are intended to be inclusive and mean that there may be additional elements other than the listed elements. In addition, it should be understood that references to "one embodiment" or "an embodiment" of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.
An amusement park may include Augmented Reality (AR), Virtual Reality (VR), and/or mixed reality (a combination of AR and VR) systems (e.g., AR/VR systems) configured to augment a guest experience of an amusement park landscape by providing the guest with an AR/VR experience (e.g., an AR experience, a VR experience, or both). Indeed, a combination of certain hardware configurations, software configurations (e.g., algorithmic structure and/or modeled responses), and certain landscape features may be utilized to provide a guest with an AR/VR experience that may be customizable, personalized, and/or interactive. For example, the AR/VR system may include a wearable visualization device, such as a head mounted display (e.g., electronic mirror or display, eyepiece), that may be worn by the guest and may be configured to enable the guest to view the virtual feature. In particular, wearable visualization devices may be used to augment guest experiences by overlaying virtual features onto the real-world environment of an amusement park, by providing an adaptable virtual environment to provide different experiences in a landscape, and so forth.
Advantageously, the disclosed embodiments provide a detection system (e.g., a drop detection system) configured to monitor whether a wearable visualization device has been improperly manipulated (e.g., experienced an adverse or potentially damaging event, such as a drop or a throw). In particular, the detection system may include a sensor (e.g., an inertial measurement unit [ IMU ]) coupled to the wearable visualization device and configured to monitor one or more parameters (e.g., acceleration and/or deceleration) of the wearable visualization device indicative of being improperly manipulated. The sensor may provide a signal indicative of the parameter to a controller (e.g., an electronic controller), which may process the signal to determine whether the wearable visualization device has been improperly manipulated, and may cause one or more actions in response to determining that the wearable visualization device has been improperly manipulated. For example, the controller may cause illumination of lights (e.g., light emitters; light emitting diodes [ LEDs ]) on the wearable visualization device, on a ride vehicle of the landscape, at an operator station of the landscape, or otherwise provide notification that the wearable visualization device has been improperly manipulated. In some embodiments, the controller may count the number of times the wearable visualization device has been improperly manipulated over time (e.g., the number of times the acceleration of the wearable visualization device has exceeded an acceleration threshold, as indicated by the signal from the sensor), and the controller may cause one or more actions in response to the number of times the wearable visualization device has been improperly manipulated exceeding the count threshold. Thus, the detection system may facilitate the effective removal of any wearable visualization device that may be damaged due to being improperly handled, and may facilitate the operation of the AR/VR system such that guests can experience a landscape with a functioning wearable visualization device.
In view of the foregoing, fig. 1 is a perspective view of an embodiment of an AR/VR system 10 (e.g., a wearable visualization system), the AR/VR system 10 configured to enable a user (e.g., a guest, an amusement park employee, an operator of a landscape, a passenger riding a vehicle) to experience an AR/VR scene (e.g., view, interact with the AR/VR scene). As shown, the AR/VR system 10 includes a wearable visualization device 12 (e.g., a head-mounted display) and a guest interface device 14, which may be removably coupleable to one another to facilitate use of the AR/VR system 10.
In the illustrated embodiment, the wearable visualization device 12 includes a lens portion 16 coupled to a housing 18 of the wearable visualization device 12. The lens portion 16 may include one or more lenses 20 (e.g., a display; transparent, translucent, or opaque). In some embodiments, the lens 20 may enable a user to view a real-world environment 22 (e.g., a physical structure in a landscape) through the lens 20, with certain virtual features 24 (e.g., AR features) overlaid onto the lens 20, such that the user perceives the virtual features 24 as being integrated into the real-world environment 22. That is, the lens portion 16 may at least partially control the user's FOV by overlaying the virtual feature 24 onto the user's line of sight. To this end, the wearable visualization device 12 may enable a user to visualize and perceive a super-reality environment 26 (e.g., a gaming environment), the super-reality environment 26 having certain virtual features 24 overlaid onto the real-world environment 22 that are viewable by the user through the lens 20.
By way of non-limiting example, the lens 20 may comprise a transparent (e.g., see-through) Light Emitting Diode (LED) display or a transparent (e.g., see-through) Organic Light Emitting Diode (OLED) display. In some embodiments, the lens portion 16 may be formed of a one-piece construction spanning a distance to display an image to both eyes of the user. That is, in such embodiments, lenses 20 (e.g., first lens 28, second lens 30) may be formed from a single, continuous piece of material, where first lens 28 may be aligned with a first eye of the user and second lens 30 may be aligned with a second eye of the user. In other embodiments, the lens portion 16 may be a multi-piece construction formed from two or more separate lenses 20.
In some embodiments, the wearable visualization device 12 may fully control the user's field of view (e.g., using an opaque viewing surface). That is, lens 20 may include an opaque or non-transparent display configured to display virtual features 24 (e.g., VR features) to a user. Thus, the hyper-reality environment 26 that is viewable by the user may be, for example, a real-time video that includes a real-world image of the real-world environment 22 electronically merged with one or more virtual features 24. Thus, when wearing the wearable visualization device 12, the user may feel fully surrounded by the hyper-reality environment 26 and may perceive the hyper-reality environment 26 as a real-world environment 22 including certain virtual features 24. In some embodiments, the wearable visualization device 12 may include features, such as light projecting features, configured to project light into one or both eyes of the user such that certain virtual features 24 are superimposed over real-world objects viewable by the user. Such wearable visualization device 12 may be considered to comprise a retinal display.
Thus, it should be understood that the super-reality environment 26 may include an AR experience, a VR experience, a mixed reality experience, a computer mediated reality experience, a combination thereof, or other similar super-reality environment. Further, it should be understood that the wearable visualization device 12 may be used alone or in combination with other features to create the super-reality environment 26. Indeed, as discussed below, a user may wear the wearable visualization device 12 for the entire duration of a ride landscape in an amusement park or during another time (such as during a game, within an entire particular area or landscape of an amusement park, during a ride to a hotel associated with the amusement park, at the hotel, etc.). In some embodiments, the wearable visualization device 12 may be physically coupled (e.g., tethered via a cable 32) to a structure (e.g., a ride vehicle) to prevent separation of the wearable visualization device 12 from the structure and/or may be electronically coupled (e.g., via the cable 32) to a computing system to facilitate operation of the wearable visualization device 12 (e.g., to display the virtual feature 24; to monitor whether the wearable visualization device 12 has been improperly manipulated and provide related notifications).
As shown, the wearable visualization device 12 is removably coupleable (e.g., tool-less coupleable; coupleable without tools; detachable without tools and without destroying the guest interface device 14 or components of the wearable visualization device 12) to the guest interface device 14 to enable the wearable visualization device 12 to be quickly transitioned between an engaged configuration 34 (where the wearable visualization device 12 is coupled to the guest interface device 14 in the engaged configuration 34) and a disengaged configuration 36 (see, e.g., fig. 2) (where the wearable visualization device 12 is decoupled from the guest interface device 14 in the disengaged configuration 36). In the illustrated embodiment, the guest interface device 14 is configured to be attached to the head of a user, and thus enables the user to comfortably wear the wearable visualization device 12 throughout various landscapes or while traversing certain amusement park environments. For example, the guest interface device 14 may include a head strap assembly 38, the head strap assembly 38 configured to span around a periphery of the user's head and configured to tighten (e.g., contract) on the user's head. In this manner, the head strap assembly 38 facilitates attachment of the guest interface device 14 to the head of the user such that the guest interface device 14 can be used to retain the wearable visualization device 12 on the user (e.g., when the wearable visualization device 12 is in the engaged configuration 34).
Such a configuration may enable a user or another person (e.g., an operator, a maintenance technician) to effectively couple and decouple the wearable visualization device 12 from the guest interface device 14 (e.g., upon determining that the wearable visualization device 12 should be serviced, such as due to being improperly handled). However, it should be understood that the wearable visualization device 12 and/or the guest interface device 14 may have any of a variety of forms or structures that enable the wearable visualization device 12 to function in the manner described herein. For example, the wearable visualization device 12 may be used without a separate guest interface device 14 and/or the wearable visualization device 12 may be integrally formed with the guest interface device 14. As shown, the wearable visualization device 12 may include a sensor 40 (e.g., an IMU) and/or one or more lights 42 (e.g., LEDs). As discussed in more detail below, the sensor 40 may be configured to monitor one or more parameters (e.g., acceleration and/or deceleration) that indicate that the wearable visualization device 12 has been improperly manipulated, and the light 42 may be configured to illuminate, such as in response to determining (e.g., by the controller) that the wearable visualization device 12 has been improperly manipulated. In this manner, even if the wearable visualization device 12 does not appear to be damaged (e.g., at the time of visual inspection), the wearable visualization device 12 may be identified as potentially damaged and may be flagged for maintenance operations.
Fig. 2 is a perspective view of an embodiment of the AR/VR system 10 illustrating the wearable visualization device 12 and the guest interface device 14 in a disassembled configuration 36. In some embodiments, housing 18 may be assembled from a plurality of plates (e.g., housing sections; molded and/or machined plates) such as cover 44, base plate 46, and lens mount 48 (e.g., plates configured to support lens portion 16), which may collectively form housing 18. As discussed below, some or all of the plates may include component mating features (e.g., machined and/or molded features on a surface of the plate) configured to receive various subcomponents of the wearable visualization device 12 (e.g., sensors 40; lights 42; other electronic components, such as a controller) and/or to couple to various subcomponents of the wearable visualization device 12 (e.g., sensors 40; lights 42; other electronic components, such as a controller).
As discussed below, after the sub-components are assembled on one or more of the plates, the plates may be assembled (e.g., coupled to each other via fasteners, adhesives, and/or other techniques) to form housing 18. Accordingly, housing 18 may support and/or encapsulate the subcomponents to substantially seal (e.g., hermetically seal) at least portions of the subcomponents within housing 18 to shield these subcomponents from direct exposure to the surrounding environmental elements (e.g., moisture) surrounding wearable visualization device 12. It is understood that in other embodiments, the housing 18 may be assembled from additional or fewer plates than the cover 44, the base plate 46, and the lens mount 48. Indeed, in certain embodiments, the housing 18 may include 1, 2, 3, 4, 5, 6, or more than six separate plates that may collectively form the housing 18 in an assembled configuration.
It should also be understood that the sensors 40 may be positioned at any location of the wearable visualization device 12 and/or any number (e.g., 1, 2, 3, 4, or more) of sensors 40 may be provided. As non-limiting examples, the sensors 40 may be position sensors and/or impact sensors, such as accelerometers, magnetometers, gyroscopes, global positioning system receivers, motion tracking sensors, electromagnetic and solid state motion tracking sensors, and/or IMUs. When the sensor 40 is an IMU, the IMU may comprise a nine degree of freedom system on a chip equipped with an accelerometer, a gyroscope, a magnetometer, and a processor for executing a sensor fusion algorithm. Thus, the signals from the IMU may be used to determine the acceleration and/or direction of the wearable visualization device 12 (e.g., relative to the gravity vector). The wearable visualization device 12 may include different types of sensors 40, such as different types of sensors 40 that detect different parameters (e.g., an IMU that detects acceleration of the wearable visualization device 12 and one or more impact sensors that detect the location of an impact on the wearable visualization device 12).
Similarly, the lights 42 may be positioned at any location of the wearable visualization device 12 and/or any number (e.g., 1, 2, 3, 4, or more) of lights 42 may be provided. The light 42 may be positioned to be visible when the wearable visualization device 12 is coupled to the guest interface device 14, visible when the wearable visualization device 12 is docked (docked) (e.g., coupled to or stored on a structure, such as a ride vehicle), visible to a user when the wearable visualization device 12 is worn by the user, and/or visible to an operator (e.g., a person other than the user) to facilitate visualization of the light 42 when the light 42 is illuminated.
Fig. 3 is a schematic diagram of components of a detection system 50 (e.g., a drop detection system) for a wearable visualization device 12. As shown, the detection system 50 may include the sensor 40 and the light 42 of the wearable visualization device 12. The detection system 50 may also include a controller 52 having a processor 54 and a memory device 56. As shown, the controller 52 is positioned on the wearable visualization device 12; however, it should be understood that the controller 52 may be located off of the wearable visualization device 12, such as on a ride vehicle or on a system located remotely from the wearable visualization device 12. Further, the functions and processing steps described herein as being performed by the controller 52 may be distributed between the controller 52 and any other suitable controller or processing system (e.g., the sensor 40, the ride vehicle, a system located remotely from the wearable visualization device 12; the controller 52 may be or may be part of a distributed control system having multiple processors). For example, the sensor 40 may be an IMU having a first processor configured to count the number of accelerations above an acceleration threshold, and the sensor 40 may provide this number to a second processor for further processing and/or to enable the second processor to perform certain actions, such as illuminating the light 42. Thus, processor 54 may include one or more processors located in any suitable location, and memory device 56 may include one or more memory devices located in any suitable location.
Memory device 56 may include one or more tangible, non-transitory, computer-readable media that store instructions executable by processor 54 and/or data (e.g., parameters; number of events) to be processed by processor 54. For example, memory device 56 may include Random Access Memory (RAM), Read Only Memory (ROM), rewritable non-volatile memory (such as flash memory), a hard disk drive, an optical disk, and/or the like. Additionally, the processor 54 may include one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more Field Programmable Gate Arrays (FPGAs), or any combination thereof. Further, memory device 56 may store instructions executable by processor 54 to perform the methods and control actions described herein. The controller 52 may also include a communication device 58 that enables communication with other devices or systems, such as an operator system 60 (e.g., having a computing system with a processor and a memory device) and/or a landscape system 62 (e.g., having a computing system with a processor and a memory device), via a communication network.
The sensor 40 may be configured to detect one or more parameters indicating that the wearable visualization device 12 is improperly manipulated. For example, if the user drops the wearable visualization device 12 (e.g., to a free fall toward the ground/along a gravity vector), the sensor 40 may detect an acceleration (e.g., a sudden acceleration or deceleration). The sensor 40 may provide a signal to the processor 54, and the processor 54 may process the signal by comparing the acceleration (e.g., the maximum acceleration value) to an acceleration threshold (e.g., an acceleration threshold). The processor 54 may be configured to determine that the wearable visualization device 12 has been dropped in response to determining that the acceleration exceeds an acceleration threshold. It should be understood that acceleration is a broad term encompassing various ways of detecting a fall and/or a throw, and thus, acceleration may be negative, and acceleration threshold may be a negative acceleration threshold (e.g., due to a fall), or acceleration threshold may be considered a deceleration threshold (e.g., due to an abrupt stop due to an impact). Processor 54 may also be considered to determine and analyze accelerations and/or other parameters (e.g., acceleration patterns or signatures) over time to determine whether wearable visualization device 12 has been improperly manipulated (e.g., and to characterize events, as discussed below).
Acceleration above an acceleration threshold may generally indicate that the severity (e.g., severity level) of the drop exceeds the severity threshold (e.g., the motion of the wearable visualization device 12 is sufficient to be considered a drop, which may potentially damage the wearable visualization device 12). Thus, the acceleration threshold may represent a severity threshold. In some embodiments, processor 54 may compare the acceleration to a plurality of acceleration thresholds, each of which may represent a different severity threshold, and may enable the processor to more accurately determine the severity of the drop. For example, if the acceleration is above a first acceleration threshold and below a second acceleration threshold, the processor 54 may determine that a drop has occurred and has a first, lower severity level. And if the acceleration is above both the first acceleration threshold and the second acceleration threshold, the processor 54 may determine that a drop has occurred and has a second, higher severity level. The processor 54 may be configured to determine that the wearable visualization device 12 has been flung and determine the severity of the flung in a similar manner (e.g., compared to one or more acceleration thresholds). It should be understood that the sensor 40 may additionally or alternatively detect various other parameters, such as the direction, angular rate, and/or deceleration of the wearable visualization device 12 (e.g., relative to the gravity vector). Processor 54 may process the signals from sensors 40 in a similar manner (e.g., compared to one or more thresholds) to determine whether wearable visualization device 12 has been dropped or otherwise improperly manipulated and the associated severity level.
In some embodiments, regardless of the parameters and regardless of the number of parameters, processor 54 may process the signals from sensors 40 to determine characteristics of the motion of wearable visualization device 12 (e.g., to characterize the event and/or incorrect manipulation, such as to characterize the event as being dropped or thrown). For example, the processor 54 may determine that the signal indicates that the wearable visualization device 12 is dropped, the speed of the wearable visualization device 12 during the drop, the time and/or distance traveled during the drop, indicates that the wearable visualization device 12 is thrown, the speed at which the wearable visualization device 12 is thrown, the time and/or distance of the throw, the location of the impact, and/or the like. The drop can typically have a lower acceleration than the throw, as well as other parameters than the throw. Accordingly, the processor 54 may characterize an event as being a drop or a throw based on a comparison of the parameter(s) to known parameters (e.g., stored in the memory device 56) related to the drop or throw.
As described above, in some embodiments, the processor 54 may be configured to compare the parameter(s) and/or characteristic(s) to respective thresholds (e.g., one or more acceleration thresholds, one or more velocity thresholds, one or more time thresholds, one or more distance thresholds) to determine the severity of the incorrect manipulation and/or event. For example, a short drop with a lower acceleration may be less severe than a high speed throw with a higher acceleration. In some cases, the processor 54 may be configured to input the parameter(s) and/or characteristic(s) into a model configured to output a severity or otherwise classify (e.g., categorize) the event and/or incorrect manipulation based on the parameter(s) and/or characteristic(s). For example, the model may be responsible for certain combinations of parameters that historically cause damage or compromise the operation of a similar wearable visualization device 12. In some embodiments, the processor 54 may interpret the location of the impact (e.g., based on signals from the impact sensor) to determine severity, as the impact at the lens 20 may be more severe than the impact at the housing 18 (fig. 1) and may be more likely to cause damage. The processor 54 may also be configured to determine the motion of the wearable visualization device 12 relative to the ride vehicle (e.g., to decouple the motion of the wearable visualization device 12 from the motion of the ride vehicle, such as from an expected or known motion or acceleration of the ride vehicle during the ride vehicle's course, and/or from the motion or acceleration of the ride vehicle during the ride vehicle's course as detected by a ride vehicle sensor configured to monitor the motion of the ride vehicle). In this manner, sudden movements or accelerations of the ride vehicle (e.g., at portions of the ride designed to move the ride vehicle in this manner) may be ignored or disregarded by the processor 54 as incorrect maneuvers.
In response to determining that the wearable visualization device 12 has been dropped or otherwise improperly manipulated (e.g., has a severity that exceeds a severity threshold), the processor 54 may then cause one or more actions, such as illumination of at least one of the lights 42. Illumination of at least one of the lights 42 may prompt the user or operator to perform a maintenance operation, such as inspecting the wearable visualization device 12, performing a test of the wearable visualization device 12, detaching the wearable visualization device 12 from the guest interface device 14, detaching the wearable visualization device 12 from any structure (e.g., a ride vehicle), replacing the wearable visualization device 12, and/or sending the wearable visualization device 12 to a maintenance technician for repair. In some cases, the controller 52 may direct the lights 42 to illuminate in a particular color based on the parameters, characteristics, and/or severity of the event. For example, a short drop with a lower acceleration may cause the light 42 to illuminate in a yellow color, while a high speed throw with a higher acceleration may cause the light 42 to illuminate in a red color. Any number of colors may be utilized to communicate various types of events (e.g., yellow to indicate a drop; red to indicate a throw) and/or severity (e.g., yellow to indicate acceleration below a first acceleration threshold; red to indicate acceleration above the first acceleration threshold). In some embodiments, the lights 42 may be capable of illuminating in different colors and/or a variety of different lights may be provided.
In some embodiments, the processor 54 may be configured to count the number of events over time (e.g., the number of events in which the wearable visualization device 12 has been improperly manipulated). For example, the processor 54 may direct at least one of the lights 42 to illuminate once a certain number of drops or throws are reached (e.g., each having an acceleration above an acceleration threshold; each having a severity above a severity threshold). In some cases, the processor 54 may direct one light 42 to illuminate for each event. For example, wearable visualization device 12 may include five lights, a first light may illuminate upon a first drop of wearable visualization device 12, a second light may illuminate upon a second drop of wearable visualization device 12, a third light may illuminate upon a throw of wearable visualization device 12, and so on. In some embodiments, the processor 54 may direct one or more lights 42 to illuminate for each event, and the number of lights 42 may be based on the severity of each event. For example, wearable visualization device 12 may include five lights, a first light may be illuminated upon a first brief drop of wearable visualization device 12, a second light and a third light may be illuminated upon a high speed throw of wearable visualization device 12, and so on. Then, when a certain number (e.g., all) of the lights 42 of the wearable visualization device 12 are illuminated, the operator may be notified (e.g., by looking at the lights 42) to take an action. In some embodiments, wearable visualization device 12 may include a speaker, and the one or more actions may include providing an audible output via the speaker.
In addition to or as an alternative to the illumination of the at least one light 42, the processor 54 may take one or more other actions, such as sending a notification to the operator system 60 and/or the landscape system 62. Various actions (e.g., automated actions) are envisioned. For example, upon determining that an event has occurred (e.g., the event has a severity above a severity threshold; a certain number of such events have occurred), the processor 54 may turn off the wearable visualization device 12 or at least some features of the wearable visualization device 12 (e.g., turn off the lens 20; prevent display of virtual features on the lens 20). In some embodiments, the processor 54 may prevent the display of the virtual feature on the lens 20 in response to determining that the event has a first, higher severity (e.g., high speed throw; first, higher acceleration), but the processor 54 may continue to enable the display of the virtual feature on the lens 20 in response to determining that the event has a second, lower severity (e.g., short drop; second, lower acceleration).
In some embodiments, the wearable visualization device 12 can be coupled (e.g., removably coupled; temporarily locked) to the guest interface device 14 and/or to a structure such as a ride vehicle. For example, the wearable visualization device 12 may be locked to the guest interface device 14 via an electromagnetic system. In this case, in response to determining that an event has occurred, power to the electromagnetic system may be blocked (e.g., the electromagnet may be deactivated), thereby enabling separation of the wearable visualization device 12 from the guest interface device 14. In some such cases, power to the electromagnetic system may be prevented only when the ride vehicle is in the loading/unloading zone and/or when the ride vehicle is stationary. Similarly, the locking device coupling the wearable visualization device 12 to the ride vehicle may be unlocked in response to determining that an event has occurred and/or while the ride vehicle is in the loading/unloading zone and/or while the ride vehicle is stationary. The wearable visualization device 12 can then be coupled to the guest interface device 14 and/or to the structure only via a mechanical connection (e.g., hook, key/slot interface) that can be quickly, manually disconnected. For example, such techniques may enable wearable visualization devices 12 that have experienced an event to be quickly removed for maintenance operations and replaced with another wearable visualization device 12 without slowing throughput (e.g., unloading and loading by a user) at the landscape. As another example, the processor 54 may be configured to initiate (e.g., run) a test (e.g., a health test) in response to determining that an event has occurred. The testing may include displaying an image (e.g., a pattern, a line) on the lens 20 of the wearable visualization device 12 and using the camera 64 of the wearable visualization device 12 to determine that the image is properly displayed on the lens 20. The processor 54 may receive images from the camera 64 and may process the images (e.g., via template or pattern matching) to determine whether the wearable visualization device 12 is functioning properly after the event. The testing may include providing information (e.g., questions and/or images) on the lens 20 for visualization by the user, and then receiving gesture input from the user (e.g., nodding of the user's head) detected by the sensor 40, as discussed in more detail below with respect to fig. 6.
In some embodiments, the processor 54 may be configured to send an indication (e.g., via the communication device 58) to an operator system 60, which operator system 60 may be located remotely from the wearable visualization device 12 (e.g., a tablet held by an operator of the landscape, a computer accessed by an operator overseeing operations of the amusement park). The indication may include a text message or other notification (e.g., illumination of a light) that the wearable visualization device 12 has been improperly manipulated. The indication may also include data relating to parameters, characteristics, and/or severity of the event.
It should be appreciated that the data relating to the number of events and the data relating to the parameters, characteristics, and/or severity of each event may be used to generate an event report (e.g., a table) for each wearable visualization device 12 and/or may enable an operator of the amusement park to maintain a grasp of the reliability and/or durability of the wearable visualization devices 12. For example, if the wearable visualization device 12 used in landscapes typically experiences impaired functionality after only a few minor drops, the operator may be able to focus on improving reliability and/or durability (even in the presence of drops) and/or taking measures to reduce drops. If the wearable visualization device 12 experiences multiple severe drops and/or throws, the operator may be able to focus on taking steps to reduce the drops and/or throws. Furthermore, if the wearable visualization device 12 experiences impaired functionality without any drop, the operator may be able to focus on improving other features of the wearable visualization device 12 and/or seeking replacement according to warranty.
In some embodiments, the processor 54 may be configured to send instructions to the landscape system 62 (e.g., via the communication device 58) to cause the landscape system 62 to illuminate lights (e.g., on a ride vehicle) and/or adjust operation of features of the landscape, such as adjusting the path or movement of the ride vehicle. For example, in response to determining that an event has occurred, the landscape system 62 may transfer the ride vehicle (e.g., to a maintenance bay and/or a loading/unloading zone) to facilitate maintenance operations. The transfer may occur during the ride in order to avoid the user experiencing the ride with the potentially malfunctioning wearable visualization device 12. Thus, the user or operator may inspect, repair and/or replace the wearable visualization device 12 and/or the user may unload from the ride vehicle and reload into another ride vehicle with a properly functioning wearable visualization device 12, enabling the user to enjoy the AR/VR experience throughout the remainder of the ride. The transfer may occur after the ride so that the wearable visualization device 12 can be inspected, repaired, and/or replaced between ride cycles and/or between users to avoid the user experiencing the ride with a potentially malfunctioning wearable visualization device 12. The transfer may include preventing the ride vehicle from moving forward out of the loading/unloading zone until the wearable visualization device 12 is inspected or otherwise processed. In some embodiments, in response to determining that an event has occurred, the landscape system 62 may be configured to increase physical features on the ride vehicle and/or within the landscape, such as a display, an electronic animal, a light show, etc. (e.g., to enable a user to view text or images, such as on the display, and to enjoy the landscape generally, even in the absence of a properly functioning wearable visualization device 12).
FIG. 4 is a perspective view of a landscape 70 in which the AR/VR system 10 may be employed. As shown, the user 72 is positioned within a ride vehicle 74 that travels along a path 76. At least at certain times of the ride, the user 72 may be able to view the physical structure 78 in the real-world environment 22 through the lens of the wearable visualization device 12. At least at certain times of the ride, the user 72 may be able to view the virtual feature 24 on the lens of the wearable visualization device 12. As represented in fig. 4, the virtual feature 24 may be overlaid onto the real-world environment 22 such that the user is able to view both the physical structure 78 and the virtual feature 24 in the real-world environment 22 simultaneously. Each user 72 may be presented with a different virtual feature 24 such that each user 72 has a different experience on the ride. The user 72 may board the ride vehicle 74 in a loading zone and exit from the ride vehicle 74 in an unloading zone (e.g., loading/unloading zone 80). However, in the excitement of a ride, it may be that the user 72 may drop the wearable visualization device 12, or the wearable visualization device 12 may otherwise fall off the user 72. It is also possible that the user 72 may throw the wearable visualization device 12 and/or the wearable visualization device 12 may otherwise be improperly manipulated.
Referring to fig. 3 and 4, each wearable visualization device 12 may include components that are part of a detection system 50, which detection system 50 may monitor whether the wearable visualization device 12 is improperly manipulated during the ride. In some embodiments, during the ride, the detection system 50 may illuminate at least one light 42, provide notification to the operator system 60, and/or cause action to be taken by the landscape system 62. Additionally or alternatively, the detection system 50 may count or record events within the memory device 56. Additionally or alternatively, the detection system 50 may illuminate the at least one light 42, provide notification to the operator system 60, and/or cause action to be taken by the landscape system 62 only after the ride is finished (e.g., in the loading/unloading zone 80) so as not to interrupt the ride.
In some embodiments, the processor 54 may count the total number of events and/or may periodically cause one or more actions based on the event(s), such as after a period of time (e.g., hourly, daily, weekly), whenever the wearable visualization device 12 is coupled to or decoupled from the guest interface device 14, whenever the wearable visualization device 12 is docked to a structure (e.g., docked to the ride vehicle 74, which may be detected via a positioning sensor), whenever the ride vehicle 74 is in the loading/unloading zone 80 (e.g., after each ride cycle), and/or in response to a request by a user or other person (e.g., an operator, a maintenance technician). Although fig. 4 illustrates the landscape 70 with ride vehicles 74, it should be understood that the landscape 70 may not include ride vehicles 74. Alternatively, the landscape 70 may include a path over which the user 72 walks while wearing the wearable visualization device 12, a theater in which the user 72 sits or stands while wearing the wearable visualization device 12, or any other suitable type of landscape. Further, the landscape 70 may be configured to enable the user 72 to wear and/or carry the wearable visualization device 12 outside of the ride vehicle 74 (such as while queuing to board the ride vehicle 74, after unloading from the ride vehicle 74, etc.). Thus, it may be possible for the user 72 or another person (e.g., operator, maintenance technician) to drop the wearable visualization device 12 at other locations relative to the ride vehicle 74 and/or at other times outside of the ride. The detection system 50 may be configured to detect, count, and/or cause one or more actions disclosed herein when the wearable visualization device 12 is at other locations relative to the ride vehicle 74 and/or at other times outside of the ride.
Fig. 5 is a method 90 of using the detection system 50 to monitor for an event (e.g., improper manipulation) of the wearable visualization device 12. The method 90 disclosed herein includes various steps represented by blocks. It should be noted that at least some of the steps in the method 90 may be performed by a system (such as any of the detection systems 50 disclosed herein) as an automated procedure. Although the flow diagrams illustrate steps in a certain order, it should be understood that the steps may be performed in any suitable order, and certain steps may be performed simultaneously where appropriate. Additionally, steps may be added to the method 90 or omitted from the method 90.
As shown, the method 90 may begin in step 92 with receiving a signal (e.g., from the sensor 40, at the processor 54) indicative of an event regarding the wearable visualization device 12. As discussed above, the processor 54 may be configured to receive and process the signals to determine that an event has occurred and/or to characterize the event (e.g., type, time, distance, speed, severity, location of impact). In step 94, the processor 54 may count the number of events over time. For example, data relating to parameters, characteristics, severity, and/or quantity of events may be stored in the memory device 56.
In step 96, the processor 54 may direct at least one light 42 on the wearable visualization device 12 to illuminate. For example, the processor 54 may direct the at least one light 42 to illuminate in response to detection of drops having a severity above a severity threshold and/or in response to detection of a number of drops above a count threshold. In step 98, processor 54 may provide a notification to operator system 60 that operator system 60 may be located remotely from wearable visualization device 12. In step 100, the processor 54 may communicate with the landscape system 62, which may cause the landscape system 62 to adjust features of the landscape, such as illuminating lights on the ride vehicles, adjusting the path of the ride vehicles, and so forth.
The sensors 40 of the wearable visualization device 12 may enable the user to provide gesture input. In view of this, fig. 6 is a schematic illustration of a problem that may be presented on the lens 20 of the wearable visualization device 12. For example, the question may be "can you see the image below
Figure 691967DEST_PATH_IMAGE001
", and the image may be a geometric shape or other image. The user may pan their head up and down to answer "yes" and the user may pan their head left and right to answer "no".
Referring to both fig. 3 and 6, the sensor 40 may be capable of detecting movement of the user's head when the wearable visualization device 12 is worn by the user. The sensor 40 may provide a signal indicative of the motion to the processor 54, and the processor 54 may determine a response or answer from the user based on the signal. In this case, processor 54 may characterize the response based on a comparison of the parameter(s) to known parameters (e.g., stored in memory device 56) associated with a "yes" or "no" motion. The illustrated example may be used as part of a test to test whether the wearable visualization device 12 is functional, such as after being improperly manipulated. In response to determining that the wearable visualization device has been improperly manipulated, a test may be automatically initiated by the processor 54. For example, if the user responds with a "yes," the processor 54 may determine that the wearable visualization device 12 is functional after being improperly manipulated. However, if the user responds with "no," the processor 54 may determine that the wearable visualization device 12 is not functioning properly after being improperly manipulated. In such a case, processor 54 may take one or more actions, including one or more actions disclosed herein (e.g., illuminate light 42; notify operator system 60 and/or landscape system 62). It should be understood that the test may be initiated in response to an input (e.g., by a user or operator), or may be initiated at any other time, prior to exiting the loading area of the ride (e.g., in response to coupling the wearable visualization device 12 to the guest interface device 14), and so forth.
The gesture input may be used to provide various responses to various questions or other prompts, the gesture input may be used as part of a game, and/or the gesture input may be used to control wearable visualization device 12 and/or other aspects of the landscape. In practice, different movements of the user's head may correspond to different responses or inputs. For example, moving the user's head in one way may be one input (e.g., to brighten the image on the lens 20 to cause display of one image as part of a game, adjusting the motion of the ride vehicle in one way), and moving the user's head in another way may be another input (e.g., to darken the image on the lens 20 to cause display of another image as part of a game, adjusting the motion of the ride vehicle in another way).
The gesture input may also be used to enable an operator and/or maintenance technician to unlock certain features of the wearable visualization device 12 (e.g., by moving the wearable visualization device 12 in some manner and/or in some pattern of movement). The gesture input may enable an operator and/or a maintenance technician to interact with the wearable visualization device 12 and/or the landscape (e.g., game) in order to diagnose problems and/or see information not available to the guest. The gesture input may enable an operator and/or a maintenance technician to access a menu (e.g., a menu viewable on lens 20 of wearable visualization device 12; a menu viewable on a display connected to wearable visualization device 12 (such as a display on a ride vehicle)), move through a menu, make selections on a menu, and/or perform maintenance tests and/or procedures using gesture input (e.g., only gesture input and motion of wearable visualization device 12; without auxiliary devices, such as a mouse or keyboard). In some cases, the gesture input may enable an operator and/or maintenance technician to perform maintenance and/or provide input to a computing system coupled to the wearable visualization device 12, such as a computing system of a ride vehicle (e.g., the landscape system 62 of fig. 3), to thereby adjust the operation of the computing system.
The sensor 40 of the wearable visualization device 12 may also enable other operations, such as head tracking of the user's head. The sensor 40 (e.g., IMU) may be used to obtain data indicative of the manner in which the user's head travels through the space. However, in some settings, a user may be positioned on a moving ride vehicle (e.g., translated and/or rotated relative to the ground). Accordingly, the AR/VR system 10 may include additional features and/or be configured to perform processing steps to decouple the movement of the user's head from the movement of the ride vehicle. For example, the AR/VR system 10 may use a solid state cabin tracking system and may secondarily use the sensor 40 (e.g., if needed) for additional input to a prediction algorithm (e.g., kalman filter).
The sensor 40 may also be used for off-board development (e.g., desktop development) as it provides a low cost way to have head tracking in a wearable visualization device 10. Developers can take advantage of the basic tracking provided by the sensors 40 to look around the virtual scene; however, a developer may not be able to align a virtual scene with the real world in order to create the virtual scene. Thus, a developer may not utilize a ride vehicle/cabin tracking system, which may be more expensive, use many devices, and be time consuming to set up than the sensor 40, which sensor 40 may operate to obtain data indicative of movement of the user's head when plugged into a cable (e.g., a USB cable; cable 32).
As set forth above, embodiments of the present disclosure may provide one or more technical effects useful for facilitating performance of maintenance activities on wearable visualization devices and for facilitating integration of wearable visualization devices in amusement parks. It should be understood that the technical effects and technical problems in the specification are exemplary and not restrictive. Indeed, it should be noted that the embodiments described in the specification may have other technical effects and may be capable of solving other technical problems.
While the embodiments set forth in this disclosure may be susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and have been described in detail herein. It should be understood, however, that the disclosure is not intended to be limited to the particular forms disclosed. The disclosure is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the disclosure as defined by the following appended claims.
The technology presented and claimed herein is cited and applied to practical and concrete examples of a significant improvement in the art, and thus is not abstract, intangible, or purely theoretical. Furthermore, if any claim appended to the end of this specification contains one or more elements designated as "means for [ performing ] … … [ function" or "step for [ performing ] … … [ function"), it is intended that such elements be construed in accordance with 35 u.s.c.112 (f). However, for any claim that contains elements specified in any other way, it is intended that such elements not be construed in accordance with 35 u.s.c. § 112 (f).

Claims (20)

1. A detection system configured to detect improper manipulation of a wearable visualization device, the detection system comprising:
a sensor coupled to the wearable visualization device;
a light emitter coupled to the wearable visualization device; and
a processor configured to receive a signal from the sensor, determine whether the signal indicates improper manipulation of the wearable visualization device, and direct illumination of the light emitter in response to determining that the signal indicates improper manipulation of the wearable visualization device.
2. The detection system of claim 1, wherein the signal is indicative of an acceleration of the wearable visualization device, and the processor is configured to compare the acceleration to an acceleration threshold and determine that the signal is indicative of an incorrect manipulation of the wearable visualization device in response to the acceleration exceeding the acceleration threshold.
3. The detection system of claim 2, wherein the processor is configured to determine the acceleration of the wearable visualization device relative to a ride vehicle based on the signal to determine that the signal indicates incorrect manipulation of the wearable visualization device.
4. The detection system of claim 1, wherein the processor is configured to determine a number of events of improper manipulation of the wearable visualization device over time, and direct illumination of the light emitter in response to determining that the number of events exceeds a count threshold.
5. The detection system of claim 1, wherein the processor is configured to determine a severity of the incorrect manipulation based on the signal.
6. The detection system of claim 5, wherein the processor is configured to direct illumination of the light emitter in a first color in response to determining that the severity is a first level, and direct illumination of the light emitter in a second color in response to determining that the severity is a second level.
7. The detection system of claim 5, wherein the processor is configured to determine the severity by comparing a parameter of the signal to one or more parameter thresholds.
8. The detection system of claim 1, wherein the processor is configured to stop operation of at least one component of the wearable visualization device in response to determining that the signal indicates improper manipulation of the wearable visualization device.
9. The detection system of claim 1, wherein the processor is configured to determine a type of improper manipulation of the wearable visualization device based on the signal.
10. The detection system of claim 8, wherein the processor is configured to direct illumination of the light emitter in a first color in response to determining the type relates to dropping the wearable visualization device, and direct illumination of the light emitter in a second color in response to determining the type relates to throwing the wearable visualization device.
11. The detection system of claim 1, wherein the sensor comprises an inertial measurement unit.
12. The detection system of claim 1, wherein the processor is configured to provide a notification to a remote system located remotely from the wearable visualization device in response to determining that the signal indicates improper manipulation of the wearable visualization device.
13. The detection system of claim 12, wherein the remote system comprises a landscape system, and the notification is configured to cause the landscape system to adjust a characteristic of a landscape.
14. A wearable visualization device, comprising:
a housing;
a sensor supported by the housing and configured to detect movement of the wearable visualization device;
a light emitter supported by the housing; and
a processor configured to receive a signal from the sensor, determine whether the signal indicates that the wearable visualization device has been dropped or thrown based on the detected motion of the wearable visualization device, and direct illumination of the light emitter in response to determining that the signal indicates that the wearable visualization device has been dropped or thrown.
15. The wearable visualization device of claim 14, wherein the processor is configured to determine a number of times the wearable visualization device has been dropped or thrown over time, and direct illumination of the light emitter in response to determining that the number exceeds a count threshold.
16. The wearable visualization device of claim 14, wherein the processor is configured to determine a severity of a drop or a throw of the wearable visualization device based on the signal.
17. The wearable visualization device of claim 16, wherein the processor is configured to direct illumination of the light emitter in a first color in response to determining that the severity is a first level, and direct illumination of the light emitter in a second color in response to determining that the severity is a second level.
18. A method of detecting improper manipulation of a wearable visualization device using a detection system, the method comprising:
receiving, at a processor, a signal from a sensor coupled to the wearable visualization device;
determining, using the processor, that the signal is indicative of improper manipulation of the wearable visualization device;
counting, using the processor, a number of events of incorrect manipulation of the wearable visualization device over time; and
directing, using the processor, illumination of a light emitter in response to determining that the number of events exceeds a count threshold.
19. The method of claim 18, in response to determining that the signal indicates improper manipulation of the wearable visualization device, ceasing, via the processor, operation of at least one component of the wearable visualization device.
20. The method of claim 18, providing, via the processor, a notification to a remote system located remotely from the wearable visualization device in response to determining that the signal indicates improper manipulation of the wearable visualization device.
CN202080008665.0A 2019-01-11 2020-01-10 Drop detection system and method Pending CN113260427A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201962791735P 2019-01-11 2019-01-11
US62/791735 2019-01-11
US16/738908 2020-01-09
US16/738,908 US11200656B2 (en) 2019-01-11 2020-01-09 Drop detection systems and methods
PCT/US2020/013163 WO2020146783A1 (en) 2019-01-11 2020-01-10 Drop detection systems and methods

Publications (1)

Publication Number Publication Date
CN113260427A true CN113260427A (en) 2021-08-13

Family

ID=71516768

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202080008636.4A Pending CN113226498A (en) 2019-01-11 2020-01-10 Wearable visualization device system and method
CN202080008663.1A Active CN113226499B (en) 2019-01-11 2020-01-10 Wearable visualization system and method
CN202080008665.0A Pending CN113260427A (en) 2019-01-11 2020-01-10 Drop detection system and method

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202080008636.4A Pending CN113226498A (en) 2019-01-11 2020-01-10 Wearable visualization device system and method
CN202080008663.1A Active CN113226499B (en) 2019-01-11 2020-01-10 Wearable visualization system and method

Country Status (9)

Country Link
US (3) US11210772B2 (en)
EP (3) EP3908382B1 (en)
JP (2) JP2022517227A (en)
KR (3) KR20210113641A (en)
CN (3) CN113226498A (en)
CA (3) CA3125222A1 (en)
ES (1) ES2966264T3 (en)
SG (3) SG11202106742XA (en)
WO (3) WO2020146785A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11127212B1 (en) * 2017-08-24 2021-09-21 Sean Asher Wilens Method of projecting virtual reality imagery for augmenting real world objects and surfaces
US11754689B2 (en) 2019-12-16 2023-09-12 Plusai, Inc. System and method for detecting sensor adjustment need
US11738694B2 (en) 2019-12-16 2023-08-29 Plusai, Inc. System and method for anti-tampering sensor assembly
US11650415B2 (en) 2019-12-16 2023-05-16 Plusai, Inc. System and method for a sensor protection mechanism
US11313704B2 (en) 2019-12-16 2022-04-26 Plusai, Inc. System and method for a sensor protection assembly
US11077825B2 (en) * 2019-12-16 2021-08-03 Plusai Limited System and method for anti-tampering mechanism
US11724669B2 (en) 2019-12-16 2023-08-15 Plusai, Inc. System and method for a sensor protection system
US11470265B2 (en) 2019-12-16 2022-10-11 Plusai, Inc. System and method for sensor system against glare and control thereof
US11792500B2 (en) * 2020-03-18 2023-10-17 Snap Inc. Eyewear determining facial expressions using muscle sensors
US11774770B2 (en) 2020-06-03 2023-10-03 Universal City Studios Llc Interface device with three-dimensional (3-D) viewing functionality
US20220026723A1 (en) * 2020-07-24 2022-01-27 Universal City Studios Llc Electromagnetic coupling systems and methods for visualization device
US11803063B2 (en) * 2020-08-18 2023-10-31 Universal City Studios Llc Tether actuation systems and methods for visualization device
EP4262191A4 (en) * 2020-12-11 2024-05-01 JVCKenwood Corporation Head-mounted display and adjusting method therefor
US20220328571A1 (en) * 2021-04-08 2022-10-13 Semiconductor Energy Laboratory Co., Ltd. Display apparatus and electronic device
US11772667B1 (en) 2022-06-08 2023-10-03 Plusai, Inc. Operating a vehicle in response to detecting a faulty sensor using calibration parameters of the sensor

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050279165A1 (en) * 2003-09-18 2005-12-22 Tokyo Electron Limited Drop detection device or abnormality detection device and portable apparatus equipped with said device
CN101382559A (en) * 2008-09-04 2009-03-11 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and drop alarm method and device for objects
CN102610056A (en) * 2012-03-16 2012-07-25 清华大学 Mobile phone wearing mode-oriented falling event detection system and method
CN103472257A (en) * 2013-09-12 2013-12-25 天津三星通信技术研究有限公司 Method and system for detecting acceleration of portable terminal
WO2014017348A1 (en) * 2012-07-24 2014-01-30 ソニー株式会社 Image display device and image display method
US20140188426A1 (en) * 2012-12-27 2014-07-03 Steven FASTERT Monitoring hit count for impact events
CN105139596A (en) * 2015-09-15 2015-12-09 广东小天才科技有限公司 Method and system for carrying out falling-off reminding based on wearable equipment
CN105208219A (en) * 2015-10-30 2015-12-30 努比亚技术有限公司 Mobile terminal falling reminding method and device
CN105389957A (en) * 2015-10-29 2016-03-09 小米科技有限责任公司 Communication method, device and system of wearable device
CN106029190A (en) * 2014-08-11 2016-10-12 马克里德斯有限及两合公司 Method for operating a device, in particular an amusement ride, transport means, a fitness device or similar
CN106355828A (en) * 2016-08-26 2017-01-25 深圳市沃特沃德股份有限公司 Method and device for detecting wearable device disengagement
US20170270464A1 (en) * 2016-03-16 2017-09-21 Triax Technologies, Inc. System and interfaces for managing workplace events
CN108781320A (en) * 2016-03-14 2018-11-09 索诺瓦公司 Wireless body Worn type personal device with loss detection function

Family Cites Families (262)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7355807B2 (en) * 2006-04-28 2008-04-08 Hewlett-Packard Development Company, L.P. Hard disk drive protection system and method
JP3672586B2 (en) * 1994-03-24 2005-07-20 株式会社半導体エネルギー研究所 Correction system and operation method thereof
TW290678B (en) * 1994-12-22 1996-11-11 Handotai Energy Kenkyusho Kk
KR100189178B1 (en) * 1995-05-19 1999-06-01 오우라 히로시 Lcd panel test apparatus having means for correcting data difference among test apparatus
US6259565B1 (en) 1996-11-19 2001-07-10 Sony Corporation Display apparatus
JPH11143379A (en) * 1997-09-03 1999-05-28 Semiconductor Energy Lab Co Ltd Semiconductor display device correcting system and its method
JP3854763B2 (en) 1999-11-19 2006-12-06 キヤノン株式会社 Image display device
JP3583684B2 (en) * 2000-01-12 2004-11-04 シャープ株式会社 Image defect detection apparatus and image defect detection method
US7734101B2 (en) * 2000-10-11 2010-06-08 The United States Of America As Represented By The Secretary Of The Army Apparatus and system for testing an image produced by a helmet-mounted display
US7129975B2 (en) * 2001-02-07 2006-10-31 Dialog Imaging System Gmbh Addressable imager with real time defect detection and substitution
GB0121067D0 (en) * 2001-08-31 2001-10-24 Ibm Drop detection device
US7012756B2 (en) 2001-11-14 2006-03-14 Canon Kabushiki Kaisha Display optical system, image display apparatus, image taking optical system, and image taking apparatus
US7019909B2 (en) 2001-11-14 2006-03-28 Canon Kabushiki Kaisha Optical system, image display apparatus, and image taking apparatus
US20030215129A1 (en) * 2002-05-15 2003-11-20 Three-Five Systems, Inc. Testing liquid crystal microdisplays
US7495638B2 (en) 2003-05-13 2009-02-24 Research Triangle Institute Visual display with increased field of view
US20160098095A1 (en) 2004-01-30 2016-04-07 Electronic Scripting Products, Inc. Deriving Input from Six Degrees of Freedom Interfaces
US7059182B1 (en) * 2004-03-03 2006-06-13 Gary Dean Ragner Active impact protection system
ITTO20040436A1 (en) 2004-06-28 2004-09-28 St Microelectronics Srl FREE FALL DETECTION DEVICE FOR THE PROTECTION OF PORTABLE APPLIANCES.
JP4965800B2 (en) 2004-10-01 2012-07-04 キヤノン株式会社 Image display system
KR100618866B1 (en) 2004-10-02 2006-08-31 삼성전자주식회사 Method and apparatus for detecting free falling of an electronic device
US7350394B1 (en) 2004-12-03 2008-04-01 Maxtor Corporation Zero-g offset identification of an accelerometer employed in a hard disk drive
US7639260B2 (en) * 2004-12-15 2009-12-29 Xerox Corporation Camera-based system for calibrating color displays
TWI255342B (en) * 2005-02-04 2006-05-21 Benq Corp Portable electronic device with an impact-detecting function
JP4364157B2 (en) * 2005-04-22 2009-11-11 トレックス・セミコンダクター株式会社 Fall detection device
US20060250322A1 (en) 2005-05-09 2006-11-09 Optics 1, Inc. Dynamic vergence and focus control for head-mounted displays
US7364306B2 (en) * 2005-06-20 2008-04-29 Digital Display Innovations, Llc Field sequential light source modulation for a digital display system
WO2007000178A1 (en) * 2005-06-29 2007-01-04 Bayerische Motoren Werke Aktiengesellschaft Method for a distortion-free display
KR100723494B1 (en) * 2005-07-23 2007-06-04 삼성전자주식회사 Method of detecting free fall of a mobile device, apparatus for the same and recording medium for the same
KR100630762B1 (en) 2005-08-26 2006-10-04 삼성전자주식회사 Method for detecting the free fall of disk drives and apparatus thereof
JP2007264088A (en) * 2006-03-27 2007-10-11 Funai Electric Co Ltd Display device, image persistence correction system, and image persistence correction method
US7477469B2 (en) * 2006-04-27 2009-01-13 Seagate Technology Llc Active protection system
KR100818984B1 (en) * 2006-08-01 2008-04-04 삼성전자주식회사 Image compensation system and operating method for the same
US8406562B2 (en) * 2006-08-11 2013-03-26 Geo Semiconductor Inc. System and method for automated calibration and correction of display geometry and color
JP2010002181A (en) 2006-10-06 2010-01-07 T & D:Kk Apparatus and method for recording fall
US9667954B2 (en) 2006-10-13 2017-05-30 Apple Inc. Enhanced image display in head-mounted displays
US8212859B2 (en) 2006-10-13 2012-07-03 Apple Inc. Peripheral treatment for head-mounted displays
WO2008129451A1 (en) 2007-04-19 2008-10-30 Koninklijke Philips Electronics N.V. Fall detection system
WO2009037970A1 (en) * 2007-09-21 2009-03-26 Murata Manufacturing Co., Ltd. Drop detection device, magnetic disc device, and mobile electronic device
KR100920225B1 (en) * 2007-12-17 2009-10-05 한국전자통신연구원 Method and apparatus for accuracy measuring of?3d graphical model by using image
US8511827B2 (en) 2008-01-22 2013-08-20 The Arizona Board Of Regents On Behalf Of The University Of Arizona Head-mounted projection display using reflective microdisplays
US8025581B2 (en) 2008-05-09 2011-09-27 Disney Enterprises, Inc. Interactive interface mounting assembly for amusement and theme park rides
JP4905592B2 (en) * 2008-07-23 2012-03-28 株式会社村田製作所 Fall detection device, magnetic disk device, and portable electronic device
CN101377918B (en) * 2008-09-19 2013-02-27 李鑫 Electronic display screen system, method and system for correcting electronic display screen brightness
US8061182B2 (en) * 2009-06-22 2011-11-22 Research In Motion Limited Portable electronic device and method of measuring drop impact at the portable electronic device
JP2011077960A (en) * 2009-09-30 2011-04-14 Brother Industries Ltd Head mount display
BR112012012229A2 (en) * 2009-11-25 2017-12-26 Koninl Philips Electronics Nv fall detector for use in a user's fall detection method for estimating a vertical velocity and / or vertical displacement of an object comprising an accelerometer, method for use in a fall detection user's use of a fall detector comprising an accelerometer and computer program product
US8330305B2 (en) * 2010-02-11 2012-12-11 Amazon Technologies, Inc. Protecting devices from impact damage
US20130278631A1 (en) * 2010-02-28 2013-10-24 Osterhout Group, Inc. 3d positioning of augmented reality information
US8964298B2 (en) 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US8477425B2 (en) 2010-02-28 2013-07-02 Osterhout Group, Inc. See-through near-eye display glasses including a partially reflective, partially transmitting optical element
JP5499854B2 (en) 2010-04-08 2014-05-21 ソニー株式会社 Optical position adjustment method for head mounted display
US10031576B2 (en) * 2010-06-09 2018-07-24 Dynavox Systems Llc Speech generation device with a head mounted display unit
EP2619749A4 (en) 2010-09-21 2017-11-15 4IIII Innovations Inc. Head-mounted peripheral vision display systems and methods
US8941559B2 (en) 2010-09-21 2015-01-27 Microsoft Corporation Opacity filter for display device
US10359545B2 (en) 2010-10-21 2019-07-23 Lockheed Martin Corporation Fresnel lens with reduced draft facet visibility
IT1403374B1 (en) * 2010-10-29 2013-10-17 Bosco System Lab S P A APPARATUS FOR THE TRANSMISSION OF LOCALIZED VIBRATIONS, IN PARTICULAR TO MUSCLES OF A USER.
US9292973B2 (en) 2010-11-08 2016-03-22 Microsoft Technology Licensing, Llc Automatic variable virtual focus for augmented reality displays
US8576276B2 (en) 2010-11-18 2013-11-05 Microsoft Corporation Head-mounted display device which provides surround video
US8694251B2 (en) * 2010-11-25 2014-04-08 Texas Instruments Incorporated Attitude estimation for pedestrian navigation using low cost mems accelerometer in mobile applications, and processing methods, apparatus and systems
JP2012141461A (en) 2010-12-29 2012-07-26 Sony Corp Head mount display
JP5830546B2 (en) * 2011-02-25 2015-12-09 フラウンホッファー−ゲゼルシャフト ツァ フェルダールング デァ アンゲヴァンテン フォアシュンク エー.ファオ Determination of model parameters based on model transformation of objects
WO2012133890A1 (en) * 2011-04-01 2012-10-04 シャープ株式会社 Display panel unevenness correction method, correction system
JP5790187B2 (en) 2011-06-16 2015-10-07 ソニー株式会社 Display device
US9638711B2 (en) * 2011-06-17 2017-05-02 Verizon Telematics Inc. Method and system for discerning a false positive in a fall detection signal
CN102231016B (en) * 2011-06-28 2013-03-20 青岛海信电器股份有限公司 Method, device and system for compensating brightness of liquid crystal module
US8810482B2 (en) 2011-06-29 2014-08-19 Recon Instruments Inc. Modular heads-up display systems
AU2011204946C1 (en) 2011-07-22 2012-07-26 Microsoft Technology Licensing, Llc Automatic text scrolling on a head-mounted display
US9638836B1 (en) 2011-08-17 2017-05-02 Lockheed Martin Corporation Lenses having astigmatism correcting inside reflective surface
US9342610B2 (en) 2011-08-25 2016-05-17 Microsoft Technology Licensing, Llc Portals: registered objects as virtualized, personalized displays
US9402568B2 (en) 2011-08-29 2016-08-02 Verizon Telematics Inc. Method and system for detecting a fall based on comparing data to criteria derived from multiple fall data sets
JP5774228B2 (en) * 2011-09-16 2015-09-09 クアルコム,インコーポレイテッド Detecting that a mobile device is in the vehicle
US9342108B2 (en) 2011-09-16 2016-05-17 Apple Inc. Protecting an electronic device
US9285871B2 (en) 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Personal audio/visual system for providing an adaptable augmented reality environment
US8866870B1 (en) 2011-10-20 2014-10-21 Lockheed Martin Corporation Methods, apparatus, and systems for controlling from a first location a laser at a second location
US8610781B2 (en) * 2011-11-02 2013-12-17 Stmicroelectronics, Inc. System and method for light compensation in a video panel display
CN105974587B (en) 2011-11-24 2018-09-28 松下知识产权经营株式会社 Head-mounted display device
US20130137076A1 (en) 2011-11-30 2013-05-30 Kathryn Stone Perez Head-mounted display based education and instruction
US20130141419A1 (en) 2011-12-01 2013-06-06 Brian Mount Augmented reality with realistic occlusion
US8705177B1 (en) 2011-12-05 2014-04-22 Google Inc. Integrated near-to-eye display module
US9497501B2 (en) 2011-12-06 2016-11-15 Microsoft Technology Licensing, Llc Augmented reality virtual monitor
CN102402005B (en) 2011-12-06 2015-11-25 北京理工大学 Bifocal-surface monocular stereo helmet-mounted display device with free-form surfaces
US20150312561A1 (en) 2011-12-06 2015-10-29 Microsoft Technology Licensing, Llc Virtual 3d monitor
US9451915B1 (en) * 2012-02-29 2016-09-27 Google Inc. Performance of a diagnostic procedure using a wearable computing device
US8905177B2 (en) * 2012-03-27 2014-12-09 Vitaly Grossman Wheeled platform powered by a cargo tracked vehicle and method of propulsion control thereof
US9454007B1 (en) 2012-05-07 2016-09-27 Lockheed Martin Corporation Free-space lens design and lenses therefrom
US9146397B2 (en) * 2012-05-30 2015-09-29 Microsoft Technology Licensing, Llc Customized see-through, electronic display device
US9253524B2 (en) 2012-07-20 2016-02-02 Intel Corporation Selective post-processing of decoded video frames based on focus point determination
US9088787B1 (en) 2012-08-13 2015-07-21 Lockheed Martin Corporation System, method and computer software product for providing visual remote assistance through computing systems
US10073201B2 (en) 2012-10-26 2018-09-11 Qualcomm Incorporated See through near-eye display
US20140146394A1 (en) 2012-11-28 2014-05-29 Nigel David Tout Peripheral display for a near-eye display device
US8867139B2 (en) 2012-11-30 2014-10-21 Google Inc. Dual axis internal optical beam tilt for eyepiece of an HMD
US20140168264A1 (en) 2012-12-19 2014-06-19 Lockheed Martin Corporation System, method and computer program product for real-time alignment of an augmented reality device
WO2014106823A2 (en) 2013-01-03 2014-07-10 Meta Company Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities
US9788714B2 (en) 2014-07-08 2017-10-17 Iarmourholdings, Inc. Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
US20140253702A1 (en) * 2013-03-10 2014-09-11 OrCam Technologies, Ltd. Apparatus and method for executing system commands based on captured image data
US9432492B2 (en) * 2013-03-11 2016-08-30 Apple Inc. Drop countermeasures for electronic device
US9519144B2 (en) 2013-05-17 2016-12-13 Nvidia Corporation System, method, and computer program product to produce images for a near-eye light field display having a defect
US9582922B2 (en) 2013-05-17 2017-02-28 Nvidia Corporation System, method, and computer program product to produce images for a near-eye light field display
US10137361B2 (en) 2013-06-07 2018-11-27 Sony Interactive Entertainment America Llc Systems and methods for using reduced hops to generate an augmented virtual reality scene within a head mounted system
US10905943B2 (en) 2013-06-07 2021-02-02 Sony Interactive Entertainment LLC Systems and methods for reducing hops associated with a head mounted system
JP2014238731A (en) * 2013-06-07 2014-12-18 株式会社ソニー・コンピュータエンタテインメント Image processor, image processing system, and image processing method
US9874749B2 (en) 2013-11-27 2018-01-23 Magic Leap, Inc. Virtual and augmented reality systems and methods
US20150003819A1 (en) 2013-06-28 2015-01-01 Nathan Ackerman Camera auto-focus based on eye gaze
US9152022B2 (en) * 2013-07-11 2015-10-06 Intel Corporation Techniques for adjusting a projected image
US9264702B2 (en) * 2013-08-19 2016-02-16 Qualcomm Incorporated Automatic calibration of scene camera for optical see-through head mounted display
JP6237000B2 (en) 2013-08-29 2017-11-29 セイコーエプソン株式会社 Head-mounted display device
WO2015051660A1 (en) 2013-10-13 2015-04-16 北京蚁视科技有限公司 Head-mounted stereoscopic display
US9569886B2 (en) 2013-12-19 2017-02-14 Intel Corporation Variable shading
US9319622B2 (en) * 2014-01-09 2016-04-19 International Business Machines Corporation Video projector with automated image enhancement
US9696552B1 (en) 2014-01-10 2017-07-04 Lockheed Martin Corporation System and method for providing an augmented reality lightweight clip-on wearable device
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9389423B2 (en) 2014-03-11 2016-07-12 Google Inc. Head wearable display with adjustable transparency
US9037125B1 (en) * 2014-04-07 2015-05-19 Google Inc. Detecting driving with a wearable computing device
US9715257B2 (en) * 2014-04-18 2017-07-25 Apple Inc. Active screen protection for electronic device
JP2015228050A (en) 2014-05-30 2015-12-17 ソニー株式会社 Information processing device and information processing method
US9360671B1 (en) 2014-06-09 2016-06-07 Google Inc. Systems and methods for image zoom
US9690375B2 (en) 2014-08-18 2017-06-27 Universal City Studios Llc Systems and methods for generating augmented and virtual reality images
US10261579B2 (en) 2014-09-01 2019-04-16 Samsung Electronics Co., Ltd. Head-mounted display apparatus
KR102230076B1 (en) 2014-09-01 2021-03-19 삼성전자 주식회사 Head-mounted display apparatus
US20160093230A1 (en) 2014-09-30 2016-03-31 Lockheed Martin Corporation Domeless simulator
US9984505B2 (en) 2014-09-30 2018-05-29 Sony Interactive Entertainment Inc. Display of text information on a head-mounted display
US20160097929A1 (en) 2014-10-01 2016-04-07 Dawson Yee See-through display optic structure
US20160097930A1 (en) 2014-10-06 2016-04-07 Steven John Robbins Microdisplay optical system having two microlens arrays
WO2016061447A1 (en) 2014-10-17 2016-04-21 Lockheed Martin Corporation Head-wearable ultra-wide field of view display device
CN107003523B (en) 2014-10-24 2019-05-14 埃马金公司 Immersion based on micro-display wears view device
WO2016081664A1 (en) 2014-11-18 2016-05-26 Meta Company Wide field of view head mounted display apparatuses, methods and systems
US10495884B2 (en) 2014-12-14 2019-12-03 Elbit Systems Ltd. Visual perception enhancement of displayed color symbology
US9581819B1 (en) 2014-12-17 2017-02-28 Lockheed Martin Corporation See-through augmented reality system
WO2016100933A1 (en) * 2014-12-18 2016-06-23 Oculus Vr, Llc System, device and method for providing user interface for a virtual reality environment
EP3234920A1 (en) 2014-12-23 2017-10-25 Meta Company Apparatuses, methods and systems coupling visual accommodation and visual convergence to the same plane at any depth of an object of interest
US10241328B2 (en) 2014-12-26 2019-03-26 Cy Vision Inc. Near-to-eye display device with variable resolution
US10108832B2 (en) 2014-12-30 2018-10-23 Hand Held Products, Inc. Augmented reality vision barcode scanning system and method
US20170352226A1 (en) 2015-01-05 2017-12-07 Sony Corporation Information processing device, information processing method, and program
KR102261876B1 (en) * 2015-01-06 2021-06-07 삼성디스플레이 주식회사 Display device and driving method thereof
US10740971B2 (en) 2015-01-20 2020-08-11 Microsoft Technology Licensing, Llc Augmented reality field of view object follower
US9472025B2 (en) * 2015-01-21 2016-10-18 Oculus Vr, Llc Compressible eyecup assemblies in a virtual reality headset
US20160225191A1 (en) * 2015-02-02 2016-08-04 Daqri, Llc Head mounted display calibration
US10054797B2 (en) 2015-02-12 2018-08-21 Google Llc Combining a high resolution narrow field display and a mid resolution wide field display
WO2016141054A1 (en) 2015-03-02 2016-09-09 Lockheed Martin Corporation Wearable display system
US9726885B2 (en) * 2015-03-31 2017-08-08 Timothy A. Cummings System for virtual display and method of use
US9690374B2 (en) 2015-04-27 2017-06-27 Google Inc. Virtual/augmented reality transition system and method
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset
US10038887B2 (en) 2015-05-27 2018-07-31 Google Llc Capture and render of panoramic virtual reality content
US20170363949A1 (en) 2015-05-27 2017-12-21 Google Inc Multi-tier camera rig for stereoscopic image capture
US9877016B2 (en) 2015-05-27 2018-01-23 Google Llc Omnistereo capture and render of panoramic virtual reality content
US10366534B2 (en) 2015-06-10 2019-07-30 Microsoft Technology Licensing, Llc Selective surface mesh regeneration for 3-dimensional renderings
US9638921B2 (en) 2015-06-11 2017-05-02 Oculus Vr, Llc Strap system for head-mounted displays
US9977493B2 (en) 2015-06-17 2018-05-22 Microsoft Technology Licensing, Llc Hybrid display system
DE112016002904T5 (en) 2015-06-23 2018-03-08 Google Llc Head mounted display with curved dual displays
CA2991644C (en) 2015-07-06 2022-03-01 Frank Jones Methods and devices for demountable head mounted displays
US9454010B1 (en) 2015-08-07 2016-09-27 Ariadne's Thread (Usa), Inc. Wide field-of-view head mounted display system
US9606362B2 (en) 2015-08-07 2017-03-28 Ariadne's Thread (Usa), Inc. Peripheral field-of-view illumination system for a head mounted display
JP6565465B2 (en) * 2015-08-12 2019-08-28 セイコーエプソン株式会社 Image display device, computer program, and image display system
US10235808B2 (en) 2015-08-20 2019-03-19 Microsoft Technology Licensing, Llc Communication system
US20170053445A1 (en) 2015-08-20 2017-02-23 Microsoft Technology Licensing, Llc Augmented Reality
US10169917B2 (en) 2015-08-20 2019-01-01 Microsoft Technology Licensing, Llc Augmented reality
US10565446B2 (en) * 2015-09-24 2020-02-18 Tobii Ab Eye-tracking enabled wearable devices
KR102411738B1 (en) * 2015-09-25 2022-06-21 삼성전자 주식회사 Fall detection device and control method thereof
US10082865B1 (en) * 2015-09-29 2018-09-25 Rockwell Collins, Inc. Dynamic distortion mapping in a worn display
WO2017061677A1 (en) 2015-10-08 2017-04-13 Lg Electronics Inc. Head mount display device
US10545717B2 (en) 2015-10-08 2020-01-28 Pcms Holdings, Inc. Methods and systems of automatic calibration for dynamic display configurations
US10088685B1 (en) 2015-10-19 2018-10-02 Meta Company Apparatuses, methods and systems for multiple focal distance display
US10754156B2 (en) 2015-10-20 2020-08-25 Lockheed Martin Corporation Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system
US20170116950A1 (en) 2015-10-22 2017-04-27 Google Inc. Liquid crystal display with variable drive voltage
US9741125B2 (en) 2015-10-28 2017-08-22 Intel Corporation Method and system of background-foreground segmentation for image processing
KR102592980B1 (en) 2015-11-04 2023-10-20 매직 립, 인코포레이티드 Optical field display metrology
US10001683B2 (en) 2015-11-06 2018-06-19 Microsoft Technology Licensing, Llc Low-profile microdisplay module
USD812612S1 (en) 2015-11-18 2018-03-13 Meta Company Optical head mounted display with transparent visor
US10424117B2 (en) * 2015-12-02 2019-09-24 Seiko Epson Corporation Controlling a display of a head-mounted display device
US10173588B2 (en) * 2015-12-04 2019-01-08 Karl Lenker Systems and methods for motorbike collision avoidance
US10445860B2 (en) * 2015-12-08 2019-10-15 Facebook Technologies, Llc Autofocus virtual reality headset
US9763342B2 (en) 2015-12-09 2017-09-12 Oculus Vr, Llc Cable retention assembly for a head mounted display
US10147235B2 (en) 2015-12-10 2018-12-04 Microsoft Technology Licensing, Llc AR display with adjustable stereo overlap zone
US20170176747A1 (en) 2015-12-21 2017-06-22 Tuomas Heikki Sakari Vallius Multi-Pupil Display System for Head-Mounted Display Device
US10229540B2 (en) 2015-12-22 2019-03-12 Google Llc Adjusting video rendering rate of virtual reality content and processing of a stereoscopic image
WO2017112958A1 (en) 2015-12-24 2017-06-29 Meta Company Optical engine for creating wide-field of view fovea-based display
US10026232B2 (en) 2016-01-04 2018-07-17 Meta Compnay Apparatuses, methods and systems for application of forces within a 3D virtual environment
WO2017120552A1 (en) 2016-01-06 2017-07-13 Meta Company Apparatuses, methods and systems for pre-warping images for a display system with a distorting optical component
IL301884A (en) * 2016-01-19 2023-06-01 Magic Leap Inc Augmented reality systems and methods utilizing reflections
WO2017127190A1 (en) 2016-01-21 2017-07-27 Google Inc. Global command interface for a hybrid display, corresponding method, and head mounted display
US10229541B2 (en) 2016-01-28 2019-03-12 Sony Interactive Entertainment America Llc Methods and systems for navigation within virtual reality space using head mounted display
US10209785B2 (en) 2016-02-02 2019-02-19 Microsoft Technology Licensing, Llc Volatility based cursor tethering
US11112266B2 (en) 2016-02-12 2021-09-07 Disney Enterprises, Inc. Method for motion-synchronized AR or VR entertainment experience
US10665020B2 (en) 2016-02-15 2020-05-26 Meta View, Inc. Apparatuses, methods and systems for tethering 3-D virtual elements to digital content
USD776111S1 (en) 2016-02-15 2017-01-10 Meta Company Forehead pad
USD776110S1 (en) 2016-02-15 2017-01-10 Meta Company Head-mounted display
US10169922B2 (en) 2016-02-16 2019-01-01 Microsoft Technology Licensing, Llc Reality mixer for mixed reality
WO2017143303A1 (en) 2016-02-17 2017-08-24 Meta Company Apparatuses, methods and systems for sharing virtual elements
US10473933B2 (en) 2016-02-19 2019-11-12 Microsoft Technology Licensing, Llc Waveguide pupil relay
US9933624B1 (en) 2016-03-01 2018-04-03 Daryl White System and method for providing individualized virtual reality for an amusement attraction
US9778467B1 (en) 2016-03-01 2017-10-03 Daryl White Head mounted display
US10212517B1 (en) 2016-03-02 2019-02-19 Meta Company Head-mounted display system with a surround sound system
US9984510B1 (en) 2016-03-02 2018-05-29 Meta Company System and method for modifying virtual elements in a virtual environment using hierarchical anchors incorporated into virtual elements
US9928661B1 (en) 2016-03-02 2018-03-27 Meta Company System and method for simulating user interaction with virtual objects in an interactive space
US10168768B1 (en) 2016-03-02 2019-01-01 Meta Company Systems and methods to facilitate interactions in an interactive space
US9964767B2 (en) 2016-03-03 2018-05-08 Google Llc Display with reflected LED micro-display panels
KR102448919B1 (en) 2016-03-16 2022-10-04 삼성디스플레이 주식회사 Display device
US10133345B2 (en) 2016-03-22 2018-11-20 Microsoft Technology Licensing, Llc Virtual-reality navigation
US10003726B2 (en) * 2016-03-25 2018-06-19 Microsoft Technology Licensing, Llc Illumination module for near eye-to-eye display system
US10175487B2 (en) 2016-03-29 2019-01-08 Microsoft Technology Licensing, Llc Peripheral display for head mounted display device
US9946074B2 (en) 2016-04-07 2018-04-17 Google Llc See-through curved eyepiece with patterned optical combiner
US20170305083A1 (en) 2016-04-26 2017-10-26 David A Smith Systems and methods for forming optical lenses with an undercut
US9965899B2 (en) 2016-04-28 2018-05-08 Verizon Patent And Licensing Inc. Methods and systems for minimizing pixel data transmission in a network-based virtual reality media delivery configuration
US20170323482A1 (en) 2016-05-05 2017-11-09 Universal City Studios Llc Systems and methods for generating stereoscopic, augmented, and virtual reality images
US20170323416A1 (en) 2016-05-09 2017-11-09 Intel Corporation Processing image fragments from one frame in separate image processing pipes based on image analysis
US10186088B2 (en) 2016-05-13 2019-01-22 Meta Company System and method for managing interactive virtual frames for virtual objects in a virtual environment
US9990779B2 (en) 2016-05-13 2018-06-05 Meta Company System and method for modifying virtual objects in a virtual environment in response to user interactions
US10496156B2 (en) 2016-05-17 2019-12-03 Google Llc Techniques to change location of objects in a virtual/augmented reality system
US10502363B2 (en) 2016-05-17 2019-12-10 Occipital, Inc. Self-contained mixed reality head mounted display
US9983697B1 (en) 2016-05-18 2018-05-29 Meta Company System and method for facilitating virtual interactions with a three-dimensional virtual environment in response to sensor input into a control device having sensors
US10151927B2 (en) 2016-05-31 2018-12-11 Falcon's Treehouse, Llc Virtual reality and augmented reality head set for ride vehicle
US9979956B1 (en) * 2016-06-09 2018-05-22 Oculus Vr, Llc Sharpness and blemish quality test subsystem for eyecup assemblies of head mounted displays
US10226204B2 (en) 2016-06-17 2019-03-12 Philips North America Llc Method for detecting and responding to falls by residents within a facility
US10075685B1 (en) * 2016-06-19 2018-09-11 Oculus Vr, Llc Virtual image distance test subsystem for eyecup assemblies of head mounted displays
US9773438B1 (en) 2016-06-29 2017-09-26 Meta Company System and method for providing views of virtual content in an augmented reality environment
CN109661194B (en) 2016-07-14 2022-02-25 奇跃公司 Iris boundary estimation using corneal curvature
US10614653B2 (en) * 2016-07-22 2020-04-07 Universal Entertainment Corporation Gaming machine including projection-type display screen
CN106019597A (en) 2016-07-27 2016-10-12 北京小米移动软件有限公司 Virtual reality glasses
CN109690327A (en) * 2016-08-17 2019-04-26 雷辛奥普蒂克斯有限公司 Mobile device surge protection
KR102217789B1 (en) * 2016-08-22 2021-02-19 매직 립, 인코포레이티드 Nanograting method and apparatus
CN106154555B (en) 2016-08-24 2020-08-21 北京小米移动软件有限公司 Virtual reality glasses
TWI622803B (en) 2016-08-30 2018-05-01 廣達電腦股份有限公司 Head-mounted display device
US9958951B1 (en) 2016-09-12 2018-05-01 Meta Company System and method for providing views of virtual content in an augmented reality environment
US10026231B1 (en) 2016-09-12 2018-07-17 Meta Company System and method for providing views of virtual content in an augmented reality environment
DE202017105948U1 (en) 2016-10-03 2018-03-07 Google LLC (n.d.Ges.d. Staates Delaware) Augmented reality and / or virtual reality headset
US10425636B2 (en) * 2016-10-03 2019-09-24 Microsoft Technology Licensing, Llc Automatic detection and correction of binocular misalignment in a display device
US10334240B2 (en) * 2016-10-28 2019-06-25 Daqri, Llc Efficient augmented reality display calibration
US20180126116A1 (en) * 2016-11-07 2018-05-10 Blw Ip, Llc Integrated Stroboscopic Eyewear For Sensory Training
US10459236B2 (en) 2016-12-09 2019-10-29 Lg Electronics Inc. Head-mounted display device with earbud holder
US10805767B2 (en) * 2016-12-15 2020-10-13 Philips North America Llc Method for tracking the location of a resident within a facility
WO2018116380A1 (en) * 2016-12-20 2018-06-28 堺ディスプレイプロダクト株式会社 Correction system, display panel, display device, correction method, and program
JP6215441B1 (en) * 2016-12-27 2017-10-18 株式会社コロプラ Method for providing virtual space, program for causing computer to realize the method, and computer apparatus
EP3750004A1 (en) * 2017-01-05 2020-12-16 Philipp K. Lang Improved accuracy of displayed virtual data with optical head mount displays for mixed reality
US10127727B1 (en) 2017-01-10 2018-11-13 Meta Company Systems and methods to provide an interactive environment over an expanded field-of-view
KR102623391B1 (en) * 2017-01-10 2024-01-11 삼성전자주식회사 Method for Outputting Image and the Electronic Device supporting the same
US20180196262A1 (en) 2017-01-12 2018-07-12 Artavious Cage Virtual reality system
CN106652347A (en) * 2017-01-24 2017-05-10 深圳前海零距物联网科技有限公司 Smart helmet fall detection method and smart helmet
CN106693338B (en) * 2017-02-13 2022-04-19 中国航天员科研训练中心 Visual virtual surge exercise protection training device
JP6801073B2 (en) * 2017-02-24 2020-12-16 富士フイルム株式会社 Additional information display devices, methods and programs
US10785471B1 (en) * 2017-05-12 2020-09-22 Facebook Technologies, Llc Upsampling content for head-mounted displays
JP2020521174A (en) 2017-05-18 2020-07-16 アリゾナ ボード オブ リージェンツ オン ビハーフ オブ ザ ユニバーシティ オブ アリゾナ Multi-layer high dynamic range head-mounted display
GB2562758B (en) 2017-05-24 2021-05-12 Sony Interactive Entertainment Inc Input device and method
US10168789B1 (en) 2017-05-31 2019-01-01 Meta Company Systems and methods to facilitate user interactions with virtual content having two-dimensional representations and/or three-dimensional representations
US10241545B1 (en) * 2017-06-01 2019-03-26 Facebook Technologies, Llc Dynamic distortion correction for optical compensation
US10277893B1 (en) * 2017-06-22 2019-04-30 Facebook Technologies, Llc Characterization of optical distortion in a head mounted display
US10430939B1 (en) * 2017-08-28 2019-10-01 Facebook Technologies, Llc Full display panel grid-based virtual image distance test subsystem for eyecup assemblies of head mounted displays
US10522110B1 (en) * 2017-08-30 2019-12-31 Facebook Technologies, Llc Apparatuses, systems, and methods for measuring and adjusting the luminance of a head-mounted display
JP6953247B2 (en) * 2017-09-08 2021-10-27 ラピスセミコンダクタ株式会社 Goggles type display device, line-of-sight detection method and line-of-sight detection system
AU2018373975B2 (en) * 2017-11-27 2020-12-17 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Electronic device
US10663738B2 (en) * 2017-12-04 2020-05-26 Samsung Electronics Co., Ltd. System and method for HMD configurable for various mobile device sizes
EP3729182B1 (en) * 2017-12-21 2023-11-22 BAE SYSTEMS plc Eye tracking for head-worn display
US10692473B1 (en) * 2018-03-13 2020-06-23 Facebook Technologies, Llc Display pixel correction using compression
US10948983B2 (en) * 2018-03-21 2021-03-16 Samsung Electronics Co., Ltd. System and method for utilizing gaze tracking and focal point tracking
US10902820B2 (en) 2018-04-16 2021-01-26 Facebook Technologies, Llc Display device with dynamic resolution enhancement
US10591737B2 (en) * 2018-04-20 2020-03-17 Dell Products L.P. Systems and methods for automatic adjustment of head mounted display straps
US10491890B1 (en) * 2018-05-14 2019-11-26 Dell Products L.P. Systems and methods for automatic adjustment for vertical and rotational imbalance in augmented and virtual reality head-mounted displays
US10685515B2 (en) * 2018-09-10 2020-06-16 Ford Global Technologies, Llc In-vehicle location uncertainty management for passive start
JP7143736B2 (en) * 2018-11-20 2022-09-29 トヨタ自動車株式会社 Driving support device, wearable device, driving support method and program
US11009943B2 (en) * 2018-12-02 2021-05-18 Vigo Technologies, Inc. On/off detection in wearable electronic devices
US11042034B2 (en) * 2018-12-27 2021-06-22 Facebook Technologies, Llc Head mounted display calibration using portable docking station with calibration target

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050279165A1 (en) * 2003-09-18 2005-12-22 Tokyo Electron Limited Drop detection device or abnormality detection device and portable apparatus equipped with said device
CN101382559A (en) * 2008-09-04 2009-03-11 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and drop alarm method and device for objects
CN102610056A (en) * 2012-03-16 2012-07-25 清华大学 Mobile phone wearing mode-oriented falling event detection system and method
CN104508538A (en) * 2012-07-24 2015-04-08 索尼公司 Image display device and image display method
WO2014017348A1 (en) * 2012-07-24 2014-01-30 ソニー株式会社 Image display device and image display method
US20140188426A1 (en) * 2012-12-27 2014-07-03 Steven FASTERT Monitoring hit count for impact events
CN103472257A (en) * 2013-09-12 2013-12-25 天津三星通信技术研究有限公司 Method and system for detecting acceleration of portable terminal
CN106029190A (en) * 2014-08-11 2016-10-12 马克里德斯有限及两合公司 Method for operating a device, in particular an amusement ride, transport means, a fitness device or similar
CN105139596A (en) * 2015-09-15 2015-12-09 广东小天才科技有限公司 Method and system for carrying out falling-off reminding based on wearable equipment
CN105389957A (en) * 2015-10-29 2016-03-09 小米科技有限责任公司 Communication method, device and system of wearable device
CN105208219A (en) * 2015-10-30 2015-12-30 努比亚技术有限公司 Mobile terminal falling reminding method and device
CN108781320A (en) * 2016-03-14 2018-11-09 索诺瓦公司 Wireless body Worn type personal device with loss detection function
US20170270464A1 (en) * 2016-03-16 2017-09-21 Triax Technologies, Inc. System and interfaces for managing workplace events
CN106355828A (en) * 2016-08-26 2017-01-25 深圳市沃特沃德股份有限公司 Method and device for detecting wearable device disengagement

Also Published As

Publication number Publication date
ES2966264T3 (en) 2024-04-19
US20200225715A1 (en) 2020-07-16
US20200226838A1 (en) 2020-07-16
US11200656B2 (en) 2021-12-14
WO2020146783A1 (en) 2020-07-16
EP3908381A1 (en) 2021-11-17
EP3908380A1 (en) 2021-11-17
WO2020146780A1 (en) 2020-07-16
SG11202106745SA (en) 2021-07-29
JP2022517945A (en) 2022-03-11
KR20210113641A (en) 2021-09-16
CN113226498A (en) 2021-08-06
SG11202106742XA (en) 2021-07-29
WO2020146785A1 (en) 2020-07-16
CA3124892A1 (en) 2020-07-16
JP2022517775A (en) 2022-03-10
US11200655B2 (en) 2021-12-14
US11210772B2 (en) 2021-12-28
EP3908382A1 (en) 2021-11-17
EP3908382B1 (en) 2023-08-23
JP2022517227A (en) 2022-03-07
SG11202106819TA (en) 2021-07-29
KR20210114023A (en) 2021-09-17
CA3125222A1 (en) 2020-07-16
CA3125224A1 (en) 2020-07-16
KR20210114439A (en) 2021-09-23
US20200226738A1 (en) 2020-07-16
CN113226499A (en) 2021-08-06
CN113226499B (en) 2023-09-26

Similar Documents

Publication Publication Date Title
CN113260427A (en) Drop detection system and method
US9692990B2 (en) Infrared tracking system
CN106462232A (en) Determining coordinate frames in a dynamic environment
CN104885144A (en) Mixed reality display accommodation
CN107107834A (en) Projection display device, electronic equipment, driver's visuognosis image sharing method and driver's visuognosis image sharing program
US11839829B2 (en) Wearable visualization device with a retractable cover
US20230182031A1 (en) Amusement content processing systems and methods
US11112886B2 (en) Model and detachable controller for augmented reality / virtual reality experience
WO2023107542A1 (en) Amusement content processing systems and methods
WO2023163958A1 (en) Head-mounted display testing system and method
JP2020201301A (en) Training system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40058509

Country of ref document: HK