WO2006087709A1 - Personal navigation system - Google Patents

Personal navigation system Download PDF

Info

Publication number
WO2006087709A1
WO2006087709A1 PCT/IL2006/000195 IL2006000195W WO2006087709A1 WO 2006087709 A1 WO2006087709 A1 WO 2006087709A1 IL 2006000195 W IL2006000195 W IL 2006000195W WO 2006087709 A1 WO2006087709 A1 WO 2006087709A1
Authority
WO
WIPO (PCT)
Prior art keywords
navigation system
personal navigation
display
processor
head
Prior art date
Application number
PCT/IL2006/000195
Other languages
French (fr)
Inventor
Zvi Lapidot
Abraham Aharoni
Original Assignee
Lumus Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lumus Ltd. filed Critical Lumus Ltd.
Priority to EP06711177A priority Critical patent/EP1848966A1/en
Priority to US11/816,520 priority patent/US8140197B2/en
Publication of WO2006087709A1 publication Critical patent/WO2006087709A1/en
Priority to US13/396,306 priority patent/US8301319B2/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present invention relates to navigation systems generally and more particularly to visual navigational aids. Specifically, the present invention is concerned with personal aviation navigation systems. Background of the Invention
  • the present invention relates to a personal navigation system including a head- mounted orientation sensor, a coordinate position sensor, a head-mounted display and a processor receiving an input from the head-mounted orientation sensor and an input from the coordinate position sensor and providing a visually sensible output for displaying on the head-mounted display.
  • the display is at least partially transparent.
  • the coordinate position sensor is a portable coordinate position sensor.
  • the portable coordinate position sensor includes at least one of a user-worn sensor and a user- carried sensor.
  • the processor does not receive any inputs from navigational instrumentation of an aircraft.
  • the processor receives input from a coordinate position sensor onboard a carrier vehicle such as an aircraft, in addition to the inputs from the coordinate position sensor.
  • the processor is a portable processor.
  • the portable processor includes at least one of a user-worn processor and a user-carried processor.
  • the coordinate position sensor includes a GPS receiver.
  • the personal aviation system is mounted onto a headset.
  • the head-mounted orientation sensor includes an inertial sensor.
  • the processor and the display provide location indication functionality.
  • the location indication functionality includes landing strip designation functionality.
  • the processor and the display provide approaching aircraft warning and designation functionality.
  • the approaching aircraft warning and designation functionality includes an audio warning.
  • the approaching aircraft warning and designation functionality includes a visual warning.
  • the visual warning includes a flashing visual warning.
  • processor and the display provide temporary flight restriction zone designation functionality. Additionally or alternatively, the processor and the display provide user head turning direction designation functionality.
  • the head-mounted display is operative to display visually sensible navigation outputs overlaid on a view of a scene, which is seen through the head-mounted display. Additionally or alternatively, the head-mounted display is operative to display a visual symbol, which can be used to designate a location on the ground in the line-of-sight of a pilot, and thereby identify the designated location.
  • a visual symbol which can be used to designate a location on the ground in the line-of-sight of a pilot, and thereby identify the designated location.
  • FIGs. IA and IB are illustrations of a user-carried personal aviation navigation and orientation system, constructed and operative in accordance with a preferred embodiment of the present invention
  • FIGs. 2A and 2B are illustrations of an application of the personal aviation navigation and orientation system of Fig. 1, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations;
  • Figs. 3A and 3B are illustrations of another application of the personal aviation navigation and orientation system of Figs. IA and IB, constructed and operative in -ac.C-ordaac_e_mth_a preferred embodiment of the present invention, shown in two alternative operative orientations;
  • FIGs. 4A and 4B are illustrations of yet another application of the personal aviation navigation and orientation system of Figs. IA and IB, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations;
  • FIGs. 5A and 5B pictorial illustrations of yet another application of the personal aviation navigation and orientation system of Figs. IA and IB, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations.
  • Figs. IA and IB 5 are pictorial illustrations of a preferred embodiment of a user carrying and using a personal navigation and orientation system 100, e.g., a personal aviation navigation and orientation system.
  • the personal aviation navigation and orientation system 100 is mountable on a user's head and preferably comprises a head position and orientation sensor 102 and a display 104, which displays visually sensible navigation outputs overlaid or superimposed on a view of a scene 105, seen through display 104.
  • the orientation sensor 102 is firmly mounted onto the display 104, thereby allowing the user to readjust the personal aviation navigation and orientation system 100 on the head, while maintaining the accuracy of orientation sensor measurement of the line-of-sight through the display to the desired location.
  • the display 104 may additionally display flight information, such as a Primary Flight Display including artificial horizon, air speed and height, as well as secondary information including squawk frequency, airfield data and the like. Additionally, the personal aviation navigation and orientation system 100 may have access to database information, such as the locations of airstrips and landing procedures for each, as well as major landscape waypoints, such that identification information may be generated for display on display 104. Moreover, -the-svstem-l-QO-is-preferably_connected to a collision avoidance system (not shown), such as a Traffic Collision Avoidance System (TCAS) 5 for generating information relating to a possible collision situation. The system may further be connected to an active information system (not shown) capable of dynamically generating information concerning Temporaiy Flight Restriction (TFR) zones in close proximity to the aircraft.
  • TFR Temporaiy Flight Restriction
  • the visually sensible navigation outputs displayed on display 104 are provided by a processor 106, preferably a portable processor, which is advantageously head mounted, but may alternatively be earned or worn by the user.
  • the processor 106 receives a head orientation output from sensor 102 and also receives a user location input from a coordinate position sensor 108, such as a GPS receiver.
  • Coordinate position sensor 108 is preferably a portable coordinate position sensor, such as a head-mounted, user-carried or worn, GPS receiver.
  • personal aviation navigation and orientation system 100 may receive user location input from an onboard aircraft coordinate position sensor.
  • the visually sensible navigation outputs displayed on display 104 do not require any interface with the navigation or other instrumentation of an aircraft, thus substantially simplifying the design of system 100, alleviating the need for installation expertise and certification, and thereby lowering its cost. It is appreciated, however, that where user location inputs from aircraft-installed instrumentation are available, this data can additionally or alternatively be obtained through an interface to the aircraft-installed instrumentation.
  • the personal aviation navigation and orientation system 100 is mounted onto a conventional pilot's headset 110.
  • the sensor 102 is preferably a commercially available inertial sensor, such as an InertiaCube, commercially available from Intersense Inc. of Bedford, Massachusetts, U.S.A.
  • the display 104 is preferably an at least partially transparent display, and the processor generated visually sensible navigation outputs are superimposed over an external view seen by the user.
  • the display is preferably a display of the type described in any of the following published patent documents of the applicant/assignee, the disclosures of which are hereby incorporated by reference: U.S. Patent No. 6,829,095, European Patent Numbers EP1485747A1 and EP 1295163 A2 and Published PCT Application Numbers
  • display 104 may be opaque, and personal aviation navigation and orientation system 100 may be operative to generate a display of an external scene 105 that would be seen by the user with the visually sensible navigation outputs superimposed thereupon.
  • the processor 106 is preferably a commercially available microcontroller and is operative to combine the inputs of sensor 102 and of coordinate position sensor 108 and to provide visually sensible navigation indications, having the functionality described hereinbelow with reference to Figs. 2 A to 5B. It is noted that although in the illustrated embodiment . , the processor 106 is shown to communicate via a wired communication link with coordinate position sensor 108 communication between processor 106 and coordinate position sensor 108 may be via a wireless communication link. It is further noted that although in the illustrated embodiment processor 106 and coordinate position sensor 108 are shown as individual components, processor 106 and coordinate position sensor 108 may be incorporated into a single component which may be either head mounted or user carried or worn.
  • the processor 106 is operative to generate a location indicator symbol, such as crosshairs or frames, designated by reference numeral 112, to coincide with the line-of-sight of the user to a desired location and thereby indicate a desired location, such as a landing strip 114.
  • a location indicator symbol such as crosshairs or frames, designated by reference numeral 112
  • Fig. 2A shows what the user sees when the desired location is positioned within the field-of-view of the display 104. When the desired location is outside the field-of-view of the display 104, as illustrated in Fig.
  • suitable symbols or text such as an arrow, designated by reference numeral 116, is generated on display 104 indicating the direction in which the user should turn his head in order to see the desired location, here indicated by crosshairs 112 in phantom lines.
  • a suitable label such as the name and/or ground coordinates location of the position, may also be displayed on display 104.
  • personal aviation navigation and orientation system 100 of Fig. 1 may also be operative in a locate mode.
  • -the-sy-stem-is-Qperative_to_ provide information concerning a user-designated location.
  • processor 106 is operative to generate a location indicator symbol, such as crosshairs 112, at the center of display 104. The user then moves his head to overlay the line-of-sight of the location indicator symbol over a desired location on the ground, and activates processor 106 to identify this location.
  • Processor 106 uses the coordinates of the coordinate position sensor 108 and the orientation sensor 102 to calculate the location to which the user is pointing and provides to the user the information on the location to which he has pointed to.
  • system 100 is preferably calibrated automatically by correlating the time-evolved measurements of the coordinate position sensor 108 and the measurements obtained by the head-mounted orientation sensor 102.
  • the system 100 may be calibrated manually by entering a calibration mode upon startup.
  • a location indicator symbol is positioned at the center of display 104.
  • the user positions the location indicator symbol, preferably by turning his head, at at least one, and preferably three or more, distant, preprogrammed locations spread over a wide azimuthal range, approving each location as it is acquired by processor 106.
  • the system processor 106 can then analyze the data to average out measurement ei ⁇ ors and provide a baseline orientation for orientation sensor 102. Performing such a calibration within the cockpit of the plane can serve to alleviate any potential interference in the measurement of orientation sensor 102 as a result of magnetic effects due to the plane's structural metal components.
  • Figs. 3A and 3B are simplified pictorial illustrations of another application of the personal aviation navigation and orientation system 100 of Fig. 1, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations.
  • the processor 106 is operative to generate a location indicator, such as a series of symbols, designated by reference numeral 118, to indicate a Temporary Flight Restriction (TFR) zone boundary.
  • Fig. 3 A shows what the user sees on display 104 when the TFR zone boundary is positioned within the field-of-view of the display 104.
  • TFR zone boundary When the desired TFR zone boundary is outside the field-of-view of the display 104, as illustrated in Fig.
  • a suitable label such as the name and/or ground the name and or coordinate location of the TFR zone and other information, such as its validity times, may also be displayed on display 104.
  • Figs. 4A and 4B are simplified pictorial illustrations of yet another application of the personal aviation navigation and orientation system 100 of Fig. 1, constructed and operative in accordance with yet another preferred embodiment of the present invention, shown in two alternative operative orientations.
  • the processor 106 is operative to generate a location indicator symbol, such as a pair of chevrons, designated by reference numeral 122, to indicate an approaching aircraft on display 104 and a potentially dangerous collision situation.
  • Fig. 4A shows what the user sees on display 104 when the approaching aircraft is positioned within the field-of-view of the display 104. When the approaching aircraft is outside the field-of- view of the display 104, as illustrated in Fig.
  • suitable symbols or text such as an arrow, designated by reference numeral 124, is generated on display 104 indicating the direction in which the user should turn his head in order to see the approaching aircraft, here indicated by chevrons 122 shown in phantom.
  • audio and other visual warnings such as flashing symbols 126 may also be provided to alert the user to the potential danger of an approaching aircraft.
  • a suitable label such as the name and or communication channels relevant to the approaching aircraft can also be displayed on display 104.
  • Figs. 5A and 5B are simplified pictorial illustrations of still another application of the personal aviation navigation and orientation system 100 of Fig. 1, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations.
  • the processor 106 is operative to generate a location indicator symbol, such as a series of converging rectangular frames, designated by reference numeral 130, to indicate a landing approach path to an airport.
  • Fig. 5 A shows what the user sees on display 104 when -the-dinection of the approach path is positioned within the field-of-view of the display 104. When the direction of the approach path is outside the field-of-view of the display 104, as illustrated in Fig.
  • suitable symbols or text such as an arrow, designated by reference numeral 132, is generated on display 104 indicating the direction in which the user should turn his head in order to see the approach path, here indicated by frames 130 shown in phantom.
  • the name, calling frequencies and other relevant information may also be displayed on display 104.
  • the position of the aircraft is received from coordinate position sensor measurements, while the orientation of the user's line-of-sight is measured at the user's head.
  • the coordinate position measurements may be combined with an input from Wide Area Augmentation System (WAAS) to provide for improved height measurement accuracy compared to standard GPS measurements.
  • WAAS Wide Area Augmentation System
  • the coordinate position sensor measurements may be combined with data from a Pitot tube of the aircraft for improved height data, or coordinate position measurements may be received from instruments within the aircraft.
  • a user may enter a series of waypoints to assist in navigation along a planned route.
  • the waypoints can be given names and can be called up in sequence onto display 104 to assist navigation.
  • Navigation may also be assisted by use of database information, such as displaying the location of the nearest possible landing strip in a case of emergency landing.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Optics & Photonics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

There is provided a personal navigation system, including a head-mounted orientatio sensor, a coordinate position sensor, a head-mounted display, and a processor receiving a input from the head-mounted orientation sensor and an input from the coordinate positio sensor and providing a visually sensible output for displaying on the head-mounted display.

Description

PERSONAL NAVIGATION SYSTEM Field of the Invention
The present invention relates to navigation systems generally and more particularly to visual navigational aids. Specifically, the present invention is concerned with personal aviation navigation systems. Background of the Invention
The following U.S. Patents are believed to represent the current state of the art: 6,809,724; 6,798,391; 6,771,423; 6,757,068; 6,747,611; 6,727,865; 6,681,629; 6,661,438; 6,608,884; 6,563,648; 6,559,872; 6,474,159; 6,408,257; 6,384,982; 6,380,923; 6,359,609; 6,356,392; 6,353,503; 6,304,303; 6,222,677; 6,204,974; 6,157,533; 6,140,981 ; 6,127,990; 6,108,197; 6,094,242; 6,057,966; 6,050,717; 5,886,822; 5,880,773; 5,844,656; 5,798,733; 5,764,280; 5,581,492 and 5,757,339.
The following patent publications are also believed to be of interest:
Published PCT Applications: WO04015369; WO02086590; WO0180736; WO0156007; WOOl 16929; WO0109636; WO0023815; WO0010156; WO9733270; WO9725646; WO9637798; WO9636898; WO9607947; WO9605532; "WO9600406; WO9521395; WO9510106; WO9510061; WO9424658; WO9414152; WO9411855; WO9407161 and WO9301683.
Foreign Patent Publications: EP1310859; EP1280457; EP1267197; EP1248180; EP1223729; EP1220080; EP1185142; EP1176449; EP1135765; EP1042698; EP1022644; EP0935183; EP0904560; EP0902312; EP0889346; EP0825470; EP0802440; EP0775327; -EP07-72-7-90T-EP-07-7-14-33-; EE0224.75S; EP_Q221614;. EP0716329; EP0694196; EP0670537; EP0672286; EP0627644; EP0592591 and EP0344881. Summary of the Invention
The present invention relates to a personal navigation system including a head- mounted orientation sensor, a coordinate position sensor, a head-mounted display and a processor receiving an input from the head-mounted orientation sensor and an input from the coordinate position sensor and providing a visually sensible output for displaying on the head-mounted display.
Preferably, the display is at least partially transparent. Additionally or alternatively, the coordinate position sensor is a portable coordinate position sensor. Additionally, the portable coordinate position sensor includes at least one of a user-worn sensor and a user- carried sensor.
Preferably, the processor does not receive any inputs from navigational instrumentation of an aircraft. Alternatively, the processor receives input from a coordinate position sensor onboard a carrier vehicle such as an aircraft, in addition to the inputs from the coordinate position sensor.
Preferably, the processor is a portable processor. Additionally or alternatively, the portable processor includes at least one of a user-worn processor and a user-carried processor.
Preferably, the coordinate position sensor includes a GPS receiver. Additionally or alternatively, the personal aviation system is mounted onto a headset.
Preferably, the head-mounted orientation sensor includes an inertial sensor.
Preferably, the processor and the display provide location indication functionality. Additionally, the location indication functionality includes landing strip designation functionality.
Alternatively or additionally, the processor and the display provide approaching aircraft warning and designation functionality. Additionally, the approaching aircraft warning and designation functionality includes an audio warning. Additionally, the approaching aircraft warning and designation functionality includes a visual warning. Additionally, the visual warning includes a flashing visual warning.
-Preferabh^-t-he-pi-Qcessoi-and-the-dispJay-provide airport approach path designation functionality. Additionally or alternatively, the processor and the display provide temporary flight restriction zone designation functionality. Additionally or alternatively, the processor and the display provide user head turning direction designation functionality.
Preferably, the head-mounted display is operative to display visually sensible navigation outputs overlaid on a view of a scene, which is seen through the head-mounted display. Additionally or alternatively, the head-mounted display is operative to display a visual symbol, which can be used to designate a location on the ground in the line-of-sight of a pilot, and thereby identify the designated location. Brief Description of the Drawings
The invention will now be described in connection with certain preferred embodiments with reference to the following illustrative figures so that it may be more fully understood.
With specific reference now to the figures in detail, it is stressed that the particulars shown are by way of example and for purpose of illustrative discussion of the preferred embodiments of the present invention only and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
In the drawings:
Figs. IA and IB are illustrations of a user-carried personal aviation navigation and orientation system, constructed and operative in accordance with a preferred embodiment of the present invention;
Figs. 2A and 2B are illustrations of an application of the personal aviation navigation and orientation system of Fig. 1, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations;
Figs. 3A and 3B are illustrations of another application of the personal aviation navigation and orientation system of Figs. IA and IB, constructed and operative in -ac.C-ordaac_e_mth_a preferred embodiment of the present invention, shown in two alternative operative orientations;
Figs. 4A and 4B are illustrations of yet another application of the personal aviation navigation and orientation system of Figs. IA and IB, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations; and
Figs. 5A and 5B pictorial illustrations of yet another application of the personal aviation navigation and orientation system of Figs. IA and IB, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations.
Detailed Description of the Preferred Embodiments
Reference is made to Figs. IA and IB5 which are pictorial illustrations of a preferred embodiment of a user carrying and using a personal navigation and orientation system 100, e.g., a personal aviation navigation and orientation system. For further understanding, the following description will specifically relate to aviation navigation, which merely constitutes a non-limiting example of the most common use of the subject navigational system, which is also utilizable on land and at sea. The personal aviation navigation and orientation system 100 is mountable on a user's head and preferably comprises a head position and orientation sensor 102 and a display 104, which displays visually sensible navigation outputs overlaid or superimposed on a view of a scene 105, seen through display 104. The orientation sensor 102 is firmly mounted onto the display 104, thereby allowing the user to readjust the personal aviation navigation and orientation system 100 on the head, while maintaining the accuracy of orientation sensor measurement of the line-of-sight through the display to the desired location.
The display 104 may additionally display flight information, such as a Primary Flight Display including artificial horizon, air speed and height, as well as secondary information including squawk frequency, airfield data and the like. Additionally, the personal aviation navigation and orientation system 100 may have access to database information, such as the locations of airstrips and landing procedures for each, as well as major landscape waypoints, such that identification information may be generated for display on display 104. Moreover, -the-svstem-l-QO-is-preferably_connected to a collision avoidance system (not shown), such as a Traffic Collision Avoidance System (TCAS)5 for generating information relating to a possible collision situation. The system may further be connected to an active information system (not shown) capable of dynamically generating information concerning Temporaiy Flight Restriction (TFR) zones in close proximity to the aircraft.
The visually sensible navigation outputs displayed on display 104 are provided by a processor 106, preferably a portable processor, which is advantageously head mounted, but may alternatively be earned or worn by the user. The processor 106 receives a head orientation output from sensor 102 and also receives a user location input from a coordinate position sensor 108, such as a GPS receiver. Coordinate position sensor 108 is preferably a portable coordinate position sensor, such as a head-mounted, user-carried or worn, GPS receiver. Alternatively or additionally, personal aviation navigation and orientation system 100 may receive user location input from an onboard aircraft coordinate position sensor.
It is a particular feature of the present invention that the visually sensible navigation outputs displayed on display 104 do not require any interface with the navigation or other instrumentation of an aircraft, thus substantially simplifying the design of system 100, alleviating the need for installation expertise and certification, and thereby lowering its cost. It is appreciated, however, that where user location inputs from aircraft-installed instrumentation are available, this data can additionally or alternatively be obtained through an interface to the aircraft-installed instrumentation.
In accordance with a preferred embodiment of the present invention, the personal aviation navigation and orientation system 100 is mounted onto a conventional pilot's headset 110. The sensor 102 is preferably a commercially available inertial sensor, such as an InertiaCube, commercially available from Intersense Inc. of Bedford, Massachusetts, U.S.A. The display 104 is preferably an at least partially transparent display, and the processor generated visually sensible navigation outputs are superimposed over an external view seen by the user. The display is preferably a display of the type described in any of the following published patent documents of the applicant/assignee, the disclosures of which are hereby incorporated by reference: U.S. Patent No. 6,829,095, European Patent Numbers EP1485747A1 and EP 1295163 A2 and Published PCT Application Numbers
Figure imgf000006_0001
Alternatively, display 104 may be opaque, and personal aviation navigation and orientation system 100 may be operative to generate a display of an external scene 105 that would be seen by the user with the visually sensible navigation outputs superimposed thereupon.
The processor 106 is preferably a commercially available microcontroller and is operative to combine the inputs of sensor 102 and of coordinate position sensor 108 and to provide visually sensible navigation indications, having the functionality described hereinbelow with reference to Figs. 2 A to 5B. It is noted that although in the illustrated embodiment., the processor 106 is shown to communicate via a wired communication link with coordinate position sensor 108 communication between processor 106 and coordinate position sensor 108 may be via a wireless communication link. It is further noted that although in the illustrated embodiment processor 106 and coordinate position sensor 108 are shown as individual components, processor 106 and coordinate position sensor 108 may be incorporated into a single component which may be either head mounted or user carried or worn.
Referring to Figs. 2A and 2B, in the illustrated embodiment, the processor 106 is operative to generate a location indicator symbol, such as crosshairs or frames, designated by reference numeral 112, to coincide with the line-of-sight of the user to a desired location and thereby indicate a desired location, such as a landing strip 114. Fig. 2A shows what the user sees when the desired location is positioned within the field-of-view of the display 104. When the desired location is outside the field-of-view of the display 104, as illustrated in Fig. 2B, suitable symbols or text, such as an arrow, designated by reference numeral 116, is generated on display 104 indicating the direction in which the user should turn his head in order to see the desired location, here indicated by crosshairs 112 in phantom lines. In addition to the symbolic indication of the location of the desired position, a suitable label such as the name and/or ground coordinates location of the position, may also be displayed on display 104.
In addition to the mode of operation described above, personal aviation navigation and orientation system 100 of Fig. 1 may also be operative in a locate mode. In this mode, -the-sy-stem-is-Qperative_to_provide information concerning a user-designated location. In this mode of operation, processor 106 is operative to generate a location indicator symbol, such as crosshairs 112, at the center of display 104. The user then moves his head to overlay the line-of-sight of the location indicator symbol over a desired location on the ground, and activates processor 106 to identify this location. Processor 106 uses the coordinates of the coordinate position sensor 108 and the orientation sensor 102 to calculate the location to which the user is pointing and provides to the user the information on the location to which he has pointed to.
It is noted that system 100 is preferably calibrated automatically by correlating the time-evolved measurements of the coordinate position sensor 108 and the measurements obtained by the head-mounted orientation sensor 102. Alternatively, the system 100 may be calibrated manually by entering a calibration mode upon startup. In the calibration mode, similar to the locate mode described above, a location indicator symbol is positioned at the center of display 104. The user then positions the location indicator symbol, preferably by turning his head, at at least one, and preferably three or more, distant, preprogrammed locations spread over a wide azimuthal range, approving each location as it is acquired by processor 106. The system processor 106 can then analyze the data to average out measurement eiτors and provide a baseline orientation for orientation sensor 102. Performing such a calibration within the cockpit of the plane can serve to alleviate any potential interference in the measurement of orientation sensor 102 as a result of magnetic effects due to the plane's structural metal components.
Reference is now made to Figs. 3A and 3B, which are simplified pictorial illustrations of another application of the personal aviation navigation and orientation system 100 of Fig. 1, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations. In the illustrated embodiment, the processor 106 is operative to generate a location indicator, such as a series of symbols, designated by reference numeral 118, to indicate a Temporary Flight Restriction (TFR) zone boundary. Fig. 3 A shows what the user sees on display 104 when the TFR zone boundary is positioned within the field-of-view of the display 104. When the desired TFR zone boundary is outside the field-of-view of the display 104, as illustrated in Fig. 3B, -suitable-symbols— or— text._such as an arrow, designated by reference numeral 120, is generated on display 104 indicating to the direction in which the user should turn his head in order to see the boundary of the TFR zone, here indicated by a series of symbols 118 shown in phantom. In addition to the symbolic indication of the TFR zone, a suitable label such as the name and/or ground the name and or coordinate location of the TFR zone and other information, such as its validity times, may also be displayed on display 104.
Reference is now made to Figs. 4A and 4B, which are simplified pictorial illustrations of yet another application of the personal aviation navigation and orientation system 100 of Fig. 1, constructed and operative in accordance with yet another preferred embodiment of the present invention, shown in two alternative operative orientations. In the illustrated embodiment, the processor 106 is operative to generate a location indicator symbol, such as a pair of chevrons, designated by reference numeral 122, to indicate an approaching aircraft on display 104 and a potentially dangerous collision situation. Fig. 4A shows what the user sees on display 104 when the approaching aircraft is positioned within the field-of-view of the display 104. When the approaching aircraft is outside the field-of- view of the display 104, as illustrated in Fig. 4B, suitable symbols or text, such as an arrow, designated by reference numeral 124, is generated on display 104 indicating the direction in which the user should turn his head in order to see the approaching aircraft, here indicated by chevrons 122 shown in phantom. Typically, audio and other visual warnings, such as flashing symbols 126 may also be provided to alert the user to the potential danger of an approaching aircraft. In addition to the symbolic indication of the approaching aircraft, a suitable label, such as the name and or communication channels relevant to the approaching aircraft can also be displayed on display 104.
Reference is now made to Figs. 5A and 5B, which are simplified pictorial illustrations of still another application of the personal aviation navigation and orientation system 100 of Fig. 1, constructed and operative in accordance with a preferred embodiment of the present invention, shown in two alternative operative orientations. In the illustrated embodiment, the processor 106 is operative to generate a location indicator symbol, such as a series of converging rectangular frames, designated by reference numeral 130, to indicate a landing approach path to an airport. Fig. 5 A shows what the user sees on display 104 when -the-dinection of the approach path is positioned within the field-of-view of the display 104. When the direction of the approach path is outside the field-of-view of the display 104, as illustrated in Fig. 5B, suitable symbols or text, such as an arrow, designated by reference numeral 132, is generated on display 104 indicating the direction in which the user should turn his head in order to see the approach path, here indicated by frames 130 shown in phantom. In addition to the symbolic indication of the approach, the name, calling frequencies and other relevant information may also be displayed on display 104.
It is noted that although the examples brought hereinabove are oriented towards pilots, the present invention could be equally useful for other functional operators in airborne applications, such as equipment operators, gunners and the like, as well as users of other navigational applications such as found in sailing, coordinated operations in safety and military applications, and audio-visual personal guides.
It is also noted that in the system of the present invention the position of the aircraft is received from coordinate position sensor measurements, while the orientation of the user's line-of-sight is measured at the user's head. Alternatively, the coordinate position measurements may be combined with an input from Wide Area Augmentation System (WAAS) to provide for improved height measurement accuracy compared to standard GPS measurements. Alternatively or additionally, the coordinate position sensor measurements may be combined with data from a Pitot tube of the aircraft for improved height data, or coordinate position measurements may be received from instruments within the aircraft. Additionally, as a consequence of the large distance to the points of interest, the effect of possible separation between the location of the coordinate position sensor 108 and the orientation sensor 102 is negligible. In other cases, where the separation of the orientation sensor 102 and the coordinate positioning sensor 108 is significant, it can be accounted for or measured by introducing additional sensors.
It is further noted that the system of the present invention a user may enter a series of waypoints to assist in navigation along a planned route. The waypoints can be given names and can be called up in sequence onto display 104 to assist navigation. Navigation may also be assisted by use of database information, such as displaying the location of the nearest possible landing strip in a case of emergency landing.
-It-w-ill-be-app.re.ciate_dJ)v_ persons skilled in_the_art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the present invention includes both combinations and sub-combinations of the various features described hereinabove as well as modifications thereof which would occur to a person of ordinaiy skill in the art upon reading the foregoing description, and which are not in the prior art.

Claims

WHAT IS CLAIMED IS:
1. A personal navigation system, comprising: a head-mounted orientation sensor; a coordinate position sensor; a head-mounted display, and a processor receiving an input from said head-mounted orientation sensor and an input from said coordinate position sensor and providing a visually sensible output for displaying on said head-mounted display.
2. The personal navigation system according to claim 1, wherein said display is an at least partially transparent display.
3. The personal navigation system according to claim 1, wherein said coordinate position sensor is a portable coordinate position sensor.
4. The personal navigation system according to claim 3, wherein said portable coordinate position sensor includes at least one of a user-worn sensor and a user-carried sensor.
5. The personal navigation system according to claim 1, wherein said processor operates autonomously from any inputs from navigational instrumentation of a carrier vehicle.
6. The personal navigation system according to claim 1, wherein said coordinate position sensor is an onboard carrier vehicle coordinate position sensor.
7. The personal navigation system according to claim 6, wherein said carrier vehicle is an aircraft and said system is an aviation navigation system.
8. The personal navigation system according to claim 1, whereirr~satd~pro~cessOr- comprises a portable processor.
9. The personal navigation system according to claim 8, wherein said portable processor comprises at least one of a user- worn processor and a user carried processor.
10. The personal navigation system according to claim 1, wherein said coordinate position sensor comprises a GPS receiver.
11. The personal navigation system according to claim I5 wherein said personal system is coupled with a headset.
12. The personal navigation system according claim 1, wherein said head-mounted orientation sensor comprises an inertial sensor.
13. The personal navigation system according to claim 1, wherein said processor and said display provide location indication functionality.
14. The personal navigation system according to claim 13, wherein said location indication functionality comprises landing strip designation functionality.
15. The personal navigation system according to claim 1, wherein said processor and said display provide approaching vehicle warning and designation functionality.
16. The personal navigation system according to claim 15, wherein said approaching vehicle warning and designation functionality comprises an audio warning.
17. The personal navigation system according to claim 15, wherein said approaching vehicle warning and designation functionality comprises a visual warning.
18. The personal navigation system according to claim 17, wherein said visual warning comprises a flashing visual warning.
19. The personal navigation system according to claim 1, wherein said processor and said display provide airport approach path designation functionality.
20. The personal navigation system according to claim 1, wherein said processor and said display provide temporary flight restriction zone designation functionality.
21. The personal navigation system according to claim 1, wherein said processor and said display provide user head-tuming direction designation functionality.
22. The personal navigation system according to claim 1, wherein said head-mounted display is operative to display visually sensible navigation outputs superimposed on a view -α£a-scene^s£en.through said head mounted display.
23. The personal navigation system according to claim 1, wherein said head-mounted display is operative to display a visual symbol utilizable to designate a location on the ground in the line-of-sight of a pilot, for identifying a designated location.
24. The personal navigation system according to claim 11, wherein said head-mounted display is configured to be located in front of an eye of a user.
PCT/IL2006/000195 2005-02-17 2006-02-15 Personal navigation system WO2006087709A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP06711177A EP1848966A1 (en) 2005-02-17 2006-02-15 Personal navigation system
US11/816,520 US8140197B2 (en) 2005-02-17 2006-02-15 Personal navigation system
US13/396,306 US8301319B2 (en) 2005-02-17 2012-02-14 Personal navigation system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL16698305 2005-02-17
IL166983 2005-02-17

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/816,520 A-371-Of-International US8140197B2 (en) 2005-02-17 2006-02-15 Personal navigation system
US13/396,306 Continuation US8301319B2 (en) 2005-02-17 2012-02-14 Personal navigation system

Publications (1)

Publication Number Publication Date
WO2006087709A1 true WO2006087709A1 (en) 2006-08-24

Family

ID=36588205

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2006/000195 WO2006087709A1 (en) 2005-02-17 2006-02-15 Personal navigation system

Country Status (3)

Country Link
US (2) US8140197B2 (en)
EP (1) EP1848966A1 (en)
WO (1) WO2006087709A1 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2227676A2 (en) 2007-12-21 2010-09-15 BAE Systems PLC Apparatus and method for landing a rotary wing aircraft
GB2477787A (en) * 2010-02-15 2011-08-17 Marcus Alexander Mawson Cavalier Data Overlay Generation Using Portable Electronic Device With Head-Mounted Display
FR2994736A1 (en) * 2012-08-24 2014-02-28 Thales Sa VISUALIZATION SYSTEM, IN PARTICULAR FOR AIRCRAFT, TRANSPARENT SCREEN AND PERIPHERAL SYMBOLOGY
JP2014505897A (en) * 2010-11-18 2014-03-06 マイクロソフト コーポレーション Improved autofocus for augmented reality display
WO2015162611A1 (en) 2014-04-23 2015-10-29 Lumus Ltd. Compact head-mounted display system
US9292973B2 (en) 2010-11-08 2016-03-22 Microsoft Technology Licensing, Llc Automatic variable virtual focus for augmented reality displays
US9323325B2 (en) 2011-08-30 2016-04-26 Microsoft Technology Licensing, Llc Enhancing an object of interest in a see-through, mixed reality display device
WO2016069672A1 (en) * 2014-10-31 2016-05-06 Microsoft Technology Licensing, Llc Facilitating interaction between users and their environments using a headset having input mechanisms
WO2016075689A1 (en) 2014-11-11 2016-05-19 Lumus Ltd. Compact head-mounted display system protected by a hyperfine structure
WO2016132347A1 (en) 2015-02-19 2016-08-25 Lumus Ltd. Compact head-mounted display system having uniform image
WO2017199232A1 (en) 2016-05-18 2017-11-23 Lumus Ltd. Head-mounted imaging device
WO2018104732A1 (en) * 2016-12-09 2018-06-14 Sony Interactive Entertainment Inc. Head mounted display with user head rotation guidance
US10261321B2 (en) 2005-11-08 2019-04-16 Lumus Ltd. Polarizing optical system
US10302835B2 (en) 2017-02-22 2019-05-28 Lumus Ltd. Light guide optical assembly
US10437031B2 (en) 2016-11-08 2019-10-08 Lumus Ltd. Light-guide device with optical cutoff edge and corresponding production methods
US10481319B2 (en) 2017-03-22 2019-11-19 Lumus Ltd. Overlapping facets
US10551544B2 (en) 2018-01-21 2020-02-04 Lumus Ltd. Light-guide optical element with multiple-axis internal aperture expansion
US10564417B2 (en) 2016-10-09 2020-02-18 Lumus Ltd. Aperture multiplier using a rectangular waveguide
US10649214B2 (en) 2005-02-10 2020-05-12 Lumus Ltd. Substrate-guide optical device
EP3715935A1 (en) 2014-12-25 2020-09-30 Lumus Ltd. Substrate-guided optical device
US10895679B2 (en) 2017-04-06 2021-01-19 Lumus Ltd. Light-guide optical element and method of its manufacture
EP3796069A1 (en) 2012-05-21 2021-03-24 Lumus Ltd Head-mounted display with an eyeball-tracker integrated system
US11243434B2 (en) 2017-07-19 2022-02-08 Lumus Ltd. LCOS illumination via LOE
US11262587B2 (en) 2018-05-22 2022-03-01 Lumus Ltd. Optical system and method for improvement of light field uniformity
US11415812B2 (en) 2018-06-26 2022-08-16 Lumus Ltd. Compact collimating optical device and system
US11448816B2 (en) 2019-01-24 2022-09-20 Lumus Ltd. Optical systems including light-guide optical elements with two-dimensional expansion
US11789264B2 (en) 2021-07-04 2023-10-17 Lumus Ltd. Display with stacked light-guide elements providing different parts of field of view
US11796729B2 (en) 2021-02-25 2023-10-24 Lumus Ltd. Optical aperture multipliers having a rectangular waveguide
US11914161B2 (en) 2019-06-27 2024-02-27 Lumus Ltd. Apparatus and methods for eye tracking based on eye imaging via light-guide optical element

Families Citing this family (84)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL166799A (en) 2005-02-10 2014-09-30 Lumus Ltd Substrate-guided optical device utilizing beam splitters
CN101632033B (en) * 2007-01-12 2013-07-31 寇平公司 Helmet type monocular display device
US9217868B2 (en) * 2007-01-12 2015-12-22 Kopin Corporation Monocular display device
CA2685947A1 (en) 2007-05-14 2008-11-27 Kopin Corporation Mobile wireless display for accessing data from a host and method for controlling
US8825468B2 (en) * 2007-07-31 2014-09-02 Kopin Corporation Mobile wireless display providing speech to speech translation and avatar simulating human attributes
US8355671B2 (en) * 2008-01-04 2013-01-15 Kopin Corporation Method and apparatus for transporting video signal over Bluetooth wireless interface
US9886231B2 (en) 2008-03-28 2018-02-06 Kopin Corporation Head worn wireless computer having high-resolution display suitable for use as a mobile internet device
US8350726B2 (en) * 2009-05-19 2013-01-08 Honeywell International Inc. Gaze-based touchdown point selection system and method
KR101648339B1 (en) * 2009-09-24 2016-08-17 삼성전자주식회사 Apparatus and method for providing service using a sensor and image recognition in portable terminal
WO2011097564A1 (en) * 2010-02-05 2011-08-11 Kopin Corporation Touch sensor for controlling eyewear
US20110196598A1 (en) * 2010-02-09 2011-08-11 Honeywell International Inc. System and methods for rendering taxiway and runway signage in a synthetic display of an airport field
US8456328B2 (en) * 2010-02-17 2013-06-04 Honeywell International Inc. System and method for informing an aircraft operator about a temporary flight restriction in perspective view
US9316827B2 (en) 2010-09-20 2016-04-19 Kopin Corporation LifeBoard—series of home pages for head mounted displays (HMD) that respond to head tracking
US10013976B2 (en) 2010-09-20 2018-07-03 Kopin Corporation Context sensitive overlays in voice controlled headset computer displays
US9377862B2 (en) 2010-09-20 2016-06-28 Kopin Corporation Searchlight navigation using headtracker to reveal hidden or extra document data
US8736516B2 (en) 2010-09-20 2014-05-27 Kopin Corporation Bluetooth or other wireless interface with power management for head mounted display
US8706170B2 (en) 2010-09-20 2014-04-22 Kopin Corporation Miniature communications gateway for head mounted display
US10359545B2 (en) 2010-10-21 2019-07-23 Lockheed Martin Corporation Fresnel lens with reduced draft facet visibility
US9632315B2 (en) 2010-10-21 2017-04-25 Lockheed Martin Corporation Head-mounted display apparatus employing one or more fresnel lenses
US8678282B1 (en) * 2010-11-29 2014-03-25 Lockheed Martin Corporation Aim assist head-mounted display apparatus
US9720228B2 (en) 2010-12-16 2017-08-01 Lockheed Martin Corporation Collimating display with pixel lenses
US10627860B2 (en) 2011-05-10 2020-04-21 Kopin Corporation Headset computer that uses motion and voice commands to control information display and remote devices
US9909878B2 (en) * 2012-03-05 2018-03-06 Here Global B.V. Method and apparatus for triggering conveyance of guidance information
US8929954B2 (en) 2012-04-25 2015-01-06 Kopin Corporation Headset computer (HSC) as auxiliary display with ASR and HT input
US9442290B2 (en) 2012-05-10 2016-09-13 Kopin Corporation Headset computer operation using vehicle sensor feedback for remote control vehicle
US9378028B2 (en) 2012-05-31 2016-06-28 Kopin Corporation Headset computer (HSC) with docking station and dual personality
JP6201990B2 (en) * 2012-06-18 2017-09-27 ソニー株式会社 Image display device, image display program, and image display method
US20140002629A1 (en) * 2012-06-29 2014-01-02 Joshua J. Ratcliff Enhanced peripheral vision eyewear and methods using the same
USD713406S1 (en) 2012-11-30 2014-09-16 Kopin Corporation Headset computer with reversible display
US9160064B2 (en) 2012-12-28 2015-10-13 Kopin Corporation Spatially diverse antennas for a headset computer
JP6423799B2 (en) 2013-01-04 2018-11-14 コピン コーポレーション Ad hoc network
US9134793B2 (en) 2013-01-04 2015-09-15 Kopin Corporation Headset computer with head tracking input used for inertial control
CN104246864B (en) 2013-02-22 2016-06-29 索尼公司 Head mounted display and image display device
US9838506B1 (en) 2013-03-15 2017-12-05 Sony Interactive Entertainment America Llc Virtual reality universe representation changes viewing based upon client side parameters
US20140280506A1 (en) 2013-03-15 2014-09-18 John Cronin Virtual reality enhanced through browser connections
US20140280644A1 (en) 2013-03-15 2014-09-18 John Cronin Real time unified communications interaction of a predefined location in a virtual reality location
US20140280502A1 (en) 2013-03-15 2014-09-18 John Cronin Crowd and cloud enabled virtual reality distributed location network
US20140280503A1 (en) 2013-03-15 2014-09-18 John Cronin System and methods for effective virtual reality visitor interface
US20140267581A1 (en) 2013-03-15 2014-09-18 John Cronin Real time virtual reality leveraging web cams and ip cams and web cam and ip cam networks
US9129430B2 (en) 2013-06-25 2015-09-08 Microsoft Technology Licensing, Llc Indicating out-of-view augmented reality images
US9563331B2 (en) * 2013-06-28 2017-02-07 Microsoft Technology Licensing, Llc Web-like hierarchical menu display configuration for a near-eye display
KR20150084200A (en) * 2014-01-13 2015-07-22 엘지전자 주식회사 A head mounted display and the method of controlling thereof
US9588343B2 (en) 2014-01-25 2017-03-07 Sony Interactive Entertainment America Llc Menu navigation in a head-mounted display
FR3020688B1 (en) * 2014-04-30 2016-05-06 Thales Sa HEAD VISUALIZATION SYSTEM COMPRISING CAP SELECTING MEANS AND ASSOCIATED SELECTION METHOD
US10684476B2 (en) 2014-10-17 2020-06-16 Lockheed Martin Corporation Head-wearable ultra-wide field of view display device
WO2016141054A1 (en) 2015-03-02 2016-09-09 Lockheed Martin Corporation Wearable display system
US10114127B2 (en) * 2015-05-11 2018-10-30 The United States Of America, As Represented By The Secretary Of The Navy Augmented reality visualization system
US10754156B2 (en) 2015-10-20 2020-08-25 Lockheed Martin Corporation Multiple-eye, single-display, ultrawide-field-of-view optical see-through augmented reality system
IL243422B (en) * 2015-12-30 2018-04-30 Elbit Systems Ltd Managing displayed information according to user gaze directions
US9995936B1 (en) 2016-04-29 2018-06-12 Lockheed Martin Corporation Augmented reality systems having a virtual image overlaying an infrared portion of a live scene
US20170336631A1 (en) * 2016-05-18 2017-11-23 Rockwell Collins, Inc. Dynamic Vergence for Binocular Display Device
US11500143B2 (en) 2017-01-28 2022-11-15 Lumus Ltd. Augmented reality imaging system
US10712159B2 (en) * 2017-04-10 2020-07-14 Martha Grabowski Critical system operations and simulations using wearable immersive augmented reality technology
FR3068481B1 (en) 2017-06-29 2019-07-26 Airbus Operations (S.A.S.) DISPLAY SYSTEM AND METHOD FOR AN AIRCRAFT
US10267630B2 (en) 2017-08-28 2019-04-23 Freefall Data Systems Llc Visual altimeter for skydiving
KR102561362B1 (en) 2017-09-29 2023-07-28 루머스 리미티드 augmented reality display
KR20200077511A (en) 2017-10-22 2020-06-30 루머스 리미티드 Head mounted augmented reality device using optical bench
CN111373296B (en) 2017-11-21 2023-02-28 鲁姆斯有限公司 Optical aperture expansion arrangement for near-eye displays
WO2019106636A1 (en) 2017-12-03 2019-06-06 Lumus Ltd. Optical device testing method and apparatus
CN111417883B (en) 2017-12-03 2022-06-17 鲁姆斯有限公司 Optical equipment alignment method
TWI791728B (en) 2018-01-02 2023-02-11 以色列商魯姆斯有限公司 Augmented reality display with active alignment
CN112005091B (en) 2018-04-08 2023-08-11 鲁姆斯有限公司 Apparatus and method for optically testing a sample of optical material, and controller operatively connected to the apparatus
EP3625617B1 (en) 2018-05-14 2023-09-06 Lumus Ltd. Projector configuration with subdivided optical aperture for near-eye displays, and corresponding optical systems
US11442273B2 (en) 2018-05-17 2022-09-13 Lumus Ltd. Near-eye display having overlapping projector assemblies
WO2019224764A1 (en) 2018-05-23 2019-11-28 Lumus Ltd. Optical system including light-guide optical element with partially-reflective internal surfaces
EP3807620B1 (en) 2018-06-21 2023-08-09 Lumus Ltd. Measurement technique for refractive index inhomogeneity between plates of a lightguide optical element
TWI830753B (en) 2018-07-16 2024-02-01 以色列商魯姆斯有限公司 Light-guide optical element and display for providing image to eye of observer
US10446041B1 (en) * 2018-08-23 2019-10-15 Kitty Hawk Corporation User interfaces for mutually exclusive three dimensional flying spaces
US10438495B1 (en) 2018-08-23 2019-10-08 Kitty Hawk Corporation Mutually exclusive three dimensional flying spaces
CN116184667A (en) 2018-09-09 2023-05-30 鲁姆斯有限公司 Optical system comprising a light-guiding optical element with two-dimensional expansion
TWM642752U (en) 2018-11-08 2023-06-21 以色列商魯姆斯有限公司 Light-guide display with reflector
KR20210090622A (en) 2018-11-08 2021-07-20 루머스 리미티드 Optical devices and systems having a dichroic beamsplitter color combiner
JP3226277U (en) 2018-11-11 2020-05-14 ルムス エルティーディー. Near eye display with intermediate window
FR3089672B1 (en) * 2018-12-05 2021-12-03 Thales Sa Method and display and interaction system on board a cockpit
TWI800657B (en) 2019-03-12 2023-05-01 以色列商魯姆斯有限公司 Image projector
WO2021105982A1 (en) 2019-11-25 2021-06-03 Lumus Ltd. Method of polishing a surface of a waveguide
IL270991B (en) 2019-11-27 2020-07-30 Lumus Ltd Lightguide optical element for polarization scrambling
BR112022009872A2 (en) 2019-12-05 2022-08-09 Lumus Ltd OPTICAL DEVICE AND METHOD TO MANUFACTURE AN OPTICAL DEVICE
IL290719B2 (en) 2019-12-08 2023-09-01 Lumus Ltd Optical systems with compact image projector
CA3163674A1 (en) 2019-12-25 2021-07-01 Lumus Ltd. Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element
DE202021104723U1 (en) 2020-09-11 2021-10-18 Lumus Ltd. Image projector coupled to an optical light guide element
TW202244552A (en) 2021-03-01 2022-11-16 以色列商魯姆斯有限公司 Optical system with compact coupling from a projector into a waveguide
US11892624B2 (en) 2021-04-27 2024-02-06 Microsoft Technology Licensing, Llc Indicating an off-screen target
US11789441B2 (en) 2021-09-15 2023-10-17 Beta Air, Llc System and method for defining boundaries of a simulation of an electric aircraft

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064335A (en) * 1997-07-21 2000-05-16 Trimble Navigation Limited GPS based augmented reality collision avoidance system
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
US6297767B1 (en) * 1999-06-24 2001-10-02 Shimadzu Corporation Rescue target position indicating apparatus
EP1398601A2 (en) * 2002-09-13 2004-03-17 Canon Kabushiki Kaisha Head up display for navigation purposes in a vehicle
US20040119986A1 (en) * 2002-12-23 2004-06-24 International Business Machines Corporation Method and apparatus for retrieving information about an object of interest to an observer

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5838262A (en) * 1996-12-19 1998-11-17 Sikorsky Aircraft Corporation Aircraft virtual image display system and method for providing a real-time perspective threat coverage display
JP3052286B2 (en) * 1997-08-28 2000-06-12 防衛庁技術研究本部長 Flight system and pseudo visual field forming device for aircraft
US6798392B2 (en) * 2001-10-16 2004-09-28 Hewlett-Packard Development Company, L.P. Smart helmet
US6867753B2 (en) * 2002-10-28 2005-03-15 University Of Washington Virtual image registration in augmented display field
WO2005015333A2 (en) * 2003-03-31 2005-02-17 Sikorsky Aircraft Corporation Technical design concepts to improve helicopter obstacle avoidance and operations in 'brownout' conditions
US20050140573A1 (en) * 2003-12-01 2005-06-30 Andrew Riser Image display system and method for head-supported viewing system
US6934633B1 (en) * 2004-10-15 2005-08-23 The United States Of America As Represented By The Secretary Of The Navy Helmet-mounted parachutist navigation system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6094625A (en) * 1997-07-03 2000-07-25 Trimble Navigation Limited Augmented vision for survey work and machine control
US6064335A (en) * 1997-07-21 2000-05-16 Trimble Navigation Limited GPS based augmented reality collision avoidance system
US6297767B1 (en) * 1999-06-24 2001-10-02 Shimadzu Corporation Rescue target position indicating apparatus
EP1398601A2 (en) * 2002-09-13 2004-03-17 Canon Kabushiki Kaisha Head up display for navigation purposes in a vehicle
US20040119986A1 (en) * 2002-12-23 2004-06-24 International Business Machines Corporation Method and apparatus for retrieving information about an object of interest to an observer

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10649214B2 (en) 2005-02-10 2020-05-12 Lumus Ltd. Substrate-guide optical device
US10598937B2 (en) 2005-11-08 2020-03-24 Lumus Ltd. Polarizing optical system
US10261321B2 (en) 2005-11-08 2019-04-16 Lumus Ltd. Polarizing optical system
EP3514497A1 (en) * 2007-12-21 2019-07-24 BAE SYSTEMS plc Device for landing a rotary wing aircraft
EP2227676B1 (en) * 2007-12-21 2017-02-01 BAE Systems PLC Apparatus and method for landing a rotary wing aircraft
EP3217148A1 (en) 2007-12-21 2017-09-13 BAE SYSTEMS plc Apparatus and method for landing a rotary wing aircraft
EP2227676A2 (en) 2007-12-21 2010-09-15 BAE Systems PLC Apparatus and method for landing a rotary wing aircraft
GB2477787B (en) * 2010-02-15 2014-09-24 Marcus Alexander Mawson Cavalier Use of portable electonic devices with head-mounted display devices
GB2477787A (en) * 2010-02-15 2011-08-17 Marcus Alexander Mawson Cavalier Data Overlay Generation Using Portable Electronic Device With Head-Mounted Display
US9292973B2 (en) 2010-11-08 2016-03-22 Microsoft Technology Licensing, Llc Automatic variable virtual focus for augmented reality displays
US9588341B2 (en) 2010-11-08 2017-03-07 Microsoft Technology Licensing, Llc Automatic variable virtual focus for augmented reality displays
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
US10055889B2 (en) 2010-11-18 2018-08-21 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
JP2014505897A (en) * 2010-11-18 2014-03-06 マイクロソフト コーポレーション Improved autofocus for augmented reality display
US9323325B2 (en) 2011-08-30 2016-04-26 Microsoft Technology Licensing, Llc Enhancing an object of interest in a see-through, mixed reality display device
EP3796069A1 (en) 2012-05-21 2021-03-24 Lumus Ltd Head-mounted display with an eyeball-tracker integrated system
FR2994736A1 (en) * 2012-08-24 2014-02-28 Thales Sa VISUALIZATION SYSTEM, IN PARTICULAR FOR AIRCRAFT, TRANSPARENT SCREEN AND PERIPHERAL SYMBOLOGY
WO2015162611A1 (en) 2014-04-23 2015-10-29 Lumus Ltd. Compact head-mounted display system
EP3495870A1 (en) 2014-04-23 2019-06-12 Lumus Ltd Compact head-mounted display system
EP4242515A2 (en) 2014-04-23 2023-09-13 Lumus Ltd. Compact head-mounted display system
US10809528B2 (en) 2014-04-23 2020-10-20 Lumus Ltd. Compact head-mounted display system
CN107111472A (en) * 2014-10-31 2017-08-29 微软技术许可有限责任公司 It is easy to interacting between user and their environment using the earphone with input mechanism
US10048835B2 (en) 2014-10-31 2018-08-14 Microsoft Technology Licensing, Llc User interface functionality for facilitating interaction between users and their environments
US9977573B2 (en) 2014-10-31 2018-05-22 Microsoft Technology Licensing, Llc Facilitating interaction between users and their environments using a headset having input mechanisms
US9652124B2 (en) 2014-10-31 2017-05-16 Microsoft Technology Licensing, Llc Use of beacons for assistance to users in interacting with their environments
US9612722B2 (en) 2014-10-31 2017-04-04 Microsoft Technology Licensing, Llc Facilitating interaction between users and their environments using sounds
WO2016069672A1 (en) * 2014-10-31 2016-05-06 Microsoft Technology Licensing, Llc Facilitating interaction between users and their environments using a headset having input mechanisms
CN107111472B (en) * 2014-10-31 2020-04-28 微软技术许可有限责任公司 Facilitating interaction between a user and an environment using headphones with input mechanisms
US10782532B2 (en) 2014-11-11 2020-09-22 Lumus Ltd. Compact head-mounted display system protected by a hyperfine structure
EP3654085A1 (en) 2014-11-11 2020-05-20 Lumus Ltd. Compact head-mounted display system protected by a hyperfine structure
US10520731B2 (en) 2014-11-11 2019-12-31 Lumus Ltd. Compact head-mounted display system protected by a hyperfine structure
WO2016075689A1 (en) 2014-11-11 2016-05-19 Lumus Ltd. Compact head-mounted display system protected by a hyperfine structure
EP3715935A1 (en) 2014-12-25 2020-09-30 Lumus Ltd. Substrate-guided optical device
EP3587916A1 (en) 2015-02-19 2020-01-01 Lumus Ltd. Compact head-mounted display system having uniform image
WO2016132347A1 (en) 2015-02-19 2016-08-25 Lumus Ltd. Compact head-mounted display system having uniform image
EP4235238A2 (en) 2015-02-19 2023-08-30 Lumus Ltd. Compact head-mounted display system having uniform image
EP3936762A1 (en) 2015-02-19 2022-01-12 Lumus Ltd. Compact head-mounted display system having uniform image
WO2017199232A1 (en) 2016-05-18 2017-11-23 Lumus Ltd. Head-mounted imaging device
US10739598B2 (en) 2016-05-18 2020-08-11 Lumus Ltd. Head-mounted imaging device
US10564417B2 (en) 2016-10-09 2020-02-18 Lumus Ltd. Aperture multiplier using a rectangular waveguide
US11567316B2 (en) 2016-10-09 2023-01-31 Lumus Ltd. Aperture multiplier with depolarizer
US10437031B2 (en) 2016-11-08 2019-10-08 Lumus Ltd. Light-guide device with optical cutoff edge and corresponding production methods
US11378791B2 (en) 2016-11-08 2022-07-05 Lumus Ltd. Light-guide device with optical cutoff edge and corresponding production methods
WO2018104732A1 (en) * 2016-12-09 2018-06-14 Sony Interactive Entertainment Inc. Head mounted display with user head rotation guidance
JP7177054B2 (en) 2016-12-09 2022-11-22 株式会社ソニー・インタラクティブエンタテインメント Head-mounted display with user head rotation guide
JP2020501263A (en) * 2016-12-09 2020-01-16 株式会社ソニー・インタラクティブエンタテインメント Head mounted display with user head rotation guide
US11507201B2 (en) 2016-12-09 2022-11-22 Sony Interactive Entertainment Inc. Virtual reality
US10302835B2 (en) 2017-02-22 2019-05-28 Lumus Ltd. Light guide optical assembly
US11194084B2 (en) 2017-02-22 2021-12-07 Lumus Ltd. Light guide optical assembly
US10684403B2 (en) 2017-02-22 2020-06-16 Lumus Ltd. Light guide optical assembly
US10481319B2 (en) 2017-03-22 2019-11-19 Lumus Ltd. Overlapping facets
US11125927B2 (en) 2017-03-22 2021-09-21 Lumus Ltd. Overlapping facets
US10895679B2 (en) 2017-04-06 2021-01-19 Lumus Ltd. Light-guide optical element and method of its manufacture
US11243434B2 (en) 2017-07-19 2022-02-08 Lumus Ltd. LCOS illumination via LOE
US11385393B2 (en) 2018-01-21 2022-07-12 Lumus Ltd. Light-guide optical element with multiple-axis internal aperture expansion
US10551544B2 (en) 2018-01-21 2020-02-04 Lumus Ltd. Light-guide optical element with multiple-axis internal aperture expansion
US11262587B2 (en) 2018-05-22 2022-03-01 Lumus Ltd. Optical system and method for improvement of light field uniformity
US11415812B2 (en) 2018-06-26 2022-08-16 Lumus Ltd. Compact collimating optical device and system
US11448816B2 (en) 2019-01-24 2022-09-20 Lumus Ltd. Optical systems including light-guide optical elements with two-dimensional expansion
US11914161B2 (en) 2019-06-27 2024-02-27 Lumus Ltd. Apparatus and methods for eye tracking based on eye imaging via light-guide optical element
US11796729B2 (en) 2021-02-25 2023-10-24 Lumus Ltd. Optical aperture multipliers having a rectangular waveguide
US11789264B2 (en) 2021-07-04 2023-10-17 Lumus Ltd. Display with stacked light-guide elements providing different parts of field of view

Also Published As

Publication number Publication date
EP1848966A1 (en) 2007-10-31
US8140197B2 (en) 2012-03-20
US20120179369A1 (en) 2012-07-12
US20090112469A1 (en) 2009-04-30
US8301319B2 (en) 2012-10-30

Similar Documents

Publication Publication Date Title
US8140197B2 (en) Personal navigation system
US6064335A (en) GPS based augmented reality collision avoidance system
US8995678B2 (en) Tactile-based guidance system
US20170098333A1 (en) Computer-aided system for 360° heads up display of safety / mission critical data
CN105547285B (en) Navigation system in building based on virtual reality technology
EP3125213B1 (en) Onboard aircraft systems and methods to identify moving landing platforms
US7268702B2 (en) Apparatus and methods for providing a flight display in an aircraft
US7295901B1 (en) System and method for indicating a position of an aircraft to a user
US10204453B2 (en) Aviation mask
US6714141B2 (en) Electronic cockpit vision system
US8010245B2 (en) Aircraft systems and methods for displaying a touchdown point
US7218245B2 (en) Head-down aircraft attitude display and method for displaying schematic and terrain data symbology
US20100240988A1 (en) Computer-aided system for 360 degree heads up display of safety/mission critical data
US20060253254A1 (en) Ground-based Sense-and-Avoid Display System (SAVDS) for unmanned aerial vehicles
US8700317B1 (en) Aeronautical holding pattern calculation for solving high wind and protected airspace issues
US20080208468A1 (en) Forward looking virtual imaging
US20170053453A1 (en) Avionic system comprising means for designating and marking land
EP3491341A1 (en) System and method for 3d flight path display
US10382746B1 (en) Stereoscopic augmented reality head-worn display with indicator conforming to a real-world object
US20080147320A1 (en) Aircraft airspace display
US20230145665A1 (en) Multi-platform integrated display
KR101994898B1 (en) Flight path guiding method based on augmented reality using mobile terminal
JPH0340200A (en) Planar format of three-dimensional perspective drawing for status recognization display
EP3112813B1 (en) Systems and methods for location aware augmented vision aircraft monitoring and inspection
Yeh et al. Human factors considerations in the design and evaluation of flight deck displays and controls: version 2.0

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2006711177

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2006711177

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 11816520

Country of ref document: US