GB2613678A - A system and method for coordinated symbology in an augmented reality system - Google Patents

A system and method for coordinated symbology in an augmented reality system Download PDF

Info

Publication number
GB2613678A
GB2613678A GB2214694.8A GB202214694A GB2613678A GB 2613678 A GB2613678 A GB 2613678A GB 202214694 A GB202214694 A GB 202214694A GB 2613678 A GB2613678 A GB 2613678A
Authority
GB
United Kingdom
Prior art keywords
display
symbology
correct location
displayed
positionally correct
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2214694.8A
Other versions
GB202214694D0 (en
Inventor
Peter Frederick Baker Lee
Rodney Garnham Jason
James Rendell Matthew
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Publication of GB202214694D0 publication Critical patent/GB202214694D0/en
Publication of GB2613678A publication Critical patent/GB2613678A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0183Adaptation to parameters characterising the motion of the vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)

Abstract

A display displays a view of an environment in a real world situation, on at least two different types of display. The display has a first display 200 of a first type and a second display 202 of a second type different from the first type. A controller controls how information is to be displayed between the first display and the second display. The controller determines whether there is available symbology relating to the real world situation 500 and determines a positionally correct location the symbology would occupy if visible 502. The controller determines whether at least one of the first and second displays are displaying a view that corresponds with the positionally correct location of where the symbology would occupy if visible 506. If so, the controller causes the symbology to be displayed, in the positionally correct location on the at least one of the first and second displays 512. The first and second display may be juxtaposed and in close proximity of one another. The symbology may be displayed in respective portions between the first and second displays. The respective portions may change as the situation changes maintaining the positionally correct location of the symbology.

Description

A SYSTEM AND METHOD FOR COORDINATED SYMBOLOGY IN AN AUGMENTED REALITY SYSTEM
FIELD
The present invention relates to a system and method for coordinating symbology in different display types.
BACKGROUND
There are many circumstances in which symbology can be displayed to assist a user to be more aware of the situation in which the user is located and viewing. Typically, the user may have different types of displays that are used to view different views. This can be particularly relevant to an aircraft pilot. The pilot is required to consider many aspects relating to the situation or environment to ensure accurate and safe manoeuvring on the ground or in the air. The pilot is thus often presented with multiple displays to view both traditional aspects of a 15 view from the cockpit and additional and augmented data and information. The different displays may be located at different locations and provide different types of information. The pilot is often moving their view from one display to another in order to accurately determine what is occurring in the situation. This can be tiring and visually difficult after many hours of use. In addition, there is also a possibility that the pilot may miss some vital information which may result in risk and damage.
As a result, there is a need to provide a system which enables the pilot to have an enhanced awareness of the situation and environment which is easier to use and less likely to result in accidents, risks or damage.
SUMMARY
According to an aspect of the present invention, there is provided a display system (100) for displaying on a least two different types of display a view of an environment in a real world situation, the system comprising: a first display (200) of a first type; a second display (202) of a second type which is different from the first type; a control system (100, 108) for controlling how information is to be displayed between the first display (200) and the second display (202) and configured to: determine (500) there is available symbology relating to the real world situation; determine (502) a positionally correct location the symbology -2 -would occupy if visible; determine (504) that at least one of the first display (200) and the second display (202) are displaying a view that corresponds with the positionally correct location of where the symbology would occupy if visible; cause (512) the symbology to be displayed, in the positionally correct location in the at least one of the first display (200) and the second display (202).
In an aspect, the first display (200) and the second display (202) are juxtaposed and in close proximity of one another.
In an aspect, the symbology is displayed in respective portions between the first display (200) and the second display (202).
In an aspect, the respective portions change as the situation changes maintaining the positionally correct location of the symbology.
In an aspect, the symbology is displayed using respective formatting for the first and second types.
In an aspect, the first display (200) and the second display (202) have
different or separate Fields Of View "FOV".
In an aspect, the information for display is received from sensors in the environment associated with the situation.
In an aspect, the information comprises real world data and augmented reality data.
In an aspect, the second display (202) is a Head Down Display "HDD". In an aspect, the HDD is a Large Area Display.
According to an aspect of the present invention, there is provided a method for displaying on a least two different types of display a view of an environment in a real world situation, the method comprising: determine (500) there is available symbology relating to the real world situation; determine (502) a positionally correct location the symbology would occupy if visible; determine (504) that at least one of a first display (200) and a second display (202) are displaying a view that corresponds with the positionally correct location of where the symbology would occupy if visible; cause (512) the symbology to be displayed -3 -in the positionally correct location in the at least one of the first display (200) and the second display (202).
According to an aspect of the present invention, there is provided a computer system configured to cause a processor to perform the method of a previous aspect.
BRIEF DESCRIPTION OF THE FIGURES
Aspects of the invention will now be described by way of example only with reference to the figures, in which: Figure 1 is a simplified view of a display system usable in a cockpit of an aircraft showing the position of features in multiple displays, according to the present invention.
Figure 2 is a simplified view generated by the display system presenting of figure 1.
Figure 3 is a simplified drawing showing symbology for display in multiple 15 displays.
Figure 4 is a simplified diagram showing how the symbology in figure 3 would be viewed by a user.
Figure 5 is a flow chart of how the symbology is generated and displayed between the multiple displays.
DETAILED DESCRIPTION
The present invention relates to an improved display system for a vehicle which includes different types of display. Each display displays information including symbology which is often represented differently from one display to another. The present invention relates to a system and method for coordinating the presentation of information and symbology from one display to another without losing the context and position of the information within the field of view (FOV) of the images as viewed by the user of the vehicle. This will result in a greater situational awareness for the user and mitigates user fatigue.
In a typical case a pilot in a cockpit is presented with vast amounts of information to be able to fly an aircraft. The information includes real world information from sensors in the environment or surroundings and from objects and controls in the cockpit in the display FOV. In addition, the pilot is presented -4 -with virtual or augmented information associated with the real world, the cockpit, and from many other sources. Information associated with the controls in the cockpit, weather, position etc. is displayed over multiple displays in the cockpit. The multiple displays include a head up display (HUD), a head down display (HDD), a large area display (LAD) or any other display type. Possibilities include advanced (see-through) displays, a Helmet Mounted Display, other forms of see-through, potentially user mounted displays and whole cockpit window display. An HUD is typically see-through and provides a view of the real world situation. This includes the view through the cockpit window and anything else in the FOV of the HUD. The HUD is augmented by additional information that is generated by the display system and presented in the display. The additional information includes symbology which is displayable on the HUD display. One example of a suitable HUD is the LiteHUD TM produced by BAE Systems.
An HDD is typically opaque and presents additional and/or virtual information to the pilot from the display system. The additional information is typically based on data collected from sensors that are associated with the situation or environment. The HDD can be in the form of large area display (LAD) which includes different regions that are configured to provide a specific type of information, such as weather, direction, position, movement, objects in-and-out of the field of view and anything that enables the pilot to safely manoeuvre the aircraft.
The HUD display is within a limited FOV of the pilot, but this may not be enough for the pilot to fly or drive the aircraft. The HUD FOV covers a small portion of the FOV of the pilot. As the pilot is able to move their head. The present invention expands the limited FOV of the HUD, by extending down into an extra area in the HDD or any other display or display type. The HDD is configured to provide additional information from the sensors to compensate for the limits of the FOV. In this way the pilot has access to a much extended FOV in which real and virtual data is presented.
In general, HUDs and HDDs are fixed relative to the aircraft, for example they could be mounted at a fixed position in the cockpit of the aircraft.
In general, HMDs are fixed to the user and so may move relative to the aircraft. -5 -
Both HUD and HDD displays are able to present symbology. However, the nature of the displays and the distance between the displays makes this far from simple and is something the present invention is seeking to address. In addition, the invention further seeks to provide displays that are grouped together in a manner that enables presentation of information between the displays to be continuous across the different displays. Which was not previously been achievable due to the very large gaps between displays and in particular due to the different manners of operation of different displays. (In other words the information is apportioned or split between the plurality of displays as if there were one continuous display. Information outside of the bounds of the displays may not be presented).
Figure 1 shows a simplified overview of the display system 100 of the present invention. The system is a computer implemented system including processors, memory, sensors, controllers, peripherals, etc. The system also includes a data module 102, a first display driver 104, a second display driver 106 and a display coordination module 108.
The data module 102 receives data from one or more sensors associated with the situation of the aircraft. The sensors can be of any type, including but not limited to radiation sensors (optical and non-optical), sound sensors, climate sensors, positional sensors, velocity sensors, acceleration sensors, proximity sensors, etc. Data from the sensors is received at the data module where it is processed to produce signals to drive the first and second display drivers 104 and 106 and to control a control system 108 (also referred to as a display coordination module 108). For illustration purposes only, the first and second drivers produce an artificial horizon and a pitch ladder for display (not shown in figure 1 but shown in figure 2).
Figure 2 shows a combination of an HUD 200 and an HDD 202 as would be viewed by the pilot (not shown). The HUD 200 is positioned above the HDD 202 at a minimal separation 204. It should be noted that the minimal separation is considerably less than in prior art systems due to the nature of the displays.
The HUD 200 is driven by the first driver 104 to display content A. At the same time, the second driver 106 causes the HDD 202 to display content B. The display coordination module 108 controls the positioning of some or all of the -6 -elements displayed on either the HUD 200 or the HDD 202. The coordination will be described in greater detail below.
Referring to figure 3, the HUD 200 and the HDD 202 are shown with a pitch ladder 300 and an artificial horizon 302. The pitch ladder 300 and the horizon are shown above the displays to show all of their details. (That is to say: the ladder 300 and horizon 302 are shown as symbology continuing beyond the bounds of the displays, to illustrate the continuous symbology to be apportioned between the separate displays).
In accordance with the invention the pitch ladder 300 and artificial horizon 302 would actually be displayed as illustrated in figure 4. From figure 4, it can be seen that the system maintains the integrity of the pitch ladder and horizon even though they are viewed through different displays which do not necessarily support a common symbology (for example they may not format symbology data in the same way)..
Returning to figure 2 the details of the view presented to the pilot will now be explained. The HUD 200 is showing content A which includes a first portion 206 of the pitch ladder 300 which is displayed using the HUD symbology under the control of the display coordination module 108. The display coordination module 108 identifies that there is a portion of the pitch ladder that would be in the view of the HUD 200 based on the position thereof as determined by the display system. In response to this the display coordination module 108 causes the relevant portion of the pitch ladder to be displayed on the HUD 200 using the relevant symbology for the first display.
The HDD 202 is showing content B which includes traditional aircraft monitoring system, such as maps 208, navigational aids 210, positional information 212, etc. In addition, there is a second portion of the pitch ladder 214 and a portion 216 of the artificial horizon 302 which is in a position which could be displayed in the HDD 202. The display coordination module 108 enables the presentation of the portion of the pitch ladder 214 and the portion of the artificial horizon 216 to be displayed to be positionally correct using the symbology of the HDD 202.
(Accordingly, and impliedly, there is also provided for a class of content, content C, which is the portion of the symbology (e.g. pitch ladder and artificial horizon) that is not displayed due to its falling outside of the boundaries of the -7 -displays. See the bottom, middle, and top right portions of the symbology 300 in Figure 3.) Due to the nature of the displays used by the system it is possible for the HUD 200 and HOD 202 to be juxtaposed and in close proximity so that images displayed between the two displays have only a small part of the combined FOV of the two displays that is not covered by one or the other of the displays. The separation between the displays could be any value and depends on the type of display. In an aspect of the invention, a separation of between about 100mm and 35mm exists between the two displays ensuring that the missing image data is minimised. As there is only a short displacement between the two displays there is minimal loss of view of the pitch ladder and artificial horizon as indicated in area 204. As a result, the pitch ladder and the artificial horizon are positionally correct and presented in a coordinated manner in the two displays at the same time.
The pitch ladder 300 and horizon 302 have a "correct position" in space relative to the position and orientation of the vehicle in the real world. This correct position is referred to as positionally correct herein. The position of any symbology must be positionally correct to avoid providing misleading information to the pilot. The positioning, orientation and split of the symbology between the two displays is possible for many other types of symbology.
As the aircraft moves the system instantly updates the relative portions of the symbology to ensure integrity between the respective portions and the positionally correct location. The symbology flows from one type of display to the other in real time and represents a positionally correct image of the symbology into the FOV of the pilot. As the pilot manoeuvres the aircraft the pitch ladder and artificial horizon are always positionally correct and displayed in one or both of the first and second display. This ensures that the pilot has an enhanced view of the environment and is accordingly considerably more aware of the situation.
The display coordination module 108 operates as described with reference to figure 5. In a first step 500 the display coordination module 108 determines that there is available symbology which relates to data processed by the system. The display coordination module 108 determines the positionally correct location that the symbology would occupy if visible in step 502. The display coordination -8 -module 108 identifies the available displays in step 504. The display coordination module 108 determines, in step 506, if any of the displays are displaying in positionally correct location of where the symbology should be. (Or in other words, determining if there is a display corresponding with positionally correct symbology). If no, the process returns to the start or stops 508. If yes, the display coordination module 108 determines the type of display 510. In step 512, the display coordination module 108 causes the symbology to be displayed, in the format of the type of display and in positionally correct location.
The present invention is described with reference to a pilot operating an aircraft. It will be appreciated that the display system could be adapted to any combination of display types and for other applications than operating an aircraft. For example, the invention could be configured for other moveable platforms such as automobiles or watercraft.
The invention has been described using two types of display, it will be appreciated there is no reason for this to be a limit and further displays may be added as appropriate. It is envisaged that at least one of the displays could be in the form of a wearable display, such as a Head Worn Display (HWD).
The invention is implemented using computing systems such as, a desktop, laptop or notebook computer, hand-held computing device (FDA, cell phone, palmtop, etc.), mainframe, server, client, or any other type of special or general purpose computing device as may be desirable or appropriate for a given application or environment can be used. The computing system can include one or more processors which can be implemented using a general or special-purpose processing engine such as, for example, a microprocessor, microcontrol ler or other control module.
The computing system can also include a main memory, such as random access memory (RAM) or other dynamic memory, for storing information and instructions to be executed by a processor. Such a main memory also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by the processor. The computing system may likewise include a read only memory (ROM) or other static storage device for storing static information and instructions for a processor.
The computing system may also include an information storage system which may include, for example, a media drive and a removable storage interface. -9 -
The media drive may include a drive or other mechanism to support fixed or removable storage media, such as a hard disk drive, a floppy disk drive, a magnetic tape drive, an optical disk drive, a compact disc (CD) or digital video drive (DVD) read or write drive (R or RW), or other removable or fixed media drive. Storage media may include, for example, a hard disk, floppy disk, magnetic tape, optical disk, CD or DVD, or other fixed or removable medium that is read by and written to by media drive. The storage media may include a computer-readable storage medium having particular computer software or data stored therein.
In alternative aspects, an information storage system may include other similar components for allowing computer programs or other instructions or data to be loaded into the computing system. Such components may include, for example, a removable storage unit and an interface, such as a program cartridge and cartridge interface, a removable memory (for example, a flash memory or other removable memory module) and memory slot, and other removable storage units and interfaces that allow software and data to be transferred from the removable storage unit to computing system.
The computing system can also include a communications interface. Such a communications interface can be used to allow software and data to be transferred between a computing system and external devices. Examples of communications interfaces can include a modem, a network interface (such as an Ethernet or other NIC card), a communications port (such as for example, a universal serial bus (USB) port), a PCMCIA slot and card, etc. Software and data transferred via a communications interface are in the form of signals which can be electronic, electromagnetic, and optical or other signals capable of being received by a communications interface medium.
In this document, the terms 'computer program product', 'computer-readable medium' and the like may be used generally to refer to tangible media such as, for example, a memory, storage device, or storage unit. These and other forms of computer-readable media may store one or more instructions for use by the processor comprising the computer system to cause the processor to perform specified operations. Such instructions, generally referred to as computer program code' (which may be grouped in the form of computer programs or other groupings), when executed, enable the computing system to -10 -perform functions of aspects of the present invention. Note that the code may directly cause a processor to perform specified operations, be compiled to do so, and/or be combined with other software, hardware, and/or firmware elements (e.g., libraries for performing standard functions) to do so.
The non-transitory computer readable medium may comprise at least one from a group consisting of: a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a Read Only Memory, a Programmable Read Only Memory, an Erasable Programmable Read Only Memory, EPROM, an Electrically Erasable Programmable Read Only Memory and a Flash memory In an aspect where the elements are implemented using software, the software may be stored in a computer-readable medium and loaded into computing system using, for example, removable storage drive. A control module (in this example, software instructions or executable computer program code), when executed by the processor in the computer system, causes a processor to perform the functions of the invention as described herein.
Furthermore, the inventive concept can be applied to any circuit for performing signal processing functionality within a network element. It is further envisaged that, for example, a semiconductor manufacturer may employ the inventive concept in a design of a stand-alone device, such as a microcontroller of a digital signal processor (DSP), or application-specific integrated circuit (ASIC) and/or any other sub-system element.
It will be appreciated that, for clarity purposes, the above description has described aspects of the invention with reference to a single processing logic. However, the inventive concept may equally be implemented by way of a plurality of different functional units and processors to provide the signal processing functionality. Thus, references to specific functional units are only to be seen as references to suitable means for providing the described functionality, rather than indicative of a strict logical or physical structure or organisation.
Aspects of the invention may be implemented in any suitable form including hardware, software, firmware or any combination of these. The invention may optionally be implemented, at least partly, as computer software running on one or more data processors and/or digital signal processors or configurable module components such as FPGA devices. Thus, the elements and components of an aspect of the invention may be physically, functionally and logically implemented in any suitable way. Indeed, the functionality may be implemented in a single unit, in a plurality of units or as part of other functional units Although the present invention has been described in connection with some aspects, it is not intended to be limited to the specific form set forth herein.
Rather, the scope of the present invention is limited only by the accompanying claims. Additionally, although a feature may appear to be described in connection with particular aspects, one skilled in the art would recognize that various features of the described aspects may be combined in accordance with the invention. In the claims, the term 'comprising' does not exclude the presence of other elements or steps.
Furthermore, although individually listed, a plurality of means, elements or method steps may be implemented by, for example, a single unit or processor. Additionally, although individual features may be included in different claims, these may possibly be advantageously combined, and the inclusion in different claims does not imply that a combination of features is not feasible and/or advantageous. Also, the inclusion of a feature in one category of claims does not imply a limitation to this category, but rather indicates that the feature is equally applicable to other claim categories, as appropriate. .

Claims (25)

  1. -12 -CLAIMS A display system for displaying on a least two different types of display a view of an environment in a real world situation, the system comprising: a first display of a first type; a second display of a second type which is different from the first type; a control system for controlling how information is to be displayed between the first display and the second display and configured to: determine there is available symbology relating to the real determine a positionally correct location the symbology would occupy if visible; determine that at least one of the first display and the second display are displaying a view that corresponds with the positionally correct location of where the symbology would occupy if visible; cause the symbology to be displayed, in the positionally correct location in the at least one of the first display and the second display.
  2. The display system according to claim 1, wherein the first display and the second display are juxtaposed and in close proximity of one another.
  3. 3. The display system according to claim 1 or claim 2, wherein the symbology is displayed in respective portions between the first display and the second display.
  4. 4. The display system according to claim 3, wherein the respective portions change as the situation changes maintaining the positionally correct location of the symbology.
  5. The display system according to any one of the preceding claims, wherein the symbology is displayed using respective formatting for the first and second types.
  6. -13 -The display system according to any one of the preceding claims, wherein the first display and the second display have different Fields Of View "FOV".
  7. 7 The display system according to any one of the preceding claims, wherein the first display and the second display have separate Fields Of View "FOV".
  8. 8. The display system according to any one of the preceding claims, wherein the information for display is received from sensors in the environment associated with the situation.
  9. 9. The display system according to any one of the preceding claims, wherein the information comprises real world data and augmented reality data.
  10. 10. The display system according to any one of the preceding claims, wherein the first display is a Head Up Display "HUD".
  11. 11. The display system according to any one of the preceding claims, wherein the second display is a Head Down Display "HDD".
  12. 12. The display system according to claim 11, wherein the HDD is a Large Area Display. 25
  13. 13. A method for displaying on a least two different types of display a view of an environment in a real world situation, the method comprising: determine there is available symbology relating to the real world situation; determine a positionally correct location the symbology determine that at least one of a first display and a second display are displaying a view that corresponds with the positionally correct location of where the symbology would occupy if visible; -14 -cause the symbology to be displayed, in the positionally correct location in the at least one of the first display and the second display.
  14. 14. The method according to claim 13, further comprising locating the first display and the second display juxtaposed and in close proximity of one another.
  15. 15. The method according to claim 13 or claim 14, further comprising splitting the symbology to be displayed in respective portions of the first display and the second display.
  16. 16. The method according to claim 15, further comprising updating the respective portions change as the situation changes to maintain the positionally correct location of the symbology.
  17. 17. The method according to any one of claims 13 to 16, further comprising displaying the symbology using respective formatting for the first and second types.
  18. 18. The method according to any one of claims 13 to 17, further comprising configuring the first display and the second display to have different Fields Of View "FOV".
  19. 19. The method according to any one of claims 13 to 18, further comprising receiving the information for display from sensors in the environment associated with the situation.
  20. 20. The method according to any one of claims 13 to 19, further comprises displaying the information as real world data and augmented reality data.
  21. 21. The method according to any one of claims 13 to 20, wherein the first display is a Head Up Display "HUD".
  22. 22. The method according to claim 21, wherein the HUD is a Head Worn Device "HWD".
  23. 23. The method according to any one of claims 13 to 22, wherein the second display is a Head Down Display "HDD".
  24. 24. The method according to claim 23, wherein the HDD is a Large Area Display.
  25. 25. A computer system configured to cause a processor to perform the method according to any one of claims 13 to 24.
GB2214694.8A 2021-10-27 2022-10-06 A system and method for coordinated symbology in an augmented reality system Pending GB2613678A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB2115440.6A GB202115440D0 (en) 2021-10-27 2021-10-27 A system and method for coordinated symbology in an augmented reality system

Publications (2)

Publication Number Publication Date
GB202214694D0 GB202214694D0 (en) 2022-11-23
GB2613678A true GB2613678A (en) 2023-06-14

Family

ID=78805933

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB2115440.6A Ceased GB202115440D0 (en) 2021-10-27 2021-10-27 A system and method for coordinated symbology in an augmented reality system
GB2214694.8A Pending GB2613678A (en) 2021-10-27 2022-10-06 A system and method for coordinated symbology in an augmented reality system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB2115440.6A Ceased GB202115440D0 (en) 2021-10-27 2021-10-27 A system and method for coordinated symbology in an augmented reality system

Country Status (1)

Country Link
GB (2) GB202115440D0 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014002347A1 (en) * 2012-06-29 2014-01-03 株式会社ソニー・コンピュータエンタテインメント Video output device, 3d video observation device, video display device, and video output method
US20140313189A1 (en) * 2013-04-19 2014-10-23 Thales Hybrid display system displaying information by superimposition on the exterior
WO2016204916A1 (en) * 2015-06-17 2016-12-22 Microsoft Technology Licensing, Llc Hybrid display system
WO2018211494A1 (en) * 2017-05-15 2018-11-22 Real View Imaging Ltd. System with multiple displays and methods of use

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014002347A1 (en) * 2012-06-29 2014-01-03 株式会社ソニー・コンピュータエンタテインメント Video output device, 3d video observation device, video display device, and video output method
US20140313189A1 (en) * 2013-04-19 2014-10-23 Thales Hybrid display system displaying information by superimposition on the exterior
WO2016204916A1 (en) * 2015-06-17 2016-12-22 Microsoft Technology Licensing, Llc Hybrid display system
WO2018211494A1 (en) * 2017-05-15 2018-11-22 Real View Imaging Ltd. System with multiple displays and methods of use

Also Published As

Publication number Publication date
GB202214694D0 (en) 2022-11-23
GB202115440D0 (en) 2021-12-08

Similar Documents

Publication Publication Date Title
EP1775554B1 (en) Dynamic primary flight displays for unusual attitude conditions
EP2107340B1 (en) Waypoint display system
US9389097B2 (en) Aircraft display systems and methods for enhanced display of flight path information
EP2899509A1 (en) System and method for displaying flight path information in rotocraft
US20090309812A1 (en) Method and system for operating a near-to-eye display
EP1818650B1 (en) Dynamic lateral deviation display
EP3338135A1 (en) Holographic building information update
EP2154484B1 (en) Method and system for operating a display device on-board an aircraft
EP3444570B1 (en) Aircraft systems and methods for unusual attitude recovery
EP1959239A1 (en) Target zone display system and method
EP3410073B1 (en) System and method for adjusting the correlation between a visual display perspective and a flight path of an aircraft
CN109313042B (en) Displaying performance limits in aircraft displays
EP2555105A2 (en) Touch screen having adaptive input requirements
US20210239972A1 (en) Methods, systems, apparatuses, and devices for facilitating provisioning of a virtual experience
CN112109550A (en) AR-HUD-based display method, device and equipment for early warning information and vehicle
EP4174833A1 (en) A system and method for coordinated symbology in an augmented reality system
GB2613678A (en) A system and method for coordinated symbology in an augmented reality system
WO2023073341A1 (en) A system and method for coordinated symbology in an augmented reality system
US20220108502A1 (en) Method for generating an image data set for reproduction by means of an infotainment system of a motor vehicle
CN107933937B (en) System and method for generating and displaying aircraft orientation cues
EP4342802A1 (en) Pilot alerting of detected runway environment
US9341478B1 (en) Surface information display inhibiting system, device, and method
US10249097B2 (en) Method of graphical management of the symbology in a three-dimensional synthetic view of the exterior landscape in an on-board viewing system for an aircraft
CN108204823B (en) Method for graphically managing pitch scales in an on-board display system of an aircraft
EP4399587A1 (en) An image generator and method for an augmented reality system