CN110769237B - Projection system, apparatus and method for projection position adjustment using head position - Google Patents

Projection system, apparatus and method for projection position adjustment using head position Download PDF

Info

Publication number
CN110769237B
CN110769237B CN201911058743.3A CN201911058743A CN110769237B CN 110769237 B CN110769237 B CN 110769237B CN 201911058743 A CN201911058743 A CN 201911058743A CN 110769237 B CN110769237 B CN 110769237B
Authority
CN
China
Prior art keywords
projection
head
operators
projection system
graphical image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911058743.3A
Other languages
Chinese (zh)
Other versions
CN110769237A (en
Inventor
E·L·莱昂
J·郑
X·王
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avic General Electric Civil Avionics System Co ltd
Original Assignee
Avic General Electric Civil Avionics System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Avic General Electric Civil Avionics System Co ltd filed Critical Avic General Electric Civil Avionics System Co ltd
Priority to CN201911058743.3A priority Critical patent/CN110769237B/en
Publication of CN110769237A publication Critical patent/CN110769237A/en
Application granted granted Critical
Publication of CN110769237B publication Critical patent/CN110769237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements

Abstract

A projection system, apparatus and method for adjustment using head position are provided. In an exemplary embodiment, the system may include: a projection device configured to project a graphical image associated with an external target on a transparent body; a head tracking device configured to track and obtain head positions of one or more operators; and one or more processors configured to adjust a projection position of a graphic image on a transparent body based at least in part on a head position of an operator such that the graphic image conforms to the external target with which it is associated, and to provide the adjusted image data to the projection device.

Description

Projection system, apparatus and method for projection position adjustment using head position
Technical Field
Embodiments of the present disclosure generally relate to the field of projection. More particularly, embodiments relate to systems, devices, and methods for projection position adjustment using head position in a vehicle.
Background
Reflective displays with images collimated at infinity, such as HUD systems (head-up systems), are a projection display technology currently used in carrier aircraft that uses the principle of optical reflection to adjust the focal length of the projected image to infinity and project it onto a transparent display assembly level with the pilot's field of view. This technique has the unique advantage of ensuring conformity with the external view. However, this technique requires boresight and the eye movement range (eyebox) is limited. Here, the term "eye movement range" refers to a spatial region in which an image can be seen. The field of view is also limited to the projection surface. In general, the viewing angle is also only a few tens of degrees.
Cost is also often a major obstacle due to the precision required of the optical equipment used. Recently, with the introduction of waveguide technology, this problem has been alleviated. But the disadvantage is that this technique reduces the field of view. The reduced field of view further limits the range of head movement of the pilot in flight, thereby affecting flight operations and increasing the pilot's workload.
An alternative to the reflection technique is to use a transparent body of the cockpit as the light emitting area. The entire surface can be covered without limiting the visibility of the image. However, this technique will lose infinity focus and therefore introduce a slight lag when looking from the symbol to the external view.
Another problem with this technique is that the image is fixed relative to the frame of the windshield without position adjustment. Therefore, there is a need to find a way to ensure conformity with external views. In some applications, the symbol also needs to ensure conformality for both pilots (e.g., indicating an intruding aircraft), which would require compensating for the disparity differences of both pilots.
Wearable devices may be used to address the above needs. However, this will result in some use limitations. The first is limited compatibility with sunlight or corrected glass. Secondly, the angle of view of such lightweight devices is a few degrees and the head must be moved frequently to see a larger angle of view, not just the gaze direction. Secondly, there are weight and balance issues with active devices. This may cause some trouble in turbulent or moderate gravitational acceleration environments (cornering, evasive maneuvers). Fatigue and pain can also occur if worn for extended periods. In addition, wearable devices also require a power source or power cord.
Both HUDs and wearable devices are based on the projection of a reflected image on a translucent glass. This requires precise geometric arrangements between the image projection glass, the projector and the eye movement range. However, without a movable projector, a similar configuration would be difficult to achieve in a windshield projection system.
Disclosure of Invention
The present disclosure provides a projection system and method that utilizes head position for adjustment. In an exemplary embodiment, the system may include a projection device, a head tracking device, and one or more processors. The projection device may be configured to project a graphical image associated with the external object on the transparent body. The head tracking device may be configured to track and obtain head positions of one or more operators. The one or more processors may be configured to adjust a projection position of the graphic image on the transparent body based at least in part on the head position of the operator such that the graphic image conforms to an external target with which it is associated, and to provide the adjusted image data to the projection device.
In an exemplary embodiment, a method of the present disclosure includes obtaining head positions of one or more operators; adjusting a projection position of a graphic image to be projected onto the transparent body based at least in part on the head position of the operator such that the graphic image conforms to an external target with which it is associated; and outputting the adjusted graphic image to be projected on the transparent body.
Drawings
A better understanding of the present disclosure may be obtained from the following detailed description in conjunction with the following drawings, in which:
FIG. 1 shows an exemplary graphical image as viewed by a pilot in accordance with an exemplary embodiment;
FIG. 2A illustrates a position adjustment of a symbol for one pilot in accordance with an exemplary embodiment;
FIG. 2B illustrates position adjustment of symbols for two pilots in accordance with an exemplary embodiment;
FIG. 3 shows an exemplary setup of projection device(s) according to an exemplary embodiment;
FIG. 4 shows an exemplary setup of projection device(s) according to an exemplary embodiment;
FIG. 5 is a block diagram illustrating an example projection system, according to an example embodiment;
FIG. 6(A) is a block diagram illustrating an example projection system according to another example embodiment;
FIG. 6(B) shows a detailed example of a projection system according to an exemplary embodiment;
FIG. 7 shows an exemplary setup of a head tracking device according to an exemplary embodiment;
FIG. 8 illustrates an exemplary dual camera setting for one pilot in accordance with an exemplary embodiment;
FIG. 9 shows an exemplary side camera setup for one pilot in accordance with an exemplary embodiment;
FIG. 10 illustrates a pseudo line of sight for one pilot in accordance with an exemplary embodiment;
FIG. 11 depicts a projection of a symbol on the windshield for one pilot in accordance with an exemplary embodiment;
FIG. 12 depicts three-dimensional generation of symbols in accordance with an exemplary embodiment;
FIG. 13 is a schematic diagram showing the disparity difference between two pilots;
FIG. 14 shows an example of adjustment of a single symbol for two pilots in accordance with an exemplary embodiment;
FIG. 15 shows an example of adjustment of a single symbol for two pilots with different azimuth angles;
FIG. 16 depicts illuminating a transparent body with excitation light (e.g., ultraviolet light) from a light source (e.g., a projector), according to an example embodiment;
FIG. 17 depicts a partitioned pixel density region and a corresponding window image on a DLP in accordance with an exemplary embodiment;
FIG. 18 depicts a projection of a three-dimensional model of a symbol having divided pixel density regions in accordance with an illustrative embodiment; and
FIG. 19 is a flow chart of an example projection method for adjustment with head position.
Detailed Description
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the following described embodiments of the present disclosure. It will be apparent, however, to one skilled in the art that embodiments of the present disclosure may be practiced without some of these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to avoid obscuring the underlying principles of the various embodiments of the disclosure.
It should be noted that the terms "comprises," "comprising," "has," "having," "includes," "including," "contains," "containing," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "comprising," "having," "including," and "containing," when used in the context of this specification, do not exclude the presence of other elements, components, methods, articles, or apparatus, which may be present in, include, or contain the recited elements. The terms "a" and "an" are defined as one or more unless explicitly stated otherwise herein.
In the present disclosure, a projection system is disclosed. The projection system may include: a projection device configured to project a graphical image (e.g., a symbol) associated with an external target; a head tracking device configured to track a head position of an operator; and one or more processors configured to adjust a projection position of the graphical image based on the head position such that the graphical image conforms to an external target with which it is associated, and to provide the adjusted image data to the projection device. In an exemplary embodiment, the projection system may be applied to an aircraft, the operator may be a pilot, and the head tracking device is configured to track a head position of the pilot. In other embodiments, the system may be applied to a vehicle, and the head tracking device is configured to track the head position of the driver. In the present disclosure, embodiments are described with reference to aircraft applications, but those of ordinary skill in the art will appreciate that the projection system of the present disclosure may be applied to other types of vehicles. "vehicle" as described herein refers to a machine controlled by an operator(s) therein, including, but not limited to, trucks, buses, rail vehicles (trains, trams), boats (ships, boats), amphibious vehicles (screw-propellers, hovercraft), aircraft (airplanes, helicopters, and other vertical take-off and landing airplanes, etc.), and spacecraft.
In an exemplary embodiment, full windshield projection may be achieved. The fluorescent material can be laid on all panes of the cockpit transparency and then excited by the excitation light from the projection device. The transparent body described herein is, for example, a windscreen of a cockpit, and does not require complete transparency, as long as the exterior can be at least partially observed through the transparent body. Further, by way of example, and not limitation, the projection device may be based on digital light processing techniques.
FIG. 1 shows an exemplary graphical image as viewed by a pilot according to an exemplary embodiment. The graphic image of the present disclosure may include a symbol (symbology). The symbol of the present disclosure may be a symbol indicating an external object or target. In an exemplary embodiment, the symbol may be near or around an object or target (such as an airplane or taxiway, etc.). In an embodiment, the symbol may be a virtual taxiway marker, as shown on the left side of FIG. 1. In another embodiment, the symbol may be a circle around the external aircraft, as shown on the right side of FIG. 1. It should be understood that the symbols may also include, but are not limited to, flight assistance information such as flight path vectors, synthetic runway or terrain, intruder identification and recognition.
FIG. 2A illustrates a position adjustment of a symbol for one pilot according to an exemplary embodiment. To ensure conformity with an external target, when the head position of the pilot changes from position a to position B, the symbol projected on the windshield needs to change from position a 'to position B'. Thus, for the pilot, the sign remains in a constant relationship to the target (near or around the target, etc.), regardless of his/her head position. In other words, the symbol remains substantially on the line of the pilot's binocular and targets such that the symbol and its associated external targets substantially coincide in the pilot's pseudo-line of sight. Details of the pseudo line of sight will be set forth below.
FIG. 2B illustrates position adjustment of symbols for two pilots according to an exemplary embodiment. In an exemplary embodiment, the left pilot may be the captain and the right pilot may be the copilot. In this case, the symbol needs to conform to the two flying external targets. And also to compensate for parallax differences between the two pilots.
Exemplary projection System
Fig. 3 and 4 show exemplary arrangements of projection device(s) according to exemplary embodiments. In one embodiment, the projection device may include multiple projectors in the cockpit. In one embodiment, the projector may be located in the projection device bay above each pilot, as shown in FIG. 3. In another embodiment, the projector may be mounted under a light shield, depending primarily on the space constraints of the cockpit.
Referring to fig. 4, in an exemplary embodiment, six transparencies are provided in the cockpit, including left and right windshields, left and right front windows, and left and right rear windows. For example, six projectors may be arranged in the projection device cabin above the pilot, each projector aimed at a respective transparent body. Each pod has at least one projector aimed at a respective independent transparent body around each driver side for full windshield projection. In an exemplary embodiment, the projector has a short projection. Redundant pod designs are sometimes required to prevent obstruction of the pilot's body or overhead panels.
Fig. 5 is a block diagram illustrating an example projection system 500, according to an example embodiment. As shown in fig. 5, projection system 500 includes head-tracking device(s) 11, processor(s) 12, and projection device(s) 14.
The head tracking device 11 is configured to obtain the head position, e.g., spatial coordinates, of the pilot. In some embodiments, the head tracking device 11 also obtains the azimuth of the pilot's head. The processor(s) 12 is configured to generate an original image and adjust the projection position of a graphic image to be projected on the corresponding transparent body(s) or part of the surface of the transparent body based on the head position of the pilot obtained by the head tracking device 11 so that the graphic image conforms to the external target with which it is associated, and output the adjusted image to the projection device 14.
The head tracking device 11 may include a camera, such as a raster camera and/or a side camera, configured to perform tracking of the pilot's head, such as three-dimensional attitude estimation. The head tracking device 11 may also perform eye tracking.
Fig. 6(a) is a block diagram illustrating an example projection system 600 according to another example embodiment. Fig. 6(B) shows a detailed example of a projection system 600 according to another exemplary embodiment. In this embodiment, projection system 600 includes head tracking device(s) 11, processor(s) 12, and projection device(s) 14. The processor 12 includes an adjustment unit(s) 122, such as a CPU (central processing unit), and a graphics processing unit 124, such as a GPU (graphics processing unit). The adjustment unit 122 is configured to adjust the projection position of the graphic image to be projected on the respective transparent body(s) or part of the surface of the transparent body based on the head position of the pilot obtained by the head tracking device 11 such that the graphic image conforms to the external target with which it is associated. Graphics processing unit 124 is configured to generate and provide adjusted image data (e.g., graphics output) to projection device 14 based on the output of adjustment unit 122, e.g., by projecting symbols on respective transparent body(s) or partial surfaces of the transparent body, such as through vertex lists and shader (shader) programs. The number of GPUs may depend on the throughput of each individual GPU and its ability to generate multiple outputs. Alternatively, multiple CPUs may be coupled to multiple GPUs in a one-to-one correspondence. In some embodiments, the functionality of the GPU may be incorporated into the CPU or the projection device. In fig. 6(B), the projection device 14 is shown as a projector #4 equipped with a UV light source. However, in some embodiments, the UV light source may be logically and/or physically separate from and coupled to the projector or projectors. Further, projector #4 includes an optical system as shown in fig. 6 (B). The optical system will be described in detail below.
Fig. 7 shows an exemplary setup of the head tracking device 11 according to an exemplary embodiment. In an embodiment, the head tracking device 11 may comprise a camera mounted in front of each pilot, such as in the glare shield of the cockpit. A single camera in front of each driver with sufficient field of view may be used to locate the y and z coordinates of the head. Here, the y-coordinate represents the left-right position of the pilot's head relative to the camera, and the z-coordinate represents the up-down position of the pilot's head relative to the camera. A dual camera setup with lateral separation can be used to locate the x-coordinate representing the far and near position of the pilot's head relative to the camera by parallax measurement.
Fig. 8 illustrates an exemplary dual camera setting for one pilot in accordance with an exemplary embodiment. FIG. 9 shows an exemplary side camera setup for one pilot according to an exemplary embodiment. As shown in fig. 8, the two cameras 11a, 11b may be mounted in a light shield in front of one pilot with lateral clearance from each other. Thus, the x-coordinate of the pilot's head may be calculated based on the lateral clearance and the parallax angle measured by the camera. This is because as the pilot's head position changes in the x direction, the parallax angle will change. Although two cameras 11a, 11b are shown in fig. 8 for one pilot, two more cameras may be provided for the other pilot. Other numbers of cameras may be selected depending on the accuracy and performance required. In another embodiment, the cameras may be located on the side legs, as shown in FIG. 9, for example, one camera for each pilot, to determine the x-coordinate of the pilot's head. In other embodiments, the cameras may be mounted at other locations around the pilot.
For each pilot, the obtained anthropometric data can be used to improve the performance of the device(s) in the cockpit. This may be accomplished during or prior to flight by requiring each pilot to be temporarily positioned at a preset eye position, or the acquired anthropometric data may be stored on a digital medium and uploaded to a vision computer. In addition, the projection system may also be designed to actively avoid recording video beyond the algorithm's needs.
As described above, a head tracking device (e.g., dual camera settings) may also be used for eye tracking in order to perform drowsiness monitoring. This may also be disabled to allow the crew member to rest slightly on the seat. The head tracking device may also be configured to remove the windshield symbol if not used or when tunnel vision is detected. The text-described tunnel vision refers to a dangerous state due to the pilot concentrating on certain data (tasks) and temporarily ignoring other data (tasks). As an example of tunnel vision, for example in the case of an aircraft entering into high-altitude speeding, when the pilot takes over control of the aircraft, there is a risk of collision with other flying objects if the pilot overlooks even a very loud collision warning by paying too much attention to the recovery procedure. Furthermore, the tunnel vision described herein may also occur when a pilot is concerned with non-time critical data in a critical situation. The camera may also be configured to determine whether there is a driver or two or more drivers in the cockpit, or to assist in detecting a low-alertness state. Here, the low alert state refers to all states of consciousness in which a human being is awake on physiological indices but has a severely reduced ability to handle an emergency or a task requiring concentration. This may be related to extreme fatigue, extreme routine environments, lack of regular sensory interaction, etc. The worst case is a micro-sleep cycle, in which the brain can exhibit sleep characteristics in a few seconds. Overcoming this phenomenon is one of the major challenges for long range flight as well as single pilot operation. The low vigilance state may be measured, for example, by percent eyelid closure Period (PERCLOS), pulse, and other biological measurements. For this purpose, the eye movement characteristics, head movement characteristics, etc. of the pilot may be obtained using a camera. Eye movement characteristics include, for example, blink amplitude, blink frequency, and PERCLOS, among others. Head motion characteristics include, for example, eye, nose tip, and mouth corner position. The positions of eyes, nose tips and mouth angles are combined, the attention direction of the pilot can be obtained according to the tracking of eyeballs, and whether the attention of the pilot is dispersed or not is judged.
Once the position (e.g., coordinates) of the head is obtained, a pseudo-line of sight may be calculated based on the head position. The term "false" is used herein because there is a line of sight whether or not the pilot is actually looking. This is the orientation in which the symbol should be displayed. In an exemplary embodiment, the pseudo line of sight may be determined based on the head position and the current position of the target (if any) or the head azimuth. The current position of the target may be obtained by any known method. In an exemplary embodiment, a pseudo line of sight may be calculated by the camera detecting and/or calculating the direction (e.g., azimuth angle) of the head even if no target is present.
FIG. 10 illustrates a pseudo line of sight for one pilot in accordance with an exemplary embodiment. In fig. 10, the pseudo line of sight extends from the head of the pilot towards an external target. If no target is present, a pseudo line of sight may be determined based on the position and azimuth of the head, as shown.
As described above, the symbol of the present disclosure may be a symbol near an external target. The symbol may be a circle around the target, such as an aircraft or taxiway, a virtual taxiway marker, a flight path vector, synthetic runway or terrain, or an intruder marker and identification, and the like. In some embodiments, the symbol may be represented as a planar circle or any shape surrounding the target. In another embodiment, the symbol may include alphanumeric information. In the absence of external targets, the symbols may include flight assistance information, such as flight path vectors, synthetic runways, or terrain, among others.
FIG. 11 depicts a projection of a symbol on the windshield for one pilot according to an exemplary embodiment. The size of a symbol may be defined by its perspective. In some embodiments, the symbol may be generated from a three-dimensional model of the symbol. Most three-dimensional graphics APIs define polygon or spline curves graphics based on a list of three-dimensional vertices with cartesian coordinates rather than angular representations. This is why fig. 11 shows the figure as being located on an arbitrary plane perpendicular to the line of sight, since it shows an internal representation of the symbols in the software. The general purpose graphics API has a standard method of computing and rendering an image based on the following assumptions as shown in fig. 12.
FIG. 12 depicts three-dimensional generation of symbols in accordance with an exemplary embodiment; as shown, the volume rendered as the final image may be viewed as a truncated pyramid (view frustum) with its vertex located at the eye of the potential viewer and constrained by a near clipping plane and a far clipping plane. The surface on which the image is drawn is a plane perpendicular to the line of sight of the observer, which substantially corresponds to the display plane (e.g., windshield) of the normal position and orientation of the head of the actual observer. Thus, based on the current pilot's head position (head as vertex), one or more continuous visualized pyramids are generated using elements like the pixel/solid angle/normal of the windshield plane within the projection range of each projector. Based on this pyramid, a three-dimensional model of the depicted symbol (at its actual location) is then projected as a two-dimensional image and fused into the original bitmap to generate the projected image.
However, at least two projection planes for the projection devices in the cockpit do not match the above assumptions. First, the surface of the windshield may not be perpendicular to the line of sight. The angle is determined by, for example, aerodynamics and bird resistance. Second, the surface of the windshield may not be flat (e.g., boeing 747, 787, poinbar CRJ, etc.).
The first problem can be solved by replacing the standard perspective projection matrix (e.g., GLfrustrum) by a customized view and projection matrix. Any known or future developed algorithm may be used. The second problem can be solved by approximating the surface of the windshield with multiple planes and projecting the scene individually to each of these planes. Multiple pyramids can be used to compensate for curvature of the transparent body or lateral offset of the projector, thereby reducing uneven distribution of pixels per surface unit of the windshield. Compensation for curvature will be described in detail below.
However, as mentioned above, in situations where the symbol needs to provide conformal images to more pilots (such as captain and co-pilot), there is a need to compensate for these pilots' parallax differences.
Fig. 13 is a schematic diagram showing the disparity difference between two pilots. As shown in fig. 13, in case the above projection scheme for one pilot is applied to two pilots, it will be possible for the same external target object to be projected as two symbols for each pilot. This may lead to pilot distraction or even confusion, particularly because, unlike a standard HUD, both pilots will be able to see the symbol at the same time. The main reason for this is that the positions of the two pilots are independently separated, however, in order to get the attention wake-up and/or information notification to the desired effect, the symbol generated should be conformal to the external view for each pilot.
In an exemplary embodiment, to alleviate the inconvenience of both pilots, additional information is added to the symbol, such as what is being viewed (e.g., airplane call sign or runway/taxiway identification in alphanumeric form), and/or who should look at it (captain or copilot). The additional information may facilitate a better understanding of the symbols by the pilot. Also, additional information may be helpful for the case of a single pilot.
In another embodiment, the disparity difference compensation can also be implemented with a single symbol. FIG. 14 exemplary embodiment shows an example of the adjustment of a single symbol for two pilots according to an exemplary embodiment. In this embodiment, a three-dimensional model of the symbol may be built around an average pseudo-line of sight based on the midpoints of the two pilots, and the corresponding symbol varies according to an azimuth (θ) function. The average pseudo-line of sight may be calculated from the midpoint of the head positions of the two pilots and the position of the external target as shown in fig. 14.
Fig. 15 shows an example of adjustment of a single symbol for two pilots with different azimuth angles. As shown in fig. 15, when the target is located directly in front of the aircraft at a reasonable distance, the two pseudo-lines of sight will be substantially parallel and the symbols will therefore be separated between the pilots. In this case, the shape of the symbol may clearly indicate whether it should be used by the person gazing at it or his/her co-pilot.
Fig. 16 depicts exciting a transparent body to self-illuminate with excitation light (e.g., ultraviolet light) from a light source (e.g., projector), according to an example embodiment. Transparent body 21, such as left and right windshields, left and right front windows, and left and right rear windows as shown in fig. 4, may receive excitation light from a light output, such as UV DLP projector 22. The received excitation light may be absorbed by a luminescent material (e.g. a phosphor layer) at the transparent body 21. When the luminescent material receives the excitation light, the luminescent material may emit visible light (fluorescence). Accordingly, an image (e.g., a symbol) can be created on the transparent body 21 by selectively illuminating the transparent body 21 with excitation light.
According to an embodiment of the present disclosure, the excitation light may be ultraviolet light. If the excitation light is ultraviolet light, a down-conversion physical phenomenon occurs when the luminescent material emits visible light in response to the ultraviolet light. Specifically, ultraviolet light has a shorter wavelength and higher energy than visible light. Accordingly, when the luminescent material absorbs ultraviolet light and emits lower-energy visible light, the ultraviolet light is down-converted into visible light because the energy level of the ultraviolet light is lowered when it is converted into visible light. In some embodiments, the luminescent material is a fluorescent material.
According to embodiments of the present disclosure, the excitation light may be Infrared (IR) light. If the excitation light is infrared light, an up-conversion physical phenomenon occurs when the luminescent material emits visible light in response to the infrared light. Specifically, infrared light has a longer wavelength and lower energy than visible light. Accordingly, when the light emitting material absorbs infrared light and emits visible light of higher energy, the infrared light is up-converted into visible light because the energy level of the infrared light rises when it is converted into visible light. In some embodiments, the luminescent material is a fluorescent material. In the up-conversion physics, the absorption of more than one infrared photon is necessary for the emission of each light photon. Those skilled in the art will appreciate the need for: the need for multiphoton absorption makes infrared light a less desirable choice as the excitation light than ultraviolet light.
In the exemplary embodiment shown in fig. 16, the excitation light may be output by a digital projector. In some embodiments, the projector may be a micro-mirror array (MMA) projector (e.g., a Digital Light Processing (DLP) projector). An MMA projector outputting ultraviolet light, i.e., UV DLP projector 22, may be similar to an MMA projector outputting visible light, except that the color wheel has filters tailored for the ultraviolet spectrum. Those skilled in the art will appreciate that the excitation light used by the projector may vary depending on the material of the light emitting material (phosphor layer) on the transparent body 21.
One aspect of the present disclosure is to utilize a UV or IR projector in an aircraft to achieve full windshield projection without the restricted field of view as with a HUD.
In the embodiment shown in fig. 16, the transparent body may comprise a UV filter between the glass and the luminescent material for blocking UV light from the outside, so that the luminescent material is not excited by undesired UV light outside the cabin. Currently, there are a variety of fluorescent materials in the industry. A common characteristic of these materials is that the size of the fluorescent particles is very small. Typically some nanoparticles or molecules between 0.5nm and 500nm in size with little scattering effect to reduce the visible transparency effect on the screen. Such materials include, for example: an inorganic nano-sized phosphor; organic molecules and dyes; semiconductor-based nanoparticles; and organometallic molecules.
For down-Conversion, the following materials may be used to form a Fluorescence Conversion (Fluorescence Conversion) display screen: 1. inorganic or ceramic phosphors or nanoparticles including, but not limited to, metal oxides, metal halides, metal chalcogenides (e.g., metal sulfides), or mixtures thereof, e.g., metal oxy-halides, metal oxy-chalcogenides. These inorganic phosphors have wide applications in fluorescent lamps and electronic monitors. These materials can convert shorter wavelength photons (e.g., UV and blue light) to longer wavelength visible light and can be easily deposited on or dispersed in a display screen. 2. Laser dyes and small organic molecules, as well as fluorescent organic polymers. These also convert shorter wavelength laser photons (e.g., UV and blue light) to longer wavelength visible light and can be easily and uniformly attached to the display screen. Because their solids are in a molecular state, screen transparency is maintained since there is no particle scattering. 3. Nanoparticles of semiconductors, such as II-VI or III-V compound semiconductors, for example, fluorescent quantum dots. Furthermore, their addition to the screen does not affect optical transparency. 4. An organometallic molecule. The molecule includes at least a metal center, such as rare earth elements (e.g., Eu, Tb, Ce, Er, Tm, Pr, Ho) and transition metal elements such as Cr, Mn, Zn, Ir, Ru, V, as well as main group elements such as B, Al, Ga, etc. The metal element is chemically bonded to an organic group to prevent quenching (quenching) of fluorescence from a host or a solvent. Screens filled with such organometallic compounds do not scatter light or affect the transparency of the screen, unlike micro-sized particles.
Of the down-converting FC materials or molecules described above, those that can be excited by long wavelength UV (e.g., > 300nm) to blue (< 500nm) lasers and produce visible light emission can be used in embodiments of the present application. For example, the phosphor may be a garnet series of Ce-doped phosphors: (YmAl-m)3 (ainbl-n) 5O12, wherein 0 < ═ m and n < > 1; a includes other rare earth elements and B includes B, Ga. In addition, phosphors comprising metal silicate, metal borate, metal phosphate and metal aluminate hosts are preferably applied to FC displays; in addition, nanoparticle phosphors are also preferably used in FC displays, these nanoparticles comprising: common rare earth elements (e.g., Eu, Tb, Ce, Dy, Er, Pr, Tm) and transition or main group elements (e.g., Mn, Cr, Ti, Ag, Cu, Zn, Bi, Pb, Sn, Tl) serve as fluorescence activators. Finally, some undoped materials (e.g., metals (e.g., Ca, Zn, Cd) tungstate, metal vanadates, ZnO, etc.) are also preferred FC display materials.
Commercial laser dyes are another exemplary FC display material. A variety of commercial laser dyes are available from a variety of laser dye suppliers, including LambdaPhysik, and Exciton et al. Some preferred classes of laser dyes include: pyrromethene (Pyrromethene), coumarin, rhodamine, fluorescein, other aromatic hydrocarbons and their derivatives, and the like. In addition, there are a variety of polymers containing unsaturated carbon-to-carbon bonds that can also be used as fluorescent materials and have a variety of optical and fluorescent applications. For example, MEH-PPV, etc. have been used in optoelectronic devices, such as Polymer Light Emitting Diodes (PLEDs). The fluorescent polymer can be directly used as a fluorescent layer of a transparent 2-D display screen. Further, recently developed semiconductor nanoparticles (e.g., quantum dots) are also preferred LIF display materials. The term "semiconductor nanoparticles" refers to inorganic crystallites having a diameter between 1nm and 1000nm, preferably between 2nm and 50 nm. The semiconductor nanoparticles are capable of emitting electromagnetic radiation upon excitation (i.e., the semiconductor nanoparticles are luminescent). The nanoparticles may be uniform nanocrystals, or include multiple shells (shells). For example, it comprises one or more "cores" of a first semiconductor material and may be surrounded by a "shell" of a second semiconductor material. The core and/or shell may be a semiconductor material including, but not limited to, group II-VI (ZnS, ZnSe, ZnTe, CdS, CdSe, CdTe, HgS, HgSe, HgTe, MgS, MgSe, MgTe, CaS, CaSe, CaTe, SrS, SrSe, SrTe, BaS, BaSe, BaTe, etc.) and group III-V (GaN, GaP, GaAs, GaSb, InN, InP, InAs, InSb, etc.) and group IV (Ge, Si, etc.) materials, as well as alloys or mixtures thereof.
Finally, fluorescent organometallic molecules containing rare earth or transition element cations are also used in the down-conversion phosphor screen. This molecule comprises a metal center of rare earth elements including Eu, Tb, Er, Tm, Ce, which are externally protected with organic chelating groups. The metal center may also include transition elements such as Zn, Mn, Cr, Ir, etc. and main group elements such as B, Al, Ga. Such organometallic molecules can be readily dissolved in liquid or transparent solid host media (hosts) and form transparent phosphor screens for 2-D transparent displays with minimal light scattering. Some examples of such fluorescent organometallic molecules include: 1. tris (dibenzoylmethane) mono (phenanthroline) europium (iii) (tris (diazomethanine) mono (phenanthroline) europeium (iii)); 2. tris (8-hydroxyquinoline) erbium (Tris (8-hydroxyquinoline) erbium); 3. tris (1-phenyl-3-methyl 4- (2, 2-dimethylpropane-1-oyl) o-diazacyclopentene-5-mono) terbium (iii) (Tris (1-phenyl-3-methyl-4- (2, 2-dimethyl propan-1-oyl) pyrazolin-5-one) terbium (iii)); 4. bis (2-methyl-8-hydroxyquinone) zinc (Bis (2-methyl-8-hydroxyquinolato) zinc); 5. biphenyl borane-8-hydroxyquinolinate (Diphenylborane-8-hydroxyquinolinate).
The up-converting phosphor is similar in chemical composition to the down-converting fluorescent material in question. Other aspects of fluorescent materials including up-converting materials are described in a number of references, such as U.S. patent application publication No. 2010/0254019a1, published 10/7/2010, which is incorporated herein by reference.
Referring back to fig. 6, in a projection device, a projector, such as a UV DLP projector, may be equipped with an optical system to correct any windshield curvature so that the image appears to be focused on the largest area of the windshield. In other embodiments, the light sources may be concentrated outside the pod, and directed into the projector pod by using clusters of UV diodes and light conducting materials (e.g., fiber optics) to provide more room for the flight crew and avoid excessive heat dissipation in the area.
In an exemplary embodiment, the projector may not be arranged with its projection axis perpendicular to the projection transparency, or at a long distance to the transparency. This may affect the optical design of the projector. For example, if the optical system is axisymmetric, the angle between the visual axis of the projector and the projection surface will cause the image to be out of focus for most screens and to have severe trapezoidal distortion. This focusing problem may be complicated by the fact that the projection surface may not even be flat (the aforementioned curved surface problem).
To solve this problem, the position of the optical system of the projector relative to the projection surface can be fixed, so that the lens can be designed such that the focal plane is as close to the projection surface as possible and the pixels will be distributed as uniformly as possible.
The actual distribution of pixels on the screen is an important property because in standard rendering methods it is usually assumed that the pixels are evenly distributed over the wind shield, but this is more difficult to achieve. Furthermore, each processor (or image processing unit (GPU)) will drive digital light processor(s) (DLP) to process a rectangular pixel map that will not be drawn onto the windshield as is. It is worth mentioning that a small distortion level may be acceptable, since a circle drawn on a wall will actually be optically perceived as an ellipse (if directly in front of the viewer), but may be recognized as a circle after perspective correction from the brain.
In an exemplary embodiment, the windshield may be divided into a plurality of regions of substantially uniform pixel density distribution, each region mapped to a pixel rectangle on the DLP. Fig. 17 depicts a divided pixel density uniform distribution area and a corresponding rectangular window image on a DLP according to an example embodiment. For each region, a three-dimensional model of the symbol may be projected within a corresponding pixel rectangle on the DLP projector and rendered in a generic manner using the oblique projection method as described above. FIG. 18 depicts a projection of a three-dimensional model of a symbol within a partitioned region of uniform pixel density distribution in accordance with an exemplary embodiment. As shown in fig. 18, the windshield is divided into two regions, with the upper region having a pixel density of 60 pixels per inch and the lower region having a pixel density of 45 pixels per inch. Thereby compensating for optical distortion caused by the curved windshield. However, as shown in FIG. 18, this may result in overlapping regions on the DLP bitmap, but an accumulation buffer or any other typical method may be used to fuse the overlapping portions together. Thus, in some embodiments, the symbol may be a grayscale bitmap to speed up rendering to some extent. It is also desirable to estimate the pixel density profile of the projected image. In an exemplary embodiment, a mathematical analysis of the optical system is employed, since optimized results are likely to be obtained by numerical simulations in the lens system. In another embodiment, test patterns are projected onto the DLP at different display areas (inspection of different pixel sizes) and equivalent pixels per inch are identified for displaying the image on the actual window.
Exemplary projection method
FIG. 19 is a flow chart of an example projection method for adjustment with head position. The method may be performed by the projection system described above. The method steps presented in this section may be performed in any suitable order and combination, and may be modified by any other features and characteristics of the present disclosure.
At 1901, head position(s) for the pilot(s) are obtained. In some embodiments, output from the head tracking device is received to locate the head position(s) of the pilot(s) in the cockpit. In some embodiments, other information may also be obtained, such as a three-dimensional position and model of the target object that conforms to the symbol (e.g., the aircraft's position information at ADS-B or a airport map database) and a relative position calculated from the aircraft's current position.
At 1902, a pseudo-line of sight for the pilot is calculated based on the head position of the pilot and a symbol is created. In some embodiments, the actual symbol is generated from a three-dimensional model of the symbol, as shown in FIG. 12. For one pilot, a symbol may be created based on the pilot's head position, with adjustments as shown in fig. 10-11 being performed to make the symbol to be projected conform to the corresponding target. For two or more pilots, symbols may be created based on the head positions of the pilots, by performing the adjustments shown in fig. 13-15 to make the symbols to be projected to conform to the corresponding targets, and also to remain conformal between pilots.
At 1903, a transparent body on which to project a symbol is determined based on the calculated pseudo line of sight of the pilot. If the transparent body onto which the symbol is to be projected has multiple planes or is not flat, the sub-surface(s) of the transparent body onto which the symbol is to be projected can be determined based on the pilot pseudo line of sight calculated in 1904. Here, the dashed boxes in fig. 19 indicate that the steps therein are optional.
At 1905, the symbols are output to be rendered and projected onto a target transparent body or a target sub-surface of a transparent body. Referring back to fig. 6, the output from the GPU(s) is transmitted to the projector. In some embodiments, the three-dimensional model of the symbol is output as a central data segment generated by the algorithm described above.
By implementation of the projection system of the present disclosure, it is not limited by the HUD field of view. For example, this technique may be very useful when the crosswind approaches. Furthermore, it will be able to visually track or point to objects located outside the field of view. In some embodiments, by performing tracking of head position and calculation of pseudo-sightlines, a visual alert may be created in the main field of view when looking out of the window, regardless of head orientation. In addition, the projection system of the present disclosure can present a composite image that is complete and conformal outside the window, which is not possible or practical for view-limited head-up (HUD) and head-down (HDD). Furthermore, the crew does not need any additional head-mounted equipment.
The symbol generation of the present disclosure may be used for example, but not limited to, airspace traffic vision acquisition (traffic advisory or ADS-B) and virtual signage displayed during taxiing based on taxi clearance guidance algorithms in complex airports. This technique can also be used to generate a straight ahead symbol or in combination with a conventional HUD (e.g., waveguide) if it is desired to keep the focus at infinity.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of such implementations. Embodiments of the invention may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
It will be appreciated that the steps of the methods described herein may be implemented by one or more general or special purpose processors (or processing device (s)), such as microprocessors, digital signal processors, custom processors and Field Programmable Gate Arrays (FPGAs), and unique stored program instructions (including both software and firmware) that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the methods and/or apparatus described herein.
In some embodiments, integrity enhancement module(s) coupled to processor(s) (computing devices) that perform the steps of the methods described herein may be provided to achieve higher system integrity and security. In some embodiments, the integrity enhancement module may monitor the performance and/or output of the steps of the methods described herein. In some embodiments, the integrity enhancement module may perform cross checking with the main processor(s) by performing the same steps. In some embodiments, the integrity enhancement module may include one or more units, each unit performing the steps of the methods described herein to vote from the results of the outputs from each unit and the host processor(s), such as by majority decision. The integrity enhancement module may be implemented by hardware such as microprocessors, digital signal processors, custom processors, and Field Programmable Gate Arrays (FPGAs), as well as uniquely stored program instructions, including both software and firmware, so long as it is separate from the main processor(s).
Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more Application Specific Integrated Circuits (ASICs), in which various functions or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of these two approaches may also be used.
Furthermore, the methods described herein may be implemented as a computer-readable storage medium having computer-readable code stored thereon for programming a computer (e.g., including a processor) to perform the methods as described and claimed herein. Examples of such computer-readable storage media include, but are not limited to, hard disks, CD-ROMs, optical memory devices, magnetic memory devices, ROMs (read-only memories), PROMs (programmable read-only memories), EPROMs (erasable programmable read-only memories), EEPROMs (electrically erasable programmable read-only memories), and flash memories. Moreover, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
Specific embodiments have been described in the foregoing specification. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the disclosure as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings. Furthermore, the described embodiments/examples/implementations should not be construed as mutually exclusive, but rather should be understood as potentially combinable if such combination is allowed in any way. In other words, any feature disclosed in any of the foregoing embodiments/examples/implementations may be included in any of the other foregoing embodiments/examples/implementations.
The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or element of any or all the claims. The disclosure is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
The Abstract of the disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. This Abstract is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Furthermore, in the foregoing detailed description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separately claimed subject matter.

Claims (28)

1. A projection system, comprising:
a projection device configured to project a graphical image associated with an external target on a transparent body;
a head tracking device configured to track and obtain head positions of one or more operators; and
one or more processors configured to adjust a projection position of the graphical image on the transparent body based at least in part on the head position of the operator such that the graphical image is conformal to the external target with which it is associated, and to provide the adjusted image data to the projection device,
wherein the transparent body comprises:
a windshield comprising glass;
a UV filter for blocking external UV light; and
a phosphor layer capable of being excited to emit light by the excitation light from the projection device.
2. The projection system of claim 1, wherein the projection system is disposed on a vehicle and the operator is located within the vehicle.
3. The projection system of claim 1, wherein the head tracking device comprises one or more cameras mounted near each of the one or more operators, the cameras configured to obtain coordinates of the operator's head as the head position.
4. The projection system of claim 1, wherein the processor is further configured to calculate, for each of the one or more operators, a pseudo line of sight based at least in part on a head position of the operator and a position of an external target.
5. The projection system of claim 1, wherein the head-tracking device is further configured to obtain a head azimuth angle for the one or more operators, and
wherein the processor is further configured to calculate, for each of the one or more operators, a pseudo line of sight based at least in part on the head position and the head azimuth for that operator.
6. The projection system of claim 4 or 5, wherein the processor is further configured to determine which transparencies the graphical image is to be projected onto based at least in part on the pseudo-line of sight, and to adjust a projection position of the graphical image on the transparency for each of the one or more operators.
7. The projection system of claim 4, wherein the processor is further configured to:
calculating an average pseudo-line of sight based at least in part on a midpoint of the head positions of the two operators and the position of the external target, and
adjusting a projection position of the graphical image to be conformal to the two operators based at least in part on the average pseudo line of sight and the pseudo line of sight of the two operators.
8. The projection system of claim 1, wherein the processor is further configured to remove at least a portion of the graphical image in response to detecting an unused graphical image or detecting tunnel vision.
9. The projection system of claim 1, wherein the graphical image is generated from a three-dimensional model of a symbol.
10. The projection system of claim 2, wherein the projection device comprises:
a plurality of projectors mounted within the vehicle;
a light source coupled to the projection device and outputting UV light or IR light; and
an optical system configured to focus the graphic image on a maximum area of the transparent body and to distribute the graphic image uniformly.
11. The projection system of claim 1, wherein the projection device is a UV Digital Light Processing (DLP) projector and uses UV light as the excitation light.
12. The projection system of claim 11 wherein the processor is further configured to modify a projection matrix in the UV DLP projector for oblique projection.
13. The projection system of claim 11 or 12, wherein the UV DLP projector comprises a plurality of pixel rectangles, each of which corresponds to a sub-region of the transparent body, and
the processor is further configured to control the UV DLP projector such that the graphical image on each of the sub-regions is corrected for the curvature of that sub-region.
14. The projection system of claim 13, wherein at least two of the plurality of pixel rectangles have different pixel densities.
15. A method of projection, comprising:
obtaining head positions of one or more operators;
adjusting a projection position of a graphic image to be projected on a transparent body based at least in part on the head position of the operator such that the graphic image conforms to an external target with which it is associated; and
outputting an adjusted graphical image to be projected on the transparent body,
wherein the transparent body comprises:
a windshield comprising glass;
a UV filter for blocking external UV light; and
a phosphor layer capable of being excited to emit light by excitation light from the projection device.
16. The method of claim 15, wherein the projection method is for an aircraft and the operator is a pilot.
17. The method of claim 15, further comprising: for each of the one or more operators, a pseudo-line of sight is calculated based at least in part on a head position of the operator and a position of an external target.
18. The method of claim 17, further comprising:
a head azimuth of the one or more operators is obtained, and a pseudo-line of sight is calculated for each of the one or more operators based at least in part on the head position and the head azimuth of the operator.
19. The method of claim 17 or 18, further comprising: determining which transparency the graphical image is to be projected onto based at least in part on the pseudo-line of sight, and adjusting, for each of the one or more operators, a projection position of the graphical image on the transparency.
20. The method of claim 17, further comprising:
calculating an average pseudo-line of sight based at least in part on a midpoint of the head positions of the two operators and the position of the external target, an
Adjusting a projection position of the graphical image to be conformal to the two operators based at least in part on the average pseudo line of sight and the pseudo line of sight of the two operators.
21. The method of claim 15, further comprising removing at least a portion of the graphical image in response to detecting an unused graphical image or detecting tunnel vision.
22. The method of claim 15, wherein the graphical image is generated from a three-dimensional model of a symbol.
23. The method of claim 15, further comprising modifying a projection matrix in a UV Digital Light Processing (DLP) projector for oblique projection.
24. The method of claim 23, wherein the UV DLP projector comprises a plurality of pixel rectangles, each of the plurality of pixel rectangles corresponding to a sub-region of the transparent body, and the method further comprises:
such that the graphical image on each of the sub-regions is corrected for the curvature of that sub-region.
25. A projection system, comprising:
one or more processors configured to perform the steps of the method of any one of claims 15-24.
26. A projection system, comprising:
a processor configured to perform the steps of the method of any one of claims 15-24; and
at least one integrity enhancement module coupled to the processor and configured to perform at least one step of the method of any of claims 15-24.
27. The projection system of claim 26,
monitoring results of respective steps performed by the processor based on results of each step performed by each of the at least one integrity enhancement module.
28. A non-transitory computer-readable storage medium comprising a plurality of instructions which, when executed by one or more processors, cause the processors to perform the steps of the method of any one of claims 15-24.
CN201911058743.3A 2019-11-01 2019-11-01 Projection system, apparatus and method for projection position adjustment using head position Active CN110769237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911058743.3A CN110769237B (en) 2019-11-01 2019-11-01 Projection system, apparatus and method for projection position adjustment using head position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911058743.3A CN110769237B (en) 2019-11-01 2019-11-01 Projection system, apparatus and method for projection position adjustment using head position

Publications (2)

Publication Number Publication Date
CN110769237A CN110769237A (en) 2020-02-07
CN110769237B true CN110769237B (en) 2022-04-12

Family

ID=69335578

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911058743.3A Active CN110769237B (en) 2019-11-01 2019-11-01 Projection system, apparatus and method for projection position adjustment using head position

Country Status (1)

Country Link
CN (1) CN110769237B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112312126A (en) * 2020-10-29 2021-02-02 中国航空工业集团公司洛阳电光设备研究所 Target correction method and target correction equipment for airborne head-up vision system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101682789A (en) * 2007-03-31 2010-03-24 索尼德国有限责任公司 Method for image projection, image projection apparatus and image projection screen
CN102682653A (en) * 2011-08-15 2012-09-19 上海华博信息服务有限公司 Laser based dynamic projection tracing sand plate system, and application of system
CN105353999A (en) * 2014-05-27 2016-02-24 空中客车集团有限公司 Method for projecting virtual data and device enabling said projection
CN107018404A (en) * 2015-12-02 2017-08-04 罗克韦尔柯林斯公司 Use the Wearable display of all edge views
US10254544B1 (en) * 2015-05-13 2019-04-09 Rockwell Collins, Inc. Head tracking accuracy and reducing latency in dynamic environments

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101682789A (en) * 2007-03-31 2010-03-24 索尼德国有限责任公司 Method for image projection, image projection apparatus and image projection screen
CN102682653A (en) * 2011-08-15 2012-09-19 上海华博信息服务有限公司 Laser based dynamic projection tracing sand plate system, and application of system
CN105353999A (en) * 2014-05-27 2016-02-24 空中客车集团有限公司 Method for projecting virtual data and device enabling said projection
US10254544B1 (en) * 2015-05-13 2019-04-09 Rockwell Collins, Inc. Head tracking accuracy and reducing latency in dynamic environments
CN107018404A (en) * 2015-12-02 2017-08-04 罗克韦尔柯林斯公司 Use the Wearable display of all edge views

Also Published As

Publication number Publication date
CN110769237A (en) 2020-02-07

Similar Documents

Publication Publication Date Title
EP2549329B1 (en) Aircraft display system
US8098171B1 (en) Traffic visibility in poor viewing conditions on full windshield head-up display
US9566946B2 (en) Systems, methods, and computer readable media for protecting an operator against glare
JP5279733B2 (en) Electrically dimmable combiner for head-up displays
US8098170B1 (en) Full-windshield head-up display interface for social networking
US7196329B1 (en) Head-down enhanced vision system
DE102010013356B4 (en) Driver fatigue alarm on head-up indicator for entire windshield
US8933819B1 (en) Exterior aircraft display system
CN103387056B (en) Airborne vehicle and the method showing visual information that flight parameter is relevant to its operator
JP2019217941A (en) Video display system, video display method, program, and moving body
JP2017223933A (en) Image projector including screen and light source using light emitting quantum rod
US20180195691A1 (en) Electromagnetic radiation shielding assembly
CN110769237B (en) Projection system, apparatus and method for projection position adjustment using head position
CN102442252A (en) Vehicle-mounted head-up projection display equipment
US9625716B2 (en) Omnidirectional HUD
Wood et al. Head-up display
JP2008056046A (en) Aircraft landing assistant device at poor visual field
CN202463698U (en) Vehicle-mounted head-up projection display device
CN210666205U (en) Head-up display device, imaging system and vehicle
JP2002182305A (en) Projection type display device
CN112444976A (en) Head-up display device, imaging system and vehicle
US11556019B1 (en) Viewing protection system for laser threats against aircraft
Muensterer et al. Integration and flight testing of a DVE system on the H145
Wisely Wide-angle head-up display system for enhanced and synthetic vision applications
Wisely The design of wide angle head up displays for synthetic vision

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant