WO2020031549A1 - Dispositif d'affichage d'image virtuelle - Google Patents

Dispositif d'affichage d'image virtuelle Download PDF

Info

Publication number
WO2020031549A1
WO2020031549A1 PCT/JP2019/026011 JP2019026011W WO2020031549A1 WO 2020031549 A1 WO2020031549 A1 WO 2020031549A1 JP 2019026011 W JP2019026011 W JP 2019026011W WO 2020031549 A1 WO2020031549 A1 WO 2020031549A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual image
viewpoint
image
image display
display device
Prior art date
Application number
PCT/JP2019/026011
Other languages
English (en)
Japanese (ja)
Inventor
和幸 石原
安藤 浩
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2020031549A1 publication Critical patent/WO2020031549A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/02Simple or compound lenses with non-spherical faces
    • G02B3/06Simple or compound lenses with non-spherical faces with cylindrical or toric faces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses

Definitions

  • the present disclosure relates to a virtual image display device.
  • the projection unit displays a stereoscopic image in which left-eye images and right-eye images are alternately arranged.
  • the parallax barrier causes the left-eye image to enter the left eye of the observer and the right-eye image to enter the right eye of the observer.
  • Patent Literature 1 the position of the virtual image does not match the display position of the stereoscopic image. This causes problems such as a decrease in the resolution of the stereoscopic image due to the out-of-focus of the observer and a delay in the recognition time of the stereoscopic image.
  • the present disclosure has an object to provide a virtual image display device that realizes high visibility in a stereoscopic view of a virtual image.
  • a virtual image display device configured to display a virtual image visually recognizable from a visible region by reflecting display light to a projection unit includes a plurality of viewpoint division elements arranged with respect to each other.
  • a parallax image display unit that displays a parallax image associated with each viewpoint in each area on the screen that individually corresponds to a viewpoint dividing element.
  • the pitch of arrangement of the viewpoint dividing element in the virtual image of the viewpoint division unit and P d, the distance of arrangement of the viewpoint of the viewing area and P e, the distance from the viewing area to the virtual image of the viewpoint division section L The pitch of arrangement of the viewpoint dividing element in the virtual image of the viewpoint division unit and P d, the distance of arrangement of the viewpoint of the viewing area and P e, the distance from the viewing area to the virtual image of the viewpoint division section
  • FIG. 2 is a diagram illustrating a state where the virtual image display device according to the first embodiment is mounted on a vehicle; 1 is a perspective view illustrating a schematic configuration of a virtual image display device according to a first embodiment, FIG. 3 is a diagram schematically illustrating a correspondence relationship between a cylindrical lens, a facing image area, and a parallax image area in the first embodiment; FIG. 3 is a block diagram illustrating an image control unit according to the first embodiment; It is a schematic diagram for explaining the relationship between the viewpoint and the stereoscopic video of the first embodiment, FIG.
  • 3 is a schematic diagram for explaining an effect of diffraction in the first embodiment
  • 6 is a table showing numerical values set in the first embodiment
  • It is a graph for demonstrating the degree of dispersion
  • It is a perspective view which shows schematic structure of the virtual image display device of 2nd Embodiment
  • It is a perspective view showing the schematic structure of the virtual image display of a 3rd embodiment.
  • the virtual image display device As illustrated in FIG. 1, the virtual image display device according to the first embodiment of the present disclosure is used in a vehicle 1 and is mounted on the vehicle 1 by being housed in an instrument panel 2 of the vehicle 1.
  • the head-up display device (hereinafter, HUD device) 100 is shown.
  • the HUD device 100 displays a virtual image that can be visually recognized by an occupant as an observer by projecting an image on a projection unit 3a set on the windshield 3 of the vehicle 1. That is, when the display light of the image reflected by the projection unit 3a reaches the visual recognition area EB provided in the room of the vehicle 1, the occupant whose eyeball 5 is located in the visual recognition area EB uses the display light as a virtual image. Perceive.
  • the occupant recognizes the stereoscopic image SI that can be recognized by the virtual image. Then, the occupant can recognize various types of information from the stereoscopic video SI.
  • the various information to be displayed includes, for example, information indicating the state of the vehicle 1 such as the speed of the vehicle 1 and the remaining fuel amount, or navigation information such as view assist information and road information.
  • the windshield 3 of the vehicle 1 is formed in a translucent plate shape by, for example, glass or synthetic resin, and is disposed above the instrument panel 2.
  • the windshield 3 is arranged so as to be inclined away from the instrument panel 2 toward the rear.
  • the windshield 3 has a projection portion 3a on which the display light is projected, formed in a curved surface having a smooth concave curve.
  • the incident angle of the display light to the projection unit 3a is often set to 20 ° or more and less than 90 °, for example, set to 65 °.
  • the projection unit 3a may not be provided on the windshield 3.
  • a combiner that is separate from the vehicle 1 may be installed in the vehicle 1, and the combiner may be provided with the projection unit 3a.
  • the visual recognition area EB is a space area where the virtual image displayed by the HUD device 100 can be visually recognized so as to satisfy a predetermined standard (for example, the virtual image can be visually recognized at a predetermined luminance or higher), and is also referred to as an eye box.
  • the visual recognition area EB is typically set so as to overlap with the eye lip set for the vehicle 1.
  • the eye lip is set in an ellipsoidal shape based on an eye range that statistically represents the spatial distribution of the eye points of the occupant sitting on the seat 4 (that is, the position of the eyeball 5). That is, the viewing area EB is set in a space behind the projection unit 3a.
  • the HUD device 100 includes a housing 10, a parallax image display unit 12, a lenticular lens 20, an image control unit 30, and the like.
  • the housing 10 is formed in a hollow shape having a light-shielding property, for example, with a synthetic resin or metal, and houses the parallax image display unit 12 and the lenticular lens 20.
  • the housing 10 has a window 10a that is optically opened upward and faces the projection 3a.
  • the window 10a is closed by a dustproof sheet 10b having a light transmitting property or a semi-light transmitting property.
  • the parallax image display unit 12 is provided relatively low in the housing 10.
  • the parallax image display unit 12 of the present embodiment is a liquid crystal display.
  • the parallax image display unit 12 has an image display panel 13 and a backlight 14, and is formed by, for example, housing these in a box-shaped casing.
  • Various configurations such as a so-called edge-type backlight and a direct-type backlight can be adopted as the backlight 14.
  • the image display panel 13 is a flat display element for displaying a real image of an image.
  • the image display panel 13 is a TFT liquid crystal panel using thin film transistors (Thin Film Transistors, TFTs), and is an active matrix type in which a display screen 13a is formed by a plurality of pixels arranged in a two-dimensional direction.
  • TFTs Thin Film Transistors
  • a transmissive liquid crystal panel is employed.
  • the image display panel 13 has a rectangular shape having a longitudinal direction and a lateral direction. Since the pixels are arranged along the longitudinal direction and the lateral direction, the display screen 13a also has a rectangular shape.
  • the image display panel 13 includes a pair of polarizing plates, a liquid crystal layer sandwiched between the pair of polarizing plates, and the like.
  • the pair of polarizing plates are arranged so that their polarization axes are substantially orthogonal to each other.
  • the polarization direction of light transmitted through the liquid crystal layer can be rotated according to the applied voltage. In this way, it is possible to control the transmittance of light transmitted through the polarizing plate on the display screen side for each display pixel.
  • the image display panel 13 can display an image on the display screen 13a by illuminating the illumination target surface which is the surface on the backlight 14 side with the backlight 14 and controlling the transmittance for each pixel. It is possible. Adjacent pixels are provided with color filters of different colors (for example, red, green, and blue), and various colors are expressed by a combination of these.
  • the display screen 13a is arranged so that its longitudinal direction is along the left-right direction, and faces the upper windshield 3, so that display light is emitted so as to be emitted upward from each pixel.
  • the lenticular lens 20 is formed of, for example, glass or a synthetic resin, and has translucency.
  • the lenticular lens 20 is formed in a plate shape on which a plurality of cylindrical lenses 21 are arranged.
  • the lenticular lens 20 is arranged on the optical path of the display light, and is arranged, for example, so as to be in contact with the display screen 13a, and thus is configured integrally with the image display panel 13.
  • the lenticular lens 20 As the display light is projected onto the projection unit 3a, the lenticular lens 20 is disposed on the opposite side of the viewing area EB with respect to the projection unit 3a, that is, on the space outside the vehicle ahead of the windshield 3. Twenty virtual images VI1 are formed.
  • the virtual image VI1 of the lenticular lens 20 is different from an image in which display light is formed and recognized. If the power of the parallax image display unit 12 is turned off and the lenticular lens 20 is sufficiently illuminated with external light such as sunlight, for example, the virtual image VI1 of the lenticular lens 20 is removed from the visual recognition area EB. Although there is a possibility that the image can be actually confirmed as an image of the shape, otherwise, it is difficult to directly confirm the image due to the influence of the luminance contrast with the stereoscopic image SI due to the display light.
  • the cylindrical lens 21 is disposed so as to extend in the vertical direction in the virtual image VI1 of the lenticular lens 20. Since the virtual image VI1 of the lenticular lens 20 is formed by the reflection at the inclined projection unit 3a as described above, the cylindrical lens 21 as a real object extends in the front-rear direction. The arrangement direction of the cylindrical lenses 21 as a real object is along the left-right direction, and the arrangement direction of the cylindrical lenses 21 reflected on the virtual image VI1 is also along the left-right direction. In FIG. 2, only a part of the cylindrical lens 21 is denoted by a reference numeral.
  • the pitch of the arrangement of the cylindrical lenses 21 is formed so as to be equalized, more preferably equal, in the virtual image VI1 of the lenticular lens 20. That is, the pitch of the array of the cylindrical lenses 21 as a substance is modulated in consideration of the curved surface shape of the projection unit 3a. More specifically, in consideration of the distortion that may occur in the virtual image VI1 of the lenticular lens 20 due to the reflection of the display light on the projection unit 3a, the pitch of the array of the actual cylindrical lenses 21 is set to, for example, the lenticular lens 20. It is set in advance so as to increase from the center toward the left and right. Note that "equalization" of the present embodiment, with respect to disturbance of the pitch P d in the virtual image VI1 in the case of equalizing the pitch of the cylindrical lens 21 entities was improved to the disturbance becomes smaller Figure Means that
  • each of the cylindrical lenses 21 has, for example, a surface 21 a on the display screen 13 a side formed in a plane common to the entire lenticular lens 20, and a surface 21 b on the opposite side formed in a longitudinal section including the arrangement direction. It is formed in the shape of a convex cylindrical surface curved on the surface.
  • each facing image area CGA is virtually set as an area facing the pair of cylindrical lenses 21 (in other words, as a virtual area for image control).
  • each facing image area CGA is an elongated rectangular area whose longitudinal direction is along the direction in which the cylindrical lens 21 extends, and whose transverse direction is the arrangement direction of the cylindrical lenses 21. Since the cylindrical lenses 21 are arranged without gaps without overlapping each other, the opposed image areas CGA are also arranged without gaps without overlapping each other.
  • each display light emitted from each opposed image area CGA is individually incident on the corresponding cylindrical lens 21. Due to the refraction effect of the cylindrical lens 21, each display light that belongs to the same opposed image area CGA and is emitted from each pixel at a position shifted from each other in the arrangement direction is transmitted through the lenticular lens 20, and changes in the arrangement direction. In the longitudinal section including the light, the light is refracted in directions different from each other.
  • each cylindrical lens 21 of the lenticular lens 20 adjusts the imaging position of the virtual image VI2 on the display screen 13a so that the virtual image VI2 on the display screen 13a is formed at a distance of 3 m or more and 5 m or less from the viewing area EB. It preferably has optical power.
  • the image control unit 30 shown in FIG. 4 is a so-called computer, and is mainly configured by an electronic circuit including at least one processor, a memory, and an input / output interface.
  • the processor is an arithmetic circuit that executes a computer program stored in the memory.
  • the memory device is a non-transitional substantial storage medium that is provided by, for example, a semiconductor memory or the like and temporarily stores a computer program and data readable by a processor.
  • the image control unit 30 is communicably connected to the parallax image display unit 12 and controls an image displayed on the display screen 13a.
  • the image control unit 30 is configured to be able to acquire various information from the vehicle 1 by inputting an electric signal using communication. Note that the communication between the image control unit 30 and each element may employ various suitable communication methods regardless of wired communication or wireless communication.
  • the image control unit 30 has a parallax image control unit 31 and a feedback control unit 32 as functional blocks.
  • the parallax image control unit 31 virtually sets each opposing image area CGA of the display screen 13a, and further virtually sets a parallax image area PGA in which each opposing image area CGA is divided in the array direction of the cylindrical lenses 21. Then, the image displayed on the display screen 13a is controlled.
  • the parallax image areas PGA belonging to the same opposed image area CGA are provided so as to have the same number as the total number of viewpoints VP set in the viewing area EB where the occupant's eyeballs 5 are located.
  • FIG. 3 illustrates the parallax image area PGA with the number of divisions smaller than the actual number, from the viewpoint of the visibility of the figure.
  • each parallax image area PGA belonging to the same opposite image area CGA a parallax image individually associated with each viewpoint set in the viewing area EB is displayed.
  • a parallax image that represents parallax for an interval between two adjacent viewpoints (hereinafter, referred to as a viewpoint interval Pe ) in the viewing area EB is displayed.
  • the display light from each parallax image displayed in each parallax image area PGA is refracted in different directions by the cylindrical lens 21, and subsequently reflected by the projection unit 3 a to individually correspond in the visual recognition area EB. To the position of the viewpoint VP.
  • one parallax image in each of the opposed image areas CGA individually corresponds to one viewpoint VP. That is, a plurality of parallax images belonging to different facing image areas CGA correspond to one viewpoint VP. More specifically, display light from a predetermined parallax image via each cylindrical lens 21 reaches one viewpoint VP.
  • the lenticular lens 20 functions as a viewpoint dividing unit that divides the viewpoint VP in the viewing area EB into a plurality of parts by the cylindrical lens 21 as the viewpoint dividing element.
  • the cylindrical lenses 21 constituting the viewpoint dividing element are arranged in the left-right direction in the virtual image VI1 of the lenticular lens 20, so that the viewpoints VP in the viewing area EB are also arranged in the left-right direction.
  • the viewpoint VP divided by the cylindrical lens 21 is expressed as a “point”.
  • the cylindrical lens 21 are set in a “line” shape extending in the up-down direction corresponding to the stretching direction.
  • the viewpoint VP of the present embodiment means a “position” for viewing, rather than a “point” as a shape. 2 and 5, a part of the viewpoint VP is schematically shown.
  • the parallax generated between the parallax images visually recognized from the plurality of viewpoints VP will cause the occupant to have a disparity. Affects perception. Further, for example, if the viewpoint VP overlapping the right eye and the viewpoint VP overlapping the left eye are different, the disparity image recognized from the viewpoint VP overlapping the right eye and the parallax image recognized from the viewpoint VP overlapping the left eye are different. The parallax created between the two affects the occupant's perception.
  • the image recognized by the occupant by the display light under the influence of the parallax has a different distance from the virtual image VI2 of the parallax image display unit 12 rather than the virtual image VI2 of the parallax image display unit 12 itself.
  • the three-dimensional image SI emerges in the image. This stereoscopic image SI and the occupant's eyeball 5 have a conjugate relationship.
  • the viewpoint interval P e is under the occupant of the pupil diameter or less, preferably set to be equal to or less than half the pupil diameter. That is, at least two or more viewpoints VP are set for the right eye 5 and two or more viewpoints VP are set for the left eye 5. For this reason, the number of divisions of the viewpoint VP in the present embodiment is at least four or more, and a large number of viewpoints VP are actually set in the viewing area EB.
  • the distribution of the light beam of the display light at the display distance Z is such that the adjustment of the occupant's eyes is focused on Z. Assuming that there is, it is represented by the following Equation 2.
  • the distance P d is the pitch of arrangement of the cylindrical lens 21 in the virtual image VI1 of the lenticular lens 20, L from viewing area EB to the virtual image VI1 of the lenticular lens 20, m is the number of viewpoints entering one pupil, theta is The viewing angle and PSF are point spread functions for taking into account the diffraction effect, g is the light emission width of one pixel in the virtual image VI2 of the parallax image display unit 12, and n is the elemental lens of the lenticular lens 20 (that is, the cylindrical lens 21). Is a number.
  • Equation 3 represents a phase difference between the viewpoints VP at the display position of the stereoscopic image SI.
  • the absolute value of the phase difference represented by Expression 2 may be set smaller than 1. If the absolute value of the phase difference is 1, since the phases match between the viewpoints VP entering the pupil of one eyeball 5, the phase difference should be set smaller than 1. Therefore, the condition of the following Expression 4 is derived.
  • the condition of the following Expression 5 may be satisfied.
  • the viewpoint interval Pe , the pitch Pd , and the distance L are set so as to satisfy Expression 5.
  • viewpoint interval P e, pitch P d, and the distance L are set.
  • the display light of the parallax image displayed on the display screen 13a is refracted by the cylindrical lens 21 based on geometrical optics, and is condensed at a position corresponding to the optical power of the cylindrical lens 21. Imaged.
  • the pitch if P d the not properly set the sequence of the cylindrical lens 21 functions as a diffraction grating, hatched viewing area EB of side dot of the display light received diffracting action are given intensity distribution (FIG. 6 Section). Then, not only does the parallax image become blurred, but the occupant also perceives the virtual image VI1 itself as being formed at a closer position.
  • L is usually set to about 1 m
  • the pitch Pd is preferably set to 1 mm or more. However, more strict conditions are obtained as follows.
  • the parallax image displaying portion 12 displays white center of gravity of the wavelength when the (red, all lit display each pixel of green and blue) is defined as lambda g.
  • the center-of-gravity position ⁇ g is a value obtained by weighting the emission intensity of each wavelength to each wavelength. Based on the centroid position lambda g, it is preferable that the equation 6 is satisfied the following.
  • Equation 6 represents the spot diameter at the condensing position when Gaussian beam of wavelength lambda g is an F-number L / P d.
  • the spot diameter is a width where the light intensity becomes 1 / e 2 when the peak intensity of the light is set to 1. That is, Equation 6 is light emitted from the lenticular lens 20 is assumed, the spot diameter when condensed with viewing area EB is, which means to become smaller than the pitch P d.
  • Equation 6 As the setting condition of the pitch P d of the arrangement of the cylindrical lens 21 in the virtual image VI1 of the lenticular lens 20, the following conditions of Equation 7 is obtained.
  • the parallax image display unit 12 displays a parallax image corresponding to each parallax in the left-right direction, that is, the horizontal direction based on the occupant.
  • the display light emitted from the parallax image display unit 12 is guided by the lenticular lens 20 so as to form a plurality of viewpoints VP arranged in the left-right direction (horizontal direction of the occupant) in the viewing area EB.
  • the stereoscopic video SI is displayed at a position different from the virtual image VI1 of the lenticular lens 20, the motion parallax, convergence, and binocular parallax in the left-right direction (horizontal direction of the occupant) are ensured in the stereoscopic video SI. .
  • the feedback control unit 32 can detect the position of the eyeball 5 from the head information detection unit 7.
  • the head information detection unit 7 is realized by, for example, a driver status monitor (hereinafter, DSM) mounted on the vehicle 1.
  • DSM driver status monitor
  • the feedback control unit 32 corrects each parallax image based on the acquired position of the occupant's eyeball 5, particularly, a component in the vertical direction (vertical direction of the occupant).
  • Each of the corrected parallax images is displayed on the display screen 13a by the parallax image control unit 31.
  • the position of the occupant's eyeball 5 detected by the head information detection unit 7 may be calculated from feature point recognition information of the occupant's head image captured by DSM.
  • information about the average human pupil distance is added to the center position information of the head calculated from the feature points of the head image.
  • the position of the eyeball 5 may be detected.
  • each numerical value is set as shown in FIG.
  • the pupil diameter varies depending on the brightness of the surrounding environment, and there are individual differences. However, if P e is set to 700 [ ⁇ m], for example, when the average pupil diameter is 4 mm, one of the occupants becomes one. Six viewpoints VP can be placed in one pupil, and even if the surrounding area becomes bright and the pupil diameter is reduced by 2 mm, three viewpoints VP can be placed in one pupil.
  • FIG. 8 shows a graph in which the degree of dispersion of light rays at each viewpoint VP when the display distance Z of the stereoscopic image SI is changed under the conditions of the first embodiment is plotted.
  • Each solid line in FIG. 8 indicates the position of one pixel corresponding to one viewpoint VP. However, the position is defined by a direction in which the one pixel is viewed with reference to the center of the pupil (this is called a viewing angle in the present embodiment).
  • the eye is divided by the lenticular lens 20 as a viewpoint dividing unit and arranged in the visual recognition area EB.
  • the phase matching between two or three viewpoints VP entering the pupil of one eyeball 5 of the observer is avoided.
  • the virtual light beam obtained by extending the light beam of the display light reaching each viewpoint VP in the opposite direction is separated. Be composed.
  • the display positions of the viewpoints VP entering one pupil are shifted from each other, so that the number of pixels substantially recognized by the observer is secured according to each viewpoint VP. Therefore, the resolution of the stereoscopic video SI can be increased. As described above, it is possible to realize high visibility of a virtual image in a stereoscopic view.
  • the P d above 1 mm be set near the position of the virtual image VI1 of the lenticular lens 20, it is possible to easily establish the number 4 of the above conditions, HUD
  • the optical paths of the device 100 can be compactly arranged.
  • the effect of diffraction by the cylindrical lens 21 as the viewpoint splitting element is reduced, and the phenomenon in which the stereoscopic image SI is blurred can be suppressed, so that the visibility of the virtual image in the stereoscopic view can be improved.
  • the pitch P d of the cylindrical lens 21 is larger than the diffraction limit of the spot of the display light, to further reduce the diffraction by the cylindrical lens 21 Can be. For this reason, the phenomenon in which the stereoscopic video SI is blurred can be suppressed, and the visibility of the virtual image in stereoscopic vision can be improved.
  • the cylindrical lens 21 extending along the vertical direction of the vehicle 1 in the virtual image VI1 of the lenticular lens 20 is employed as the viewpoint dividing element.
  • the viewpoints VP are arranged in the left-right direction of the vehicle 1, in other words, in the horizontal direction of the observer, so that binocular parallax can be secured.
  • the HUD device 100 further includes the feedback control unit 32 that corrects the parallax image based on the position of the observer's eye 5.
  • the feedback control unit 32 that corrects the parallax image based on the position of the observer's eye 5.
  • the pitch of the cylindrical lens 21 in the entity of the lenticular lens 20 is modulated.
  • pitch P d in the virtual image VI1 is equalized, since a wide range of the virtual image VI1 becomes possible to establish the inequality of Equation 4 easily, over a wide range of three-dimensional image SI, to achieve high resolution be able to.
  • the lenticular lens 20 controls the optical power for adjusting the imaging position so that the virtual image VI2 of the parallax image display unit 12 is formed at a distance of 3 m or more and 5 m or less from the viewing area EB. May have.
  • the position of the virtual image VI2 of the parallax image display unit 12 can be brought closer to the display position of the stereoscopic video SI whose resolution is increased in the present embodiment in the range of 3 ⁇ Z ⁇ 5.
  • the delay of the recognition time can also be suppressed. Therefore, the visibility of the virtual image in the stereoscopic view can be made extraordinary.
  • the second embodiment is a modification of the first embodiment.
  • the second embodiment will be described focusing on the differences from the first embodiment.
  • a plate-shaped microlens array 220 in which a plurality of lens elements 221 are two-dimensionally arranged (in other words, two directions) is employed.
  • the microlens array 220 is formed of, for example, glass or a synthetic resin, and has a light transmitting property. It is arranged on the optical path of the display light, and is arranged, for example, so as to be in contact with the display screen 13a, so that it is configured integrally with the image display panel 13.
  • the microlens array 220 is on the opposite side of the viewing area EB across the projection unit 3a, that is, ahead of the windshield 3 with the display light being projected onto the projection unit 3a.
  • a virtual image VI1 of the micro lens array 220 is formed in the space outside the vehicle.
  • the plurality of lens elements 221 are arranged in the virtual image VI1 of the microlens array 220 so as to be arranged in two directions, that is, the vertical direction and the horizontal direction. Since the virtual image VI1 of the microlens array 220 is formed by the reflection at the inclined projection unit 3a, the lens elements 221 as a real object are arranged along two directions of the front-back direction and the left-right direction. In FIG. 9, only a part of the lens element 221 is denoted by a reference numeral.
  • the pitch of the array of the lens elements 221 in each direction is formed so as to be equalized and more preferably equal in the virtual image VI1 of the microlens array 220. That is, the pitch of each direction of the lens element 221 as a substance is modulated in consideration of the curved surface shape of the projection unit 3a. More specifically, in consideration of the distortion that may occur in the virtual image VI1 of the microlens array 220 due to the reflection of the display light on the projection unit 3a, the pitch of the array of the real lens elements 221 is set to, for example, the microlens array. It is set in advance so as to increase in the left-right direction from the center of 220 and to increase in the front-rear direction from the center.
  • Each lens element 221 has a surface 221a on the display screen 13a side formed in a common plane shape over the entire microlens array 220, and an opposite surface 221b formed in a convex shape curved in each vertical section including the arrangement direction thereof. Has formed.
  • the convex shape means various shapes such as a spherical shape, a rotationally symmetric aspherical shape, and a toroidal surface shape.
  • a corresponding image area CGA on the display screen 13a corresponding to each lens element 221 is set.
  • Each facing image area CGA is virtually set as an area facing the paired lens element 221, and is a rectangular area corresponding to the outer peripheral contour of the lens element 221.
  • the parallax image area PGA is divided into two directions, a front-back direction and a left-right direction.
  • the viewpoints VP in the viewing area EB are arranged not only in the left-right direction but also in the up-down direction (that is, arranged two-dimensionally). Therefore, even if the position of the occupant's eyeball 5 is not fed back to each parallax image by the feedback control unit 32 as in the first embodiment, parallax can be generated in both the horizontal direction and the vertical direction of the occupant. Therefore, the feedback control unit 32 is not provided in the second embodiment. In FIG. 9, the arrangement of the viewpoints VP in the vertical direction is omitted.
  • the pitch P d of the arrangement of lens elements 221 in the virtual image VI1 of the microlens array 220 is set so as to satisfy the condition of the formula 5.
  • the lens elements 221 arranged along the vertical direction and the horizontal direction of the vehicle 1 are adopted as viewpoint dividing elements.
  • the viewpoints VP are arranged in two directions, that is, the horizontal direction and the vertical direction of the vehicle 1, in other words, in the horizontal direction and the vertical direction of the observer, so that the observer's eyeball 5 moves in each direction.
  • the visibility of the stereoscopic video SI can be kept high.
  • the third embodiment is a modification of the second embodiment.
  • the third embodiment will be described focusing on the differences from the first embodiment.
  • the pitch of the lens elements 321 in the actual microlens array 320 is set at substantially equal intervals without being modulated.
  • FIG. 10 only a part of the lens element 321 is denoted by a reference numeral.
  • an additional lens 323 is provided separately from the microlens array 320.
  • the additional lens 323 is formed of, for example, glass or a synthetic resin, and has a light transmitting property.
  • the additional lens 323 is arranged between the micro lens array 320 and the projection unit 3a on the optical path, more specifically, close to the surface of the micro lens array 320 on the projection unit 3a side.
  • the additional lens 323 has a first optical surface 323a facing the microlens array 320 and a second optical surface 323b facing the projection unit 3a, and is a single lens.
  • the first optical surface 323a is formed in a planar shape
  • the second optical surface 323b is formed in a curved shape protruding toward the projection unit.
  • the second optical surface 323b of the additional lens 323 is a free-form surface conforming to the shape of the projection unit 3a, so that the pitch of the lens elements 321 is equalized in the virtual image VI1 of the microlens array 320. , The distortion of the virtual image VI1 is corrected.
  • the additional lens 323 as a distortion correcting lens for correcting distortion is provided so that the pitch of the viewpoint dividing elements is equalized in the virtual image of the viewpoint dividing unit.
  • pitch P d in the virtual image VI1 is equalized, since a wide range of the virtual image VI1 becomes possible to establish the inequality of Equation 4 easily, over a wide range of three-dimensional image SI, to achieve high resolution be able to.
  • the parallax image may be not a color image but a single color image such as red, green, and blue.
  • the center-of-gravity position lambda g of the wavelength can be regarded as the peak wavelength of the display light.
  • the parallax image display unit 12 is not limited to the liquid crystal display, and various displays such as an organic EL display can be adopted.
  • the display distance Z of the stereoscopic image SI does not need to be always set in the range of 3 ⁇ Z ⁇ 5.
  • the image SI can be displayed.
  • the condition of Expression 4 may be satisfied only in one of the two directions in which the lens elements 221 and 321 are arranged.
  • the virtual image display device can be applied to various vehicles such as an airplane, a ship, or a non-moving housing (for example, a game housing).

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Optical Elements Other Than Lenses (AREA)
  • Instrument Panels (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

L'invention concerne un dispositif d'affichage d'image virtuelle pour afficher une image virtuelle visuellement reconnaissable d'une zone de reconnaissance visuelle (EB) par réflexion de la lumière d'affichage par une partie de projection (3a), qui est pourvu de : une partie de division de points de vue (20, 220, 320) qui comprend une pluralité d'éléments de division de points de vue (21, 221, 321) pour, afin qu'une pluralité de points de vue (VP) soient agencée dans la zone de reconnaissance visuelle, diviser les points de vue ; et une partie d'affichage d'image de disparité (12) qui comprend un écran (13a) pour émettre la lumière d'affichage à transmettre par l'intermédiaire de la partie de division de points de vue, et affiche une image de disparité associée à chacun des points de vue dans chaque zone (CGA) sur l'écran correspondant individuellement à chaque élément de division de points de vue. La formule AA établit que lorsque Pd représente le pas d'agencement des éléments de division de points de vue dans une image virtuelle (VI1) de la partie de division de points de vue, Pe représente l'intervalle d'agencement entre les points de vue dans la zone de reconnaissance visuelle, et L représente une distance de la zone de reconnaissance visuelle à l'image virtuelle de la partie de division de points de vue.
PCT/JP2019/026011 2018-08-08 2019-07-01 Dispositif d'affichage d'image virtuelle WO2020031549A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-149746 2018-08-08
JP2018149746A JP7127415B2 (ja) 2018-08-08 2018-08-08 虚像表示装置

Publications (1)

Publication Number Publication Date
WO2020031549A1 true WO2020031549A1 (fr) 2020-02-13

Family

ID=69413480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026011 WO2020031549A1 (fr) 2018-08-08 2019-07-01 Dispositif d'affichage d'image virtuelle

Country Status (2)

Country Link
JP (1) JP7127415B2 (fr)
WO (1) WO2020031549A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230018075A (ko) 2021-07-29 2023-02-07 삼성전자주식회사 시차 광학 소자의 캘리브레이션 장치 및 방법

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005326610A (ja) * 2004-05-14 2005-11-24 Matsushita Electric Ind Co Ltd 三次元画像再生装置
JP2009008722A (ja) * 2007-06-26 2009-01-15 Univ Of Tsukuba 3次元ヘッドアップディスプレイ装置
JP2011085790A (ja) * 2009-10-16 2011-04-28 Seiko Epson Corp 電気光学装置及び電子機器
WO2015145934A1 (fr) * 2014-03-27 2015-10-01 パナソニックIpマネジメント株式会社 Appareil d'affichage d'image virtuelle, système d'affichage tête haute, et véhicule
JP2016065908A (ja) * 2014-09-24 2016-04-28 日本精機株式会社 ヘッドアップディスプレイ装置
US20180052309A1 (en) * 2016-08-19 2018-02-22 Electronics And Telecommunications Research Institute Method for expanding field of view of head-mounted display device and apparatus using the same

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005326610A (ja) * 2004-05-14 2005-11-24 Matsushita Electric Ind Co Ltd 三次元画像再生装置
JP2009008722A (ja) * 2007-06-26 2009-01-15 Univ Of Tsukuba 3次元ヘッドアップディスプレイ装置
JP2011085790A (ja) * 2009-10-16 2011-04-28 Seiko Epson Corp 電気光学装置及び電子機器
WO2015145934A1 (fr) * 2014-03-27 2015-10-01 パナソニックIpマネジメント株式会社 Appareil d'affichage d'image virtuelle, système d'affichage tête haute, et véhicule
JP2016065908A (ja) * 2014-09-24 2016-04-28 日本精機株式会社 ヘッドアップディスプレイ装置
US20180052309A1 (en) * 2016-08-19 2018-02-22 Electronics And Telecommunications Research Institute Method for expanding field of view of head-mounted display device and apparatus using the same

Also Published As

Publication number Publication date
JP2020024340A (ja) 2020-02-13
JP7127415B2 (ja) 2022-08-30

Similar Documents

Publication Publication Date Title
US20100214635A1 (en) Display device, display method and head-up display
WO2015146042A1 (fr) Appareil d'affichage d'images
JP7003925B2 (ja) 反射板、情報表示装置および移動体
US20110090419A1 (en) Electrooptical device and electronic device
WO2011074209A1 (fr) Dispositif d'affichage transmissif
JP3979604B2 (ja) ディスプレイ
JPWO2016047009A1 (ja) ヘッドアップディスプレイおよび移動体
US9684166B2 (en) Motor vehicle and display of a three-dimensional graphical object
US20180101018A1 (en) Light-field display
WO2019003514A1 (fr) Appareil d'affichage d'image virtuelle
CN113661432B (zh) 平视显示装置
JP4945691B2 (ja) 表示装置、表示方法及びヘッドアップディスプレイ
JP2018203245A (ja) 表示システム、電子ミラーシステム及び移動体
WO2020031549A1 (fr) Dispositif d'affichage d'image virtuelle
JP2020095271A (ja) 3次元映像を表示する装置及び方法
US11054641B2 (en) Image generating device for screen and head-up display
JP6697747B2 (ja) 表示システム、電子ミラーシステム及び移動体
WO2018101170A1 (fr) Dispositif d'affichage et miroir électronique
WO2021010123A1 (fr) Dispositif d'affichage tête haute
CN112526748A (zh) 一种抬头显示设备、成像***和车辆
JP2021022851A (ja) ヘッドアップディスプレイ装置
CN210666207U (zh) 一种抬头显示设备、成像***和车辆
WO2023286743A1 (fr) Dispositif d'affichage tête haute
JP7111070B2 (ja) ヘッドアップディスプレイ装置
WO2023085230A1 (fr) Appareil de génération d'images et affichage tête haute

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19848007

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19848007

Country of ref document: EP

Kind code of ref document: A1