WO2023216670A1 - 立体显示装置和交通工具 - Google Patents

立体显示装置和交通工具 Download PDF

Info

Publication number
WO2023216670A1
WO2023216670A1 PCT/CN2023/076650 CN2023076650W WO2023216670A1 WO 2023216670 A1 WO2023216670 A1 WO 2023216670A1 CN 2023076650 W CN2023076650 W CN 2023076650W WO 2023216670 A1 WO2023216670 A1 WO 2023216670A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging
light
display device
image
image information
Prior art date
Application number
PCT/CN2023/076650
Other languages
English (en)
French (fr)
Inventor
邓宁
贺俊妮
邹冰
常泽山
黄志勇
常天海
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023216670A1 publication Critical patent/WO2023216670A1/zh

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/26Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
    • G02B30/33Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving directional light or back-light sources
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
    • G02B30/35Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers using reflective optical elements in the optical path between the images and the observer

Definitions

  • the present application relates to the field of display, and in particular, to a three-dimensional display device and a vehicle including the three-dimensional display device.
  • Stereoscopic display requires providing image information with different parallaxes to both eyes. Compared with 2D display, stereoscopic display can give people a better experience.
  • Naked-eye stereoscopic display technology is also a stereoscopic display solution in which users do not need to wear polarized glasses or shutter glasses.
  • the stereoscopic display device outputs two channels of imaging light to the user's left and right eyes respectively.
  • the stereoscopic display device can output two channels of imaging light in a time-sharing manner.
  • one channel of imaging light output by the stereoscopic display device irradiates one eye of the user.
  • another path of imaging light output by the stereoscopic display device irradiates the other eye of the user.
  • the first time period and the second time period are divided alternately.
  • the two channels of imaging light carry image information with different parallaxes, thereby providing users with three-dimensional visual enjoyment.
  • stereoscopically displayed images require a larger format. Therefore, when the distance between the user and the stereoscopic display device is short, the user's experience is low.
  • the present application provides a stereoscopic display device and a vehicle, which can enlarge the stereoscopically displayed image through a curved mirror or lens. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
  • a first aspect of this application provides a three-dimensional display device.
  • the stereoscopic display device includes an image generating component and a curved mirror.
  • the image generation component is used to generate two channels of imaging light.
  • the two imaging lights carry image information with different parallaxes.
  • Curved mirrors are used to reflect two-way imaging light. There is an angle between the two reflected imaging lights.
  • the focal length of the curved mirror is f.
  • the distance between the image surface of the image generation component and the curved mirror is d. d is less than f.
  • d is smaller than f, and the curved mirror can amplify the stereoscopically displayed image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved. Moreover, compared with lenses, the volume of the curved mirror can be smaller, thereby reducing the volume of the stereoscopic display device.
  • the distance between the virtual image formed by the two reflected imaging lights and the curved mirror is D.
  • D satisfies the following formula:
  • ⁇ 1 there is an included angle ⁇ 1 between the two imaging lights after being reflected by the curved mirror.
  • S is the distance between the receiving position of the two imaging lights and the curved mirror.
  • the value of E ranges from 53 mm to 73 mm.
  • w is the width at the receiving position of at least one of the two imaging lights after being reflected by the curved mirror.
  • ⁇ 2 there is an angle ⁇ 2 between the two imaging lights before being reflected by the curved mirror.
  • S is the distance between the receiving position of the two imaging lights and the curved mirror.
  • E ranges from 53 mm to 73 mm.
  • w is the width at the receiving position of at least one of the two imaging lights after being reflected by the curved mirror.
  • the divergence angle of each of the two imaging lights before being reflected by the curved mirror is ⁇ .
  • w satisfies the following formula: w is less than 73 mm.
  • the image generating component includes a first light source component and a pixel component.
  • the first light source component is used to time-share the first light beam and the second light beam in different emission directions to the pixel component.
  • the pixel component is used to separately modulate the first light beam and the second light beam using different image information to generate two paths of imaging light.
  • the first light source component includes a first light source device and a second light source device.
  • the first light source device and the second light source device are used to alternately output the first light beam and the second light beam in time division.
  • the image generation component further includes a timing control unit.
  • the timing control unit is used to control the first light source device and the second light source device to alternately output the first light beam and the second light beam in time division.
  • the timing control unit is also used to control the pixel component to use different image information to modulate the first light beam and the second light beam in a time-sharing manner.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the image generating component includes a second light source component, a pixel component and a lens array.
  • the second light source component is used to output the third light beam to the pixel component.
  • the pixel component is used to modulate the third light beam according to different image information to generate the first imaging light and the second imaging light.
  • the curved mirror array is used to transmit the first imaging light and the second imaging light at different angles.
  • the pixel component includes a first pixel and a second pixel.
  • the first pixel is used to modulate the third light beam according to the first image information to generate a first path of imaging light.
  • the second pixel is used to modulate the third light beam according to the second image information to generate a second path of imaging light. Due to the process error of the curved mirror, the zoom factor and imaging position of the image observed by the user will be displayed differently relative to the ideal position. Display differences can cause users to experience dizziness and other physiological discomfort, reducing user experience. To this end, the image information can be preprocessed to compensate for display differences.
  • the third beam includes a first sub-beam and a second sub-beam.
  • the first pixel is used to modulate the third light beam according to the first image information and generate the first imaging light.
  • the first pixel is used to modulate the first sub-beam according to the first image information to generate the first imaging light.
  • the second pixel is used to modulate the third light beam according to the second image information and generate the second imaging light.
  • the second pixel is used to modulate the second sub-beam according to the second image information to generate the second imaging light.
  • the second light source component is used to simultaneously generate the first sub-beam and the second sub-beam.
  • the image information of different disparities includes first image information and second image information.
  • the stereoscopic display device also includes a processor. Wherein, the processor can be set inside the image generation component, or can be set outside the image generation component.
  • the processor is configured to preprocess the third image information to obtain the first image information.
  • the processor is used to preprocess the fourth image information to obtain the second image information.
  • the processor is further configured to obtain first coordinate information of the first position and/or second coordinate information of the second position.
  • One of the two imaging lights is illuminated to the first position.
  • the other imaging light of the two imaging lights is illuminated to the second position.
  • the processor is configured to preprocess the third image information including: the processor is configured to preprocess the third image information according to the first coordinate information.
  • the processor is configured to preprocess the fourth image information including: the processor is configured to preprocess the fourth image information according to the second coordinate information.
  • both eyes of the user receive the same image information. Therefore, the processor can only preprocess one image information.
  • the processor may obtain coordinate information of the middle position of the user's eyes.
  • the processor preprocesses the image information according to the coordinate information of the intermediate position.
  • the processor can also preprocess different image information respectively according to the coordinate information of the intermediate position.
  • the coordinate information of the intermediate position corresponds to two correction parameters.
  • the processor can preprocess different image information respectively according to two correction parameters.
  • by using coordinate information of different locations for preprocessing the accuracy of preprocessing can be improved, thereby improving user experience.
  • f is less than 400 mm.
  • the present application can reduce the volume of the stereoscopic display device.
  • the image generating component includes a projector and a diffusion screen.
  • the projector is used to generate two channels of imaging light.
  • the diffusion screen is used to receive two channels of imaging light, diffuse the two channels of imaging light, and output the two channels of diffused imaging light.
  • the curved mirror is used to reflect the diffused two-way imaging light.
  • a second aspect of the present application provides a three-dimensional display device.
  • the stereoscopic display device includes an image generating component and a lens.
  • the image generation component is used to generate two channels of imaging light.
  • the two imaging lights carry image information with different parallaxes.
  • the lens is used to transmit two-way imaging light. There is an angle between the two transmitted imaging lights.
  • the focal length of transmission is f.
  • the distance between the image plane of the image generating component and the lens is d. d is less than f.
  • d is smaller than f, and the lens can amplify the stereoscopically displayed image. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
  • the distance between the virtual image formed by the two reflected imaging lights and the curved mirror is D.
  • D satisfies the following formula:
  • ⁇ 1 there is an included angle ⁇ 1 between the two imaging lights after being reflected by the lens.
  • S is the distance between the receiving position of the two imaging lights and the lens.
  • E ranges from 53 mm to 73 mm.
  • w is the width of each of the at least one imaging light reflected by the lens at the receiving position.
  • ⁇ 2 there is an angle ⁇ 2 between the two imaging lights before being reflected by the lens.
  • S is the distance between the receiving position of the two imaging lights and the curved mirror.
  • E ranges from 53 mm to 73 mm.
  • w is the width at the receiving position of at least one of the two imaging lights after being reflected by the curved mirror.
  • the divergence angle of each of the two imaging lights before being reflected by the lens is ⁇ .
  • w satisfies the following formula: w is less than 73 mm.
  • the image generating component includes a first light source component and a pixel component.
  • the first light source component is used to output the first light beam and the second light beam in different emission directions to the pixel component in a time-sharing manner.
  • the pixel component is used to modulate the first light beam and the second light beam respectively according to different image information to generate two paths of imaging light.
  • the first light source component includes a first light source device and a second light source device.
  • the first light source device and the second light source device are used to alternately output the first light beam and the second light beam in time division.
  • the image generation component further includes a timing control unit.
  • the timing control unit is used to control the first light source device and the second light source device to alternately output the first light beam and the second light beam in time division.
  • the timing control unit is also used to control the pixel component to use different image information to modulate the first light beam and the second light beam in a time-sharing manner.
  • the pixel component includes a first pixel and a second pixel.
  • the first pixel is used to modulate the first light beam to obtain the first imaging light of the two imaging lights.
  • the second pixel is used to modulate the second light beam to obtain the second imaging light of the two imaging lights.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the image generating component includes a second light source component, a pixel component and a lens array.
  • the second light source component is used to output the third light beam to the pixel component.
  • the pixel component is used to modulate the third light beam using different image information to generate the first imaging light and the second imaging light.
  • the lens array is used to transmit the first imaging light and the second imaging light at different angles.
  • the pixel component includes a first pixel and a second pixel.
  • the first pixel is used to modulate the third light beam according to the first image information to generate a first path of imaging light.
  • the second pixel is used to modulate the third light beam according to the second image information to generate a second path of imaging light.
  • the third beam includes a first sub-beam and a second sub-beam.
  • the first pixel is used to modulate the first sub-beam according to the first image information to generate a first path of imaging light.
  • the second pixel is used to modulate the second sub-beam according to the second image information to generate a second path of imaging light.
  • the second light source component is used to simultaneously generate the first sub-beam and the second sub-beam.
  • the image information of different disparities includes first image information and second image information.
  • the stereoscopic display device also includes a processor.
  • the processor is used to preprocess the third image information to obtain the first image information.
  • the processor is used to preprocess the fourth image information to obtain the second image information.
  • the processor is further configured to obtain first coordinate information of the first position and/or second coordinate information of the second position.
  • One of the two imaging lights is illuminated to the first position.
  • the other imaging light of the two imaging lights is illuminated to the second position.
  • the processor is configured to preprocess the third image information including: the processor is configured to preprocess the third image information according to the first coordinate information.
  • the processor is configured to preprocess the fourth image information including: the processor is configured to preprocess the fourth image information according to the second coordinate information.
  • f is less than 300 mm.
  • the image generating component includes a projector and a diffusion screen.
  • the projector is used to generate two channels of imaging light.
  • the diffusion screen is used to receive two channels of imaging light, diffuse the two channels of imaging light, and output the two channels of diffused imaging light.
  • the lens is used to transmit the diffused two-way imaging light.
  • the third aspect of this application provides a vehicle.
  • the vehicle includes a three-dimensional display device as described in the aforementioned first aspect, any optional manner of the first aspect, the second aspect, or any optional manner of the second aspect.
  • the stereoscopic display device is installed on the vehicle.
  • Figure 1 is a first structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 2 is a schematic diagram of the first light path projection of the three-dimensional display device provided by the embodiment of the present application.
  • Figure 3 is a second structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of the second optical path projection of the three-dimensional display device provided by the embodiment of the present application.
  • Figure 5 is a first structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 6 is a second structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 7a is a third structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 7b is a fourth structural schematic diagram of the image generation component provided by the embodiment of the present application.
  • Figure 8 is a schematic structural diagram of a pixel component and a lens array provided by an embodiment of the present application.
  • Figure 9a is a third structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 9b is a fourth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 10a is a fifth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 10b is a sixth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 11 is a seventh structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 12 is an eighth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 13 is a schematic diagram of the third optical path projection of the three-dimensional display device provided by the embodiment of the present application.
  • Figure 14 is a circuit schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 15 is a schematic structural diagram of a vehicle provided by an embodiment of the present application.
  • Figure 16 is a schematic diagram of a possible functional framework of the vehicle provided by the embodiment of the present application.
  • the present application provides a stereoscopic display device and a vehicle, which can enlarge the stereoscopically displayed image through a curved mirror or lens. Therefore, when the distance between the user and the stereoscopic display device is relatively short, the user experience can be improved.
  • first, second, etc. used in this application are only used for the purpose of distinguishing descriptions, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order.
  • reference numbers and/or letters are repeated in multiple drawings of the embodiments of this application. Repetition does not imply a strictly limiting relationship between the various embodiments and/or configurations.
  • the three-dimensional display device in this application may also be called a 3D display device.
  • Stereoscopic display devices are used in the field of projection technology.
  • directional backlight devices can be used to provide users with three-dimensional visual enjoyment.
  • stereoscopically displayed images require a larger format. Therefore, when the distance between the user and the stereoscopic display device is short, the user's experience is low.
  • FIG. 1 is a first structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes an image generating component 101 and a curved mirror 102 .
  • the image generating component 101 is used to generate two paths of imaging light.
  • each solid line connected to the image generating component 101 represents a path of imaging light.
  • the curved mirror 102 is used to reflect two paths of imaging light. There is an angle between the two reflected imaging lights. Therefore, the two reflected imaging lights can illuminate different locations. For example, one channel of imaging light irradiates the user's left eye, and another channel of imaging light irradiates the user's right eye.
  • the two channels of imaging light carry image (pattern) information with different parallaxes, thereby providing users with three-dimensional visual enjoyment.
  • the position of the human eye can be called the viewpoint.
  • the above-mentioned three-dimensional display device can provide multiple viewpoints for viewing by multiple people.
  • the image generating component 101 can produce multiple channels of imaging light for viewing by different people. This embodiment takes a viewpoint as an example, that is, the image generation component 101 generates two paths of imaging light, to illustrate the imaging process of the stereoscopic display device.
  • the focal length of the curved mirror 102 is f.
  • the distance between the image surface (display surface of the image) of the image generation component 101 and the curved mirror 102 is d.
  • d may be the furthest vertical distance between the curved mirror 102 and the image plane of the image generating component 101 .
  • d may be the straight-line distance between the center pixel of the image surface of the image generation component 101 and the target point on the curved mirror 102 .
  • the center pixel is the image One or more pixels at the center of the image plane.
  • the imaging light output by the central pixel irradiates the target point on the curved mirror 102 .
  • d is less than f.
  • the curved mirror 102 can amplify the virtual image. Therefore, when the distance between the user and the stereoscopic display device 100 is relatively close, the user can see the enlarged virtual image, thereby improving the user experience.
  • FIG. 2 is a first optical path projection schematic diagram of the three-dimensional display device provided by the embodiment of the present application.
  • the image generating component 101 is used to generate two paths of imaging light.
  • the divergence angle of each of the two imaging lights is ⁇ .
  • the dotted line in Figure 2 represents one of the two imaging lights.
  • the solid line in Figure 2 represents the other imaging light among the two imaging lights.
  • the curved mirror 102 is used to reflect two paths of imaging light.
  • D can be obtained from f and d.
  • the two reflected imaging lights can illuminate different locations.
  • the position where the two imaging lights are illuminated is also called the receiving position of the two imaging lights, such as the user's eyes.
  • the interpupillary distance (pupillary distance) of both eyes is E.
  • the value range of E can be between 53 mm and 73 mm.
  • E is 53mm or 73mm.
  • the distance between the eyes and the curved mirror 102 is S.
  • the width of each of the two reflected imaging lights is w.
  • w is related to S, ⁇ and D. According to the following formula 2, w can be obtained from S, ⁇ , and D.
  • the width w of each imaging light is too large, one imaging light may cover the user's eyes, resulting in crosstalk of light beams.
  • w can be smaller than the maximum value of E.
  • the maximum value of E is 73 mm.
  • the value of S can be 0 mm or 5000 mm.
  • the reflected two imaging lights can be illuminated at appropriate locations, such as the user's eyes.
  • the distance M between the two reflected imaging lights is related to the angles ⁇ 1, D, and S. According to the following formula 3, M can be obtained from ⁇ 1, D and S.
  • the distance M between the two reflected imaging lights is related to the angles ⁇ 2, D, and S. According to the following formula 4, M can be obtained through ⁇ 2, D, and S.
  • the relationship between M, E, and w can be set. Specifically, when the value of S is a value between 0 mm and 5000 mm, E-w ⁇ M ⁇ E+2w. w can be w1 or w2. w1 is the width of the first imaging light. w2 is the width of the second imaging light. In practical applications, w can be w1 and w2. At this time, when the value of S is a value between 0 mm and 5000 mm, E-w1 ⁇ M ⁇ E+2w1, E-w2 ⁇ M ⁇ E+2w2.
  • FIG. 3 is a second structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes an image generating component 101 and a lens 301 .
  • the image generating component 101 is used to generate two paths of imaging light.
  • each solid line connected to the image generating component 101 represents a path of imaging light.
  • Lens 301 is used to transmit two paths of imaging light. There is an angle between the two transmitted imaging lights. Therefore, the two transmitted imaging lights can illuminate different locations.
  • one channel of imaging light irradiates the user's left eye, and another channel of imaging light irradiates the user's right eye.
  • the two channels of imaging light carry image information with different parallaxes, thereby providing users with three-dimensional visual enjoyment.
  • the focal length of the lens is f.
  • the distance between the image plane of the image generating component 101 and the lens 301 is d.
  • d is less than f.
  • the lens 301 can amplify the virtual image. Therefore, when the distance between the user and the stereoscopic display device 100 is relatively short, the user experience can be improved.
  • the parameters of the two imaging lights are the same.
  • the divergence angles of the two imaging lights are ⁇ .
  • the divergence angle of the first imaging light is ⁇ 1.
  • the divergence angle of the first imaging light is ⁇ 2.
  • two w can be obtained through ⁇ 1, ⁇ 2 and the aforementioned formula 1.
  • the two ws include the width w1 of the first imaging light and the width w2 of the second imaging light. In order to reduce or avoid beam crosstalk, both w1 and w2 can be smaller than the maximum value of E.
  • FIG. 4 is a second optical path projection schematic diagram of the three-dimensional display device provided by the embodiment of the present application.
  • the image generating component 101 is used to generate two paths of imaging light.
  • the divergence angle of each of the two imaging lights is ⁇ .
  • the dotted line in Figure 4 represents one of the two imaging lights.
  • the solid line in Figure 4 represents the other imaging light among the two imaging lights.
  • the lens 301 is used to transmit two channels of imaging light, and the propagation directions of the two channels of imaging light are deflected after transmission.
  • the distance between the virtual image formed by the transmitted two imaging lights and the lens 301 is D.
  • FIG. 5 is a first structural schematic diagram of an image generation component provided by an embodiment of the present application.
  • the image generation component 101 includes a first light source component 501 and a pixel component 502 .
  • the first light source component 501 can be a light emitting diode (LED) light source or a laser diode (LD) light source.
  • the first light source component 501 is used to output the first light beam and the second light beam in different emission directions to the pixel component 502 in a time-sharing manner.
  • LED light emitting diode
  • LD laser diode
  • the dotted line connected to the first light source assembly 501 represents the first light beam.
  • the solid line connected to the first light source assembly 501 represents the second light beam.
  • the pixel component 502 may be a liquid crystal display (LCD), liquid crystal on silicon (LCOS), digital micro-mirror device (DMD), etc. Pixel component 502 may be referred to as an image modulator.
  • the pixel component 502 is used to respectively modulate the first light beam and the second light beam using different image information to generate two paths of imaging light.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the dotted line connected to the pixel component 502 represents the first path of imaging light.
  • the solid line connected to the pixel component 502 represents the second path of imaging light.
  • the first light source assembly 501 may include multiple light source devices.
  • Figure 6 is a second structural schematic diagram of an image generation component provided by an embodiment of the present application. As shown in FIG. 6 , the first light source assembly 501 includes a first light source device 505 and a second light source device 506 . Based on FIG. 5 , the image generation component 101 also includes a timing control unit 504 . The timing control unit 504 is used to control the first light source device 505 and the second light source device 506 to alternately output the first light beam and the second light beam in time division.
  • the timing control unit 504 is also used to control the pixel component 502 to alternately display (load) images with different parallaxes in a time-sharing manner. For example, in the first period of time, the timing control unit 504 is used to control the pixel component 502 to display the image of the left eye. In the second time period, the timing control unit 504 is used to control the first light source device 505 to output the first light beam. The pixel component 502 uses the image of the left eye to modulate the first light beam to obtain a first path of imaging light. In the third period of time, the timing control unit 504 is used to control the pixel component 502 to display the image of the right eye.
  • the timing control unit 504 is used to control the second light source device 506 to output the second light beam.
  • the pixel component 502 uses the image of the right eye to modulate the second light beam to obtain a second path of imaging light.
  • the first time period, the second time period, the third time period and the fourth time period are alternately distributed.
  • the image generating component 101 may further include a light source component 501 located between the first light source component 501 and the pixel component 502 Beam control unit 503 between.
  • the beam control unit 503 may be a Fresnel screen, a cylindrical lens or a lens array, etc.
  • the beam control unit 503 is used to change the divergence angle of the first light beam and/or the second light beam, thereby improving the light utilization efficiency of the first light source assembly 501 and increasing the brightness of the generated imaging light, thus improving the brightness of the stereoscopic display device.
  • FIG. 7a is a third structural schematic diagram of an image generation component provided by an embodiment of the present application.
  • the image generation component 101 includes a second light source component 701, a pixel component 502 and a lens array 702.
  • the second light source component 701 may be an LED light source or an LD light source, or the like.
  • the second light source component 701 is used to output the third light beam to the pixel component 502 .
  • the solid line connected to the second light source assembly 701 represents the third light beam.
  • the pixel component 502 is used to modulate the third light beam according to different image information to generate a first path of imaging light and a second path of imaging light output from different directions.
  • the first imaging light and the second imaging light have certain directionality and divergence angle.
  • the dotted line connected to the pixel component 502 represents the first path of imaging light.
  • the solid line connected to the pixel component 502 represents the second path of imaging light.
  • the pixel component 502 may include left-eye pixels and right-eye pixels. The left-eye pixels are used to display left-eye images, and the right-eye pixels are used to display right-eye images. The pixels of the left eye are modulated and emit the first imaging light, and the pixels of the right eye are modulated and emit the second imaging light.
  • the imaging light emitted by the pixel component 502 is input to the lens array 702.
  • the lens array 702 is used to transmit the first imaging light and the second imaging light at different angles, so that the first imaging light and the second imaging light output by the lens array 702
  • the imaging light has different output (propagation) directions, and the first imaging light and the second imaging light propagate to the left and right eyes of the person respectively.
  • the dotted line connected to the lens array 702 represents the first path of imaging light.
  • the solid line connected to the lens array 702 represents the second imaging light.
  • the distance between the image surface of the image generation component 101 and the curved mirror 102 is d.
  • the image surface of the image generating component 101 may be a pixel component or a diffusion screen.
  • the two light beams output by the second light source component 701 are light beams that do not carry image information
  • the image plane of the image generation component 101 is the pixel component 502.
  • FIG. 7b is a fourth structural schematic diagram of an image generation component provided by an embodiment of the present application. As shown in Figure 7b, the image generation component 101 includes a projector 703, a diffusion screen 704 and a lens array 702.
  • the projector 703 outputs a third light beam, and the third light beam carries image information.
  • Diffusion screen 704 is a pixelated device.
  • the diffusion screen 704 is used to amplify the divergence angle of the third light beam output by the projector 703.
  • the third light beam can carry image information with different parallaxes in a time-sharing manner, and the diffusion screen 704 can output two channels of imaging light, and the two channels of imaging light carry image information with different parallaxes.
  • the lens array 702 is used to transmit two imaging lights at different angles.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the dotted line connected to the lens array 702 represents the first path of imaging light.
  • the solid line connected to the lens array 702 represents the second imaging light.
  • the image surface of the image generation component 101 is a diffusion screen 704.
  • FIG. 8 is a schematic structural diagram of a pixel component and a lens array provided by an embodiment of the present application.
  • the pixel component 502 includes N pixel groups 801 .
  • N is an integer greater than 0.
  • Each pixel group 801 includes a first pixel and a second pixel.
  • the first pixel is used to modulate the third light beam and output a first sub-imaging light.
  • the second pixel is used to modulate the third light beam and output a second sub-imaging light.
  • the first sub-imaging light and the second sub-imaging light have certain directionality and divergence angle.
  • the dotted line connected to the first pixel represents the first sub-imaging light.
  • the solid line connected to the second pixel represents the second sub-imaging light.
  • Lens array 702 includes N lenses 802 .
  • Each lens 802 is used to transmit a first sub-imaging light and a second sub-imaging light.
  • Each lens 802 is used to output a first sub-imaging light and a second sub-imaging light in a certain direction.
  • the dotted line connected to the lens 802 represents the first sub-imaging light.
  • the solid line connected to lens 802 represents the second sub-imaging light.
  • N pixel groups 801 and N lenses 802 correspond one to one.
  • N lenses 802 are used to output N first sub-imaging lights and N second sub-imaging lights. The N first sub-imaging lights converge to form a first path of imaging light.
  • N second sub-imaging lights converge to form a second path component Like light.
  • FIG. 9a is a third structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 9b is a fourth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes an image generating component 101 and a curved mirror 102 .
  • the stereoscopic display device 100 reference may be made to the relevant description in FIG. 1 .
  • the first imaging light of the two imaging lights is reflected to the left eye through point A of the curved mirror 102 .
  • the second imaging light of the two imaging lights is reflected to the right eye through point B of the curved mirror 102 .
  • different process errors exist at different positions of the curved mirror 102 .
  • Process errors will cause display differences in the zoom factor and imaging position of the image observed by the user relative to the ideal position.
  • the two virtual images observed by the user's eyes are at different locations. Display differences can cause users to experience dizziness and other physiological discomfort, reducing user experience. Therefore, in the embodiment of the present application, image information of different disparities can be preprocessed. Compensate for display differences through preprocessing to enhance the display effect.
  • the stereoscopic display device may perform one or more of the following processes on the left eye image and/or right eye image loaded by the pixel component 502:
  • Figure 10a is a fifth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • Figure 10b is a sixth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the stereoscopic display device 100 includes a processor 1001 and an image generation component 101.
  • the processor 1001 may be a central processing unit (CPU), a network processor (NP), or a combination of CPU and NP.
  • the processor may further include a hardware chip or other general-purpose processor.
  • the above-mentioned hardware chip can be an application specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • ASIC application specific integrated circuit
  • PLD programmable logic device
  • the processor 1001 is used to obtain third image information, and preprocess the third image information to obtain first image information.
  • the processor 1001 is used to obtain first coordinate information of a first location.
  • the first position may be the position of the user's left eye.
  • the processor 1001 is used to obtain the mapping table.
  • the mapping table contains the corresponding relationship between coordinate information and correction parameters.
  • the processor 1001 searches the mapping table for the first correction parameter corresponding to the first coordinate information.
  • the processor 1001 preprocesses the third image information according to the first correction parameter to obtain the first image information.
  • the first correction parameter may be a translation of 2 pixels to the left.
  • the processor 1001 can control the first path of imaging light to shift to the right to obtain the stereoscopic display device shown in Figure 10a.
  • the image generating component 101 Before the offset, the image generating component 101 generates the first imaging light through pixel 1. After the offset, the image generating component 101 generates the first imaging light through the pixel 2 . The position of pixel 1 after it is translated 2 pixels to the right is the position of pixel 2.
  • the processor 1001 can control the second path of imaging light to shift to the left.
  • the processor 1001 can control the first imaging light to shift to the left to obtain the stereoscopic display device shown in Figure 10b.
  • the processor 1001 can control the second imaging light to shift to the right.
  • the processor 1001 may also be used to obtain fourth image information, and preprocess the fourth image information to obtain second image information.
  • the processor 1001 may be used to obtain second coordinate information of the second location.
  • the second position may be the position of the user's right eye.
  • the processor 1001 searches the mapping table for the second correction parameter corresponding to the second coordinate information, and preprocesses the fourth image information according to the second correction parameter to obtain the second image information.
  • the second correction parameter may be an overall reduction of 5%.
  • Pixel component 502 in image generation component 101 may include display circuitry and a display panel.
  • the display circuit can also be called a display controller (DC), which has a display control function.
  • the display circuit is used to receive the first image information and the second image information output by the processor 1001.
  • the display circuit is also used to control the display panel to display the first image and the second image according to the first image information and the second image information.
  • the first image information corresponds to the first image.
  • the second image information corresponds to the second image.
  • the function of the above timing control unit 504 can be implemented by a display circuit.
  • FIG. 11 is a seventh structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the first light source component 501 outputs the first light beam and the first light beam in a time-sharing manner. Specifically, LED1 in the first light source assembly 501 generates a first light beam. LED2 in the first light source assembly 501 generates a second light beam.
  • FIG. 12 is an eighth structural schematic diagram of a three-dimensional display device provided by an embodiment of the present application. As shown in FIG.
  • the first light source component 501 outputs the first light beam and the second light beam in time division.
  • LED3 in the first light source assembly 501 generates a first light beam.
  • LED2 in the first light source assembly 501 generates a second light beam.
  • the first beam and the second beam pass through the beam control unit 503 and then reach the pixel component 502 .
  • the pixel component 502 modulates the first light beam and the second light beam through different pixels in a time-sharing manner to obtain two paths of imaging light. Among them, the pixel component 502 modulates the first light beam through the pixel 2 to obtain the first imaging light, which is reflected by the curved mirror 102 and then enters the left eye.
  • the pixel component 502 modulates the second beam through the pixel 1 to obtain the second imaging light, which enters the right eye after being reflected by the curved mirror 102.
  • the two virtual images observed by the user's eyes are at the same position, and dizziness does not occur, which improves the accuracy of the stereoscopic display. Effect.
  • the description with respect to Figure 12 is only an example.
  • the pixel component 502 may change the used pixels at the same time. Specifically, the pixel component 502 modulates the first light beam through the pixel 2 to obtain the first imaging light.
  • the pixel component 502 modulates the second light beam through the pixel 3 to obtain a second path of imaging light.
  • pixel 1, pixel 2, first pixel or second pixel, etc. may refer to one pixel point, or may refer to a set of multiple pixel points. This application does not limit this.
  • LED1 or LED2 may refer to one LED or a collection of multiple LEDs.
  • the first light source device or the second light source device may also refer to one LED, or may refer to a collection of multiple LEDs.
  • FIG. 13 is a schematic diagram of the third optical path projection of the three-dimensional display device provided by the embodiment of the present application.
  • two pixels of the image generating component 101 output two paths of imaging light.
  • the two channels of imaging light include a first channel of imaging light and a second channel of imaging light.
  • the two pixels include pixel 1 and pixel 2.
  • the coordinates of pixel 1 are (X_Oleft, Y_Oleft).
  • the coordinates of pixel 2 are (X_Oright, Y_Oright).
  • the curved mirror 102 is used to reflect the first imaging light output by the pixel 1.
  • the reflected first imaging light strikes the user's left eye.
  • the coordinates of the user's left eye are (X_left, Y_left).
  • the curved mirror 102 is also used to reflect the second path of imaging light output by the pixel 2.
  • the reflected second imaging light irradiates the user's right eye.
  • the coordinates of the user's right eye are (X_right, Y_right).
  • the virtual images corresponding to pixel 1 and pixel 2 are in the same virtual image on the virtual image plane 1301. Like on point.
  • the coordinates of the virtual image point are (X_V, Y_V).
  • the virtual image corresponding to pixel 1 is on virtual image point 1 of the virtual image plane 1301.
  • the coordinates of virtual image point 1 are (X_V1, Y_V1).
  • the virtual image corresponding to pixel 2 is on virtual image point 2 of the virtual image plane 1301.
  • the coordinates of virtual image point 2 are (X_V2, Y_V2).
  • the calculation display error ⁇
  • the above-mentioned pixel 1 and pixel 2 are a pair of sampling points, respectively displaying the left eye image and the right eye image.
  • the characterization processor 1001 needs to perform preprocessing.
  • the threshold may be tan(2.5mrad) ⁇ S. Wherein, S is the distance between the user's eyes and the curved mirror 102 .
  • the pixels included in a pair of sampling points can be changed.
  • a pair of sample points includes pixel 1 and pixel 3.
  • the virtual image of pixel 3 is projected on virtual image point 3 of virtual image plane 1301.
  • the coordinates of virtual image point 3 are (X_V3, Y_V3).
  • the display error ⁇
  • is calculated. If the display error ⁇ is less than the threshold, it means that the display error is within an acceptable range after correction, and the processor 1001 does not need to preprocess the image information. If after preprocessing, the display error ⁇ is still greater than or equal to the threshold, the processor 1001 may perform further preprocessing until the display error ⁇ is less than the threshold.
  • FIG. 14 is a schematic circuit diagram of a three-dimensional display device provided by an embodiment of the present application.
  • the circuit in the display device mainly includes a processor 1001, internal memory 1002, external memory interface 1003, audio module 1004, video module 1005, power module 1006, wireless communication module 1007, I/O interface 1008, Video interface 1009, controller area network (Controller Area Network, CAN) transceiver 1010, display circuit 1028 and display panel 1029, etc.
  • the processor 1001 and its peripheral components such as memory 1002, CAN transceiver 1010, audio module 1004, video module 1005, power module 1006, wireless communication module 1007, I/O interface 1008, video interface 1009, touch unit 1010 , the display circuit 1028 can be connected through the bus.
  • Processor 1001 may be called a front-end processor.
  • circuit diagram schematically illustrated in the embodiment of the present application does not constitute a specific limitation on the display device.
  • the display device may include more or fewer components than shown in the figures, or some components may be combined, or some components may be separated, or may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 1001 includes one or more processing units.
  • the processor 1001 may include an application processor (Application Processor, AP), a modem processor, a graphics processor (Graphics Processing Unit, GPU), an image signal processing unit. (Image Signal Processor, ISP), video codec, digital signal processor (Digital Signal Processor, DSP), baseband processor, and/or neural network processor (Neural-Network Processing Unit, NPU), etc.
  • application processor Application Processor, AP
  • modem processor a graphics processor
  • GPU Graphics Processing Unit
  • ISP Image Signal Processor
  • video codec video codec
  • digital signal processor Digital Signal Processor
  • DSP Digital Signal Processor
  • baseband processor baseband processor
  • neural network processor Neural-Network Processing Unit, NPU
  • different processing units can be independent devices or integrated in one or more processors.
  • the processor 1001 may also be provided with a memory for storing instructions and data.
  • a memory for storing instructions and data.
  • the memory in processor 1001 is a cache memory. This memory can hold instructions or data that have just been used or are recycled by the processor 1001 . If the processor 1001 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 1001 is reduced, thus improving the efficiency of the system.
  • the functions of the processor 1001 can be implemented by a domain controller on the vehicle.
  • the display device may also include a plurality of input/output (I/O) interfaces 1008 connected to the processor 1001 .
  • the interface 1008 may include, but is not limited to, an integrated circuit (Inter-Integrated Circuit, I2C) interface, integrated circuit built-in audio (Inter-Integrated Circuit Sound, I2S) interface, pulse code modulation (Pulse Code Modulation, PCM) interface, Universal Asynchronous Receiver/Transmitter (UART) interface, mobile industry processing Mobile Industry Processor Interface (MIPI), General-Purpose Input/Output (GPIO) interface, Subscriber Identity Module (SIM) interface, and/or Universal Serial Bus (Universal Serial Bus, USB) interface, etc.
  • I2C Inter-Integrated Circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART Universal Asynchronous Receiver/Transmitter
  • MIPI mobile industry processing Mobile Industry Processor Interface
  • GPIO General-Purpose Input/Output
  • SIM Subscriber Identity Module
  • USB Universal Serial Bus
  • the above-mentioned I/O interface 1008 can be connected to devices such as a mouse, touch screen, keyboard, camera, speaker/speaker, microphone, etc., or can be connected to physical buttons on the display device (such as volume keys, brightness adjustment keys, power on/off keys, etc.).
  • Internal memory 1002 may be used to store computer executable program code, which includes instructions.
  • the memory 1002 may include a program storage area and a data storage area.
  • the stored program area can store the operating system, at least one application program required for the function (such as call function, time setting function, AR function, etc.).
  • the storage data area can store data created during use of the display device (such as phone book, world time, etc.).
  • the internal memory 1002 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, Universal Flash Storage (UFS), etc.
  • the processor 1001 executes instructions stored in the internal memory 1002 and/or instructions stored in a memory provided in the processor 1001 to execute various functional applications and data processing of the display device.
  • the external memory interface 1003 can be used to connect an external memory (such as a Micro SD card).
  • the external memory can store data or program instructions as needed.
  • the processor 1001 can read and write these data or program instructions through the external memory interface 1003.
  • the audio module 1004 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
  • the audio module 1004 can also be used to encode and decode audio signals, such as playing or recording.
  • the audio module 1004 may be disposed in the processor 1001, or some functional modules of the audio module 1004 may be disposed in the processor 1001.
  • the display device can implement audio functions through the audio module 1004 and an application processor.
  • the video interface 1009 can receive external audio and video input, which can specifically be a High Definition Multimedia Interface (HDMI), a Digital Video Interface (Digital Visual Interface, DVI), or a Video Graphics Array (VGA). Display port (DP), Low Voltage Differential Signaling (LVDS) interface, etc.
  • the video interface 1009 can also output video.
  • the display device receives video data sent by the navigation system or receives video data sent by the domain controller through the video interface.
  • the video module 1005 can decode the video input by the video interface 1009, for example, perform H.264 decoding.
  • the video module can also encode the video collected by the display device, such as H.264 encoding of the video collected by an external camera.
  • the processor 1001 can also decode the video input from the video interface 1009, and then output the decoded image signal to the display circuit.
  • the above-mentioned display device also includes a CAN transceiver 1010, and the CAN transceiver 1010 can be connected to the CAN bus (CAN BUS) of the car.
  • CAN BUS CAN bus
  • the display device can communicate with the in-vehicle entertainment system (music, radio, video module), vehicle status system, etc.
  • the user can activate the car music playback function by operating the display device.
  • the vehicle status system can send vehicle status information (doors, seat belts, etc.) to the display device for display.
  • the display circuit 1010 and the display panel 1011 jointly implement the function of displaying images.
  • the display circuit 1010 receives the image signal output by the processor 1001, processes the image signal, and then inputs it into the display panel 1011 for imaging.
  • the display circuit 1010 can also control the image displayed by the display panel 1011. For example, control parameters such as display brightness or contrast.
  • the display circuit 1010 may include a driving circuit, an image control circuit, and the like.
  • the above-mentioned display circuit 1010 and display panel 1011 may be located in pixel component 502.
  • the display panel 1011 is used to modulate the light beam input from the light source according to the input image signal, thereby generating a visible image.
  • the display panel 1011 may be a silicon-based liquid crystal panel, a liquid crystal display panel or a digital micromirror device.
  • the video interface 1009 can receive input video data (also called a video source).
  • the video module 1005 decodes and/or digitizes the data and then outputs the image signal to the display circuit 1010.
  • the display circuit 1010 responds to the input image signal.
  • the display panel 1011 is driven to image the light beam emitted by the light source, thereby generating a visible image (emitting imaging light).
  • the power module 1006 is used to provide power to the processor 1001 and the light source according to the input power (eg, direct current).
  • the power module 1006 may include a rechargeable battery, and the rechargeable battery may provide power to the processor 1001 and the light source.
  • the light emitted by the light source can be transmitted to the display panel 1029 for imaging, thereby forming an image light signal (imaging light).
  • the power supply module 1006 can be connected to a power supply module (such as a power battery) of a car, and the power supply module of the car supplies power to the power supply module 1006 of the display device.
  • a power supply module such as a power battery
  • the wireless communication module 1007 can enable the display device to communicate wirelessly with the outside world, and can provide Wireless Local Area Networks (WLAN) (such as Wireless Fidelity (Wi-Fi) network), Bluetooth (Bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR) and other wireless communication solutions.
  • WLAN Wireless Local Area Networks
  • Wi-Fi Wireless Fidelity
  • BT Bluetooth
  • GNSS Global Navigation Satellite System
  • FM Frequency Modulation
  • NFC Near Field Communication
  • IR Infrared
  • the wireless communication module 1007 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1007 receives electromagnetic waves through the antenna, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 1001 .
  • the wireless communication module 1007 can also receive the signal to be sent from the processor 1001, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna for radiation.
  • the video data decoded by the video module 1005 can also be received wirelessly through the wireless communication module 1007 or read from the internal memory 1002 or an external memory.
  • the display device can pass through the in-car
  • the wireless LAN receives video data from the terminal device or the vehicle entertainment system, and the display device can also read the audio and video data stored in the internal memory 1002 or the external memory.
  • An embodiment of the present application also provides a vehicle equipped with any one of the aforementioned three-dimensional display devices.
  • the two imaging lights carry image information with different parallaxes.
  • the output two-way imaging light is reflected to the windshield through the reflector, and the windshield further reflects the two-way imaging light to form a virtual image.
  • the virtual image is on one side of the windshield, with the driver or passenger on the other side.
  • the reflected two-channel imaging light shines on the eyes of the driver or passenger respectively. For example, the first imaging light hits the passenger's left eye.
  • the second imaging light is illuminated to the passenger's right eye.
  • FIG. 15 is a schematic diagram of a three-dimensional display device installed on a vehicle according to an embodiment of the present application.
  • the windshield of a vehicle can be used as a curved mirror or lens in a stereoscopic display device.
  • the image generating assembly 101 and the driver or passenger are located on the same side of the windshield.
  • the image generating assembly 101 and the driver or passenger are located on different sides of the windshield.
  • the image generating component 101 is used to output two channels of imaging light. The two imaging lights carry image information with different parallaxes.
  • the windshield is used to reflect or transmit two-way imaging light to form a virtual image.
  • the virtual image is on one side of the windshield, with the driver or passenger on the other side.
  • the two-channel imaging light after reflection or transmission is illuminated to the eyes of the driver or passenger respectively.
  • the first imaging light shines on the passenger's left eye.
  • the second imaging light is illuminated to the passenger's right eye.
  • vehicles may be cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, playground vehicles, construction equipment, trolleys, golf carts, trains, and handcarts etc.
  • the three-dimensional display device can be installed on the instrument panel (Instrument Panel, IP) of the vehicle On the platform, it is located in the passenger or main driver's position, or it can be installed on the back of the seat.
  • IP Instrument Panel
  • the above-mentioned three-dimensional display device is used in a vehicle, it can be called a head-up display (HUD), and can be used to display navigation information, vehicle speed, power/fuel level, etc.
  • HUD head-up display
  • Figure 16 is a schematic diagram of a possible functional framework of the vehicle provided by the embodiment of the present application.
  • the functional framework of the vehicle may include various subsystems, such as the control system 14, the sensor system 12, one or more peripheral devices 16 (one is shown as an example), Power supply 18, computer system 20, display system 32.
  • the vehicle may also include other functional systems, such as an engine system that provides power for the vehicle, etc., which is not limited in this application.
  • the sensor system 12 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules.
  • these detection devices may include a global positioning system (GPS), vehicle speed sensor, inertial measurement unit (IMU), radar unit, laser rangefinder, camera device, wheel speed sensor, Steering sensors, gear sensors, or other components used for automatic detection, etc. are not limited in this application.
  • the control system 14 may include several elements, such as the illustrated steering unit, braking unit, lighting system, automatic driving system, map navigation system, network time synchronization system and obstacle avoidance system.
  • the control system 14 may also include components such as a throttle controller and an engine controller for controlling the driving speed of the vehicle, which are not limited in this application.
  • Peripheral device 16 may include several elements, such as a communication system, a touch screen, a user interface, a microphone and a speaker as shown, among others.
  • the communication system is used to realize network communication between vehicles and other devices other than vehicles.
  • the communication system can use wireless communication technology or wired communication technology to realize network communication between vehicles and other devices.
  • the wired communication technology may refer to communication between vehicles and other devices through network cables or optical fibers.
  • the power source 18 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, rechargeable lithium batteries or lead-acid batteries, etc. In practical applications, one or more battery components in the power supply are used to provide electric energy or energy for starting the vehicle. The type and material of the power supply are not limited in this application.
  • the computer system 20 may include one or more processors 2001 (one processor is shown as an example) and a memory 2002 (which may also be referred to as a storage device).
  • the memory 2002 may also be inside the computer system 20 or outside the computer system 20 , for example, as a cache in a vehicle, etc., which is not limited by this application. in,
  • Processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (GPU).
  • the processor 2001 may be used to run relevant programs or instructions corresponding to the programs stored in the memory 2002 to implement corresponding functions of the vehicle.
  • Memory 2002 may include volatile memory (volatile memory), such as RAM; memory may also include non-volatile memory (non-volatile memory), such as ROM, flash memory (flash memory) or solid state drive (solid state). drives, SSD); the memory 2002 may also include a combination of the above types of memory.
  • the memory 2002 can be used to store a set of program codes or instructions corresponding to the program codes, so that the processor 2001 can call the program codes or instructions stored in the memory 2002 to implement corresponding functions of the vehicle. This function includes but is not limited to some or all of the functions in the vehicle function framework diagram shown in Figure 13. In this application, a set of program codes for vehicle control can be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to achieve safe driving of the vehicle will be described in detail below in this application.
  • the memory 2002 may also store information such as road maps, driving routes, sensor data, and the like.
  • Computer system 20 may be combined with other elements in the vehicle functional framework diagram, such as sensors Sensors, GPS, etc. in the system realize vehicle-related functions.
  • the computer system 20 can control the driving direction or driving speed of the vehicle based on data input from the sensor system 12 , which is not limited in this application.
  • the display system 32 may include several elements, such as a controller and the stereoscopic display device 100 described above.
  • the controller is configured to generate an image according to user instructions (for example, generate an image including vehicle status such as vehicle speed, power/fuel level, and an image of augmented reality AR content), and send the image content to the stereoscopic display device 100 .
  • the image generation module 101 in the stereoscopic display device 100 is used to output two channels of imaging light carrying different image information.
  • the curved screen 102 in the stereoscopic display device 100 is a windshield.
  • the windshield is used to reflect or transmit two-way imaging light, so that a virtual image corresponding to the image content is presented in front of the driver or passenger.
  • the functions of some components in the display system 32 can also be implemented by other subsystems of the vehicle.
  • the controller can also be a component in the control system 14 .
  • Figure 16 of this application shows that it includes four subsystems.
  • the sensor system 12, the control system 14, the computer system 20 and the display system 32 are only examples and do not constitute a limitation.
  • vehicles can combine several components in the vehicle according to different functions to obtain subsystems with corresponding different functions.
  • the vehicle may include more or fewer systems or components, which is not limited by this application.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

一种立体显示装置(100),应用于显示领域。立体显示装置(100)包括图像生成组件(101)和曲面镜(102)。图像生成组件(101)用于生成两路成像光。两路成像光携带不同视差的图像信息。曲面镜(102)用于反射两路成像光,反射后的两路成像光之间存在夹角。曲面镜(102)的焦距为f。图像生成组件(101)的图像面与曲面镜(102)的距离为d,d小于f。曲面镜(102)可以对虚像进行放大。在用户和立体显示装置(100)之间的距离较近时,可以提升用户体验。

Description

立体显示装置和交通工具
本申请要求于2022年5月10日提交中国国家知识产权局、申请号为202210505264.7、申请名称为“立体显示装置和交通工具”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及显示领域,尤其涉及一种立体显示装置和包含有该立体显示装置的交通工具。
背景技术
立体显示需要给双眼提供携带不同视差的图像信息。相比于2D显示,立体显示可以给人更好的体验。
裸眼立体显示技术也是一种立体显示方案,该方案中用户不需要佩戴偏光眼镜或快门眼镜。立体显示装置输出两路成像光分别到用户的左右眼。具体地,立体显示装置可以分时的输出两路成像光,在第一时间段,立体显示装置输出的一路成像光照射入用户的一只眼睛。在第二时间段,立体显示装置输出的另一路成像光照射入用户的另一只眼睛。第一时间段和第二时间段交替分部。两路成像光携带不同视差的图像信息,从而给用户提供立体的视觉享受。
但是,为了提高用户体验,立体显示的图像需要较大的画幅。因此,当用户和立体显示装置之间的距离较近时,用户的体验较低。
发明内容
本申请提供了一种立体显示装置和交通工具,通过曲面镜或透镜可以将立体显示的图像进行放大。因此,当用户和立体显示装置之间的距离较近时,可以提升用户体验。
本申请第一方面提供了一种立体显示装置。立体显示装置包括图像生成组件和曲面镜。图像生成组件用于生成两路成像光。两路成像光携带不同视差的图像信息。曲面镜用于反射两路成像光。反射后的两路成像光之间存在夹角。曲面镜的焦距为f。图像生成组件的图像面与曲面镜的距离为d。d小于f。
在本申请中,d小于f,曲面镜可以对立体显示的图像进行放大。因此,当用户和立体显示装置之间的距离较近时,可以提升用户体验。并且,相比于透镜,曲面镜的体积可以更小,从而可以降低立体显示装置的体积。
在第一方面的一种可选方式中,反射后的两路成像光形成的虚像和曲面镜的距离为D。D满足如下公式:
在第一方面的一种可选方式中,被曲面镜反射后的两路成像光之间存在夹角γ1。当S的取值为0毫米至5000毫米之间的某一值时,γ1满足以下公式:E-w<tan(γ1)×(S+D)<E+2w。其中,S为两路成像光的接收位置与曲面镜的距离。E的取值范围在53毫米至73毫米之间。w为被曲面镜反射后的两路成像光中的至少一路成像光在接收位置处的宽度。其中,当两路成像光照射到用户的同一只眼时,会产生光束串扰, 降低用户体验。通过控制夹角γ1,可以使得反射后的两路成像光分别照射到用户的不同眼睛。因此,本申请可以进一步提高用户体验。
在第一方面的一种可选方式中,被曲面镜反射前的两路成像光之间存在夹角γ2。当S的取值为0毫米至5000毫米之间的某一值时,γ2满足以下公式:其中S为两路成像光的接收位置与曲面镜的距离。E的取值范围在53毫米至73毫米之间。w为被曲面镜反射后的两路成像光中的至少一路成像光在接收位置处的宽度。通过控制夹角γ2,可以使得反射后的两路成像光分别照射到用户的不同眼睛。因此,本申请可以进一步提高用户体验。
在第一方面的一种可选方式中,被曲面镜反射前的两路成像光中的每路成像光的发散角为α。当S的取值为0毫米至5000毫米之间的某一值时,w满足以下公式:w小于73毫米。其中,当每路成像光的宽度太大时,每路成像光可能都会覆盖到用户的双眼,从而导致光束串扰。通过控制w的大小,可以避免一路成像光照射入用户的两个眼睛。因此,本申请可以进一步提高用户体验。
在第一方面的一种可选方式中,图像生成组件包括第一光源组件和像素组件。第一光源组件用于分时的输出不同出射方向的第一光束和第二光束至像素组件。像素组件用于使用不同的图像信息对第一光束和第二光束分别进行调制,生成两路成像光。
在第一方面的一种可选方式中,第一光源组件包括第一光源器件和第二光源器件。第一光源器件和第二光源器件用于分时交替输出第一光束和第二光束。
在第一方面的一种可选方式中,图像生成组件还包括时序控制单元。时序控制单元用于控制第一光源器件和第二光源器件分时交替输出第一光束和第二光束。时序控制单元还用于控制像素组件使用不同的图像信息分时对第一光束和第二光束进行调制。
在第一方面的一种可选方式中,两路成像光包括第一路成像光和第二路成像光。图像生成组件包括第二光源组件、像素组件和透镜阵列。第二光源组件用于输出第三光束至像素组件。像素组件用于根据不同的图像信息对第三光束进行调制,生成第一路成像光和第二路成像光。曲面镜阵列用于以不同的角度透射第一路成像光和第二路成像光。通过使用第三光束,可以降低立体实现装置的成本。
在第一方面的一种可选方式中,像素组件包括第一像素和第二像素。第一像素用于根据第一图像信息对第三光束进行调制,生成第一路成像光。第二像素用于根据第二图像信息对第三光束进行调制,生成第二路成像光。由于曲面镜的工艺误差,会导致用户观察到的图像的缩放倍数、成像位置相对于理想位置存在显示差别。显示差别会导致用户产生眩晕等生理不适,降低用户体验。为此,可以通过对图像信息进行预处理,以补偿显示差别。当使用相同的像素调制第一光束和第二光束时,只能对不同视差的图像信息进行相同程度的预处理。相同程度的预处理容易产生新的显示误差,从而降低用户体验。当使用不同的像素分别调制第一光束和第二光束时,可以对不同视差的图像信息进行不同程度的预处理。不同程度的预处理方面可以降低或消除显示误差,从而提高用户体验。
在第一方面的一种可选方式中,第三光束包括第一子光束和第二子光束。第一像素用于根据第一图像信息对第三光束进行调制,生成第一路成像光包括:第一像素用于根据第一图像信息对第一子光束进行调制,生成第一路成像光。第二像素用于根据第二图像信息对第三光束进行调制,生成第二路成像光包括:第二像素用于根据第二图像信息对第二子光束进行调制,生成第二路成像光。第二光源组件用于同时生成第一子光束和第二子光束。
在第一方面的一种可选方式中,不同视差的图像信息包括第一图像信息和第二图像信息。 立体显示装置还包括处理器。其中,处理器可以设置于图像生成组件内,也可以设置于图像生成组件外。
在第一方面的一种可选方式中,处理器用于对第三图像信息进行预处理,得到第一图像信息。处理器用于对第四图像信息进行预处理,得到第二图像信息。通过对不同视差的图像信息进行不同程度的预处理,可以降低或消除显示误差,从而提高用户体验。
在第一方面的一种可选方式中,处理器还用于获取第一位置的第一坐标信息和/或第二位置的第二坐标信息。两路成像光中的一路成像光照射至第一位置。两路成像光中的另一路成像光照射至第二位置。处理器用于对第三图像信息进行预处理包括:处理器用于根据第一坐标信息对第三图像信息进行预处理。处理器用于对第四图像信息进行预处理包括:处理器用于根据第二坐标信息对第四图像信息进行预处理。在2D显示的领域中,用户的双眼接收相同的图像信息。因此,处理器只能对一个图像信息进行预处理。具体地,处理器可以获取用户的双眼的中间位置的坐标信息。处理器根据中间位置的坐标信息对图像信息进行预处理。在本申请中,处理器也可以根据中间位置的坐标信息分别对不同的图像信息进行预处理。例如,中间位置的坐标信息对应两个修正参数。处理器可以根据两个修正参数分别对不同的图像信息进行预处理。在本申请中,通过使用不同位置的坐标信息进行预处理,可以提高预处理的准确性,从而提高用户体验。
在第一方面的一种可选方式中,f小于400毫米。当f过大时,会导致立体显示装置的体积较大。因此,本申请可以减少立体显示装置的体积。
在第一方面的一种可选方式中,图像生成组件包括投影仪和扩散屏。投影仪用于生成两路成像光。扩散屏用于接收两路成像光,并对两路成像光进行扩散,输出扩散后的两路成像光。曲面镜用于反射扩散后的两路成像光。
本申请第二方面提供了一种立体显示装置。立体显示装置包括图像生成组件和透镜。图像生成组件用于生成两路成像光。两路成像光携带不同视差的图像信息。透镜用于透射两路成像光。透射后的两路成像光之间存在夹角。透射的焦距为f。图像生成组件的图像面与透镜之间的距离为d。d小于f。
在本申请中,d小于f,透镜可以对立体显示的图像进行放大。因此,当用户和立体显示装置之间的距离较近时,可以提升用户体验。
在第二方面的一种可选方式中,反射后的两路成像光形成的虚像和曲面镜的距离为D。D满足如下公式:
在第二方面的一种可选方式中,被透镜反射后的两路成像光之间存在夹角γ1。当S的取值为0毫米至5000毫米之间的某一值时,γ1满足以下公式:E-w<tan(γ1)×(S+D)<E+2w。其中,S为两路成像光的接收位置与透镜的距离。E的取值范围在53毫米至73毫米之间。w为被透镜反射后的至少一路成像光中的每路成像光在接收位置处的宽度。
在第二方面的一种可选方式中,被透镜反射前的两路成像光之间存在夹角γ2。当S的取值为0毫米至5000毫米之间的某一值时,γ2满足以下公式:其中,S为两路成像光的接收位置与曲面镜的距离。E的取值范围在53毫米至73毫米之间。w为被曲面镜反射后的两路成像光中的至少一路成像光在接收位置处的宽度。
在第二方面的一种可选方式中,被透镜反射前的两路成像光中的每路成像光的发散角为α。当S的取值为0毫米至5000毫米之间的某一值时,w满足以下公式:w小于73毫米。
在第二方面的一种可选方式中,图像生成组件包括第一光源组件和像素组件。第一光源组件用于分时输出不同出射方向的第一光束和第二光束至像素组件。像素组件用于根据不同的图像信息对第一光束和第二光束分别进行调制,生成两路成像光。
在第二方面的一种可选方式中,第一光源组件包括第一光源器件和第二光源器件。第一光源器件和第二光源器件用于分时交替输出第一光束和第二光束。
在第二方面的一种可选方式中,图像生成组件还包括时序控制单元。时序控制单元用于控制第一光源器件和第二光源器件分时交替输出第一光束和第二光束。时序控制单元还用于控制像素组件使用不同的图像信息分时对第一光束和第二光束进行调制。
在第二方面的一种可选方式中,像素组件包括第一像素和第二像素。第一像素用于调制第一光束,得到两路成像光中的第一路成像光。第二像素用于调制第二光束,得到两路成像光中的第二路成像光。
在第二方面的一种可选方式中,两路成像光包括第一路成像光和第二路成像光。图像生成组件包括第二光源组件、像素组件和透镜阵列。第二光源组件用于输出第三光束至像素组件。像素组件用于使用不同的图像信息对第三光束进行调制,生成第一路成像光和第二路成像光。透镜阵列用于以不同的角度透射第一路成像光和第二路成像光。
在第二方面的一种可选方式中,像素组件包括第一像素和第二像素。第一像素用于根据第一图像信息对所述第三光束进行调制,生成第一路成像光。第二像素用于根据第二图像信息对第三光束进行调制,生成第二路成像光。
在第二方面的一种可选方式中,第三光束包括第一子光束和第二子光束。第一像素用于根据第一图像信息对第一子光束进行调制,生成第一路成像光。第二像素用于根据第二图像信息对第二子光束进行调制,生成第二路成像光。第二光源组件用于同时生成第一子光束和第二子光束。
在第二方面的一种可选方式中,不同视差的图像信息包括第一图像信息和第二图像信息。立体显示装置还包括处理器。处理器用于对第三图像信息进行预处理,得到第一图像信息。处理器用于对第四图像信息进行预处理,得到第二图像信息。
在第二方面的一种可选方式中,处理器还用于获取第一位置的第一坐标信息和/或第二位置的第二坐标信息。两路成像光中的一路成像光照射至第一位置。两路成像光中的另一路成像光照射至第二位置。处理器用于对第三图像信息进行预处理包括:处理器用于根据第一坐标信息对第三图像信息进行预处理。处理器用于对第四图像信息进行预处理包括:处理器用于根据第二坐标信息对第四图像信息进行预处理。
在第二方面的一种可选方式中,f小于300毫米。
在第二方面的一种可选方式中,图像生成组件包括投影仪和扩散屏。投影仪用于生成两路成像光。扩散屏用于接收两路成像光,并对两路成像光进行扩散,输出扩散后的两路成像光。透镜用于透射扩散后的两路成像光。
本申请第三方面提供了一种交通工具。交通工具包括如前述第一方面、第一方面任意一种可选方式、第二方面或第二方面任意一种可选方式中所述的立体显示装置。立体显示装置安装在交通工具上。
附图说明
图1为本申请实施例提供的立体显示装置的第一个结构示意图;
图2为本申请实施例提供的立体显示装置的第一个光路投影示意图;
图3为本申请实施例提供的立体显示装置的第二个结构示意图;
图4为本申请实施例提供的立体显示装置的第二个光路投影示意图;
图5为本申请实施例提供的图像生成组件的第一个结构示意图;
图6为本申请实施例提供的图像生成组件的第二个结构示意图;
图7a为本申请实施例提供的图像生成组件的第三个结构示意图;
图7b为本申请实施例提供的图像生成组件的第四个结构示意图;
图8为本申请实施例提供的像素组件和透镜阵列的结构示意图;
图9a为本申请实施例提供的立体显示装置的第三个结构示意图;
图9b为本申请实施例提供的立体显示装置的第四个结构示意图;
图10a为本申请实施例提供的立体显示装置的第五个结构示意图;
图10b为本申请实施例提供的立体显示装置的第六个结构示意图;
图11为本申请实施例提供的立体显示装置的第七个结构示意图;
图12为本申请实施例提供的立体显示装置的第八个结构示意图;
图13为本申请实施例提供的立体显示装置的第三个光路投影示意图;
图14是本申请实施例提供的一种立体显示装置的电路示意图;
图15为本申请实施例提供的交通工具的结构示意图;
图16是本申请实施例提供的交通工具的一种可能的功能框架示意图。
具体实施方式
本申请提供了一种立体显示装置和交通工具,通过曲面镜或透镜可以将立体显示的图像进行放大。因此,当用户和立体显示装置之间的距离较近时,可以提升用户体验。应理解,本申请中使用的“第一”、“第二”等仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。另外,为了简明和清楚,本申请实施例多个附图中重复参考编号和/或字母。重复并不表明各种实施例和/或配置之间存在严格的限定关系。
本申请中的立体显示装置也可以称为3D显示装置。立体显示装置应用于投影技术领域。在投影技术领域中,可以通过定向背光装置给用户提供立体的视觉享受。但是,为了提高用户体验,立体显示的图像需要较大的画幅。因此,当用户和立体显示装置之间的距离较近时,用户的体验较低。
为此,本申请提供了一种立体显示装置。图1为本申请实施例提供的立体显示装置的第一个结构示意图。如图1所示,立体显示装置100包括图像生成组件101和曲面镜102。图像生成组件101用于生成两路成像光。在图1中,和图像生成组件101相连的每条实线表示一路成像光。曲面镜102用于反射两路成像光。反射后的两路成像光之间存在夹角。因此,反射后的两路成像光可以照射到不同的位置。例如一路成像光照射到用户的左眼,另一路成像光照射到用户的右眼。两路成像光携带不同视差的图像(图案)信息,从而给用户提供立体的视觉享受。人眼所在位置可以称为视点。上述立体显示装置可以提供多个视点,供多人观看。对应的,图像生成组件101可以生产多路成像光,分别供不同的人观看。本实施例以一个视点为例,即图像生成组件101生成两路成像光为例,来说明立体显示装置的成像过程。
在本实施例中,曲面镜102的焦距为f。图像生成组件101的图像面(图像的显示面)与曲面镜102的距离为d。曲面镜102上的每个点和图像生成组件101的图像面存在一个垂直距离。d可以为曲面镜102与图像生成组件101的图像面的最远垂直距离。或者,d可以为图像生成组件101的图像面的中心像素与曲面镜102上的目标点的直线距离。中心像素为图 像面的中心位置处的一个或多个像素。中心像素输出的成像光照射到曲面镜102上的目标点。d小于f。当d小于f时,曲面镜102可以对虚像进行放大。因此,在用户和立体显示装置100之间的距离较近时,用户可以看到放大的虚像,从而提升用户体验。
图2为本申请实施例提供的立体显示装置的第一个光路投影示意图。如图2所示,图像生成组件101用于生成两路成像光。两路成像光中的每路成像光的发散角为α。图2中的点划线表示两路成像光中的一路成像光。图2中的实线表示两路成像光中的另一路成像光。两路成像光之间存在夹角γ2。曲面镜102用于反射两路成像光。反射后的两路成像光之间存在夹角γ1。反射后的两路成像光形成的虚像和曲面镜102的距离为D。根据以下公式1,可以通过f和d得到D。
反射后的两路成像光可以照射到不同的位置。两路成像光照射到的位置也称为两路成像光的接收位置,例如用户的双眼。双眼的瞳距距离(瞳距)为E。E的取值范围可以在53毫米至73毫米之间。例如,E为53毫米或73毫米。双眼离曲面镜102的距离为S。在双眼位置处,反射后的两路成像光中的每路成像光的宽度为w。w和S、α以及D相关。根据以下公式2,可以通过S、α以及D得到w。当每路成像光的宽度w太大时,一路成像光可能覆盖用户的双眼,导致光束的串扰。因此,为了降低或避免光束的串扰,当S的取值为0毫米至5000毫米之间的某一值时,w可以小于E的最大值。E的最大值为73毫米。S的取值可以为0毫米或5000毫米。
在实际应用中,通过设置反射后的两路成像光之间的距离,可以使得反射后的两路成像光分别照射到适合的位置,例如用户的双眼。反射后的两路成像光之间的距离M和夹角γ1、D、S相关。根据以下公式3,可以通过γ1、D和S得到M。
M=tan(γ1)×(S+D)  公式3
类似地,在实际应用中,反射后的两路成像光之间的距离M和夹角γ2、D、S相关。根据以下公式4,可以通过γ2、D、和S得到M。
为了使得反射后的两路成像光分别照射到用户的双眼,可以设置M与E、w之间的关系。具体地,当S的取值为0毫米至5000毫米之间的某一值时,E-w<M<E+2w。w可以为w1或w2。w1为第一路成像光的宽度。w2为第二路成像光的宽度。在实际应用中,w可以为w1和w2。此时,当S的取值为0毫米至5000毫米之间的某一值时,E-w1<M<E+2w1,E-w2<M<E+2w2。
在图1和图2中,通过曲面镜的反射原理实现了对虚像的放大。在实际应用中,也可以通过透镜的透射原理实现对虚像的放大。图3为本申请实施例提供的立体显示装置的第二个结构示意图。如图3所示,立体显示装置100包括图像生成组件101和透镜301。图像生成组件101用于生成两路成像光。在图3中,和图像生成组件101相连的每条实线表示一路成像光。透镜301用于透射两路成像光。透射后的两路成像光之间存在夹角。因此,透射后的两路成像光可以照射到不同的位置。例如一路成像光照射到用户的左眼,另一路成像光照射到用户的右眼。两路成像光携带不同视差的图像的信息,从而给用户提供立体的视觉享受。透镜的焦距为f。图像生成组件101的图像面与透镜301之间的距离为d。d小于f。当d小于f时,透镜301可以对虚像进行放大。因此,在用户和立体显示装置100之间的距离较近时,可以提升用户体验。
在本申请实施例中,为了方面描述,两路成像光的参数相同。例如,在图2和图4中,两路成像光的发散角都为α。在实际应用中,两路成像光的发散角可能会存在一定偏差。例如,第一路成像光的发散角为α1。第一路成像光的发散角为α2。类似地,通过α1、α2和前述公式1可以得到两个w。两个w包括第一路成像光的宽度w1和第二路成像光的宽度w2。为了降低或避免光束的串扰,w1和w2可以都小于E的最大值。
图4为本申请实施例提供的立体显示装置的第二个光路投影示意图。如图4所示,图像生成组件101用于生成两路成像光。两路成像光中的每路成像光的发散角为α。图4中的点划线表示两路成像光中的一路成像光。图4中的实线表示两路成像光中的另一路成像光。图像生成组件101生成的两路成像光之间存在夹角γ2。透镜301用于透射两路成像光,透射后的两路成像光的传播方向发生偏转。透射后的两路成像光之间存在夹角γ1。透射后的两路成像光形成的虚像和透镜301的距离为D。关于D、w、E和M等描述,可以参考前述对图2的相关描述。根据图2和图4可知,通过透射原理实现的立体显示装置和通过反射原理实现的立体显示装置之间存在一些相似之处。因此,关于对通过透射原理实现的立体显示装置的描述,可以参考对通过反射原理实现的立体显示装置的描述。在后续的示例中,将以通过反射原理实现的立体显示装置为例,对立体显示装置进行描述。
根据前面的描述可知,图像生成组件101用于生成两路成像光。下面对图像生成组件101的结构进行示例性的描述。图5为本申请实施例提供的图像生成组件的第一个结构示意图。如图5所示,图像生成组件101包括第一光源组件501和像素组件502。第一光源组件501可以发光二极管(light emitting diode,LED)光源或激光二极管(laser diode,LD)光源等。第一光源组件501用于分时地输出不同出射方向的第一光束和第二光束至像素组件502。在图5中,和第一光源组件501连接的点划线表示第一光束。和第一光源组件501连接的实线表示第二光束。像素组件502可以是液晶显示器(liquid crystal display,LCD)、硅基液晶(liquid crystal on silicon,LCOS)、数字微镜器件(digital micro-mirror device,DMD)等。像素组件502可以称为图像调制器。像素组件502用于使用不同的图像信息对第一光束和第二光束分别进行调制,生成两路成像光。两路成像光包括第一路成像光和第二路成像光。在图5中,和像素组件502连接的点划线表示第一路成像光。和像素组件502连接的实线表示第二路成像光。
在实际应用中,为了分时交替输出第一光束和第二光束,第一光源组件501可以包括多个光源器件。图6为本申请实施例提供的图像生成组件的第二个结构示意图。如图6所示,第一光源组件501包括第一光源器件505和第二光源器件506。在图5的基础上,图像生成组件101还包括时序控制单元504。时序控制单元504用于控制第一光源器件505和第二光源器件506分时交替输出第一光束和第二光束。时序控制单元504还用于控制像素组件502分时交替显示(加载)不同视差的图像。例如,在第一时间段,时序控制单元504用于控制像素组件502显示左眼的图像。在第二时间段,时序控制单元504用于控制第一光源器件505输出第一光束。像素组件502使用左眼的图像调制第一光束,得到第一路成像光。在第三时间段,时序控制单元504用于控制像素组件502显示右眼的图像。在第四时间段,时序控制单元504用于控制第二光源器件506输出第二光束。像素组件502使用右眼的图像调制第二光束,得到第二路成像光。第一时间段、第二时间段、第三时间段和第四时间段交替分布。
第二路成像光和第一路成像光存在一定的方向性和发散角。类似地,第一光束和第二光束也具有一定的方向性和发散角。但是,在实际应用中,第一光束和第二光束的发散角可能过大或过小。因此,图像生成组件101还可以包括位于第一光源组件501和像素组件502之 间的光束控制单元503。光束控制单元503可以是菲涅尔屏、柱透镜或透镜阵列等。光束控制单元503用于改变第一光束和/或第二光束的发散角,从而提高第一光源组件501的光利用率,提高生成的成像光的亮度,进而提高了立体显示装置的亮度。
在前述图5或图6的示例中,像素组件502通过调制不同的光束得到不同的成像光。在实际应用中,像素组件502可以通过调制相同的光束得到不同的成像光。图7a为本申请实施例提供的图像生成组件的第三个结构示意图。如图7a所示,图像生成组件101包括第二光源组件701、像素组件502和透镜阵列702。第二光源组件701可以是LED光源或LD光源等。第二光源组件701用于输出第三光束至像素组件502。在图7a中,和第二光源组件701连接的实线表示第三光束。像素组件502用于根据不同的图像信息对第三光束进行调制,生成从不同方向输出的第一路成像光和第二路成像光。第一路成像光和第二路成像光具有一定的方向性和发散角。
在图7a中,和像素组件502连接的点划线表示第一路成像光。和像素组件502连接的实线表示第二路成像光。其中,像素组件502中可以包括左眼像素和右眼像素,左眼像素用于显示左眼图像,右眼像素用于显示右眼图像。左眼像素被调制后发出第一路成像光,右眼像素被调制后发出第二路成像光。像素组件502发出的成像光输入至透镜阵列702,透镜阵列702用于以不同的角度透射第一路成像光和第二路成像光,使得透镜阵列702输出的第一路成像光和第二路成像光具有不同的输出(传播)方向,进而第一路成像光和第二路成像光分别传播至人的左右眼。在图7a中,和透镜阵列702连接的点划线表示第一路成像光。和透镜阵列702连接的实线表示第二路成像光。
根据前述对图1的描述可知,图像生成组件101的图像面与曲面镜102的距离为d。图像生成组件101的图像面可以是像素组件或扩散屏。例如,在图7a中,第二光源组件701输出的两路光束为不携带图像信息的光束,图像生成组件101的图像面为像素组件502。又例如,图7b为本申请实施例提供的图像生成组件的第四个结构示意图。如图7b所示,图像生成组件101包括投影机703、扩散屏704和透镜阵列702。投影机703输出第三光束,第三光束携带了图像信息。扩散屏704是一种像素化的器件。扩散屏704用于对投影机703输出的第三光束的发散角进行放大。第三光束可以分时携带不同视差的图像信息,进而扩散屏704可以输出两路成像光,两路成像光携带不同视差的图像信息。透镜阵列702用于以不同的角度透射两路成像光。两路成像光包括第一路成像光和第二路成像光。在图7b中,和透镜阵列702连接的点划线表示第一路成像光。和透镜阵列702连接的实线表示第二路成像光。在图7b中,图像生成组件101的图像面为扩散屏704。
图8为本申请实施例提供的像素组件和透镜阵列的结构示意图。如图8所示,像素组件502包括N个像素组801。N为大于0的整数。每个像素组801包括一个第一像素和一个第二像素。第一像素用于调制第三光束,输出一个第一子成像光。第二像素用于调制第三光束,输出一个第二子成像光。第一子成像光和第二子成像光具有一定的方向性和发散角。在图8中,和第一像素连接的点划线表示第一子成像光。和第二像素连接的实线表示第二子成像光。透镜阵列702包括N个透镜802。每个透镜802用于透射一个第一子成像光和一个第二子成像光。每个透镜802用于按照一定方向输出一个第一子成像光和一个第二子成像光。在图8中,和透镜802连接的点划线表示第一子成像光。和透镜802连接的实线表示第二子成像光。N个像素组801和N个透镜802一一对应。N个透镜802用于输出N个第一子成像光和N个第二子成像光。N个第一子成像光汇聚形成第一路成像光。N个第二子成像光汇聚形成第二路成 像光。
在显示技术领域中,不管是2D显示还是立体显示,用户的左右眼接收的成像光来自曲面镜或透镜的不同位置。图9a为本申请实施例提供的立体显示装置的第三个结构示意图。图9b为本申请实施例提供的立体显示装置的第四个结构示意图。如图9a和图9b所示,立体显示装置100包括图像生成组件101和曲面镜102。关于立体显示装置100的描述,可以参考图1中的相关描述。两路成像光中的第一路成像光经过曲面镜102的A点反射到左眼。两路成像光中的第二路成像光经过曲面镜102的B点反射到右眼。在实际应用中,曲面镜102不同位置存在不同的工艺误差。工艺误差会导致用户观察到的图像的缩放倍数、成像位置相对于理想位置存在显示差别。例如,在图9a和图9b中,用户的双眼观察到的两个虚像在不同的位置。显示差别会导致用户产生眩晕等生理不适,降低用户体验。因此,在本申请实施例中,可以对不同视差的图像信息进行预处理。通过预处理来补偿显示差别,从而增强显示效果。例如,立体显示装置可以对像素组件502加载的左眼图像和/或右眼图像进行以下一项或多项处理:
1、对左眼图像或右眼图像的整体或局部进行平移。
2、对左眼图像或右眼图像的整体或局部进行放大或缩小。
3、对左眼图像或右眼图像的整体或局部进行畸变。
图10a为本申请实施例提供的立体显示装置的第五个结构示意图。图10b为本申请实施例提供的立体显示装置的第六个结构示意图。如图10a和图10b所示,在图1的基础上,立体显示装置100包括处理器1001和图像生成组件101。处理器1001可以是中央处理器(central processing unit,CPU),网络处理器(network processor,NP)或者CPU和NP的组合。处理器还可以进一步包括硬件芯片或其他通用处理器。上述硬件芯片可以是专用集成电路(application specific integrated circuit,ASIC),可编程逻辑器件(programmable logic device,PLD)或其组合。
处理器1001用于获取第三图像信息,对第三图像信息进行预处理,得到第一图像信息。例如,处理器1001用于获取第一位置的第一坐标信息。第一位置可以为用户的左眼的位置。处理器1001用于获取映射表。映射表中有坐标信息和修正参数的对应关系。处理器1001在映射表中查找第一坐标信息对应的第一修正参数。处理器1001根据第一修正参数对第三图像信息进行预处理,得到第一图像信息。第一修正参数可以为向左平移2个像素。例如,对于图9a,处理器1001可以控制第一路成像光往右偏移,得到图10a中所示的立体显示装置。在偏移前,图像生成组件101通过像素1生成第一路成像光。在偏移后,图像生成组件101通过像素2生成第一路成像光。像素1向右平移2个像素后的位置为像素2的位置。或者,处理器1001可以控制第二路成像光往左偏移。类似地,对于图9b,处理器1001可以控制第一路成像光往左偏移,得到图10b中所示的立体显示装置。或者,处理器1001可以控制第二路成像光往右偏移。
处理器1001还可以用于获取第四图像信息,对第四图像信息进行预处理,得到第二图像信息。例如,处理器1001可以用于获取第二位置的第二坐标信息。第二位置可以为用户的右眼的位置。处理器1001在映射表中查找第二坐标信息对应的第二修正参数,并根据第二修正参数对第四图像信息进行预处理,得到第二图像信息。例如,第二修正参数可以为整体缩小5%。
图像生成组件101中的像素组件502可以包括显示电路和显示面板。显示电路也可以称为显示控制器(display controller,DC),其具有显示控制功能。显示电路用于接收处理器1001输出的第一图像信息和第二图像信息。显示电路还用于根据第一图像信息和第二图像信息控制显示面板显示第一图像和第二图像。其中,第一图像信息对应第一图像。第二图像信息对应第二图像。其中,上述时序控制单元504的功能可以由显示电路来实现。
在前述图5中,第一光源组件501分时的输出第一光束和第二光束。因此,在处理器1001对不同视差的图像信息进行预处理前,像素组件502可以使用相同的像素调制第一光束和第二光束,得到对应左右眼的两路成像光。图11为本申请实施例提供的立体显示装置的第七个结构示意图。如图11所示,第一光源组件501分时的输出第一光束和第一光束。具体地,第一光源组件501中的LED1生成第一光束。第一光源组件501中的LED2生成第二光束。第一光束和第二光束由经过光束控制单元503后到达像素组件502。像素组件502分时的通过像素1调制第一光束或第二光束,得到两路成像光,即像素组件502使用相同的像素来显示左眼图像和右眼图像。在处理器1001对不同视差的图像信息进行预处理后,像素组件502可以使用不同的像素调制第一光束和第二光束,得到两路成像光,即像素组件502使用不同的像素来显示左眼图像和右眼图像。图12为本申请实施例提供的立体显示装置的第八个结构示意图。如图12所示,在图11的基础上,第一光源组件501分时的输出第一光束和第二光束。例如,第一光源组件501中的LED3生成第一光束。第一光源组件501中的LED2生成第二光束。第一光束和第二光束经过光束控制单元503后到达像素组件502。像素组件502分时的通过不同的像素调制第一光束和第二光束,得到两路成像光。其中,像素组件502通过像素2调制第一光束得到第一路成像光,经过曲面镜102反射后进入左眼。像素组件502通过像素1调制第二光束得到第二路成像光,经过曲面镜102反射后进入右眼,用户的双眼观察到的两个虚像在同一位置,不会出现眩晕,提高了立体显示的效果。应理解,关于图12的描述只是一个示例。例如,在实际应用中,在显示面板对显示的图像进行预处理后,像素组件502可能同时改变使用的像素点。具体地,像素组件502通过像素2调制第一光束得到第一路成像光。像素组件502通过像素3调制第二光束得到第二路成像光。
因理解,在本申请实施例中,像素1、像素2、第一像素或第二像素等可以是指一个像素点,也可以是指多个像素点的集合。本申请对此不做限定。类似地,LED1或LED2等可以是指一个LED,也可以是指多个LED的集合。类似地,第一光源器件或第二光源器件也可以是指一个LED,也可以是指多个LED的集合。
根据前面的描述可知,本实施例可以通过处理器1001的预处理补偿显示误差。在实际应用中,为了避免处理器1001的过度处理,可以在显示误差大于阈值的情况下,才进行预处理。下面为计算显示误差的一个示例。图13为本申请实施例提供的立体显示装置的第三个光路投影示意图。如图13所示,图像生成组件101的两个像素输出两路成像光。两路成像光包括第一路成像光和第二路成像光。两个像素包括像素1和像素2。像素1的坐标为(X_Oleft,Y_Oleft)。像素2的坐标为(X_Oright,Y_Oright)。曲面镜102用于反射像素1输出的第一路成像光。反射后的第一路成像光照射至用户的左眼。用户的左眼的坐标为(X_left,Y_left)。曲面镜102还用于反射像素2输出的第二路成像光。反射后的第二路成像光照射至用户的右眼。用户的右眼的坐标为(X_right,Y_right)。
在不存在显示误差的理想状态下,像素1和像素2对应的虚像在虚像面1301上的同一虚 像点上。虚像点的坐标为(X_V,Y_V)。当存在显示误差时,像素1对应的虚像在虚像面1301的虚像点1上。虚像点1的坐标为(X_V1,Y_V1)。像素2对应的虚像在虚像面1301的虚像点2上。虚像点2的坐标为(X_V2,Y_V2)。
此时,计算显示误差△=|(X_V1,X_V2)-(Y_V1,Y_V2)|。上述像素1和像素2为一对采样点,分别显示左眼图像和右眼图像。
如果对于一定区域内的多对采样点,显示误差△的平均值大于阈值,则表征处理器1001需要进行预处理。例如,阈值可以为tan(2.5mrad)×S。其中,S为用户的双眼与曲面镜102之间的距离。
通过上述的预处理,可以改变一对采样点包括的像素。例如,在预处理后,一对采样点包括像素1和像素3。像素3的虚像投影在虚像面1301的虚像点3上。虚像点3的坐标为(X_V3,Y_V3)。通过前述类似的方法,计算显示误差△=|(X_V1,X_V3)-(Y_V1,Y_V3)|。若显示误差△小于阈值,则表征显示误差经过纠正后在可接受范围内,处理器1001无需对图像信息进行预处理。若经过预处理,显示误差△仍大于或等于阈值,则处理器1001可以进行进一步的预处理,直至显示误差△小于阈值。
参考图14,图14是本申请实施例提供的一种立体显示装置的电路示意图。
如图14所示,显示装置中的电路主要包括包含处理器1001,内部存储器1002,外部存储器接口1003,音频模块1004,视频模块1005,电源模块1006,无线通信模块1007,I/O接口1008、视频接口1009、控制器局域网(Controller Area Network,CAN)收发器1010,显示电路1028和显示面板1029等。其中,处理器1001与其周边的元件,例如存储器1002,CAN收发器1010,音频模块1004,视频模块1005,电源模块1006,无线通信模块1007,I/O接口1008、视频接口1009、触控单元1010、显示电路1028可以通过总线连接。处理器1001可以称为前端处理器。
另外,本申请实施例示意的电路图并不构成对显示装置的具体限定。在本申请另一些实施例中,显示装置可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
其中,处理器1001包括一个或多个处理单元,例如:处理器1001可以包括应用处理器(Application Processor,AP),调制解调处理器,图形处理器(Graphics Processing Unit,GPU),图像信号处理器(Image Signal Processor,ISP),视频编解码器,数字信号处理器(Digital Signal Processor,DSP),基带处理器,和/或神经网络处理器(Neural-Network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器1001中还可以设置存储器,用于存储指令和数据。例如,存储显示装置的操作***、AR Creator软件包等。在一些实施例中,处理器1001中的存储器为高速缓冲存储器。该存储器可以保存处理器1001刚用过或循环使用的指令或数据。如果处理器1001需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器1001的等待时间,因而提高了***的效率。
另外,如果本实施例中的显示装置安装在交通工具上,处理器1001的功能可以由交通工具上的域控制器来实现。
在一些实施例中,显示装置还可以包括多个连接到处理器1001的输入输出(Input/Output,I/O)接口1008。接口1008可以包括但不限于集成电路(Inter-Integrated Circuit, I2C)接口,集成电路内置音频(Inter-Integrated Circuit Sound,I2S)接口,脉冲编码调制(Pulse Code Modulation,PCM)接口,通用异步收发传输器(Universal Asynchronous Receiver/Transmitter,UART)接口,移动产业处理器接口(Mobile Industry Processor Interface,MIPI),通用输入输出(General-Purpose Input/Output,GPIO)接口,用户标识模块(Subscriber Identity Module,SIM)接口,和/或通用串行总线(Universal Serial Bus,USB)接口等。上述I/O接口1008可以连接鼠标、触摸屏、键盘、摄像头、扬声器/喇叭、麦克风等设备,也可以连接显示装置上的物理按键(例如音量键、亮度调节键、开关机键等)。
内部存储器1002可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。存储器1002可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如通话功能,时间设置功能,AR功能等)等。存储数据区可存储显示装置使用过程中所创建的数据(比如电话簿,世界时间等)等。此外,内部存储器1002可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(Universal Flash Storage,UFS)等。处理器1001通过运行存储在内部存储器1002的指令,和/或存储在设置于处理器1001中的存储器的指令,执行显示装置的各种功能应用以及数据处理。
外部存储器接口1003可以用于连接外部存储器(例如Micro SD卡),外部存储器可以根据需要存储数据或程序指令,处理器1001可以通过外部存储器接口1003对这些数据或程序执行进行读写等操作。
音频模块1004用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块1004还可以用于对音频信号编码和解码,例如进行放音或录音。在一些实施例中,音频模块1004可以设置于处理器1001中,或将音频模块1004的部分功能模块设置于处理器1001中。显示装置可以通过音频模块1004以及应用处理器等实现音频功能。
视频接口1009可以接收外部输入的音视频,其具体可以为高清晰多媒体接口(High Definition Multimedia Interface,HDMI),数字视频接口(Digital Visual Interface,DVI),视频图形阵列(Video Graphics Array,VGA),显示端口(Display port,DP),低压差分信号(Low Voltage Differential Signaling,LVDS)接口等,视频接口1009还可以向外输出视频。例如,显示装置通过视频接口接收导航***发送的视频数据或者接收域控制器发送的视频数据。
视频模块1005可以对视频接口1009输入的视频进行解码,例如进行H.264解码。视频模块还可以对显示装置采集到的视频进行编码,例如对外接的摄像头采集到的视频进行H.264编码。此外,处理器1001也可以对视频接口1009输入的视频进行解码,然后将解码后的图像信号输出到显示电路。
进一步的,上述显示装置还包括CAN收发器1010,CAN收发器1010可以连接到汽车的CAN总线(CAN BUS)。通过CAN总线,显示装置可以与车载娱乐***(音乐、电台、视频模块)、车辆状态***等进行通信。例如,用户可以通过操作显示装置来开启车载音乐播放功能。车辆状态***可以将车辆状态信息(车门、安全带等)发送给显示装置进行显示。
显示电路1010和显示面板1011共同实现显示图像的功能。显示电路1010接收处理器1001输出的图像信号,对该图像信号进行处理后输入显示面板1011进行成像。显示电路1010还可以对显示面板1011显示的图像进行控制。例如,控制显示亮度或对比度等参数。其中,显示电路1010可以包括驱动电路、图像控制电路等。其中,上述显示电路1010和显示面板 1011可以位于像素组件502中。
显示面板1011用于根据输入的图像信号对光源输入的光束进行调制,从而生成可视图像。显示面板1011可以为硅基液晶面板、液晶显示面板或数字微镜设备。
在本实施例中,视频接口1009可以接收输入的视频数据(或称为视频源),视频模块1005进行解码和/或数字化处理后输出图像信号至显示电路1010,显示电路1010根据输入的图像信号驱动显示面板1011将光源发出的光束进行成像,从而生成可视图像(发出成像光)。
电源模块1006用于根据输入的电力(例如直流电)为处理器1001和光源提供电源,电源模块1006中可以包括可充电电池,可充电电池可以为处理器1001和光源提供电源。光源发出的光可以传输到显示面板1029进行成像,从而形成图像光信号(成像光)。
此外,上述电源模块1006可以连接到汽车的供电模块(例如动力电池),由汽车的供电模块为显示装置的电源模块1006供电。
无线通信模块1007可以使得显示装置与外界进行无线通信,其可以提供无线局域网(Wireless Local Area Networks,WLAN)(如无线保真(Wireless Fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星***(Global Navigation Satellite System,GNSS),调频(Frequency Modulation,FM),近距离无线通信技术(Near Field Communication,NFC),红外技术(Infrared,IR)等无线通信的解决方案。无线通信模块1007可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块1007经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器1001。无线通信模块1007还可以从处理器1001接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
另外,视频模块1005进行解码的视频数据除了通过视频接口1009输入之外,还可以通过无线通信模块1007以无线的方式接收或从内部存储器1002或外部存储器中读取,例如显示装置可以通过车内的无线局域网从终端设备或车载娱乐***接收视频数据,显示装置还可以读取内部存储器1002或外部存储器中存储的音视频数据。
本申请实施例还提供了一种交通工具,该交通工具安装有前述任意一种立体显示装置。两路成像光携带不同视差的图像信息。输出的两路成像光经过反射镜反射至挡风玻璃,挡风玻璃进一步反射两路成像光,形成虚像。虚像位于挡风玻璃的一侧,驾驶员或乘客位于挡风玻璃的另一侧。反射后的两路成像光分别照射至驾驶员或乘客的双眼。例如,第一路成像光照射至乘客的左眼。第二路成像光照射至乘客的右眼。
本申请实施例还提供了一种交通工具,该交通工具安装有前述任意一种立体显示装置。图15为本申请实施例提供立体显示装置安装在交通工具的示意图。交通工具的挡风玻璃可以作为立体显示装置中的曲面镜或透镜。当挡风玻璃作为立体显示装置中的曲面镜时,图像生成组件101和驾驶员或乘客位于挡风玻璃的同一侧。当挡风玻璃作为立体显示装置中的透镜时,图像生成组件101和驾驶员或乘客位于挡风玻璃的不同侧。图像生成组件101用于输出两路成像光。两路成像光携带不同视差的图像信息。挡风玻璃用于反射或透射两路成像光,形成虚像。虚像位于挡风玻璃的一侧,驾驶员或乘客位于挡风玻璃的另一侧。反射或透射后的两路成像光分别照射至驾驶员或乘客的双眼。例如,第一路成像光照射至乘客的左眼。第二路成像光照射至乘客的右眼。
示例性的,交通工具可以为轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请实施例不作特别的限定。立体显示装置可以安装于交通工具的仪表板(Instrument Panel,IP) 台上,位于副驾位置或主驾位置,也可以安装在座椅后背。上述立体显示装置应用在交通工具时,可以称为抬头显示(Head Up Display,HUD),可以用于显示导航信息、车速、电量/油量等。
图16是本申请实施例提供的交通工具的一种可能的功能框架示意图。
如图16所示,交通工具的功能框架中可包括各种子***,例如,图示中的控制***14、传感器***12、一个或多个***设备16(图示以一个为例示出)、电源18、计算机***20、显示***32。可选地,交通工具还可包括其他功能***,例如,为交通工具提供动力的引擎***等等,本申请这里不做限定。
其中,传感器***12可包括若干检测装置,这些检测装置能感受到被测量的信息,并将感受到的信息按照一定规律将其转换为电信号或者其他所需形式的信息输出。如图示出,这些检测装置可包括全球定位***(global positioning system,GPS)、车速传感器、惯性测量单元(inertial measurement unit,IMU)、雷达单元、激光测距仪、摄像装置、轮速传感器、转向传感器、档位传感器、或者其他用于自动检测的元件等等,本申请并不做限定。
控制***14可包括若干元件,例如图示出的转向单元、制动单元、照明***、自动驾驶***、地图导航***、网络对时***和障碍规避***。可选地,控制***14还可包括诸如用于控制车辆行驶速度的油门控制器及发动机控制器等元件,本申请不做限定。
***设备16可包括若干元件,例如图示中的通信***、触摸屏、用户接口、麦克风以及扬声器等等。其中,通信***用于实现交通工具和除交通工具之外的其他设备之间的网络通信。在实际应用中,通信***可采用无线通信技术或有线通信技术实现交通工具和其他设备之间的网络通信。该有线通信技术可以是指车辆和其他设备之间通过网线或光纤等方式通信。
电源18代表为车辆提供电力或能源的***,其可包括但不限于再充电的锂电池或铅酸电池等。在实际应用中,电源中的一个或多个电池组件用于提供车辆启动的电能或能量,电源的种类和材料本申请并不限定。
交通工具的若干功能均由计算机***20控制实现。计算机***20可包括一个或多个处理器2001(图示以一个处理器为例示出)和存储器2002(也可称为存储装置)。在实际应用中,该存储器2002也在计算机***20内部,也可在计算机***20外部,例如作为交通工具中的缓存等,本申请不做限定。其中,
关于处理器2001的描述,可以参考前述处理器1001的描述。处理器2001可包括一个或多个通用处理器,例如,图形处理器(graphic processing unit,GPU)。处理器2001可用于运行存储器2002中存储的相关程序或程序对应的指令,以实现车辆的相应功能。
存储器2002可以包括易失性存储器(volatile memory),例如,RAM;存储器也可以包括非易失性存储器(non-volatile memory),例如,ROM、快闪存储器(flash memory)或固态硬盘(solid state drives,SSD);存储器2002还可以包括上述种类的存储器的组合。存储器2002可用于存储一组程序代码或程序代码对应的指令,以便于处理器2001调用存储器2002中存储的程序代码或指令以实现车辆的相应功能。该功能包括但不限于图13所示的车辆功能框架示意图中的部分功能或全部功能。本申请中,存储器2002中可存储一组用于车辆控制的程序代码,处理器2001调用该程序代码可控制车辆安全行驶,关于如何实现车辆安全行驶具体在本申请下文详述。
可选地,存储器2002除了存储程序代码或指令之外,还可存储诸如道路地图、驾驶线路、传感器数据等信息。计算机***20可以结合车辆功能框架示意图中的其他元件,例如传感器 ***中的传感器、GPS等,实现车辆的相关功能。例如,计算机***20可基于传感器***12的数据输入控制交通工具的行驶方向或行驶速度等,本申请不做限定。
显示***32可包括若干元件,例如,控制器和前文中描述的立体显示装置100。控制器用于根据用户指令生成图像(如生成包含车速、电量/油量等车辆状态的图像以及增强现实AR内容的图像),并将该图像内容发送至立体显示装置100。立体显示装置100中的图像生成模块101用于输出携带不同图像信息的两路成像光。立体显示装置100中的曲面屏102为挡风玻璃。挡风玻璃用于反射或透射两路成像光,以使在驾驶员或乘客的前方呈现图像内容对应的虚像。需要说明的是,显示***32中的部分元件的功能也可以由车辆的其它子***来实现,例如,控制器也可以为控制***14中的元件。
其中,本申请图16示出包括四个子***,传感器***12、控制***14、计算机***20和显示***32仅为示例,并不构成限定。在实际应用中,交通工具可根据不同功能对车辆中的若干元件进行组合,从而得到相应不同功能的子***。在实际应用中,交通工具可包括更多或更少的***或元件,本申请不做限定。
在本说明书的描述中,具体特征、结构、材料或者特点可以在任何的一个或多个实施例或示例中以合适的方式结合。
以上,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (15)

  1. 一种立体显示装置,其特征在于,包括图像生成组件和曲面镜,其中:
    所述图像生成组件用于生成两路成像光,所述两路成像光携带不同视差的图像信息;
    所述曲面镜用于反射所述两路成像光,反射后的所述两路成像光之间存在夹角,所述曲面镜的焦距为f,所述图像生成组件的图像面与所述曲面镜的距离为d,所述d小于所述f。
  2. 根据权利要求1所述的立体显示装置,其特征在于,反射后的两路成像光形成的虚像和曲面镜的距离为D,所述D满足如下公式:
  3. 根据权利要求2所述的立体显示装置,其特征在于,反射后的所述两路成像光之间存在夹角γ1,所述γ1满足以下公式:E-w<tan(γ1)×(S+D)<E+2w,其中,所述S为所述两路成像光的接收位置与所述曲面镜的距离,所述E的取值范围在53毫米至73毫米之间,所述w为被所述曲面镜反射后的所述两路成像光中的至少一路成像光在所述接收位置处的宽度。
  4. 根据权利要求2所述的立体显示装置,其特征在于,被所述曲面镜反射前的所述两路成像光之间存在夹角γ2,所述γ2满足以下公式:其中,所述S为所述两路成像光的接收位置与所述曲面镜的距离,所述E的取值范围在53毫米至73毫米之间,所述w为被所述曲面镜反射后的所述两路成像光中的至少一路成像光在所述接收位置处的宽度。
  5. 根据权利要求3所述的立体显示装置,其特征在于,被所述曲面镜反射前的所述两路成像光中的每路成像光的发散角为α,所述w满足以下公式:其中,所述w小于73毫米。
  6. 根据权利要求1至5中任意一项所述的立体显示装置,其特征在于,所述图像生成组件包括第一光源组件和像素组件;
    所述第一光源组件用于分时输出不同出射方向的第一光束和第二光束至所述像素组件;
    所述像素组件用于根据不同的图像信息对所述第一光束和第二光束分别进行调制,生成所述两路成像光。
  7. 根据权利要求6所述的立体显示装置,其特征在于,所述第一光源组件包括第一光源器件和第二光源器件,所述第一光源器件和所述第二光源器件用于分时交替输出所述第一光束和第二光束。
  8. 根据权利要求7所述的立体显示装置,其特征在于,所述图像生成组件还包括时序控制单元;
    所述时序控制单元用于控制所述第一光源器件和所述第二光源器件分时交替输出所述第一光束和第二光束;
    所述时序控制单元还用于控制所述像素组件使用不同的图像信息分时对所述第一光束和第二光束进行调制。
  9. 根据权利要求1至5中任意一项所述的立体显示装置,其特征在于,所述两路成像光包括第一路成像光和第二路成像光,所述图像生成组件包括第二光源组件、像素组件和透镜阵列;
    所述第二光源组件用于输出第三光束至所述像素组件;
    所述像素组件用于使用不同的图像信息对所述第三光束进行调制,生成第一路成像光和第二路成像光;
    所述透镜阵列用于以不同的角度透射所述第一路成像光和所述第二路成像光。
  10. 根据权利要求9所述的立体显示装置,其特征在于,所述像素组件包括第一像素和第二像素;
    所述第一像素用于根据第一图像信息对所述第三光束进行调制,生成所述第一路成像光;
    所述第二像素用于根据第二图像信息对所述第三光束进行调制,生成所述第二路成像光。
  11. 根据权利要求1至10中任意一项所述的立体显示装置,其特征在于,所述不同视差的图像信息包括第一图像信息和第二图像信息;
    所述立体显示装置还包括处理器;
    所述处理器用于对第三图像信息进行预处理,得到所述第一图像信息;
    所述处理器还用于对第四图像信息进行预处理,得到所述第二图像信息。
  12. 根据权利要求11所述的立体显示装置,其特征在于,
    所述处理器还用于获取第一位置的第一坐标信息和第二位置的第二坐标信息,所述两路成像光中的一路成像光照射至所述第一位置,所述两路成像光中的另一路成像光照射至所述第二位置;
    所述处理器用于对第三图像信息进行预处理包括:所述处理器用于根据所述第一坐标信息对第三图像信息进行预处理;
    所述处理器用于对第四图像信息进行预处理包括:所述处理器用于根据所述第二坐标信息对第四图像信息进行预处理。
  13. 根据权利要求1至12中任意一项所述的立体显示装置,其特征在于,所述f小于300毫米。
  14. 根据权利要求1至13中任意一项所述的立体显示装置,其特征在于,所述图像生成组件包括投影仪和扩散屏,所述投影仪用于生成所述两路成像光,所述扩散屏用于接收所述两路成像光,并对所述两路成像光进行扩散,输出扩散后的所述两路成像光;
    所述曲面镜用于反射所述两路成像光包括:所述曲面镜用于反射扩散后的所述两路成像光。
  15. 一种交通工具,其特征在于,包括如权利要求1-14中任一项所述的立体显示装置,所述立体显示装置安装在所述交通工具上。
PCT/CN2023/076650 2022-05-10 2023-02-17 立体显示装置和交通工具 WO2023216670A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210505264.7 2022-05-10
CN202210505264.7A CN117075359A (zh) 2022-05-10 2022-05-10 立体显示装置和交通工具

Publications (1)

Publication Number Publication Date
WO2023216670A1 true WO2023216670A1 (zh) 2023-11-16

Family

ID=86469458

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/076650 WO2023216670A1 (zh) 2022-05-10 2023-02-17 立体显示装置和交通工具

Country Status (2)

Country Link
CN (2) CN117075359A (zh)
WO (1) WO2023216670A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019219555A (ja) * 2018-06-21 2019-12-26 創智車電股▲ふん▼有限公司Conserve&Associates,Inc. ディスプレイ装置、および、それを用いた自動車のヘッドアップディスプレイシステム(display device and automobile head−up display system using the same)
JP2021021914A (ja) * 2019-07-30 2021-02-18 怡利電子工業股▲ふん▼有限公司 裸眼3d反射型拡散片ヘッドアップディスプレイ装置
CN112526748A (zh) * 2019-09-02 2021-03-19 未来(北京)黑科技有限公司 一种抬头显示设备、成像***和车辆
CN112639581A (zh) * 2020-10-31 2021-04-09 华为技术有限公司 抬头显示器和抬头显示方法
JP2021067909A (ja) * 2019-10-28 2021-04-30 日本精機株式会社 立体表示装置及びヘッドアップディスプレイ装置
CN213457538U (zh) * 2020-09-08 2021-06-15 未来(北京)黑科技有限公司 抬头显示装置及抬头显示***
CN114137725A (zh) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 一种可显示三维图像的抬头显示***

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130035587A (ko) * 2011-09-30 2013-04-09 엘지디스플레이 주식회사 입체영상 디스플레이장치 및 그 제조 방법
KR101322910B1 (ko) * 2011-12-23 2013-10-29 한국과학기술연구원 다수의 관찰자에 적용가능한 동적 시역 확장을 이용한 다시점 3차원 영상표시장치 및 그 방법
CN104536578B (zh) * 2015-01-13 2018-02-16 京东方科技集团股份有限公司 裸眼3d显示装置的控制方法及装置、裸眼3d显示装置
CN105025289B (zh) * 2015-08-10 2017-08-08 重庆卓美华视光电有限公司 一种立体显示方法及装置
CN105404011B (zh) * 2015-12-24 2017-12-12 深圳点石创新科技有限公司 一种抬头显示器的3d图像校正方法以及抬头显示器
CN108663807B (zh) * 2017-03-31 2021-06-01 宁波舜宇车载光学技术有限公司 平视显示光学***和装置及其成像方法
JP6873850B2 (ja) * 2017-07-07 2021-05-19 京セラ株式会社 画像投影装置及び移動体
CN110874867A (zh) * 2018-09-03 2020-03-10 广东虚拟现实科技有限公司 显示方法、装置、终端设备及存储介质
CN109462750A (zh) * 2018-12-29 2019-03-12 上海玮舟微电子科技有限公司 一种抬头显示***、信息显示方法、装置及介质
CN113661432B (zh) * 2019-06-26 2023-07-14 Jvc建伍株式会社 平视显示装置
CN114153066A (zh) * 2020-09-08 2022-03-08 未来(北京)黑科技有限公司 抬头显示装置及抬头显示***
CN112752085A (zh) * 2020-12-29 2021-05-04 北京邮电大学 基于人眼跟踪的裸眼3d视频播放***及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019219555A (ja) * 2018-06-21 2019-12-26 創智車電股▲ふん▼有限公司Conserve&Associates,Inc. ディスプレイ装置、および、それを用いた自動車のヘッドアップディスプレイシステム(display device and automobile head−up display system using the same)
JP2021021914A (ja) * 2019-07-30 2021-02-18 怡利電子工業股▲ふん▼有限公司 裸眼3d反射型拡散片ヘッドアップディスプレイ装置
CN112526748A (zh) * 2019-09-02 2021-03-19 未来(北京)黑科技有限公司 一种抬头显示设备、成像***和车辆
JP2021067909A (ja) * 2019-10-28 2021-04-30 日本精機株式会社 立体表示装置及びヘッドアップディスプレイ装置
CN114137725A (zh) * 2020-09-04 2022-03-04 未来(北京)黑科技有限公司 一种可显示三维图像的抬头显示***
CN213457538U (zh) * 2020-09-08 2021-06-15 未来(北京)黑科技有限公司 抬头显示装置及抬头显示***
CN112639581A (zh) * 2020-10-31 2021-04-09 华为技术有限公司 抬头显示器和抬头显示方法

Also Published As

Publication number Publication date
CN116184686A (zh) 2023-05-30
CN117075359A (zh) 2023-11-17

Similar Documents

Publication Publication Date Title
CN112639581B (zh) 抬头显示器和抬头显示方法
WO2021054277A1 (ja) ヘッドアップディスプレイおよび画像表示システム
WO2024021574A1 (zh) 立体投影***、投影***和交通工具
WO2024017038A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2021015171A1 (ja) ヘッドアップディスプレイ
WO2023216670A1 (zh) 立体显示装置和交通工具
US20240036311A1 (en) Head-up display
CN217360538U (zh) 一种投影***、显示设备和交通工具
WO2024021852A1 (zh) 立体显示装置、立体显示***和交通工具
US20230152586A1 (en) Image generation device and head-up display
WO2020218072A1 (ja) 車両用ヘッドアップディスプレイおよびそれに用いられる光源ユニット
WO2024098828A1 (zh) 投影***、投影方法和交通工具
CN116165808B (zh) 立体显示装置、立体显示***和交通工具
WO2023185293A1 (zh) 一种图像生成装置、显示设备和交通工具
JP7492971B2 (ja) ヘッドアップディスプレイ
WO2023130759A1 (zh) 一种显示装置和交通工具
US20240069335A1 (en) Head-up display
CN115542644B (zh) 投影装置、显示设备及交通工具
WO2023098228A1 (zh) 显示装置、电子设备以及交通工具
WO2024001225A1 (zh) 虚像显示装置、图像数据的生成方法、装置和相关设备
WO2024065332A1 (zh) 一种显示模组、光学显示***、终端设备及图像显示方法
WO2024041034A1 (zh) 一种显示模组、光学显示***、终端设备及成像方法
WO2022009605A1 (ja) 画像生成装置及びヘッドアップディスプレイ
WO2023040669A1 (zh) 抬头显示设备和交通工具
CN116500784A (zh) 显示装置以及交通工具

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23802446

Country of ref document: EP

Kind code of ref document: A1