WO2023071548A1 - 一种光学显示装置、显示***、交通工具及色彩调节方法 - Google Patents

一种光学显示装置、显示***、交通工具及色彩调节方法 Download PDF

Info

Publication number
WO2023071548A1
WO2023071548A1 PCT/CN2022/117928 CN2022117928W WO2023071548A1 WO 2023071548 A1 WO2023071548 A1 WO 2023071548A1 CN 2022117928 W CN2022117928 W CN 2022117928W WO 2023071548 A1 WO2023071548 A1 WO 2023071548A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
module
light source
color
image
Prior art date
Application number
PCT/CN2022/117928
Other languages
English (en)
French (fr)
Inventor
黄东晨
李洪辉
萧壮义
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22885437.8A priority Critical patent/EP4403998A1/en
Publication of WO2023071548A1 publication Critical patent/WO2023071548A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/28Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
    • G02B27/283Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for beam splitting or combining
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0112Head-up displays characterised by optical features comprising device for genereting colour display
    • G02B2027/0114Head-up displays characterised by optical features comprising device for genereting colour display comprising dichroic elements

Definitions

  • the present application relates to the technical field of image display, in particular to an optical display device, a display system, a vehicle and a color adjustment method.
  • the head-up display system is a system that projects driving-related information (such as instrument information or navigation information, etc.) to the front of the driver's field of vision.
  • driving-related information such as instrument information or navigation information, etc.
  • the driver can see the instrument information and navigation information in front of the field of vision without looking down at the steering wheel.
  • the instrument panel or central control display can improve the braking reaction time in emergency situations, thereby improving driving safety.
  • the driving environment of vehicles is more complicated. For example, due to different seasons, there may be humid environment in spring, high temperature environment and heavy rain environment in summer, strong wind environment in autumn and cold snow environment in winter, etc., which may affect the HUD. The components in it have some influence. Therefore, the HUD needs to set a specific working mode according to the specific environment. At this time, the image (or picture) displayed by the optical display device (or light machine) in the HUD may be set to a specific brightness and color (which can be used for color coordinate representation), but the difference in the environment will cause The image displayed by the optical display device has color shift or brightness attenuation. Wherein, the color shift means that the color of the displayed actual image is obviously different from the color of the real image.
  • the present application provides an optical display device, a display system, a vehicle and a color adjustment method for reducing the color shift of an image displayed by the optical display device.
  • the present application provides an optical display device, which includes a light source module, a light splitting module, a modulation module and a sensing module.
  • the light source module is used for emitting (outputting) light, and the light is synthesized by light of at least two colors.
  • the light splitting module is used to divide the light from the light source module into the first light and the second light, transmit the first light to the sensing module, and transmit the second light to the modulation module.
  • the modulation module is used to modulate the second light to obtain image light carrying image information.
  • the sensing module is used to acquire the color information of the first light, and the color information is used by the light source module to adjust the weights of at least two colors of light.
  • the above modules may be referred to as components or modules, such as light source components, light source modules, modulation components and modulation modules.
  • the color information of the light currently emitted by the light source module can be detected through the sensor module, that is, the first light can be used as the calibration of the color of the light emitted by the light source module.
  • the sensing module adjusts the weights of at least two colors of the synthesized light, so that the color correction of the image light can be realized without increasing the optical structure, thus helping It is used to reduce the color shift (or called chromatic aberration) of the image displayed by the optical display device.
  • the following exemplarily shows three possible schemes for adjusting the weights of light of at least two colors based on color information.
  • the light source module adjusts the weights of light of at least two colors based on the color information.
  • the light source module includes a processing component and a light emitting component.
  • the processing component is used to receive the color information from the sensor module, generate a control signal for adjusting the weight of light of at least two colors according to the color information, and send the control signal to the light-emitting component; the light-emitting component is used to adjust at least two colors according to the control signal. The weight of the two colors of light.
  • the processing component in the light source module can control the light emitting component to adjust the weight of the light of each color of the synthesized light based on the color information obtained from the sensor module, so as to minimize the The color cast of the image.
  • the sensing module is mainly used to detect the color information of the first light, there is no need to change the structure and operation logic of the existing sensing module, so that it can be compatible with the existing sensing module.
  • the processing component is specifically configured to determine the color coordinates of the image light according to the received color information, and generate a control signal if the difference between the color coordinates of the image light and a preset target color coordinate is greater than a threshold.
  • control signal By generating the control signal when the difference between the color coordinates of the image light and the preset target color coordinates is greater than a threshold, it can help to avoid frequently adjusting the weights of each color of the synthetic light, thereby reducing the computation load of the processing component.
  • the sensing module controls the light source module to adjust the weights of light of at least two colors based on the color information.
  • the sensing module is used to acquire the color information of the first light, generate a control signal according to the color information, and send control signals for adjusting the weights of at least two colors of light to the light source module Signal.
  • the light source module can be controlled by the sensing module to adjust the weights of light of at least two colors.
  • the structure and operation logic of the existing light source module may not be changed, thereby being compatible with the existing light source module.
  • a processing component independent of the light source module and the sensor module can control the light source module to adjust the weights of light of at least two colors based on color information.
  • the optical display device further includes a processing module, the processing module is used to receive color information from the sensor module, generate a control signal according to the color information, and send the control signal to the light source module , the control signal is used to adjust the weight of light of at least two colors.
  • the light source module can be controlled by the processing module to adjust the weights of light of at least two colors while being compatible with the existing sensing module and light source module.
  • the above-mentioned processing component may be a processor.
  • the light source module can be used to adjust the weights of light of at least two colors according to the received control signal.
  • control signal may include the weight of the current of the light source corresponding to the light of each color. It should be understood that a light source corresponding to light of one color refers to a light source that emits light of that color.
  • the light emitting component in the light source module includes a first light source, a second light source and a third light source.
  • the first light source is used to emit red light
  • the second light source is used to emit blue light
  • the third light source is used to emit green light.
  • the weights of red light, green light and blue light can be adjusted by controlling the weights of the currents input to the first light source, the second light source and the third light source.
  • the light source module can also include a first dichroic mirror and a second dichroic mirror; the first dichroic mirror is used to reflect the blue light from the second light source and transmit the green light from the third light source; the second dichroic mirror The two dichroic mirrors are used to reflect the red light from the first light source, transmit the green light transmitted by the first dichroic mirror, and transmit the blue light reflected by the first dichroic mirror.
  • the dichroic mirror can be called a dichroic mirror or a combined light mirror.
  • the third light source for emitting green light can be placed at the farthest position from the spectroscopic module, so that the brightness of the image can be balanced as much as possible.
  • the green light has the greatest influence on the brightness of the image, and placing the third light source at the farthest distance from the spectroscopic module can balance the influence of the red light, green light and blue light on the brightness of the image as much as possible.
  • the light splitting module includes a polarization beam splitter, configured to split the light from the light source module into first polarized light and second polarized light, and transmit the first polarized light to the transmitting
  • the sensor module transmits the second polarized light to the modulation module.
  • the first polarized light is P polarized light
  • the second polarized light is S polarized light
  • the first polarized light is S polarized light
  • the second polarized light is P polarized light
  • the light from the light source module can be divided into the first polarized light and the second polarized light through the polarizing beam splitter.
  • the first polarized light entering the sensing module can be S polarized light, and correspondingly, the second polarized light entering the modulation module is P polarized light; or the first polarized light entering the sensing module can also be It may be P-polarized light, and correspondingly, the second polarized light incident on the modulation module is S-polarized light.
  • the light splitting module includes a transflective element (such as a transflective element), wherein the first light is the light transmitted from the transflective element, and the second light is the light transmitted from the transflective element. reflected light; or, the first light is the light reflected from the transflective element, and the second light is the light transmitted from the transflective element.
  • a transflective element such as a transflective element
  • the light from the light source module can be divided into the first light path and the second light path through the transflective member.
  • the polarization states of the two paths of light may be the same.
  • the light source module may further include a polarizer, or the optical display device may further include a polarizer.
  • the optical display device may further include an optical lens for projecting the image light output by the modulation module to a spatial area.
  • the image light output by the modulating module can be shaped and/or homogenized through the optical lens, thereby helping to improve the quality of the image formed based on the image light.
  • the optical display device provided in the above first aspect may be called a picture generation unit (picture generation unit, PGU) or an optical machine, and the PGU may be used in various display systems. For example, it is used in a projector, a head-up display system, a head-mounted optical display device, or a desktop display, etc.
  • PGU picture generation unit
  • the PGU may be used in various display systems. For example, it is used in a projector, a head-up display system, a head-mounted optical display device, or a desktop display, etc.
  • the present application provides a display system, including any optical display device in the above-mentioned first aspect or the first aspect, and a spatial light amplification module located in a spatial area; wherein, the spatial light amplification module is used for The image corresponding to the image light from the optical display device is enlarged.
  • the display system may include but is not limited to a projector, a HUD system, a desktop display, or a head-mounted optical display device and the like.
  • the spatial light amplification module may include at least one curved reflector, or at least one cylindrical mirror, or a combination of at least one curved reflector and at least one cylindrical mirror.
  • the present application provides a vehicle, including a display system and a windshield according to any one of the above-mentioned second aspect or the second aspect; wherein, the windshield is used for reflecting and imaging the image light from the display system, such as reflecting To within the eye movement range of the vehicle (eye box position).
  • the present application provides a color adjustment method, which can be applied to an optical display device, and the optical display device includes a light source module, a spectroscopic module, a modulation module and a sensing module.
  • the method includes controlling the light source module to emit light, wherein the light is synthesized by light of at least two colors, and the light is divided into a first path of light and a second path of light by the light splitting module; controlling the modulation module to modulate the second path of light, obtaining image light carrying image information; controlling the sensing module to obtain color information of the first light; and controlling the light source module to adjust the weights of at least two colors of light according to the color information.
  • the color coordinates of the image light can be determined according to the color information, and if the difference between the color coordinates of the image light and the preset target color coordinates is greater than a threshold, a light for controlling and adjusting at least two colors is generated.
  • weight control signal and send the control signal to the light source module, so as to control the light source module to adjust the weight of light of at least two colors according to the control signal.
  • the present application provides a color adjustment device, the color adjustment device is used to realize the above fourth aspect or any one of the methods in the fourth aspect, including corresponding functional modules, respectively used to realize the steps in the above methods .
  • the functions may be implemented by hardware, or may be implemented by executing corresponding software through hardware.
  • Hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • the present application provides an optical display device, which may include a light source module, a spectroscopic module, a modulation module, and a sensing module.
  • the light source module is used for emitting light, and the light is synthesized by light of at least two colors.
  • the light splitting module is used to divide the light from the light source module into the first light and the second light, transmit the first light to the sensing module, and transmit the second light to the modulation module.
  • the modulation module is used to modulate the second light to obtain image light carrying image information.
  • the sensor module is used to obtain the color information of the first light, generate a control signal according to the color information, and send the control signal to the light source module.
  • the light source module is also used for adjusting the weights of light of at least two colors according to the control signal.
  • the present application provides an optical display device, which may include a light source module, a spectroscopic module, a modulation module, a sensing module, and a processing module.
  • the light source module is used for emitting light, and the light is synthesized by light of at least two colors.
  • the light splitting module is used to divide the light from the light source module into the first light and the second light, transmit the first light to the sensing module, and transmit the second light to the modulation module.
  • the modulation module is used to modulate the second light to obtain image light carrying image information.
  • the sensing module is used to obtain the color information of the first light, and send the color information to the processing module.
  • the processing module is used for generating a control signal according to the color information, and sending the control signal to the light source module, and the control signal is used for the light source module to adjust the weight of light of at least two colors.
  • the light source module is also used for adjusting the weights of light of at least two colors according to the control signal.
  • the present application provides a chip, including at least one processor and an interface circuit. Further, optionally, the chip may further include a memory, and the processor is used to execute computer programs or instructions stored in the memory, so that the chip performs the above-mentioned The fourth aspect or the method in any possible implementation manner of the fourth aspect.
  • the present application provides a computer-readable storage medium, in which a computer program or instruction is stored, and when the computer program or instruction is executed by the control device, the control device executes the above-mentioned fourth aspect or the first A method in any possible implementation of the four aspects.
  • the present application provides a computer program product, the computer program product includes a computer program or an instruction, and when the computer program or instruction is executed by the control device, the control device executes any of the above fourth aspect or the fourth aspect. method in a possible implementation.
  • Figures 1a to 1c are schematic diagrams of a possible application scenario provided by this application.
  • FIG. 2 is a schematic diagram of another possible application scenario provided by the present application.
  • FIG. 3 is a schematic structural view of an optical display device provided by the present application.
  • Fig. 4a is a schematic structural diagram of a light source module provided by the present application.
  • Fig. 4b is a schematic structural diagram of another light source module provided by the present application.
  • Fig. 4c is a schematic structural diagram of a dodging component provided by the present application.
  • Fig. 5a is a schematic diagram of the light splitting principle of a polarizing beam splitter provided by the present application.
  • Fig. 5b is a schematic structural diagram of a transflective element provided by the present application.
  • Figure 5c is a schematic structural diagram of a liquid crystal on silicon LCoS provided by the present application.
  • FIG. 6 is a schematic structural view of an optical lens provided by the present application.
  • Fig. 7a is a schematic structural diagram of another optical display device provided by the present application.
  • Fig. 7b is a schematic structural diagram of another optical display device provided by the present application.
  • Fig. 7c is a schematic structural diagram of another optical display device provided by the present application.
  • FIG. 8 is a schematic structural diagram of another optical display device provided by the present application.
  • Fig. 9a is a schematic flowchart of a method for adjusting color provided by the present application.
  • FIG. 9b is a schematic diagram of a PWM current input to a light source module provided by the present application.
  • FIG. 10 is a schematic flowchart of a method for adjusting the color initialization process of an image provided by the present application.
  • Fig. 11 is a schematic structural diagram of another optical display device provided by the present application.
  • Fig. 12a is a schematic structural diagram of another optical display device provided by the present application.
  • Fig. 12b is a schematic circuit diagram of an optical display device provided by the present application.
  • Fig. 13a is a partial structural schematic diagram of a vehicle provided by the present application.
  • Fig. 13b is a schematic diagram of a functional framework of a vehicle provided by the present application.
  • FIG. 14 is a schematic flow chart of a color adjustment method provided by the present application.
  • Fig. 15 is a schematic structural diagram of a control device provided by the present application.
  • FIG. 16 is a schematic structural diagram of a control device provided by the present application.
  • the color coordinates are also called the color system. Color coordinates can accurately represent the color. Usually, the horizontal axis of the color coordinate is x and the vertical axis is y. A color coordinate can determine a point on the chromaticity diagram, which can be called a color point, and can be used for color coordinates (x, y) to represent a color point.
  • the spectrum usually refers to the pattern in which the dispersed monochromatic light is arranged in sequence according to the wavelength (or frequency) after the polychromatic light is split by the dispersion system.
  • Spectral tristimulus values are numerical values of the intensities of the three stimuli used to approximately describe a color.
  • the three colors used for color mixing (or synthesis) to produce any color are called three primary colors (or three primary colors). Usually red, green and blue are used as the three primary colors.
  • the number of the three primary colors that match the equienergy spectral colors is called the tristimulus value of the spectrum, represented by the symbols r, g, b.
  • PWM Pulse width modulation wave
  • a pulse width modulated waveform refers to a pulse waveform with a variable duty cycle.
  • PWM the amplitude of each pulse is equal, to change the amplitude of the equivalent output waveform, just change the width of each pulse according to the same proportional coefficient.
  • the principle of pulse width modulation to control the on-off of the switching element of the inverter circuit (for example, the conduction time of the switching element can be specifically controlled), so that the output terminal can obtain a series of pulses with equal amplitude, and use these pulses to replace the sine wave or the desired waveform.
  • Dichroic mirrors can also be called dichroic mirrors or dichroic mirrors or light combining mirrors. It is characterized by almost completely transmitting light of certain wavelengths, and almost completely reflecting light of other wavelengths.
  • the dichroic mirror 1 can transmit blue light and reflect green light; that is, blue light can almost completely pass through the dichroic mirror 1 , and green light can be almost completely reflected by the dichroic mirror 1 .
  • the dichroic mirror 2 can pass through blue light and green light, and reflect red light; reflection.
  • the dichroic mirror can be selected according to actual needs.
  • the three primary colors refer to the "basic colors” that cannot be obtained by mixing other colors. Generally, the three primary colors refer to red, green, and blue, namely R (Red), G (Green), and B (Blue). It can also be understood that the three primary colors are independent of each other, and any primary color cannot be matched by the other two primary colors.
  • Different colors of light can be obtained by synthesizing (or mixing) the three primary colors according to different percentages (ie weights).
  • the brightness of the synthesized light is determined by the sum of the brightness of the three primary colors, and the chromaticity (expressed by color coordinates) of the synthesized light is determined by the weights of the three primary colors.
  • the present application provides an optical display device, a display system, a vehicle and a corresponding color adjustment method.
  • the optical display device can be integrated into a projector 100a shown in FIG. 1a, and the projector 100a can project an image onto a wall or a projection screen.
  • the optical display device can be integrated in a display 100b as shown in FIG. 1b.
  • the optical display device can also be integrated in the vehicle display (such as shown in 100c in Fig. The location is not limited.
  • the optical display device can also be integrated into the head-up display system shown in FIG. 2 , and the head-up display system can display driving assistance information.
  • the HUD system can project the formed image (called HUD virtual image) in the driver's front field of vision and fuse it with real road surface information, thereby enhancing the driver's perception of the actual driving environment.
  • the HUD system can superimpose the HUD virtual image carrying navigation information and/or instrument information (such as vehicle speed, rotation speed, temperature, fuel level, etc.) on the real environment outside the vehicle, so that the driver can obtain the visual effect of augmented reality.
  • the HUD system includes but is not limited to a windshield (Windshield, W)-HUD system, an augmented reality head up display (augmented reality head up display, AR-HUD) and the like.
  • the HUD needs to set a specific working mode according to the specific environment, which may cause the image (or picture) displayed by the optical display device in the HUD to ) is set to a specific color, which will cause problems such as color shift in the image displayed by the optical display device.
  • the present application proposes an optical display device, which can correct the color of the image to be displayed by detecting the color information of the light currently emitted by the light source module through the sensing module, thereby helping to reduce the The color shift of the image displayed by the optical display device.
  • optical display device proposed in the present application will be described in detail below with reference to FIG. 3 to FIG. 10 .
  • the optical display device 300 may include a light source module 301 , a light splitting module 302 , a modulation module 303 and a sensor module 304 .
  • the light source module 301 is used for emitting light, and the light can be synthesized by light of at least two colors; for example, the light can be formed by mixing red light, green light and blue light.
  • the light splitting module 302 is used to divide the light from the light source module 301 into a first path of light and a second path of light, and transmit the first path of light to the sensing module 304, and transmit the second path of light to the modulation module 303; wherein, the polarization states of the first light and the second light may be the same or different, and the color information and the like are consistent.
  • the modulating module 303 is used for modulating the second path of light (for example, performing amplitude modulation and/or phase modulation) to obtain image light. Wherein, the image light carries image information (such as navigation information and/or instrument information, etc.) light.
  • the sensing module 304 is used to obtain color information of the first light, and the color information is used by the light source module to adjust the weight of at least two colors of light.
  • the color information of the light currently emitted by the light source module can be detected through the sensing module, that is to say, the first light can be used as the calibration of the color of the light emitted by the light source module. Further, Based on the color information detected by the sensing module according to the first light, the weights of at least two colors of light in the synthesized light are adjusted, so that the color correction of the image light can be realized without increasing the optical structure, thereby It helps to reduce the color shift of the image displayed by the optical display device.
  • the sum of the weights of the light of each color in the light of at least two colors in the synthesized light is a fixed value (such as 100%). If the weight of at least one color of light is changed, the weight of at least one color of light of the remaining colors of light will be changed accordingly.
  • adjusting the weight of light of at least two colors may be changing the weight of light of each color in at least two colors, or changing part of colors in at least two colors (here at least two colors may be three One or more than three kinds) light weight, the application is not limited to this.
  • the factor that has a greater influence on the color shift of the image displayed by the optical display device due to the influence of the environment (such as temperature) is the light source module. Therefore, by monitoring the color of the light emitted by the light source module information, which can effectively improve the color cast of the image.
  • Each functional module shown in FIG. 3 is introduced and described below to give an exemplary specific implementation solution.
  • the light source module, spectroscopic module, modulation module and sensor module in the following are not marked with numbers.
  • the light source module may include a light emitting component.
  • the light emitting component may include at least two light sources, and one light source may emit light of one color.
  • the light source can be, for example, a laser diode (laser diode, LD), a light emitting diode (light emitting diode, LED), an organic light emitting diode (organic light emitting diode, OLED), or a micro light emitting diode (micro light emitting diode, micro-LED), etc. .
  • the light emitting component may include a first light source, a second light source and a third light source, wherein the first light source is used to emit red light, the second light source is used to emit blue light, and the third light source is used to emit green light.
  • the light emitting component may include an R light source, a G light source and a B light source.
  • the red light emitted by the first light source, the green light emitted by the third light source and the blue light emitted by the second light source can be mixed to obtain light of different colors. For example, it can be mixed to obtain white light.
  • the possible structure of the light source module will be introduced below by taking the light emitting assembly including the first light source, the second light source and the third light source as an example.
  • FIG. 4 a is a schematic structural diagram of a light source module provided in the present application.
  • the light source module includes a light-emitting component, the light-emitting component includes a first light source, a second light source and a third light source, these three light sources are arranged in a row, and the light of three colors (RGB) emitted by the three light sources can be mixed to form a white light.
  • each light source also corresponds to a collimating mirror (such as a collimating lens, or a curved reflector, etc.).
  • the first light source corresponds to a collimating mirror
  • the second light source corresponds to a collimating mirror
  • the third light source also corresponds to a collimating mirror.
  • the light source module Based on the light source module, it is possible to realize the mixing of light of each color while omitting some optical elements (such as dichroic mirrors), which contributes to the miniaturization of the light source module, thereby enabling the miniaturization of optical displays. device.
  • some optical elements such as dichroic mirrors
  • the spectrum of the light emitted from the light source module can be expressed as LED( ⁇ ), and the specific expression can be referred to the following formula 1.
  • LED B ( ⁇ ) represents the spectrum of blue light emitted by the second light source
  • LED G ( ⁇ ) represents the spectrum of green light emitted by the third light source
  • LED R ( ⁇ ) represents the spectrum of red light emitted by the first light source.
  • the light source module may include a light emitting component, a first dichroic mirror and a second dichroic mirror, and the light emitting component includes a first light source, a second light source and a third light source.
  • RGB three colors
  • each light source may also correspond to a collimating mirror.
  • each light source may also correspond to a collimating mirror.
  • the spectrum LED( ⁇ ) of the light emitted from the light source module can refer to the following formula 2.
  • LED( ⁇ ) LED B ( ⁇ ) ⁇ T 1 ( ⁇ ) ⁇ T 2 ( ⁇ )+LED G ( ⁇ ) ⁇ R 1 ( ⁇ ) ⁇ T 2 ( ⁇ )+LED R ( ⁇ ) ⁇ R 2 ( ⁇ ) Formula 2
  • LED B ( ⁇ ) represents the spectrum of blue light emitted by the second light source
  • LED G ( ⁇ ) represents the spectrum of green light emitted by the third light source
  • LED R ( ⁇ ) represents the spectrum of red light emitted by the first light source
  • T 1 ( ⁇ ) represents the transmission spectrum of the first dichroic mirror
  • R 1 ( ⁇ ) represents the reflection spectrum of the first dichroic mirror
  • T 2 ( ⁇ ) represents the transmission spectrum of the second dichroic mirror
  • R 2 ( ⁇ ) represents Reflection spectrum of the second dichroic mirror.
  • the positions of the first light source, the second light source and the third light source in the light source module given above can also be interchanged.
  • the second dichroic mirror can be replaced by the third dichroic mirror, which can reflect blue light , transmits red and green light. They are not listed here.
  • the light source module may further include a dodging component.
  • the light rays formed by light synthesis of various colors are first passed through the light uniform component for uniform light, and then directed to the light splitting module.
  • the homogenizing component can be a fly-eye lens (see Figure 4c) composed of a series (such as two or more) of lenses (or called sub-eyes) to compress the angle of light rays, so that The light emitted to the beam splitting module becomes more uniform.
  • the number of lenses included in the fly-eye lens shown in FIG. 4c is only an example.
  • the fly-eye lens may include more lenses than in FIG. 4c, and may also include fewer lenses than in FIG. 4c. limited. It should be understood that the more sub-eyes the fly-eye lens includes, the better the light uniformity effect will be. In addition, there may be one or more fly-eye lenses, which is not limited in this application.
  • the light splitting module can divide the light from the light source module into a first path of light and a second path of light.
  • the following exemplarily shows the structures of two possible spectroscopic modules.
  • the light splitting module can be a polarized beam splitter.
  • the polarization states of the first light and the second light are different, the first light may be called first polarized light, and the second light may be called second polarized light.
  • FIG. 5 a it is a schematic diagram of a light splitting principle of a polarizing beam splitter provided in the present application.
  • the polarizing beam splitter (PBS) can be coated with one or more layers of thin film on the oblique surface of the right-angle prism, and then bonded through the adhesive layer.
  • the transmittance of P-polarized light is 1 and the transmittance of S-polarized light is less than 1 when the light is incident at Brewster's angle, after the light passes through the film at Brewster's angle for many times, the P-polarized component is completely transmitted, and absolutely An optical element that is mostly reflective (at least 90%) of the S-polarized component.
  • the polarizing beam splitter may split incident light (including P-polarized light and S-polarized light) into horizontally polarized light (ie, P-polarized light) and vertically polarized light (ie, S-polarized light). Wherein, the P-polarized light completely passes through, the S-polarized light is reflected at an angle of 45 degrees, and the outgoing direction of the S-polarized light and the outgoing direction of the P-polarized light form an angle of 90 degrees.
  • the polarization beam splitter is used to split the light from the light source module into first polarized light and second polarized light
  • the first polarized light may be P polarized light
  • the second polarized light is S polarized light
  • the first polarized light is S polarized light
  • the second polarized light is P polarized light
  • the splitting module also reflects the first polarized light (ie, S polarized light) to the sensing module, and transmits the second polarized light (ie, P polarized light) to the modulation module; or, the first polarized light (ie, P polarized light) is transmitted to the modulation module; The polarized light (that is, P polarized light) is transmitted to the sensing module, and the second polarized light (that is, S polarized light) is reflected to the modulation module.
  • the polarized light that is, P polarized light
  • the second polarized light that is, S polarized light
  • the light splitting module is a transflective component.
  • the polarization states of the first light and the second light are the same.
  • the first light and the second light may both be P-polarized light, or both may be S-polarized light, or both may be natural light.
  • a corresponding polarizer can be added in the light source module or between the light source module and the beam splitting module, and the corresponding polarizer can be Allow P polarized light to pass or allow S polarized light to pass through.
  • the transflective member can partially transmit the light from the light source module to obtain the first light, and partially reflect it to obtain the second light.
  • the first path of light is the light transmitted from the transflective element
  • the second path of light is the light reflected from the transflective element.
  • the transflective member can reflect part of the light from the light source module to obtain the first light, and partially transmit it to obtain the second light.
  • the first path of light is the light reflected from the transflective element
  • the second path of light is the light transmitted from the transflective element.
  • the transflector can be a beam splitter, and the working part of the transflector can be coated with a plane of a spectroscopic film (see FIG. 5b ), so as to change the ratio of the incident light being transmitted and reflected.
  • a light-splitting film may be coated on a transparent flat substrate to form a transflective member.
  • the transflective member can select the reflectivity and transmittance of the spectroscopic film according to specific requirements, for example, the reflectivity can be higher than 50%, and the transmittance can be lower than 50%; or the reflectivity can be lower than 50%, and the transmittance can be The rate is lower than 50%; or the reflectivity and transmittance are both equal to 50%.
  • This type of transflective part can also be called a semi-transparent and semi-reflective mirror (semi-transparent and semi-reflective mirror), that is, the transmission of the semi-transparent mirror.
  • a semi-transparent and semi-reflective mirror that is, the transmission of the semi-transparent mirror.
  • Structure 1 can be understood as splitting light based on the principle of polarization state
  • structure 2 can be understood as splitting light based on the intensity (or energy) of light of.
  • other structures that can realize that the light from the light source module is divided into the first light path and the second light path are also within the protection scope of the present application.
  • the modulation module may include an image source (or called an optical data processing (optical data processing, ODP) unit), which is used to modulate the received second polarized light to obtain image information carrying image light.
  • ODP optical data processing
  • the modulation module can perform spatial phase modulation on the second polarized light to obtain image light carrying image information.
  • the polarization state of the image light is the same as that of the first polarized light. Therefore, after the modulation module reflects the image light to the beam splitting module, the image light can be transmitted to the spatial area through the beam splitting module.
  • the modulation module performs spatial phase modulation on the second polarized light, and the obtained image light is P polarized light; if the second polarized light is P polarized light, the modulation module performs spatial phase modulation on the second polarized light Spatial phase modulation, the obtained image light is S polarized light.
  • the modulation module may include, but not limited to: LCoS (see the related description above) display, liquid crystal display (liquid crystal display, LCD), digital light processing (Digital Light Processing, DLP) display, laser line scan (laser beam scanning, LBS) display, etc.
  • LCoS see the related description above
  • LCD liquid crystal display
  • DLP Digital Light Processing
  • LBS laser line scan
  • FIG. 5c it is a schematic structural diagram of a liquid crystal on silicon (LCoS) provided by the present application.
  • Liquid crystal is injected between the upper glass substrate of the LCoS and the lower silicon substrate based on complementary metal oxide semiconductor (COMS) technology to form a liquid crystal layer; the bottom of the liquid crystal layer is provided with electrodes.
  • the working principle of LCOS is: when the applied voltage of a certain pixel of the liquid crystal layer is 0, the input S-polarized light passes through the liquid crystal layer, the polarization direction does not deflect, reaches the bottom and is reflected back to output S-polarized light, which is reflected by the polarizing beam splitter Afterwards, the S polarized light returns to the original path.
  • the input S-polarized light passes through the liquid crystal layer, the polarization direction is deflected, reaches the bottom and is reflected back to output P-polarized light, which directly passes through the polarizing beam splitter and is coupled into the optical lens. Therefore, the direction of the long axis of the liquid crystal molecules can be changed by changing the applied voltage or current to change the refractive index of the LCoS, thereby changing the phase of light passing through the LCoS. It is equivalent to using the delay of the phase to rotate the polarization state of the light, and cooperate with the polarizing beam splitter to realize the modulation of the light. Based on LCOS, smaller display chips can be realized, which is beneficial to the miniaturization of optical display devices.
  • modulation module is only an example, and other devices capable of modulating the light from the light source module and generating image light are also within the protection scope of the present application.
  • the modulation module can be, for example, an LCoS display or LCD; if the light splitting module is the transflective beam splitter shown in the above structure two components, the modulating module may be, for example, an LCoS display, LCD, DLP display, or LBS display.
  • the sensing module is used to detect color information of the first polarized light.
  • the color information can be used as a spectral pointer.
  • the sensing module can detect the first spectrum of the first polarized light, and convert the information of the first spectrum into a first electrical signal. It can also be understood that the sensing module performs photoelectric conversion on the detected first spectrum of the first polarized light to obtain a first electrical signal representing information of the first spectrum of the first polarized light.
  • the first spectrum of the first polarized light detected by the sensing module can be represented by Sensor( ⁇ ). If the first polarized light is P-polarized light, the Sensor( ⁇ ) can be expressed in the following formula 3; if the first polarized light is S-polarized light, the Sensor( ⁇ ) can be expressed in the following formula 4.
  • Tp( ⁇ ) is the transmission spectrum of the spectroscopic module
  • Rs( ⁇ ) is the reflection spectrum of the spectroscopic module.
  • the sensing module may include a detection component.
  • Detection components may include but are not limited to: photodetector (photon detector, PD), or high-speed photodiode, or charge coupled device (charge coupled device, CCD) or complementary metal-oxide-semiconductor (complementary metal-oxide-semiconductor, CMOS) Phototransistors, photodiodes, etc.
  • the sensing module can be, for example, a color sensor. Color sensors can also be called color recognition sensors or color sensors, which can more accurately distinguish similar colors.
  • the color shift of the image formed based on the image light can be corrected.
  • the specific process please refer to the introduction of the following Fig. 9a and Fig. 10, here No longer.
  • the optical display device may further include an optical lens.
  • the optical lens can be used to project the image light output by the modulation module to the spatial area.
  • the optical lens can shape and/or homogenize and/or converge the image light output by the modulation module, and spread the shaped and/or homogenized and/or converged image light to the spatial area.
  • the optical display device is applied to a display system (such as a projector, a HUD system, a desktop display, or a head-mounted optical display device, etc.), the optical lens can transmit the shaped and/or homogenized and/or converged image light to The spatial optical amplification module located in the spatial area (for details, please refer to the following related introduction).
  • FIG. 6 it is a schematic structural diagram of an optical lens provided by the present application.
  • the optical lens may include at least one mirror.
  • Fig. 6 is an example including 3 lenses.
  • the application does not limit the number of lenses included in the optical lens, which may be more than the above-mentioned Figure 6, or may be less than the above-mentioned Figure 6, and the application does not limit the type of lenses, and the lenses can also be Including other lenses or combinations of other lenses, such as plano-convex lenses, plano-concave lenses, etc.
  • the optical lens may be rotationally symmetric about the optical axis of the optical lens.
  • the lens in the optical lens can be a single spherical lens, or a combination of multiple spherical lenses.
  • the optical lens can also be non-rotationally symmetric.
  • the lens in the optical lens can be a single aspheric lens, or a combination of multiple aspheric lenses. The combination of multiple spherical lenses and/or aspheric lenses helps to improve the imaging quality of the optical lens and reduce the aberration of the optical lens.
  • the material of the lens in the optical lens may be an optical material such as glass, resin, or crystal.
  • the lens material is resin, it helps to reduce the mass of the detection system.
  • the material of the lens is glass, it helps to further improve the imaging quality of the detection system.
  • the optical lens includes at least one lens made of glass material.
  • the optical display device 700 may include a light source module 701 , a polarizing beam splitter 702 , a modulation module 703 , a sensor module 704 , and an optical lens 705 . Further, optionally, the optical display device may further include a processing module 706 .
  • the light source module 701 includes a first light source 7011, a second light source 7012, a third light source 7013, a first dichroic mirror 7014, a second dichroic mirror 7015, and a fly-eye lens 7016 as an example.
  • the light source module 701 includes a first light source 7011, a second light source 7012, a third light source 7013, a first dichroic mirror 7014, a second dichroic mirror 7015, and a fly-eye lens 7016 as an example.
  • the first light source 7011 emits red light
  • the second light source 7012 emits blue light
  • the third light source 7013 emits green light.
  • the second dichroic mirror 7015 the blue light is reflected to the second dichroic mirror 7015 through the first dichroic mirror 7014
  • the green light and the blue light are transmitted through the second dichroic mirror 7015
  • the red light is reflected with the blue light after being reflected by the second dichroic mirror 7015
  • the white light is synthesized with the green light, and the light is homogenized by the fly-eye lens 7016 and then propagated to the polarizing beam splitter 702.
  • the spectrum of the light emitted from the light source module can be expressed by the above formula 2.
  • the light is divided into the first polarized light and the second polarized light by the polarizing beam splitter 702 , and the first polarized light is transmitted to the sensing module 704 by the polarizing beam splitter 702 .
  • the first spectrum of the first polarized light received by the sensing module 704 can refer to Sensor( ⁇ ) in the above formula 3, where Tp( ⁇ ) in the formula 3 is the transmission of the polarizing beam splitter spectrum.
  • the second polarized light is reflected by the polarizing beam splitter 702 to the modulation module 703, and the modulation module 703 performs spatial phase modulation on the second polarized light to obtain image light (the polarization state of the image light is the same as that of the first polarized light) , and reflect the image light to the polarizing beam splitter 702 , and transmit the image light to the optical lens 705 through the polarizing beam splitter 702 .
  • the second spectrum of the image light projected onto the spatial region may be expressed as Display( ⁇ ), and details may be referred to the following formula 5.
  • Rs( ⁇ ) represents the reflection spectrum of the beam splitting module (here, the polarizing beam splitter 702)
  • LCoS( ⁇ ) represents the reflection spectrum of the modulation module 703
  • a ⁇ ( ⁇ ) represents the transmission spectrum of the optical lens 1004 .
  • FIG. 7 a is illustrated by taking the first polarized light as P-polarized light and the second polarized light as S-polarized light as an example. If the first polarized light is S polarized light and the second polarized light is P polarized light, the positions of the modulation module 703 and the sensor module 704 in FIG. 7 a can be exchanged.
  • Rs( ⁇ ) in the above formula 5 can be replaced with Tp( ⁇ ), and Tp( ⁇ ) can be replaced with Rs( ⁇ ), which means that formula 5 remains unchanged;
  • Rs( ⁇ ) in the above formula 6 Replace with Tp( ⁇ ).
  • FIG. 7 b it is a schematic structural diagram of another optical display device provided by the present application.
  • the optical display device 710 can replace the polarizing beam splitter 702 in FIG. 7 a with a transflective member 712 .
  • the optical display device 710 may include a light source module 711 , a transflective element 712 , a modulation module 713 , a sensor module 714 , and an optical lens 715 .
  • the optical display device may further include a processing module 716 .
  • a processing module 716 For a more detailed introduction to each module, please refer to the above-mentioned related descriptions, and will not repeat them here.
  • the light is divided into a first path of light and a second path of light through the transflective element 712 , and the first path of light is partially transmitted to the sensing module 714 through the transflective element 712 .
  • the first spectrum of the first light received by the sensing module 714 can refer to Sensor( ⁇ ) in the following formula 3, where Tp( ⁇ ) in the formula 3 is the transmission of the transflective member 712 spectrum.
  • the second path of light is reflected to the modulation module 713 by the transflective element 712, and the modulation module 713 performs spatial phase modulation on the second path of light to obtain image light, and reflects the image light to the transflective element 712, and the image light passes through the transflective element 712 transmits to optical lens 715.
  • the second spectrum of the image light projected onto the spatial region can be expressed as Display( ⁇ ), for details, refer to the above formula 5, where Rs( ⁇ ) in formula 5 represents the reflection spectrum of the transflective member 712 .
  • FIG. 7 b is illustrated by taking the first path of light as the light transmitted by the transflective element 712 and the second path of light as the light reflected by the transflective element 712 as an example. If the first light is the light of the reflection part and the second light is the light of the transmission part, the positions of the modulation module 713 and the sensor module 714 in FIG. 7 b can be exchanged.
  • Rs( ⁇ ) in the above formula 5 can be replaced with Tp( ⁇ ), and Tp( ⁇ ) can be replaced with Rs( ⁇ ), which means that formula 5 remains unchanged;
  • Rs( ⁇ ) in the above formula 6 Replace with Tp( ⁇ ).
  • FIG. 7 c it is a schematic structural diagram of another optical display device provided by the present application.
  • the optical display device 720 can add a polarizer 727 before or after the fly-eye lens 7116 in FIG. 7b (that is, between the fly-eye lens 7116 and the second dichroic mirror 7115). Sheet 727.
  • a polarizer 727 is added between the light source module 711 and the transflective member 712 in FIG.
  • the polarizer 727 can allow P-polarized light or S-polarized light to pass through.
  • FIG. 7c shows an example where the polarizer 727 is added between the light source module 721 and the transflector 722 , and the polarizer 727 allows P polarized light to pass through as an example.
  • the optical display device 720 may include a light source module 721 , a transflector 722 , a modulation module 723 , a sensor module 724 , an optical lens 725 and a polarizer 727 . Further, optionally, the optical display device may further include a processing module 726 .
  • a processing module 726 For a more detailed introduction to other modules, please refer to the above-mentioned related descriptions, and will not repeat them here.
  • the first light is the first polarized light
  • the first polarized light is P polarized light
  • the second light is the second polarized light
  • the second polarized light is S polarized light as an example.
  • the color shift of the image formed by the image light corresponding to the second spectrum is corrected.
  • the specific process can be processed by Component execution. The following is an introduction based on the situation of the module group to which the processing component belongs.
  • the processing component belongs to the light source module.
  • the above-mentioned light source module may further include a processing component.
  • the light source module may include a light emitting component and a processing component, and further, optionally, a collimating mirror and/or a dichroic mirror and the like.
  • the optical display device may include a light source module 801 , a spectroscopic module 802 , a modulation module 803 , a sensor module 804 , and an optical lens 805 .
  • the light source module 801 includes a processing component 8011 and a light emitting component 8012 .
  • the sensing module 804 is used to send the color information to be obtained to the processing component 8011.
  • the processing component 8011 receives the color information from the sensing module 804, generates a control signal according to the color information, and sends the control signal to the light emitting component 8012.
  • the light emitting component 8012 is used to adjust the weights of light of at least two colors according to the control signal.
  • the specific process please refer to the introduction of FIG. 9a below.
  • the light source module 801 , the beam splitting module 802 , the modulation module 803 , the sensor module 804 , and the optical lens 805 please refer to the foregoing description, which will not be repeated here.
  • the processing component may be a central processing unit (central processing unit, CPU), a general purpose processor, a digital signal processor (digital signal processor, DSP), an application specific integrated circuit (application specific integrated circuit, ASIC), field programmable Gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof.
  • CPU central processing unit
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable Gate array
  • the general-purpose processor may be a microprocessor, or any conventional processor.
  • FIG. 9a it is a schematic flowchart of a method for adjusting color for the present application.
  • the method can be executed by the processing component in the above-mentioned light source module, and the method includes the following steps:
  • step 901 the sensing module sends the information of the first spectrum to the processing component.
  • the processing component receives information of the first spectrum from the sensing module.
  • the information of the first spectrum can be carried in the first electrical signal. That is to say, the sensing module sends a first electrical signal to the processing component, and the first electrical signal includes information of the first spectrum.
  • Step 902 the processing component determines the second spectrum of the image light according to the information of the first spectrum and the corresponding relationship between the first spectrum and the second spectrum.
  • Step 903 the processing component determines the color coordinates of the image light according to the second spectrum and the corresponding relationship between the spectrum and the color coordinates.
  • X, Y, and Z represent the spectral tristimulus values of the standard observer
  • x, y represent the color coordinates
  • the integration interval is the visible light range, such as [380, 780], further, it can also be [400, 700].
  • the color coordinates of the image light are (x real , y real ).
  • Step 904 the processing component determines whether it is necessary to adjust the weight of light of each color according to the color coordinates of the image light and the preset target color coordinates; if adjustment is required, perform the following step 905; if no adjustment is required, perform the following step 906 .
  • multiple sets of preset color coordinates may be stored in advance.
  • the optical display device may have multiple working modes (such as snow mode, rainy mode, sunny mode, night mode, day mode, etc.), one working mode may correspond to a set of preset color coordinates, one
  • the preset color coordinates of the group may be the color coordinates of white light, see Table 1. It should be understood that the preset target color coordinates may be one of the preset color coordinates.
  • Table 1 preset color coordinates
  • Operating mode Color coordinates snow mode A(x target1 ,y target1 ) rain mode B(x target2 ,y target2 ) sunny mode C(x target3 ,y target3 )
  • the user can select a certain set of preset color coordinates (the selected preset color coordinates are the preset target color coordinates), and the processing component can detect the set of preset color coordinates selected by the user.
  • the set target color coordinates are compared with the color coordinates (x real , y real ) of the image light and the preset target color coordinates to determine whether to adjust the weights of the light of each color.
  • ⁇ x
  • , ⁇ y
  • the threshold may be, for example, 0.002 or the like.
  • Step 905 the processing component determines the weights of the lights of each color, and generates control signals according to the weights of the lights of each color.
  • the second spectrum of the image light can be represented by the following LED( ⁇ ) target considering the weights of light of various colors, wherein a, m, and n are blue light, blue light, and Weights for green and red light.
  • LED( ⁇ ) target a ⁇ LED B ( ⁇ ) ⁇ T 1 ( ⁇ ) ⁇ T 2 ( ⁇ )+m ⁇ LED G ( ⁇ ) ⁇ R 1 ( ⁇ ) ⁇ T 2 ( ⁇ )+n ⁇ LED R ( ⁇ ) ⁇ R 2 ( ⁇ ).
  • the second spectrum LED( ⁇ ) target is adjusted by adjusting the weights of red light, green light and blue light, so as to adjust the light emitted by the light source and reduce the color shift of the image formed based on the image light.
  • weights of light of various colors may be adjusted by adjusting weights of currents of light sources corresponding to light emitting lights of various colors.
  • the control signal includes the weight of the current of each light source.
  • the light emitting component includes a first light source, a second light source and a third light source, and the processing component can adjust the weights a:m:n of light of each color by adjusting I 1 :I 2 :I 3 .
  • the processing component can adjust the weight of the current according to a color processing algorithm, and the color processing algorithm can be preset. For example, the weight of the current input to one or several light sources is increased by a%, and the weight of the current of the remaining light sources is decreased by a%. Wherein, a% can be 0.5%, 0.2% or 1%. It should be noted that the sum of weights of light source currents corresponding to lights of various colors is 100%.
  • each light source input to the light source module has an initial current weight, which may be pre-stored or determined during the initialization process.
  • initial current weight may be pre-stored or determined during the initialization process.
  • the light source module includes an R light source, a G light source, and a B light source as an example.
  • the duty cycle of the PWM of the input R light source, G light source and B light source specifically, the interval between two adjacent pulses can be changed (that is, the T 1 , T 2 , T 3 size).
  • the magnitude of the current value (I) input to the R light source, the G light source, and the B light source it is possible to change the magnitude of the current value (I) input to the R light source, the G light source, and the B light source.
  • the weights of the input currents of the R light source, the G light source and the B light source can be adjusted.
  • Step 906 the processing component controls the light emitting component to emit light of corresponding colors according to the initialized weights of light of various colors.
  • FIG. 10 is a schematic flowchart of a method for adjusting the color initialization process of an image provided in this application.
  • the method can be executed by the processing component in the above-mentioned light source module.
  • the light source module includes the first light source, the second light source and the third light source as an example below.
  • the method includes the following steps:
  • Step 1001 the processing component acquires preset target color coordinates.
  • a certain group of preset color coordinates selected by the user may be detected, and the selected preset color coordinates are the preset target color coordinates.
  • Step 1002 the processing component generates an initial signal according to the preset correspondence between the color coordinates and the current weight, and the acquired preset target.
  • the initial signal includes the weight of the current input to each light source.
  • the correspondence relationship between the preset color coordinates and the current weight can be expressed in Table 2 below.
  • the current weight corresponding to the preset target color coordinates can be determined. For example, if the preset target color coordinate is A(x target1 , y target1 ), then the corresponding current weight ratio can be determined as I 11 : I 11 : I 13 ; the initial signal includes current weight as I 11 : I 12 : I 13 .
  • Table 1 and Table 2 may be two independent tables or combined together, which is not limited in this application.
  • Table 1 and Table 2 can be stored in a memory or a register, and the processing component can obtain the data in Table 1 and Table 2 by calling the memory or register.
  • the memory or register please refer to the following related introductions, which will not be repeated here.
  • Step 1003 the processing component sends an initial signal to the light emitting component.
  • the light emitting component can receive an initial signal from the processing component.
  • the light emitting component includes a first light source, a second light source and a third light source as an example.
  • Step 1004 the light-emitting component can emit light of a corresponding color according to the received initial signal.
  • the processing component belongs to the sensing module.
  • the above sensing module may also include a processing component.
  • the sensing module may include detection components and processing components.
  • FIG. 11 is a schematic structural diagram of another optical display device provided by the present application.
  • the optical display device may include a light source module 1101 , a light splitting module 1102 , a modulation module 1103 and a sensing module 1104 .
  • the sensing module 1104 includes a processing component 11041 and a detection component 11042 .
  • the detection component 11042 can be used to detect the color information of the first polarized light, and send the color information to the processing component 11041 .
  • the processing component 11041 is configured to generate a control signal according to the color information, and send the control signal to the light source module 1101 .
  • the light source module 1101 is used for adjusting the weight of the current of each color in the light of at least two colors according to the control signal.
  • the light source module 801, the light splitting module 802, the modulation module 803, the sensing module 804, and the optical lens 805, please refer to the foregoing description, and will not be repeated here.
  • step 902 For the process of generating the control signal by the processing components in the sensing module, refer to the introduction of step 902 to step 905 in FIG. 9 a , and will not be repeated here.
  • the processing component is independent of the sensor module and the light source module, and here, the processing component may also be referred to as a processing module.
  • optical display device may also include a processing module.
  • the optical display device 1200 may include a light source module 1201 , a spectroscopic module 1202 , a modulation module 1203 , a sensing module 1204 , an optical lens 1205 and a processing module 1206 .
  • the sensing module 1204 is used to obtain the color information of the first polarized light, and send the color information to the processing module 1206;
  • the processing module 1206 is used to generate a control signal according to the color information, and send the control signal to the light source module 1201; the light source
  • the module 1201 is also used for adjusting the weights of at least two colors of light according to the control signal.
  • the spectroscopic module 1202 For a more detailed introduction of the light source module 1201 , the spectroscopic module 1202 , the modulation module 1203 , the sensor module 1204 , and the optical lens 1205 , please refer to the foregoing description, and will not be repeated here.
  • the process of processing the control signal generated by the processing module refer to the introduction of step 902 to step 905 in FIG. 9 a , which will not be repeated here.
  • the optical display device without increasing the complexity of the optical system, by adding a sensor module, when the environment (such as temperature) changes, feedback adjustment of the current of each light-emitting component (such as a light source) in the light source module can be realized Therefore, the weight of the light of each color of the synthetic light can be changed to ensure that the optical display device displays in a constant color range, thereby helping to reduce the color shift of the image displayed by the optical display device.
  • a sensor module when the environment (such as temperature) changes, feedback adjustment of the current of each light-emitting component (such as a light source) in the light source module can be realized Therefore, the weight of the light of each color of the synthetic light can be changed to ensure that the optical display device displays in a constant color range, thereby helping to reduce the color shift of the image displayed by the optical display device.
  • Fig. 12b is a schematic circuit diagram of an optical display device provided by the present application.
  • the circuit in the optical display device mainly includes a main processor (host CPU) 1701, an external memory interface 1702, an internal memory 1703, an audio module 1704, a video module 1705, a power supply module 1706, a wireless communication module 1707, and an I/O interface 1708 , video interface 1709, display circuit 1710, modulator 1711, etc.
  • the main processor 1701 and its surrounding components such as an external memory interface 1702, an internal memory 1703, an audio module 1704, a video module 1705, a power module 1706, a wireless communication module 1707, an I/O interface 1708, a video interface 1709, and a display circuit 1710 can be connected by bus.
  • the main processor 1701 may be called a front-end processor.
  • the circuit diagrams shown in the embodiments of the present application do not constitute specific limitations on the optical display device.
  • the optical display device may include more or fewer components than shown in the illustrations, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the main processor 1701 includes one or more processing units, for example: the main processor 1701 may include an application processor (Application Processor, AP), a modem processor, a graphics processing unit (Graphics Processing Unit, GPU), an image Signal processor (Image Signal Processor, ISP), controller, video codec, digital signal processor (Digital Signal Processor, DSP), baseband processor, and/or neural network processor (Neural-Network Processing Unit, NPU )wait.
  • different processing units may be independent devices, or may be integrated in one or more processors.
  • a memory may also be provided in the main processor 1701 for storing instructions and data.
  • the memory in the main processor 1701 is a cache memory.
  • the memory may hold instructions or data that the main processor 1701 has just used or recycled. If the main processor 1701 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the main processor 1701 is reduced, thereby improving the efficiency of the optical display device.
  • the main processor 1701 can execute the stored instructions to execute the above method for adjusting the color.
  • the optical display device may further include a plurality of input/output (Input/Output, I/O) interfaces 1708 connected to the main processor 1701 .
  • the interface 1708 may include an integrated circuit (Inter-Integrated Circuit, I2C) interface, an integrated circuit built-in audio (Inter-Integrated Circuit Sound, I2S) interface, a pulse code modulation (Pulse Code Modulation, PCM) interface, a universal asynchronous transceiver (Universal Asynchronous Receiver/Transmitter, UART) interface, mobile industry processor interface (Mobile Industry Processor Interface, MIPI), general-purpose input and output (General-Purpose Input/Output, GPIO) interface, subscriber identity module (Subscriber Identity Module, SIM) interface, And/or Universal Serial Bus (Universal Serial Bus, USB) interface, etc.
  • I2C Inter-Integrated Circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI
  • the above-mentioned I/O interface 1708 can be connected to devices such as a mouse, a touchpad, a keyboard, a camera, a speaker/speaker, and a microphone, and can also be connected to physical buttons on an optical display device (such as volume keys, brightness adjustment keys, power-on/off keys, etc.).
  • the external memory interface 1702 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the optical display device.
  • the external memory card communicates with the main processor 1701 through the external memory interface 1702 to realize data storage function.
  • Internal memory 1703 may be used to store computer-executable program code, which includes instructions.
  • the internal memory 1703 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a call function, a time setting function, etc.) and the like.
  • the data storage area can store data (such as phone book, world time, etc.) created during the use of the optical display device.
  • the internal memory 1703 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (Universal Flash Storage, UFS), and the like.
  • the main processor 1701 executes various functional applications and data processing of the optical display device by executing instructions stored in the internal memory 1703 and/or instructions stored in the memory provided in the main processor 1701 .
  • the optical display device can implement audio functions through the audio module 1704 and the application processor. Such as music playback, calls, etc.
  • the audio module 1704 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 1704 can also be used for encoding and decoding audio signals, such as playing or recording.
  • the audio module 1704 can be set in the main processor 1701 , or some functional modules of the audio module 1704 can be set in the main processor 1701 .
  • Video interface 1709 can receive externally input audio and video signals, which specifically can be High Definition Multimedia Interface (High Definition Multimedia Interface, HDMI), Digital Visual Interface (Digital Visual Interface, DVI), Video Graphics Array (Video Graphics Array, VGA) , display port (Display port, DP), etc., the video interface 1709 can also output video externally.
  • High Definition Multimedia Interface High Definition Multimedia Interface, HDMI
  • Digital Visual Interface Digital Visual Interface
  • DVI Digital Visual Interface
  • Video Graphics Array Video Graphics Array
  • VGA Video Graphics Array
  • display port Display port, DP
  • the video interface 1709 can also output video externally.
  • the video interface 1709 can receive speed signals and power signals input from peripheral equipment, and can also receive AR video signals input from outside.
  • the video interface 1709 can receive video signals input from an external computer or terminal equipment.
  • the video module 1705 can decode the video input by the video interface 1709, for example, perform H.264 decoding.
  • the video module can also encode the video captured by the optical display device, for example, perform H.264 encoding on the video captured by the external camera.
  • the main processor 1701 can also decode the video input from the video interface 1709, and then output the decoded image signal to the display circuit 1710.
  • the display circuit 1710 and the modulator 1711 are used to display corresponding images.
  • the video interface 1709 receives an externally input video source signal, and the video module 1705 outputs one or more image signals to the display circuit 1710 after decoding and/or digital processing, and the display circuit 1710 drives and modulates the signal according to the input image signal.
  • the device 1711 images the incident polarized light, and then outputs at least two paths of image light.
  • the main processor 1701 can also output one or more image signals to the display circuit 1710 .
  • the display circuit 1710 and the modulator 1711 belong to the electronic components in the modulation unit, and the display circuit 1710 may be called a driving circuit.
  • the power module 1706 is used to provide power for the main processor 1701 and the light source 1712 according to the input power (such as DC power).
  • the light emitted by the light source 1712 can be transmitted to the modulator 1711 for imaging, thereby forming an image light signal.
  • the wireless communication module 1707 can enable the optical display device to communicate wirelessly with the outside world, which can provide wireless local area networks (Wireless Local Area Networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), Bluetooth (Bluetooth, BT) , global navigation satellite system (Global Navigation Satellite System, GNSS), frequency modulation (Frequency Modulation, FM), near field communication technology (Near Field Communication, NFC), infrared technology (Infrared, IR) and other wireless communication solutions.
  • the wireless communication module 1707 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 1707 receives electromagnetic waves through the antenna, frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the main processor 1701 .
  • the wireless communication module 1707 can also receive the signal to be sent from the main processor 1701, frequency-modulate it, amplify it, and convert it into electromagnetic wave and radiate it through the antenna.
  • the video data decoded by the video module 1705 can not only be input through the video interface 1709, but also can be received wirelessly through the wireless communication module 1707 or read from an external memory. Receive video data from terminal equipment or car entertainment system, and the optical display device can also read the audio and video data stored in the external memory.
  • the present application also provides a display system.
  • the display system includes the optical display device shown in any one of the above embodiments and a spatial light amplification module located in a spatial area.
  • the spatial light amplification module is used for amplifying the image corresponding to the image light from the optical display device.
  • the display system may include, but not limited to, a projector, a HUD system, or a desktop display.
  • the spatial light amplification module includes any one or a combination of at least one curved reflector and at least one cylindrical mirror.
  • the cylindrical mirror can have curvature in one dimension, so that one-dimensional shaping can be realized. It can also be understood as diverging or converging light in one dimension and reflecting light in another dimension.
  • the cylindrical mirror can be, for example, a plano-convex cylindrical mirror or a plano-concave cylindrical mirror.
  • the present application may also provide a vehicle.
  • Fig. 13a it is a partial structural schematic diagram of a vehicle provided by the present application.
  • the vehicle may include the display system and the windshield in any of the above embodiments, wherein the windshield is used for reflective imaging of the image light from the display system.
  • the vehicle is a vehicle
  • the display system is a HUD system
  • the windshield can be used to reflect the image light from the display system to the eye movement range of the vehicle (see FIG. 2 ).
  • the second spectrum can be represented by the following formula 12.
  • A( ⁇ ) represents the sum of the spectrum of each optical element passing through from the optical lens to the human eye seeing the displayed image.
  • the passing optical components include but are not limited to optical lenses, spatial light amplification modules, and windshields.
  • the eye movement range may also be referred to as an eyebox (eyebox), and the driver's eyes usually need to be within the eye movement range, as shown in FIG. 2 .
  • eyebox eyebox
  • the driver's eyes usually need to be within the eye movement range, as shown in FIG. 2 .
  • the glasses are aligned with the center of the eye box, then a full and sharp virtual image can be seen.
  • the virtual image you see may appear distorted, appear in wrong colors, or even not appear.
  • the size of the general eye movement range is 130mm ⁇ 50mm, that is, there is a movement range of about ⁇ 50mm in the vertical direction and a movement range of about ⁇ 130mm in the horizontal direction.
  • the windshield In order to prevent the whole piece of the windshield from being broken after being hit, the windshield usually includes two layers of glass and a layer of polyvinyl butyral (polyvinyl butyral, PVB) material sandwiched between the two layers of glass.
  • the refractive index of the PVB material is the same as that of the glass.
  • the windshield can be simplified as a flat glass with a certain thickness (generally 4-5mm).
  • the windshield includes a wedge-shaped windshield or a plane windshield.
  • Vehicles may also include other devices such as steering wheels, memory, and wireless communication devices, among others.
  • FIG. 13b is a schematic diagram of a possible functional framework of a vehicle provided in this application.
  • the display system is introduced as a head-up display system as an example.
  • Various subsystems may be included in the functional framework of the vehicle, such as sensor system 12, control system 14, one or more peripheral devices 16 (shown as an example), power supply 18, computer system 20 and Head up display system 22 .
  • the vehicle may also include other functional systems, such as an engine system providing power for the vehicle, etc., which are not limited in this application.
  • the sensor system 12 may include several detection devices, which can sense the measured information and convert the sensed information into electrical signals or other required forms of information output according to certain rules.
  • these detection devices may include a global positioning system (global positioning system, GPS), a vehicle speed sensor, an inertial measurement unit (inertial measurement unit, IMU), a radar unit, a laser range finder, a camera device, a wheel speed sensor, The steering sensor, gear sensor, or other components used for automatic detection, etc., are not limited in this application.
  • the control system 14 may include several elements such as the illustrated steering unit, braking unit, lighting system, automatic driving system, map navigation system, network time synchronization system and obstacle avoidance system.
  • the control system 14 may also include elements such as an accelerator controller and an engine controller for controlling the driving speed of the vehicle, which are not limited in this application.
  • Peripherals 16 may include elements such as a communication system, a touch screen, a user interface, a microphone, and speakers as shown, among others.
  • the communication system is used to realize the network communication between the vehicle and other devices except the vehicle.
  • the communication system can use wireless communication technology or wired communication technology to realize network communication between vehicles and other devices.
  • the wired communication technology may refer to communication between the vehicle and other devices through network cables or optical fibers.
  • Power source 18 represents a system that provides electrical power or energy to the vehicle, which may include, but is not limited to, a rechargeable lithium or lead-acid battery, or the like. In practical applications, one or more battery components in the power supply are used to provide electric energy or energy for starting the vehicle, and the type and material of the power supply are not limited in this application.
  • the computer system 20 may include one or more processors 2001 (one processor is used as an example in the figure) and a memory 2002 (also called a storage device).
  • processors 2001 one processor is used as an example in the figure
  • memory 2002 also called a storage device.
  • the memory 2002 is also inside the computer system 20, or outside the computer system 20, for example, as a cache in a vehicle, etc., which is not limited in this application.
  • the processor 2001 may include one or more general-purpose processors, such as a graphics processing unit (graphic processing unit, GPU).
  • the processor 2001 can be used to run related programs stored in the memory 2002 or instructions corresponding to the programs, so as to realize corresponding functions of the vehicle.
  • Memory 2002 can comprise volatile memory (volatile memory), such as RAM; Memory also can comprise non-volatile memory (non-vlatile memory), such as ROM, flash memory (flash memory), HDD or solid state disk SSD; 2002 may also include combinations of the above types of memory.
  • the memory 2002 can be used to store a set of program codes or instructions corresponding to the program codes, so that the processor 2001 calls the program codes or instructions stored in the memory 2002 to realize corresponding functions of the vehicle. This function includes, but is not limited to, some or all of the functions in the schematic diagram of the vehicle functional framework shown in FIG. 13b. In this application, a set of program codes for vehicle control can be stored in the memory 2002, and the processor 2001 calls the program codes to control the safe driving of the vehicle. How to realize the safe driving of the vehicle will be described in detail below in this application.
  • the memory 2002 can also store information such as road maps, driving routes, and sensor data.
  • the computer system 20 can combine other components in the vehicle functional framework diagram, such as sensors in the sensor system, GPS, etc., to realize related functions of the vehicle.
  • the computer system 20 can control the driving direction or driving speed of the vehicle based on the data input from the sensor system 12 , which is not limited in this application.
  • the heads-up display system 22 may include several elements, such as a windshield as shown, a controller, and a heads-up display.
  • the controller 222 is used to generate images according to user instructions (for example, generate images containing vehicle speed, power/fuel level and other vehicle statuses and images of augmented reality AR content), and send the images to the head-up display for display; the head-up display may include images
  • the combination of generating unit, reflector, and front windshield is used to cooperate with the head-up display to realize the optical path of the head-up display system, so that the target image is presented in front of the driver.
  • the functions of some components in the head-up display system can also be realized by other subsystems of the vehicle, for example, the controller can also be a component in the control system.
  • Fig. 13b of the present application shows that there are four subsystems, and the sensor system 12, the control system 14, the computer system 20 and the head-up display system 22 are only examples and not limiting.
  • vehicles can combine several components in the vehicle according to different functions, so as to obtain subsystems with corresponding different functions.
  • the vehicle may include more or less systems or elements, which is not limited in this application.
  • vehicles can be smart cars, electric cars, digital cars, cars, trucks, motorcycles, buses, boats, airplanes, helicopters, lawn mowers, recreational vehicles, playground vehicles, construction equipment, trams , golf carts, trains, and trolleys, etc., the application is not limited to this.
  • the present application provides a color adjustment method, please refer to the introduction of FIG. 14 .
  • the color adjustment method can be applied to the optical display device shown in any one of the above-mentioned embodiments in FIG. 3 to FIG. 12a. It can also be understood that the color adjustment method can be implemented based on the optical display device shown in any one of the above-mentioned embodiments in FIG. 3 to FIG. 12a. Alternatively, the color adjustment can also be applied to the display system shown in any of the above embodiments, or can also be applied to the vehicle shown in any of the above embodiments.
  • the color adjustment method can be executed by a control device, which can belong to the optical display device (such as a main processor in the optical display device), or can also be a control device independent of the optical display device, such as a chip or a chip system.
  • the control device may be a domain processor in the vehicle, or may be an electronic control unit (electronic control unit, ECU) in the vehicle, or may be a processing components, or processing components in the sensing module, etc.
  • the color adjustment method includes the following steps:
  • Step 1401 control the light source module to emit light.
  • the light is synthesized by light of at least two colors.
  • the weight of the light of each color can be controlled by controlling the weight of the current of the light source corresponding to the light of each color input to the light source module.
  • the specific process can refer to the introduction of the aforementioned light source module, and will not be repeated here.
  • the light from the light source module can be divided into the first polarized light and the second polarized light by the light splitting module.
  • the light splitting module please refer to the introduction of the aforementioned light splitting module, which will not be repeated here.
  • Step 1402 controlling the modulation module to modulate the second polarized light to obtain image light carrying image information.
  • step 1402 reference may be made to the introduction of the aforementioned modulation module, and details will not be repeated here.
  • Step 1403 controlling the sensing module to obtain color information of the first polarized light.
  • step 1403 refer to the introduction of the aforementioned sensing module, and details will not be repeated here.
  • Step 1404 according to the color information, control the light source module to adjust the weights of light of at least two colors.
  • step 1404 reference may be made to the introduction of the above-mentioned FIG. 9a and FIG. 10 , which will not be repeated here.
  • each module may be controlled by sending a corresponding control signal to each module.
  • the first polarized light detectable by the sensing module is used as the calibration of the color of the light emitted by the light source module. Further, based on the detection of the first polarized light by the sensing module, The obtained color information is used to adjust the weights of at least two colors of the synthesized light, thereby realizing the correction of the color of the image light.
  • control device includes hardware structures and/or software modules corresponding to each function.
  • present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the modules and method steps described in the embodiments disclosed in the present application. Whether a certain function is executed by hardware or computer software drives the hardware depends on the specific application scenario and design constraints of the technical solution.
  • Fig. 15 and Fig. 16 are schematic structural diagrams of a possible control device provided in the present application. These control devices can be used to realize the functions shown in Fig. 9a, Fig. 10 or Fig. 14 in the above-mentioned method embodiments, so the beneficial effects possessed by the above-mentioned method embodiments can also be realized.
  • control device 1500 includes a processing module 1501 , and may further include a transceiver module 1502 .
  • the control device 1500 is used to implement the functions in the method embodiment shown in FIG. 14 above.
  • the processing module 1501 is used to control the light source module to emit light, control the modulation module to modulate the second polarized light to obtain image light carrying image information,
  • the sensing module is controlled to obtain color information of the first polarized light
  • the light source module is controlled to adjust the weights of at least two colors of light according to the color information.
  • the transceiver module 1502 is configured to send control signals to the light source module, modulation module, sensor module and the like.
  • processing module 1501 in the embodiment of the present application may be implemented by a processor or processor-related circuit components, and the transceiver module 1502 may be implemented by an interface circuit and other related circuit components.
  • the present application further provides a control device 1600 .
  • the control device 1600 may include a processor 1601 , and further, optionally, may also include an interface circuit 1602 .
  • the processor 1601 and the interface circuit 1602 are coupled to each other. It can be understood that the interface circuit 1602 may be an input and output interface.
  • the control device 1600 may further include a memory 1603 for storing computer programs or instructions executed by the processor 1601, and the like.
  • the processor 1601 is used to execute the functions of the above-mentioned processing module 1501
  • the interface circuit 1602 is used to execute the functions of the above-mentioned transceiver module 1502 .
  • the present application provides a chip.
  • the chip may include a processor and an interface circuit. Further, optionally, the chip may also include a memory, and the processor is used to execute computer programs or instructions stored in the memory, so that the chip performs any of the above-mentioned possible implementations in FIG. 14. method.
  • the method steps in the embodiments of the present application may be implemented by means of hardware, or may be implemented by means of a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules, and software modules can be stored in random access memory (random access memory, RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or known in the art any other form of storage medium.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may also be a component of the processor.
  • the processor and storage medium can be located in the ASIC.
  • the ASIC may be located in an optical display device, a display system, or a vehicle.
  • the processor and the storage medium can also exist in the optical display device, the display system or the vehicle as a separate module.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • a computer program product consists of one or more computer programs or instructions. When the computer programs or instructions are loaded and executed on the computer, the processes or functions of the embodiments of the present application are executed in whole or in part.
  • the computer can be a general purpose computer, special purpose computer, a network of computers, or other programmable devices.
  • Computer programs or instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, computer programs or instructions may be Wired or wireless transmission to another website site, computer, server or data center.
  • a computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrating one or more available media.
  • Available media can be magnetic media, such as floppy disks, hard disks, and magnetic tapes; optical media, such as digital video discs (digital video discs, DVDs); and semiconductor media, such as solid state drives (SSDs). ).
  • At least one item (piece) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c ", where a, b, c can be single or multiple.
  • the character “/” generally indicates that the contextual objects are an “or” relationship.
  • the character "/” indicates that the related objects before and after are in a "division" relationship.
  • the symbol “(a, b)” means an open interval, the range is greater than a and less than b; "[a, b]” means a closed interval, the range is greater than or equal to a and less than or equal to b; "(a , b]” means a half-open and half-closed interval, the range is greater than a and less than or equal to b; "(a, b]” means a half-open and half-closed interval, the range is greater than a and less than or equal to b.
  • the word “exemplarily” is used to mean an example, illustration or illustration.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)

Abstract

一种光学显示装置(300)、显示***、交通工具及色彩调节方法,用于解决现有技术中图像色偏的问题。可应用于人工驾驶或车载影音等领域。光学显示装置(300)包括:用于发射由至少两种颜色的光合成的光线的光源模组(301);分光模组(302)用于将光线分为第一路光和第二路光,将第一路光传播至传感模组(304)、第二路光传播至调制模组(303);调制模组(303)用于对第二路光调制,得到携带图像信息的图像光;传感模组(304)用于获取第一路光的色彩信息,色彩信息用于调节至少两种颜色的光的权重。通过将第一路光作为光线的颜色的标定,基于第一路光得到色彩信息,根据色彩信息调节至少两种颜色的光的权重,可实现对图像光的色彩校正,有助于减小图像光形成的图像的色偏。

Description

一种光学显示装置、显示***、交通工具及色彩调节方法
相关申请的交叉引用
本申请要求于2021年10月29日提交中国国家知识产权局、申请号202111271734.X、申请名称为“一种光学显示装置、显示***、交通工具及色彩调节方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像显示技术领域,尤其涉及一种光学显示装置、显示***、交通工具及色彩调节方法。
背景技术
随着技术的不断发展,对交通工具使用的便捷性和安全性的提出了越来越高的要求。例如,抬头显示(head up display,HUD)(或称为平视显示)***的广泛应用,可以提升交通工具使用的安全性。抬头显示***是一种将驾驶相关的信息(如仪表信息或导航信息等)投影到驾驶员的视野前方的***,驾驶员可以在视野前方看到仪表信息和导航信息,不需要低头观察方向盘下方的仪表盘或者中控显示屏,从而可提高紧急情况下的制动反应时间,进而可提升驾驶的安全性。
交通工具行驶的环境较复杂,例如,可能由于季节的不同,会出现春天的潮湿环境、夏天的高温环境和暴雨环境、秋天的大风环境及冬天的雪地寒冷环境等,这些都可能会对HUD中的部件造成一定的影响。因此,HUD需根据具体环境设定特定的工作模式。此时,HUD中的光学显示装置(或称为光机)显示的图像(或称为画面)可能会被设定为特定的亮度和色彩(可用于色坐标表示),但环境的差异会导致光学显示装置显示的图像出现色偏或亮度衰减等。其中,色偏是指显示的实际图像的颜色与真实图像的颜色有明显区别。
综上所述,如何减小光学显示装置显示的图像的色偏,是当前亟需解决的技术问题。
发明内容
本申请提供一种光学显示装置、显示***、交通工具及色彩调节方法,用于减小光学显示装置显示的图像的色偏。
第一方面,本申请提供一种光学显示装置,该光学显示装置包括光源模组、分光模组、调制模组和传感模组。其中,光源模组用于发射(输出)光线,该光线由至少两种颜色的光合成。分光模组用于将来自光源模组的光线分为第一路光和第二路光,并将第一路光传播至传感模组,将第二路光传播至调制模组。调制模组用于对第二路光进行调制,得到携带图像信息的图像光。传感模组用于获取第一路光的色彩信息,该色彩信息用于光源模组调节至少两种颜色的光的权重。其中,上述模组可以称为组件或模块,例如光源组件、光源模块、调制组件和调制模块。
基于该方案,通过传感模组可检测到光源模组当前发射的光线的色彩信息,也就是说,可通过第一路光作为标定光源模组发射的光线的颜色的标定,进一步,基于传感模组根据 第一路光检测到的色彩信息,调节合成光线的至少两种颜色的光的权重,从而可在不增加光学结构的情况下,实现对图像光的色彩的校正,从而有助于减小光学显示装置显示的图像的色偏(或称为色差)。
下面示例性地的示出了三种可能的基于色彩信息调节至少两种颜色的光的权重的方案。
实现方式一,光源模组基于色彩信息调节至少两种颜色的光的权重。
在一种可能的实现方式中,光源模组包括处理组件和发光组件。处理组件用于接收来自传感模组的色彩信息,根据色彩信息生成用于调节至少两种颜色的光的权重的控制信号,并向发光组件发送控制信号;发光组件用于根据控制信号调节至少两种颜色的光的权重。
基于该实现方式一,光源模组中的处理组件可基于从传感模组获取的色彩信息,控制发光组件来调节合成光线的各颜色的光的权重,从而尽可能的减小基于该光线形成的图像的色偏。而且,由于传感模组主要用于检测第一路光的色彩信息,因此,不需要改变现有的传感模组的结构和运算逻辑,从而可兼容现有的传感模组。
进一步,可选的,处理组件具体用于根据接收到的色彩信息确定图像光的色坐标,若图像光的色坐标与预设的目标色坐标的差值大于阈值,生成控制信号。
通过在图像光的色坐标与预设目标色坐标的差值大于阈值时生成控制信号,可有助于避免频繁调节合成光线的各颜色的光的权重,从而可减小处理组件的运算量。
实现方式二,传感模组基于色彩信息,控制光源模组调节至少两种颜色的光的权重。
在一种可能的实现方式中,传感模组用于获取第一路光的色彩信息,根据色彩信息生成控制信号,并向光源模组发送用于调节至少两种颜色的光的权重的控制信号。
通过该实现方式二,可通过传感模组来控制光源模组调节至少两种颜色的光的权重。可以不改变现有光源模组的结构和运算逻辑,从而可兼容现有的光源模组。
实现方式三,独立于光源模组和传感模组的处理组件(此处可称为处理模组)可基于色彩信息,控制光源模组调节至少两种颜色的光的权重。
在一种可能的实现方式中,光学显示装置包括还包括处理模组,该处理模组用于接收来自传感模组的色彩信息,根据色彩信息生成控制信号,并向光源模组发送控制信号,该控制信号用于调节至少两种颜色的光的权重。
通过该实现方式三,可在兼容现有传感模组和光源模组的情况下,通过处理模组来控制光源模组调节至少两种颜色的光的权重。其中,上述处理组件可以为处理器。
对于上述实现方式一、实现方式二和实现方式三,光源模组可用于根据接收到的控制信号,调节至少两种颜色的光的权重。
进一步,可选的,控制信号可包括各颜色的光对应的光源的电流的权重。应理解,一种颜色的光对应的光源指发射该颜色的光的光源。
在一种可能的实现方式中,光源模组中的发光组件包括第一光源、第二光源和第三光源。
进一步,第一光源用于发射红光、第二光源用于发射蓝光,第三光源用于发射绿光。可通过控制输入第一光源、第二光源和第三光源的电流的权重,调节红光、绿光和蓝光的权重。
进一步,可选的,光源模组还可包括第一二向镜和第二二向镜;第一二向镜用于反射来自第二光源的蓝光,并透射来自第三光源的绿光;第二二向镜用于反射来自第一光源的 红光,并透射第一二向镜透射的绿光,并透射第一二向镜反射的蓝光。其中,二向镜可以称为二向色镜或合光镜。
通过第一二向镜和第二二向镜,可以实现将用于发射绿光的第三光源放置于距离分光模组最远的位置,如此可以尽可能的平衡图像的亮度。其中,绿光对图像的亮度的影响最大,将第三光源放置在距离分光模组的最远处,可以尽可能的平衡红光、绿光和蓝光对图像的亮度的影响。
在一种可能的实现方式中,分光模组包括偏振分束器,用于将来自光源模组的光线分为第一偏振光和第二偏振光,并将所述第一偏振光传播至传感模组,将第二偏振光传播至调制模组。应理解,第一偏振光和第二偏振光的偏振态不同,色彩信息等是一致的。
进一步,第一偏振光为P偏振光,第二偏振光为S偏振光;或者,第一偏振光为S偏振光,第二偏振光为P偏振光。
通过偏振分束器可实现将来自光源模组的光线分为第一偏振光和第二偏振光。进一步,射入传感模组的第一偏振光可以是S偏振光,相应的,射入调制模组的第二偏振光是P偏振光;或者射入传感模组的第一偏振光也可以是P偏振光,相应的,射入调制模组的第二偏振光是S偏振光。
在另一种可能的实现方式中,分光模组包括透反件(例如半透半反件),其中,第一路光为从透反件透射的光线,第二路光为从透反件反射的光线;或者,第一路光为从透反件反射的光线,第二路光为从透反件透射的光线。
通过透反件可以将来自光源模组的光线分为第一路光和第二光路。其中,这两路光的偏振态可以相同。
进一步,可选得,若分光模组为透反件,光源模组还可包括偏振片,或者光学显示装置还可包括偏振片。
在一种可能的实现方式中,光学显示装置还可包括光学镜头,光学镜头用于将调制模组输出的图像光投射至空间区域。
通过光学镜头可对调制模组输出的图像光进行整形和/或匀光,从而有助于提高基于图像光形成的图像的质量。
上述第一方面提供的光学显示装置可以称为图像生成单元(picture generation unit,PGU)或光机,PGU可以用于各种显示***中。例如,用在投影仪、抬头显示***、头戴式光学显示装置、或桌面显示器等。
第二方面,本申请提供一种显示***,包括上述第一方面或第一方面中的任意一种光学显示装置、以及位于空间区域的空间光放大模组;其中,空间光放大模组用于将来自光学显示装置的图像光对应的图像进行放大。
其中,显示***可以包括但不限于投影仪、HUD***、桌面显示器、或头戴式光学显示装置等。
示例性的,空间光放大模组可包括至少一个曲面反射镜、或者包括至少一个柱面镜、或者包括至少一个曲面反射镜和至少一个柱面镜的组合。
第三方面,本申请提供一种交通工具,包括上述第二方面或第二方面中的任意一种的显示***以及风挡;其中,风挡用于将来自显示***的图像光进行反射成像,例如反射至交通工具的眼动范围内(眼盒位置)。
第四方面,本申请提供一种色彩调节方法,可应用于光学显示装置,该光学显示装置 包括光源模组、分光模组、调制模组和传感模组。
该方法包括控制光源模组发射光线,其中光线由至少两种颜色的光合成,该光线通过分光模组分为第一路光和第二路光;控制调制模组对第二路光进行调制,得到携带图像信息的图像光;控制传感模组获取第一路光的色彩信息;以及根据色彩信息,控制光源模组调节至少两种颜色的光的权重。
在一种可能的实现方式中,可根据色彩信息确定图像光的色坐标,若图像光的色坐标与预设的目标色坐标的差值大于阈值,生成用于控制调节至少两种颜色的光的权重控制信号,并向光源模组发送控制信号,以控制光源模组根据控制信号,调节至少两种颜色的光的权重。
第五方面,本申请提供一种色彩调节装置,该色彩调节装置用于实现上述第四方面或第四方面中的任意一种方法,包括相应的功能模块,分别用于实现以上方法中的步骤。功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块。
第六方面,本申请提供一种光学显示装置,该光学显示装置可包括光源模组、分光模组、调制模组和传感模组。其中,光源模组用于发射光线,该光线由至少两种颜色的光合成。分光模组用于将来自光源模组的光线分为第一路光和第二路光,并将第一路光传播至传感模组,将第二路光传播至调制模组。调制模组用于对第二路光进行调制,得到携带图像信息的图像光。传感模组用于获取第一路光的色彩信息,根据色彩信息生成控制信号,并向光源模组发送控制信号。光源模组还用于根据控制信号,调节至少两种颜色的光的权重。
第七方面,本申请提供一种光学显示装置,该光学显示装置可包括光源模组、分光模组、调制模组、传感模组和处理模组。其中,光源模组用于发射光线,该光线由至少两种颜色的光合成。分光模组用于将来自光源模组的光线分为第一路光和第二路光,并将第一路光传播至传感模组,将第二路光传播至调制模组。调制模组用于对第二路光进行调制,得到携带图像信息的图像光。传感模组用于获取第一路光的色彩信息,并向处理模组发送色彩信息。处理模组用于根据色彩信息生成控制信号,并向光源模组发送控制信号,控制信号用于光源模组调节至少两种颜色的光的权重。光源模组还用于根据控制信号,调节至少两种颜色的光的权重。
第八方面,本申请提供一种芯片,包括至少一个处理器和接口电路,进一步,可选的,该芯片还可包括存储器,处理器用于执行存储器中存储的计算机程序或指令,使得芯片执行上述第四方面或第四方面的任意可能的实现方式中的方法。
第九方面,本申请提供一种计算机可读存储介质,计算机可读存储介质中存储有计算机程序或指令,当计算机程序或指令被控制装置执行时,使得该控制装置执行上述第四方面或第四方面的任意可能的实现方式中的方法。
第十方面,本申请提供一种计算机程序产品,该计算机程序产品包括计算机程序或指令,当该计算机程序或指令被控制装置执行时,使得该控制装置执行上述第四方面或第四方面的任意可能的实现方式中的方法。
上述第二方面至第十方面中任一方面可以达到的技术效果可以参照上述第一方面中有益效果的描述,此处不再重复赘述。
附图说明
图1a~图1c为本申请提供的一种可能的应用场景示意图;
图2为本申请提供的另一种可能的应用场景示意图;
图3为本申请提供的一种光学显示装置的结构示意图;
图4a为本申请提供的一种光源模组的结构示意图;
图4b为本申请提供的另一种光源模组的结构示意图;
图4c为本申请提供的一种匀光组件的结构示意图;
图5a为本申请提供的一种偏振光分束器的分光原理示意图;
图5b为本申请提供的一种透反件的结构示意图;
图5c为本申请提供的一种硅基液晶LCoS的结构示意图;
图6为本申请提供的一种光学镜头的结构示意图;
图7a为本申请提供的另一种光学显示装置的结构示意图;
图7b为本申请提供的又一种光学显示装置的结构示意图;
图7c为本申请提供的又一种光学显示装置的结构示意图;
图8为本申请提供的另一种光学显示装置的结构示意图;
图9a为本申请提供一种调节色彩的方法流程示意图;
图9b为本申请提供一种输入光源模组的PWM的电流的示意图;
图10为本申请提供的一种调节图像的色彩初始化过程的方法流程示意图;
图11为本申请提供的另一种光学显示装置的结构示意图;
图12a为本申请提供的又一种光学显示装置的结构示意图;
图12b为本申请提供的一种光学显示装置的电路示意图;
图13a为本申请提供的一种交通工具的部分结构示意图;
图13b为本申请提供的一种交通工具的功能框架示意图;
图14为本申请提供的一种色彩调节方法的方法流程示意图;
图15为本申请提供的一种控制装置的结构示意图;
图16为本申请提供的一种控制装置的结构示意图。
具体实施方式
下面将结合附图,对本申请实施例进行详细描述。
以下,对本申请中的部分用语进行解释说明。需要说明的是,这些解释是为了便于本领域技术人员理解,并不是对本申请所要求的保护范围构成限定。
一、色坐标
色坐标也称为表色系。色坐标可以精确的表示颜色。通常,色坐标的横轴为x,纵轴为y。一个色坐标在色度图上可以确定出一个点,该点可称为色点,可用于色坐标(x,y)表示一个色点。
二、光谱
光谱通常是指复色光经过色散***分光后,被色散开的单色光按波长(或频率)大小依次排列的图案。
三、光谱三刺激值
光谱三刺激值用于近似地描述颜色的三个刺激强度的数值。在颜色匹配中,用于颜色混合(或称为合成)以产生任意颜色的三种颜色称为三原色(或三基色)。通常使用红、绿、蓝三种颜色作为三原色。匹配等能光谱色的三原色的数量,称为光谱的三刺激值,用符号r、g、b表示。
四、脉冲宽度调制波形(pulse width modulation wave,PWM)
脉冲宽度调制波形是指占空比可变的脉冲波形。在PWM中,各脉冲的幅值是相等的,要改变等效输出波形的幅值,只要按同一比例系数改变各脉冲的宽度即可。脉冲宽度调制的原理:对逆变电路开关元件的通断进行控制(例如具体可控制开关元件的导通时间),使输出端得到一系列幅值相等的脉冲,用这些脉冲来代替正弦波或所需要的波形。
五、二向镜
二向色镜也可称为双色镜或二向色镜或合光镜。其特点是对一定波长的光几乎完全透过,而对另一些波长的光几乎完全反射。例如二向镜1可透过蓝光,反射绿光;也就是说,蓝光射向该二向镜1可几乎完全透过,绿光经该二向镜1几乎被完全反射。再比如,二向镜2可透过蓝光和绿光,反射红光;也就是说,蓝光和绿光射向二向镜2可几乎完全透过,红光经二向镜2可几乎被完全反射。本申请中对二向镜透射哪些波长的光、以及反射哪些波长的光不作限定,可根据实际需求进行选择二向镜。
六、三基色
三基色是指通过其他颜色的混合无法得到的“基本色”。一般三基色指红、绿、蓝,即R(Red)、G(Green)、B(Blue)。也可以理解为,三基色彼此独立,任一种基色不能通过其它两种基色配出。
将三基色按不同百分比(即权重)进行合成(或称为混合),可以得到不同的颜色的光线。合成的光线的亮度由三基色的亮度之和决定,合成的光线的色度(可用色坐标表示)由三基色的权重决定。
前文介绍了本申请所涉及到的一些用语,下面介绍本申请涉及的技术方案。
本申请提供了一种光学显示装置、显示***、交通工具以及对应的色彩调节方法。其中,该光学显示装置可以集成在图1a所示的投影仪100a中,投影仪100a可以将图像投影到墙面或投影屏幕上。或者,光学显示装置可以集成在如图1b所示的显示器100b中使用。或者,光学显示装置也可以集成在车载显示屏(例如图1c中的100c所示),车载显示屏可以安装在交通工具的座椅后背或副驾驶位置等,本申请对车载显示屏安装的位置不作限定。或者,光学显示装置也可以集成在图2所示的抬头显示***,抬头显示***可以显示驾驶辅助信息。
进一步,可选的,以光学显示装置集成于HUD***为例、HUD***应用在车辆为例介绍的。HUD***可将形成的图像(称为HUD虚像)投射在驾驶员的前方视野范围,并与真实路面信息融合起来,从而可增强驾驶员对于实际驾驶环境的感知。例如,HUD***可以将携带导航信息和/或仪表信息(如车速、转速、温度、油量等)的HUD虚像叠加在交通工具外的真实环境上,使得驾驶员可获得增强现实的视觉效果。具体可应用于AR导航、自适应巡航、车道偏离预警等场景。其中,HUD***包括但不限于挡风玻璃(Windshield,W)-HUD***、增强现实抬头显示器(augmented reality head up display,AR-HUD)等。
应理解,如上场景只是举例,本申请提供的方法和装置还可以应用在多种场景,而不限于上述示例出的场景。
如背景技术描述,由于交通工具行驶的环境较复杂,为了适应不同的环境,HUD需根据具体环境设定特定的工作模式,这可能会导致HUD中的光学显示装置显示的图像(或称为画面)被设定为特定色彩,从而会导致光学显示装置显示的图像出现色偏等问题。
鉴于此,本申请提出一种光学显示装置,该光学显示装置可以通过传感模组检测到光源模组当前发射的光线的色彩信息,来校正待显示的图像的色彩,从而有助于减小光学显示装置显示的图像的色偏。
基于上述内容,下面结合附图3至附图10,对本申请提出的光学显示装置进行具体阐述。
如图3所示,为本申请提供的一种光学显示装置的结构示意图。该光学显示装置300可包括光源模组301、分光模组302、调制模组303和传感模组304。其中,光源模组301用于发射光线,该光线可由至少两种颜色的光合成;例如,该光线可由红光、绿光和蓝光混合而成。分光模组302用于将来自光源模组301的光线分为第一路光和第二路光,并将第一路光传播至传感模组304,将第二路光传播至调制模组303;其中,第一路光和第二路光的偏振态可以相同也可以不同,色彩信息等是一致的。调制模组303用于对第二路光进行调制(例如进行振幅调制和/或相位调制),得到图像光。其中,图像光携带图像信息(如导航信息和/或仪表信息等)的光。传感模组304用于获取第一路光的色彩信息,色彩信息用于光源模组调节至少两种颜色的光的权重。
基于上述光学显示装置,通过传感模组可检测到光源模组当前发射的光线的色彩信息,也就是说,可通过第一路光作为标定光源模组发射的光线的颜色的标定,进一步,基于传感模组根据第一路光检测到的色彩信息,调节合成光线的至少两种颜色的光的权重,从而可在不增加光学结构的情况下,实现对图像光的色彩的校正,从而有助于减小光学显示装置显示的图像的色偏。
应理解,合成光线的至少两种颜色的光中各种颜色的光的权重之和为固定值(如100%)。若改变其中至少一种颜色的光权重,其它剩余的颜色的光中至少一种颜色的光的权重会发生相应的改变。换句话说,调节至少两种颜色的光的权重可以是改变至少两种颜色中的各个颜色的光的权重,也可以是改变至少两种颜色中部分颜色(此处至少两种颜色可以是三种或三种以上)的光权重,本申请对此不作限定。
需要说明的是,在光学显示装置中,受环境(如温度)影响导致光学显示装置显示的图像的色偏影响较大的因素是光源模组,因此,通过监控光源模组发射的光线的色彩信息,可较有效的改善图像的色偏。
下面对图3所示的各个功能模组分别进行介绍说明,以给出示例性的具体实现方案。为方便说明,下文中的光源模组、分光模组、调制模组和传感模组均未加数字标识。
一、光源模组
在一种可能的实现方式中,光源模组可包括发光组件。示例性地,发光组件可以包括至少两个光源,一个光源可发射一种颜色的光。光源例如可以是激光二极管(laser diode,LD)、发光二极管(light emitting diode,LED)、有机发光二极管(organic light emitting diode,OLED),或者微型发光二极管(micro light emitting diode,micro-LED)等。
示例性地,发光组件可包括第一光源、第二光源和第三光源,其中第一光源用于发射红光的,第二光源用于发射蓝光,第三光源用于发射绿光。也可以理解为,发光组件可包括R光源、G光源和B光源。通过第一光源发射的红光、第三光源发射的绿光和第二光源 发射蓝光可以混合得到不同颜色的光线。例如可混合得到白光。
下面以发光组件包括第一光源、第二光源和第三光源为例,介绍光源模组可能的结构。
请参阅图4a,为本申请提供的一种光源模组的结构示意图。该光源模组包括发光组件,发光组件包括第一光源、第二光源和第三光源,这三个光源排为一列,三个光源发射出的三种颜色(RGB)的光可混合形成白色的光线。进一步,可选的,为了提高各个光源发出的光束的均匀度,每个光源还对应一个准直镜(例如准直透镜、或曲面反射镜等)。具体的,第一光源对应一个准直镜,第二光源对应一个准直镜,第三光源也对应一个准直镜。
基于该光源模组,可以在省去部分光学元件(如二向色镜)的情况下,实现对各颜色的光的混合,从而有助于光源模组的小型化,从而可小型化光学显示装置。
基于图4a所示的光源模组,从光源模组出射的光线的光谱可表示为LED(λ),具体的可参见下述公式1的表示。
LED(λ)=LED B(λ)+LED G(λ)+LED R(λ)   公式1
其中,LED B(λ)表示第二光源发射的蓝光的光谱,LED G(λ)表示第三光源发射的绿光的光谱,LED R(λ)表示第一光源发射的红光的光谱。
请参阅图4b,为本申请提供的另一种光源模组的结构示意图。该光源模组可包括发光组件、第一二向镜和第二二向镜,发光组件包括第一光源、第二光源和第三光源。三个光源发射出的三种颜色(RGB)的光,第一二向镜用于反射来自第二光源的蓝光,并向第二二向镜透射来自第三光源的绿光;第二二向镜用于反射来自第一光源的红光,透射第一二向镜透射的绿光、以及透射第一二向镜反射的蓝光。也可以理解为,经第二二向镜后,来自第一光源的红光、来自第三光源的绿光和来自第二光源的蓝光混合形成光线。进一步,可选的,为了提高各个光源发出的光束的均匀度,每个光源也可以对应一个准直镜,具体可参见前述图4a的介绍,此处不再赘述。
基于图4b所示的光源模组,从光源模组出射的光线的光谱LED(λ)可参见下述公式2。
LED(λ)=LED B(λ)×T 1(λ)×T 2(λ)+LED G(λ)×R 1(λ)×T 2(λ)+LED R(λ)×R 2(λ)   公式2
其中,LED B(λ)表示第二光源发射的蓝光的光谱,LED G(λ)表示第三光源发射的绿光的光谱,LED R(λ)表示第一光源发射的红光的光谱,T 1(λ)表示第一二向镜的透射光谱,R 1(λ)表示第一二向镜的反射光谱,T 2(λ)表示第二二向镜的透射光谱,R 2(λ)表示第二二向镜的反射光谱。
需要说明的是,上述给出的光源模组中的第一光源、第二光源和第三光源的位置也可以互换。对于上述图4b所示的光源模组的结构,若第一光源和第二光源的位置互换,相应的,可用第三二向镜替换第二二向镜,第三二向镜可反射蓝光,透射红光和绿光。此处不再一一列举。
为了提高形成图像的质量(例如图像的亮度的均匀度),光源模组还可以包括匀光组件。具体的,各种颜色的光合成后形成的光线先经过匀光组件进行匀光之后,再射向分光模组。其中,匀光组件可以是一系列(如两个或者两个以上)的透镜(或称为子眼)组成的复眼透镜(可参见图4c),以实现将光线的角度进行压缩,从而可使得将射向分光模组的光线变得更均匀一些。
需要说明的是,图4c中所示的复眼透镜包括的透镜的数量仅是示例,本申请中复眼透镜可以包括比图4c多的透镜,也可以比图4c少的透镜,本申请对此不作限定。应理解,复眼透镜包括的子眼越多,匀光效果越好。此外,复眼透镜可以是一个,也可以是多个, 本申请对此也不作限定。
二、分光模组
在一种可能的实现方式中,分光模组可将来自光源模组的光线分为第一路光和第二路光。
下面示例性的示出了两种可能的分光模组的结构。
结构一,分光模组可以为偏振光分束器。
基于该结构一,第一路光和第二路光的偏振态不同,第一路光可以称为第一偏振光,第二路光可称为第二偏振光。
如图5a所示,为本申请提供的一种偏振光分束器的分光原理示意图。偏振光分束器(polarizing beam splitter,PBS)可通过在直角棱镜的斜面镀制一层或多层薄膜,然后通过胶层相贴合。利用光线以布鲁斯特角入射时P偏振光透射率为1而S偏振光透射率小于1的性质,在光线以布鲁斯特角多次通过薄膜以后,达到使的P偏振分量完全透过,而绝大部分S偏振分量反射(至少90%以上)的一个光学元件。也可以理解为,PBS具有透射和反射特性,通常对S偏振光的反射率在99.5%以上,对P偏振光的透过率在91%以上。示例性地,偏振光分束器可将入射光(包括P偏振光和S偏振光)分为水平偏振光(即P偏振光)和垂直偏振光(即S偏振光)。其中,P偏振光完全通过,S偏振光以45度角被反射,且S偏振光的出射方向与P偏振光的出射方向成90度角。
在一种可能的实现方式中,偏振分束器用于将来自光源模组的光线分为第一偏振光和第二偏振光,第一偏振光可为P偏振光,相应的,第二偏振光为S偏振光;或者,第一偏振光为S偏振光,第二偏振光为P偏振光。
进一步,可选地,分光模组还将第一偏振光(即S偏振光)反射至传感模组,将第二偏振光(即P偏振光)透射至调制模组;或者,将第一偏振光(即P偏振光)透射至传感模组,将第二偏振光(即S偏振光)反射至调制模组。
结构二,分光模组为透反件。
基于该结构二,第一路光和第二路光的偏振态相同。例如,第一路光和第二路光可以均是P偏光,或者也可以均是S偏光,或者也可以均是自然光。需要说明的是,若第一路光和第二路光均是P偏光或S偏光,可以在光源模组内或光源模组与分光模组之间增加对应的偏振片,对应的偏振片可以允许P偏光通过或允许S偏光通过。
在一种可能的实现方式中,透反件可以将来自光源模组的光线的部分透射得到第一路光,部分反射得到第二路光。也可以理解为,第一路光为从透反件透射的光,第二路光为从透反件反射的光。或者,透反件可以将来自光源模组的光线的部分反射得到第一路光,部分透射得到第二路光。也可以理解为,第一路光为从透反件反射的光,第二路光为从透反件透射的光。
示例性地,透反件例如可以是分光镜,透反件的工作部分可以镀制分光膜(可参见图5b)的平面,以改变入射光的被透射和被反射的比例。例如可以是在透明的平板基板上镀上分光膜形成透反件。需要说明的是,透反件可以根据具体的需求选择分光膜的反射率和透射率,例如反射率可以高于50%,透射率低于50%;或者也可以反射率低于50%,透射率低于50%;或者反射率和透射率均等于50%,该类透反件也可称为半透半反镜(semi-transparent and semi-reflective mirror),即半透半反镜的透射率和反射率各50%,当入射光经过半透半反镜后,其透过的光强和反射的光强各占50%。
需要说明的是,上述给出的分光模组的结构仅是示例,结构一可以理解为是基于偏振态原理进行分光的,结构二可以理解为是基于光的强度(或称为能量)进行分光的。当然,其它可以实现来自光源模组的光线分为第一路光和第二路光的结构也在本申请的保护范围。
三、调制模组
在一种可能的实现方式中,调制模组可包括图像源(或称为光学数据处理(optical data processing,ODP)单元),用于对接收到的第二偏振光进行调制,得到携带图像信息的图像光。具体的,调制模组可对第二偏振光进行空间相位调制,得到携带图像信息的图像光。其中,图像光的偏振态与第一偏振光的偏振态相同。因此,调制模组将图像光反射至分光模组后,可经分光模组将图像光透射至空间区域。若第二偏振光为S偏振光,调制模组对第二偏振光进行空间相位调制,得到的图像光为P偏光;若第二偏振光为P偏振光,调制模组对第二偏振光进行空间相位调制,得到的图像光为S偏光。
示例性地,调制模组可以包括但不限于:LCoS(可参见前述相关描述)显示器、液晶显示器(liquid crystal display,LCD)、数字光处理(Digital Light Procession,DLP)显示器、激光线扫描(laser beam scanning,LBS)显示器等。
如图5c所示,为本申请提供的一种硅基液晶(liquid crystal on silicon,LCoS)的结构示意图。该LCoS的上层玻璃基底和下层基于互补金属氧化物半导体(complementary metal oxide semiconductor,COMS)工艺的硅基底之间注入液晶,形成液晶层;液晶层的底部设置有电极。LCOS的工作原理为:当液晶层的某像素的外加电压为0时,输入的S偏振光经过液晶层,偏振方向不发生偏转,到达底部反射回来输出S偏振光,经过偏振光分束器反射后,S偏振光原路返回。当此像素外部施加电压时,输入的S偏振光经过液晶层,偏振方向发生偏转,到达底部反射回来输出P偏振光,直接穿过偏振光分束器,耦合进光学镜头等。因此,可以通过改变外加电压或电流来改变液晶分子长轴的方向,以改变LCoS折射率,从而可改变光经过LCoS的相位。相当于利用相位的延迟来旋转光的偏振态,并配合偏振光分束器实现光的调制。基于LCOS,可以实现较小的显示芯片,有利于光学显示装置的小型化。
需要说明的是,上述给出的调制模组的仅是示例,其它可以实现将来自光源模组的光线进行调制并生成图像光的装置也在本申请的保护范围。
在一种可能的实现方式中,若分光模组为上述结构一所示的偏正光分束器,调制模组例如可以是LCoS显示器或LCD;若分光模组为上述结构二所示的透反件,调制模块例如可以是LCoS显示器、LCD、DLP显示器、或LBS显示器等。
四、传感模组
在一种可能的实现方式中,传感模组用于检测第一偏振光的色彩信息。示例性地,色彩信息可用光谱表针。具体的,传感模组可检测第一偏振光的第一光谱,并将第一光谱的信息转换为第一电信号。也可以理解为,传感模组对检测到的第一偏振光的第一光谱进行光电转化,得到表示第一偏振光的第一光谱的信息的第一电信号。
此处,传感模组检测到的第一偏振光的第一光谱可用Sensor(λ)表示。若第一偏振光为P偏光,Sensor(λ)可参见下述公式3的表示;若第一偏振光为S偏光,Sensor(λ)可参见下述公式4的表示。
Sensor(λ)=LED(λ)×Tp(λ)   公式3
Sensor(λ)=LED(λ)×Rs(λ)   公式4
其中,Tp(λ)为分光模组的透射光谱,Rs(λ)表示分光模组的反射光谱。
示例性地,传感模组可以包括探测组件。探测组件可以包括但不限于:光电探测器(photon detector,PD),或高速光电二极管、或电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管、光电二极管等。传感模组例如可以为颜色传感器。颜色传感器也可称为颜色识别传感器或色彩传感器,可以较精确的区别相似的颜色。
进一步,可基于传感模组输出的携带第一光谱的信息的第一电信号,校正基于图像光的形成的图像的色偏,具体过程可参见下述图9a和图10的介绍,此处不再赘述。
本申请中,光学显示装置还可包括光学镜头。光学镜头可用于将调制模组输出的图像光投射至空间区域。具体的,光学镜头可对调制模组输出的图像光进行整形和/或匀光和/或汇聚,并将整形和/或匀光和/或汇聚后的图像光传播至空间区域。若该光学显示装置应用于显示***(如投影仪、HUD***、桌面显示器、或头戴式光学显示装置等),光学镜头可将整形和/或匀光和/或汇聚后的图像光传播至位于空间区域的空间光放大模组(具体可参见下述相关介绍)。
如图6所示,为本申请提供的一种光学镜头的结构示意图。该光学镜头可包括至少一个镜片。图6是以包括3个镜片为例示例的。要说明的是,本申请对光学镜头包括的镜片的数量不作限定,可以比上述图6更多,或者也可以比上述图6更少,且本申请对镜片的类型也不作限定,镜片也可以包括其它透镜或其它透镜的组合,例如平凸透镜、平凹透镜等。此外,光学镜头可以是绕光学镜头的光轴旋转对称的。例如,光学镜头中的镜片可以是单片的球面透镜,也可以是多片球面透镜的组合。或者,光学镜头也可以是非旋转对称的。例如,光学镜头中的镜片可以是单片的非球面透镜,也可以是多片非球面透镜的组合。通过多片球面透镜和/或非球面透镜的组合,有助于提高光学镜头的成像质量,降低光学镜头的像差。
在一种可能的实现方式中,光学镜头中的镜片的材料可以是玻璃、树脂或者晶体等光学材料。当镜片的材料为树脂时,有助于减轻探测***的质量。当镜片的材料为玻璃时,有助于进一步提高探测***的成像质量。进一步,为了有效抑制温漂,光学镜头中包括至少一个玻璃材料的镜片。
基于上述内容,下面给出上述光学显示装置的三种具体的结构,以便于进一步理解上述光学显示装置改善显示的图像的色偏的过程。
如图7a所示,为本申请提供的另一种光学显示装置的结构示意图。该光学显示装置700可包括光源模组701、偏振光分束器702、调制模组703、传感模组704、光学镜头705。进一步,可选的,该光学显示装置还可包括处理模组706。图7a中以光源模组701包括第一光源7011、第二光源7012、第三光源7013、第一二向镜7014、第二二向镜7015和复眼透镜7016为例,关于其它各个模组更详细的介绍可参见前述相关描述,此处不再赘述。
基于该光学显示装置,第一光源7011发射红光,第二光源7012发射蓝光,第三光源7013发射绿光,各种颜色的光的传播光路为:绿光经第一二向镜7014透射至第二二向镜7015,蓝光经第一二向镜7014反射至第二二向镜7015,绿光和蓝光经第二二向镜7015透 射,红光经第二二向镜7015反射后与蓝光和绿光合成白色光线,该光线经复眼透镜7016匀光后传播至偏振光分束器702,从光源模组射出的光线的光谱可用上述公式2表示。
进一步,该光线经偏振光分束器702分为第一偏振光和第二偏振光,第一偏振光经偏振光分束器702透射至传感模组704。基于此,传感模组704接收到的第一偏振光的第一光谱可参见上述公式3中的Sensor(λ),此处,公式3中的Tp(λ)为偏振光分束器的透射光谱。第二偏振光经偏振光分束器702反射至调制模组703,调制模组703对第二偏振光进行空间相位调制得到图像光(图像光的偏振态与第一偏振光的偏振态相同),并将图像光反射至偏振光分束器702,图像光经偏振光分束器702透射至光学镜头705。基于此,投射到空间区域的图像光的第二光谱可表示为Display(λ),具体可参见下述公式5。
Display`(λ)=LED(λ)×Rs(λ)×LCoS(λ)×Tp(λ)×A`(λ)   公式5
其中,Rs(λ)表示分光模组(此处为偏振光分束器702)的反射光谱,LCoS(λ)表示调制模组703的反射光谱,A`(λ)表示光学镜头1004的透射光谱。
由此可以确定,传感模组704检测到的第一光谱与光学镜头705投射至空间区域的第二光谱的差异可用t表示,具体可参见下述公式6。
Figure PCTCN2022117928-appb-000001
其中,Rs(λ)、LCoS(λ)和A`(λ)可通过分光光度计等测试得到。
由此可以得出,传感模组704检测到的第一光谱与投射至空间区域的第二光谱的关系满足下述公式7。
Display(λ)=t*Sensor(λ)   公式7
需要说明的是,上述图7a是以第一偏振光为P偏光,第二偏振光为S偏光为例说明。若第一偏振光为S偏光,第二偏振光为P偏光,可将图7a中调制模组703和传感模组704的位置互换。相应的,可将上述公式5中的Rs(λ)用Tp(λ)替换,Tp(λ)用Rs(λ)替换,相当于公式5维持不变;将上述公式6中的Rs(λ)用Tp(λ)替换。
如图7b所示,为本申请提供的又一种光学显示装置的结构示意图。该光学显示装置710可以将上述图7a中的偏振光分束器702用透反件712替换。具体的,该光学显示装置710可包括光源模组711、透反件712、调制模组713、传感模组714、光学镜头715。进一步,可选的,该光学显示装置还可包括处理模组716。关于各个模组更详细的介绍可参见前述相关描述,此处不再赘述。
进一步,光线经透反件712分为第一路光和第二路光,第一路光经透反件712部分透射至传感模组714。基于此,传感模组714接收到的第一路光的第一光谱可参见下述公式3中的Sensor(λ),此处,公式3中的Tp(λ)为透反件712的透射光谱。第二路光经透反件712反射至调制模组713,调制模组713对第二路光进行空间相位调制得到图像光,并将图像光反射至透反件712,图像光经透反件712透射至光学镜头715。基于此,投射到空间区域的图像光的第二光谱可表示为Display(λ),具体可参见上述公式5,此处公式5中的Rs(λ)表示透反件712的反射光谱。
由此可以确定,传感模组714检测到的第一光谱与光学镜头715投射至空间区域的第二光谱的差异可用t表示,具体可参见上述公式6,此处不再赘述。
需要说明的是,上述图7b是以第一路光为透反件712透射部分的光,第二路光为透反件712反射部分的光为例说明。若第一路光为反射部分的光,第二路光为透射部分的光, 可将图7b中调制模组713和传感模组714的位置互换。相应的,可将上述公式5中的Rs(λ)用Tp(λ)替换,Tp(λ)用Rs(λ)替换,相当于公式5维持不变;将上述公式6中的Rs(λ)用Tp(λ)替换。
如图7c所示,为本申请提供的又一种光学显示装置的结构示意图。该光学显示装置720可以在上述图7b的复眼透镜7116之前(即复眼透镜7116与第二二向镜7115之间)或之后增加偏振片727,也可以理解为,光源模组711还可包括偏振片727。或者,在上述图7b的光源模组711和透反件712之间增加偏振片727,也可以理解为,光学显示装置720还可包括偏振片727。该偏振片727可以允许P偏光或S偏光通过。图7c以偏振片727增加在光源模组721和透反件722之间示例的,以偏振片727允许P偏光通过为例。具体的,该光学显示装置720可包括光源模组721、透反件722、调制模组723、传感模组724、光学镜头725和偏振片727。进一步,可选的,该光学显示装置还可包括处理模组726。关于其它各个模组更详细的介绍可参见前述相关描述,此处不再赘述。
需要说明的是,基于上述图7c的光学显示装置的各个光谱与上基于述图7b的各个光谱区别在于:上述图7c进入透反件722的光谱为光源模组721出射的光谱LED(λ)×T 3(λ),其中,T 3(λ)为偏振片727的透射光谱。
为了便于方案的描述,下文的介绍中以第一路光为第一偏振光,第一偏振光为P偏光,第二路光为第二偏振光,第二偏振光为S偏光为例介绍。
基于上述第一光谱与第二光谱的关系,以及传感模组检测到的第一偏振光的第一光谱,修正第二光谱对应的图像光形成的图像的色偏,具体的过程可以由处理组件执行。如下,基于处理组件所属的模组分情形介绍。
情形一,处理组件属于光源模组。
也可以理解为,上述光源模组还可包括处理组件。换言之,光源模组可包括发光组件和处理组件,进一步,可选的,还可包括准直镜和/或二向镜等。
请参阅图8,为本申请提供的另一种光学显示装置的结构示意图。该光学显示装置可包括光源模组801、分光模组802、调制模组803、传感模组804、光学镜头805,光源模组801包括处理组件8011和发光组件8012。传感模组804用于向处理组件8011发送将获得的色彩信息,处理组件8011接收来自传感模组804的色彩信息,根据色彩信息生成控制信号,并向发光组件8012发送控制信号,发光组件8012用于根据控制信号调节至少两种颜色的光的权重,具体的过程可参见下述图9a的介绍。关于光源模组801、分光模组802、调制模组803、传感模组804、光学镜头805更详细的介绍可参见前述描述,此处不再赘述。
示例性地,处理组件可以是中央处理单元(central processing unit,CPU),通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其它可编程逻辑器件、晶体管逻辑器件,硬件部件或者其任意组合。其中,通用处理器可以是微处理器,也可以是任何常规的处理器。
如图9a所示,为本申请提供一种调节色彩的方法流程示意图。该方法可以由上述光源模组中的处理组件执行,该方法包括以下步骤:
步骤901,传感模组向处理组件发送第一光谱的信息。相应的,处理组件接收来自传 感模组的第一光谱的信息。
此处,该第一光谱的信息可携带于第一电信号。也就是说,传感模组向处理组件发送第一电信号,该第一电信号包括第一光谱的信息。
步骤902,处理组件根据第一光谱的信息以及第一光谱与第二光谱的对应关系,确定图像光的第二光谱。
第一光谱可通过传感模组检测第一偏振光得到,根据上述公式7,可得到第二光谱。即Display(λ)=t*Sensor(λ),其中,第二光谱用Display(λ)表示,第一光谱用Sensor(λ)表示。
步骤903,处理组件根据第二光谱、以及光谱与色坐标的对应关系,确定图像光的色坐标。
其中,光谱与色坐标的对应关系可参见下述公式8和公式9。
Figure PCTCN2022117928-appb-000002
Figure PCTCN2022117928-appb-000003
其中,X、Y、Z表示标准观察者光谱三刺激值,x、y表示色坐标,
Figure PCTCN2022117928-appb-000004
表示光谱强度分布,积分区间为可见光范围,例如[380,780],进一步,也可以是[400,700]。
进一步,将用第一光谱表示的第二光谱(即t*Sensor(λ))代入上述公式8和公式9,得到公式10和公式11,基于公式10和公式11可得到图像光的色坐标。
Figure PCTCN2022117928-appb-000005
Figure PCTCN2022117928-appb-000006
也可以理解为,图像光的色坐标为(x real,y real)。
步骤904,处理组件根据图像光的色坐标与预设的目标色坐标,确定是否需要调节各颜色的光的权重;若需要调节,执行下述步骤905;若不需要调节,执行下述步骤906。
在一种可能的实现方式中,可预先存储多组预设的色坐标(x target,y target)。进一步,可选的,光学显示装置可有多种工作模式(例如雪天模式、雨天模式、晴天模式、夜间模式、白天模式等),一种工作模式可对应一组预设的色坐标,一组预设的色坐标可以是白光的色坐标,可参见表1。应理解,预设的目标色坐标可为预设的色坐标中的一个。
表1预设的色坐标
工作模式 色坐标
雪天模式 A(x target1,y target1)
雨天模式 B(x target2,y target2)
晴天模式 C(x target3,y target3)
夜间模式 D(x target4,y target4)
白天模式 E(x target5,y target5)
需要说明的是,通过表的形式存储工作模式与预设的色坐标的对应关系仅是一种示例,还可以通过其它可能的形式,本申请对此不作限定。
进一步,可选的,用户可选定某一组预设的色坐标(选定的该预设的色坐标即为预设的目标色坐标),处理组件可检测到用户选定的该组预设的目标色坐标,对比图像光的色坐标(x real,y real)与预设的目标色坐标,确定是否需要调节各颜色的光的权重。具体的:△x=|x target-x real|,△y=|y target-y real|;若△x和△y中至少一个大于阈值,说明图像光的色坐标与预设的目标色坐标不一致,需要调节各颜色的光的权重;若△x和△y均不大于阈值,说明图像光的色坐标与预设的目标色坐标一致,不需要调节各颜色的光的权重。其中,阈值例如可以是0.002等。
步骤905,处理组件确定各颜色的光的权重,并根据各种颜色的光的权重生成控制信号。
以光源模组为上述图4b所示的结构为例,考虑各种颜色的光的权重图像光的第二光谱可用下述LED(λ) target表示,其中,a、m、n分别为蓝光、绿光和红光的权重。
LED(λ) target=a×LED B(λ)×T 1(λ)×T 2(λ)+m×LED G(λ)×R 1(λ)×T 2(λ)+n×LED R(λ)×R 2(λ)。
也可以理解为,通过调节红光、绿光和蓝光的权重,来调节第二光谱LED(λ) target,从而实现调节光源发射的光线,以实现减小基于图像光形成的图像的色偏。
在一种可能的实现方式中,可通过调节发射各种颜色的光对应的光源的电流的权重,以实现调节各种颜色的光的权重。也可以理解为,控制信号包括各个光源的电流的权重。例如,发光组件包括第一光源、第二光源和第三光源,处理组件可通过调节I 1:I 2:I 3实现调节各颜色的光的权重a:m:n。
进一步,可选的,处理组件可根据颜色处理算法调节电流的权重,颜色处理算法可以预设的。例如,按a%增加输入某一个或某几个光源的电流的权重,并按a%减小剩余光源的电流的权重。其中,a%可以为0.5%、0.2%或1%等。需要说明的是,各种颜色的光对应的光源的电流的权重之和为100%。
应理解,输入光源模组的各个光源有个初始电流权重,该初始电流权重可以是预先存储的,可以是在初始化过程中确定的,过于初始化的过程可参见下述图10的介绍。
以输入光源模组的电流是PWM的电流为例,光源模组以包括R光源、G光源和B光源为例。请参阅图9b,可通过调节输入R光源、G光源和B光源的PWM的占空比,具体的,可改变相邻两个脉冲之间的间隔(即调节T 1、T 2、T 3的大小)。再比如,可以通过改变输入R光源、G光源和B光源的电流值(I)的大小。进而可实现调节输入R光源、G光源和B光源的电流的权重。
步骤906,处理组件控制发光组件按初始化的各种颜色的光的权重发射对应颜色的光。
通过上述步骤901至步骤906可以看出,通过第一光谱与第二光谱的关系,以及传感模组检测到的第一偏振光的第一光谱,修正第二光谱对应的图像光形成的图像的色偏,从而有助于减小光学显示装置显示图像的色偏。也就是说,复用第二偏振的光,将传感模组检测到的色彩信息与预设的目标色坐标进行对比,以实现色点调节,从而可实现颜色校准。
请参阅图10,为本申请提供的一种调节图像的色彩初始化过程的方法流程示意图。该方法可以由上述光源模组中的处理组件执行。为了便于方案的说明,下面以光源模组包括第一光源、第二光源和第三光源为例说明。
该方法包括以下步骤:
步骤1001,处理组件获取预设的目标色坐标。
在一种可能的实现方式中,可以是检测到用户选定的某一组预设的色坐标,该选定的预设色坐标即为预设的目标色坐标。
步骤1002,处理组件根据预设的色坐标与电流权重的对应关系、以及获取的预设的目标,生成初始信号。
其中,初始信号包括输入各个光源的电流的权重。
在一种可能的实现方式中,预设的色坐标与电流权重的对应关系可用下述表2表示。根据预设的目标色坐标和表2,可确定出预设的目标色坐标对应的电流权重。例如,预设的目标色坐标为A(x target1,y target1),则可确定对应的电流权重比为I 11:I 11:I 13;初始信号包括电流权重为I 11:I 12:I 13
表2预设的目标色坐标、电流的权重、亮度三者之间的对应关系
色坐标 电流权重 亮度
A(x target1,y target1) I 11:I 12:I 13 B 1
B(x target2,y target2) I 21:I 22:I 23 B 2
 
需要说明的是,通过表的形式表示预设的色坐标、电流权重以及亮度的对应关系仅是一种示例,还可以通过其它可能的形式,本申请对此不作限定。
还需要说明的是,上述表1和表2可以是两个独立的表,也可以是合并到一起的,本申请对此不作限定。此外,表1和表2可以存储于存储器或寄存器中,处理组件可通过调用该存储器或寄存器来获得表1和表2中的数据。关于存储器或寄存器的可参见下述相关介绍,此处不再赘述。
步骤1003,处理组件向发光组件发送初始信号。相应的,发光组件可接收来自处理组件的初始信号。
其中,发光组件以包括第一光源、第二光源和第三光源为例。
步骤1004,发光组件中可根据接收到的初始信号发射对应颜色的光。
需要说明的是,上述图9a和图10的过程也适用于亮度的调节,具体可将上述电流权重用亮度替换,此处不再详细赘述。
情形二,处理组件属于传感模组。
也可以理解为,上述传感模组还可包括处理组件。换言之,传感模组可包括探测组件和处理组件。
请参阅图11,为本申请提供的另一种光学显示装置的结构示意图。该光学显示装置可包括光源模组1101、分光模组1102、调制模组1103和传感模组1104。其中,传感模组1104包括处理组件11041和探测组件11042。探测组件11042可用于检测第一偏振光的色彩信息,并向处理组件11041发送色彩信息。处理组件11041用于根据色彩信息生成控制信号,并向光源模组1101发送控制信号。相应的,光源模组1101用于根据控制信号调节至少两种颜色的光中各颜色的电流的权重。关于光源模组801、分光模组802、调制模组803、传 感模组804、光学镜头805更详细的介绍可参见前述描述,此处不再赘述。
关于传感模组中的处理组件生成控制信号的过程可参见前述图9a的步骤902至步骤905的介绍,此处不再重复赘述。
情形三,处理组件是独立于传感模组和光源模组的,此处,处理组件也可称为处理模组。
也可以理解为,光学显示装置还可包括处理模组。
请参阅图12a,为本申请提供的又一种光学显示装置的结构示意图。该光学显示装置1200可包括光源模组1201、分光模组1202、调制模组1203、传感模组1204、光学镜头1205和处理模组1206。传感模组1204用于获取第一偏振光的色彩信息,并向处理模组1206发送色彩信息;处理模组1206用于根据色彩信息生成控制信号,并向光源模组1201发送控制信号;光源模组1201还用于根据控制信号,调节至少两种颜色的光的权重。
关于光源模组1201、分光模组1202、调制模组1203、传感模组1204、光学镜头1205更详细的介绍可参见前述描述,此处不再赘述。关于处理模组生成控制信号的过程可参见前述图9a的步骤902至步骤905的介绍,此处不再重复赘述。
需要说明的是,上述图9a和图10的过程还可能是其它可能具有处理功能的模组或结构执行的,例如ODP的图像处理控制板、或车载控制器,车载控制器例如可以是独立的控制器、或交通工具中的域控制器、或交通工具中的电子控制单元(electronic control unit,ECU)等,本申请对此不作限定。
基于上述光学显示装置,在不增加光学***复杂度的情况下,通过增加传感模组,可实现当环境(如温度)变化时,反馈调节光源模组中个发光组件(如光源)的电流的权重,从而可改变合成光线的各颜色的光的权重,可保证光学显示装置显示在一个恒定的颜色范围,从而有助于减小光学显示装置显示的图像的色偏。
图12b为本申请提供的一种光学显示装置的电路示意图。该光学显示装置中的电路主要包括包含主处理器(host CPU)1701,外部存储器接口1702,内部存储器1703,音频模块1704,视频模块1705,电源模块1706,无线通信模块1707,I/O接口1708、视频接口1709、显示电路1710和调制器1711等。其中,主处理器1701与其周边的元件,例如外部存储器接口1702,内部存储器1703,音频模块1704,视频模块1705,电源模块1706,无线通信模块1707,I/O接口1708、视频接口1709、显示电路1710可以通过总线连接。主处理器1701可以称为前端处理器。
另外,本申请实施例示意的电路图并不构成对光学显示装置的具体限定。在本申请另一些实施例中,光学显示装置可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
其中,主处理器1701包括一个或多个处理单元,例如:主处理器1701可以包括应用处理器(Application Processor,AP),调制解调处理器,图形处理器(Graphics Processing Unit,GPU),图像信号处理器(Image Signal Processor,ISP),控制器,视频编解码器,数字信号处理器(Digital Signal Processor,DSP),基带处理器,和/或神经网络处理器(Neural-Network Processing Unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
主处理器1701中还可以设置存储器,用于存储指令和数据。在一些实施例中,主处 理器1701中的存储器为高速缓冲存储器。该存储器可以保存主处理器1701刚用过或循环使用的指令或数据。如果主处理器1701需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了主处理器1701的等待时间,因而提高了光学显示装置的效率。其中,主处理器1701可以执行存储的指令,执行上述调节色彩的方法。
在一些实施例中,光学显示装置还可以包括多个连接到主处理器1701的输入输出(Input/Output,I/O)接口1708。接口1708可以包括集成电路(Inter-Integrated Circuit,I2C)接口,集成电路内置音频(Inter-Integrated Circuit Sound,I2S)接口,脉冲编码调制(Pulse Code Modulation,PCM)接口,通用异步收发传输器(Universal Asynchronous Receiver/Transmitter,UART)接口,移动产业处理器接口(Mobile Industry Processor Interface,MIPI),通用输入输出(General-Purpose Input/Output,GPIO)接口,用户标识模块(Subscriber Identity Module,SIM)接口,和/或通用串行总线(Universal Serial Bus,USB)接口等。上述I/O接口1708可以连接鼠标、触摸板、键盘、摄像头、扬声器/喇叭、麦克风等设备,也可以连接光学显示装置上的物理按键(例如音量键、亮度调节键、开关机键等)。
外部存储器接口1702可以用于连接外部存储卡,例如Micro SD卡,实现扩展光学显示装置的存储能力。外部存储卡通过外部存储器接口1702与主处理器1701通信,实现数据存储功能。
内部存储器1703可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器1703可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如通话功能,时间设置功能等)等。存储数据区可存储光学显示装置使用过程中所创建的数据(比如电话簿,世界时间等)等。此外,内部存储器1703可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(Universal Flash Storage,UFS)等。主处理器1701通过运行存储在内部存储器1703的指令,和/或存储在设置于主处理器1701中的存储器的指令,执行光学显示装置的各种功能应用以及数据处理。
光学显示装置可以通过音频模块1704以及应用处理器等实现音频功能。例如音乐播放,通话等。
音频模块1704用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块1704还可以用于对音频信号编码和解码,例如进行放音或录音。在一些实施例中,音频模块1704可以设置于主处理器1701中,或将音频模块1704的部分功能模块设置于主处理器1701中。
视频接口1709可以接收外部输入的音视频信号,其具体可以为高清晰多媒体接口(High Definition Multimedia Interface,HDMI),数字视频接口(Digital Visual Interface,DVI),视频图形阵列(Video Graphics Array,VGA),显示端口(Display port,DP)等,视频接口1709还可以向外输出视频。当光学显示装置作为抬头显示使用时,视频接口1709可以接收周边设备输入的速度信号、电量信号,还可以接收外部输入的AR视频信号。当光学显示装置作为投影仪使用时,视频接口1709可以接收外部电脑或终端设备输入的视频信号。
视频模块1705可以对视频接口1709输入的视频进行解码,例如进行H.264解码。视频模块还可以对光学显示装置采集到的视频进行编码,例如对外接的摄像头采集到的视频进行H.264编码。此外,主处理器1701也可以对视频接口1709输入的视频进行解码,然 后将解码后的图像信号输出到显示电路1710。
显示电路1710和调制器1711用于显示对应的图像。在本实施例中,视频接口1709接收外部输入的视频源信号,视频模块1705进行解码和/或数字化处理后输出一路或多路图像信号至显示电路1710,显示电路1710根据输入的图像信号驱动调制器1711将入射的偏振光进行成像,进而输出至少两路图像光。此外,主处理器1701也可以向显示电路1710输出一路或多路图像信号。
在本实施例中,显示电路1710以及调制器1711属于调制单元中的电子元件,显示电路1710可以称为驱动电路。
电源模块1706用于根据输入的电力(例如直流电)为主处理器1701和光源1712提供电源,电源模块1706中可以包括可充电电池,可充电电池可以为主处理器1701和光源1712提供电源。光源1712发出的光可以传输到调制器1711进行成像,从而形成图像光信号。
无线通信模块1707可以使得光学显示装置与外界进行无线通信,其可以提供无线局域网(Wireless Local Area Networks,WLAN)(如无线保真(Wireless Fidelity,Wi-Fi)网络),蓝牙(Bluetooth,BT),全球导航卫星***(Global Navigation Satellite System,GNSS),调频(Frequency Modulation,FM),近距离无线通信技术(Near Field Communication,NFC),红外技术(Infrared,IR)等无线通信的解决方案。无线通信模块1707可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块1707经由天线接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到主处理器1701。无线通信模块1707还可以从主处理器1701接收待发送的信号,对其进行调频,放大,经天线转为电磁波辐射出去。
另外,视频模块1705进行解码的视频数据除了通过视频接口1709输入之外,还可以通过无线通信模块1707以无线的方式接收或从外部存储器中读取,例如光学显示装置可以通过车内的无线局域网从终端设备或车载娱乐***接收视频数据,光学显示装置还可以读取外部存储器中存储的音视频数据。
基于上述描述的光学显示装置的结构和功能原理,本申请还提供一种显示***。该显示***包括上述任一实施例所示的光学显示装置以及位于空间区域的空间光放大模组。其中,空间光放大模组用于将来自光学显示装置的图像光对应的图像进行放大。示例性地,该显示***可以包括但不限于投影仪、HUD***、或桌面显示器等。
在一种可能的实现方式中,空间光放大模组包括至少一个曲面反射镜、至少一个柱面镜中的任一项或任多项的组合。其中,柱面镜可以在一个维度上有曲率,从而可以实现一维整形。也可以理解为,在一个维度上对光线进行发散或会聚,在另一个维度上对光线进行反射。柱面镜例如可以是平凸柱面镜、或平凹柱面镜等。
基于上述描述的显示***的结构和功能原理,本申请还可以提供一种交通工具。如图13a所示,为本申请提供的一种交通工具的部分结构示意图。该交通工具可以包括上述任一实施例中的显示***和风挡,其中风挡用于将来自显示***的图像光进行反射成像。
示例性地,交通工具为车辆,显示***为HUD***,风挡可用于将来自显示***的图像光反射至车辆的眼动范围内(可参见图2)。
当上述任一实施例中的光学显示装置应用于交通工具时,第二光谱可用下述公式12 表示。
Display(λ)=LED(λ)×Rs(λ)×LCoS(λ)×Tp(λ)×A(λ)   公式12
其中,A(λ)表示从光学镜头至人眼看到显示图像中间经过的各个光学元件的光谱的总和。中间经过的光学元件包括但不限于光学镜头、空间光放大模组、以及风挡等。
在一种可能的实现方式中,眼动范围也可称为眼盒(eyebox),驶员的眼睛通常需处于眼动范围内,可参见图2。应理解,如果眼镜与眼盒的中心对齐,则可以看到完整且清晰的虚像。当眼睛向左右或上下移动时,在每个方向上的某个点处,看到的虚像可能会呈现扭曲、显色错误,甚至不显示等问题。由于不同驾驶员的身高,一般眼动范围的大小是130mm×50mm,即在纵向上有约±50mm的移动范围,在横向上有约±130mm的移动范围。
为了防止风挡被撞击之后整块破碎,通常风挡包括两层玻璃及夹在两层玻璃中间的一层聚乙烯醇缩丁醛(polyvinyl butyral,PVB)材料,PVB材料的折射率与玻璃的折射率较接近,为了方案的说明,可将风挡简化为有一定厚度(一般为4~5mm)的平面玻璃。在一种可能的实现方式中,风挡包括楔型风挡或者平面风挡。
应理解,图13a所示的交通工具结构仅是一个示例。交通工具还可以包括其他器件,例如方向盘、存储器和无线通信装置等。
请参见图13b,为本申请提供的一种交通工具的一种可能的功能框架示意图。该示例中以显示***为抬头显示***为例介绍。交通工具的功能框架中可包括各种子***,例如图示中的传感器***12、控制***14、一个或多个***设备16(图示以一个为例示出)、电源18、计算机***20和抬头显示***22。可选地,交通工具还可包括其他功能***,例如为交通工具提供动力的引擎***等等,本申请这里不作限定。
其中,传感器***12可包括若干检测装置,这些检测装置能感受到被测量的信息,并将感受到的信息按照一定规律将其转换为电信号或者其他所需形式的信息输出。如图示出,这些检测装置可包括全球定位***(global positioning system,GPS)、车速传感器、惯性测量单元(inertial measurement unit,IMU)、雷达单元、激光测距仪、摄像装置、轮速传感器、转向传感器、档位传感器、或者其他用于自动检测的元件等等,本申请并不作限定。
控制***14可包括若干元件,例如图示出的转向单元、制动单元、照明***、自动驾驶***、地图导航***、网络对时***和障碍规避***。可选地,控制***14还可包括诸如用于控制车辆行驶速度的油门控制器及发动机控制器等元件,本申请不作限定。
***设备16可包括若干元件,例如图示中的通信***、触摸屏、用户接口、麦克风以及扬声器等等。其中,通信***用于实现交通工具和除交通工具之外的其他设备之间的网络通信。在实际应用中,通信***可采用无线通信技术或有线通信技术实现交通工具和其他设备之间的网络通信。该有线通信技术可以是指车辆和其他设备之间通过网线或光纤等方式通信。
电源18代表为车辆提供电力或能源的***,其可包括但不限于再充电的锂电池或铅酸电池等。在实际应用中,电源中的一个或多个电池组件用于提供车辆启动的电能或能量,电源的种类和材料本申请并不限定。
交通工具的若干功能均由计算机***20控制实现。计算机***20可包括一个或多个处理器2001(图示以一个处理器为例示出)和存储器2002(也可称为存储装置)。在实际应用中,该存储器2002也在计算机***20内部,也可在计算机***20外部,例如作为 交通工具中的缓存等,本申请不作限定。
其中,处理器2001可包括一个或多个通用处理器,例如图形处理器(graphic processing unit,GPU)。处理器2001可用于运行存储器2002中存储的相关程序或程序对应的指令,以实现车辆的相应功能。
存储器2002可以包括易失性存储器(volatile memory),例如RAM;存储器也可以包括非易失性存储器(non-vlatile memory),例如ROM、快闪存储器(flash memory)、HDD或固态硬盘SSD;存储器2002还可以包括上述种类的存储器的组合。存储器2002可用于存储一组程序代码或程序代码对应的指令,以便于处理器2001调用存储器2002中存储的程序代码或指令以实现车辆的相应功能。该功能包括但不限于图13b所示的车辆功能框架示意图中的部分功能或全部功能。本申请中,存储器2002中可存储一组用于车辆控制的程序代码,处理器2001调用该程序代码可控制车辆安全行驶,关于如何实现车辆安全行驶具体在本申请下文详述。
可选地,存储器2002除了存储程序代码或指令之外,还可存储诸如道路地图、驾驶线路、传感器数据等信息。计算机***20可以结合车辆功能框架示意图中的其他元件,例如传感器***中的传感器、GPS等,实现车辆的相关功能。例如,计算机***20可基于传感器***12的数据输入控制交通工具的行驶方向或行驶速度等,本申请不作限定。
抬头显示***22可包括若干元件,例如图示出的前挡玻璃,控制器和抬头显示器。控制器222用于根据用户指令生成图像(例如生成包含车速、电量/油量等车辆状态的图像以及增强现实AR内容的图像),并将该图像发送至抬头显示器进行显示;抬头显示器可以包括图像生成单元、反射镜组合,前挡玻璃用于配合抬头显示器以实现抬头显示***的光路,以使在驾驶员前方呈现目标图像。需要说明的是,抬头显示***中的部分元件的功能也可以由车辆的其它子***来实现,例如,控制器也可以为控制***中的元件。
其中,本申请图13b示出包括四个子***,传感器***12、控制***14、计算机***20和抬头显示***22仅为示例,并不构成限定。在实际应用中,交通工具可根据不同功能对车辆中的若干元件进行组合,从而得到相应不同功能的子***。在实际应用中,交通工具可包括更多或更少的***或元件,本申请不作限定。
示例性地,交通工具可以为智能车、电动车、数字汽车、轿车、卡车、摩托车、公共汽车、船、飞机、直升飞机、割草机、娱乐车、游乐场车辆、施工设备、电车、高尔夫球车、火车、和手推车等,本申请对此不作限定。
基于上述内容和相同的构思,本申请提供一种色彩调节方法,请参阅图14的介绍。该色彩调节方法可应用于上述图3至图12a任一实施例所示的光学显示装置。也可以理解为,可以基于上述图3至图12a任一实施例所示的光学显示装置来实现色彩调节方法。或者,该色彩调节也可以应用于上述任一实施例所示的显示***,或者也可以应用于上述任一实施例所示的交通工具。
该色彩调节方法可由控制装置执行,该控制装置可以属于光学显示装置(例如为光学显示装置中的主处理器),或者也可以是独立于光学显示装置的控制装置,例如芯片或芯片***等。当该控制装置属于交通工具时,该控制装置可以是交通工具中的域处理器,或者也可以是交通工具中的电子控制单元(electronic control unit,ECU)等,或者也可以是光源模组中的处理组件,或者也可以是传感模组中的处理组件等。
该色彩调节方法包括以下步骤:
步骤1401,控制光源模组发射光线。
其中,光线由至少两种颜色的光合成,在一种可能的实现方式中,可通过控制输入光源模组的各种颜色的光对应的光源的电流的权重,来控制各种颜色的光的权重,具体过程可参见前述光源模组的介绍,此处不再赘述。
进一步,来自光源模组的光线可通过分光模组分为第一偏振光和第二偏振光,具体可参见前述分光模组的介绍,此处不再赘述。
步骤1402,控制调制模组对第二偏振光进行调制,得到携带图像信息的图像光。
该步骤1402可参见前述调制模组的介绍,此处不再赘述。
步骤1403,控制传感模组获取第一偏振光的色彩信息。
该步骤1403可参见前述传感模组的介绍,此处不再赘述。
步骤1404,根据色彩信息,控制光源模组调节至少两种颜色的光的权重。
该步骤1404可参见上述图9a和图10的介绍,此处不再重复赘述。
在一种可能的实现方式中,可以通过向各个模组发送对应的控制信号,以实现控制各个模组。
通过上述步骤1401至步骤1404可以看出,通过传感模组可检测到的第一偏振光作为标定光源模组发射的光线的颜色的标定,进一步,基于传感模组根据第一偏振光检测到的色彩信息,调节合成光线的至少两种颜色的光的权重,从而实现对图像光的色彩的校正。
可以理解的是,为了实现上述方法实施例中功能,控制装置包括了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本申请中所公开的实施例描述的各示例的模块及方法步骤,本申请能够以硬件或硬件和计算机软件相结合的形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用场景和设计约束条件。
基于上述内容和相同构思,图15和图16为本申请的提供的可能的控制装置的结构示意图。这些控制装置可以用于实现上述方法实施例中如图9a、图10或图14中的功能,因此也能实现上述方法实施例所具备的有益效果。
如图15所示,该控制装置1500包括处理模块1501,进一步,还可包括收发模块1502。控制装置1500用于实现上述图14中所示的方法实施例中的功能。
当控制装置1500用于实现图14所示的方法实施例的功能时:处理模块1501用于控制光源模组发射光线、控制调制模组对第二偏振光进行调制得到携带图像信息的图像光、控制传感模组获取第一偏振光的色彩信息、以及根据色彩信息控制光源模组调节至少两种颜色的光的权重。进一步,可选的,收发模块1502用于向光源模组、调制模组、传感模组等发送控制信号。
应理解,本申请实施例中的处理模块1501可以由处理器或处理器相关电路组件实现,收发模块1502可以由接口电路等相关电路组件实现。
基于上述内容和相同构思,如图16所示,本申请还提供一种控制装置1600。该控制装置1600可包括处理器1601,进一步,可选的,还可包括接口电路1602。处理器1601和接口电路1602之间相互耦合。可以理解的是,接口电路1602可以为输入输出接口。可选地,控制装置1600还可包括存储器1603,用于存储处理器1601执行的计算机程序或指 令等。
当控制装置1600用于实现图15所示的方法时,处理器1601用于执行上述处理模块1501的功能,接口电路1602用于执行上述收发模块1502的功能。
基于上述内容和相同构思,本申请提供一种芯片。该芯片可包括处理器和接口电路,进一步,可选的,该芯片还可包括存储器,处理器用于执行存储器中存储的计算机程序或指令,使得芯片执行上述图14中任意可能的实现方式中的方法。
本申请的实施例中的方法步骤可以通过硬件的方式来实现,也可以由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。另外,该ASIC可以位于光学显示装置、显示***或交通工具中。当然,处理器和存储介质也可以作为分立模组存在于光学显示装置、显示***或交通工具中。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。计算机程序产品包括一个或多个计算机程序或指令。在计算机上加载和执行计算机程序或指令时,全部或部分地执行本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其它可编程装置。计算机程序或指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机程序或指令可以从一个网站站点、计算机、服务器或数据中心通过有线或无线方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是集成一个或多个可用介质的服务器、数据中心等数据存储设备。可用介质可以是磁性介质,例如,软盘、硬盘、磁带;也可以是光介质,例如,数字视频光盘(digital video disc,DVD);还可以是半导体介质,例如,固态硬盘(solid state drive,SSD)。
在本申请的各个实施例中,如果没有特殊说明以及逻辑冲突,不同的实施例之间的术语和/或描述具有一致性、且可以相互引用,不同的实施例中的技术特征根据其内在的逻辑关系可以组合形成新的实施例。
本申请中,“均匀”不是指绝对的均匀,可以允许有一定工程上的误差。“垂直”不是指绝对的垂直,可以允许有一定工程上的误差。“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。在本申请的文字描述中,字符“/”,一般表示前后关联对象是一种“或”的关系。在本 申请的公式中,字符“/”,表示前后关联对象是一种“相除”的关系。本申请中,符号“(a,b)”表示开区间,范围为大于a且小于b;“[a,b]”表示闭区间,范围为大于或等于a且小于或等于b;“(a,b]”表示半开半闭区间,范围为大于a且小于或等于b;“(a,b]”表示半开半闭区间,范围为大于a且小于或等于b。另外,在本申请中,“示例性地”一词用于表示作例子、例证或说明。本申请中被描述为“示例”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。或者可理解为,使用示例的一词旨在以具体方式呈现概念,并不对本申请构成限定。
可以理解的是,在本申请中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定。术语“第一”、“第二”等类似表述,是用于分区别类似的对象,而不必用于描述特定的顺序或先后次序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、***、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。

Claims (13)

  1. 一种光学显示装置,其特征在于,包括光源模组、分光模组、调制模组和传感模组;
    所述光源模组,用于发射光线,所述光线由至少两种颜色的光合成;
    所述分光模组,用于将来自所述光源模组的所述光线分为第一路光和第二路光,并将所述第一路光传播至所述传感模组,将所述第二路光传播至所述调制模组;
    所述调制模组,用于对所述第二路光进行调制,得到携带图像信息的图像光;
    所述传感模组,用于获取所述第一路光的色彩信息,所述色彩信息用于所述光源模组调节所述至少两种颜色的光的权重。
  2. 如权利要求1所述的装置,其特征在于,所述光源模组包括处理组件和发光组件;
    所述处理组件,用于接收来自所述传感模组的所述色彩信息,根据所述色彩信息生成控制信号,并向所述发光组件发送所述控制信号,所述控制信号用于调节所述至少两种颜色的光的权重;
    所述发光组件,用于根据所述控制信号调节所述至少两种颜色的光的权重。
  3. 如权利要求2所述的装置,其特征在于,所述处理组件具体用于:
    根据所述色彩信息确定所述图像光的色坐标;
    若所述图像光的色坐标与预设的目标色坐标的差值大于阈值,生成所述控制信号。
  4. 如权利要求1~3任一项所述的装置,其特征在于,所述分光模组包括偏振分束器;
    所述第一路光为P偏振光,所述第二路光为S偏振光;或者,
    所述第一路光为S偏振光,所述第二路光为P偏振光。
  5. 如权利要求1~3任一项所述的装置,其特征在于,所述分光模组包括透反件;
    所述第一路光为从所述透反件透射的光,所述第二路光为从所述透反件反射的光;或者,
    所述第一路光为从所述透反件反射的光,所述第二路光为从所述透反件透射的光。
  6. 如权利要求1~5任一项所述的装置,其特征在于,所述发光组件包括用于发射红光的第一光源、用于发射蓝光第二光源和用于发射绿光的第三光源。
  7. 如权利要求6所述的装置,其特征在于,所述光源模组还包括第一二向镜和第二二向镜;
    所述第一二向镜,用于反射来自所述第二光源的蓝光,并透射来自所述第三光源的绿光;
    所述第二二向镜,用于反射来自所述第一光源的红光,并透射所述第一二向镜透射的绿光,并透射所述第一二向镜反射的蓝光。
  8. 如权利要求1~7任一项所述的装置,其特征在于,所述光学显示装置还包括光学镜头;
    所述光学镜头,用于将所述调制模组输出的图像光投射至空间区域。
  9. 一种显示***,其特征在于,包括如权利要求1~8任一项所述的光学显示装置、以及位于所述空间区域的空间光放大模组;
    所述空间光放大模组,用于将来自所述光学显示装置的所述图像光对应的图像进行放大。
  10. 如权利要求9所述的***,其特征在于,所述空间光放大模组包括以下任一项或任 多项组合:
    至少一个曲面反射镜;
    至少一个柱面镜。
  11. 一种交通工具,其特征在于,包括如权利要求9或10所述的显示***以及风挡;
    所述风挡,用于将来自所述显示***的图像光进行反射成像。
  12. 一种色彩调节方法,其特征在于,应用于光学显示装置,所述光学显示装置包括光源模组、分光模组、调制模组和传感模组;
    所述方法包括:
    控制所述光源模组发射光线,所述光线由至少两种颜色的光合成,所述光线通过所述分光模组分为第一路光和第二路光;
    控制所述调制模组对所述第二路光进行调制,得到携带图像信息的图像光;
    控制所述传感模组获取所述第一路光的色彩信息;
    根据所述色彩信息,控制所述光源模组调节所述至少两种颜色的光的权重。
  13. 如权利要求12所述的方法,其特征在于,所述根据所述色彩信息,控制所述光源模组调节所述至少两种颜色的光的权重,包括:
    根据所述色彩信息确定所述图像光的色坐标;
    若所述图像光的色坐标与预设的目标色坐标的差值大于阈值,生成所述控制信号,所述控制信号用于控制调节所述至少两种颜色的光的权重;
    向所述光源模组发送所述控制信号;
    控制所述光源模组根据所述控制信号,调节所述至少两种颜色的光的权重。
PCT/CN2022/117928 2021-10-29 2022-09-08 一种光学显示装置、显示***、交通工具及色彩调节方法 WO2023071548A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22885437.8A EP4403998A1 (en) 2021-10-29 2022-09-08 Optical display apparatus, display system, vehicle, and color adjustment method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111271734.XA CN116068759A (zh) 2021-10-29 2021-10-29 一种光学显示装置、显示***、交通工具及色彩调节方法
CN202111271734.X 2021-10-29

Publications (1)

Publication Number Publication Date
WO2023071548A1 true WO2023071548A1 (zh) 2023-05-04

Family

ID=86160179

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/117928 WO2023071548A1 (zh) 2021-10-29 2022-09-08 一种光学显示装置、显示***、交通工具及色彩调节方法

Country Status (3)

Country Link
EP (1) EP4403998A1 (zh)
CN (1) CN116068759A (zh)
WO (1) WO2023071548A1 (zh)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1916691A (zh) * 2005-08-15 2007-02-21 明基电通股份有限公司 投影装置及其光源亮度调整方法
JP2007079402A (ja) * 2005-09-16 2007-03-29 Nikon Corp プロジェクタ
CN101697036A (zh) * 2009-11-13 2010-04-21 深圳市博视创电子有限公司 汽车玻璃信息显示***及其成像的方法
CN102012614A (zh) * 2010-10-28 2011-04-13 鸿富锦精密工业(深圳)有限公司 具有自动调整投影亮度功能的投影装置及方法
CN102073203A (zh) * 2010-11-17 2011-05-25 鸿富锦精密工业(深圳)有限公司 具有自动调整投影亮度功能的投影装置及方法
CN102231042A (zh) * 2011-06-22 2011-11-02 贺银波 用于反射式液晶投影显示的光引擎***
CN103003736A (zh) * 2010-05-26 2013-03-27 约翰逊控股公司 显示器,尤其是车辆的平视显示器
CN105388693A (zh) * 2015-12-31 2016-03-09 中国华录集团有限公司 一种激光投影机色彩自动控制***
JP2017038204A (ja) * 2015-08-10 2017-02-16 セイコーエプソン株式会社 プロジェクターおよび制御方法
CN107621746A (zh) * 2016-07-15 2018-01-23 深圳市光峰光电技术有限公司 发光装置及相关投影***
CN108600713A (zh) * 2018-01-24 2018-09-28 苏州佳世达光电有限公司 一种动态侦测投影装置色坐标的方法、模组和投影装置
CN113238379A (zh) * 2021-05-19 2021-08-10 上海天马微电子有限公司 抬头显示***、抬头显示***的驱动方法及交通工具

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1916691A (zh) * 2005-08-15 2007-02-21 明基电通股份有限公司 投影装置及其光源亮度调整方法
JP2007079402A (ja) * 2005-09-16 2007-03-29 Nikon Corp プロジェクタ
CN101697036A (zh) * 2009-11-13 2010-04-21 深圳市博视创电子有限公司 汽车玻璃信息显示***及其成像的方法
CN103003736A (zh) * 2010-05-26 2013-03-27 约翰逊控股公司 显示器,尤其是车辆的平视显示器
CN102012614A (zh) * 2010-10-28 2011-04-13 鸿富锦精密工业(深圳)有限公司 具有自动调整投影亮度功能的投影装置及方法
CN102073203A (zh) * 2010-11-17 2011-05-25 鸿富锦精密工业(深圳)有限公司 具有自动调整投影亮度功能的投影装置及方法
CN102231042A (zh) * 2011-06-22 2011-11-02 贺银波 用于反射式液晶投影显示的光引擎***
JP2017038204A (ja) * 2015-08-10 2017-02-16 セイコーエプソン株式会社 プロジェクターおよび制御方法
CN105388693A (zh) * 2015-12-31 2016-03-09 中国华录集团有限公司 一种激光投影机色彩自动控制***
CN107621746A (zh) * 2016-07-15 2018-01-23 深圳市光峰光电技术有限公司 发光装置及相关投影***
CN108600713A (zh) * 2018-01-24 2018-09-28 苏州佳世达光电有限公司 一种动态侦测投影装置色坐标的方法、模组和投影装置
CN113238379A (zh) * 2021-05-19 2021-08-10 上海天马微电子有限公司 抬头显示***、抬头显示***的驱动方法及交通工具

Also Published As

Publication number Publication date
EP4403998A1 (en) 2024-07-24
CN116068759A (zh) 2023-05-05

Similar Documents

Publication Publication Date Title
WO2024017038A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2024021852A1 (zh) 立体显示装置、立体显示***和交通工具
WO2024007749A1 (zh) 光学***、显示设备以及交通工具
WO2024021574A1 (zh) 立体投影***、投影***和交通工具
WO2023071548A1 (zh) 一种光学显示装置、显示***、交通工具及色彩调节方法
WO2023202195A1 (zh) 一种投影装置、交通工具
CN217360538U (zh) 一种投影***、显示设备和交通工具
WO2023103492A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2023040669A1 (zh) 抬头显示设备和交通工具
WO2023138138A1 (zh) 一种显示装置和交通工具
WO2023040662A1 (zh) 图像生成装置、相关设备和图像投射方法
WO2023098228A1 (zh) 显示装置、电子设备以及交通工具
CN221446439U (zh) 一种显示模组、显示***、交通工具和车载***
WO2024098828A1 (zh) 投影***、投影方法和交通工具
WO2023087739A1 (zh) 投影装置、显示设备和交通工具
WO2024045704A1 (zh) 显示装置、显示设备及交通工具
WO2024021563A1 (zh) 一种显示装置和交通工具
CN115933185B (zh) 虚像显示装置、图像数据的生成方法、装置和相关设备
CN115542644B (zh) 投影装置、显示设备及交通工具
CN221303711U (zh) 一种显示装置、处理设备、显示***和交通工具
WO2023184276A1 (zh) 一种显示方法、显示***和终端设备
WO2023193210A1 (zh) 光学发射模组、光学显示装置、终端设备及图像显示方法
WO2023185293A1 (zh) 一种图像生成装置、显示设备和交通工具
WO2024125441A1 (zh) 显示装置和交通工具
CN115561906A (zh) 显示装置以及交通工具

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22885437

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2022885437

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2022885437

Country of ref document: EP

Effective date: 20240419

NENP Non-entry into the national phase

Ref country code: DE