CN115576110A - Display method and vehicle - Google Patents

Display method and vehicle Download PDF

Info

Publication number
CN115576110A
CN115576110A CN202211356582.8A CN202211356582A CN115576110A CN 115576110 A CN115576110 A CN 115576110A CN 202211356582 A CN202211356582 A CN 202211356582A CN 115576110 A CN115576110 A CN 115576110A
Authority
CN
China
Prior art keywords
image
brightness
display
controller
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211356582.8A
Other languages
Chinese (zh)
Inventor
张淑芳
袁敏敏
米德旺
徐腊梅
殷智慧
王国瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhu Automotive Prospective Technology Research Institute Co ltd
Chery Automobile Co Ltd
Original Assignee
Wuhu Automotive Prospective Technology Research Institute Co ltd
Chery Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhu Automotive Prospective Technology Research Institute Co ltd, Chery Automobile Co Ltd filed Critical Wuhu Automotive Prospective Technology Research Institute Co ltd
Priority to CN202211356582.8A priority Critical patent/CN115576110A/en
Publication of CN115576110A publication Critical patent/CN115576110A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Instrument Panels (AREA)

Abstract

The application discloses a display method and a vehicle, and belongs to the technical field of vehicles. The display method is applied to a display system, the display system comprises an image acquisition device, a controller and a head-up display, and the display method comprises the following steps: the method comprises the steps that an image acquisition device acquires a first image, wherein the first image is an image in front of a vehicle; the image acquisition device sends the first image to the controller; the controller detects the brightness of the first image and determines the corresponding target brightness based on the brightness of the first image; the controller generates a first instruction and sends the first instruction to the head-up display, wherein the first instruction comprises target brightness; the head-up display displays information having a target brightness based on the first instruction, wherein the target brightness matches a brightness of the first image. By the display method, the driving safety of the vehicle is improved.

Description

Display method and vehicle
Technical Field
The application relates to the technical field of vehicles, in particular to a display method and a vehicle.
Background
To ensure that drivers can drive vehicles more safely and more stably, more and more vehicles are being equipped with heads-up display systems. The head-up display system is a vehicle auxiliary vision safe driving system, and the system projects some important information required by a driver during driving onto the front windshield glass of the vehicle through a light path, and the display information is fused with the real scenes around the vehicle. The light rays fused with the real scene are reflected into the eyes of the driver, so that the driver sees the virtual image fused with the real scene. The virtual image may include navigation information, important content for instrument display, and information such as lane departure of an Advanced Driver Assistance System (ADAS) and a preceding vehicle collision warning. As the eyes of the driver only need to watch the front sight road all the time without looking over the instrument panel by lowering the head, the running condition of the vehicle can be mastered in time, and the condition that the safe driving is influenced due to the condition that the information is looked over by lowering the head can be avoided. When the information display function of the head-up display system is used, how to better integrate the display information with the real scene is one of the problems that needs to be solved urgently.
In the related art, in order to solve the above problem, a light sensor is generally mounted on a vehicle body. The brightness signal of the ambient light outside the vehicle is collected through the light sensor, and the brightness signal is sent to the controller. The controller determines the brightness value of the display information matched with the ambient light based on the brightness signal of the ambient light, and sends the brightness value to the head-up display, the head-up display outputs the brightness of the display information matched with the brightness value, and then a driver can see a virtual image of the display information better fused with the ambient light through the front windshield of the vehicle.
However, when the ambient light brightness changes greatly, for example, when the vehicle exits from the tunnel, enters the tunnel, exits from a garage, enters the garage, or is shaded by trees, the brightness of the ambient light collected by the light sensor is greatly different from the brightness of the actual virtual image display area formed by the head-up display. In this way, it may occur that the virtual image display area that has been seen by the human eye is a relatively dark (relatively bright) light, and the ambient light collected by the light sensor is a relatively bright (relatively dark) light, thus causing the brightness of the display information output by the heads-up display based on the light signal collected by the light sensor to be relatively bright (relatively dark). At this time, the realization of the driver is easily attracted by the bright display information (the dark display information is not clearly seen), the attention of the driver is dispersed, the phenomenon that the driver does not brake in time because the driver does not notice people or objects around the vehicle is caused, and further, safety accidents are easily caused. That is, the method of determining the brightness of the display information of the head-up display using the light sensor is less secure.
Disclosure of Invention
In view of this, the present application provides a display method and a vehicle, which can avoid a situation that the brightness of information display is greatly different from the brightness of the environment in front of the vehicle, so that the brightness of the displayed information does not interfere with the driving behavior of the driver, and the driving safety of the vehicle is improved.
In one aspect, an embodiment of the present application provides a display method, where the display method is applied to a display system, the display system includes an image acquisition device, a controller, and a head-up display, and the display method includes:
the image acquisition device acquires a first image, wherein the first image is an image in front of a vehicle;
the image acquisition device sends the first image to the controller;
the controller detects the brightness of the first image and determines corresponding target brightness based on the brightness of the first image;
the controller generates a first instruction and sends the first instruction to the head-up display, wherein the first instruction comprises the target brightness;
the head-up display displays information having the target brightness based on the first instruction, wherein the target brightness matches the brightness of the first image.
Optionally, the field of view corresponding to the first image covers a background range of a virtual image display area, where the virtual image display area is an area where light of the heads-up display is projected onto a front windshield of the vehicle.
Optionally, the detecting, by the controller, the brightness of the first image, and determining the corresponding target brightness based on the brightness of the first image includes:
the controller intercepts a region corresponding to the background range of the virtual image display region in the first image to obtain a second image;
the controller detects the brightness of the second image and determines the corresponding target brightness based on the brightness of the second image.
Optionally, the capturing, by the controller, a region corresponding to a background range of the virtual image display region in the first image, and obtaining a second image includes:
the controller acquires coordinate information of a corresponding area of a background range of the virtual image display area on the first image;
and the controller intercepts an area corresponding to the background range of the virtual image display area in the first image based on the coordinate information to obtain the second image.
Optionally, the obtaining, by the controller, coordinate information of a corresponding area of a background range of the virtual image display area on the first image includes:
the controller determines a background range of a virtual image display region at a target time based on operation information of the vehicle, wherein the operation information includes an operation track and an operation speed of the vehicle, and the target time is a time after the current time;
the controller acquires coordinate information of a corresponding area of a background range of the virtual image display area on the first image at the target moment.
Optionally, the detecting, by the controller, the brightness of the first image, and determining the corresponding target brightness based on the brightness of the first image includes:
the controller determines an area of the first image for detecting brightness;
the controller detects the total average brightness of the areas used for detecting the brightness, and determines the corresponding target brightness based on the total average brightness.
Optionally, the detecting, by the controller, the brightness of the first image, and determining the corresponding target brightness based on the brightness of the first image includes:
the controller divides the area of the first image for detecting the brightness into a plurality of sub-areas, detects sub-average brightness of each sub-area, and determines corresponding sub-target brightness based on each sub-average brightness;
the controller generates a first instruction and sends the first instruction to the head-up display, wherein the first instruction containing the target brightness comprises:
the controller generates a first instruction and sends the first instruction to the head-up display, wherein the first instruction comprises the brightness of the plurality of sub-targets;
the head-up display displaying information having the target brightness based on the first instruction includes:
the display area of the head-up display comprises a plurality of sub-display areas, and the head-up display displays information according to the corresponding sub-target brightness in each sub-display area based on the first instruction.
Optionally, the controller determining the corresponding target brightness based on the brightness of the first image comprises:
the controller determines a target brightness corresponding to the brightness of the first image based on a brightness matching table, wherein the brightness matching table includes a plurality of candidate image brightness ranges, and each candidate image brightness range corresponds to one information display brightness.
Optionally, the display method further includes:
the controller detects the color of the first image and determines a corresponding target color based on the color of the first image;
the controller generates a second instruction and sends the second instruction to the head-up display, wherein the second instruction comprises the target color;
the heads-up display displays information having the target color based on the second instruction, wherein the target color matches a color of the first image.
On the other hand, the embodiment of the application further provides a vehicle, the vehicle comprises a display system, the display system comprises an image acquisition device, a controller and a head-up display, and the image acquisition device, the controller and the head-up display are respectively used for realizing the display method of any one of the above items.
The display method is applied to a display system, and the display system comprises an image acquisition device, a controller and a head-up display. First, a first image in front of the vehicle is acquired through an image acquisition device and sent to a controller. Then, the controller detects the brightness of the first image, determines corresponding target brightness based on the brightness of the first image, and sends a first instruction containing the target brightness to the head-up display. Finally, the head-up display displays information with the target brightness based on the first instruction. By the display method, the brightness of the image in front of the vehicle can be acquired in time by acquiring the image in front of the vehicle and detecting the brightness of the image. Because the brightness displayed by the head-up display can be timely adjusted according to the brightness of the image in front of the vehicle, the situation that the brightness displayed by the information is not matched with the brightness of the environment in front of the vehicle can be avoided, the driving behavior of a driver can not be interfered by the brightness of the displayed information, and the driving safety of the vehicle is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a display method provided in an embodiment of the present application;
FIG. 2 is a flow chart of a display method provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a display method provided in an embodiment of the present application;
fig. 4 is a schematic diagram of a first image and a second image in a display method provided in an embodiment of the present application.
Reference numerals:
1. a driver;
2. a ground surface;
3. a front windshield;
4. a vehicle body;
5. a first image;
6. a second image.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, of the embodiments of the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
Unless defined otherwise, all technical terms used in the examples of the present application have the same meaning as commonly understood by one of ordinary skill in the art.
In order to make the technical solutions and advantages of the present application clearer, the following will describe the embodiments of the present application in further detail with reference to the accompanying drawings.
Fig. 1 is a flowchart of a display method provided in an embodiment of the present application, where the display method is applied to a display system, and the display system includes an image capture device, a controller, and a head-up display. Referring to fig. 1, the display method includes steps 101 to 105.
In step 101, an image capturing device captures a first image, wherein the first image is an image in front of a vehicle.
It should be noted that, since the first image is an image in front of the vehicle, the first image may reflect an environment state in front of the vehicle, that is, parameters such as brightness or color of the first image may be used to reflect information such as brightness or color of an environment seen by eyes of a driver when the driver visually observes the front. In some embodiments, the image capturing device is a device with a shooting function, and can be installed in the cab or outside the cab. For example, the image capturing device may be a camera for capturing an image in front of the vehicle, such as an ADAS (Advanced Driver Assistance System) camera mounted on the vehicle, a 360-degree panoramic camera, or the like.
In some embodiments, the image capture device captures the first image in real-time. So that the environmental state in front of the vehicle can be reflected in real time through the first image.
In step 102, the image capture device sends the first image to the controller.
It should be noted that the image capturing device is electrically connected to the controller.
In step 103, the controller detects the brightness of the first image and determines a corresponding target brightness based on the brightness of the first image.
It should be noted that the brightness of the first image is used to represent the brightness of the ambient light in front of the vehicle. The target brightness is the light brightness of the head-up display during information display. Therefore, the light brightness of the head-up display during information display can be determined in real time based on the brightness of the ambient light in front of the vehicle, and the condition that dangerous driving is caused due to the fact that the light brightness for information display is not matched with the brightness of the ambient light can be avoided.
In step 104, the controller generates a first command and sends the first command to the head-up display, wherein the first command includes a target brightness.
It should be noted that the head-up display may be an augmented reality head-up display.
In some embodiments, the heads-up display includes an image generation unit and an optical display assembly, the controller sending a first instruction to the image generation unit, the first instruction including a target brightness. The image generation unit is configured to generate and display an image based on the target brightness and the information. The image generating unit may use TFT-LCD (Thin Film Transistor Liquid Crystal Display) technology, DLP (Digital Light processing) technology, or MEMS (Micro-Electro-Mechanical System) projection technology to generate and Display an image. An optical display module includes a plurality of optical elements, such as flat mirrors, curved mirrors, etc., where light rays follow the principle of light propagation. The optical display assembly is used for projecting the light of the image generation unit to a front windshield of the vehicle, so that a driver can see information corresponding to the image output by the image generation unit through the front windshield. It will be appreciated that the information includes textual and/or graphical indicia. The text mark can be a mark representing the vehicle speed, such as 50m/s; the pattern identifier may represent lane information, and if the lane in which the vehicle is located is a left turn lane, the pattern identifier is a left turn arrow. It should be noted that the above information types and their representation manners are only examples, and there may be other information types and their representation manners, which are not listed here.
In step 105, the head-up display displays information having the target brightness based on the first instruction.
Wherein the target brightness matches the brightness of the first image. It is understood that the target brightness is matched with the brightness of the ambient light since the target brightness is matched based on the brightness of the first image, which is used to characterize the brightness of the ambient light in front of the vehicle. Therefore, the brightness of the information displayed by the head-up display is matched with the brightness of the ambient light in front of the vehicle, and the fusion effect of the information and the environment is improved.
In some embodiments, the image generation unit of the heads-up display displays information having the target brightness based on the first instruction.
By adopting the display method provided by the application, even if the vehicle is in a scene with large brightness change of the ambient light such as the vehicle exits from the tunnel, enters the tunnel, exits from the garage, enters the garage and the like, the brightness of the information display of the head-up display can be adjusted in real time according to the brightness of the ambient light in front of the vehicle, so that the head-up display can display the brightness of the information display matched with the brightness of the ambient light. The phenomenon that the driving process of a driver is interfered due to over-bright or over-dark brightness of information display is avoided, so that the driver can drive with concentrated attention, and the driving safety is improved.
Fig. 2 is a flowchart of a display method according to an embodiment of the present application. The display method is applied to a display system, and the display system comprises an image acquisition device, a controller and a head-up display. Referring to fig. 2, the display method includes steps 201 to 208.
In step 201, an image capturing device captures a first image, which is an image in front of a vehicle.
Step 201 corresponds to step 101 described above.
In some embodiments, the field of view corresponding to the first image covers a background range of a virtual image display area, wherein the virtual image display area is an area where light of the head-up display is projected on a front windshield of the vehicle. Note that the virtual image display region here is a region where all light rays are formed on the front windshield when the full screen of the display screen of the heads-up display is illuminated. It is understood that, if the screen of the display screen is rectangular, the virtual image display area is a rectangular area formed by projecting all the light rays on the display screen onto the front windshield. In addition, the field of view corresponding to the first image in the embodiment of the present application is determined based on the height and the acquisition angle of the acquired image.
The principle of this step will be explained with reference to fig. 3. Fig. 3 is a schematic diagram illustrating a display method according to an embodiment of the present disclosure. Referring to fig. 3, when the driver 1 views the front, the driver 1 can view information displayed on the head-up display on the front windshield 3 of the vehicle. It will be appreciated that the image generated by the heads-up display corresponds to light projected onto the glass, forming a virtual image. According to the principle of refraction of light, the light projected onto the front windshield 3 is refracted into the human eyes O, so that the human eyes O can see information represented by the virtual image in the virtual image display region (i.e., the region between A1 A2) on the front windshield 3. It should be noted that the actual position of the virtual image is the position shown by CD, and generally, the horizontal distance between the center point of the virtual image and the eyes of the driver may be 7.5m, and the distance may also be adjusted as required. Both A3 and A4 shown in fig. 3 are points of intersection of the rays forming the virtual image with the ground 2. And B3 and B4 are the intersection points of the acquisition field of view of the image acquisition device and the ground 2. The regions A3, A4, A6, and A5 are the background regions of the virtual image display region. The image acquisition device is arranged at the front part of the vehicle body 4, and the areas formed by B1, B2, B4 and B3 in the figure are the fields of view corresponding to the first image acquired by the image acquisition device. The background range of the virtual image display region covered by the visual field corresponding to the first image is a region formed by B1, B2, B4, and B3 covering regions A3, A4, A6, and A5. Since the field of view corresponding to the first image covers the background range of the virtual image display region, the brightness of the ambient light in the background range of the virtual image display region can be determined based on the brightness of the first image captured by the field of view. It should be noted that fig. 3 is only a schematic diagram, and in an actual scene, the virtual image display area, the background range of the virtual image display area, and the field of view corresponding to the first image are all areas having a certain area or space.
In step 202, the image capture device sends the first image to the controller.
Step 202 corresponds to step 102 described above.
In some embodiments, the image capture device sends the first image to an image controller corresponding to the image capture device. The communication between the image capturing device and the image controller is video communication.
In step 203, the controller detects the brightness of the first image and determines a corresponding target brightness based on the brightness of the first image.
Step 203 corresponds to step 103.
In some embodiments, the image controller detects the brightness of the first image and determines a corresponding target brightness based on the brightness of the first image.
In some embodiments, step 203 comprises: the controller determines a target brightness corresponding to the brightness of the first image based on a brightness matching table, wherein the brightness matching table includes a plurality of candidate image brightness ranges, and each candidate image brightness range corresponds to one information display brightness. It should be noted that the data in the brightness matching table is the data that is determined by the professional to meet the optimal visual requirement of human eyes after the technician performs calibration and testing according to different people, so that the comfort of the human eyes can be optimal when the human eyes watch information. The data may also be adjusted appropriately based on actual needs. The candidate image brightness ranges are different brightness ranges corresponding to the ambient light where the vehicle is located, and the information display brightness is the light brightness when the head-up display matched with the brightness ranges of the different ambient light performs information display. And when the controller detects the brightness of the first image, determining a candidate image brightness range to which the brightness of the first image belongs, and further taking the information display brightness corresponding to the determined candidate image brightness range as the target brightness. Therefore, the brightness of the ambient light based on the brightness representation of the first image can be timely adjusted to the brightness of information display, and the fusion effect of the information and the environment is improved. The condition that the driving of a driver is interfered because the brightness of information display is not adjusted in time when the vehicle runs on a road section with large ambient light brightness change is avoided, and the comfort of the driver for watching information and the driving safety are improved.
It should be noted that the controller may further extract the brightness of the first image through a preset algorithm, and further determine the target brightness corresponding to the brightness of the first image according to a brightness matching algorithm.
In some embodiments, step 203 comprises: the controller intercepts a region corresponding to the background range of the virtual image display region in the first image to obtain a second image; the controller detects the brightness of the second image and determines the corresponding target brightness based on the brightness of the second image. As shown in fig. 4, it can be understood that, since the field of view corresponding to the first image covers the background range of the virtual image display region, the first image 5 includes a region corresponding to the background range of the virtual image display region, and the second image 6 is obtained by cutting. Because the brightness of the second image detected by the controller can more accurately reflect the brightness of the environment light in the background range of the virtual image display area, the target brightness matched with the brightness of the environment light in the virtual image display area can be more accurately determined, the fusion effect of the information displayed on the front windshield of the vehicle and the environment in front of the vehicle is better, and the comfort of the driver for watching the information and the safety of driving the vehicle are improved.
In some embodiments, the controller intercepts a region of the first image corresponding to a background range of the virtual image display region, and the obtaining of the second image includes: the method comprises the steps that a controller acquires coordinate information of a corresponding area of a background range of a virtual image display area on a first image; and the controller intercepts an area corresponding to the background range of the virtual image display area in the first image based on the coordinate information to obtain a second image. It can be understood that the image corresponding to the background range of the virtual image display area can be more accurately determined by intercepting the image through the coordinate information.
In some embodiments, the controller acquiring coordinate information of a corresponding region of a background range of the virtual image display region on the first image includes: the controller determines a background range of a virtual image display area at a target moment based on running information of the vehicle, wherein the running information comprises a running track and a running speed of the vehicle, and the target moment is a moment after the current moment; the controller acquires coordinate information of a corresponding region of a background range of the virtual image display region on the first image at a target time. Then, the controller intercepts an area corresponding to the background range of the virtual image display area at the target moment in the first image based on the coordinate information, and obtains a second image. It should be noted that, by capturing the area corresponding to the background range of the virtual image display area at the target time from the first image, the target brightness corresponding to the brightness of the ambient light at the target time can be determined in advance, and the delay adjustment phenomenon is better avoided. That is, the method can acquire the brightness of the background range of the virtual image background display area at the next moment in advance, and the efficiency of adjusting the information display brightness is improved. The time interval between the target time and the current time may be set in advance, or may be adjusted based on the driving habits of the driver.
In some embodiments, step 203 may be implemented in the following two ways.
The first way comprises the following two steps:
in a first step, the controller determines an area of the first image for detecting brightness. It should be noted that the image generation unit of the head-up display includes the display screen, and when carrying out information display, only the position that corresponds with the information on the display screen just can send light, and this light is projected to windscreen before, forms the virtual image that different information corresponds to people's eye can see all kinds of information through windscreen before. The area of the first image used for detecting the brightness comprises a background range corresponding to an area of a front windshield onto which light rays for displaying different information are projected. For example, the first information corresponds to a first virtual image, the first virtual image is projected to a first virtual image display area of the front windshield through light, and an area corresponding to a background range of the first virtual image display area on the first image is a first area; the second information corresponds to a second virtual image, the second virtual image is projected to a second virtual image display area of the front windshield through light rays, and an area corresponding to the background range of the second virtual image display area on the first image is a second area. The region of the first image used for detecting the brightness includes the total region of the first region and the second region in the first image and does not include the other regions except the first region and the second region in the first image.
In the second step, the controller detects the total average brightness of the regions used for detecting the brightness, and determines the corresponding target brightness based on the total average brightness. By determining the total average luminance of the background positions corresponding to the positions of all the light rays projected onto the front windshield, that is, the total average luminance equivalent to the ambient light for which the virtual image display region is determined. Determining the brightness of the light of the information display based on the total average brightness can make the effect of fusing the information displayed on the front windshield and the environment better.
The second way comprises the following steps: the controller divides a region of the first image used to detect the luminance into a plurality of sub-regions, detects sub-average luminance of each sub-region, and determines a corresponding sub-target luminance based on each sub-average luminance. The plurality of sub-regions correspond to the background range of the virtual image display region of one piece of information, respectively. For example, a heads-up display is used to display first information and second information. The first information corresponds to a first virtual image, the first virtual image is projected to a first virtual image display area of the front windshield through light rays, and an area, corresponding to a background range of the first virtual image display area, of the first image is a first sub-area; the second information corresponds to a second virtual image, the second virtual image is projected to a second virtual image display area of the front windshield through light rays, and an area corresponding to the background range of the second virtual image display area on the first image is a second sub-area. Detecting the sub-average brightness of each sub-area, namely detecting the average brightness of the first sub-area to obtain first sub-average brightness; and detecting the average brightness of the second sub-area to obtain a second sub-average brightness. Then, the corresponding first sub-target brightness is determined based on the first sub-average brightness, and the corresponding second sub-target brightness is determined based on the second sub-average brightness. The method realizes that the light brightness corresponding to each piece of information is determined based on the display area of different pieces of information on the front windshield, so that the brightness of each piece of information can be better fused with the light brightness of the corresponding background range. When the brightness of the environment light corresponding to the virtual image display area of each piece of information is different, the method can enable the light brightness of each piece of information to be matched with the brightness of the corresponding background light one by one, and improves the fusion effect of the information and the environment.
In step 204, the controller generates a first command and sends the first command to the head-up display, wherein the first command includes a target brightness.
Step 204 corresponds to step 104.
In some embodiments, the image controller generates a first instruction and sends the first instruction to a display controller corresponding to the head-up display, and the first instruction comprises the target brightness. The display controller is used for controlling the head-up display to display information according to the instruction.
In some embodiments, if step 203 is implemented by the second method in step 203, step 204 includes: the controller generates a first command and sends the first command to the head-up display, wherein the first command comprises a plurality of sub-target brightness. So that the brightness of the ambient light can be determined from the different information display areas. In connection with the example in step 203 above, the first command includes a first sub-target brightness and a second sub-target brightness.
In step 205, the heads-up display displays information having a target brightness based on a first instruction.
Wherein the target brightness matches the brightness of the first image. It should be noted that step 205 is the same as step 105 described above.
In some embodiments, the display controller controls the image generation unit of the heads-up display to display information having a target brightness based on the first instruction, wherein the target brightness matches a brightness of the first image.
In some embodiments, if step 203 is implemented by the second manner in step 203, step 205 includes: the display area of the head-up display comprises a plurality of sub-display areas, and the head-up display displays information in each sub-display area according to the corresponding sub-target brightness based on the first instruction. In connection with the example in step 203 above, the display area of the head-up display includes a first sub-display area for displaying the first information and a second sub-display area for displaying the second information. The head-up display displays the first information in the first sub-display area according to the first sub-target brightness and displays the second information in the second sub-display area according to the second sub-target brightness based on the first instruction. Therefore, the brightness of different information on the front windshield can be better matched with the brightness of the corresponding ambient light of the background, each information can be better fused with the environment, the display effect of the information is improved, and the comfort and driving safety of a driver are improved.
In step 206, the controller detects the color of the first image and determines a corresponding target color based on the color of the first image.
In some embodiments, the image controller detects a color of the first image and determines a corresponding target color based on the color of the first image.
In some embodiments, step 206 comprises: the controller determines a target color corresponding to a color of the first image based on a color matching table, wherein the color matching table includes a plurality of candidate color ranges, and each candidate color range corresponds to one information display color. It should be noted that the color matching method in the color matching table is determined by a professional to combine the optimal visual needs of human eyes after different people are calibrated and tested. The candidate color ranges in the color matching table are different RGB value ranges respectively, the different RGB ranges correspond to one matched information display color, and the RGB value of the information display color is not located in the RGB range of the corresponding candidate color, so that the candidate color and the corresponding information display color can be distinguished, and the driver can observe the information display color conveniently. For example, if the candidate color is black, the information display color is white. And if the candidate color is white, the information display color is blue. Therefore, the color of the virtual image display background range can be distinguished from the target color, the display effect of the display information on the front windshield is improved, and the driver can see the virtual image displayed on the front windshield clearly in different ambient lights. In some embodiments, the controller obtains RGB values of the color of the first image, and determines that the information display color corresponding to the color range including the RGB values of the color of the first image in the color matching table is determined as the target color.
In some embodiments, the controller may further extract the color parameter of the first image through a preset color algorithm, and then determine a target color corresponding to the color parameter of the first image according to the color algorithm.
In some embodiments, the color of the first image refers to a color of a region in the first image corresponding to a background range of the virtual image display region. It should be noted that different information of the head-up display may be in the same color or in different colors. When the colors of different information of the head-up display are different, the controller divides the first image into a plurality of sub-regions, detects a color parameter of each sub-region, and determines a corresponding sub-target color based on each sub-region. Each sub-target color corresponds to a display color of the information. For example, the head-up display is configured to display first information and second information, where virtual image display regions of the first information and the second information correspond to a first virtual image display region and a second virtual image display region on the front windshield, respectively, where the first virtual image display region corresponds to a first sub-region in a first image, and the second virtual image display region corresponds to a second sub-region in the first image. The controller detects a first color parameter of the first sub-area and a second color parameter of the second sub-area, determines a corresponding first sub-target color based on the first color parameter, and determines a corresponding second sub-target color based on the second color parameter. The color parameters are RGB values of the color.
In step 207, the controller generates a second instruction and sends the second instruction to the heads-up display, the second instruction including the target color.
In some embodiments, the image controller generates a second instruction and sends the second instruction to the display controller, the second controller containing the target color. It should be noted that, when the target color includes a plurality of sub-target colors, the second instruction includes a plurality of sub-target colors.
In step 208, the heads-up display displays information having the target color based on the second instruction.
Wherein the target color matches a color of the first image.
In some embodiments, the display controller controls the image generation unit of the heads-up display to display information having a target color based on the second instruction, wherein the target color matches a color of the first image.
In some embodiments, the heads-up display displays each sub-display region in a corresponding sub-target color based on the second instruction. Therefore, the information with different colors can be displayed on the front windshield, the color of each information can be matched with the color of the background light, the flexibility and the diversity of information display are improved, a driver can visually distinguish different information according to different colors, and the experience of the driver is improved.
It should be noted that the process of determining and displaying the target color in steps 206 to 208 may be performed before, after or simultaneously with the process of determining and displaying the target brightness in steps 203 to 205. For example, the controller detects the brightness and color of the first image at the same time, and generates a matching instruction, wherein the matching instruction is used for indicating that the corresponding target brightness and target color are determined based on the brightness and color of the first image at the same time. Then, the controller generates a target instruction based on the target brightness and the target color and sends the target instruction to the heads-up display, wherein the target instruction comprises the target brightness and the target color. The heads-up display displays information having a target brightness and a target color based on the target instruction.
It should be noted that, the steps executed by the controller in the steps of the display method provided in the embodiment of the present application may be implemented by a vehicle controller of a vehicle, except that the steps executed by the controller are implemented by adopting an interaction manner between an image controller and a display controller.
By the display method, the brightness and the color of the ambient light in front of the vehicle can be known in time by acquiring the image in front of the vehicle and detecting the brightness and the color of the image. Because the brightness and the color of the information display of the head-up display can be adjusted in time according to the brightness and the color of the image in front of the vehicle, the phenomenon that the brightness and the color of the information display are not matched with the brightness and the color of the environment in front of the vehicle to cause interference to the driving of a driver can be avoided, and the driving safety of the vehicle is improved.
The embodiment of the application further provides a vehicle, which comprises a display system, wherein the display system comprises an image acquisition device, a controller and a head-up display, and the image acquisition device, the controller and the head-up display are respectively used for realizing any one of the display methods. The display method is the same as the display method provided in the above embodiment of the present application, and is not described herein again.
In some embodiments, the vehicle further includes a center screen that displays an open option and a close option. If the opening option is selected, the controller can interact with the image acquisition device and the head-up display to realize the display method provided by the embodiment of the application, namely, the function of adjusting the brightness and the color of the displayed information in real time according to the brightness and the color of the environment in front of the vehicle can be realized. If the closing option is selected, the controller, the image acquisition device and the head-up display cannot realize the display method provided by the embodiment of the application, namely, the function of adjusting the brightness and the color of the displayed information in real time according to the brightness and the color of the environment in front of the vehicle cannot be realized. It should be noted that the controller and the central control screen realize communication through a controller local area network signal.
The vehicle provided by the embodiment of the application can realize the display method, namely, the brightness and the color of the ambient light in front of the vehicle can be known in time by acquiring the image in front of the vehicle and detecting the brightness and the color of the image. Because the brightness and the color of the information display of the head-up display can be adjusted in time according to the brightness and the color of the image in front of the vehicle, the phenomenon that the brightness and the color of the information display are not matched with the brightness and the color of the environment in front of the vehicle to cause interference to the driving of a driver can be avoided, and the driving safety of the vehicle is improved. The high cost caused by the influence on the personal safety of the driver or the loss of the vehicle maintenance due to the occurrence of dangerous accidents is avoided.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the present application disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A display method is applied to a display system, the display system comprises an image acquisition device, a controller and a head-up display, and the display method comprises the following steps:
the image acquisition device acquires a first image, wherein the first image is an image in front of a vehicle;
the image acquisition device sends the first image to the controller;
the controller detects the brightness of the first image and determines corresponding target brightness based on the brightness of the first image;
the controller generates a first instruction and sends the first instruction to the head-up display, wherein the first instruction comprises the target brightness;
the head-up display displays information having the target brightness based on the first instruction, wherein the target brightness matches the brightness of the first image.
2. The display method according to claim 1, wherein the field of view corresponding to the first image covers a background range of a virtual image display area, wherein the virtual image display area is an area where light of the heads-up display is projected on a front windshield of the vehicle.
3. The display method of claim 2, wherein the controller detects the brightness of the first image and determines the corresponding target brightness based on the brightness of the first image comprises:
the controller intercepts a region corresponding to the background range of the virtual image display region in the first image to obtain a second image;
the controller detects the brightness of the second image and determines the corresponding target brightness based on the brightness of the second image.
4. The display method according to claim 3, wherein the controller cuts out an area corresponding to a background range of the virtual image display area in the first image, and obtains a second image including:
the controller acquires coordinate information of a corresponding area of a background range of the virtual image display area on the first image;
and the controller intercepts an area corresponding to the background range of the virtual image display area in the first image based on the coordinate information to obtain the second image.
5. The display method according to claim 4, wherein the controller acquiring coordinate information of a corresponding region of a background range of the virtual image display region on the first image comprises:
the controller determines a background range of a virtual image display region at a target time based on running information of the vehicle, wherein the running information includes a running track and a running speed of the vehicle, and the target time is a time after the current time;
the controller acquires coordinate information of a corresponding area of a background range of the virtual image display area on the first image at the target moment.
6. The display method of claim 1, wherein the controller detects a brightness of the first image and determines the corresponding target brightness based on the brightness of the first image comprises:
the controller determines an area of the first image for detecting brightness;
the controller detects the total average brightness of the areas used for detecting the brightness, and determines the corresponding target brightness based on the total average brightness.
7. The display method of claim 1, wherein the controller detects a brightness of the first image and determines the corresponding target brightness based on the brightness of the first image comprises:
the controller divides the area of the first image used for detecting the brightness into a plurality of sub-areas, detects sub-average brightness of each sub-area, and determines corresponding sub-target brightness based on each sub-average brightness;
the controller generates a first instruction and sends the first instruction to the head-up display, wherein the first instruction containing the target brightness comprises:
the controller generates a first instruction and sends the first instruction to the head-up display, wherein the first instruction comprises the brightness of the plurality of sub-targets;
the heads-up display displaying information having the target brightness based on the first instruction includes:
the display area of the head-up display comprises a plurality of sub-display areas, and the head-up display displays information according to the corresponding sub-target brightness in each sub-display area based on the first instruction.
8. The display method of claim 1, wherein the controller determining the corresponding target brightness based on the brightness of the first image comprises:
the controller determines a target brightness corresponding to the brightness of the first image based on a brightness matching table, wherein the brightness matching table includes a plurality of candidate image brightness ranges, and each candidate image brightness range corresponds to one information display brightness.
9. The display method according to claim 1, further comprising:
the controller detects the color of the first image and determines a corresponding target color based on the color of the first image;
the controller generates a second instruction and sends the second instruction to the head-up display, wherein the second instruction comprises the target color;
the heads-up display displays information having the target color based on the second instruction, wherein the target color matches a color of the first image.
10. A vehicle, characterized in that the vehicle comprises a display system comprising an image acquisition device, a controller and a head-up display, wherein the image acquisition device, the controller and the head-up display are respectively used for realizing the display method of any one of claims 1 to 9.
CN202211356582.8A 2022-11-01 2022-11-01 Display method and vehicle Pending CN115576110A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211356582.8A CN115576110A (en) 2022-11-01 2022-11-01 Display method and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211356582.8A CN115576110A (en) 2022-11-01 2022-11-01 Display method and vehicle

Publications (1)

Publication Number Publication Date
CN115576110A true CN115576110A (en) 2023-01-06

Family

ID=84588542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211356582.8A Pending CN115576110A (en) 2022-11-01 2022-11-01 Display method and vehicle

Country Status (1)

Country Link
CN (1) CN115576110A (en)

Similar Documents

Publication Publication Date Title
US11194154B2 (en) Onboard display control apparatus
US9069163B2 (en) Head-up display with brightness control
CN206031079U (en) On -vehicle head -up display AR of augmented reality HUD
US10895743B2 (en) Display apparatus for superimposing a virtual image into the field of vision of a user
US11043187B2 (en) On-vehicle display device, on-vehicle display method and vehicle
JP6409337B2 (en) Display device
CN102910130A (en) Actually-enhanced driver-assisted early warning system
CN109597200B (en) Ultra-wide head-up display system and display method thereof
US11506891B2 (en) Method for operating a visual field display device for a motor vehicle
CN107683220A (en) Projection display device and method for displaying projection
JP6669053B2 (en) Head-up display system
KR101976106B1 (en) Integrated head-up display device for vehicles for providing information
US11238834B2 (en) Method, device and system for adjusting image, and computer readable storage medium
US20180334101A1 (en) Simulated mirror or remote view display via transparent display system and method
US20210260998A1 (en) Method for Operating a Visual Field Display Device for a Motor Vehicle
JP2014026177A (en) Vehicle display control device, vehicle display device and vehicle display control method
CN114489332A (en) Display method and system of AR-HUD output information
US20210268961A1 (en) Display method, display device, and display system
KR20120059732A (en) Display device for vehicle with foreground penetration function and display method of the same
CN111086518B (en) Display method and device, vehicle-mounted head-up display equipment and storage medium
CN115576110A (en) Display method and vehicle
CN111071037A (en) Equipment control method and device, vehicle-mounted head-up display equipment and storage medium
CN113401057A (en) Scene follow-up holographic projection system and automobile thereof
EP1848611A1 (en) A driver assistance system
JP7472007B2 (en) Image projection system and image projection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination