CN113362751A - Data compensation method and data compensation device of display panel - Google Patents

Data compensation method and data compensation device of display panel Download PDF

Info

Publication number
CN113362751A
CN113362751A CN202110610337.4A CN202110610337A CN113362751A CN 113362751 A CN113362751 A CN 113362751A CN 202110610337 A CN202110610337 A CN 202110610337A CN 113362751 A CN113362751 A CN 113362751A
Authority
CN
China
Prior art keywords
image capturing
image data
determining
fusion coefficient
weight fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110610337.4A
Other languages
Chinese (zh)
Other versions
CN113362751B (en
Inventor
康小健
肖剑锋
陈辛洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen China Star Optoelectronics Semiconductor Display Technology Co Ltd
Original Assignee
Shenzhen China Star Optoelectronics Semiconductor Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen China Star Optoelectronics Semiconductor Display Technology Co Ltd filed Critical Shenzhen China Star Optoelectronics Semiconductor Display Technology Co Ltd
Priority to CN202110610337.4A priority Critical patent/CN113362751B/en
Publication of CN113362751A publication Critical patent/CN113362751A/en
Application granted granted Critical
Publication of CN113362751B publication Critical patent/CN113362751B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0233Improving the luminance or brightness uniformity across the screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The application discloses a data compensation method and a data compensation device of a display panel, wherein the data compensation method can improve the technical problem of poor side view angle display effect of the display panel by acquiring at least two groups of initial image data of the display panel at different image capturing positions, determining a fusion result of the initial image data as target image data and generating a corresponding data compensation table; meanwhile, the initial image data have the same brightness change trend, and the fused target image data can further improve the display effect of different visual angles of the display panel.

Description

Data compensation method and data compensation device of display panel
Technical Field
The present application relates to the field of display technologies, and in particular, to a data compensation method and a data compensation apparatus for a display panel.
Background
The display panel has many manufacturing processes and a complex structure, and various visual defects are inevitable, wherein one of the visual defects is local brightness unevenness (mura) of the display panel. The cause of such visual defects includes at least one of backlight unevenness, process coating unevenness, film thickness unevenness, and Cell Gap unevenness, which is unavoidable.
To eliminate mura, compensation may be performed by importing luminance-uniformizing Repair data (De-mura Repair). The data compensation method adopts a single Camera (Camera) positioned in the center of the display panel to capture images, generates compensation data and then repairs the data, can effectively repair front visual angle mura, and cannot ensure the repair quality of some side visual angle mura. For example, as shown in fig. 1, when the display panel P1 of the first size, the image capture viewing angle of the camera is a 1; when the display panel P2 is of the second size, the image capture viewing angle of the camera is a 2; when the display panel P3 has the third size, the image capture angle of the camera is A3; the first size to the third size are at least sequentially increased in the width direction of the display panel, correspondingly, the image capturing viewing angles a1 to a image capturing viewing angle A3 are also gradually increased, and along with the gradual increase of the image capturing viewing angles, the image data acquired by a single camera located at the center of the display panel is difficult to ensure that the display effects of the display panel with different viewing angles can be optimized.
It should be noted that the above description of the background art is only for the convenience of clear and complete understanding of the technical solutions of the present application. The technical solutions referred to above are therefore not considered to be known to the person skilled in the art, merely because they appear in the background of the present application.
Disclosure of Invention
The application provides a data compensation method and a data compensation device of a display panel, which are used for solving the problem that the side viewing angle display effect is poor easily when image data acquired by a single camera is subjected to data compensation.
In a first aspect, the present application provides a data compensation method for a display panel, which includes: acquiring initial image data of at least two groups of different image capturing positions of the display panel and with the same brightness change trend, wherein the image capturing positions are positions of the image capturing equipment relative to the display panel; determining the calibrated image data based on the fusion result of at least two groups of initial image data with different image capturing positions and the same brightness change trend; based on the target image data, a corresponding data compensation table is generated.
In some embodiments, the different image capturing positions include a first image capturing position, a second image capturing position and a third image capturing position, and the step of acquiring at least two sets of initial image data with different viewing angles and the same brightness variation trend of the display panel includes: determining a first image capturing position, a second image capturing position and a third image capturing position, wherein the second image capturing position is located between the first image capturing position and the third image capturing position; based on the first image capturing position, the second image capturing position and the third image capturing position, the first view angle initial image data, the second view angle initial image data and the third view angle initial image data are sequentially acquired.
In some embodiments, the step of acquiring at least two sets of initial image data with different viewing angles and the same brightness variation trend of the display panel further includes: determining a first image capturing focal length, a second image capturing focal length and a third image capturing focal length, wherein the first image capturing focal length, the second image capturing focal length and the third image capturing focal length are all the same numerical value; based on the first image capturing focal length, the second image capturing focal length and the third image capturing focal length, the first view angle initial image data, the second view angle initial image data and the third view angle initial image data are sequentially acquired.
In some embodiments, the determining the target image data based on the fusion result of the at least two sets of initial image data with different image capturing positions and the same brightness variation trend includes: determining a first weight fusion coefficient of the initial image data of the first view angle, a second weight fusion coefficient of the initial image data of the second view angle and a third weight fusion coefficient of the initial image data of the third view angle; fusing the first view angle initial image data, the second view angle initial image data and the third view angle initial image data based on the first weight fusion coefficient, the second weight fusion coefficient and the third weight fusion coefficient; and determining the calibrated image data based on the fusion result of the first visual angle initial image data, the second visual angle initial image data and the third visual angle initial image data.
In some embodiments, the step of determining a first weight fusion coefficient of the initial image data from the first perspective, a second weight fusion coefficient of the initial image data from the second perspective, and a third weight fusion coefficient of the initial image data from the third perspective includes: determining a first weight fusion coefficient based on the first image capturing position; determining a second weight fusion coefficient based on the second image capturing position; and determining a third weight fusion coefficient based on the third image capturing position.
In some embodiments, the step of determining a first weight fusion coefficient of the initial image data from the first perspective, a second weight fusion coefficient of the initial image data from the second perspective, and a third weight fusion coefficient of the initial image data from the third perspective further comprises: dividing the display width of the display panel into M equal parts, wherein M is a positive integer; determining a first image capturing position, a second image capturing position and a third image capturing position; determining a first weight fusion coefficient based on the first image capturing position and a first calculation formula; determining a third weight fusion coefficient based on the third image capturing position and a third calculation formula; and determining a second weight fusion coefficient based on the first weight fusion coefficient and the third weight fusion coefficient.
In some embodiments, the step of determining the first weight fusion coefficient based on the first image capturing position and the first calculation formula includes: determining that the first image-taking position is located at the X1 th equally-divided position of the display width, wherein X1 is a positive integer and is less than M/2; the first calculation formula is determined as Y1 ═ 1/(1+ exp (-X1+ M/4)), and Y1 is the first weight fusion coefficient.
In some embodiments, the step of determining the third weight fusion coefficient based on the third image capturing position and the third calculation formula includes: determining that the third image-capturing position is located at the X2 th equally-divided position of the display width, wherein X2 is a positive integer and is greater than M/2; the third calculation formula is determined as Y3 ═ 1/(1+ exp (X2-3 × M/4)), and Y3 is the third weight fusion coefficient.
In some embodiments, the step of determining the second weight fusion coefficient based on the first weight fusion coefficient and the third weight fusion coefficient includes: determining the first weight fusion coefficient and the third weight fusion coefficient as Y1 and Y3 in sequence; the second calculation formula for determining the second weight fusion coefficient is Y2 ═ 1- (Y1+ Y3), and Y2 is the second weight fusion coefficient.
In a second aspect, the present application provides a data compensation apparatus for a display panel, which includes an obtaining module, a fusing module, and a generating module; the acquisition module is used for acquiring initial image data of at least two groups of different image acquisition positions of the display panel and with the same brightness variation trend, wherein the image acquisition positions are positions of the image acquisition equipment relative to the display panel; the fusion module is used for determining the calibrated image data based on the fusion result of the initial image data of at least two groups of different image capturing positions and the same brightness change trend; the generation module is used for generating a corresponding data compensation table based on the target image data.
According to the data compensation method and the data compensation device for the display panel, the technical problem of poor side view angle display effect of the display panel can be solved by acquiring at least two groups of initial image data of the display panel at different image capturing positions, determining a fusion result of the initial image data as target image data and generating a corresponding data compensation table; meanwhile, the initial image data have the same brightness change trend, and the fused target image data can further improve the display effect of different visual angles of the display panel.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of image data acquisition in a conventional technical solution.
Fig. 2 is a schematic flowchart of a data compensation method of a display panel according to an embodiment of the present disclosure.
Fig. 3 is a schematic structural diagram of initial image data acquisition according to an embodiment of the present application.
Fig. 4 is a schematic diagram illustrating a comparison of luminance variation trends at different image capturing positions according to an embodiment of the present disclosure.
Fig. 5 is a schematic diagram of weight fusion coefficients provided in an embodiment of the present application.
Fig. 6 is a schematic diagram of a fusion process of initial image data according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a data compensation apparatus according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 2 to 7, as shown in fig. 2, the present embodiment provides a data compensation method for a display panel, which includes the following steps:
step S100: at least two groups of initial image data with different image capturing positions and the same brightness change trend of the display panel are obtained, wherein the image capturing positions are positions of the image capturing device relative to the display panel.
Step S200: and determining the calibrated image data based on the fusion result of the initial image data of at least two groups of different image capturing positions and the same brightness change trend.
And step S300: based on the target image data, a corresponding data compensation table is generated.
It can be understood that, in the data compensation method for a display panel provided in this embodiment, by acquiring at least two sets of initial image data at different image capturing positions of the display panel to determine a fusion result of the initial image data as target image data and generate a corresponding data compensation table, a technical problem of a poor side view angle display effect of the display panel can be improved; meanwhile, the initial image data have the same brightness change trend, and the fused target image data can further improve the display effect of different visual angles of the display panel.
It should be noted that each set of initial image data may correspond to the original image data obtained by the image capturing device, which may include the coordinate data and the luminance data of each sub-pixel, based on which the luminance variation trend of each set of initial image data may be determined, and therefore, the luminance variation trends of different sets of initial image data may be compared to determine whether the luminance variation trends of different sets of initial image data are the same.
It can be understood that, for different groups of initial image data with the same brightness variation trend, the brightness variation trends of the fused target image data are the same or similar, and the data compensation table generated based on the same can improve the display effect of the display panel at different viewing angles.
In one embodiment, the different image capturing positions include a first image capturing position, a second image capturing position and a third image capturing position, and the step of acquiring at least two sets of initial image data with different viewing angles and the same brightness variation trend of the display panel includes: determining a first image capturing position, a second image capturing position and a third image capturing position, wherein the second image capturing position is located between the first image capturing position and the third image capturing position; based on the first image capturing position, the second image capturing position and the third image capturing position, the first view angle initial image data, the second view angle initial image data and the third view angle initial image data are sequentially acquired.
It should be noted that, in this embodiment, the first perspective initial image data, the second perspective initial image data, and the third perspective initial image data may be acquired simultaneously, so that the acquisition time of the different perspective initial image data can be saved.
In one embodiment, the step of acquiring at least two sets of initial image data with different viewing angles and the same brightness variation trend of the display panel further includes: determining a first image capturing focal length, a second image capturing focal length and a third image capturing focal length, wherein the first image capturing focal length, the second image capturing focal length and the third image capturing focal length are all the same numerical value; based on the first image capturing focal length, the second image capturing focal length and the third image capturing focal length, the first view angle initial image data, the second view angle initial image data and the third view angle initial image data are sequentially acquired.
It is to be understood that the image capturing device may comprise an image capturing element. The different image taking focal lengths may be distances from the corresponding image capturing devices to the display panel. Different image capturing devices adopt the same image capturing focal length, so that the initial image data with uniform specifications can be acquired, and the weight fusion coefficients of different initial image data can be calculated conveniently.
In one embodiment, the step of determining the target image data based on the fusion result of the at least two sets of initial image data with different image capturing positions and the same brightness variation trend comprises: determining a first weight fusion coefficient of the initial image data of the first view angle, a second weight fusion coefficient of the initial image data of the second view angle and a third weight fusion coefficient of the initial image data of the third view angle; fusing the first view angle initial image data, the second view angle initial image data and the third view angle initial image data based on the first weight fusion coefficient, the second weight fusion coefficient and the third weight fusion coefficient; and determining the calibrated image data based on the fusion result of the first visual angle initial image data, the second visual angle initial image data and the third visual angle initial image data.
It can be understood that, the corresponding weight fusion coefficients are configured for the initial image data acquired at different viewing angles or different image capturing positions, so that the display effects of the display panel at different viewing angles can be effectively improved.
In one embodiment, the step of determining a first weight fusion coefficient of the initial image data from the first perspective, a second weight fusion coefficient of the initial image data from the second perspective, and a third weight fusion coefficient of the initial image data from the third perspective includes: determining a first weight fusion coefficient based on the first image capturing position; determining a second weight fusion coefficient based on the second image capturing position; and determining a third weight fusion coefficient based on the third image capturing position.
In one embodiment, the step of determining a first weight fusion coefficient of the initial image data from the first perspective, a second weight fusion coefficient of the initial image data from the second perspective, and a third weight fusion coefficient of the initial image data from the third perspective further includes: dividing the display width of the display panel into M equal parts, wherein M is a positive integer; determining a first image capturing position, a second image capturing position and a third image capturing position; determining a first weight fusion coefficient based on the first image capturing position and a first calculation formula; determining a third weight fusion coefficient based on the third image capturing position and a third calculation formula; and determining a second weight fusion coefficient based on the first weight fusion coefficient and the third weight fusion coefficient.
It should be noted that the display width of the display panel may be a width of a display area of the display panel, and after the display area is divided by M, the display width is used to determine that the vertical light emitting direction of the corresponding image capturing device corresponds to the divided positions of the display width, so as to determine the relative position of the corresponding image capturing device.
In one embodiment, the step of determining the first weight fusion coefficient based on the first image capturing position and the first calculation formula includes: determining that the first image-taking position is located at the X1 th equally-divided position of the display width, wherein X1 is a positive integer and is less than M/2; the first calculation formula is determined as Y1 ═ 1/(1+ exp (-X1+ M/4)), and Y1 is the first weight fusion coefficient.
Note that, in the present embodiment, the starting point of the display width is the left end point of the display area when facing the display panel, and the ending point of the display width is the right end point of the display area when facing the display panel. The X1 th equally dividing position is a position divided equally from the left end point by the X1 th equally.
In one embodiment, the step of determining the third weight fusion coefficient based on the third image capturing position and the third calculation formula includes: determining that the third image-capturing position is located at the X2 th equally-divided position of the display width, wherein X2 is a positive integer and is greater than M/2; the third calculation formula is determined as Y3 ═ 1/(1+ exp (X2-3 × M/4)), and Y3 is the third weight fusion coefficient.
In one embodiment, the step of determining the second weight fusion coefficient based on the first weight fusion coefficient and the third weight fusion coefficient includes: determining the first weight fusion coefficient and the third weight fusion coefficient as Y1 and Y3 in sequence; the second calculation formula for determining the second weight fusion coefficient is Y2 ═ 1- (Y1+ Y3), and Y2 is the second weight fusion coefficient.
In one embodiment, the second image capture position may be an M/2 th equally divided position.
Based on the above embodiments, as shown in fig. 3, the image capturing devices may include a first image capturing device L, a second image capturing device C, and a third image capturing device R, wherein the first image capturing device L may be located at a first image capturing position, the second image capturing device C may be located at a second image capturing position, and the third image capturing device R may be located at a third image capturing position.
The first image capturing device L may be located at a left end point of the display width of the display panel P, the second image capturing device C may be located at a middle end point of the display width of the display panel P, and the third image capturing device R may be located at a right end point of the display width of the display panel P.
As shown in fig. 4, the abscissa represents the position of the display width of the display panel in equal parts, and the ordinate represents the number of gray scales of the display panel. For example, the left graph T1 in fig. 4 shows that before the data compensation is performed, wherein the curve group S31 can be characterized in that when the first image capturing device L and the third image capturing device R are respectively located at a symmetrical position on the left side and the right side of the second image capturing device C, the gray scale variation trends are similar, and correspondingly, the luminance variation trends of the two image capturing devices are similar, but a certain difference exists between the two images, which indicates that a certain viewing angle display difference exists when the left side and the right side display screens. Similarly, the curve group S21 can be characterized in that when the first image capturing device L and the third image capturing device R are respectively located at another symmetrical position of the left side and the right side of the second image capturing device C, the gray scale variation trends are similar, and correspondingly, the luminance variation trends of the first image capturing device L and the third image capturing device R are also similar, but a certain difference exists between the gray scale variation trends and the luminance variation trends, which indicates that a certain viewing angle display difference exists when the left side and the right side display screens. Similarly, the curve S11 may be characterized by a luminance variation trend of the image data acquired by the second image capturing device C, which is consistent with the luminance variation trends of the curve group S21 and the curve group S31.
For example, the right graph T2 in fig. 4 shows that after the data compensation is performed, wherein the curve group S3 can be characterized in that when the first image capturing device L and the third image capturing device R are respectively located at a symmetrical position on the left and right sides of the second image capturing device C, the gray-scale variation trends are similar, correspondingly, the luminance variation trends of the two image capturing devices are similar, and the difference between the two brightness variations is infinitely reduced, which indicates that the viewing angle display difference is reduced when the left and right side views display the screen. Similarly, the curve group S2 can be characterized in that when the first image capturing device L and the third image capturing device R are respectively located at another symmetrical position on the left side and the right side of the second image capturing device C, the gray scale variation trends are similar, correspondingly, the luminance variation trends of the first image capturing device L and the third image capturing device R are similar, and the difference between the gray scale variation trend and the luminance variation trend is reduced accordingly, which indicates that the viewing angle display difference is reduced synchronously when the left side view and the right side view display screen are displayed. Similarly, the curve S1 may be characterized by a luminance variation trend of the image data acquired by the second image capturing device C, which is consistent with the luminance variation trends of the curve group S2 and the curve group S3. This means that after the image data with the same brightness variation trend are fused, the generated data compensation table can improve the display effect of the display panel at different viewing angles, for example, the display effect at the left viewing angle, the display effect at the right viewing angle, and the display effect at the front viewing angle are all improved synchronously.
As shown in fig. 5, the sum of the fusion coefficients is constantly equal to 1 for each weight in the above-described embodiment. Where a curve S10 represents a variation trend of the first weight fusion coefficient of the first perspective initial image data, a curve S20 represents a variation trend of the third weight fusion coefficient of the third perspective initial image data, and a curve S30 represents a variation trend of the second weight fusion coefficient of the second perspective initial image data. The ordinate represents a specific value of each weight fusion coefficient, and the abscissa represents an equal number of the display width of the display panel, that is, M may be 481 in the present drawing.
As shown in fig. 6: after the first view angle initial image data, the second view angle initial image data and the third view angle initial image data sequentially acquired by the first image capturing device L, the second image capturing device C and the third image capturing device R are sequentially and respectively multiplied by the corresponding first weight fusion coefficient, the second weight fusion coefficient and the third weight fusion coefficient, the target image data MIX can be obtained. It is understood that the graphs shown in fig. 6 are all images displayed based on the corresponding image data.
As shown in fig. 7, in one embodiment, the present embodiment provides a data compensation apparatus for a display panel, which includes an obtaining module 100, a fusing module 200, and a generating module 300; the acquiring module 100 is configured to acquire at least two sets of initial image data of the display panel at different image capturing positions and with the same brightness variation trend, where the image capturing positions are positions of the image capturing device relative to the display panel; the fusion module 200 is configured to determine the target image data based on a fusion result of at least two sets of initial image data with different image capturing positions and the same brightness variation trend; the generating module 300 is used for generating a corresponding data compensation table based on the target image data.
It can be understood that, in the data compensation apparatus provided in this embodiment, by acquiring at least two sets of initial image data of different image capturing positions of the display panel, determining a fusion result of the initial image data as target image data, and generating a corresponding data compensation table, a technical problem of a poor side view angle display effect of the display panel can be improved; meanwhile, the initial image data have the same brightness change trend, and the fused target image data can further improve the display effect of different visual angles of the display panel.
It should be noted that the obtaining module 100 may be electrically connected to the fusion module 200, and the fusion module 200 may be electrically connected to the generating module 300.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The data compensation method and the data compensation device for the display panel provided by the embodiment of the present application are described in detail above, and a specific example is applied in the description to explain the principle and the implementation of the present application, and the description of the above embodiment is only used to help understanding the technical scheme and the core idea of the present application; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present disclosure as defined by the appended claims.

Claims (10)

1. A data compensation method of a display panel is characterized by comprising the following steps:
acquiring initial image data of at least two groups of display panels, wherein the initial image data have different image capturing positions and the same brightness change trend, and the image capturing positions are positions of image capturing equipment relative to the display panels;
determining the calibrated image data based on the fusion result of the initial image data of the at least two groups of different image capturing positions and the same brightness change trend;
and generating a corresponding data compensation table based on the target image data.
2. The data compensation method as claimed in claim 1, wherein the different image capturing positions include a first image capturing position, a second image capturing position and a third image capturing position, and the step of acquiring at least two sets of initial image data with different viewing angles and the same brightness variation trend of the display panel comprises:
determining the first image capturing position, the second image capturing position and the third image capturing position, wherein the second image capturing position is located between the first image capturing position and the third image capturing position;
and sequentially acquiring the first view angle initial image data, the second view angle initial image data and the third view angle initial image data based on the first image capturing position, the second image capturing position and the third image capturing position.
3. The data compensation method of claim 2, wherein the step of obtaining at least two sets of initial image data with different viewing angles and the same brightness variation trend of the display panel further comprises:
determining a first image capturing focal length, a second image capturing focal length and a third image capturing focal length, wherein the first image capturing focal length, the second image capturing focal length and the third image capturing focal length are all the same numerical value;
and sequentially acquiring the first view angle initial image data, the second view angle initial image data and the third view angle initial image data based on the first image capturing focal length, the second image capturing focal length and the third image capturing focal length.
4. The data compensation method according to claim 2, wherein the step of determining the target image data based on the fusion result of the at least two sets of initial image data with different image capturing positions and the same brightness variation trend comprises:
determining a first weight fusion coefficient of the first perspective initial image data, a second weight fusion coefficient of the second perspective initial image data and a third weight fusion coefficient of the third perspective initial image data;
fusing the first view angle initial image data, the second view angle initial image data and the third view angle initial image data based on the first weight fusion coefficient, the second weight fusion coefficient and the third weight fusion coefficient;
and determining the target image data based on the fusion result of the first view initial image data, the second view initial image data and the third view initial image data.
5. The data compensation method of claim 4, wherein the step of determining a first weight fusion coefficient of the first perspective initial image data, a second weight fusion coefficient of the second perspective initial image data, and a third weight fusion coefficient of the third perspective initial image data comprises:
determining the first weight fusion coefficient based on the first image capturing position;
determining the second weight fusion coefficient based on the second image capturing position;
and determining the third weight fusion coefficient based on the third image capturing position.
6. The data compensation method of claim 5, wherein the step of determining a first weight fusion coefficient of the first perspective initial image data, a second weight fusion coefficient of the second perspective initial image data, and a third weight fusion coefficient of the third perspective initial image data further comprises:
dividing the display width of the display panel into M equal parts, wherein M is a positive integer;
determining the first image capturing position, the second image capturing position and the third image capturing position;
determining the first weight fusion coefficient based on the first image capturing position and a first calculation formula;
determining the third weight fusion coefficient based on the third image capturing position and a third calculation formula;
and determining the second weight fusion coefficient based on the first weight fusion coefficient and the third weight fusion coefficient.
7. The data compensation method as claimed in claim 6, wherein the step of determining the first weight fusion coefficient based on the first image capturing position and the first calculation formula comprises:
determining that the first image-taking position is located at an X1-th halving position of the display width, wherein X1 is a positive integer and is less than M/2;
determining that the first calculation formula is Y1 ═ 1/(1+ exp (-X1+ M/4)), and Y1 is the first weight fusion coefficient.
8. The data compensation method as claimed in claim 7, wherein the step of determining the third weight fusion coefficient based on the third image capturing position and a third calculation formula comprises:
determining that the third image capture position is located at an X2 th halving position of the display width, wherein X2 is a positive integer and is greater than M/2;
determining the third calculation formula as Y3 ═ 1/(1+ exp (X2-3 × M/4)), and Y3 as the third weight fusion coefficient.
9. The data compensation method of claim 8, wherein the step of determining the second weight fusion coefficient based on the first weight fusion coefficient and the third weight fusion coefficient comprises:
determining the first weight fusion coefficient and the third weight fusion coefficient as Y1 and Y3 in sequence;
a second calculation formula of determining the second weight fusion coefficient is Y2 ═ 1- (Y1+ Y3), and Y2 is the second weight fusion coefficient.
10. A data compensation apparatus for a display panel, comprising:
the acquisition module is used for acquiring initial image data of at least two groups of different image acquisition positions of the display panel and with the same brightness variation trend, wherein the image acquisition positions are positions of the image acquisition equipment relative to the display panel;
the fusion module is used for determining the calibrated image data based on the fusion result of the initial image data of the at least two groups of different image capturing positions and the same brightness change trend; and
and the generating module is used for generating a corresponding data compensation table based on the target image data.
CN202110610337.4A 2021-06-01 2021-06-01 Data compensation method and data compensation device of display panel Active CN113362751B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110610337.4A CN113362751B (en) 2021-06-01 2021-06-01 Data compensation method and data compensation device of display panel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110610337.4A CN113362751B (en) 2021-06-01 2021-06-01 Data compensation method and data compensation device of display panel

Publications (2)

Publication Number Publication Date
CN113362751A true CN113362751A (en) 2021-09-07
CN113362751B CN113362751B (en) 2023-11-28

Family

ID=77530952

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110610337.4A Active CN113362751B (en) 2021-06-01 2021-06-01 Data compensation method and data compensation device of display panel

Country Status (1)

Country Link
CN (1) CN113362751B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019747A (en) * 2022-05-23 2022-09-06 惠科股份有限公司 Display control method and vehicle-mounted display equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007121243A (en) * 2005-10-31 2007-05-17 Sharp Corp Device and method for inspecting image display panel
KR20070060965A (en) * 2005-12-09 2007-06-13 타이완 티에프티 엘씨디 오쏘시에이션 A system and a method of measuring a display at multi-angles
WO2010146732A1 (en) * 2009-06-18 2010-12-23 シャープ株式会社 Defect inspection method and defect inspection device for display panel
CN105590604A (en) * 2016-03-09 2016-05-18 深圳市华星光电技术有限公司 Mura phenomenon compensation method
JP2016206383A (en) * 2015-04-21 2016-12-08 シャープ株式会社 Liquid crystal display device
CN108922481A (en) * 2018-06-11 2018-11-30 宏祐图像科技(上海)有限公司 A kind of demura implementation method based on LCD TV side view angle
CN111491070A (en) * 2020-06-29 2020-08-04 武汉精立电子技术有限公司 Display panel multi-view angle equalization Demura method and terminal equipment
CN112767891A (en) * 2021-01-20 2021-05-07 Tcl华星光电技术有限公司 Mura compensation method, display panel and display device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007121243A (en) * 2005-10-31 2007-05-17 Sharp Corp Device and method for inspecting image display panel
KR20070060965A (en) * 2005-12-09 2007-06-13 타이완 티에프티 엘씨디 오쏘시에이션 A system and a method of measuring a display at multi-angles
WO2010146732A1 (en) * 2009-06-18 2010-12-23 シャープ株式会社 Defect inspection method and defect inspection device for display panel
JP2016206383A (en) * 2015-04-21 2016-12-08 シャープ株式会社 Liquid crystal display device
CN105590604A (en) * 2016-03-09 2016-05-18 深圳市华星光电技术有限公司 Mura phenomenon compensation method
CN108922481A (en) * 2018-06-11 2018-11-30 宏祐图像科技(上海)有限公司 A kind of demura implementation method based on LCD TV side view angle
CN111491070A (en) * 2020-06-29 2020-08-04 武汉精立电子技术有限公司 Display panel multi-view angle equalization Demura method and terminal equipment
CN112767891A (en) * 2021-01-20 2021-05-07 Tcl华星光电技术有限公司 Mura compensation method, display panel and display device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115019747A (en) * 2022-05-23 2022-09-06 惠科股份有限公司 Display control method and vehicle-mounted display equipment
CN115019747B (en) * 2022-05-23 2023-07-18 惠科股份有限公司 Display control method and vehicle-mounted display device

Also Published As

Publication number Publication date
CN113362751B (en) 2023-11-28

Similar Documents

Publication Publication Date Title
JP6719579B2 (en) Compensation method for unevenness
CN105244007B (en) A kind of generation method and device of the grayscale correction chart of camber display screen
US10839729B2 (en) Apparatus for testing display panel and driving method thereof
US8134779B2 (en) 3D image display, aligning system and method thereof
US9208735B2 (en) Voltage adjustment method and apparatus of liquid crystal display panel
JP4799329B2 (en) Unevenness inspection method, display panel manufacturing method, and unevenness inspection apparatus
CN106952626B (en) The mura compensation deals method, apparatus and liquid crystal display of RGBW pixel arrangement panel
JP2017527848A (en) Method for setting gray scale value of liquid crystal panel and liquid crystal display
CN105376540A (en) Projection display system and correction method of projection area
CN106328079B (en) Image brightness compensation method and compensating module
CN103686162B (en) Method and device for testing crosstalk of three-dimensional display
CN105590606A (en) Mura phenomenon compensation method
CN103167223A (en) Mobile device with wide-angle shooting function and image acquisition method thereof
CN109616507B (en) Mura compensation device, display panel, display device and mura compensation method
CN113362751A (en) Data compensation method and data compensation device of display panel
CN111491070B (en) Display panel multi-view angle equalization Demura method and terminal equipment
KR102209953B1 (en) Mura Detecting Device
CN102411005B (en) Cell substrate inspection system and method
CN107911602B (en) Display panel Mura detection method, detection device and computer readable storage medium
TWI746201B (en) Display device and image correction method
CN112954304A (en) Mura defect evaluation method and system for display panel and readable storage medium
CN109073503B (en) Unevenness evaluation method and unevenness evaluation device
CN113903284A (en) Testing device
CN104166240A (en) Naked eye 3D display device
CN105469757A (en) Display panel scan driving method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant