TWI654873B - Screen calibration method and screen calibration system - Google Patents

Screen calibration method and screen calibration system

Info

Publication number
TWI654873B
TWI654873B TW107102449A TW107102449A TWI654873B TW I654873 B TWI654873 B TW I654873B TW 107102449 A TW107102449 A TW 107102449A TW 107102449 A TW107102449 A TW 107102449A TW I654873 B TWI654873 B TW I654873B
Authority
TW
Taiwan
Prior art keywords
optical data
screen
region
target
target optical
Prior art date
Application number
TW107102449A
Other languages
Chinese (zh)
Other versions
TW201933858A (en
Inventor
林信男
黃重裕
Original Assignee
明基電通股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 明基電通股份有限公司 filed Critical 明基電通股份有限公司
Priority to TW107102449A priority Critical patent/TWI654873B/en
Application granted granted Critical
Publication of TWI654873B publication Critical patent/TWI654873B/en
Publication of TW201933858A publication Critical patent/TW201933858A/en

Links

Landscapes

  • Controls And Circuits For Display Device (AREA)

Abstract

螢幕校正方法包含相機取得螢幕之全螢幕影像;感測器取得螢幕之複數個區域中之第一區域的第一光學資料;依據第一校正參數校正螢幕於第一區域的第一光學資料,以控制第一區域的發色接近目標光學資料;根據全螢幕影像與第一區域的第一光學資料產生第二區域的第二光學資料;及依據目標光學資料及第二光學資料,產生第二校正參數以校正第二光學資料,以控制第二區域的發色接近目標光學資料。The screen calibration method comprises the camera acquiring a full screen image of the screen; the sensor obtaining the first optical data of the first region in the plurality of regions of the screen; correcting the first optical data of the screen in the first region according to the first calibration parameter, Controlling the color of the first region to be close to the target optical data; generating second optical data of the second region according to the full screen image and the first optical data of the first region; and generating the second correction according to the target optical data and the second optical data The parameter is to correct the second optical data to control the color of the second region to be close to the target optical data.

Description

螢幕校正方法及螢幕校正系統Screen correction method and screen correction system

本發明描述了一種螢幕校正方法及其系統,尤指一種利用相機及感測器對螢幕中之所有區域的進行光學補償的校正方法及其系統。The present invention describes a screen correction method and system thereof, and more particularly to a method and system for correcting optical compensation of all areas in a screen using a camera and a sensor.

隨著科技日新月異,各式各樣的顯示器也被廣泛地使用。例如,液晶顯示器(Liquid Crystal Display,LCD)以及有機發光二極體(Organic light emitting diode,OLED)顯示器,其具有外型輕薄、省電以及無輻射等優點,目前已被普遍地應用於多媒體播放器、行動電話、個人數位助理、電腦顯示器、或平面電視等電子產品上。然而,當顯示器在顯示畫面時,由於出產時製程的差異性或是使用者的設定,顯示出的畫面可能會發生色彩偏移的現象。例如色調偏移、白平衡偏移、色彩明度偏移等等。這些色彩偏移的現象常常會造成不討喜的發色,或是失真的發色。As technology advances, a wide variety of displays are also widely used. For example, liquid crystal displays (LCDs) and organic light emitting diode (OLED) displays have the advantages of slimness, power saving, and no radiation, and are now widely used in multimedia playback. On electronic devices such as mobile phones, personal digital assistants, computer monitors, or flat-panel TVs. However, when the display is displayed on the screen, due to the difference in the manufacturing process or the user's settings, the displayed image may be subject to color shift. For example, hue shift, white balance shift, color brightness shift, and the like. These color shifting phenomena often result in unpleasant hair color or distorted hair color.

當顯示器發生色彩偏移的現象時,常用的解決手段為,使用者開啟顯示器的視控調整功能(也可稱為On Screen Display,OSD),並手動調整顯示器的各種參數。同時,使用者也必須將校正裝置(Calibrator)手動地貼近螢幕上的某一個小區域,並利用逐步修正的方式慢慢調整到自己想要的色彩。然而,以現今的校正裝置而言,由於校正裝置內之光感測器的偵測範圍有限,故僅能偵測螢幕上之單點(或是很小的範圍)的光學特性。換言之,以目前主流的大型高解析度螢幕而言,若要校正全螢幕的畫面時,使用者必須要使用校正裝置蒐集螢幕上之所有子區域的發光特性。這種光學資料蒐集的操作是屬於一種高重複性的操作,除了費時費力外,也因為校正裝置在螢幕之每一個子區域的位移量是手動控制,因此校正結果未必保證精確。When the color shift of the display occurs, the common solution is that the user turns on the visual adjustment function of the display (also called On Screen Display, OSD), and manually adjusts various parameters of the display. At the same time, the user must manually close the calibration device (Calibrator) to a small area on the screen and gradually adjust to the color you want by stepwise correction. However, with today's calibration devices, since the detection range of the light sensor in the calibration device is limited, only the optical characteristics of a single point (or a small range) on the screen can be detected. In other words, in the case of large-scale high-resolution screens currently in the mainstream, in order to correct the screen of the full screen, the user must use the correcting device to collect the illuminating characteristics of all sub-areas on the screen. This optical data collection operation is a highly repetitive operation, in addition to time and effort, and because the displacement of the correction device in each sub-area of the screen is manually controlled, the correction result is not necessarily accurate.

本發明一實施例提出一種螢幕校正方法,包含相機取得螢幕之全螢幕影像,感測器取得螢幕之複數個區域中之第一區域的第一光學資料,依據第一校正參數校正螢幕於第一區域的第一光學資料,以控制第一區域的發色接近目標光學資料,根據全螢幕影像與第一區域的第一光學資料產生第二區域的第二光學資料,及依據目標光學資料及第二光學資料,產生第二校正參數,以校正第二光學資料而控制第二區域的發色接近目標光學資料。An embodiment of the present invention provides a screen correction method, including a camera acquiring a full screen image of a screen, the sensor acquiring first optical data of a first region of a plurality of regions of the screen, and correcting the screen according to the first calibration parameter. The first optical data of the area is controlled to control the color of the first area to be close to the target optical data, and the second optical data of the second area is generated according to the full screen image and the first optical data of the first area, and according to the target optical data and the And the second optical data is generated to correct the second optical data to control the color of the second region to be close to the target optical data.

本發明另一實施例提出一種螢幕校正系統。螢幕校正系統包含螢幕、相機、感測器及處理器。螢幕包含複數個區域,用以顯示影像。相機用以取得螢幕之全螢幕影像。感測器用以貼近螢幕以取得區域性的光學資料。處理器耦接於感測器、相機及螢幕,用以校正螢幕。感測器取得螢幕之該些區域中之第一區域的第一光學資料。處理器依據第一校正參數校正螢幕於第一區域的第一光學資料,以控制第一區域的發色接近目標光學資料,根據全螢幕影像與第一區域的第一光學資料產生第二區域的第二光學資料,及依據目標光學資料及第二光學資料,產生第二校正參數以校正第二光學資料而控制第二區域的發色接近目標光學資料。Another embodiment of the present invention provides a screen correction system. The screen correction system includes a screen, a camera, a sensor, and a processor. The screen contains a number of areas for displaying images. The camera is used to capture the full screen image of the screen. The sensor is used to access the screen to obtain regional optical data. The processor is coupled to the sensor, the camera, and the screen to correct the screen. The sensor obtains first optical data of the first of the regions of the screen. The processor corrects the first optical data of the first area by the first correction parameter to control the color of the first area to be close to the target optical data, and generates the second area according to the full screen image and the first optical data of the first area. And the second optical data, and the second optical data is generated according to the target optical data and the second optical data to correct the second optical data to control the color of the second region to be close to the target optical data.

第1圖係為本發明之螢幕校正系統100之實施例的方塊圖。螢幕校正系統100包含螢幕10、相機11、感測器12以及處理器13。螢幕10包含複數個區域,用以顯示影像。在本實施例中,螢幕10可為液晶顯示螢幕、有機發光二極體螢幕等任何產生光訊號能力的表面裝置。螢幕10中的複數個區域可為複數個子畫素陣列所構成的區域。本發明不限制螢幕10中的複數個區域之形狀、數量以及尺寸。相機11用以取得螢幕10之全螢幕影像。相機11可包含具有感光元件(例如感光耦合元件,Charge Coupled Device)的鏡頭,其廣角視野可包含全螢幕的範圍。因此,當相機11面對螢幕10拍照時,可以擷取全螢幕影像。感測器12用以貼近螢幕10以取得區域性的光學資料。感測器12可包含任何種類的光學感測裝置,當螢幕10產生光訊號並顯示影像時,感測器可用以貼近螢幕10取得實質上相等於點區域或是小範圍區域的光學資料。處理器13耦接於感測器12、相機11及螢幕10,用以校正螢幕10。在本實施例中,螢幕校正系統100的目的在於依據相機11所拍攝的全螢幕影像以及感測器12所取得的點區域或是小範圍區域的光學資料,而校正整個範圍的螢幕。最終期望整個範圍的螢幕發色可以達到使用者所設定的目標光學資料。為了達到此目的,感測器12會取得螢幕10之該些區域中之第一區域的第一光學資料。處理器13依據第一校正參數校正螢幕10於第一區域的第一光學資料,以控制第一區域的發色接近目標光學資料。並且,處理器13也會根據全螢幕影像與第一區域的第一光學資料,產生第二區域的第二光學資料,並依據目標光學資料及第二光學資料,產生第二校正參數,以校正第二光學資料而控制第二區域的發色接近目標光學資料。在下文中,將引入數個實施例,以說明螢幕校正系統100的運作過程以及演算法。1 is a block diagram of an embodiment of a screen correction system 100 of the present invention. The screen correction system 100 includes a screen 10, a camera 11, a sensor 12, and a processor 13. The screen 10 includes a plurality of areas for displaying images. In this embodiment, the screen 10 can be any surface device capable of generating optical signals, such as a liquid crystal display screen, an organic light emitting diode screen, and the like. The plurality of regions in the screen 10 may be regions formed by a plurality of sub-pixel arrays. The present invention does not limit the shape, number, and size of the plurality of regions in the screen 10. The camera 11 is used to capture a full screen image of the screen 10. The camera 11 may include a lens having a photosensitive element (e.g., a Charge Coupled Device) whose wide-angle field of view may include a full screen range. Therefore, when the camera 11 is photographed facing the screen 10, a full screen image can be captured. The sensor 12 is used to be close to the screen 10 to obtain regional optical data. The sensor 12 can include any kind of optical sensing device. When the screen 10 generates an optical signal and displays an image, the sensor can be used to approach the screen 10 to obtain optical data substantially equal to a dot area or a small area. The processor 13 is coupled to the sensor 12, the camera 11 and the screen 10 for correcting the screen 10. In the present embodiment, the purpose of the screen correction system 100 is to correct the entire range of screens according to the full screen image captured by the camera 11 and the optical data of the dot area or the small area obtained by the sensor 12. It is finally expected that the entire range of screen colors will reach the target optical data set by the user. To achieve this, the sensor 12 will obtain the first optical material of the first of the regions of the screen 10. The processor 13 corrects the first optical data of the screen 10 in the first area according to the first correction parameter to control the color of the first area to approach the target optical data. Moreover, the processor 13 also generates second optical data of the second region according to the full screen image and the first optical data of the first region, and generates a second calibration parameter according to the target optical data and the second optical data to correct The second optical material controls the color of the second region to be close to the target optical data. In the following, several embodiments will be introduced to illustrate the operation of the screen correction system 100 and the algorithm.

第2A圖係為螢幕校正系統100中,整合螢幕10、相機11及感測器12的顯示裝置14於準備偵測顯示畫面時的示意圖。第2B圖係為螢幕校正系統100中,整合螢幕10、相機11及感測器12的顯示裝置14在偵測顯示畫面時的示意圖。在本實施例中,如第2A圖以及第2B圖所示,螢幕校正系統100可將相機11、感測器12以及螢幕10整合並設置於顯示裝置14上,使其外觀為一體成形。然而,螢幕校正系統100也可以將相機11以及感測器12與螢幕10分離,再利用無線連結傳遞資料。任何合理的技術變更或是硬體變更都屬於本發明所揭露的範疇。在第2A圖以及第2B圖中,感測器12可利用至少一個軸承轉動至螢幕10之第一區域,以貼近第一區域的方式取得第一光學資料。類似地,相機11可利用至少一個軸承轉動至螢幕10的正面,以擷取全螢幕影像。然而,感測器12可以利用任何的方式移動至螢幕的特定區域,例如利用多關節式的支撐架將感測器12移動至螢幕的特定區域。相機11也可利用滑軌或是可彎曲式的連接裝置,移動至適當的位置以擷取全螢幕影像。FIG. 2A is a schematic diagram of the display device 14 integrating the screen 10, the camera 11 and the sensor 12 in the screen correction system 100 in preparation for detecting a display screen. FIG. 2B is a schematic diagram of the screen correction system 100, which integrates the screen 10, the camera 11 and the display device 14 of the sensor 12 when detecting a display screen. In the present embodiment, as shown in FIGS. 2A and 2B, the screen correction system 100 can integrate and mount the camera 11, the sensor 12, and the screen 10 on the display device 14, so that the appearance thereof is integrally formed. However, the screen correction system 100 can also separate the camera 11 and the sensor 12 from the screen 10 and then transfer the data using a wireless link. Any reasonable technical or hardware changes are within the scope of the present invention. In FIGS. 2A and 2B, the sensor 12 can be rotated to the first region of the screen 10 using at least one bearing to obtain the first optical material in proximity to the first region. Similarly, the camera 11 can be rotated to the front of the screen 10 using at least one bearing to capture a full screen image. However, the sensor 12 can be moved to a particular area of the screen in any manner, such as by using a multi-joint support frame to move the sensor 12 to a particular area of the screen. The camera 11 can also be moved to an appropriate position to capture a full screen image using a slide rail or a bendable connection.

第3圖係為螢幕校正系統100中,利用相機11擷取螢幕10之全螢幕影像的示意圖。如前述提及,螢幕10可為具有產生光訊號能力的表面裝置,且可被分割為多個區域。例如,螢幕10可被分割為區域R1至區域R9。區域R1至區域R9對應不同的位置,因此產生的光訊號也會不同。並且,由於相機11要移動至適當的位置以擷取全螢幕影像。因此,相機11與螢幕10存在一個距離,此距離可視為相機11能擷取到全螢幕影像的對焦距離。由於相機11與螢幕10之間具有對焦距離,因此相機11所擷取到之全螢幕影像中,每一個區域會受到環境光的影響。為了方便描述,後文將光學資料以CIE色度空間座標表示(例如CIE 1931的三維色度空間)。然而,應當理解的是,本發明所用的演算法或是流程中,並不侷限於使用CIE色度空間座標進行處理,也可以使用三原色(RGB)之色度空間座標等任何色域轉換的座標進行處理。如上所述,因全螢幕影像中,每一個區域會受到環境光的影響。因此,相機11所擷取到之全螢幕影像中,對應螢幕10之區域R1的影像光學資料可為(x1,y1,Y1)。並且,(x1,y1,Y1)可表示為: (x1,y1,Y1)= (x1’+Δx1,y1’+Δy1,Y1’+ΔY1)FIG. 3 is a schematic diagram of the full screen image of the screen 10 captured by the camera 11 in the screen correction system 100. As mentioned above, the screen 10 can be a surface device having the ability to generate optical signals and can be divided into multiple regions. For example, the screen 10 can be divided into a region R1 to a region R9. The regions R1 to R9 correspond to different positions, and thus the generated optical signals are also different. Also, since the camera 11 is to be moved to an appropriate position to capture a full screen image. Therefore, the camera 11 has a distance from the screen 10, which can be regarded as the focusing distance that the camera 11 can capture the full screen image. Since the camera 11 and the screen 10 have a focusing distance, each area of the full screen image captured by the camera 11 is affected by ambient light. For convenience of description, the optical data will be represented by CIE chromaticity space coordinates (for example, the three-dimensional chromaticity space of CIE 1931). However, it should be understood that the algorithm or process used in the present invention is not limited to processing using CIE chromaticity space coordinates, and any gamut-converted coordinates such as chromaticity space coordinates of three primary colors (RGB) may be used. Process it. As mentioned above, each area is affected by ambient light due to the full screen image. Therefore, in the full screen image captured by the camera 11, the image optical data corresponding to the region R1 of the screen 10 can be (x1, y1, Y1). Also, (x1, y1, Y1) can be expressed as: (x1, y1, Y1) = (x1' + Δx1, y1' + Δy1, Y1' + ΔY1)

其中(x1’,y1’,Y1’)為區域R1的真實發光特性,(Δx1,Δy1,ΔY1)為區域R1對應的環境光參數。相機11所擷取到區域R1之影像光學資料(x1,y1,Y1)即可視為真實發光特性(x1’,y1’,Y1’)與環境光參數(Δx1,Δy1,ΔY1)的結合。類似地,相機11所擷取到之全螢幕影像中,對應螢幕10之區域R2的影像光學資料可為(x2,y2,Y2)。並且,(x2,y2,Y2)可表示為: (x2,y2,Y2)= (x2’+Δx2,y2’+Δy2,Y2’+ΔY2)Where (x1', y1', Y1') is the true luminescence characteristic of the region R1, and (Δx1, Δy1, ΔY1) is the ambient light parameter corresponding to the region R1. The image optical data (x1, y1, Y1) captured by the camera 11 in the region R1 can be regarded as a combination of the true light-emitting characteristics (x1', y1', Y1') and the ambient light parameters (Δx1, Δy1, ΔY1). Similarly, in the full screen image captured by the camera 11, the image optical data corresponding to the region R2 of the screen 10 may be (x2, y2, Y2). Also, (x2, y2, Y2) can be expressed as: (x2, y2, Y2) = (x2' + Δx2, y2' + Δy2, Y2' + ΔY2)

其中(x2’,y2’,Y2’)為區域R2的真實發光特性,(Δx2,Δy2,ΔY2)為區域R2對應的環境光參數。相機11所擷取到區域R2之影像光學資料(x2,y2,Y2)即可視為真實發光特性(x2’,y2’,Y2’)與環境光參數(Δx2, Δy2, ΔY2)的結合。依此類推,區域R1至區域R9受到環境光影響後,被相機擷取的影像光學資料分別為(x1,y1,Y1)至(x9,y9,Y9)。Where (x2', y2', Y2') is the true luminescence characteristic of the region R2, and (Δx2, Δy2, ΔY2) is the ambient light parameter corresponding to the region R2. The image optical data (x2, y2, Y2) captured by the camera 11 in the region R2 can be regarded as a combination of the true luminescence characteristics (x2', y2', Y2') and the ambient light parameters (Δx2, Δy2, ΔY2). Similarly, after the region R1 to the region R9 are affected by the ambient light, the image optical data captured by the camera are (x1, y1, Y1) to (x9, y9, Y9), respectively.

第4圖係為螢幕校正系統100中,感測器12依據全螢幕影像,感測螢幕中亮度最低的區域的示意圖。如前述,相機擷取區域R1至區域R9受到環境光影響後的影像光學資料分別為(x1,y1,Y1)至(x9,y9,Y9)。處理器13會依據影像光學資料(x1,y1,Y1)至(x9,y9,Y9),判斷亮度最低的區域,在此實施例為區域R1。接著,感測器12可依據取得螢幕10之該些區域中亮度最低區域R1的第一光學資料。感測器12可用貼近螢幕10的方式取得區域R1的第一光學資料。因此,感測器12取得區域R1的第一光學資料,可視為未受環境光影響的光學資料,對應於區域R1的真實發光特性。利用前述定義,感測器12取得區域R1的第一光學資料可表示為(x1’,y1’,Y1’)。Figure 4 is a schematic diagram of the screen correction system 100 in which the sensor 12 senses the region of lowest brightness in the screen based on the full screen image. As described above, the image optical data of the camera capturing region R1 to the region R9 after being affected by the ambient light are (x1, y1, Y1) to (x9, y9, Y9), respectively. The processor 13 determines the region with the lowest brightness based on the image optical data (x1, y1, Y1) to (x9, y9, Y9), which is the region R1 in this embodiment. Then, the sensor 12 can obtain the first optical data of the lowest brightness region R1 in the regions of the screen 10. The sensor 12 can acquire the first optical material of the region R1 in a manner close to the screen 10. Therefore, the sensor 12 obtains the first optical material of the region R1, which can be regarded as optical material that is not affected by the ambient light, and corresponds to the true light-emitting characteristic of the region R1. Using the foregoing definition, the first optical material of sensor region 12 to obtain region R1 can be represented as (x1', y1', Y1').

第5圖係為螢幕校正系統100中,處理器校正亮度最低的區域R1,並估計其餘區域的真實發光特性的示意圖。接著前述步驟,感測器12可取得區域R1的第一光學資料,可視為區域R1未受到環境光影響的真實發光特性,以(x1’,y1’,Y1’)表示。並且,如前述定義,相機11所擷取到之全螢幕影像中,對應螢幕10之區域R1的影像光學資料可為(x1,y1,Y1),可表示為(x1,y1,Y1)= (x1’+Δx1,y1’+Δy1,Y1’+ΔY1)。因此,由於區域R1的第一光學資料(x1’,y1’,Y1’)可被感測器12偵測,且受到環境光影響的影像光學資料(x1,y1,Y1)可被相機11偵測。因此,對於區域R1而言,依據全螢幕影像及第一光學資料,可以推算一組環境光參數(Δx1,Δy1,ΔY1),表示如下: (Δx1,Δy1,ΔY1)= (x1-x1’,y1-y1’,Y1-Y1’)Figure 5 is a schematic diagram of the screen correction system 100 in which the processor corrects the region R1 having the lowest brightness and estimates the true luminescence characteristics of the remaining regions. Following the foregoing steps, the sensor 12 can obtain the first optical material of the region R1, which can be regarded as the true illuminating characteristic of the region R1 that is not affected by the ambient light, and is represented by (x1', y1', Y1'). Moreover, as defined above, in the full screen image captured by the camera 11, the image optical data corresponding to the region R1 of the screen 10 may be (x1, y1, Y1), which may be expressed as (x1, y1, Y1) = ( X1' + Δx1, y1' + Δy1, Y1' + ΔY1). Therefore, since the first optical data (x1', y1', Y1') of the region R1 can be detected by the sensor 12, the optical image data (x1, y1, Y1) affected by the ambient light can be detected by the camera 11. Measurement. Therefore, for the region R1, based on the full screen image and the first optical data, a set of ambient light parameters (Δx1, Δy1, ΔY1) can be estimated, expressed as follows: (Δx1, Δy1, ΔY1) = (x1-x1', Y1-y1',Y1-Y1')

並且,處理器13也會依據區域R1的第一光學資料(x1’,y1’,Y1’)以及使用者所設定之目標光學資料(x,y,Y),產生第一校正參數 f R1(x R1,y R1,Y R1)。換言之,在區域R1中,目標光學資料(x,y,Y)、第一光學資料(x1’,y1’,Y1’)以及第一校正參數 f R1(x R1,y R1,Y R1)的關係可以表示如下: (x,y,Y)= f R1(x1’,y1’,Y1’) Moreover, the processor 13 also generates a first correction parameter f R1 according to the first optical data (x1', y1', Y1') of the region R1 and the target optical data (x, y, Y) set by the user ( x R1 , y R1 , Y R1 ). In other words, in the region R1, the target optical data (x, y, Y), the first optical data (x1', y1', Y1') and the first correction parameter f R1 (x R1 , y R1 , Y R1 ) The relationship can be expressed as follows: (x, y, Y) = f R1 (x1', y1', Y1')

其中 f R1(x R1,y R1,Y R1)可為轉換函數,遞迴函數,或是任何色域投影函數或是矩陣。若 f R1(x R1,y R1,Y R1)為增益矩陣G1 RGB,上述之目標光學資料(x,y,Y)與第一光學資料(x1’,y1’,Y1’)可以表示如下: Where f R1 (x R1 , y R1 , Y R1 ) can be a transfer function, a recursive function, or any gamut projection function or matrix. If f R1 (x R1 , y R1 , Y R1 ) is the gain matrix G1 RGB , the above-mentioned target optical data (x, y, Y) and the first optical data (x1', y1', Y1') can be expressed as follows:

第一光學資料(x1’,y1’,Y1’)可以用遞迴的轉換方法轉為目標光學資料(x,y,Y)。例如,利用多次遞迴處理,將第一光學資料(x1’,y1’,Y1’)的數值逐漸偏移,最終收斂至目標光學資料(x,y,Y)。任何合理的色度座標轉換之演算法都屬於本發明所揭露的範疇。因此,在第5圖中,區域R1的發光特性最終會校正至對應目標光學資料(x,y,Y)。並且,如前述定義,相機11所擷取到之全螢幕影像中,對應區域R2的影像光學資料為(x2,y2,Y2)。由於全螢幕影像引入了環境光參數,因此,對應區域R2的影像光學資料為(x2,y2,Y2)可表示為(x2,y2,Y2)= (x2’+Δx2,y2’+Δy2,Y2’+ΔY2)。然而,由於只有區域R1的環境光參數(Δx1,Δy1,ΔY1)能被精準地計算。因此,對於區域R2而言,處理器13可以根據全螢幕影像與第一光學資料,計算區域R2的第二光學資料。而區域R2的第二光學資料即可視為區域R2的真實發光特性之光學資料的估計值,表示如下: (x2’,y2’,Y2’) ≈ (x2-Δx1,y2-Δy1,Y2-ΔY1)The first optical data (x1', y1', Y1') can be converted to the target optical data (x, y, Y) by a recursive conversion method. For example, with the multiple recursive processing, the values of the first optical data (x1', y1', Y1') are gradually shifted, and finally converge to the target optical data (x, y, Y). Any algorithm for reasonable chromaticity coordinate conversion is within the scope of the present invention. Therefore, in Fig. 5, the luminescence characteristics of the region R1 are finally corrected to the corresponding target optical data (x, y, Y). Further, as defined above, in the full screen image captured by the camera 11, the image optical data of the corresponding region R2 is (x2, y2, Y2). Since the ambient light parameter is introduced into the full screen image, the image optical data of the corresponding region R2 is (x2, y2, Y2) and can be expressed as (x2, y2, Y2) = (x2' + Δx2, y2' + Δy2, Y2 '+ΔY2). However, since only the ambient light parameters (Δx1, Δy1, ΔY1) of the region R1 can be accurately calculated. Therefore, for the region R2, the processor 13 can calculate the second optical data of the region R2 based on the full screen image and the first optical data. The second optical data of the region R2 can be regarded as an estimated value of the optical data of the true luminescence characteristic of the region R2, which is expressed as follows: (x2', y2', Y2') ≈ (x2-Δx1, y2-Δy1, Y2-ΔY1 )

其中,(x2-Δx1,y2-Δy1,Y2-ΔY1)為區域R2的第二光學資料,可視為區域R2的真實發光特性之光學資料的估計值,說明如下。如前述提及,區域R2的真實發光特性之光學資料(x2’,y2’,Y2’)可以表示如下: (x2’,y2’,Y2’) = (x2-Δx2,y2-Δy2,Y2-ΔY2)Wherein (x2-Δx1, y2-Δy1, Y2-ΔY1) is the second optical data of the region R2, and can be regarded as an estimated value of the optical data of the true luminescence characteristic of the region R2, as explained below. As mentioned above, the optical data (x2', y2', Y2') of the true luminescence characteristics of the region R2 can be expressed as follows: (x2', y2', Y2') = (x2-?x2, y2-?y2, Y2- ΔY2)

由於感測器只偵測區域R1的發光特性,因此處理器13會將區域R1的環境光參數(Δx1,Δy1,ΔY1)取代區域R2的環境光參數(Δx2,Δy2,ΔY2),而估計出區域R2的真實發光特性之光學資料(第二光學資料)。換句話說,當R2的環境光參數(Δx2,Δy2,ΔY2)與區域R1的環境光參數(Δx1,Δy1,ΔY1)相近時,區域R2的第二光學資料(x2-Δx1,y2-Δy1,Y2-ΔY1)會趨近於區域R2的真實發光特性之光學資料(x2’,y2’,Y2’)。接著,處理器13也會依據區域R2的第二光學資料(x2-Δx1,y2-Δy1,Y2-ΔY1)以及使用者所設定之目標光學資料(x,y,Y),產生第二校正參數 f R2(x R2,y R2,Y R2)。換言之,在區域R2中,目標光學資料(x,y,Y)、第二光學資料(x2-Δx1,y2-Δy1,Y2-ΔY1)以及第二校正參數 f R2(x R2,y R2,Y R2)的關係可以表示如下: (x,y,Y)= f R2(x2-Δx1,y2-Δy1,Y2-ΔY1) Since the sensor only detects the light-emitting characteristics of the region R1, the processor 13 estimates the ambient light parameters (Δx1, Δy1, ΔY1) of the region R1 by replacing the ambient light parameters (Δx2, Δy2, ΔY2) of the region R2. Optical material (second optical data) of the true luminescence characteristics of the region R2. In other words, when the ambient light parameter (Δx2, Δy2, ΔY2) of R2 is close to the ambient light parameter (Δx1, Δy1, ΔY1) of the region R1, the second optical data of the region R2 (x2-Δx1, y2-Δy1, Y2-ΔY1) will approach the optical data (x2', y2', Y2') of the true luminescence properties of the region R2. Next, the processor 13 also generates a second correction parameter according to the second optical data (x2-Δx1, y2-Δy1, Y2-ΔY1) of the region R2 and the target optical data (x, y, Y) set by the user. f R2 (x R2 , y R2 , Y R2 ). In other words, in the region R2, the target optical data (x, y, Y), the second optical data (x2-Δx1, y2-Δy1, Y2-ΔY1) and the second correction parameter f R2 (x R2 , y R2 , Y The relationship of R2 ) can be expressed as follows: (x, y, Y) = f R2 (x2-Δx1, y2-Δy1, Y2-ΔY1)

其中 f R2(x R2,y R2,Y R2)可為轉換函數,遞迴函數,或是任何色域投影函數或是矩陣。若 f R2(x R2,y R2,Y R2)為增益矩陣G2 RGB,上述之目標光學資料(x,y,Y)與第二光學資料(x2-Δx1,y2-Δy1,Y2-ΔY1)可以表示如下: Where f R2 (x R2 , y R2 , Y R2 ) can be a transfer function, a recursive function, or any gamut projection function or matrix. If f R2 (x R2 , y R2 , Y R2 ) is the gain matrix G2 RGB , the above-mentioned target optical data (x, y, Y) and the second optical data (x2-Δx1, y2-Δy1, Y2-ΔY1) may Expressed as follows:

於此說明,如前述提及,第二光學資料(x2-Δx1,y2-Δy1,Y2-ΔY1)會趨近於區域R2的真實發光特性之光學資料(x2’,y2’,Y2’)。因此,區域R2的真實發光特性之光學資料(x2’,y2’,Y2’),經過增益矩陣G2 RGB調整後,可以近似於目標光學資料(x,y,Y),表示如下: As described herein, as mentioned above, the second optical data (x2-Δx1, y2-Δy1, Y2-ΔY1) will approach the optical data (x2', y2', Y2') of the true luminescence characteristics of the region R2. Therefore, the optical data (x2', y2', Y2') of the true luminescence characteristic of the region R2 can be approximated to the target optical data (x, y, Y) after being adjusted by the gain matrix G2 RGB , as follows:

因此,在本實施例之螢幕校正系統100中,區域R1的真實發光特性可用第一校正參數 f R1(x R1,y R1,Y R1)進行補償,以使區域R1的真實發光特性得以更新至對應於目標光學資料(x,y,Y)。並且,區域R2的真實發光特性可用第二校正參數 f R2(x R2,y R2,Y R2)進行補償,以使區域R2的真實發光特性得以更新至趨近於目標光學資料(x,y,Y)。並且,螢幕校正系統100之螢幕10的其他區域也可以用類似的方法校正其發色,使螢幕10每一個區域之發色都能幾乎符合使用者所設定的目標光學資料(x,y,Y)。 Therefore, in the screen correction system 100 of the present embodiment, the true luminescence characteristic of the region R1 can be compensated by the first correction parameter f R1 (x R1 , y R1 , Y R1 ) so that the true luminescence characteristic of the region R1 is updated to Corresponds to the target optical data (x, y, Y). Moreover, the true luminescence characteristic of the region R2 can be compensated by the second correction parameter f R2 (x R2 , y R2 , Y R2 ) so that the true luminescence characteristic of the region R2 is updated to approach the target optical data (x, y, Y). Moreover, the other areas of the screen 10 of the screen correction system 100 can also be corrected in a similar manner, so that the color of each area of the screen 10 can almost match the target optical data set by the user (x, y, Y). ).

第6圖係為螢幕校正系統100中,處理器13校正螢幕10之亮度最低的區域R1後,再校正螢幕10之其他區域的示意圖。如前述提及,由於處理器13可利用相機11以及感測器12準確地計算出區域R1的環境光參數(Δx1,Δy1, ΔY1),且感測器12緊貼於區域R1以蒐集光學資料,因此,區域R1將可被精準地校正(利用第一校正參數 f R1(x R1,y R1,Y R1)進行補償),使其發色符合目標光學資料(x,y,Y)。而對螢幕10之剩下的區域而言,由於處理器13會利用區域R1的環境光參數以及相機11所攝之全螢幕影像,估計剩下區域的真實發光特性,再利用對應的校正參數將剩下區域的發色趨近於目標光學資料(x,y,Y)。因此,在環境光波動不大的情況下,螢幕10所顯示的影像將會大致符合目標光學資料(x,y,Y)。並且,在本實施例中,處理器13可以依據使用者所設定之目標光學資料,產生複數個測試畫面至螢幕10。螢幕10會顯示複數個測試畫面以校正影像。若螢幕10之最暗區域R1所顯示的測試畫面之真實光學特性可被更新至符合目標光學資料(x,y,Y),則螢幕10之所有區域之真實光學特性均可以被更新至趨近於目標光學資料(x,y,Y)。這原因在於,最暗區域R1之真實光學特性會使用增益值較大的第一校正參數 f R1(x R1,y R1,Y R1)進行補償,以使最暗區域R1之發色可接近目標光學資料(x,y,Y)。既然使用增益值較大的第一校正參數 f R1(x R1,y R1,Y R1)的最暗區域R1可以達到接近目標光學資料(x,y,Y)的發色,則使用較小增益值的校正參數(例如,區域R2使用較小增益值的第二校正參數 f R2(x R2,y R2,Y R2))的區域也必定能達到接近目標光學資料(x,y,Y)的發色。因此,若螢幕10之最暗區域R1被視為參考區域,且螢幕10的最暗區域R1支援光學補償時,螢幕10所有區域均可以支援該些測試畫面所對應的目標光學資料(x,y,Y)。然而,本發明的感測器12也不侷限於選擇螢幕10中之最暗的區域R1,在另一個實施例中,感測器12可以選擇某個特定的區域,說明如下。 FIG. 6 is a schematic diagram of the screen correction system 100 in which the processor 13 corrects the region R1 having the lowest brightness of the screen 10 and then corrects other regions of the screen 10. As mentioned above, since the processor 13 can accurately calculate the ambient light parameters (Δx1, Δy1, ΔY1) of the region R1 using the camera 11 and the sensor 12, and the sensor 12 is in close proximity to the region R1 to collect optical data. Therefore, the region R1 will be accurately corrected (compensated by the first correction parameter f R1 (x R1 , y R1 , Y R1 )) such that its coloration conforms to the target optical data (x, y, Y). For the remaining area of the screen 10, since the processor 13 uses the ambient light parameters of the area R1 and the full screen image taken by the camera 11, the true illuminating characteristics of the remaining area are estimated, and the corresponding correction parameters are used. The color of the remaining area approaches the target optical data (x, y, Y). Therefore, in the case where the ambient light fluctuation is not large, the image displayed on the screen 10 will substantially conform to the target optical data (x, y, Y). Moreover, in this embodiment, the processor 13 can generate a plurality of test pictures to the screen 10 according to the target optical data set by the user. The screen 10 displays a plurality of test pictures to correct the image. If the true optical characteristics of the test picture displayed in the darkest region R1 of the screen 10 can be updated to match the target optical data (x, y, Y), the true optical characteristics of all regions of the screen 10 can be updated to approach Target optical data (x, y, Y). The reason is that the true optical characteristics of the darkest region R1 are compensated using the first correction parameter f R1 (x R1 , y R1 , Y R1 ) having a large gain value, so that the color of the darkest region R1 can be approached to the target. Optical data (x, y, Y). Since the darkest region R1 of the first correction parameter f R1 (x R1 , y R1 , Y R1 ) having a larger gain value can reach the color of the target optical data (x, y, Y), a smaller gain is used. The correction parameter of the value (for example, the region R2 using the second correction parameter f R2 (x R2 , y R2 , Y R2 ) of the smaller gain value) must also reach the target optical data (x, y, Y). hair color. Therefore, if the darkest region R1 of the screen 10 is regarded as a reference region, and the darkest region R1 of the screen 10 supports optical compensation, all regions of the screen 10 can support the target optical data corresponding to the test screens (x, y). , Y). However, the sensor 12 of the present invention is also not limited to selecting the darkest region R1 in the screen 10. In another embodiment, the sensor 12 can select a particular region, as explained below.

第7圖係為螢幕校正系統100中,感測器12感測螢幕10中心的區域R5的示意圖。在本實施例中,感測器12可取得螢幕10之該些區域之中心區域R5的第一光學資料。由於區域R5在螢幕的中央,因此相異於前述實施例,感測器12可以直接以貼近螢幕10的方式取得區域R5的第一光學資料。換句話說,在本實施例中,感測器12取得螢幕10之該些區域中之區域R5的第一光學資料,可以在相機11取得螢幕10之全螢幕影像之前執行。亦即,使用者可直接將感測器12移至螢幕10的中心(區域R5),無須利用全螢幕影像判斷特定亮度的區域。於此說明,第一光學資料的定義為感測器12貼近並取得對應區域的光學資料。在前述實施例中,感測器12取得螢幕10中最暗區域R1的光學資料。因此,前述實施例之第一光學資料的定義為最暗區域R1的光學資料(真實發光特性)。而在本實施例中,感測器12取得螢幕10中心區域R5的光學資料。因此,本實施例之第一光學資料的定義為螢幕10中心區域R5的光學資料,對應於區域R5的真實發光特性(未受環境光干擾),可用(x5’,y5’,Y5’)表示。並且,相機11所擷取到之全螢幕影像中,對應螢幕10之區域R2的影像光學資料可為(x2,y2,Y2)。並且,(x2,y2,Y2)可表示為: (x2,y2,Y2)= (x2’+Δx2,y2’+Δy2,Y2’+ΔY2)FIG. 7 is a schematic diagram of the sensor 12 sensing the region R5 at the center of the screen 10 in the screen correction system 100. In this embodiment, the sensor 12 can obtain the first optical data of the central region R5 of the regions of the screen 10. Since the region R5 is at the center of the screen, the sensor 12 can directly acquire the first optical material of the region R5 in a manner close to the screen 10, unlike the foregoing embodiment. In other words, in the present embodiment, the sensor 12 obtains the first optical material of the region R5 in the regions of the screen 10, which can be performed before the camera 11 obtains the full screen image of the screen 10. That is, the user can directly move the sensor 12 to the center of the screen 10 (area R5) without judging the area of the specific brightness using the full screen image. As described herein, the first optical material is defined as the proximity of the sensor 12 to the optical data of the corresponding area. In the foregoing embodiment, the sensor 12 takes the optical data of the darkest region R1 in the screen 10. Therefore, the first optical material of the foregoing embodiment is defined as the optical material (true luminescence characteristic) of the darkest region R1. In the present embodiment, the sensor 12 obtains the optical data of the central region R5 of the screen 10. Therefore, the first optical data of the embodiment is defined as the optical data of the central region R5 of the screen 10, corresponding to the true illuminating characteristic of the region R5 (not interfered by ambient light), and can be represented by (x5', y5', Y5') . Moreover, in the full screen image captured by the camera 11, the image optical data corresponding to the region R2 of the screen 10 may be (x2, y2, Y2). Also, (x2, y2, Y2) can be expressed as: (x2, y2, Y2) = (x2' + Δx2, y2' + Δy2, Y2' + ΔY2)

其中(x2’,y2’,Y2’)為區域R2的真實發光特性,(Δx2,Δy2,ΔY2)為區域R2對應的環境光參數。相機11所擷取到區域R2之影像光學資料(x2,y2,Y2)即可視為真實發光特性(x2’, y2’, Y2’)與環境光參數(Δx2, Δy2, ΔY2)的結合。類似地,相機11所擷取到之全螢幕影像中,對應螢幕10之中心的區域R5的影像光學資料為(x5,y5,Y5)。並且,(x5,y5,Y5)可表示為: (x5,y5,Y5)= (x5’+Δx5,y5’+Δy5,Y5’+ΔY5)Where (x2', y2', Y2') is the true luminescence characteristic of the region R2, and (Δx2, Δy2, ΔY2) is the ambient light parameter corresponding to the region R2. The image optical data (x2, y2, Y2) captured by the camera 11 in the region R2 can be regarded as a combination of the true luminescence characteristics (x2', y2', Y2') and the ambient light parameters (Δx2, Δy2, ΔY2). Similarly, in the full screen image captured by the camera 11, the image optical data of the region R5 corresponding to the center of the screen 10 is (x5, y5, Y5). Also, (x5, y5, Y5) can be expressed as: (x5, y5, Y5) = (x5' + Δx5, y5' + Δy5, Y5' + ΔY5)

其中(x5’,y5’,Y5’)為區域R5的真實發光特性,(Δx5,Δy5,ΔY5)為區域R5對應的環境光參數。相機11所擷取到區域R5之影像光學資料(x5,y5,Y5)即可視為真實發光特性(x5’,y5’,Y5’)與環境光參數(Δx5,Δy5,ΔY5)的結合。依此類推,區域R1至區域R9受到環境光影響後,被相機擷取的影像光學資料分別為(x1,y1,Y1)至(x9,y9,Y9)。Where (x5', y5', Y5') is the true luminescence characteristic of the region R5, and (Δx5, Δy5, ΔY5) is an ambient light parameter corresponding to the region R5. The image optical data (x5, y5, Y5) captured by the camera 11 in the region R5 can be regarded as a combination of the true light-emitting characteristics (x5', y5', Y5') and the ambient light parameters (Δx5, Δy5, ΔY5). Similarly, after the region R1 to the region R9 are affected by the ambient light, the image optical data captured by the camera are (x1, y1, Y1) to (x9, y9, Y9), respectively.

第8圖係為螢幕校正系統100中,處理器13校正螢幕10中心的區域R5,並估計其餘區域的真實發光特性的示意圖。接著前述步驟,感測器12可取得區域R5的第一光學資料,可視為未受到環境光影響的真實發光特性,可用(x5’, y5’, Y5’)表示。並且,如前述定義,相機11所擷取到之全螢幕影像中,對應螢幕10之區域R5的影像光學資料可為(x5,y5,Y5),可表示為(x5,y5,Y5)= (x5’+Δx5,y5’+Δy5,Y5’+ΔY5)。因此,由於區域R5的第一光學資料(x5’,y5’,Y5’)可被感測器12偵測。因此,對於區域R5而言,依據全螢幕影像及第一光學資料(x5’,y5’,Y5’),可以取得一組環境光參數(Δx5,Δy5,ΔY5),表示如下: (Δx5,Δy5,ΔY5)=(x5-x5’,y5-y5’,Y5-Y5’)Figure 8 is a schematic diagram of the screen correction system 100 in which the processor 13 corrects the region R5 at the center of the screen 10 and estimates the true luminescence characteristics of the remaining regions. Following the foregoing steps, the sensor 12 can obtain the first optical material of the region R5, which can be regarded as a true illuminating characteristic that is not affected by the ambient light, and can be represented by (x5', y5', Y5'). Moreover, as defined above, in the full screen image captured by the camera 11, the image optical data corresponding to the region R5 of the screen 10 may be (x5, y5, Y5), which may be expressed as (x5, y5, Y5) = ( X5' + Δx5, y5' + Δy5, Y5' + ΔY5). Therefore, the first optical material (x5', y5', Y5') of the region R5 can be detected by the sensor 12. Therefore, for the region R5, based on the full screen image and the first optical data (x5', y5', Y5'), a set of ambient light parameters (Δx5, Δy5, ΔY5) can be obtained, which are expressed as follows: (Δx5, Δy5) , ΔY5)=(x5-x5', y5-y5', Y5-Y5')

並且,處理器13也會依據區域R5的第一光學資料(x5’,y5’,Y5’)以及使用者所設定之目標光學資料(x,y,Y),產生對應於區域R5之校正參數 f R5(x R5,y R5,Y R5)。換言之,在區域R5中,目標光學資料(x,y,Y)、第一光學資料(x5’,y5’,Y5’)以及校正參數 f R5(x R5,y R5,Y R5)的關係可以表示如下: (x,y,Y)= f R5(x5’,y5’,Y5’) Moreover, the processor 13 also generates correction parameters corresponding to the region R5 according to the first optical data (x5', y5', Y5') of the region R5 and the target optical data (x, y, Y) set by the user. f R5 (x R5 , y R5 , Y R5 ). In other words, in the region R5, the relationship between the target optical data (x, y, Y), the first optical data (x5', y5', Y5') and the correction parameter f R5 (x R5 , y R5 , Y R5 ) may be Expressed as follows: (x, y, Y) = f R5 (x5', y5', Y5')

其中 f R5(x R5,y R5,Y R5)可為轉換函數,遞迴函數,或是任何色域投影函數或是矩陣。若 f R5(x R5,y R5,Y R5)為增益矩陣G5 RGB,上述之目標光學資料(x,y,Y)與第一光學資料(x5’,y5’,Y5’)可以表示如下: Where f R5 (x R5 , y R5 , Y R5 ) can be a transfer function, a recursive function, or any gamut projection function or matrix. If f R5 (x R5 , y R5 , Y R5 ) is the gain matrix G5 RGB , the above-mentioned target optical data (x, y, Y) and the first optical data (x5', y5', Y5') can be expressed as follows:

第一光學資料(x5’,y5’,Y5’)可以用遞迴的轉換方法轉為目標光學資料(x,y,Y)。例如,利用多次遞迴處理,將第一光學資料(x5’,y5’,Y5’)的數值逐漸偏移,最終收斂至目標光學資料(x,y,Y)。任何合理的色度座標轉換之演算法都屬於本發明所揭露的範疇。因此,在第8圖中,區域R5的發光特性最終會校正至對應目標光學資料(x,y,Y)。並且,如前述定義,相機11所擷取到之全螢幕影像中,對應區域R2的影像光學資料為(x2,y2,Y2)。由於全螢幕影像引入了環境光參數,因此,對應區域R2的影像光學資料為(x2,y2,Y2)可表示為(x2,y2,Y2)= (x2’+Δx2,y2’+Δy2,Y2’+ΔY2)。然而,由於只有區域R5的環境光參數(Δx5,Δy5,ΔY5)能被精準地計算。因此,對於區域R2而言,處理器13可以根據全螢幕影像與第一光學資料,計算區域R2的第二光學資料。而區域R2的第二光學資料即可視為區域R2的真實發光特性之光學資料的估計值,表示如下: (x2’,y2’,Y2’) ≈ (x2-Δx5,y2-Δy5,Y2-ΔY5)The first optical data (x5', y5', Y5') can be converted to the target optical data (x, y, Y) by a recursive conversion method. For example, with the multiple recursive processing, the values of the first optical data (x5', y5', Y5') are gradually shifted, and finally converge to the target optical data (x, y, Y). Any algorithm for reasonable chromaticity coordinate conversion is within the scope of the present invention. Therefore, in Fig. 8, the luminescence characteristics of the region R5 are finally corrected to the corresponding target optical data (x, y, Y). Further, as defined above, in the full screen image captured by the camera 11, the image optical data of the corresponding region R2 is (x2, y2, Y2). Since the ambient light parameter is introduced into the full screen image, the image optical data of the corresponding region R2 is (x2, y2, Y2) and can be expressed as (x2, y2, Y2) = (x2' + Δx2, y2' + Δy2, Y2 '+ΔY2). However, since only the ambient light parameters (Δx5, Δy5, ΔY5) of the region R5 can be accurately calculated. Therefore, for the region R2, the processor 13 can calculate the second optical data of the region R2 based on the full screen image and the first optical data. The second optical data of the region R2 can be regarded as an estimated value of the optical data of the true luminescence property of the region R2, expressed as follows: (x2', y2', Y2') ≈ (x2-Δx5, y2-Δy5, Y2-ΔY5 )

其中,(x2-Δx5,y2-Δy5,Y2-ΔY5)為區域R2的第二光學資料,可視為區域R2的真實發光特性之光學資料的估計值,說明如下。如前述提及,區域R2的真實發光特性之光學資料(x2’,y2’,Y2’)可以表示如下: (x2’,y2’,Y2’) = (x2-Δx2,y2-Δy2,Y2-ΔY2)Wherein, (x2-Δx5, y2-Δy5, Y2-ΔY5) is the second optical data of the region R2, and can be regarded as an estimated value of the optical data of the true luminescence characteristic of the region R2, as explained below. As mentioned above, the optical data (x2', y2', Y2') of the true luminescence characteristics of the region R2 can be expressed as follows: (x2', y2', Y2') = (x2-?x2, y2-?y2, Y2- ΔY2)

由於感測器只偵測區域R5的發光特性,因此處理器13會將區域R5的環境光參數(Δx5,Δy5,ΔY5)取代區域R2的環境光參數(Δx2,Δy2,ΔY2),而估計出區域R2的真實發光特性之光學資料(第二光學資料)。換句話說,當R2的環境光參數(Δx2, Δy2, ΔY2)與區域R5的環境光參數(Δx5, Δy5, ΔY5)相近時,區域R2的第二光學資料(x2-Δx5,y2-Δy5,Y2-ΔY5)會趨近於區域R2的真實發光特性之光學資料(x2’,y2’,Y2’)。接著,處理器13也會依據區域R2的第二光學資料(x2-Δx5,y2-Δy5,Y2-ΔY5)以及使用者所設定之目標光學資料(x,y,Y),產生第二校正參數 f R2(x R2,y R2,Y R2)。換言之,在區域R2中,目標光學資料(x,y,Y)、第二光學資料(x2-Δx5,y2-Δy5,Y2-ΔY5)以及第二校正參數 f R2(x R2,y R2,Y R2)的關係可以表示如下: (x,y,Y)= f R2(x2-Δx5,y2-Δy5,Y2-ΔY5) Since the sensor only detects the light-emitting characteristics of the region R5, the processor 13 estimates the ambient light parameters (Δx5, Δy5, ΔY5) of the region R5 by replacing the ambient light parameters (Δx2, Δy2, ΔY2) of the region R2. Optical material (second optical data) of the true luminescence characteristics of the region R2. In other words, when the ambient light parameter (Δx2, Δy2, ΔY2) of R2 is close to the ambient light parameter (Δx5, Δy5, ΔY5) of the region R5, the second optical data of the region R2 (x2-Δx5, y2-Δy5, Y2-ΔY5) will approach the optical data (x2', y2', Y2') of the true luminescence properties of the region R2. Next, the processor 13 also generates a second calibration parameter according to the second optical data (x2-Δx5, y2-Δy5, Y2-ΔY5) of the region R2 and the target optical data (x, y, Y) set by the user. f R2 (x R2 , y R2 , Y R2 ). In other words, in the region R2, the target optical data (x, y, Y), the second optical data (x2-Δx5, y2-Δy5, Y2-ΔY5) and the second correction parameter f R2 (x R2 , y R2 , Y The relationship of R2 ) can be expressed as follows: (x, y, Y) = f R2 (x2-Δx5, y2-Δy5, Y2-ΔY5)

其中 f R2(x R2,y R2,Y R2)可為轉換函數,遞迴函數,或是任何色域投影函數或是矩陣。若 f R2(x R2,y R2,Y R2)為增益矩陣G2 RGB,上述之目標光學資料(x,y,Y)與第二光學資料(x2-Δx5,y2-Δy5,Y2-ΔY5)可以表示如下: Where f R2 (x R2 , y R2 , Y R2 ) can be a transfer function, a recursive function, or any gamut projection function or matrix. If f R2 (x R2 , y R2 , Y R2 ) is the gain matrix G2 RGB , the above-mentioned target optical data (x, y, Y) and the second optical data (x2-Δx5, y2-Δy5, Y2-ΔY5) may Expressed as follows:

於此說明,如前述提及,第二光學資料(x2-Δx5,y2-Δy5,Y2-ΔY5)會趨近於區域R2的真實發光特性之光學資料(x2’,y2’,Y2’)。因此,區域R2的真實發光特性之光學資料(x2’,y2’,Y2’),經過增益矩陣G2 RGB調整後,可以近似於目標光學資料(x,y,Y),表示如下: As described herein, as mentioned above, the second optical data (x2-Δx5, y2-Δy5, Y2-ΔY5) will approach the optical data (x2', y2', Y2') of the true luminescence characteristics of the region R2. Therefore, the optical data (x2', y2', Y2') of the true luminescence characteristic of the region R2 can be approximated to the target optical data (x, y, Y) after being adjusted by the gain matrix G2 RGB , as follows:

類似前述實施例,本實施例之螢幕校正系統100中,區域R5的真實發光特性可被補償至對應於目標光學資料(x,y,Y)。並且,區域R2的真實發光特性可被補償至趨近於目標光學資料(x,y,Y)。並且,螢幕校正系統100之螢幕10的其他區域也可以用類似的方法校正其發色,使螢幕10每一個區域之發色都能幾乎符合使用者所設定的目標光學資料(x,y,Y)。Similar to the foregoing embodiment, in the screen correction system 100 of the present embodiment, the true light-emitting characteristics of the region R5 can be compensated to correspond to the target optical data (x, y, Y). Also, the true luminescence characteristics of the region R2 can be compensated to approach the target optical data (x, y, Y). Moreover, the other areas of the screen 10 of the screen correction system 100 can also be corrected in a similar manner, so that the color of each area of the screen 10 can almost match the target optical data set by the user (x, y, Y). ).

第9圖係為螢幕校正系統100中,處理器13校正螢幕10中心的區域R5後,再校正螢幕10之其他區域的示意圖。如前述提及,由於處理器13可利用相機11以及感測器12準確地計算出區域R5的環境光參數(Δx5,Δy5, ΔY5),且感測器12緊貼於區域R5以蒐集光學資料,因此,區域R5的將可被精準地校正,其發色符合目標光學資料(x,y,Y)。而對螢幕10之剩下的區域而言,由於處理器13會利用區域R5的環境光參數(Δx5, Δy5, ΔY5)以及相機11所攝之全螢幕影像,估計剩下區域的真實發光特性,再利用對應的校正參數將剩下區域的發色趨近於目標光學資料(x,y,Y)。因此,在環境光波動不大的情況下,螢幕10所顯示的影像將會大致符合目標光學資料(x,y,Y)。類似地,處理器13可以依據使用者所設定之目標光學資料,產生複數個測試畫面至螢幕10。螢幕10會顯示複數個測試畫面以校正影像。在本實施例中,即使螢幕10之中心的區域R5所顯示的測試畫面之真實光學特性可被更新至符合目標光學資料(x,y,Y),但未必能保證螢幕10之所有區域之真實光學特性均可以被更新至趨近於目標光學資料(x,y,Y)。這原因在於,螢幕10之中心的區域R5未必是螢幕10中最暗的區域。因此,當螢幕10之中心的區域R5利用校正參數補償其發色至符合目標光學資料(x,y,Y)時,螢幕10中某些極暗的區域未必能支援高增益值的校正參數,而將其發色補償至符合目標光學資料(x,y,Y)。舉例而言,使用者所設定的目標光學資料(x,y,Y)對應色溫6500K且亮度為230nits(平方燭光)。然而,雖然螢幕10之中心的區域R5可以被校正至符合亮度為230nits的水準,螢幕10之某些區域可能無法達到亮度為230nits的水準。因此,為了避免螢幕10中的亮度分佈不均勻,螢幕校正系統100將會調整目標光學資料(x,y,Y)。例如,螢幕校正系統100將目標光學資料(x,y,Y)對應的亮度由230nits調降至210nits,以使螢幕10所有的區域都能呈現一致性的亮度。換句話說,當處理器13所產生之對應使用者之目標光學資料(x,y,Y)的該些測試畫面於螢幕10中無法被某些區域支援,則處理器13會微調目標光學資料(x,y,Y),以產生調整後的目標光學資料。例如將目標光學資料(x,y,Y)的亮度值微調,以產生調整後的目標光學資料。因此,螢幕10中的所有區域的發色將可被一致性地校正至趨近於調整後的目標光學資料,故也能達到發色均勻的功效。FIG. 9 is a schematic diagram of the screen correction system 100 in which the processor 13 corrects the area R5 at the center of the screen 10 and then corrects other areas of the screen 10. As mentioned above, since the processor 13 can accurately calculate the ambient light parameters (Δx5, Δy5, ΔY5) of the region R5 using the camera 11 and the sensor 12, and the sensor 12 is in close proximity to the region R5 to collect optical data. Therefore, the region R5 will be accurately corrected, and its color is in accordance with the target optical data (x, y, Y). For the remaining area of the screen 10, since the processor 13 uses the ambient light parameters (Δx5, Δy5, ΔY5) of the region R5 and the full screen image taken by the camera 11, the true luminescence characteristics of the remaining regions are estimated. The corresponding correction parameters are used to bring the color of the remaining area closer to the target optical data (x, y, Y). Therefore, in the case where the ambient light fluctuation is not large, the image displayed on the screen 10 will substantially conform to the target optical data (x, y, Y). Similarly, the processor 13 can generate a plurality of test pictures to the screen 10 according to the target optical data set by the user. The screen 10 displays a plurality of test pictures to correct the image. In the present embodiment, even if the true optical characteristics of the test picture displayed in the area R5 at the center of the screen 10 can be updated to conform to the target optical data (x, y, Y), it is not necessarily guaranteed that all areas of the screen 10 are true. Optical properties can be updated to approximate the target optical data (x, y, Y). The reason for this is that the region R5 at the center of the screen 10 is not necessarily the darkest region in the screen 10. Therefore, when the region R5 at the center of the screen 10 compensates for its color development to conform to the target optical data (x, y, Y), some extremely dark regions of the screen 10 may not be able to support the correction parameters of the high gain value. The color is compensated to match the target optical data (x, y, Y). For example, the target optical data (x, y, Y) set by the user corresponds to a color temperature of 6500K and a brightness of 230 nits (square candle light). However, although the area R5 at the center of the screen 10 can be corrected to a level of 230 nits, some areas of the screen 10 may not reach a level of 230 nits. Therefore, in order to avoid uneven brightness distribution in the screen 10, the screen correction system 100 will adjust the target optical data (x, y, Y). For example, the screen correction system 100 adjusts the brightness of the target optical data (x, y, Y) from 230 nits to 210 nits so that all areas of the screen 10 exhibit consistent brightness. In other words, when the test pictures corresponding to the target optical data (x, y, Y) of the user generated by the processor 13 cannot be supported by certain areas in the screen 10, the processor 13 fine-tunes the target optical data. (x, y, Y) to produce adjusted target optical data. For example, the brightness value of the target optical data (x, y, Y) is fine-tuned to produce an adjusted target optical data. Therefore, the color development of all the areas in the screen 10 can be uniformly corrected to approach the adjusted target optical data, so that the uniform color development effect can also be achieved.

承上述實施例,若處理器13所產生之對應使用者之目標光學資料(x,y,Y)的該些測試畫面於螢幕10中可被螢幕10之所有區域支援,表示螢幕10有足夠的兼容能力以滿足使用者的設定。因此,目標光學資料(x,y,Y)將不會被微調,螢幕10的所有區域最終會顯示符合目標光學資料(x,y,Y)的發色。在上述多個實施例中,感測器12所取得的第一光學資料可為螢幕10中之最暗區域(例如前述之區域R1)的真實發光特性,也可以為螢幕10中之中心區域(例如前述之區域R5)的真實發光特性。處理器13會依據感測器12所取得的第一光學資料與全螢幕影像,計算對應的環境光,並依此估測出每一個區域的真實發光特性,再將每一個區域的發色補償至趨近於符合目標光學資料(x,y,Y)的水準。然而,本發明也不以此為限制,例如,感測器12可以取得螢幕10中之任意一個區域的第一光學資料,隨後再對其他區域進行光學補償。在螢幕10中的每一個區域進行光學補償後,使用者可以利用感測器12手動偵測,或是利用處理器13自動偵測每一個區域所對應的光學資料,以驗證所有區域對應之光學資料是否一致。因此,在螢幕10校正完成後,使用者觀看螢幕10的影像會由於發色均勻的特性,而增加舒適度以及視覺體驗。According to the above embodiment, if the test pictures corresponding to the target optical data (x, y, Y) of the user generated by the processor 13 are supported by the screen 10 in all areas of the screen 10, the screen 10 has sufficient Compatible to meet user settings. Therefore, the target optical data (x, y, Y) will not be fine-tuned, and all areas of the screen 10 will eventually display the color corresponding to the target optical data (x, y, Y). In the above various embodiments, the first optical data obtained by the sensor 12 may be the true light-emitting characteristic of the darkest region in the screen 10 (for example, the aforementioned region R1), or may be the central region in the screen 10 ( For example, the true luminescence properties of the aforementioned region R5). The processor 13 calculates the corresponding ambient light according to the first optical data and the full screen image obtained by the sensor 12, and estimates the true light-emitting characteristics of each region according to this, and then compensates the color of each region. It is close to the level of the target optical data (x, y, Y). However, the present invention is not limited thereto. For example, the sensor 12 can obtain the first optical data of any one of the areas of the screen 10, and then optically compensate other areas. After optical compensation in each area of the screen 10, the user can manually detect by using the sensor 12, or use the processor 13 to automatically detect the optical data corresponding to each area to verify the optical corresponding to all areas. Whether the information is consistent. Therefore, after the screen 10 is corrected, the user viewing the image of the screen 10 will increase the comfort and visual experience due to the uniform color development.

第10圖係為螢幕校正系統100中,執行螢幕校正方法的流程圖。螢幕校正方法可包含步驟S101至步驟S105。任何合理的步驟變更都屬於本發明所揭露的範疇。步驟S101至步驟S105描述於下。 <TABLE border="1" borderColor="#000000" width="85%"><TBODY><tr><td> 步驟S101: </td><td> 相機11取得螢幕10之全螢幕影像; </td></tr><tr><td> 步驟S102: </td><td> 感測器12取得螢幕10之複數個區域中之第一區域的第一光學資料; </td></tr><tr><td> 步驟S103: </td><td> 依據第一校正參數校正螢幕10於第一區域的第一光學資料,以控制第一區域的發色接近目標光學資料; </td></tr><tr><td> 步驟S104: </td><td> 根據全螢幕影像與第一區域的第一光學資料產生第二區域的第二光學資料; </td></tr><tr><td> 步驟S105: </td><td> 依據目標光學資料及第二光學資料,產生第二校正參數以校正第二光學資料,而控制第二區域的發色接近目標光學資料。 </td></tr></TBODY></TABLE>Figure 10 is a flow diagram of a method of performing a screen correction in the screen correction system 100. The screen correction method may include steps S101 to S105. Any reasonable step changes are within the scope of the present invention. Steps S101 to S105 are described below.  <TABLE border="1" borderColor="#000000" width="85%"><TBODY><tr><td> Step S101: </td><td> The camera 11 obtains the full screen image of the screen 10; /td></tr><tr><td> Step S102: </td><td> The sensor 12 obtains the first optical data of the first region of the plurality of regions of the screen 10; </td>< /tr><tr><td> Step S103: </td><td> correcting the first optical data of the screen 10 in the first region according to the first calibration parameter to control the color of the first region to be close to the target optical data; </td></tr><tr><td> Step S104: </td><td> generating second optical data of the second region according to the full screen image and the first optical material of the first region; </td ></tr><tr><td> Step S105: </td><td> according to the target optical data and the second optical data, generating a second correction parameter to correct the second optical data, and controlling the transmission of the second region The color is close to the target optical data. </td></tr></TBODY></TABLE>

步驟S101至步驟S105的運作已經詳述於前文,故於此將不再贅述。螢幕校正系統100利用步驟S101至步驟S105的流程,可以實現便利的螢幕校正操作以及高品質的螢幕校正結果。The operations of step S101 to step S105 have been described in detail above, and thus will not be described again. The screen correction system 100 can realize a convenient screen correction operation and a high-quality screen correction result by using the flow of steps S101 to S105.

綜上所述,本發明描述了一種螢幕校正方法及其系統。使用者不需要利用感測器重複執行蒐集光學資料的步驟,僅需要利用相機取得全螢幕畫面,以及利用感測器取得某一個區域性的光學資料即可。螢幕校正系統可以根據某一個區域性的光學資料以及全螢幕畫面,計算某一個區域的環境光參數和校正參數。接著,螢幕校正系統可以逐步估測出其他區域的校正參數。因此,螢幕校正系統可以利用每一個區域的校正參數,將其發色進行補償,最終讓全部區域都能趨近於符合使用者設定之目標光學資料的水準。因此,相較於傳統之螢幕校正系統,本發明之螢幕校正系統至少具有下列優點。第一、由於使用者僅需將感測器進行單次操作,取得特定區域的光學資料,因此避免了高重複性的操作行為,可以大幅降低操作時間。第二、由於處理器可以自動估計出每一個區域的校正參數而將其發色進行補償,故可避免傳統之螢幕校正系統因校正裝置在螢幕之每一個區域的位移量為手動控制,而產生校正誤差的問題。 以上所述僅為本發明之較佳實施例,凡依本發明申請專利範圍所做之均等變化與修飾,皆應屬本發明之涵蓋範圍。In summary, the present invention describes a screen correction method and system therefor. The user does not need to repeatedly perform the steps of collecting the optical data by using the sensor, and only needs to obtain a full screen image by using the camera, and obtain a certain regional optical data by using the sensor. The screen correction system can calculate ambient light parameters and calibration parameters for a certain area based on a regional optical data and a full screen image. The screen correction system can then gradually estimate the calibration parameters for other areas. Therefore, the screen correction system can use the correction parameters of each area to compensate for its color, and finally the entire area can be approached to meet the level of the target optical data set by the user. Therefore, the screen correction system of the present invention has at least the following advantages as compared with the conventional screen correction system. First, since the user only needs to perform a single operation of the sensor to obtain optical data of a specific area, high repetitive operation behavior is avoided, and the operation time can be greatly reduced. Second, since the processor can automatically estimate the correction parameters of each area and compensate the color thereof, it can avoid the traditional screen correction system because the displacement of the correction device in each area of the screen is manually controlled. Correct the problem of error. The above are only the preferred embodiments of the present invention, and all changes and modifications made to the scope of the present invention should be within the scope of the present invention.

100‧‧‧螢幕校正系統100‧‧‧Screen Correction System

10‧‧‧螢幕 10‧‧‧ screen

11‧‧‧相機 11‧‧‧ camera

12‧‧‧感測器 12‧‧‧ Sensors

13‧‧‧處理器 13‧‧‧ Processor

14‧‧‧顯示裝置 14‧‧‧Display device

R1至R9‧‧‧區域 R1 to R9‧‧‧ area

S101至S105‧‧‧步驟 S101 to S105‧‧‧ steps

第1圖係為本發明之螢幕校正系統之實施例的方塊圖。 第2A圖係為第1圖之螢幕校正系統中,整合螢幕、相機及感測器的顯示裝置於準備偵測顯示畫面時的示意圖。 第2B圖係為第1圖之螢幕校正系統中,整合螢幕、相機及感測器的顯示裝置在偵測顯示畫面時的示意圖。 第3圖係為第1圖之螢幕校正系統中,利用相機擷取螢幕之全螢幕影像的示意圖。 第4圖係為第1圖之螢幕校正系統中,感測器依據全螢幕影像,感測螢幕中亮度最低的區域的示意圖。 第5圖係為第1圖之螢幕校正系統中,處理器校正亮度最低的區域,並估計其餘區域的真實發光特性的示意圖。 第6圖係為第1圖之螢幕校正系統中,處理器校正螢幕之亮度最低的區域後,再校正螢幕之其他區域的示意圖。 第7圖係為第1圖之螢幕校正系統中,感測器感測螢幕中心的區域的示意圖。 第8圖係為第1圖之螢幕校正系統中,處理器校正螢幕中心的區域,並估計其餘區域的真實發光特性的示意圖。 第9圖係為第1圖之螢幕校正系統中,處理器校正螢幕中心的區域後,再校正螢幕之其他區域的示意圖。 第10圖係為第1圖之螢幕校正系統中,執行螢幕校正方法的流程圖。Figure 1 is a block diagram of an embodiment of a screen correction system of the present invention. Fig. 2A is a schematic diagram of a display device incorporating a screen, a camera, and a sensor in a screen correction system of Fig. 1 in preparation for detecting a display screen. FIG. 2B is a schematic diagram of the display device integrated with the screen, the camera and the sensor in the screen correction system of FIG. 1 when detecting the display screen. Figure 3 is a schematic diagram of the full screen image of the screen captured by the camera in the screen correction system of Figure 1. Figure 4 is a schematic diagram of the screen correction system of Figure 1, in which the sensor senses the region of lowest brightness in the screen based on the full screen image. Figure 5 is a schematic diagram of the screen correction system in Fig. 1 with the processor correcting the region of the lowest luminance and estimating the true luminescence characteristics of the remaining regions. Fig. 6 is a schematic diagram of the screen correction system of Fig. 1 in which the processor corrects the area where the brightness of the screen is the lowest, and then corrects other areas of the screen. Figure 7 is a schematic diagram of the area in which the sensor senses the center of the screen in the screen correction system of Figure 1. Figure 8 is a schematic diagram of the screen correction system of Figure 1, where the processor corrects the area of the center of the screen and estimates the true luminescence characteristics of the remaining areas. Figure 9 is a schematic diagram of the screen correction system of Figure 1, after the processor corrects the area of the screen center, and then corrects other areas of the screen. Fig. 10 is a flow chart showing a method of performing a screen correction in the screen correction system of Fig. 1.

Claims (21)

一種螢幕校正方法,包含: 一相機取得一螢幕之一全螢幕影像; 一感測器取得該螢幕之複數個區域中之一第一區域的第一光學資料; 依據一第一校正參數校正該螢幕於該第一區域的該第一光學資料,以控制該第一區域的發色接近目標光學資料; 根據該全螢幕影像與該第一區域的該第一光學資料產生一第二區域的第二光學資料;及 依據該目標光學資料及該第二光學資料,產生該第二校正參數,以校正該第二光學資料,以控制該第二區域的發色接近該目標光學資料。A screen correction method comprising: a camera acquiring a full screen image of a screen; a sensor acquiring first optical data of a first region of a plurality of regions of the screen; correcting the screen according to a first correction parameter The first optical data in the first area to control the color of the first area to approach the target optical data; to generate a second area according to the full screen image and the first optical data of the first area Optical data; and generating the second calibration parameter according to the target optical data and the second optical data to correct the second optical data to control the color of the second region to be close to the target optical data. 如請求項1所述之方法,其中該感測器取得該螢幕之該些區域中該第一區域的該第一光學資料,係為該感測器依據該全螢幕影像,取得該螢幕之該些區域中一亮度最低區域的該第一光學資料。The method of claim 1, wherein the sensor obtains the first optical data of the first area in the regions of the screen, and the sensor obtains the screen according to the full screen image. The first optical material of a region of the lowest brightness in the regions. 如請求項1所述之方法,其中該第二光學資料趨近於該第二區域之一真實發光特性,該第一光學資料對應於該第一區域之一真實發光特性,該方法另包含: 依據該全螢幕影像及該第一光學資料,取得一組環境光參數。The method of claim 1, wherein the second optical material approaches a true luminescence characteristic of the second region, the first optical material corresponding to a true luminescence characteristic of the first region, the method further comprising: A set of ambient light parameters is obtained based on the full screen image and the first optical data. 如請求項3所述之方法,其中依據該目標光學資料及該第二光學資料,產生該第二校正參數,以校正該第二光學資料使其接近該目標光學資料,係為依據該目標光學資料及該第二光學資料,產生該第二校正參數,並利用該第二校正參數將該第二區域的一真實發光特性進行補償,以使該第二區域的該真實發光特性趨近於該目標光學資料。The method of claim 3, wherein the second correction parameter is generated according to the target optical data and the second optical data to correct the second optical data to be close to the target optical data, according to the target optical Data and the second optical data, generating the second correction parameter, and using the second correction parameter to compensate a true illumination characteristic of the second region, so that the true illumination characteristic of the second region approaches the Target optical data. 如請求項1所述之方法,其中該第一光學資料及該第二光學資料係為兩CIE色度空間座標的資料或兩三原色(RGB)之色度空間座標的資料。The method of claim 1, wherein the first optical data and the second optical data are data of two CIE chromaticity space coordinates or data of chromaticity space coordinates of two or three primary colors (RGB). 如請求項1所述之方法,其中取得該螢幕之該區域的該第一光學資料,係為取得該螢幕之一中心區域的該第一光學資料。The method of claim 1, wherein the obtaining the first optical material of the area of the screen is to obtain the first optical material of a central area of the screen. 如請求項6所述之方法,另包含: 設定該目標光學資料; 依據該目標光學資料,產生複數個測試畫面至該螢幕;及 若該些測試畫面於該螢幕中無法符合該目標光學資料,調整該目標光學資料,以產生調整後的目標光學資料。The method of claim 6, further comprising: setting the target optical data; generating a plurality of test pictures to the screen according to the target optical data; and if the test pictures cannot meet the target optical data in the screen, The target optical data is adjusted to produce an adjusted target optical data. 如請求項1或6所述之方法,另包含: 設定該目標光學資料;及 依據該目標光學資料,產生複數個測試畫面至該螢幕; 其中該些測試畫面對應的該目標光學資料被該螢幕之該些區域支援。The method of claim 1 or 6, further comprising: setting the target optical data; and generating a plurality of test pictures to the screen according to the target optical data; wherein the target optical data corresponding to the test pictures is used by the screen These regional support. 如請求項1或6所述之方法,其中該第一光學資料對應於該第一區域之一真實發光特性,該第二光學資料趨近於該第二區域之一真實發光特性,且取得該螢幕之該全螢幕影像,係在該感測器取得該螢幕之該些區域中之該第一區域的該第一光學資料之後執行。The method of claim 1 or 6, wherein the first optical material corresponds to a true light-emitting characteristic of the first region, the second optical material approaches a true light-emitting characteristic of the second region, and the The full screen image of the screen is executed after the sensor acquires the first optical material of the first region in the regions of the screen. 如請求項1或6所述之方法,另包含: 偵測該螢幕之該些區域中每一區域對應的光學資料,以驗證該些區域對應之光學資料是否一致。The method of claim 1 or 6, further comprising: detecting optical data corresponding to each of the regions of the screen to verify whether the optical materials corresponding to the regions are consistent. 如請求項1所述之方法,其中該相機、該感測器及該螢幕設置於一顯示裝置上,該相機利用至少一軸承轉動至該螢幕的正面,以擷取該全螢幕影像,且該感測器利用至少一軸承轉動至該螢幕之該第一區域,以貼近該第一區域的方式取得該第一光學資料。The method of claim 1, wherein the camera, the sensor, and the screen are disposed on a display device, and the camera is rotated to the front of the screen by using at least one bearing to capture the full screen image, and the camera The sensor rotates to the first area of the screen with at least one bearing to obtain the first optical material in proximity to the first area. 一種螢幕校正系統,包含: 一螢幕,包含複數個區域,用以顯示影像; 一相機,用以取得該螢幕之一全螢幕影像; 一感測器,用以貼近該螢幕以取得區域性的光學資料;及 一處理器,耦接於該感測器、該相機及該螢幕,用以校正該螢幕; 其中該感測器取得該螢幕之該些區域中之一第一區域的第一光學資料,該處理器依據一第一校正參數校正該螢幕於該第一區域的該第一光學資料,以控制該第一區域的發色接近一目標光學資料,根據該全螢幕影像與該第一區域的該第一光學資料產生一第二區域的第二光學資料,及依據該目標光學資料及該第二光學資料,產生該第二校正參數,以校正該第二光學資料而控制該第二區域的發色接近該目標光學資料。A screen correction system comprising: a screen comprising a plurality of regions for displaying images; a camera for capturing a full screen image of the screen; and a sensor for abutting the screen for regional optics And a processor coupled to the sensor, the camera, and the screen for correcting the screen; wherein the sensor obtains the first optical data of the first region of the plurality of regions of the screen The processor corrects the first optical data of the screen in the first area according to a first calibration parameter to control the color of the first area to be close to a target optical data, according to the full screen image and the first area The first optical data generates a second optical data of a second region, and generates the second calibration parameter according to the target optical data and the second optical data to correct the second optical data to control the second region The hair color is close to the target optical data. 如請求項12所述之系統,其中該感測器依據該全螢幕影像,取得該螢幕之該些區域中一亮度最低區域的該第一光學資料。The system of claim 12, wherein the sensor obtains the first optical data of a lowest luminance region of the regions of the screen according to the full screen image. 如請求項12所述之系統,其中該第二光學資料趨近於該第二區域之一真實發光特性,該第一光學資料對應於該第一區域之一真實發光特性,且該處理器依據該全螢幕影像及該第一光學資料,取得一組環境光參數。The system of claim 12, wherein the second optical material approaches a true luminescence characteristic of the second region, the first optical material corresponds to a true luminescence characteristic of the first region, and the processor is based on The full screen image and the first optical material obtain a set of ambient light parameters. 如請求項14所述之系統,其中該處理器依據該目標光學資料及該第二光學資料,產生該第二校正參數,並利用該第二校正參數將該第二區域的一真實發光特性進行補償,以使該第二區域的該真實發光特性趨近於該目標光學資料。The system of claim 14, wherein the processor generates the second correction parameter according to the target optical data and the second optical data, and uses the second correction parameter to perform a true illumination characteristic of the second region Compensating so that the true luminescence characteristic of the second region approaches the target optical material. 如請求項12所述之系統,其中該第一光學資料及該第二光學資料係為兩CIE色度空間座標的資料或兩三原色(RGB)之色度空間座標的資料。The system of claim 12, wherein the first optical data and the second optical data are data of two CIE chromaticity space coordinates or data of chromaticity space coordinates of two or three primary colors (RGB). 如請求項12所述之系統,其中該感測器取得該螢幕之一中心區域的該第一光學資料。The system of claim 12, wherein the sensor obtains the first optical material in a central region of the one of the screens. 如請求項17所述之系統,其中該處理器設定該目標光學資料,依據該目標光學資料,產生複數個測試畫面至該螢幕,及若該些測試畫面於該螢幕中無法符合該目標光學資料,該處理器調整該目標光學資料以產生調整後的目標光學資料。The system of claim 17, wherein the processor sets the target optical data, generates a plurality of test pictures to the screen according to the target optical data, and if the test pictures cannot meet the target optical data in the screen The processor adjusts the target optical data to generate adjusted target optical data. 如請求項12或17所述之系統,其中該處理器設定該目標光學資料,並依據該目標光學資料,產生複數個測試畫面至該螢幕,且該些測試畫面對應的該目標光學資料被該螢幕之該些區域支援。The system of claim 12 or 17, wherein the processor sets the target optical data, and generates a plurality of test pictures to the screen according to the target optical data, and the target optical data corresponding to the test pictures is Support for these areas of the screen. 如請求項12或17所述之系統,其中該第一光學資料對應於該第一區域之一真實發光特性,該第二光學資料趨近於該第二區域之一真實發光特性,且該相機取得該螢幕之該全螢幕影像,係在該感測器取得該螢幕之該些區域中之該第一區域的該第一光學資料之後執行。The system of claim 12 or 17, wherein the first optical material corresponds to a true luminescence characteristic of the first region, the second optical material approaches a true luminescence characteristic of the second region, and the camera Acquiring the full screen image of the screen is performed after the sensor acquires the first optical material of the first region in the regions of the screen. 如請求項12或17所述之系統,其中該螢幕之該些區域中每一區域對應的光學資料會被偵測,以驗證該些區域對應之光學資料是否一致。The system of claim 12 or 17, wherein the optical data corresponding to each of the regions of the screen is detected to verify whether the optical data corresponding to the regions are consistent.
TW107102449A 2018-01-24 2018-01-24 Screen calibration method and screen calibration system TWI654873B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW107102449A TWI654873B (en) 2018-01-24 2018-01-24 Screen calibration method and screen calibration system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW107102449A TWI654873B (en) 2018-01-24 2018-01-24 Screen calibration method and screen calibration system

Publications (2)

Publication Number Publication Date
TWI654873B true TWI654873B (en) 2019-03-21
TW201933858A TW201933858A (en) 2019-08-16

Family

ID=66590849

Family Applications (1)

Application Number Title Priority Date Filing Date
TW107102449A TWI654873B (en) 2018-01-24 2018-01-24 Screen calibration method and screen calibration system

Country Status (1)

Country Link
TW (1) TWI654873B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113160771A (en) * 2020-01-22 2021-07-23 苏州佳世达电通有限公司 Color correction auxiliary module, display device and color correction method
TWI739322B (en) * 2020-02-26 2021-09-11 佳世達科技股份有限公司 Color calibrating auxiliary module and color calibrating method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI780449B (en) * 2020-06-22 2022-10-11 大陸商北京集創北方科技股份有限公司 Color gamut conversion method of OLED display panel and display device and information processing device using the same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI280409B (en) 2006-04-14 2007-05-01 Asustek Comp Inc Reflective photo device, an electronic apparatus with a built-in camera using the device for providing colorimeter and ambient light sensor functions and its method
TW200908756A (en) 2007-06-11 2009-02-16 Micron Technology Inc Color correcting for ambient light
TWI325270B (en) 2005-08-02 2010-05-21 Kolorific Inc Method and system for automatically calibrating a color display
CN106791759A (en) 2016-12-14 2017-05-31 南京巨鲨显示科技有限公司 The bearing calibration of medical display color uniformity and correction system
US20170302915A1 (en) 2014-09-09 2017-10-19 Hewlett-Packard Development Company, L.P. Color calibration

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI325270B (en) 2005-08-02 2010-05-21 Kolorific Inc Method and system for automatically calibrating a color display
TWI280409B (en) 2006-04-14 2007-05-01 Asustek Comp Inc Reflective photo device, an electronic apparatus with a built-in camera using the device for providing colorimeter and ambient light sensor functions and its method
TW200908756A (en) 2007-06-11 2009-02-16 Micron Technology Inc Color correcting for ambient light
US20170302915A1 (en) 2014-09-09 2017-10-19 Hewlett-Packard Development Company, L.P. Color calibration
CN106791759A (en) 2016-12-14 2017-05-31 南京巨鲨显示科技有限公司 The bearing calibration of medical display color uniformity and correction system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113160771A (en) * 2020-01-22 2021-07-23 苏州佳世达电通有限公司 Color correction auxiliary module, display device and color correction method
CN113160771B (en) * 2020-01-22 2022-09-02 苏州佳世达电通有限公司 Color correction auxiliary module, display device and color correction method
TWI739322B (en) * 2020-02-26 2021-09-11 佳世達科技股份有限公司 Color calibrating auxiliary module and color calibrating method

Also Published As

Publication number Publication date
TW201933858A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN108346393B (en) Screen correction method and screen correction system
TWI600323B (en) Display device and module and method for compensating pixels of display device
JP4283297B2 (en) Image compensation method
JP5354265B2 (en) Liquid crystal display
TWI654873B (en) Screen calibration method and screen calibration system
CN109246405B (en) Method and system for adjusting uniformity of image tone
US8610781B2 (en) System and method for light compensation in a video panel display
CN112512184B (en) Color-taking illumination control method, device, system and storage medium
TW200908756A (en) Color correcting for ambient light
JP2010057149A (en) Image correction data generation system, image correction data generation method, image correction data generation program, and image correction circuit
TW200813948A (en) Apparatus and method for adaptively adjusting backlight
TW200816158A (en) Method and apparatus of looking for new color temperature point
US11056078B2 (en) Multi-screen color correction method and electronic device using the same
CN104183216A (en) Method and device for controlling brightness of display screen of displayer
JP2017120330A (en) Luminance adjusting device and luminance adjustment program
CN112189337A (en) Image processing apparatus, image processing method, and program
JP6679811B1 (en) Correction image generation system, image control program, and recording medium
JP2008147889A (en) Image processor and method thereof
CN113903306A (en) Compensation method and compensation device of display panel
JP6722366B1 (en) Correction image generation system, image control method, image control program, and recording medium
TWI693591B (en) Method for adjusting uniformity of image color tones and system thereof
JP2011150349A (en) Picture quality adjusting device and image correction data generation program
US20150181188A1 (en) Display device, electronic apparatus, and method for driving display device
KR100897652B1 (en) Output Controller of the Light Source and Method thereof
JP6732144B1 (en) Correction image generation system, image control method, image control program, and recording medium