CN116528068A - Display device, image processing method and device - Google Patents

Display device, image processing method and device Download PDF

Info

Publication number
CN116528068A
CN116528068A CN202210073828.4A CN202210073828A CN116528068A CN 116528068 A CN116528068 A CN 116528068A CN 202210073828 A CN202210073828 A CN 202210073828A CN 116528068 A CN116528068 A CN 116528068A
Authority
CN
China
Prior art keywords
preset
value
images
image
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210073828.4A
Other languages
Chinese (zh)
Inventor
李广卿
丛晓东
刁玉洁
沈海杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210073828.4A priority Critical patent/CN116528068A/en
Publication of CN116528068A publication Critical patent/CN116528068A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The embodiment of the application provides a display device, an image processing method and an apparatus, and belongs to the technical field of image processing, wherein the display device comprises a display, and a processor connected with the display, and the processor is configured to: acquiring RGB gray value information of edge pixels of each frame of image in the N frames of images; determining a corresponding motion level according to the RGB gray value information; determining a target overdrive table according to the motion level and a preset overdrive table; and displaying the next group of N frame images of the N frame images according to the target overdrive table. The display device can be effectively prevented from color dragging, and the image quality of the display device is improved.

Description

Display device, image processing method and device
Technical Field
The application relates to the technical field of image processing. And more particularly, to a display apparatus, an image processing method and an image processing device.
Background
When a display device displays a moving image, a color drag phenomenon, including a liquid crystal display device, generally occurs. Since the deflection of the liquid crystal display device requires time, when the liquid crystal deflection time exceeds the time of one frame image, the human eye will refer to the residual image of the previous frame, causing a color drag phenomenon. Meanwhile, since the image is composed of Red Green Blue (RGB) gray values, the ratio of RGB of different images is different, when the liquid crystal display device displays a moving image, the response time of R pixel, G pixel and B pixel in RGB pixels may be different, and if the response time of one of the pixels is longer, the color dragging phenomenon may occur.
Currently, an OverDrive (OD) algorithm is generally used to improve the color-dragging phenomenon of the liquid crystal display device. Specifically, the overdrive algorithm is to output an overdrive value according to a previous Gray (Pre Gray) to a Target Gray (Target Gray) to achieve a reduction of a response time of liquid crystal, that is, to reach the Target Gray within a time of one frame image, so as to improve a color dragging phenomenon of the liquid crystal display device. However, the overdrive mode can reduce the color drag phenomenon in a slow-speed scene, but still can generate the color drag phenomenon in a fast-speed motion scene.
Disclosure of Invention
The embodiment of the application provides a display device, an image processing method and an image processing device, which can effectively avoid the phenomenon of color dragging of the display device and improve the image quality of the display device.
In a first aspect, embodiments of the present application provide a display device, including:
a display;
a processor coupled to the display, the processor configured to:
acquiring RGB gray value information of edge pixels of each frame of image in the N frames of images;
determining a corresponding motion level according to the RGB gray value information;
determining a target overdrive table according to the motion level and a preset overdrive table, wherein the preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels;
And displaying the next group of N frame images of the N frame images according to the target overdrive table.
In some possible implementations, the processor is specifically configured to: according to the RGB gray scale value information, determining a target gray scale value with the largest difference value in the RGB gray scale values of the edge pixels; acquiring an average value of difference values of target gray values of edge pixels of two adjacent frames of images in the N frames of images; and determining the corresponding motion level according to the average value.
In some possible implementations, the processor is specifically configured to: if the average values are all preset values, determining that the corresponding motion grades are first motion grades; or if the average value is not the preset value, determining the corresponding motion level according to the number of the average value which is the preset value.
In some possible implementations, the processor is specifically configured to: acquiring RGB gray value information of pixels of each frame of image in the N frames of images; and acquiring RGB gray scale value information of edge pixels of each frame of image in the N frames of images based on a preset edge detection algorithm and the RGB gray scale value information of the pixels.
In some possible implementations, the processor is further configured to: the preset overdrive table is obtained by: acquiring images of preset videos displayed by display equipment under different motion grades and original images of the preset videos, wherein the preset videos comprise two different gray value combinations, and the original images have no color dragging phenomenon; acquiring an average value of gray value differences of corresponding pixels in an image of a preset video displayed by display equipment and an original image; and if the average value of the gray value difference values is larger than the difference value threshold value, adjusting the overdrive value of the preset video until the average value of the gray value difference values is smaller than or equal to the difference value threshold value, and obtaining a preset overdrive table of the corresponding motion level.
In a second aspect, an embodiment of the present application provides an image processing method, applied to a display device, including:
acquiring RGB gray value information of edge pixels of each frame of image in the N frames of images;
determining a corresponding motion level according to the RGB gray value information;
determining a target overdrive table according to the motion level and a preset overdrive table, wherein the preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels;
and displaying the next group of N frame images of the N frame images according to the target overdrive table.
In some possible implementations, determining the corresponding motion level according to the RGB gray value information includes: according to the RGB gray scale value information, determining a target gray scale value with the largest difference value in the RGB gray scale values of the edge pixels; acquiring an average value of difference values of target gray values of edge pixels of two adjacent frames of images in the N frames of images; and determining the corresponding motion level according to the average value.
In some possible implementations, determining the corresponding motion level from the average value includes: if the average values are all preset values, determining that the corresponding motion grades are first motion grades; or if the average value is not the preset value, determining the corresponding motion level according to the number of the average value which is the preset value.
In some possible implementations, obtaining RGB gray value information for edge pixels of each of the N frames of images includes: acquiring RGB gray value information of pixels of each frame of image in the N frames of images; and acquiring RGB gray scale value information of edge pixels of each frame of image in the N frames of images based on a preset edge detection algorithm and the RGB gray scale value information of the pixels.
In some possible implementations, the preset overdrive table is obtained by: acquiring images of preset videos displayed by display equipment under different motion grades and original images of the preset videos, wherein the preset videos comprise two different gray value combinations, and the original images have no color dragging phenomenon; acquiring an average value of gray value differences of corresponding pixels in an image of a preset video displayed by display equipment and an original image; and if the average value of the gray value difference values is larger than the difference value threshold value, adjusting the overdrive value of the preset video until the average value of the gray value difference values is smaller than or equal to the difference value threshold value, and obtaining a preset overdrive table of the corresponding motion level.
In a third aspect, an embodiment of the present application provides an image processing apparatus, applied to a display device, including:
The first acquisition module is used for acquiring RGB gray value information of edge pixels of each frame of image in the N frames of images;
the first determining module is used for determining corresponding motion levels according to the RGB gray value information;
the second determining module is used for determining a target overdrive table according to the motion level and a preset overdrive table, wherein the preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels;
and the display module is used for displaying the next group of N frame images of the N frame images according to the target overdrive table.
In some possible implementations, the first determining module is specifically configured to: according to the RGB gray scale value information, determining a target gray scale value with the largest difference value in the RGB gray scale values of the edge pixels; acquiring an average value of difference values of target gray values of edge pixels of two adjacent frames of images in the N frames of images; and determining the corresponding motion level according to the average value.
In some possible implementations, the first determining module, when configured to determine the corresponding motion level according to the average value, is specifically configured to: if the average values are all preset values, determining that the corresponding motion grades are first motion grades; or if the average value is not the preset value, determining the corresponding motion level according to the number of the average value which is the preset value.
In some possible implementations, the first obtaining module is specifically configured to: acquiring RGB gray value information of pixels of each frame of image in the N frames of images; and acquiring RGB gray scale value information of edge pixels of each frame of image in the N frames of images based on a preset edge detection algorithm and the RGB gray scale value information of the pixels.
In some possible implementations, the image processing apparatus further includes a second obtaining module configured to obtain the preset overdrive table by: acquiring images of preset videos displayed by display equipment under different motion grades and original images of the preset videos, wherein the preset videos comprise two different gray value combinations, and the original images have no color dragging phenomenon; acquiring an average value of gray value differences of corresponding pixels in an image of a preset video displayed by display equipment and an original image; and if the average value of the gray value difference values is larger than the difference value threshold value, adjusting the overdrive value of the preset video until the average value of the gray value difference values is smaller than or equal to the difference value threshold value, and obtaining a preset overdrive table of the corresponding motion level.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored therein computer program instructions which, when executed, implement an image processing method as described in the second aspect of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed by a processor, implements the image processing method according to the second aspect of the present application.
The display equipment, the image processing method and the device provided by the application are characterized in that RGB gray value information of edge pixels of each frame of image in N frames of images is obtained; determining a corresponding motion level according to the RGB gray value information; determining a target overdrive table according to the motion level and a preset overdrive table; and displaying the next group of N frame images of the N frame images according to the target overdrive table. Because the application determines the corresponding motion level based on the RGB gray value information of the edge pixels in the image, and further uses the corresponding preset overdrive table to display the image, namely, the overdrive value can be dynamically adjusted, the phenomenon of color dragging of the display equipment can be effectively avoided, and the image quality of the display equipment is improved.
These and other aspects of the application will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation in the related art, a brief description will be given below of the drawings required for the embodiments or the related art descriptions, and it is apparent that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings for those of ordinary skill in the art.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a user according to an embodiment of the present application;
FIG. 2 is a block diagram of a hardware configuration of a display device according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating gray value conversion between adjacent frame images according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a color dragging phenomenon according to an embodiment of the present disclosure;
FIG. 5 is a flowchart of an image processing method according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram of a camera according to an embodiment of the present application capturing an image of a preset video;
FIG. 7 is a flowchart of an image processing method according to another embodiment of the present application;
fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
For purposes of clarity, embodiments and advantages of the present application, the following description will make clear and complete the exemplary embodiments of the present application, with reference to the accompanying drawings in the exemplary embodiments of the present application, it being apparent that the exemplary embodiments described are only some, but not all, of the examples of the present application.
Based on the exemplary embodiments described herein, all other embodiments that may be obtained by one of ordinary skill in the art without making any inventive effort are within the scope of the claims appended hereto. Furthermore, while the disclosure is presented in the context of an exemplary embodiment or embodiments, it should be appreciated that the various aspects of the disclosure may, separately, comprise a complete embodiment.
It should be noted that the brief description of the terms in the present application is only for convenience in understanding the embodiments described below, and is not intended to limit the embodiments of the present application. Unless otherwise indicated, these terms should be construed in their ordinary and customary meaning.
The terms "first," second, "" third and the like in the description and in the claims and in the above drawings are used for distinguishing between similar or similar objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated (Unless otherwise indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device (such as a display device as disclosed in this application) that can typically be controlled wirelessly over a relatively short distance. The electronic device is typically connected to the electronic device using infrared and/or Radio Frequency (RF) signals and/or bluetooth, and may also include functional modules such as WIFI, wireless USB, bluetooth, motion sensors, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in a general remote control device with a touch screen user interface.
The term "gesture" as used herein refers to a user action by a change in hand shape or hand movement, etc., used to express an intended idea, action, purpose, or result.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a user according to an embodiment of the present application. As shown in fig. 1, the user turns on the display device 200 to watch the video, and the display device 200 displays an image corresponding to the video.
As also shown in fig. 1, the display device 200 is also in data communication with the server 400 via a variety of communication means. The display device 200 may be permitted to make communication connections via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired. The display apparatus 200 may additionally provide a smart network television function of a computer support function, including, but not limited to, a network television, a smart television, an Internet Protocol Television (IPTV), etc., in addition to the broadcast receiving television function.
Fig. 2 is a block diagram of a hardware configuration of a display device according to an embodiment of the present application. As shown in fig. 2, in some embodiments, at least one of the controller 250, the communicator 220, the detector 230, the input/output interface 255, the display 275, the memory 260, the power supply 290, the user interface 265 are included in the display device 200.
In some embodiments, the display 275 is configured to receive image signals from the first processor output, and to display video content and images and components of the menu manipulation interface.
In some embodiments, display 275 includes a display screen assembly for presenting pictures, and a drive assembly for driving the display of images.
In some embodiments, the display 275 is used to present a user-manipulated UI interface generated in the display device 200 and used to control the display device 200.
In some embodiments, depending on the type of display 275, a drive assembly for driving the display is also included.
In some embodiments, display 275 is a projection display and may further include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator 220 may include at least one of a WIFl chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, and other network communication protocol chips or a near field communication protocol chip, and an infrared receiver. The WIFl chip corresponds to the WIFI module 221, and may also be referred to as a wireless module; the bluetooth communication protocol chip corresponds to the bluetooth module 222; the wired ethernet communication protocol chip corresponds to the wired ethernet module 223.
In some embodiments, the display device 200 may establish control signal and data signal transmission and reception between the communicator 220 and an external control device or a content providing device.
In some embodiments, the user interface 265 may be used to receive infrared control signals from a control device (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal that the display device 200 uses to capture or interact with the external environment.
In some embodiments, the detector 230 includes an optical receiver, a sensor for capturing the intensity of ambient light, a parameter change may be adaptively displayed by capturing ambient light, etc.
In some embodiments, the detector 230 may further include an image collector 232, such as a camera, a video camera, etc., which may be used to collect external environmental scenes, collect attributes of a user or interact with a user, adaptively change display parameters, and recognize a user gesture to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a sound collector 231 or the like, such as a microphone, that may be used to receive the user's sound.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to enable data transfer between the controller 250 and external other devices or other controllers 250.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored on the memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command to select to display a UI object on the display 275, the controller 250 may perform an operation related to the object selected by the user command.
As shown in fig. 2, the controller 250 includes at least one of a random access Memory 251 (Random Access Memory, RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a graphics processor (Graphics Processing Unit, GPU), a processor 254 (Central Processing Unit, CPU), a communication interface (Communication Interface), and a communication Bus 256 (Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other on-the-fly programs
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, ROM 252 is used to store a basic input output system, referred to as a basic input output system (Basic Input Output System, BIOS). The system comprises a drive program and a boot operating system, wherein the drive program is used for completing power-on self-checking of the system, initialization of each functional module in the system and basic input/output of the system.
In some embodiments, upon receipt of the power-on signal, the display device 200 power starts up, the CPU runs system boot instructions in the ROM 252, copies temporary data of the operating system stored in memory into the RAM 251, in order to start up or run the operating system. When the operating system is started, the CPU copies temporary data of various applications in the memory to the RAM 251, and then, facilitates starting or running of the various applications.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside, so as to finally display and play various audio and video contents.
In some exemplary embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include one main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in the pre-power-up mode and/or displaying a picture in the normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is configured to generate various graphic objects, including an operator, and display the various objects according to display attributes by receiving user input of various interactive instructions to perform the operation. And a renderer for rendering the various objects obtained by the arithmetic unit, wherein the rendered objects are used for being displayed on a display.
In some embodiments, video processor 270 is configured to receive external video signals, perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image composition, etc., according to standard codec protocols for input signals, and may result in signals that are displayed or played on directly displayable device 200.
In some embodiments, the graphics processor 253 may be integrated with the video processor, or may be separately configured, where the integrated configuration may perform processing of graphics signals output to the display, and the separate configuration may perform different functions, such as gpu+ FRC (Frame Rate Conversion)) architecture, respectively.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the audio signal according to a standard codec protocol of an input signal, and perform noise reduction, digital-to-analog conversion, and amplification processing, so as to obtain a sound signal that can be played in a speaker.
In some embodiments, video processor 270 may include one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, video processor 270 and audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of the controller 250, receives the sound signal output by the audio processor 280, and may output to an external sound output terminal of the generating device of the external device, in addition to a speaker carried by the display device 200 itself, and may further include a close range communication module in the communication interface.
The power supply 290 supplies power input from an external power source to the display device 200 under the control of the controller 250. The power supply 290 may include a built-in power circuit installed inside the display device 200, or may be an external power source installed in the display device 200, and a power interface for providing an external power source in the display device 200.
The user interface 265 is used to receive an input signal from a user and then transmit the received user input signal to the controller 250. The user input signal may be a remote control signal received through an infrared receiver, and various user control signals may be received through a network communication module.
The memory 260 includes memory storing various software modules for driving the display device 200.
The base module is a bottom software module for signal communication between the various hardware in the display device 200 and for sending processing and control signals to the upper modules. The detection module is used for collecting various information from various sensors or user input interfaces and carrying out digital-to-analog conversion and analysis management.
When a display device displays a moving image, a color drag phenomenon, including a liquid crystal display device, generally occurs. Taking a liquid crystal display device as an example of liquid crystal electricity, the color dragging phenomenon is a problem of the liquid crystal television in the past, and is caused by the optical principle of the liquid crystal television, the deflection of the liquid crystal needs time, and when the deflection time of the liquid crystal exceeds the time of one frame picture, human eyes refer to the residual image of the previous frame, so that the color dragging phenomenon is caused. Meanwhile, as the image is composed of RGB gray values, the proportions of RGB of different images are different, when the liquid crystal television displays a moving image, the response time of R pixels, G pixels and B pixels in the RGB pixels may be different, and if the response time of one pixel is longer, the color dragging phenomenon can occur.
Currently, overdrive is generally used to improve the color drag phenomenon of a liquid crystal display device. Illustratively, fig. 3 is a schematic diagram of gray value conversion between adjacent frame images according to an embodiment of the present application, and as shown in fig. 3, a line 301 represents a liquid crystal deflection process from a gray value (i.e., an initial gray) of an mth frame image to a gray value (i.e., a target gray) of an mth+1th frame image, it is understood that in a case where an overdrive algorithm is not adopted (i.e., no overdrive value), a time for which the liquid crystal is deflected is longer than a time of one frame, resulting in that the liquid crystal is not deflected in place when the time of one frame arrives. The overdrive algorithm is to increase a gray value higher than the target gray, called overdrive value, so as to reach the target gray from the initial gray in a frame of image, so as to solve the problem of color dragging, such as line 302 in fig. 3, and the corresponding overdrive value is the optimal overdrive value. However, if the overdrive value is too high, such as line 303 in fig. 3, the gray value of the next frame of image exceeds the target gray value, and the phenomenon of color dragging may occur. The existing overdrive algorithm only has a set of fixed overdrive lookup table, the overdrive lookup table cannot enable all gray value conversion to reach a target gray value within one frame time, and the same gray conversion scene can lighten the color drag under a slow scene if the fixed overdrive value is used, but the color drag phenomenon still occurs under a fast motion scene, so that human eyes can feel the color drag, and the user experience is poor. Illustratively, when a moving image is played, a motion edge color dragging phenomenon, particularly a boundary of a bright and dark scene, appears more remarkably. Fig. 4 is a schematic diagram of a color dragging phenomenon according to an embodiment of the present application, as shown in fig. 4, in a scene 1, a and b in a white frame 401 respectively correspond to boundaries of a bright and dark scene, and RGB gray values respectively corresponding to a and b and gray value differences are data corresponding to the scene 1 in table 1; in scene 2, c and d in the white frame 402 correspond to the boundaries of the bright and dark scenes, respectively, and the RGB gray values and the gray value differences corresponding to c and d are, for example, the data corresponding to scene 2 in table 1. It will be appreciated that the boundaries RGB of the bright and dark scenes differ significantly and the RGB differences differ, and that when the image is moving, the RGB gray values differ in response time, and that the gray value differences are large (e.g., the R gray value differences in scene 1) and the response time is long, resulting in a draggy color phenomenon at the moving edge, wherein scene 1 appears as draggy red and scene 2 appears as draggy green.
TABLE 1
Based on the above problems, the application provides a display device, an image processing method and an image processing device, which increase motion vector calculation for different motion scenes, and further dynamically adjust overdrive values, so that the phenomenon of color dragging of the display device can be effectively avoided, and the image quality of the display device is improved.
The following uses detailed examples to illustrate how the present application performs image processing.
Fig. 5 is a flowchart of an image processing method according to an embodiment of the present application, which is applied to a display device, where the display device includes a display and a processor connected to the display. As shown in fig. 5, the processor in the display device is configured to perform the steps of:
in S501, RGB gray-scale value information of edge pixels of each of N frame images is acquired.
In the embodiment of the present application, the N frame image is exemplified by, for example, 5 frame images. The 5 frames of images may be continuously grabbed and stored in an image memory of the display device. Referring to fig. 4, taking scene 1 as an example, a and B in a white frame 401 respectively correspond to the boundaries of a bright and dark scene, and accordingly, corresponding edge pixels and RGB gray value information of the edge pixels may be determined, where the RGB gray value information includes an R gray value, a G gray value, and a B gray value. The RGB gray scale value information is, for example, an R gray scale value, a G gray scale value, and a B gray scale value corresponding to a in table 1. For specific information on how to obtain RGB gray values of edge pixels of each frame of N frames of images, reference may be made to the following embodiments, which are not described herein.
In S502, a corresponding motion level is determined according to the RGB gray value information.
For example, the sports level is 5 sports levels, namely sports level 0, sports level 1, sports level 2, sports level 3 and sports level 4. It will be appreciated that different motion levels correspond to different motion speeds of edge pixels in the image. In this step, after RGB gray value information of an edge pixel of each of the N frames of images is obtained, a corresponding motion level may be determined according to the RGB gray value information. For how to determine the corresponding motion level according to the RGB gray value information, reference may be made to the following embodiments, which are not described herein.
In S503, a target overdrive table is determined according to the motion level and the preset overdrive table.
The preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels.
In this step, the preset overdrive table is stored in the display device in advance, and different overdrive tables correspond to different motion levels. For example, the motion levels are, for example, 5 motion levels, and accordingly, there are 5 sets of preset overdrive tables, namely overdrive table 0, overdrive table 1, overdrive table 2, overdrive table 3 and overdrive table 4, wherein overdrive table 0 can be understood as a default overdrive table of the display device. For how to obtain the preset overdrive table, reference may be made to the following embodiments, which are not repeated here. It will be appreciated that after the motion level is determined, a corresponding preset overdrive table may be obtained according to the motion level, i.e. the target overdrive table is obtained. Illustratively, if it is determined that the motion level is motion level 3, it may be determined that the target overdrive table is overdrive table 3; if it is determined that the motion level is the motion level 0, it may be determined that the target overdrive table is the overdrive table 0, i.e., a default overdrive table of the display device.
In S504, a next set of N frame images of the N frame images is displayed according to the target overdrive table.
Illustratively, the N frame images are, for example, 5 frame images, and if the target overdrive table determined according to the 5 frame images is overdrive table 3, the next group of 5 frame images of the 5 frame images, that is, the 6 th to 10 th frame images, are displayed according to the overdrive values in overdrive table 3. And so on, the steps S501 to S504 are repeatedly performed, and the next set of N frame images of the N frame images is displayed according to the target overdrive table determined by the N frame images.
According to the image processing method, RGB gray value information of edge pixels of each frame of image in N frames of images is obtained; determining a corresponding motion level according to the RGB gray value information; determining a target overdrive table according to the motion level and a preset overdrive table; and displaying the next group of N frame images of the N frame images according to the target overdrive table. According to the method and the device for displaying the image, the corresponding motion level is determined based on the RGB gray value information of the edge pixels in the image, and then the image is displayed by using the corresponding preset overdrive table, namely, the overdrive value can be dynamically adjusted, so that the phenomenon of color dragging of the display device can be effectively avoided, and the image quality of the display device is improved.
Optionally, on the basis of the above embodiment, the processor of the display device is further configured to: the preset overdrive table is obtained by: acquiring images of preset videos displayed by the display equipment under different motion grades and original images of the preset videos, wherein the preset videos comprise two different gray value combinations, and the original images have no color dragging phenomenon; acquiring an average value of gray value differences of corresponding pixels in an image of a preset video displayed by the display equipment and the original image; and if the average value of the gray value differences is larger than a difference threshold value, adjusting the overdrive value of the preset video until the average value of the gray value differences is smaller than or equal to the difference threshold value, and obtaining a preset overdrive table of the corresponding motion level.
Fig. 6 is a schematic diagram illustrating a camera according to an embodiment of the present application capturing an image of a preset video. As shown in fig. 6, the background 601 of the preset video is a gray value (the gray value may be divided into n levels, n=8, 16, 32..255), the middle mullion 602 is a gray value (the gray value may be divided into n levels, n=8, 16, 32..255), and the mullion 602 moves from left to right at different speeds and is divided into 5 moving levels according to the speeds. Taking n=8 as an example, 8×8×5 (where the background 601 corresponds to 8 grays, the mullion 602 corresponds to 8 grays, and 5 motion levels) images can be captured by the camera, the images captured by the camera are denoted as E, the images can exhibit a drag color, the current image is saved by the television at the same time the images captured by the camera are denoted as F, the images are not displayed by the display, and therefore, the images belong to the original image, and therefore, no drag color. Comparing the image E with the image F, subtracting the two images (pixels at the same position are subtracted, because the gray image r=g=b, any gray value of RGB can be calculated), and the gray value difference of each pixel is noted as Xi, the average gray value difference of the images is: y= (x1+x2+), x+xn-/n, defining the gray value difference threshold of the image as α, if Y is greater than α, the gray boundary representing the motion level appears to drag. And for the situation that Y is larger than alpha, automatically debugging an OD value under the gray value difference value until no color dragging occurs.
And adjusting the dragging colors under the difference values of all different motion grades and gray values through the OD values, so that the image is free from dragging colors, and a plurality of groups of corresponding OD tables can be obtained. If the motion level is, for example, 5 motion levels, 5 sets of OD tables are generated, and the 5 sets of OD tables are stored in the display device, that is, the display device obtains the preset overdrive table. For example, table 2 is an OD Table 0 (i.e., OD Table 0) corresponding to a motion level 0, where the first vertical column is a gray value corresponding to the previous frame image, the first horizontal row is a gray value corresponding to the current frame image, for example, 64 in the first vertical column is a gray value corresponding to the previous frame image, 128 in the first horizontal row is a gray value corresponding to the current frame image, from the previous frame image to the current frame image, the corresponding gray value needs to be converted from 64 to 128, and the corresponding OD value is B3; table 3 is an OD Table 4 (i.e., OD Table 4) corresponding to motion level 4, where Gain1 through Gain25 represent OD compensation coefficients for motion level 4 relative to motion level 0.
TABLE 2
OD Table 0 0 64 128 192 255
0 A1 A2 A3 A4 A5
64 B1 B2 B3 B4 B5
128 C1 C2 C3 C4 C5
192 D1 D2 D3 D4 D5
255 E1 E2 E3 E4 E5
TABLE 3 Table 3
OD Table 4 0 64 128 192 255
0 A1*Gain1 A2*Gain2 A3*Gain3 A4*Gain4 A5*Gain5
64 B1*Gain6 B2*Gain7 B3*Gain8 B4*Gain9 B5*Gain10
128 C1*Gain11 C2*Gain12 C3*Gain13 C4*Gain14 C5*Gain15
192 D1*Gain16 D2*Gain17 D3*Gain18 D4*Gain19 D5*Gain20
255 E1*Gain21 E2*Gain22 E3*Gain23 E4*Gain24 E5*Gain25
The image processing method provided in the embodiment of the present application is described in detail below with reference to specific steps.
Fig. 7 is a flowchart of an image processing method according to another embodiment of the present application. As shown in fig. 7, the processor in the display device is configured to perform the steps of:
In this embodiment, step S501 in fig. 5 may further include two steps S701 and S702 as follows:
in S701, RGB gray-scale value information of pixels of each of N frame images is acquired.
In this step, the processor may obtain RGB gray value information of pixels of each frame of image in the N frames of images according to a preset statistical method. For the preset statistical method used, reference may be made to the related art, and the present application is not limited.
In S702, RGB gray-scale value information of an edge pixel of each frame of N frame images is obtained based on a preset edge detection algorithm and the RGB gray-scale value information of the pixel.
For example, after RGB gray value information of pixels of each frame of N frames of images is obtained, a preset edge detection algorithm is adopted, for each pixel of the images, a current pixel is taken as a central pixel, a maximum gray value in the RGB gray values of the central pixel is firstly obtained, for example, an R gray value is the maximum gray value, then the R gray value of the central pixel and the R gray values of four adjacent pixels on the upper, lower, left and right of the central pixel are respectively made into difference values, four corresponding difference values can be obtained, and an adjacent pixel corresponding to the maximum difference value in the four difference values is determined to be an edge pixel. For example, referring to fig. 4, in scene 1, for example, an edge pixel corresponding to a (RGB gray value information of the pixel is denoted as r_ A, G _ A, B _a, respectively) and an edge pixel corresponding to B (RGB gray value information of the pixel is denoted as r_ B, G _ B, B _b, respectively) may be obtained. The gray value difference ranges of the edge pixels corresponding to a and the edge pixels corresponding to b are 0 and 255, the gray value difference threshold value of the edge pixels is defined to be T, for example, 50, and the value of T can be adjusted according to the displays of different display devices. A drag color may occur according to the experiment that any gray value difference between r_a and r_ B, G _a and g_ B, B _a and b_b is greater than 50.
In this embodiment, the step S502 in fig. 5 may further include the following three steps S703 to S705:
in S703, a target gradation value having the largest difference among the RGB gradation values of the edge pixels is determined based on the RGB gradation value information.
In this step, after RGB gray-scale value information of an edge pixel of each of the N frames of images is obtained, a target gray-scale value having the largest difference among the RGB gray-scale values of the edge pixels may be determined according to the RGB gray-scale value information. For example, referring to the data corresponding to scene 1 in table 1, if the gray value difference is the greatest R gray value difference, the target gray value is determined to be the R gray value.
In S704, an average value of differences in target gradation values of edge pixels of two adjacent frame images among the N frame images is acquired.
For example, if the N frame image is, for example, a 5 frame image and the target gray value of the edge pixel is, for example, an R gray value, then the R gray value of the edge pixel of the 1 st frame image in the 5 frame image is defined as r_an, the R gray values of the edge pixels at the same positions as the edge pixel of the 1 st frame image in the 2 nd frame image to the 5 frame image are respectively r_an+1, r_an+2, r_an+3, and r_an+4 in sequence, and the average value of the differences of the R gray values of the edge pixels of the two adjacent frame images is calculated, so that 4 average values can be obtained.
In S705, a corresponding motion level is determined from the average value.
In this step, after the average value of the differences of the target gray values of the edge pixels of two adjacent frames of the N frames of images is obtained, the corresponding motion level may be determined according to the average value.
Further, the processor is specifically configured to: if the average values are all preset values, determining that the corresponding motion grades are first motion grades; or if the average value is not the preset value, determining the corresponding motion level according to the number of the average value which is the preset value.
For example, the preset value is, for example, 0, and the motion level is, for example, 5 motion levels, namely, motion level 0, motion level 1, motion level 2, motion level 3, and motion level 4. Based on the example in step S704, 4 averages may be obtained, and if all the 4 averages are zero, determining that the corresponding motion level is motion level 0; if the 3 average values are 0, determining that the corresponding motion level is motion level 1; if the 2 average values are 0, determining that the corresponding motion level is motion level 2; if the 1 average value is 0, determining that the corresponding motion level is motion level 3; if all of the 4 averages are not 0, the corresponding motion level is determined to be motion level 4.
In S706, a target overdrive table is determined according to the motion level and the preset overdrive table.
The preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels.
The specific implementation process of this step may be referred to as S503, and will not be described herein.
In S707, the next set of N frame images of the N frame images is displayed according to the target overdrive table.
The specific implementation process of this step may be referred to as related description of S504, which is not repeated here.
According to the image processing method, RGB gray value information of pixels of each frame of image in N frames of images is obtained, and the RGB gray value information of edge pixels of each frame of image in N frames of images is obtained based on a preset edge detection algorithm and the RGB gray value information of the pixels; according to the RGB gray scale value information, determining a target gray scale value with the largest difference value in the RGB gray scale values of the edge pixels; acquiring an average value of difference values of target gray values of edge pixels of two adjacent frames of images in the N frames of images; determining a corresponding motion level according to the average value; determining a target overdrive table according to the motion level and a preset overdrive table; and displaying the next group of N frame images of the N frame images according to the target overdrive table. According to the method and the device for displaying the image, the corresponding motion level is determined based on the RGB gray value information of the edge pixels in the image, and then the image is displayed by using the corresponding preset overdrive table, namely, the overdrive value can be dynamically adjusted, so that the phenomenon of color dragging of the display device can be effectively avoided, and the image quality of the display device is improved.
The following are device embodiments of the present application, which may be used to perform method embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Fig. 8 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application. The image processing apparatus is applied to a display device. As shown in fig. 8, an image processing apparatus 800 provided in an embodiment of the present application includes: a first acquisition module 801, a first determination module 802, a second determination module 803, and a display module 804. Wherein:
a first obtaining module 801, configured to obtain RGB gray value information of edge pixels of each frame of N frames of images.
The first determining module 802 is configured to determine a corresponding motion level according to the RGB gray value information.
The second determining module 803 is configured to determine a target overdrive table according to the motion level and a preset overdrive table, where the preset overdrive table is used to characterize overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels.
A display module 804, configured to display a next set of N frame images of the N frame images according to the target overdrive table.
In some embodiments, the first determining module 802 may be specifically configured to: according to the RGB gray scale value information, determining a target gray scale value with the largest difference value in the RGB gray scale values of the edge pixels; acquiring an average value of difference values of target gray values of edge pixels of two adjacent frames of images in the N frames of images; and determining the corresponding motion level according to the average value.
In some embodiments, the first determining module 802, when configured to determine the corresponding motion level according to the average value, may be specifically configured to: if the average values are all preset values, determining that the corresponding motion grades are first motion grades; or if the average value is not the preset value, determining the corresponding motion level according to the number of the average value which is the preset value.
In some embodiments, the first obtaining module 801 may be specifically configured to: acquiring RGB gray value information of pixels of each frame of image in the N frames of images; and acquiring RGB gray scale value information of edge pixels of each frame of image in the N frames of images based on a preset edge detection algorithm and the RGB gray scale value information of the pixels.
In some embodiments, the image processing apparatus further includes a second obtaining module 805 configured to obtain the preset overdrive table by: acquiring images of preset videos displayed by display equipment under different motion grades and original images of the preset videos, wherein the preset videos comprise two different gray value combinations, and the original images have no color dragging phenomenon; acquiring an average value of gray value differences of corresponding pixels in an image of a preset video displayed by display equipment and an original image; and if the average value of the gray value difference values is larger than the difference value threshold value, adjusting the overdrive value of the preset video until the average value of the gray value difference values is smaller than or equal to the difference value threshold value, and obtaining a preset overdrive table of the corresponding motion level.
It should be noted that, the apparatus provided in this embodiment may be used to execute the above-mentioned image processing method, and its implementation manner and technical effects are similar, and this embodiment is not repeated here.
It should be noted that, it should be understood that the division of the modules of the above apparatus is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; the method can also be realized in a form of calling software by a processing element, and the method can be realized in a form of hardware by a part of modules. For example, the processing module may be a processing element that is set up separately, may be implemented in a chip of the above-mentioned apparatus, or may be stored in a memory of the above-mentioned apparatus in the form of program codes, and the functions of the above-mentioned processing module may be called and executed by a processing element of the above-mentioned apparatus. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. The processing element here may be an integrated circuit with signal processing capabilities. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the modules above may be one or more integrated circuits configured to implement the methods above, such as: one or more ASICs (Application Specific Integrated Circuit, specific integrated circuits), or one or more DSPs (Digital Signal Processor, digital signal processors), or one or more FPGAs (Field Programmable Gate Array, field programmable gate arrays), etc. For another example, when a module above is implemented in the form of a processing element scheduler code, the processing element may be a general purpose processor, such as a CPU or other processor that may invoke the program code. For another example, the modules may be integrated together and implemented in the form of a System-on-a-Chip (SOC).
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer programs. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer program may be stored in or transmitted from one computer readable storage medium to another, for example, a website, computer, server, or data center via a wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer readable storage media can be any available media that can be accessed by a computer or data storage devices, such as servers, data centers, etc., that contain an integration of one or more available media. Usable media may be magnetic media (e.g., floppy disks, hard disks, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., solid State Disk (SSD)), among others.
The present application also provides a computer-readable storage medium having a computer program stored therein, which when executed by a processor implements the image processing method according to any of the method embodiments described above.
Embodiments of the present application also provide a computer program product, which includes a computer program, where the computer program is stored in a computer readable storage medium, and from which at least one processor can read the computer program, where the at least one processor can implement the image processing method according to any one of the method embodiments above when executing the computer program.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (13)

1. A display device, characterized by comprising:
a display;
a processor coupled to the display, the processor configured to:
acquiring RGB gray value information of edge pixels of each frame of image in the N frames of images;
determining a corresponding motion level according to the RGB gray scale value information;
determining a target overdrive table according to the motion level and a preset overdrive table, wherein the preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels;
and displaying the next group of N frame images of the N frame images according to the target overdrive table.
2. The display device of claim 1, wherein the processor is specifically configured to:
according to the RGB gray scale value information, determining a target gray scale value with the largest difference value in the RGB gray scale values of the edge pixels;
acquiring an average value of difference values of target gray values of edge pixels of two adjacent frames of images in the N frames of images;
and determining the corresponding motion level according to the average value.
3. The display device of claim 2, wherein the processor is specifically configured to:
if the average values are all preset values, determining that the corresponding motion grades are first motion grades;
or if the average value is not the preset value, determining the corresponding motion level according to the number of the average value which is the preset value.
4. A display device according to any one of claims 1 to 3, wherein the processor is specifically configured to:
acquiring RGB gray value information of pixels of each frame of image in the N frames of images;
and acquiring RGB gray scale value information of edge pixels of each frame of image in the N frames of images based on a preset edge detection algorithm and the RGB gray scale value information of the pixels.
5. The display device of any one of claims 1-3, wherein the processor is further configured to:
the preset overdrive table is obtained by:
acquiring images of preset videos displayed by the display equipment under different motion grades and original images of the preset videos, wherein the preset videos comprise two different gray value combinations, and the original images have no color dragging phenomenon;
acquiring an average value of gray value differences of corresponding pixels in an image of a preset video displayed by the display equipment and the original image;
and if the average value of the gray value differences is larger than a difference threshold value, adjusting the overdrive value of the preset video until the average value of the gray value differences is smaller than or equal to the difference threshold value, and obtaining a preset overdrive table of the corresponding motion level.
6. An image processing method, characterized by being applied to a display device, comprising:
acquiring RGB gray value information of edge pixels of each frame of image in the N frames of images;
determining a corresponding motion level according to the RGB gray scale value information;
determining a target overdrive table according to the motion level and a preset overdrive table, wherein the preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels;
And displaying the next group of N frame images of the N frame images according to the target overdrive table.
7. The image processing method according to claim 6, wherein the determining the corresponding motion level according to the RGB gray value information includes:
according to the RGB gray scale value information, determining a target gray scale value with the largest difference value in the RGB gray scale values of the edge pixels;
acquiring an average value of difference values of target gray values of edge pixels of two adjacent frames of images in the N frames of images;
and determining the corresponding motion level according to the average value.
8. The image processing method according to claim 7, wherein the determining the corresponding motion level from the average value includes:
if the average values are all preset values, determining that the corresponding motion grades are first motion grades;
or if the average value is not the preset value, determining the corresponding motion level according to the number of the average value which is the preset value.
9. The image processing method according to any one of claims 6 to 8, wherein the acquiring RGB gray-value information of edge pixels of each of the N frame images includes:
Acquiring RGB gray value information of pixels of each frame of image in the N frames of images;
and acquiring RGB gray scale value information of edge pixels of each frame of image in the N frames of images based on a preset edge detection algorithm and the RGB gray scale value information of the pixels.
10. The image processing method according to any one of claims 6 to 8, wherein the preset overdrive table is obtained by:
acquiring images of preset videos displayed by the display equipment under different motion grades and original images of the preset videos, wherein the preset videos comprise two different gray value combinations, and the original images have no color dragging phenomenon;
acquiring an average value of gray value differences of corresponding pixels in an image of a preset video displayed by the display equipment and the original image; and if the average value of the gray value differences is larger than a difference threshold value, adjusting the overdrive value of the preset video until the average value of the gray value differences is smaller than or equal to the difference threshold value, and obtaining a preset overdrive table of the corresponding motion level.
11. An image processing apparatus, characterized by being applied to a display device, comprising:
The first acquisition module is used for acquiring RGB gray value information of edge pixels of each frame of image in the N frames of images;
the first determining module is used for determining a corresponding motion level according to the RGB gray value information;
the second determining module is used for determining a target overdrive table according to the motion level and a preset overdrive table, wherein the preset overdrive table is used for representing overdrive values corresponding to RGB gray value conversion of two adjacent frames of images under different motion levels;
and the display module is used for displaying the next group of N frame images of the N frame images according to the target overdrive table.
12. A computer-readable storage medium, in which computer program instructions are stored, which, when executed, implement the image processing method according to any one of claims 6 to 10.
13. A computer program product comprising a computer program which, when executed by a processor, implements the image processing method according to any one of claims 6 to 10.
CN202210073828.4A 2022-01-21 2022-01-21 Display device, image processing method and device Pending CN116528068A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210073828.4A CN116528068A (en) 2022-01-21 2022-01-21 Display device, image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210073828.4A CN116528068A (en) 2022-01-21 2022-01-21 Display device, image processing method and device

Publications (1)

Publication Number Publication Date
CN116528068A true CN116528068A (en) 2023-08-01

Family

ID=87394565

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210073828.4A Pending CN116528068A (en) 2022-01-21 2022-01-21 Display device, image processing method and device

Country Status (1)

Country Link
CN (1) CN116528068A (en)

Similar Documents

Publication Publication Date Title
US10885384B2 (en) Local tone mapping to reduce bit depth of input images to high-level computer vision tasks
CN108352059B (en) Method and apparatus for generating standard dynamic range video from high dynamic range video
CN109783178B (en) Color adjusting method, device, equipment and medium for interface component
TWI593275B (en) Adaptive linear luma domain video pipeline architecture, system and machine readable medium
US20210281718A1 (en) Video Processing Method, Electronic Device and Storage Medium
US9661298B2 (en) Depth image enhancement for hardware generated depth images
US10559073B2 (en) Motion adaptive stream processing for temporal noise reduction
US9659354B2 (en) Color matching for imaging systems
CN112055875B (en) Partial image frame update system and method for electronic display
CN108369750B (en) Method and apparatus for image enhancement of virtual reality images
CN112788235A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
KR20210020387A (en) Electronic apparatus and control method thereof
US10805680B2 (en) Method and device for configuring image mode
CN110858388B (en) Method and device for enhancing video image quality
US11373280B2 (en) Electronic device and method of training a learning model for contrast ratio of an image
US9286655B2 (en) Content aware video resizing
US11094286B2 (en) Image processing apparatus and image processing method
CN116528068A (en) Display device, image processing method and device
CN110858389B (en) Method, device, terminal and transcoding equipment for enhancing video image quality
CN114694599B (en) Display device and mura processing method
WO2023185706A1 (en) Image processing method, image processing apparatus and storage medium
CN116844492A (en) Display apparatus and control method
CN117478801A (en) Display device and display method
CN115550706A (en) Display device, image processing method and device
CN117351901A (en) Display device, backlight control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination