CN116091392B - Image processing method, system and storage medium - Google Patents

Image processing method, system and storage medium Download PDF

Info

Publication number
CN116091392B
CN116091392B CN202210983176.8A CN202210983176A CN116091392B CN 116091392 B CN116091392 B CN 116091392B CN 202210983176 A CN202210983176 A CN 202210983176A CN 116091392 B CN116091392 B CN 116091392B
Authority
CN
China
Prior art keywords
image
terminal
tested
color
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210983176.8A
Other languages
Chinese (zh)
Other versions
CN116091392A (en
Inventor
李艳波
陈祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210983176.8A priority Critical patent/CN116091392B/en
Publication of CN116091392A publication Critical patent/CN116091392A/en
Application granted granted Critical
Publication of CN116091392B publication Critical patent/CN116091392B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/01Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour 
    • G02F1/13Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour  based on liquid crystals, e.g. single liquid crystal display cells
    • G02F1/1306Details
    • G02F1/1309Repairing; Testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30121CRT, LCD or plasma display

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Nonlinear Science (AREA)
  • Quality & Reliability (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Optics & Photonics (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides an image processing method, an image processing system and a storage medium, and relates to the technical field of testing. Wherein the method comprises the following steps: the method comprises the steps that first equipment sends a first instruction to a terminal to be tested to instruct the terminal to be tested to shoot a target graphic card by using a camera application, receives media data shot by the terminal to be tested, performs color analysis on an image to be processed in the media data to obtain first image color data, sends a second instruction to second equipment to instruct the second equipment to collect second image color data of the image to be processed displayed by the terminal to be tested, finally obtains the second image color data collected by the second equipment, and determines screen color difference information of the terminal to be tested based on the first image color data and the second image color data. Therefore, the first equipment avoids the influence of the photographing process on the image chromatic aberration, reduces the influence of the image quality on the screen chromatic aberration, and improves the screen chromatic aberration analysis accuracy of the terminal to be tested.

Description

Image processing method, system and storage medium
Technical Field
The present application relates to the field of display technologies, and in particular, to an image processing method, system, and storage medium.
Background
With the development of display technology and terminal technology, terminal devices are increasingly used as displays, and users have increasingly high requirements on display effects of the displays so as to pursue high-quality restoration effects.
At present, the color difference analysis method of the display mainly analyzes the display effect of the display based on the constructed color set to obtain the color difference analysis result of the display screen. However, this method has a problem of low accuracy of color difference analysis.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing system and a storage medium, which are used for solving the problem of low accuracy of the existing color difference analysis.
The first aspect of the present application provides an image processing method, the method comprising: the method comprises the steps that first equipment sends a first instruction to a terminal to be tested, wherein the first instruction is used for instructing the terminal to be tested to shoot a target graphic card by using a camera application; the first equipment receives media data shot by the terminal to be tested, and performs color analysis on an image to be processed in the media data to obtain first image color data; the first device sends a second instruction to a second device, wherein the second instruction is used for instructing the second device to acquire second image color data of the image to be processed, which is displayed by the terminal to be tested; the first device acquires the second image color data acquired by the second device, and determines the screen color difference information of the terminal to be tested based on the first image color data and the second image color data.
In this embodiment, based on the image processing method executed by the image processing system, the consistency degree of the color data of the display image and the color data of the shot image of the terminal to be tested can be determined, so that the influence of the shooting process on the color difference of the image is avoided, the influence of the image quality on the color difference of the screen is reduced, and the accuracy of analyzing the color difference of the screen of the terminal to be tested is improved.
In one possible implementation manner, before the first device sends the first instruction to the terminal to be tested, the method further includes: the first device sends a third instruction to the light source, wherein the third instruction is used for indicating the light source to adjust the emitted illumination brightness and color temperature. In order to avoid the quality influence of different illumination environments on the photos shot by the camera application, the light source is added in the image processing system, the illumination brightness and the color temperature are controlled by the light source, the images with the quality meeting the requirements can be shot, the external interference can be reduced, and the foundation is laid for improving the screen color difference precision.
In one possible implementation manner, the first device is preset with brightness level information, color temperature level information and parameter adjustment intervals; the method further comprises the steps of: the first equipment performs permutation and combination on the brightness level information and the color temperature level information, and determines the quantity of illumination environments and brightness parameters and color temperature parameters of various illumination environments, wherein the brightness parameters and/or the color temperature parameters in different illumination environments are different;
The first device sending a third instruction to the light source, comprising: the first device sends a brightness parameter and a color temperature parameter of an illumination environment to the light source through a third instruction every time the parameter is adjusted, and the third instruction is used for indicating the light source to send out illumination brightness corresponding to the brightness parameter and color temperature corresponding to the color temperature parameter until the number of times of sending the illumination environment number;
the first device sends a first instruction to a terminal to be tested, and the first instruction comprises: and the first device sends the first instruction to the terminal to be tested after sending the third instruction to the light source each time.
In the possible implementation manner, the first device controls the light source for multiple times to enable the target graphics card to be in different illumination environments, and correspondingly controls the shooting behavior of the terminal to be tested, the terminal to be tested can execute the shooting process in different illumination environments, media data shot by the terminal to be tested in different illumination environments can be obtained, in addition, the media data is less influenced by external interference, the quality can meet the requirements, and a foundation is laid for the follow-up image analysis to obtain the screen color difference information with high precision.
In one possible implementation manner, the first device is preset with brightness level information, color temperature level information and parameter adjustment intervals; the method further comprises the steps of: the first equipment performs permutation and combination on the brightness level information and the color temperature level information to determine the quantity of illumination environments and brightness parameters and color temperature parameters of various illumination environments, and the brightness parameters and/or the color temperature parameters in different illumination environments are different;
the first device sending a third instruction to the light source, comprising: the first device sends the parameter adjustment interval, the number of illumination environments and the brightness parameters and the color temperature parameters of various illumination environments to a light source through the third instruction, wherein the third instruction is used for indicating the light source to adjust the emitted illumination brightness and color temperature once according to the brightness parameters and the color temperature parameters of various illumination environments every time the duration of the parameter adjustment interval until the number of illumination environments is reduced by 1 time;
before the first device sends the first instruction to the terminal to be tested, the method further comprises: the first device sends the parameter adjustment interval and the illumination environment quantity to the terminal to be tested, wherein the parameter adjustment interval is a time interval when the terminal to be tested shoots the target image card by using a camera application after receiving the second instruction, and the illumination environment quantity is the times when the terminal to be tested shoots the target image card by using the camera application.
In the possible implementation manner, the first device sends preset configuration information to the light source and the terminal to be tested, so that the light source can automatically adjust the emitted illumination information, different illumination environments are presented, the terminal to be tested can automatically execute the shooting process, delay is reduced, the degree of automation of image processing is improved, and a foundation is laid for determining accurate screen color difference information subsequently.
In one possible implementation manner, after the first device receives the media data shot by the terminal to be tested, the method further includes: the first device classifies and renames the media data to obtain at least one type of image set, wherein the illumination environments of the images in each type of image set are the same, and the shooting terminal identifiers are the same. Therefore, the first equipment can process the images to be processed in different types of image sets respectively during processing, and the data processing efficiency is improved.
In one possible implementation, before the first device sends the second instruction to the second device, the method further includes: and sending a fourth instruction to the terminal to be tested, wherein the fourth instruction is used for indicating the terminal to be tested to display the image to be processed in the media data. Therefore, after the second device receives the second instruction sent by the first device, the second device can successfully acquire the color data of the image to be processed, and an implementation condition is provided for subsequently evaluating whether the color data of the display image of the terminal to be tested is consistent with the color data of the shooting image.
In one possible implementation manner, the first device determines, based on the first image color data and the second image color data, screen color difference information of the terminal to be tested, including: determining a first color gamut range corresponding to the first image color data and a second color gamut range corresponding to the second image color data based on the preset color space; calculating the color gamut consistency of the first color gamut range and the second color gamut range; and according to the color gamut consistency, determining the screen color difference information of the terminal to be tested. According to the embodiment of the application, the display color accuracy of the screen of the terminal to be tested with a photographing function can be effectively evaluated by calculating the consistency degree of the first image color data and the second image color data, namely by comparing the consistency of the photographed image and the display image color, and meanwhile, the display debugging direction is provided, so that a foundation is laid for improving the display accuracy of the terminal to be tested.
A second aspect of the present application provides an image processing apparatus, which may be an electronic device, or may be a chip or a chip system in an electronic device. The image processing apparatus may include a processing unit and an integrated circuit IC. When the image processing apparatus is an electronic device, the processing unit is configured to implement any of the methods related to processing in the first aspect or in each of the possible implementations of the first aspect. When the image processing apparatus is an electronic device, the processing unit may be a processor. The image processing apparatus may further include a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the electronic device implements a method described in the first aspect or each possible implementation manner of the first aspect. When the image processing apparatus is a chip or a system of chips within an electronic device, the processing unit may be a processor. The processing unit executes instructions stored by the storage unit to cause the electronic device to implement a method as described in the first aspect or in each of the possible implementations of the first aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) within the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) within the electronic device that is external to the chip.
The image processing apparatus may further include a transmitting unit and a receiving unit;
correspondingly, the sending unit is used for sending a first instruction to the terminal to be tested, wherein the first instruction is used for indicating the terminal to be tested to shoot the target graphic card by using a camera application; the receiving unit is used for receiving media data shot by the terminal to be tested; the processing unit is used for carrying out color analysis on the image to be processed in the media data to obtain first image color data;
the sending unit is further configured to send a second instruction to a second device, where the second instruction is used to instruct the second device to collect second image color data of the image to be processed, where the second image color data is displayed by the terminal to be tested; the receiving unit is further configured to obtain the second image color data collected by the second device, and the processing unit is further configured to determine screen color difference information of the terminal to be tested based on the first image color data and the second image color data.
In a possible implementation manner, the sending unit is further configured to send a third instruction to the light source, where the third instruction is used to instruct the light source to adjust the emitted illumination brightness and color temperature.
In one possible implementation manner, the image processing device is preset with brightness level information, color temperature level information and parameter adjustment intervals; the processing unit is also used for arranging and combining the brightness level information and the color temperature level information, and determining the quantity of illumination environments and brightness parameters and color temperature parameters of various illumination environments, wherein the brightness parameters and/or the color temperature parameters in different illumination environments are different;
the sending unit is used for sending a brightness parameter and a color temperature parameter of an illumination environment to the light source through a third instruction every time the parameter is adjusted, and the third instruction is used for indicating the light source to send out illumination brightness corresponding to the brightness parameter and color temperature corresponding to the color temperature parameter until the number of times of sending the illumination environment number;
the sending unit is further configured to send the first instruction to the terminal to be tested after sending the third instruction to the light source each time.
In one possible implementation manner, the image processing device is preset with brightness level information, color temperature level information and parameter adjustment intervals; the processing unit is also used for arranging and combining the brightness level information and the color temperature level information, and determining the quantity of illumination environments and brightness parameters and color temperature parameters of various illumination environments, wherein the brightness parameters and/or the color temperature parameters in different illumination environments are different;
The sending unit is configured to send the parameter adjustment interval, the number of lighting environments, and the brightness parameters and the color temperature parameters of the various lighting environments to the light source through the third instruction, where the third instruction is configured to instruct the light source to adjust the emitted lighting brightness and color temperature once according to the brightness parameters and the color temperature parameters of the various lighting environments, and the duration of the parameter adjustment interval is adjusted every interval until the number of lighting environments is reduced by 1 time;
the sending unit is further configured to send, before sending a first instruction to a terminal to be tested, the parameter adjustment interval and the number of illumination environments to the terminal to be tested, where the parameter adjustment interval is a time interval during which the terminal to be tested photographs the target graphics card by using a camera application after receiving the second instruction, and the number of illumination environments is the number of times that the terminal to be tested photographs the target graphics card by using the camera application.
In a possible implementation manner, the processing unit is further configured to classify and rename the media data to obtain at least one type of image set, where the illumination environment is the same and the identification of the shooting terminal is the same when the images in each type of image set are shot.
In a possible implementation manner, the sending unit is further configured to send a fourth instruction to the terminal to be tested, where the fourth instruction is used to instruct the terminal to be tested to display an image to be processed in the media data.
In a possible implementation manner, the processing unit is specifically configured to determine, based on the preset color space, a first color gamut range corresponding to the first image color data and a second color gamut range corresponding to the second image color data; calculating the color gamut consistency of the first color gamut range and the second color gamut range; and according to the color gamut consistency, determining the screen color difference information of the terminal to be tested.
The third aspect of the application provides an image processing system, comprising a first device, a second device and a terminal to be tested, wherein the second device and the terminal to be tested are connected with the first device; the terminal to be tested is provided with a camera application and a display screen;
the first device is used for sending a first instruction to the terminal to be tested; the terminal to be tested is used for shooting a target graphic card by using a camera application when receiving the first instruction, storing the media data obtained by shooting, and sending the media data to the first device; the first device is further configured to perform color analysis on the received image to be processed in the media data, so as to obtain first image color data;
The first device is further configured to send a second instruction to the second device; the second device is used for collecting second image color data of the image to be processed, which is displayed on the display screen of the terminal to be tested, and sending the second image color data to the first device when receiving the second instruction;
the first device is further configured to determine, when the second image color data is received, screen color difference information of the terminal to be tested based on the first image color data and the second image color data.
In one possible implementation, the image processing system further comprises a light source connected to the first device;
the first device is further configured to send a third instruction to the light source before sending the first instruction to the terminal to be tested, where the third instruction is used to instruct the light source to adjust the brightness and the color temperature of the emitted light.
In one possible implementation, the light source is a light box, and the target graphic card is disposed inside the light box.
In one possible implementation manner, the first device is preset with brightness level information, color temperature level information and parameter adjustment intervals; the first device is further configured to perform permutation and combination on the brightness level information and the color temperature level information to determine the number of illumination environments, and brightness parameters and color temperature parameters of various illumination environments, where the brightness parameters and/or the color temperature parameters in different illumination environments are different;
The first device is specifically configured to send, to the light source, a luminance parameter and a color temperature parameter of an illumination environment through a third instruction every a duration of the parameter adjustment interval until the number of times of sending the number of illumination environments; the light source is used for sending out illumination brightness corresponding to the brightness parameter and color temperature corresponding to the color temperature parameter after receiving the brightness parameter and the color temperature parameter of an illumination environment;
the first device is specifically configured to send the first instruction to the terminal to be tested after sending the third instruction to the light source each time.
In one possible implementation manner, the first device is preset with brightness level information, color temperature level information and parameter adjustment intervals; the first device is further configured to perform permutation and combination on the brightness level information and the color temperature level information to determine the number of illumination environments, and brightness parameters and color temperature parameters of various illumination environments, where the brightness parameters and/or the color temperature parameters in different illumination environments are different;
the first device is further configured to send the parameter adjustment interval, the number of lighting environments, and brightness parameters and color temperature parameters of the various lighting environments to the light source through the third instruction; the light source is further used for adjusting the illumination brightness and the color temperature which are sent once according to the brightness parameter and the color temperature parameter of the at least one illumination environment after receiving the third instruction, wherein the parameter is adjusted at intervals until the quantity of the illumination environments is reduced by 1 time;
The first device is further configured to send the parameter adjustment interval and the number of illumination environments to the terminal to be tested; and the terminal to be tested is used for shooting the target graphic card by using a camera application every time the parameter adjustment interval is long after the second instruction is received until the number of times of executing the illumination environment number.
In one possible implementation, the first device is further configured to: and classifying and renaming the media data to obtain at least one type of image set, wherein the illumination environment is the same and the shooting terminal identification is the same when the images in each type of image set are shot.
In a possible implementation manner, the first device is further configured to send, before sending a second instruction to the second device, a fourth instruction to the terminal to be tested, where the fourth instruction is used to instruct the terminal to be tested to display an image to be processed in the media data.
A fourth aspect of the present application provides an electronic device comprising: a memory for storing a computer program and a processor for executing the computer program to perform the image processing method according to the first aspect and various possible implementations.
A fifth aspect of the application provides a computer readable storage medium storing instructions that, when executed, cause a computer to perform the image processing method of the first aspect and various possible implementations described above.
A sixth aspect of the application provides a computer program product comprising a computer program which, when run, causes an electronic device to perform the image processing method as described in the first aspect and various possible implementations.
A seventh aspect of the present application provides a chip or chip system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a line, the at least one processor being configured to execute a computer program or instructions to perform the image processing method of the first aspect and various possible implementations described above. The communication interface in the chip can be an input/output interface, a pin, a circuit or the like.
In one possible implementation, the chip or chip system described above further includes at least one memory, where the at least one memory has instructions stored therein. The memory may be a memory unit within the chip, such as a register, a cache, etc., or may be a memory unit of the chip (e.g., a read-only memory, a random access memory, etc.).
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
FIG. 1 is a schematic diagram of an electronic device;
FIG. 2 is a schematic diagram of an image processing system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an application scenario according to an embodiment of the present application;
FIG. 4 is a schematic diagram of another application scenario according to an embodiment of the present application;
FIG. 5 is a color gamut comparison of captured image color data and display image color data of a terminal to be tested;
fig. 6 is a flowchart of an image processing method according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In embodiments of the present application, the words "first," "second," and the like are used to distinguish between identical or similar items that have substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
With the development of display technology, users have increasingly demanded display effects of electronic devices, and image reduction effects for high image quality have been a mainstream trend. However, the display screen of the electronic device may introduce a problem of inconsistent display effect due to differences in the display panel itself, for example, fluctuations in material characteristics, process manufacturing, or parameter correction, etc., and thus, even the same model of device may have different display effects. Therefore, before the electronic device leaves the factory, color evaluation is generally required to be performed on the display screen of the electronic device to determine whether the screen display effect of the electronic device meets the requirement.
At present, the color difference analysis method of the display mainly analyzes the display effect of the display based on the constructed color set to obtain the color difference analysis result of the display screen. According to the scheme, the display is analyzed, the display data of the display are measured, and the color difference information of the display screen can be determined to a certain degree. However, with the development of terminal technology, the terminal device has more and more use scenes as a photographing device and an image display device, and the photographing effect of the camera application may have a certain influence on the image display effect, so the above-mentioned color difference analysis scheme which does not involve photographing function and flow and does not calculate photographing data has the problem of low color difference analysis accuracy.
Aiming at the problems, the technical conception process of the application is as follows: if the display screen color difference of the terminal to be tested (the display device with the photographing function) is to be analyzed, the color difference analysis can be performed based on the image photographed by the terminal to be tested, specifically, the screen color difference information of the terminal to be tested is determined through the photographed image color data and the display image color data of the terminal to be tested, so that the influence of poor quality of the image on the analysis result of the display screen can be avoided.
That is, the control device may be used to control the terminal to be tested to capture an image, on the one hand, the color analyzer may be used to collect color data of a display image of the captured image displayed on the terminal to be tested and transmit the color data of the display image to the control device, on the other hand, the captured image may be transmitted to the control device, the control device may perform color analysis to obtain color data of the captured image, and finally, the control device may process the color data of the captured image according to the color data of the display image collected by the color analyzer and the color data of the captured image analyzed by the control device, so as to determine the screen color difference information of the terminal to be tested.
Based on the above technical conception process, the embodiment of the present application provides a scheme for executing an image processing method by constructing an image processing system and using the constructed image processing system. The control device of the image processing system may acquire media data (images and/or videos) captured by the terminal to be tested, on one hand, when the terminal to be tested displays (i.e. displays back) the media data, it may acquire color data of the display image, on the other hand, the media data is transmitted to the control device to perform image color analysis, so as to obtain color data of the captured image, and then the two results are compared, so as to determine the screen color difference information of the terminal to be tested.
Optionally, the image processing system may include a first device, and a second device and a terminal to be tested that are both connected to the first device, where in an application process, the first device may control operations of the second device and the terminal to be tested, for example, the first device may send a first instruction to the terminal to be tested to trigger the terminal to be tested to capture a target graphics card by using a camera application, receive captured media data from the terminal to be tested, and perform color difference analysis on an image to be processed in the media data, so as to obtain captured image color data; the first device can also send a second instruction to the second device to trigger the second device to acquire the image color of the image to be processed displayed by the terminal to be tested, so as to obtain the color data of the display image, and finally the first device can compare and analyze the color data of the display image acquired by the second device with the color data of the shooting image obtained by self analysis, so as to determine the screen color difference information of the terminal to be tested.
In the embodiment of the application, based on the image processing method executed by the image processing system, the consistency degree of the color data of the display image and the color data of the shooting image of the terminal to be tested can be determined, the influence of poor quality of the image on the analysis result of the display screen is avoided, and the accuracy of the screen color difference analysis of the terminal to be tested is improved.
It may be understood that, in the embodiment of the present application, the terminal to be tested is an electronic device having a photographing function and having a display screen, and the first device and the second device may also be electronic devices, and the electronic devices may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), or the like. The electronic device may be a mobile phone, a smart television, a wearable device, a tablet (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an augmented reality (augmented reality, AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned driving (self-driving), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), or the like. The embodiment of the application does not limit the specific technology and the specific equipment form adopted by the electronic equipment.
It should be noted that, the product implementation form of the embodiment of the present application is program code that is contained in machine learning and deep learning platform software and is deployed on a device. The program code of embodiments of the present application may be stored within the device. In operation, program code runs in host memory and/or GPU memory of the electronic device.
In order to better understand the embodiments of the present application, the following describes the structure of the electronic device according to the embodiments of the present application:
fig. 1 is a schematic structural view of an electronic device. As shown in fig. 1, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, a user identification module (subscriber identification module, SIM) card interface 195, and the like.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it may be called from memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Antennas in electronic device 100 may be used to cover single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. Wireless communication techniques may include global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot lighte mitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, so that the electrical signal is converted into an image visible to naked eyes. ISP can also perform algorithm optimization on noise and brightness of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer-executable program code that includes instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture, among others. In the embodiment of the application, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The technical scheme of the application is described in detail through specific embodiments. It should be noted that the following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments.
Optionally, before describing the image processing method, an image processing system to which the image processing method is applicable is first described.
Fig. 2 is a schematic structural diagram of an image processing system according to an embodiment of the present application. Fig. 3 is a schematic diagram of an application scenario according to an embodiment of the present application. As shown in fig. 2 and 3, in the present embodiment, the image processing system may include: the first device 201, the second device 202 connected with the first device 201, and the terminal 203 to be tested. The terminal to be tested 203 is provided with a camera application and a display screen.
The first device 201 is configured to send a first instruction to the terminal 203 to be tested; the terminal to be tested 203 is configured to, when receiving a first instruction, photograph a target graphic card with a camera application, store media data obtained by photographing, and send the media data to the first device 201;
the first device 201 is further configured to perform color analysis on an image to be processed in the received media data, to obtain first image color data;
the first device 201 is further configured to send a second instruction to the second device 202; the second device 202 is configured to collect, when receiving the second instruction, second image color data of an image to be processed displayed on a display screen of the terminal 203 to be tested, and send the second image color data to the first device 201;
The first device 201 is further configured to determine, upon receiving the second image color data, screen color difference information of the terminal 203 to be tested based on the first image color data and the second image color data.
In practical application, the first device 201 is called a control device and a control computer, and has global control capability, so that the working states of the terminal 203 to be tested and the second device 202 can be controlled. The first device 201 is further preset with a color analysis algorithm, which can perform color analysis on an image or a video frame shot by the terminal to be tested 203, so as to obtain first image color data of the image to be processed.
The terminal to be tested 203 is a terminal device having a photographing function and a display function, on which a camera application is mounted, having a display screen, through which the terminal to be tested 203 can photograph a target graphic card. It can be understood that, in the photographing process, in order to ensure the definition of the photographed image, the target graphic card located in the camera application preview screen may be focused first, and the photographing process may be performed after the focusing is finished.
The second device 202 is a device having a color analysis function, for example, the second device 202 may be a color analyzer, which may perform color analysis on an image to be processed displayed on a display screen of the terminal to be tested 203 to obtain second image color data of the image to be processed.
It will be appreciated that the second device 202 is a dedicated color analyzer that is capable of rapidly measuring the brightness and chromaticity of images displayed on various display screens. In this embodiment, the second device 202 can accurately measure the color data of the image to be processed presented on the display screen of the terminal 203 to be tested, which lays a foundation for the subsequent analysis of the color difference of the display screen of the terminal 203 to be tested.
In this embodiment, the image processing system is one of control systems, and the first device 201, the terminal to be tested 203, and the second device 202 included in the image processing system may be one of terminal devices, and the screen color difference information of the terminal to be tested 203 may be determined through cooperation of the first device 201, the terminal to be tested 203, and the second device 202. The following explains how the image processing system solves the technical problem to be solved by the present application through an application scenario shown in fig. 3.
Illustratively, as shown in FIG. 3, an image processing system is first built. Referring to fig. 3, the application scenario may include a first device 201, a terminal to be tested 203, and a second device 202, and may further include a target graphics card.
Alternatively, the target graphics card may be a standard 24-color test card (24 color). The 24-color test card comprises six-level gray color blocks, additive three primary colors (red, green and blue), subtractive three primary colors (yellow, quality and cyan) and a standard board for simulating the real colors of natural objects, wherein the standard board is provided with 24 pure color blocks from left to right and then is respectively marked as 1-24 from top to bottom. It can be understood that the embodiment of the present application is not limited to specific implementation of the target graphics card, for example, the target color card may also be a 12 color card, a 36 color card, etc., which are not described herein.
In practical application of the present application, a tester fixes the terminal 203 to be tested on a bracket, then adjusts the distances among the terminal 203 to be tested, the target graphics card, the second device 202 and other devices, and then connects the power supply and the first device 201.
It will be appreciated that in practical applications, the first device 201 may be a control computer, the terminal to be tested 203 may be a terminal that needs to test a display screen, and the second device 202 may be a professional color analyzer.
It can be understood that in the process of setting up the image processing system, a tester may first fix the target color card, then open the camera application of the terminal 203 to be tested and start the second device 202, fix the position of the terminal 203 to be tested when determining that the target color card is located in the camera viewfinder of the terminal 203 to be tested, and then determine the position of the second device 202 when determining that the second device 202 can collect the image to be processed displayed on the display screen of the terminal 203 to be tested, thereby determining the positional relationship among the terminal 203 to be tested, the second device 202 and the target color card, and thus ensuring smooth implementation of the scheme.
In an alternative embodiment of the present application, after the image processing system is built, the first device 201 may send a first instruction to the terminal to be tested 203 based on an instruction of a tester, so that the terminal to be tested 203 photographs the target graphic card located in the camera viewfinder by using the camera application, so that media data about the target graphic card may be obtained.
Alternatively, after determining that the photographing work of the terminal 203 to be tested is completed, the media data may be imported to the first device 201 so that the first device 201 processes it. Illustratively, the first device 201 may import the image to be processed in the media data into a color analysis system in the first device 201, generating first image color data for the image to be processed.
Optionally, when the image to be processed is displayed on the display screen of the terminal 203 to be tested, the first device 201 may also send a second instruction to the second device 202 based on the instruction of the tester, so that the second device 202 collects color data of the image to be processed currently displayed on the display screen of the terminal 203 to be tested, thereby obtaining color data of the second image. Accordingly, the second device 202 may transmit the second image color data of the image to be processed to the first device 201.
It will be appreciated that after the second device 202 collects the second image color data of the image to be processed in the media data, the second image color data may be written into excel according to a certain rule and stored, so as to send the second image color data to the first device 201 at the same time.
Optionally, after receiving the second image color data of the image to be processed from the second device 202, the first device 201 may perform consistency analysis on the second image color data and the first image color data of the image to be processed, and determine the consistency degree of the second image color data and the first image color data, so as to determine the screen color difference information of the terminal 203 to be tested based on the consistency degree.
It will be appreciated that in an embodiment of the present application, after the position of the terminal to be tested 203 and the target graphics card are fixed, in an alternative embodiment, the user may issue a first operation through a camera application icon acting on the interface, the terminal to be tested 203 starts the camera application in response to the first operation, and displays a camera viewfinder on the interface of the terminal to be tested 203. In an alternative embodiment, when the shortcut switch of the camera application is deployed on the pull-down notification interface of the terminal to be tested 203, the user may also issue a first operation through the shortcut switch of the camera application acting on the pull-down notification interface, so that the terminal to be tested 203 starts the camera application in response to the first operation.
Optionally, a shooting trigger control is present on the interface where the camera viewfinder is located. Thus, the user may issue a first instruction by acting on the capture trigger control to trigger the camera application to perform a capture process.
The image processing system provided by the embodiment of the application comprises a first device, a second device and a terminal to be tested, wherein the second device is connected with the first device, the first device can control the operation of the second device and the terminal to be tested, for example, the first device sends a first instruction to the terminal to be tested to trigger the terminal to be tested to shoot a target image card by using a camera application, the second device can also send a second instruction to the second device to instruct the second device to collect second image color data of an image to be processed, which is displayed by the terminal to be tested, the first device can also receive media data shot by the terminal to be tested, perform color analysis on the image to be processed in the media data to obtain first image color data, acquire the second image color data acquired by the second device, and determine screen color difference information of the terminal to be tested based on the first image color data and the second image color data. Therefore, based on the image processing system, the color consistency of the image display and the shooting image of the terminal to be tested can be determined, so that the screen color difference information of the terminal to be tested can be accurately determined, the problem of low color difference analysis accuracy in the prior art is solved, and a reference basis is provided for the follow-up optimization of the display screen of the terminal to be tested.
It will be appreciated that the image processing system provided by embodiments of the present application may be provided in a light-stable scene, for example, during a certain period of time, either indoors or outdoors during the day. However, in practical application, the illumination brightness in the same time period is different in different dates and different weather, and the obtained screen color difference information may be low in accuracy due to the influence of external factors according to the screen color difference information determined in the above manner. To address this problem, the image processing system of the present embodiment may be disposed in the environment of a darkroom, and then add a light source to provide an illumination environment.
Fig. 4 is a schematic diagram of another application scenario of an embodiment of the present application. As shown in fig. 4, in this application scenario, the image processing system may further comprise a light source connected to the first device 201.
Accordingly, in this embodiment, the first device 201 is further configured to send, before sending the first instruction to the terminal to be tested 203, a third instruction to the light source, where the third instruction is used to instruct the light source to adjust the brightness and the color temperature of the emitted illumination.
For example, in this embodiment, in order to ensure performance stability of the camera application, the target graphics card may be set in an environment with stable illumination and balanced color temperature, as shown in fig. 4, the light source may be a light box, and the light box may emit light with multiple brightness levels and multiple color temperature levels, and the target graphics card may be set inside the light box, so as to provide an illumination environment for the terminal 203 to be tested to capture an image of the target graphics card with the camera application.
It can be appreciated that in the embodiment of the present application, the environment where the image processing system is located may further include other light sources, for example, light sources disposed around the terminal 203 to be tested, which may provide an illumination environment of the whole image processing system, and the embodiment does not limit the types and the number of such light sources, which may be determined according to actual scene requirements, which is not described herein.
In practical applications, the first device 201 may send a third instruction to the light source to instruct the light source to send illumination brightness corresponding to the brightness level and color temperature corresponding to the color temperature level, and then send the first instruction to the terminal 203 to be tested, so that the terminal 203 to be tested photographs the target graphics card with the camera application.
In an embodiment of the present application, configuration information is preset in the first device 201, and the configuration information may include brightness level information, color temperature level information, and parameter adjustment interval.
Correspondingly, the first device 201 is further configured to determine the number of lighting environments and the brightness parameters and the color temperature parameters of the various lighting environments by permutation and combination of the brightness level information and the color temperature level information, where the brightness parameters and/or the color temperature parameters in different lighting environments are different.
For example, assuming that the luminance levels include L1, L2, and L3 and the color temperature levels include S1, S2, and S3, the lighting environment determined according to the luminance level information (L1, L2, and L3) and the color temperature level information (S1, S2, and S3) may include L1 to S1, L1 to S2, L1 to S3, L2 to S1, L2 to S2, L2 to S3, L3 to S1, L3 to S2, and L3 to S3, and the luminance parameter and/or the color temperature parameter in different lighting environments may be different due to the difference in the luminance parameter corresponding to each of the luminance levels and the color temperature parameter corresponding to each of the color temperature levels.
As an example, the first device 201 is specifically configured to send, to the light source, a luminance parameter and a color temperature parameter of a lighting environment by a third instruction every a duration of the parameter adjustment interval, up to the number of times of sending the number of lighting environments; correspondingly, the light source is used for sending out the illumination brightness corresponding to the brightness parameter and the color temperature corresponding to the color temperature parameter after receiving the brightness parameter and the color temperature parameter of the illumination environment.
That is, the first device 201 may sequentially send the luminance parameters and the color temperature parameters of the various illumination environments to the light source through the third instruction with the parameter adjustment interval as a time interval, where the third instruction is used to instruct the light source to sequentially send the illumination luminance corresponding to the luminance parameter and the color temperature corresponding to the color temperature parameter.
For example, assuming that the parameter adjustment interval is Δt, the first device 201 sends an instruction to the light source from the time T1, at this time, the first device 201 may send the luminance parameter and the color temperature parameter corresponding to L1 to S1 to the light source at the time T1, the light source sends the luminance parameter and the color temperature parameter corresponding to L1 level to the light source at the time T1+Δt, the light source sends the luminance parameter and the color temperature corresponding to S2 level to the light source, the duration of the interval Δt sends the luminance parameter and the color temperature parameter corresponding to L1 to S3 to the light source, the light source sends the luminance parameter and the color temperature corresponding to S3 level to the light source, and so on until the number of times of sending the number of illumination environments, that is, all the parameters corresponding to the illumination environments are sent, and the light source also sends the luminance and the color temperature of all the illumination environments.
Alternatively, in this embodiment, the first device 201 is specifically configured to send the first instruction to the terminal to be tested 203 after sending the third instruction to the light source each time. In this way, the terminal 203 to be tested can perform the photographing process under different illumination environments, so that media data photographed by the terminal 203 to be tested under different illumination environments can be obtained.
In this example, the first device 201 controls the light source for multiple times to make the target graphics card be in different illumination environments, and correspondingly controls the shooting behavior of the terminal 203 to be tested, so that media data with quality meeting the requirement can be obtained, and a foundation is laid for analyzing the screen color difference information with high precision for the subsequent image.
As another example, the first device 201 is further configured to send the parameter adjustment interval, the number of lighting environments, and the luminance parameters and the color temperature parameters of the various lighting environments to the light source by the third instruction; correspondingly, the light source is also used for adjusting the illumination brightness and the color temperature which are sent once according to the brightness parameters and the color temperature parameters of various illumination environments after receiving the third instruction, and the time length of each interval of the parameter adjustment interval is adjusted until the quantity of the illumination environments is reduced by 1 time.
For example, assuming that the parameter adjustment interval is Δt, the first device 201 may transmit the parameter adjustment interval Δt, the above-determined number of lighting environments (9 kinds) and the brightness parameters and color temperature parameters "L1 to S1, L1 to S2, L1 to S3, L2 to S1, L2 to S2, L2 to S3, L3 to S1, L3 to S2, L3 to S3" of the various lighting environments to the light source, so that the light source may switch the lighting environments to be presented at different times. For example, the light source emits the luminance corresponding to the L1 level and the color temperature corresponding to the S1 level at the time T1, then the luminance corresponding to the L1 level and the color temperature corresponding to the S2 level may be automatically emitted at the time t1+Δt, and then the duration of Δt is further set, that is, the luminance corresponding to the L1 level and the color temperature corresponding to the S3 level are emitted at the time t1+2Δt, and so on, until the light source emits the luminance and the color temperature of all the illumination environments, at this time, the illumination adjustment of 1 time minus the number of the illumination environments is performed.
Optionally, in this example, the first device 201 is further configured to send the parameter adjustment interval and the number of lighting environments to the terminal to be tested 203; correspondingly, the terminal 203 to be tested is configured to, after receiving the second instruction, photograph the target graphic card with the camera application every time the parameter is adjusted by the interval, until the number of times of executing the number of illumination environments. For example, assuming that the time difference between photographing of the terminal to be tested 203 and adjustment of the light source is Δt, Δt < Δt, at this time, if the light source emits the luminance corresponding to the L1 level and the color temperature corresponding to the S1 level at the time T1, the terminal to be tested 203 may automatically perform photographing of the target graphic card at the time t1+Δt, then perform photographing of the target graphic card again at the time t1+Δt+Δt, perform photographing process once again for a period of Δt, and perform photographing the number of times of lighting environments (9 times).
In this example, the light source can automatically adjust the emitted illumination information, present different illumination environments, and the terminal 203 to be tested can also automatically execute the shooting process, thereby reducing the delay, improving the automation degree of the image processing system, and laying a foundation for determining accurate screen color difference information subsequently.
In the embodiment of the present application, after receiving the media data sent by the terminal to be tested 203, the first device 201 may further classify and rename the media data before calculating the image color data of the image to be processed in the media data, so as to obtain at least one type of image set, where the illumination environment is the same when the image is shot in each type of image set, and the identification of the shooting terminal is the same.
In practical applications, the image processing system may include a plurality of terminals 203 to be tested at the same time, and the types of different terminals 203 to be tested may be different, and the color of the display screen may be different, so in this embodiment, one first device 201 may process media data captured by a plurality of terminals 203 to be tested at the same time, so that the media data captured by the terminals 203 to be tested may carry the types of the terminals 203 to be tested, illumination environment information, and so on, so after the first device 201 receives the media data from the terminals 203 to be tested, the media data may be classified and renamed according to the types of the terminals 203 to be tested, the illumination environment information, and so on, to obtain different image sets, that is, image sets under different capturing conditions.
It can be appreciated that the number, model, style, etc. of the terminals 203 to be tested are not limited, and may be determined according to actual testing requirements, which are not described herein.
Alternatively, in this embodiment, if the second device 202 is required to be able to successfully collect the color data of the image to be processed displayed by the terminal to be tested 203, the image to be processed needs to be displayed on the interface of the terminal to be tested 203. Thus, the first device 201 is further configured to send a fourth instruction to the terminal to be tested 203 before sending the second instruction to the second device 202, the fourth instruction being configured to instruct the terminal to be tested 203 to display the image to be processed in the media data. In this way, after receiving the second instruction sent by the first device 201, the second device 202 can successfully acquire the color data of the image to be processed, so as to provide implementation conditions for subsequently evaluating whether the color data of the display image of the terminal 203 to be tested is consistent with the color data of the shot image.
Optionally, in an alternative embodiment, a color computing system is preset in the first device 201, and a color processing algorithm can be run, so that color analysis can be performed on an image to be processed in the media data to obtain first image color data.
Optionally, in an optional embodiment, the first device is configured to determine, based on the first image color data and the second image color data, screen color difference information of the terminal to be tested, where the determining is specifically implemented as: the first device may determine a first gamut range corresponding to the first image color data and a second gamut range corresponding to the second image color data based on a preset color space, then calculate gamut consistency of the first gamut range and the second gamut range, and finally determine screen color difference information of the terminal to be tested according to the gamut consistency.
In practical applications, to measure the degree of deviation between two colors, a suitable color space needs to be selected. In daily life, the most commonly used color space is the RGB color space, but when people want to use euclidean distance to describe the difference between two colors, there is a serious problem that the calculated distance between two colors cannot correctly represent the actual difference between the two colors actually perceived by people, and in the color space standard established by the international commission on illumination (CIE), the calculated distance between the colors calculated by the Lab color space is basically consistent with the actual perceived difference, so the preset color space in the embodiment of the present application may be selected as the Lab color space.
Lab color space is a device independent color model, is a color-opponent space, and consists of three elements, one element is brightness (L), a and b are two color channels, representing color opposites dimensions, and a comprises colors from dark green (low brightness value) to gray (medium brightness value) to bright pink (high brightness value); b is from bright blue (low luminance value) to gray (medium luminance value) to yellow (high luminance value).
In the embodiment of the application, after first image color data and second image color data of an image to be processed are acquired by first equipment, the first image color data and the second image color data are firstly converted into color gamut ranges of a Lab color space based on a selected Lab color space, and then color comparison is carried out based on the first color gamut range corresponding to the first image color data and the second color gamut range corresponding to the second image color data, namely, screen color difference information of a terminal to be tested is evaluated based on the overlapping degree of the second color gamut range and the first color gamut range.
Illustratively, the horseshoe diagram is a representation of color contrast, which is a 2-dimensional representation of the spectrum that the human eye draws to see, based on the three primary colors. In the display industry, horseshoe diagrams are very important criteria for determining the color gamut range of a display screen. By way of example, the degree of consistency of the captured image color data (first image color data) and the display image color data (second image color data) of the terminal to be tested can be characterized by a horseshoe chart as follows.
Fig. 5 is a color gamut comparison diagram of photographed image color data and display image color data of a terminal to be tested, by way of example. As shown in fig. 5, a first gamut range corresponding to the first image color data is represented by a thin solid line, that is, a gamut range of the photographed image color data, and a second gamut range corresponding to the second image color data is represented by a thick solid line, that is, a gamut range of the display image color data. Referring to fig. 5, it can be seen that, in the red and green dimensions, the second color gamut is larger than the first color gamut, while in the blue dimension, the second color gamut is slightly smaller than the first color gamut, and if the screen color difference of the terminal to be tested is evaluated based on the photographed image, it can be determined that there may be a problem of oversaturation of the screen of the terminal to be tested in the green and red dimensions, and a problem of less undersaturation in the blue dimension. In the embodiment of the application, the display color accuracy of the screen of the terminal to be tested with the photographing function can be effectively evaluated by comparing the color consistency of the photographed image and the display image, and meanwhile, the display debugging direction is provided, so that a foundation is laid for improving the display accuracy of the terminal to be tested.
It will be appreciated that in embodiments of the present application, embodiments of the present application are not limited to the specific form of each device in an image processing system, and may be determined according to an actual scenario.
For example, the light source may also be referred to as a light control system, which may place the terminal to be tested in different lighting environments.
The camera application of the terminals to be tested can be understood as an image acquisition system, each terminal to be tested can shoot images, and the images can be shot in different illumination environments, so that the number of images and the types of images of the screen color difference analysis are improved.
The second device may also be referred to as a display image color acquisition system, i.e. it is capable of acquiring display image color data of an image to be processed displayed on the interface of the terminal to be tested.
The first device can also be used as a shooting image color computing system and a color consistency evaluating system, namely, an image shot by the terminal to be tested can be transmitted to the first device, so that the first device can calculate shooting image color data of an image to be processed based on a color analysis algorithm of the shooting image color computing system, and can carry out consistency analysis on display image color data of the image to be processed and shooting image color data based on the color consistency evaluating system, and obtain screen color difference information of the terminal to be tested.
In the embodiment of the application, the image processing system comprising the first equipment, the second equipment connected with the first equipment and the terminal to be tested is constructed, so that the image color data of the image shot by the terminal to be tested can be obtained, the image color data of the image shot by the terminal to be tested can be displayed, and further, the consistency analysis is carried out on the image color data and the image color data, so that the screen color difference information of the terminal to be tested can be determined, the problem of low color difference analysis precision in the traditional color difference analysis method is solved, the scheme is simple, the implementation is easy, and the color difference analysis cost is reduced.
The following explains a specific implementation of the image processing method provided by the embodiment of the present application on the basis of the image processing system provided by each of the above embodiments.
Fig. 6 is a schematic flow chart of an image processing method according to an embodiment of the present application. The embodiment of the method is applied to the image processing system shown in the above-mentioned fig. 2 to 4, and as shown in fig. 2 to 4, the image processing system comprises a first device, a second device connected with the first device, and a terminal to be tested. As shown in fig. 6, the image processing method may include the steps of:
S601, the first device sends a first instruction to the terminal to be tested, wherein the first instruction is used for indicating the terminal to be tested to shoot a target graphic card by using a camera application.
S602, the first device receives media data shot by the terminal to be tested, and performs color analysis on an image to be processed in the media data to obtain first image color data.
S603, the first device sends a second instruction to the second device, wherein the second instruction is used for instructing the second device to collect second image color data of the to-be-processed image displayed by the to-be-tested terminal.
S604, the first device acquires second image color data acquired by the second device, and determines screen color difference information of the terminal to be tested based on the first image color data and the second image color data.
In this embodiment, in the image processing system that is completed, the positions of the terminal to be tested and the target graphics card are fixed, and the first device, the second device, and the terminal to be tested are connected. Therefore, after the first device receives the starting instruction, the technical scheme of the application can be automatically executed.
Optionally, because the second device collects color data of the image to be processed, which is shot by the terminal to be tested, the first device first sends a first instruction to the terminal to be tested, so that the terminal to be tested starts to shoot the target image card by using the camera application after receiving the first instruction, and media data are obtained.
It can be understood that, in the specific implementation process, the number of the target cards is not limited, and the category of the target cards is not limited, and the target cards can be determined according to actual requirements, which is not described herein.
On the one hand, the terminal to be tested can import the shot media data into the first device, so that the first device performs color analysis on the image to be processed based on a built-in image color analysis method to obtain first image color data of the image to be processed.
On the other hand, the first device may further control the terminal to be tested to display the image to be processed on the interface, and then send a second instruction to the second device, so that the second device collects second image color data of the image to be processed displayed by the terminal to be tested based on the received second instruction.
It can be understood that the first device may process a plurality of images in the media data to obtain first image color data of the plurality of images to be processed, and at the same time, the first device may also control the terminal to be tested and the second device to work cooperatively, and obtain second image color data of the plurality of images to be processed, which is collected by the second device.
Optionally, in this embodiment, the plurality of images included in the media data may be photographed under different illumination environments, and color difference information of the terminal screen to be tested may be accurately determined by performing color consistency analysis on the images under different illumination environments.
Optionally, in an embodiment of the present application, if the image processing system further includes a light source connected to the first device, before S601, the image processing method may further include: the first device sends a third instruction to the light source, the third instruction being used to instruct the light source to adjust the emitted illumination brightness and color temperature.
Optionally, brightness level information, color temperature level information and parameter adjustment intervals are preset in the first device; correspondingly, the image processing method may further include: the first equipment performs permutation and combination on the brightness level information and the color temperature level information, and determines the quantity of illumination environments and brightness parameters and color temperature parameters of various illumination environments, wherein the brightness parameters and/or the color temperature parameters in different illumination environments are different.
It will be appreciated that in embodiments of the present application, in order to reduce interference of the external environment with the camera application captured images, the image processing system may be disposed in the environment of a darkroom, with different illumination environments provided by the light sources. Optionally, in this embodiment, in order to improve the automation performance of the image processing system, the light source may be connected to the first device, and after the processing process begins, the first device may first determine, based on the set brightness level information, the color temperature level information, and the like, an illumination environment when the terminal to be tested captures an image, and then control the terminal to be tested to capture an image under different illumination environments.
It is understood that the first terminal may control the light source and/or the operation mode of the terminal to be tested in different ways. The method comprises the following steps:
in an alternative embodiment of the present application, the first device sends a third instruction to the light source, including: and sending a brightness parameter and a color temperature parameter of an illumination environment to the light source through a third instruction every time the first equipment adjusts the time length of the interval, wherein the third instruction is used for indicating the light source to send out the illumination brightness corresponding to the brightness parameter and the color temperature corresponding to the color temperature parameter until the number of times of sending the illumination environment is reached.
Correspondingly, the first device sends a first instruction to the terminal to be tested, including: and the first device sends the first instruction to the terminal to be tested after sending the third instruction to the light source each time.
In an alternative embodiment of the present application, the first device sends a third instruction to the light source, including: the first device sends the parameter adjustment interval, the number of illumination environments and the brightness parameters and the color temperature parameters of various illumination environments to the light source through a third instruction, wherein the third instruction is used for indicating the light source to adjust the emitted illumination brightness and the emitted color temperature once every time the parameter adjustment interval is long according to the brightness parameters and the color temperature parameters of various illumination environments until the number of the illumination environments is reduced by 1 time.
Correspondingly, before the first device sends the first instruction to the terminal to be tested, the image processing method further comprises the following steps: and the first equipment sends the parameter adjustment interval and the illumination environment quantity to the terminal to be tested. The parameter adjustment interval is a time interval when the terminal to be tested shoots the target image card by using the camera application after receiving the second instruction, and the number of the illumination environments is the number of times that the terminal to be tested shoots the target image card by using the camera application.
In this embodiment, the first device controls the cooperative working modes of the terminal to be tested and the light source in different modes, so that the terminal to be tested can shoot media data in different illumination environments, and implementation conditions are provided for subsequent calculation of display image color data and shot image color data of the terminal to be tested.
Optionally, in an optional embodiment of the present application, after the first device receives media data shot by the terminal to be tested, the method further includes: the first equipment classifies and renames the media data to obtain at least one type of image set, wherein the illumination environments of the images in each type of image set are the same, and the shooting terminal identifiers are the same.
Optionally, the first device classifies and renames the acquired media data so as to improve the processing performance of the media data. By way of example, images with the same illumination environment and the same shooting terminal identifier can be divided into one type, so that a plurality of types of image sets can be obtained, and then the images to be processed can be processed in a subsequent type-by-type manner, and the data processing efficiency is improved.
Optionally, in an optional embodiment, before the first device sends the second instruction to the second device, the image processing method may further include: and sending a fourth instruction to the terminal to be tested, wherein the fourth instruction is used for instructing the terminal to be tested to display the image to be processed in the media data. Therefore, the second equipment can successfully acquire the color data of the image to be processed displayed by the terminal to be tested, and an implementation basis is provided for the subsequent execution of color analysis.
Optionally, in an optional embodiment, the first device determines, based on the first image color data and the second image color data, screen color difference information of the terminal to be tested by:
and determining a first color gamut range corresponding to the first image color data and a second color gamut range corresponding to the second image color data based on a preset color space, calculating color gamut consistency of the first color gamut range and the second color gamut range, and determining screen color difference information of the terminal to be tested according to the color gamut consistency.
The scheme for determining the screen color difference information of the terminal to be tested based on the image processing system is introduced, and the influence of shooting performance of the terminal to be tested on the screen color difference can be reduced through cooperation of all the devices in the image processing system, so that the screen color difference information can be simply and accurately determined, and the accuracy of color difference analysis is improved.
The embodiment of the application can divide the functional modules of the device for realizing the image processing method according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
Fig. 7 is a schematic structural diagram of a chip according to an embodiment of the present application. As shown in fig. 7, chip 700 includes one or more (including two) processors 701, communication lines 702, communication interfaces 703, and memory 704.
In some implementations, the memory 704 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
The method described in the above embodiments of the present application may be applied to the processor 701 or implemented by the processor 701. The processor 701 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 701 or by instructions in the form of software. The processor 701 may implement or perform the methods, steps, and logic blocks associated with the various processes disclosed in embodiments of the application.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a state-of-the-art storage medium such as random access memory, read-only memory, programmable read-only memory, or charged erasable programmable memory (electrically erasable programmable read only memory, EEPROM). The storage medium is located in the memory 704, and the processor 701 reads information in the memory 704, and in combination with its hardware, performs the steps of the above method.
The processor 701, the memory 704 and the communication interface 703 may communicate with each other via a communication line 702.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
Embodiments of the present application also provide a computer program product comprising one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL), or wireless (e.g., infrared, wireless, microwave, etc.), or semiconductor medium (e.g., solid state disk, SSD)) or the like.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
In one possible implementation, the computer readable medium may include RAM, ROM, compact disk-read only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium targeted for carrying or storing the desired program code in the form of instructions or data structures and accessible by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (Digital Subscriber Line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes optical disc, laser disc, optical disc, digital versatile disc (Digital Versatile Disc, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing detailed description of the application has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the application.

Claims (11)

1. An image processing method, the method comprising:
the method comprises the steps that first equipment sends a first instruction to a terminal to be tested, wherein the first instruction is used for instructing the terminal to be tested to shoot a target graphic card by using a camera application;
the first equipment receives media data shot by the terminal to be tested, and performs color analysis on an image to be processed in the media data to obtain first image color data;
the first device sends a second instruction to a second device, wherein the second instruction is used for instructing the second device to acquire second image color data of the image to be processed, which is displayed by the terminal to be tested;
the first device acquires the second image color data acquired by the second device, and determines the screen color difference information of the terminal to be tested based on the first image color data and the second image color data;
the first device determines screen color difference information of the terminal to be tested based on the first image color data and the second image color data, and the method comprises the following steps:
determining a first color gamut range corresponding to the first image color data and a second color gamut range corresponding to the second image color data based on a preset color space;
Calculating the color gamut consistency of the first color gamut range and the second color gamut range;
and according to the color gamut consistency, determining the screen color difference information of the terminal to be tested.
2. The method of claim 1, wherein prior to the first device sending the first instruction to the terminal under test, the method further comprises:
the first device sends a third instruction to the light source, wherein the third instruction is used for indicating the light source to adjust the emitted illumination brightness and color temperature.
3. The method according to claim 2, wherein the first device is preset with brightness level information, color temperature level information, and parameter adjustment intervals; the method further comprises the steps of:
the first equipment performs permutation and combination on the brightness level information and the color temperature level information, and determines the quantity of illumination environments and brightness parameters and color temperature parameters of various illumination environments, wherein the brightness parameters and/or the color temperature parameters in different illumination environments are different;
the first device sending a third instruction to the light source, comprising:
the first device sends a brightness parameter and a color temperature parameter of an illumination environment to the light source through a third instruction every time the parameter is adjusted, and the third instruction is used for indicating the light source to send out illumination brightness corresponding to the brightness parameter and color temperature corresponding to the color temperature parameter until the number of times of sending the illumination environment number;
The first device sends a first instruction to a terminal to be tested, and the first instruction comprises:
and the first device sends the first instruction to the terminal to be tested after sending the third instruction to the light source each time.
4. The method according to claim 2, wherein the first device is preset with brightness level information, color temperature level information, and parameter adjustment intervals; the method further comprises the steps of:
the first equipment performs permutation and combination on the brightness level information and the color temperature level information to determine the quantity of illumination environments and brightness parameters and color temperature parameters of various illumination environments, and the brightness parameters and/or the color temperature parameters in different illumination environments are different;
the first device sending a third instruction to the light source, comprising: the first device sends the parameter adjustment interval, the number of illumination environments and the brightness parameters and the color temperature parameters of various illumination environments to a light source through the third instruction, wherein the third instruction is used for indicating the light source to adjust the emitted illumination brightness and color temperature once according to the brightness parameters and the color temperature parameters of various illumination environments every time the duration of the parameter adjustment interval until the number of illumination environments is reduced by 1 time;
Before the first device sends the first instruction to the terminal to be tested, the method further comprises:
the first device sends the parameter adjustment interval and the illumination environment quantity to the terminal to be tested, wherein the parameter adjustment interval is a time interval when the terminal to be tested shoots the target image card by using a camera application after receiving the second instruction, and the illumination environment quantity is the times when the terminal to be tested shoots the target image card by using the camera application.
5. The method according to any one of claims 1 to 4, further comprising, after the first device receives media data photographed by the terminal to be tested:
the first device classifies and renames the media data to obtain at least one type of image set, wherein the illumination environments of the images in each type of image set are the same, and the shooting terminal identifiers are the same.
6. The method of any of claims 1-4, wherein prior to the first device sending a second instruction to the second device, the method further comprises:
and sending a fourth instruction to the terminal to be tested, wherein the fourth instruction is used for indicating the terminal to be tested to display the image to be processed in the media data.
7. An image processing system is characterized by comprising a first device, a second device and a terminal to be tested, wherein the second device is connected with the first device; the terminal to be tested is provided with a camera application and a display screen;
the first device is used for sending a first instruction to the terminal to be tested;
the terminal to be tested is used for shooting the target graphic card by using a camera application when receiving the first instruction, storing media data obtained by shooting, and sending the media data to the first device;
the first device is further configured to perform color analysis on the received image to be processed in the media data, so as to obtain first image color data;
the first device is further configured to send a second instruction to the second device;
the second device is used for collecting second image color data of the image to be processed, which is displayed on the display screen of the terminal to be tested, and sending the second image color data to the first device when receiving the second instruction;
the first device is further configured to determine, when receiving the second image color data, screen color difference information of the terminal to be tested based on the first image color data and the second image color data;
The first device determines screen color difference information of the terminal to be tested based on the first image color data and the second image color data, and the method comprises the following steps:
determining a first color gamut range corresponding to the first image color data and a second color gamut range corresponding to the second image color data based on a preset color space;
calculating the color gamut consistency of the first color gamut range and the second color gamut range;
and according to the color gamut consistency, determining the screen color difference information of the terminal to be tested.
8. The system of claim 7, further comprising a light source coupled to the first device;
the first device is further configured to send a third instruction to the light source before sending the first instruction to the terminal to be tested, where the third instruction is used to instruct the light source to adjust the brightness and the color temperature of the emitted light.
9. The system of claim 8, wherein the light source is a light box and the target graphic card is disposed inside the light box.
10. An electronic device, comprising: a memory for storing a computer program and a processor for executing the computer program to perform the image processing method according to any one of the preceding claims 1 to 6.
11. A computer-readable storage medium storing instructions that, when executed, cause a computer to perform the image processing method according to any one of claims 1 to 6.
CN202210983176.8A 2022-08-16 2022-08-16 Image processing method, system and storage medium Active CN116091392B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210983176.8A CN116091392B (en) 2022-08-16 2022-08-16 Image processing method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210983176.8A CN116091392B (en) 2022-08-16 2022-08-16 Image processing method, system and storage medium

Publications (2)

Publication Number Publication Date
CN116091392A CN116091392A (en) 2023-05-09
CN116091392B true CN116091392B (en) 2023-10-20

Family

ID=86208870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210983176.8A Active CN116091392B (en) 2022-08-16 2022-08-16 Image processing method, system and storage medium

Country Status (1)

Country Link
CN (1) CN116091392B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117857762B (en) * 2024-03-08 2024-05-14 深圳市东陆科技有限公司 Image processing method and device of display module, chip and storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1460967A (en) * 2002-05-18 2003-12-10 Lg.菲利浦Lcd株式会社 Image quality analysis method and system for display device
CN101312502A (en) * 2007-05-21 2008-11-26 富士胶片株式会社 Image taking apparatus, image reproducing apparatus, image taking method and program
CN102026003A (en) * 2010-12-24 2011-04-20 广东威创视讯科技股份有限公司 Method and device for correcting colors of multi-screen splicing wall
CN103167293A (en) * 2011-12-09 2013-06-19 夏普株式会社 Display system and computer-readable medium
CN103453994A (en) * 2012-05-29 2013-12-18 技嘉科技股份有限公司 Chromatic aberration test fixture, chromatic aberration test interface device and chromatic aberration test method
WO2016145831A1 (en) * 2015-09-08 2016-09-22 中兴通讯股份有限公司 Image acquisition method and device
CN108012133A (en) * 2017-12-12 2018-05-08 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
CN108470334A (en) * 2018-03-20 2018-08-31 上海顺久电子科技有限公司 A kind of method and device of acquisition screen intensity and coloration
CN112712475A (en) * 2020-12-17 2021-04-27 珠海格力电器股份有限公司 Image color difference compensation method and device, electronic equipment and storage medium
CN113496477A (en) * 2020-04-07 2021-10-12 华为技术有限公司 Screen detection method and electronic equipment
CN113645494A (en) * 2021-08-10 2021-11-12 海信视像科技股份有限公司 Screen fusion method, display device, terminal device and server
CN113810601A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Terminal image processing method and device and terminal equipment
CN114286067A (en) * 2021-12-28 2022-04-05 青岛海信激光显示股份有限公司 Projection equipment and image correction method
WO2022071413A1 (en) * 2020-10-02 2022-04-07 富士フイルム株式会社 Image processing device, endoscope system, method for operating image processing device, and program for image processing device
CN114299100A (en) * 2021-05-07 2022-04-08 海信视像科技股份有限公司 Screen positioning method, terminal device and display device
CN114359166A (en) * 2021-12-13 2022-04-15 苏州凌云视界智能设备有限责任公司 Screen color detection method, device, equipment and storage medium
CN114881886A (en) * 2022-06-02 2022-08-09 北京字跳网络技术有限公司 Image processing method, image processing device, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114677319A (en) * 2020-12-24 2022-06-28 富泰华工业(深圳)有限公司 Stem cell distribution determination method and device, electronic equipment and storage medium

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1460967A (en) * 2002-05-18 2003-12-10 Lg.菲利浦Lcd株式会社 Image quality analysis method and system for display device
CN101312502A (en) * 2007-05-21 2008-11-26 富士胶片株式会社 Image taking apparatus, image reproducing apparatus, image taking method and program
CN102026003A (en) * 2010-12-24 2011-04-20 广东威创视讯科技股份有限公司 Method and device for correcting colors of multi-screen splicing wall
CN103167293A (en) * 2011-12-09 2013-06-19 夏普株式会社 Display system and computer-readable medium
CN103453994A (en) * 2012-05-29 2013-12-18 技嘉科技股份有限公司 Chromatic aberration test fixture, chromatic aberration test interface device and chromatic aberration test method
WO2016145831A1 (en) * 2015-09-08 2016-09-22 中兴通讯股份有限公司 Image acquisition method and device
CN108012133A (en) * 2017-12-12 2018-05-08 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
CN108470334A (en) * 2018-03-20 2018-08-31 上海顺久电子科技有限公司 A kind of method and device of acquisition screen intensity and coloration
CN113496477A (en) * 2020-04-07 2021-10-12 华为技术有限公司 Screen detection method and electronic equipment
WO2022071413A1 (en) * 2020-10-02 2022-04-07 富士フイルム株式会社 Image processing device, endoscope system, method for operating image processing device, and program for image processing device
CN112712475A (en) * 2020-12-17 2021-04-27 珠海格力电器股份有限公司 Image color difference compensation method and device, electronic equipment and storage medium
CN114299100A (en) * 2021-05-07 2022-04-08 海信视像科技股份有限公司 Screen positioning method, terminal device and display device
CN113645494A (en) * 2021-08-10 2021-11-12 海信视像科技股份有限公司 Screen fusion method, display device, terminal device and server
CN113810601A (en) * 2021-08-12 2021-12-17 荣耀终端有限公司 Terminal image processing method and device and terminal equipment
CN114359166A (en) * 2021-12-13 2022-04-15 苏州凌云视界智能设备有限责任公司 Screen color detection method, device, equipment and storage medium
CN114286067A (en) * 2021-12-28 2022-04-05 青岛海信激光显示股份有限公司 Projection equipment and image correction method
CN114881886A (en) * 2022-06-02 2022-08-09 北京字跳网络技术有限公司 Image processing method, image processing device, electronic equipment and storage medium

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
An algorithm for online detection of colour differences in warp knitted fabrics;Guosheng Xie等;《The Journal of The Textile Institute 》;159-165 *
screen colour difference detection;Jianguo Zhang等;《2019 International Conference on Intelligent Informatics and Biomedical Sciences (ICIIBMS)》;86-92 *
一种基于LASSO回归模型的彩色相机颜色校正方法;郭越等;《影像科学与光化学》;153-161 *
手机屏幕图像缺陷检测方法的研究;刘波;《中国优秀硕士学位论文全文数据库 信息科技辑》;I140-195 *

Also Published As

Publication number Publication date
CN116091392A (en) 2023-05-09

Similar Documents

Publication Publication Date Title
CN111179282B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN111132234B (en) Data transmission method and corresponding terminal
WO2020056647A1 (en) Ai model development method and device
CN104205204A (en) Image display device and method
CN114422340B (en) Log reporting method, electronic equipment and storage medium
CN104205203B (en) Image display device, camera, image display system and method
CN113938720B (en) Multi-device cooperation method, electronic device and multi-device cooperation system
US20220245778A1 (en) Image bloom processing method and apparatus, and storage medium
CN113436576B (en) OLED display screen dimming method and device applied to two-dimensional code scanning
WO2022148319A1 (en) Video switching method and apparatus, storage medium, and device
CN116055712B (en) Method, device, chip, electronic equipment and medium for determining film forming rate
CN116091392B (en) Image processing method, system and storage medium
CN110830730A (en) Apparatus and method for generating moving image data in electronic device
CN113810603A (en) Point light source image detection method and electronic equipment
CN113542580A (en) Method and device for removing light spots of glasses and electronic equipment
CN112188094B (en) Image processing method and device, computer readable medium and terminal equipment
CN115412678B (en) Exposure processing method and device and electronic equipment
CN111918047A (en) Photographing control method and device, storage medium and electronic equipment
CN116708751B (en) Method and device for determining photographing duration and electronic equipment
CN115631250B (en) Image processing method and electronic equipment
CN115686182B (en) Processing method of augmented reality video and electronic equipment
CN112099741B (en) Display screen position identification method, electronic device and computer readable storage medium
CN116095477B (en) Focusing processing system, method, equipment and storage medium
CN115705663B (en) Image processing method and electronic equipment
CN116700646B (en) Image color difference processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant