CN113496477A - Screen detection method and electronic equipment - Google Patents

Screen detection method and electronic equipment Download PDF

Info

Publication number
CN113496477A
CN113496477A CN202010266843.1A CN202010266843A CN113496477A CN 113496477 A CN113496477 A CN 113496477A CN 202010266843 A CN202010266843 A CN 202010266843A CN 113496477 A CN113496477 A CN 113496477A
Authority
CN
China
Prior art keywords
image
image sequence
screen
electronic device
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010266843.1A
Other languages
Chinese (zh)
Inventor
肖斌
朱聪超
彭勇
吴云鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010266843.1A priority Critical patent/CN113496477A/en
Publication of CN113496477A publication Critical patent/CN113496477A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30121CRT, LCD or plasma display

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention provides a screen detection method and electronic equipment, wherein the method comprises the following steps: acquiring a first image sequence and a second image sequence, wherein the second image sequence comprises a screenshot image of second electronic equipment, the second electronic equipment is an object to be detected, the first image sequence comprises a screen display image of first electronic equipment, and the first electronic equipment is a reference object of the object to be detected; adjusting attribute information of each first image in the first image sequence to enable each first image to be aligned with a second image with the same time stamp in the second image sequence; comparing the first image sequence with the second image sequence according to a specified comparison mode to obtain difference information; and when the difference information meets the specified conditions, identifying that the second electronic equipment has a fault corresponding to the specified comparison mode. By the technical scheme, the detection scene range of screen detection is expanded, the automation and the efficiency of screen detection are improved, and the screen detection cost is effectively reduced.

Description

Screen detection method and electronic equipment
[ technical field ] A method for producing a semiconductor device
The invention relates to the technical field of terminals, in particular to a screen detection method and electronic equipment.
[ background of the invention ]
Due to the fact that problems may occur to display devices, display algorithms, display chips and the like of the electronic equipment, screen faults of screen splash or screen flashing are often caused. Therefore, the electronic equipment can be detected before the electronic equipment leaves a factory so as to intercept the electronic equipment with screen failure. In the existing detection mode, the pixel distribution characteristics such as the pixel variance of the image displayed by the electronic equipment can be calculated and analyzed to obtain the detection result of the screen display condition.
However, this detection method can be used only for a pure color picture or a preset picture. The actual application scenes of the electronic equipment are very complex, such as video, games, application switching and other common user use scenes, the screen of the electronic equipment is normally displayed under a pure color picture or a preset picture, and the screen cannot represent that the screen can be normally displayed in other actual scenes, so that the detection mode cannot cover a large number of actual scenes required by the user, and the accuracy of the obtained detection result is low. If the corresponding detection objects are preset for different application scenes, system resources of the electronic device are consumed greatly, and the performance of the electronic device is affected.
Therefore, how to efficiently and practically detect the screen of the electronic device becomes a technical problem to be solved urgently at present.
[ summary of the invention ]
The embodiment of the invention provides a screen detection method and electronic equipment, and aims to solve the technical problem that the screen detection mode of the electronic equipment is unreasonable in the related art.
In a first aspect, an embodiment of the present invention provides a screen detection method, including: acquiring a first image sequence and a second image sequence, wherein the second image sequence comprises a screenshot image of second electronic equipment, the second electronic equipment is an object to be detected, the first image sequence comprises a screen display image of first electronic equipment, and the first electronic equipment is a reference object of the object to be detected; adjusting attribute information of each first image in the first image sequence to align each first image with a second image with the same time stamp in the second image sequence; comparing the first image sequence with the second image sequence according to a specified comparison mode to obtain difference information; and when the difference information meets a specified condition, identifying that the second electronic equipment has a fault corresponding to the specified comparison mode.
In one possible design, if the attribute information includes a timestamp, the step of adjusting the attribute information of each first image in the sequence of first images includes: determining the second image in the second image sequence having the same timestamp for said each first image in the first image sequence.
In a possible design, if the attribute information further includes an image shape, the step of adjusting the attribute information of each first image in the first image sequence further includes: and performing affine transformation on each first image in the first image sequence by taking the second image corresponding to each first image as a fitting object, so that the image shape of each first image is consistent with that of the corresponding second image.
In a possible design, the attribute information further includes image brightness, and the step of adjusting the attribute information of each first image in the first image sequence further includes: and adjusting the image brightness of each first image in the first image sequence to enable the ratio of the image brightness of each first image to the image brightness of the corresponding second image to be a specified coefficient.
In one possible design, the designated contrast mode includes a splash screen contrast mode and/or a splash screen contrast mode.
In a possible design, for the screen-splash comparison mode, the step of comparing the first image sequence and the second image sequence according to a specified comparison mode to obtain difference information includes: calculating a difference threshold for each of the first images in the sequence of first images and the corresponding second image; the step of identifying that the second electronic device has a fault corresponding to the specified comparison mode when the difference information meets a specified condition includes: and in the case that any difference threshold value is larger than a specified threshold value, identifying that the second electronic equipment is subjected to screen splash.
In a possible design, for the splash-screen contrast mode, the step of comparing the first image sequence and the second image sequence according to a specified contrast mode to obtain difference information includes: generating a first luminance curve based on the image luminance and the time stamp of each first image in the first image sequence; generating a second brightness curve based on the image brightness and the time stamp of each second image in the second image sequence; acquiring a difference interval of the second brightness curve relative to the first brightness curve; the step of identifying that the second electronic device has a fault corresponding to the specified comparison mode when the difference information meets a specified condition includes: and identifying that the second electronic equipment flickers when the difference interval is a peak jitter interval or a trough jitter interval.
In one possible design, the step of acquiring the first image sequence and the second image sequence includes: and acquiring the first image sequence and the second image sequence of a specified frame number at specified time intervals.
In a second aspect, an embodiment of the present invention provides a screen detecting apparatus, including: the image sequence acquisition unit is used for acquiring a first image sequence and a second image sequence, wherein the second image sequence comprises a screenshot image of second electronic equipment, the second electronic equipment is an object to be detected, the first image sequence comprises a screen display image of first electronic equipment, and the first electronic equipment is a reference object of the object to be detected; the image alignment unit is used for adjusting the attribute information of each first image in the first image sequence to align each first image with a second image with the same time stamp in the second image sequence; the difference comparison unit is used for comparing the first image sequence with the second image sequence according to a specified comparison mode to obtain difference information; and the fault identification unit is used for identifying the fault corresponding to the specified comparison mode of the second electronic equipment when the difference information meets specified conditions.
In one possible design, the attribute information includes a timestamp, and the image alignment unit includes: a time stamp alignment unit for determining for each of said first images in said first sequence of images said second image in said second sequence of images having the same time stamp.
In one possible design, the attribute information further includes an image shape, and the image alignment unit includes: and the affine transformation unit is used for carrying out affine transformation on each first image in the first image sequence by taking the second image corresponding to each first image as a fitting object so that the image shape of each first image is consistent with the corresponding second image.
In one possible design, the attribute information further includes image brightness, and the image alignment unit includes: and the brightness adjusting unit is used for adjusting the image brightness of each first image in the first image sequence, so that the ratio of the image brightness of each first image to the image brightness of the corresponding second image is a specified coefficient.
In one possible design, the designated contrast mode includes a splash screen contrast mode and/or a splash screen contrast mode.
In one possible design, the difference comparison unit includes: a difference threshold calculation unit, configured to calculate a difference threshold between each first image in the first image sequence and the corresponding second image; the fault identification unit is used for: and in the case that any difference threshold value is larger than a specified threshold value, identifying that the second electronic equipment is subjected to screen splash.
In one possible design, the difference comparison unit includes: a luminance curve generating unit configured to generate a first luminance curve based on the image luminance and the time stamp of each of the first images in the first image sequence, and generate a second luminance curve based on the image luminance and the time stamp of each of the second images in the second image sequence; the difference interval acquisition unit is used for acquiring a difference interval of the second brightness curve relative to the first brightness curve; the fault identification unit is used for: and identifying that the second electronic equipment flickers when the difference interval is a peak jitter interval or a trough jitter interval.
In one possible embodiment, the image sequence detection unit is configured to: and acquiring the first image sequence and the second image sequence of a specified frame number at specified time intervals.
In a third aspect, an embodiment of the present invention provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor, the instructions being arranged to perform the method of any of the first aspects above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium storing computer-executable instructions for performing the method flow described in any one of the first aspect.
In a fifth aspect, an embodiment of the present invention provides a computer program product, which, when run on an electronic device, causes the electronic device to execute the method flow described in any one of the above first aspects.
Through the technical scheme, the detection scene range of screen detection is enlarged, the automation and the efficiency of screen detection are improved, and the screen detection cost is effectively reduced.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic structural diagram of an electronic device to which the present application relates.
FIG. 2 is a flow chart illustrating a method for detecting a screen according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a screen detection scenario provided by an embodiment of the present invention;
fig. 4 shows a schematic block diagram of a screen detecting apparatus according to an embodiment of the present invention.
[ detailed description ] embodiments
Hereinafter, embodiments of the present embodiment will be described in detail with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
The technical solution provided in the present application is applied to an electronic device, which may be a mobile phone, a desktop computer, a tablet computer, a wearable device, an in-vehicle device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and any other electronic device having an operating system, and the embodiment of the present application does not limit the electronic device.
Fig. 1 shows a schematic structural diagram of an electronic device to which the present application relates.
As shown in fig. 1, the electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic device. In other embodiments of the present application, an electronic device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components may be used. For example, when the electronic device is a smart tv, the smart tv does not need to provide one or more of the SIM card interface 195, the camera 193, the key 190, the receiver 170B, the microphone 170C, and the earphone interface 170D. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, the electronic device may also include one or more processors 110. The controller can be a neural center and a command center of the electronic device. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses, reduces the latency of the processor 110, and thus increases the efficiency of the electronic device.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device, may also be used to transmit data between the electronic device and a peripheral device, and may also be used to connect an earphone to play audio through the earphone.
It should be understood that the interface connection relationship between the modules according to the embodiment of the present invention is only an exemplary illustration, and does not limit the structure of the electronic device. In other embodiments of the present application, the electronic device may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to electronic devices, including Wireless Local Area Networks (WLAN), bluetooth, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), NFC, Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 150 and antenna 2 is coupled to the wireless communication module 160 so that the electronic device can communicate with the network and other devices through wireless communication techniques. The wireless communication technologies may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device may implement the display function via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute instructions to generate or change display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device may implement a capture function via the ISP, camera 193, video codec, GPU, one or more display screens 194, and application processor, among others.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent cognition of electronic equipment, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, data files such as music, photos, videos, and the like are saved in the external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so as to enable the electronic device to execute the voice switching method provided in some embodiments of the present application, and various functional applications, data processing, and the like. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area can store data (such as photos, contacts and the like) and the like created during the use of the electronic device. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. In some embodiments, the processor 110 may cause the electronic device to execute the voice switching method provided in the embodiments of the present application and various functional applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
The electronic device may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc. The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and also configured to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic device answers a call or voice information, it can answer the voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device may be provided with at least one microphone 170C. In other embodiments, the electronic device may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and the like.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, may be an open mobile electronic device platform (OMTP) standard interface of 3.5mm, and may also be a CTIA (cellular telecommunications industry association) standard interface.
The sensors 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronics determine the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic device detects the intensity of the touch operation according to the pressure sensor 180A. The electronic device may also calculate the position of the touch from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion pose of the electronic device. In some embodiments, the angular velocity of the electronic device about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyroscope sensor 180B detects a shake angle of the electronic device, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device through a reverse movement, thereby achieving anti-shake. The gyro sensor 180B may also be used for navigation, body sensing game scenes, and the like.
The acceleration sensor 180E can detect the magnitude of acceleration of the electronic device in various directions (typically three axes). When the electronic device is at rest, the magnitude and direction of gravity can be detected. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the electronic device may utilize the distance sensor 180F to range to achieve fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device emits infrared light to the outside through the light emitting diode. The electronic device uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device. When insufficient reflected light is detected, the electronic device may determine that there are no objects near the electronic device. The electronic device can detect that the electronic device is held by a user and close to the ear for conversation by utilizing the proximity light sensor 180G, so that the screen is automatically extinguished, and the purpose of saving power is achieved. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. The electronic device may adaptively adjust the brightness of the display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device is in a pocket to prevent accidental touches.
A fingerprint sensor 180H (also referred to as a fingerprint recognizer) for collecting a fingerprint. The electronic equipment can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like. Further description of fingerprint sensors may be found in international patent application PCT/CN2017/082773 entitled "method and electronic device for handling notifications", which is incorporated herein by reference in its entirety.
The touch sensor 180K may also be referred to as a touch panel. The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a touch screen. The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device at a different position than the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys or touch keys. The electronic device may receive a key input, and generate a key signal input related to user settings and function control of the electronic device.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic device by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic equipment can support 1 or N SIM card interfaces, and N is a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic equipment realizes functions of conversation, data communication and the like through the interaction of the SIM card and the network. In some embodiments, the electronic device employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device and cannot be separated from the electronic device.
Hereinafter, the screen detecting method provided by the present application will be described in detail by specific embodiments.
Referring to fig. 2, an embodiment of the present application provides a screen detection method, which is executed in an electronic device a for detecting whether a screen of an object to be detected has problems such as screen splash and/or screen splash, and the method includes:
step 202, a first image sequence and a second image sequence are acquired.
The second image sequence comprises a screenshot image of second electronic equipment, the second electronic equipment is an object to be detected, the first image sequence comprises a screen display image of first electronic equipment, and the first electronic equipment is a reference object of the object to be detected and has good screen display capacity.
The first electronic device and the second electronic device compress the image sequence before sending the image sequence to the electronic device a, and the electronic device a decompresses the compressed content correspondingly after receiving the image sequence from the first electronic device and the second electronic device, so as to obtain a final first image sequence and a final second image sequence.
Here, the content is not necessarily limited to the application scenario, and the same content is executed for the first electronic device and the second electronic device in any application scenario. Because the first electronic device and the second electronic device are terminals of the same type, and have display screens with the same size and the same specification, and the first electronic device has a good screen display capability, the screen display image of the first electronic device is the actual display content of the operating content in the application scene, in other words, the screen display image of the first electronic device is the content that the second electronic device should display under the condition that the screen is free of failure.
Therefore, the electronic device a can shoot a real-time screen display image of the first electronic device to obtain a first image sequence, and intercept a real-time screen capture image of the second electronic device to obtain a second image sequence, and the first image sequence is used as a reference standard of the second image sequence.
The second electronic device is preset with an application program controllable by the electronic device a, so that the electronic device a can control the application program to capture a screenshot of the second electronic device while shooting the screen display image of the first electronic device, and the obtained screenshot image is the same image as the screen display image of the first electronic device at the same time.
Step 204, adjusting the attribute information of each first image in the first image sequence to align each first image with a second image having the same timestamp in the second image sequence.
The attribute information of the image includes, but is not limited to, a timestamp, an image shape, an image brightness, and the like, and since the first image is formed by shooting by the electronic device a, the image shape and the image brightness of the first image have a certain difference from a screen display shape and a screen brightness situation of the first electronic device due to factors such as a shooting environment and a shooting angle.
The second image with the same time stamp in the second image sequence and the first image are shot at the same time and have the same content, the second image is generated by screenshot, the size of the second image is consistent with the size of the screen display of the second electronic device, the shape of the image is standard, and the brightness of the image cannot be influenced by the external environment. Therefore, the corresponding second image can be used as the first image for fine adjustment of the image, negative effects caused by external factors such as a shooting environment, a shooting angle and the like can be reduced by fine adjustment of attribute information of the first image, and the accuracy of a screen detection result can be further improved.
Step 204 is described in detail below.
In one possible design, the attribute information includes a timestamp and a screen shape.
In another possible design, the attribute information includes a timestamp and screen brightness.
In another possible design, the attribute information includes a timestamp, screen brightness, and screen shape.
First, the attribute information must include a timestamp, and step 204 includes: a second image having the same time stamp is determined in the second image sequence for each first image in the first image sequence.
Before preprocessing the first image, such as image brightness, image shape and the like, the first image sequence and the second image sequence are required to be in one-to-one correspondence according to the time stamps of the images, the first image and the second image with the same time stamp are the images shot at the same moment and can be mutually used as references, namely the second image is used as a reference for adjusting the attribute information of the first image, so that the first image is closer to the actual display effect of the first image, and the adjusted first image is used as a comparison reference of the second image to judge whether the second electronic device where the second image is located has screen failure.
Next, the image shape and/or image brightness is adjusted.
The image shape is adjusted by the following method: and performing affine transformation on each first image by taking the second image corresponding to each first image in the first image sequence as a fitting object, so that the image shape of each first image is consistent with that of the corresponding second image.
Affine transformation is linear transformation from two-dimensional coordinates to two-dimensional coordinates, straightness and parallelism of two-dimensional figures are kept, the straightness is kept, namely straight lines or straight lines, circular arcs or circular arcs after transformation is carried out, the parallelism keeping is kept, namely the relative position relation between the two-dimensional figures is kept unchanged, and intersection angles of parallel lines or parallel lines and intersected straight lines are kept unchanged. That is, the first image can be given the same shape as the first image by affine transformation without changing the substantial pixel distribution of the first image, facilitating further comparison with the first image.
The image brightness adjustment mode is as follows: and adjusting the image brightness of each first image in the first image sequence to enable the ratio of the image brightness of each first image to the image brightness of the corresponding second image to be a specified coefficient.
Because the first image is obtained by shooting, the whole brightness of the first image is different from that of the second image, and in order to reduce the difference, the first image can restore the real display effect as much as possible and become an effective reference, and the brightness of the second image can be adjusted. Specifically, the image brightness of the second image may be adjusted to a level at which the ratio to the image brightness of the second image is a specified coefficient. In the contrast of the luminance curves in the following steps, as long as the luminance ratio of the first image to the second image is a prescribed coefficient, the two may have the same luminance curve in the case where the second electronic device has no screen failure.
In one possible design, a coefficient of 1 is specified, and the first image and the second image may have coinciding luminance curves without a screen failure of the second electronic device.
And step 206, comparing the first image sequence with the second image sequence according to a specified comparison mode to obtain difference information.
And step 208, when the difference information meets the specified condition, identifying that the second electronic equipment has a fault corresponding to the specified comparison mode.
The designated comparison mode comprises a screen splash comparison mode and/or a screen splash comparison mode, and of course, the designated comparison mode can also comprise a comparison mode required for detecting any other screen faults.
In a possible design, the way of the screen-splash comparison is that images in two sequences are compared frame by frame, specifically, a difference threshold between each first image in the first image sequence and the corresponding second image may be calculated, accordingly, the specified condition in the way of the screen-splash comparison is that the difference threshold is greater than the specified threshold, and in case that any difference threshold is greater than the specified threshold, the screen-splash of the second electronic device is identified.
Different colors may be assigned different values so that for any image, its image threshold may be calculated by including pixels of its different color, while the difference between the image thresholds of the first image and the corresponding second image is the difference threshold. The specified threshold is a maximum difference threshold which can be reached when the two images have the same display effect, so that when the difference threshold of the first image and the corresponding second image is larger than the specified threshold, the difference between the display effect of the first image and the display effect of the corresponding second image is shown, and the first image is displayed completely, the display problem occurs on the second image which has the difference between the display effect of the first image and the display effect of the second image, so that the second electronic device which provides the second image can be judged to generate screen splash.
In one possible design, the splash screen contrast mode is whether the trends of the luminance curves of the two images are significantly different. Specifically, a first luminance profile may be generated based on the image luminance and the timestamp of each first image in the sequence of first images; generating a second brightness curve based on the image brightness and the time stamp of each second image in the second image sequence; and acquiring a difference interval of the second brightness curve relative to the first brightness curve.
That is, the time stamp of each image in the image sequence is used as the abscissa, and the image brightness of each image is used as the ordinate, and the corresponding brightness curve is drawn. If the screen of the second electronic device is intact, the position and trend trends of the luminance curve of the second electronic device and the luminance curve of the first electronic device in the same coordinate system should be the same or substantially similar, and the time interval corresponding to the position where the luminance curve and the luminance curve have difference is the difference interval.
If the second electronic device has the screen flashing fault, the brightness of the second electronic device can generate peak jitter or trough jitter in the difference interval, and therefore, once the difference interval is the peak jitter interval or the trough jitter interval, the screen flashing fault of the second electronic device can be determined.
At this point, whether the second electronic device has a fault such as a splash screen or a splash screen can be judged by using the first electronic device as a reference of the second electronic device.
In this technical solution, as long as the first electronic device and the second electronic device operate in the same application scenario, there is no need to be limited to a specific application scenario, in other words, the electronic device a may control the first electronic device and the second electronic device to operate simultaneously in multiple application scenarios, so as to detect whether the second electronic device does not generate a screen failure in multiple application scenarios. Therefore, the detection scene range of screen detection is enlarged, the automation and the efficiency of screen detection are improved, and the screen detection cost is effectively reduced.
Referring to fig. 3, in the screen detection scenario, a first electronic device and a second electronic device are located in a dark box, and an electronic device a may control the dark box to provide designated shooting environment information, and shoot a screen display image of the first electronic device through a shooting device e in the dark box.
The first electronic device is provided with an application b, the second electronic device is provided with an application c, and the electronic device a can respectively control the first electronic device and the second electronic device to run in the same actual scene through the applications b and c, for example, the first electronic device and the second electronic device are controlled to simultaneously open a social application, or the first electronic device and the second electronic device are controlled to simultaneously run a specified game, and the like.
The electronic device a can control the camera bellows to shoot the screen display image of the first electronic device at the appointed moment, and simultaneously control the application program d to shoot the screen capture image of the second electronic device at the appointed moment to obtain the screen capture image of the second electronic device.
Of course, the two functions of controlling the second electronic device to simultaneously run the designated content and capturing the image of the second electronic device at the designated time may be respectively completed by the two application programs, or may be integrated in the same application program, which is not limited herein.
The electronic device a can perform detection on the screen of the second electronic device once every specified time interval, and in each detection process on the screen of the second electronic device, the screen display image of the first electronic device with the specified frame number and the screen capture image of the second electronic device with the specified frame number can be collected, namely the same first image sequence and second image sequence with the specified frame number.
Wherein, the appointed time interval and the appointed frame number can be set and flexibly adjusted according to the actual situation.
In one possible design, the specified time interval is 0.2s, 0.5s, or 1s, but can be any time interval that meets the actual detection requirements.
In one possible design, the specified number of frames may be 15 frames, but may be any number of frames that meets the actual detection requirements.
Referring to fig. 4, the screen detecting apparatus 400 includes: the image sequence acquisition unit 402 is configured to acquire a first image sequence and a second image sequence, where the second image sequence includes a screenshot image of a second electronic device, the second electronic device is an object to be detected, the first image sequence includes a screen display image of a first electronic device, and the first electronic device is a reference object of the object to be detected; an image alignment unit 404, configured to adjust attribute information of each first image in the first image sequence to align each first image with a second image having the same timestamp in the second image sequence; a difference comparing unit 406, configured to compare the first image sequence and the second image sequence according to a specified comparison method to obtain difference information; and a failure identification unit 408, configured to identify that a failure corresponding to the specified comparison manner occurs in the second electronic device when the difference information satisfies the specified condition.
In one possible design, where the attribute information includes a timestamp, the image alignment unit 404 includes: a time stamp alignment unit for determining for each first image of the sequence of first images a second image of the sequence of second images having the same time stamp.
In one possible design, the attribute information further includes an image shape, and the image alignment unit 404 includes: and the affine transformation unit is used for carrying out affine transformation on each first image by taking the second image corresponding to each first image in the first image sequence as a fitting object so that the image shape of each first image is consistent with that of the corresponding second image.
In one possible design, the attribute information further includes image brightness, and the image alignment unit 404 includes: and the brightness adjusting unit is used for adjusting the image brightness of each first image in the first image sequence, so that the ratio of the image brightness of each first image to the image brightness of the corresponding second image is a specified coefficient.
In one possible design, the designated contrast mode includes a splash screen contrast mode and/or a splash screen contrast mode.
In one possible design, the difference comparison unit 406 includes: a difference threshold calculation unit, configured to calculate a difference threshold between each first image in the first image sequence and the corresponding second image; the fault identification unit 408 is configured to: and in the case that any difference threshold value is larger than a specified threshold value, identifying that the second electronic equipment is subjected to screen blooming.
In one possible design, the difference comparison unit 406 includes: a luminance curve generating unit for generating a first luminance curve based on the image luminance and the time stamp of each first image in the first image sequence, and generating a second luminance curve based on the image luminance and the time stamp of each second image in the second image sequence; the difference interval acquisition unit is used for acquiring a difference interval of the second brightness curve relative to the first brightness curve; the fault identification unit 408 is configured to: and when the difference interval is a peak jitter interval or a trough jitter interval, identifying that the second electronic equipment flickers.
In one possible embodiment, the image sequence detection unit is configured to: a first image sequence and a second image sequence of a specified number of frames are acquired at specified time intervals.
In addition, an embodiment of the present application further provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when the instructions are executed on an electronic device, the electronic device is caused to execute the screen detection method according to any one of the foregoing implementation manners.
The embodiment of the present application further provides a computer program product, which, when running on an electronic device, causes the electronic device to execute the screen detection method according to any one of the foregoing implementation manners.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedures or functions described in accordance with the present application are generated, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk), among others.
In short, the above description is only an example of the technical solution of the present invention, and is not intended to limit the scope of the present invention. Any modifications, equivalents, improvements and the like made in accordance with the disclosure of the present invention are intended to be included within the scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that although the terms first, second, etc. may be used to describe image sequences in embodiments of the present invention, these image sequences should not be limited to these terms. These terms are only used to distinguish image sequences from each other. For example, the first image sequence may also be referred to as the second image sequence, and similarly, the second image sequence may also be referred to as the first image sequence, without departing from the scope of embodiments of the present invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (12)

1. A screen detection method, comprising:
acquiring a first sequence of images and a second sequence of images, wherein,
the second image sequence comprises a screenshot image of second electronic equipment, the second electronic equipment is an object to be detected, the first image sequence comprises a screen display image of first electronic equipment, and the first electronic equipment is a reference object of the object to be detected;
adjusting attribute information of each first image in the first image sequence to align each first image with a second image with the same time stamp in the second image sequence;
comparing the first image sequence with the second image sequence according to a specified comparison mode to obtain difference information;
and when the difference information meets a specified condition, identifying that the second electronic equipment has a fault corresponding to the specified comparison mode.
2. The screen detecting method according to claim 1,
the attribute information includes a time stamp, then
The step of adjusting the attribute information of each first image in the first image sequence includes:
determining the second image in the second image sequence having the same timestamp for said each first image in the first image sequence.
3. The screen detecting method according to claim 1,
the attribute information further includes an image shape, then
The step of adjusting the attribute information of each first image in the first image sequence further includes:
and performing affine transformation on each first image in the first image sequence by taking the second image corresponding to each first image as a fitting object, so that the image shape of each first image is consistent with that of the corresponding second image.
4. The screen detecting method according to claim 2 or 3,
the attribute information further includes image brightness, then
The step of adjusting the attribute information of each first image in the first image sequence further includes:
and adjusting the image brightness of each first image in the first image sequence to enable the ratio of the image brightness of each first image to the image brightness of the corresponding second image to be a specified coefficient.
5. The screen detecting method according to claim 4,
the specified comparison mode comprises a screen-splash comparison mode and/or a screen-splash comparison mode.
6. The screen detecting method according to claim 5, wherein the step of comparing the first image sequence and the second image sequence in the designated comparison manner for the screen-splash comparison manner to obtain difference information comprises:
calculating a difference threshold for each of the first images in the sequence of first images and the corresponding second image;
the step of identifying that the second electronic device has a fault corresponding to the specified comparison mode when the difference information meets a specified condition includes:
and in the case that any difference threshold value is larger than a specified threshold value, identifying that the second electronic equipment is subjected to screen splash.
7. The screen detecting method according to claim 5, wherein the step of comparing the first image sequence and the second image sequence in the specified comparison manner for the splash screen comparison manner to obtain difference information comprises:
generating a first luminance curve based on the image luminance and the time stamp of each first image in the first image sequence;
generating a second brightness curve based on the image brightness and the time stamp of each second image in the second image sequence;
acquiring a difference interval of the second brightness curve relative to the first brightness curve;
the step of identifying that the second electronic device has a fault corresponding to the specified comparison mode when the difference information meets a specified condition includes:
and identifying that the second electronic equipment flickers when the difference interval is a peak jitter interval or a trough jitter interval.
8. The screen detection method of claim 1, wherein the step of acquiring the first image sequence and the second image sequence comprises:
and acquiring the first image sequence and the second image sequence of a specified frame number at specified time intervals.
9. A screen detecting apparatus, comprising:
an image sequence acquisition unit for acquiring a first image sequence and a second image sequence, wherein,
the second image sequence comprises a screenshot image of second electronic equipment, the second electronic equipment is an object to be detected, the first image sequence comprises a screen display image of first electronic equipment, and the first electronic equipment is a reference object of the object to be detected;
the image alignment unit is used for adjusting the attribute information of each first image in the first image sequence to align each first image with a second image with the same time stamp in the second image sequence;
the difference comparison unit is used for comparing the first image sequence with the second image sequence according to a specified comparison mode to obtain difference information;
and the fault identification unit is used for identifying the fault corresponding to the specified comparison mode of the second electronic equipment when the difference information meets specified conditions.
10. An electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, the instructions being arranged to perform the method of any of the preceding claims 1 to 8.
11. A computer-readable storage medium having stored thereon computer-executable instructions for performing the method flow of any of claims 1-8.
12. A computer program product, characterized in that it causes an electronic device to execute the method procedure of any of claims 1 to 8 when it is run on the electronic device.
CN202010266843.1A 2020-04-07 2020-04-07 Screen detection method and electronic equipment Pending CN113496477A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010266843.1A CN113496477A (en) 2020-04-07 2020-04-07 Screen detection method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010266843.1A CN113496477A (en) 2020-04-07 2020-04-07 Screen detection method and electronic equipment

Publications (1)

Publication Number Publication Date
CN113496477A true CN113496477A (en) 2021-10-12

Family

ID=77995693

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010266843.1A Pending CN113496477A (en) 2020-04-07 2020-04-07 Screen detection method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113496477A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091392A (en) * 2022-08-16 2023-05-09 荣耀终端有限公司 Image processing method, system and storage medium
CN117156210A (en) * 2023-02-07 2023-12-01 荣耀终端有限公司 Method and device for detecting splash screen

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006049165A (en) * 2004-08-06 2006-02-16 Matsushita Electric Ind Co Ltd Lighting screen inspection method of plasma display panel
JP2008216334A (en) * 2007-02-28 2008-09-18 Nippon Signal Co Ltd:The Detecting method and detecting device for screen display fault
JP2012032369A (en) * 2010-07-29 2012-02-16 Sharp Corp Defect identification method, defect identification apparatus, program, and recording medium
CN105424710A (en) * 2015-11-20 2016-03-23 上海斐讯数据通信技术有限公司 Method and device for detecting screen of electronic equipment
CN106686374A (en) * 2016-12-08 2017-05-17 广州视源电子科技股份有限公司 Method, system and device for detecting color saturation of screens
CN107909569A (en) * 2017-11-10 2018-04-13 广东欧珀移动通信有限公司 One kind flower screen detection method, flower screen detection device and electronic equipment
CN109710378A (en) * 2018-12-18 2019-05-03 广东微云科技股份有限公司 The fault detection method of virtual machine
CN110351530A (en) * 2019-07-31 2019-10-18 Tcl王牌电器(惠州)有限公司 Polyphaser realizes method, system and the computer readable storage medium of screen detection

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006049165A (en) * 2004-08-06 2006-02-16 Matsushita Electric Ind Co Ltd Lighting screen inspection method of plasma display panel
JP2008216334A (en) * 2007-02-28 2008-09-18 Nippon Signal Co Ltd:The Detecting method and detecting device for screen display fault
JP2012032369A (en) * 2010-07-29 2012-02-16 Sharp Corp Defect identification method, defect identification apparatus, program, and recording medium
CN105424710A (en) * 2015-11-20 2016-03-23 上海斐讯数据通信技术有限公司 Method and device for detecting screen of electronic equipment
CN106686374A (en) * 2016-12-08 2017-05-17 广州视源电子科技股份有限公司 Method, system and device for detecting color saturation of screens
CN107909569A (en) * 2017-11-10 2018-04-13 广东欧珀移动通信有限公司 One kind flower screen detection method, flower screen detection device and electronic equipment
CN109710378A (en) * 2018-12-18 2019-05-03 广东微云科技股份有限公司 The fault detection method of virtual machine
CN110351530A (en) * 2019-07-31 2019-10-18 Tcl王牌电器(惠州)有限公司 Polyphaser realizes method, system and the computer readable storage medium of screen detection

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116091392A (en) * 2022-08-16 2023-05-09 荣耀终端有限公司 Image processing method, system and storage medium
CN116091392B (en) * 2022-08-16 2023-10-20 荣耀终端有限公司 Image processing method, system and storage medium
CN117156210A (en) * 2023-02-07 2023-12-01 荣耀终端有限公司 Method and device for detecting splash screen

Similar Documents

Publication Publication Date Title
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN112686981B (en) Picture rendering method and device, electronic equipment and storage medium
CN111182140B (en) Motor control method and device, computer readable medium and terminal equipment
CN112087649B (en) Equipment searching method and electronic equipment
CN113797530B (en) Image prediction method, electronic device and storage medium
CN112860428A (en) High-energy-efficiency display processing method and equipment
CN113496477A (en) Screen detection method and electronic equipment
CN115150542B (en) Video anti-shake method and related equipment
CN113518189B (en) Shooting method, shooting system, electronic equipment and storage medium
CN113781548A (en) Multi-device pose measurement method, electronic device and system
CN116048831B (en) Target signal processing method and electronic equipment
WO2022033344A1 (en) Video stabilization method, and terminal device and computer-readable storage medium
CN112929870B (en) Event subscription method and electronic equipment
CN113132532B (en) Ambient light intensity calibration method and device and electronic equipment
CN114116610A (en) Method, device, electronic equipment and medium for acquiring storage information
CN114661258A (en) Adaptive display method, electronic device, and storage medium
CN114520870B (en) Display method and terminal
CN116233599B (en) Video mode recommendation method and electronic equipment
CN113129220B (en) Image processing method and electronic equipment
CN115665632B (en) Audio circuit, related device and control method
CN115019803B (en) Audio processing method, electronic device, and storage medium
CN118113386A (en) Transition dynamic effect generation method and electronic equipment
CN116094861A (en) Method, device and system for remotely controlling electronic device
CN116449942A (en) Axial matching method and related equipment thereof
CN117319369A (en) File delivery method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination