CN108229485B - Method and apparatus for testing user interface - Google Patents

Method and apparatus for testing user interface Download PDF

Info

Publication number
CN108229485B
CN108229485B CN201810129865.6A CN201810129865A CN108229485B CN 108229485 B CN108229485 B CN 108229485B CN 201810129865 A CN201810129865 A CN 201810129865A CN 108229485 B CN108229485 B CN 108229485B
Authority
CN
China
Prior art keywords
global
sample
user interface
local
type
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810129865.6A
Other languages
Chinese (zh)
Other versions
CN108229485A (en
Inventor
尹飞
项金鑫
柏馨
张婷
刘盼盼
薛大伟
邢潘红
魏晨辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201810129865.6A priority Critical patent/CN108229485B/en
Publication of CN108229485A publication Critical patent/CN108229485A/en
Application granted granted Critical
Publication of CN108229485B publication Critical patent/CN108229485B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/148Segmentation of character regions
    • G06V30/153Segmentation of character regions using recognition of characters or words
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • User Interface Of Digital Computer (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the application discloses a method and a device for testing a user interface. One embodiment of the method comprises: acquiring a screenshot of a user interface to be tested; determining whether the screenshot of the user interface to be tested meets a preset condition; in response to the fact that the preset conditions are met, inputting the screenshot of the user interface to be tested into a pre-trained global test model to obtain a global test result corresponding to the user interface to be tested; and segmenting at least one local area from the screenshot of the user interface to be tested, and inputting the segmented at least one local area into a pre-trained local test model to obtain a local test result corresponding to the user interface to be tested. According to the embodiment, the user interface is tested by using a deep learning method and combining the global test model and the local test model, manual judgment by testers is not needed, the labor cost is saved, and the testing efficiency of the user interface is improved.

Description

Method and apparatus for testing user interface
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for testing a user interface.
Background
A UI (User Interface) generally refers to an operation Interface of a User, and includes a mobile application, a web page, an intelligent wearable device, and the like. UI design mainly refers to the style and the aesthetic degree of an interface. The good UI not only enables the software to become individual and tasteful, but also enables the operation of the software to become comfortable, simple and free, and fully embodies the positioning and characteristics of the software.
By testing the UI, it can be quickly determined whether there is a defect in the UI. The existing UI test mode usually depends on the manual judgment of a tester, and the following two modes are mainly adopted: firstly, determining whether a defect exists in a UI (user interface) by manually judging whether a specific element exists in the UI; and secondly, presetting an expected reference picture, intercepting a screenshot of the UI, comparing the screenshot with the expected reference picture, and determining whether the UI has defects according to a comparison result.
Disclosure of Invention
The embodiment of the application provides a method and a device for testing a user interface.
In a first aspect, an embodiment of the present application provides a method for testing a user interface, where the method includes: acquiring a screenshot of a user interface to be tested; determining whether the screenshot of the user interface to be tested meets a preset condition; in response to the fact that the preset conditions are met, inputting the screenshot of the user interface to be tested into the global test model to obtain a global test result corresponding to the user interface to be tested; and segmenting at least one local area from the screenshot of the user interface to be tested, inputting the segmented at least one local area into a pre-trained local test model, and obtaining a local test result corresponding to the user interface to be tested, wherein each local area comprises a character string.
In some embodiments, determining whether the screenshot of the user interface to be tested satisfies a preset condition includes: performing at least one of the following operations: identifying at least one character string in a screenshot of a user interface to be tested by utilizing an Optical Character Recognition (OCR) technology, and matching the identified at least one character string in a preset error character string set; determining the color type included in the screenshot of the user interface to be tested by using a color model; determining whether the screenshot of the user interface to be tested meets a preset condition or not based on the result of at least one operation, wherein the preset condition comprises at least one of the following conditions: the character strings are unsuccessfully matched in the preset error character string set, and the number of the color types is larger than the preset number.
In some embodiments, segmenting at least one local region from a screenshot of a user interface to be tested comprises: acquiring a Document Object Model (DOM) of a user interface to be tested, wherein the DOM of the user interface to be tested comprises position information of at least one first local area, and each first local area comprises a character string; and according to the indication of the position information of the at least one first local area, segmenting the at least one first local area from the screenshot of the user interface to be tested to serve as the at least one local area.
In some embodiments, the DOM of the user interface to be tested further includes location information of at least one second local area, wherein each second local area includes a plurality of character strings; and segmenting at least one local area from the screenshot of the user interface to be tested, further comprising: segmenting at least one second local area from the screenshot of the user interface to be tested according to the indication of the position information of the at least one second local area; for each second local area in the at least one second local area, a plurality of character strings in the second local area are identified by using an OCR technology, and the area where each character string in the identified character strings is located is divided from the second local area to serve as a plurality of local areas.
In some embodiments, the global test result includes a global type and location information of a region corresponding to the global type, and the local test result includes a local type and location information of a region corresponding to the local type.
In some embodiments, the global type includes at least one of: a global normal type, a global blank type and a global keyboard exception type, wherein the local type comprises at least one of the following types: the method comprises a character normal type, a character overlapping type, a character background transition abnormal type and a character edge shielding type.
In some embodiments, the global test model comprises at least one of: the global empty test model and the global keyboard abnormity test model are adopted, and the local test model comprises at least one of the following items: the method comprises a character overlapping test model, a character background transition abnormity test model and a character edge shielding test model.
In some embodiments, the global test model is trained by: acquiring a screenshot of a sample user interface belonging to a global normal type and a global test result corresponding to the sample user interface belonging to the global normal type as a global positive sample; acquiring a screenshot of a sample user interface belonging to the global blank type and a global test result corresponding to the sample user interface belonging to the global blank type, and taking the screenshot and the global test result as a first global negative sample; acquiring a screenshot of a sample user interface belonging to the global keyboard abnormal type and a global test result corresponding to the sample user interface belonging to the global keyboard abnormal type, and taking the screenshot and the global test result as a second global negative sample; training a preset first convolution neural network based on a global positive sample and a first global negative sample by using a deep learning method to obtain a global blank test model; and training a preset second convolutional neural network based on the global positive sample and the second global negative sample by using a deep learning method to obtain a global keyboard anomaly test model.
In some embodiments, the global negative examples are generated by: covering the first preset map on the screenshot of the sample user interface belonging to the global normal type, and generating the screenshot of the sample user interface belonging to the global blank type; and covering the preset keyboard abnormal map on the screenshot of the sample user interface belonging to the global normal type, and generating the screenshot of the sample user interface belonging to the global keyboard abnormal type.
In some embodiments, the method further comprises: and carrying out graphic heterogeneous processing on the screenshot of the sample user interface belonging to the global normal type, the screenshot of the sample user interface belonging to the global blank type and the screenshot of the sample user interface belonging to the global normal type to generate a plurality of global positive samples, a plurality of first global negative samples and a plurality of second global negative samples.
In some embodiments, the local test model is trained by: acquiring a sample local area belonging to the character normal type and a local test result corresponding to the sample local area belonging to the character normal type as a local positive sample; acquiring a sample local area belonging to a character overlapping type and a local test result corresponding to the sample local area belonging to the character overlapping type as a first local negative sample; acquiring a sample local area belonging to the character background transition abnormal type and a local test result corresponding to the sample local area belonging to the character background transition abnormal type, and taking the local test result as a second local negative sample; acquiring a sample local area belonging to a character edge shielding type and a local test result corresponding to the sample local area belonging to the character edge shielding type, and taking the local test result as a third local negative sample; training a preset third convolutional neural network based on the local positive sample and the first local negative sample by using a deep learning method to obtain a character overlapping test model; training a preset fourth convolutional neural network based on the local positive sample and the second local negative sample by using a deep learning method to obtain a character background transition anomaly test model; and training a preset fifth convolutional neural network based on the local positive sample and the third local negative sample by using a deep learning method to obtain a character edge shielding test model.
In some embodiments, the local negative examples are generated by: covering a preset character map on a sample local area belonging to a normal type of characters to generate a sample local area belonging to an overlapped type of characters; covering a preset layer on a sample local area belonging to a normal type of a character to generate a sample local area belonging to an abnormal type of a background transition of the character; and covering the second preset map on the edge of the sample local area belonging to the normal type of the character to generate the sample local area belonging to the character edge shielding test model.
In some embodiments, the method further comprises: and carrying out graphic heterogeneous processing on a sample local area belonging to a character normal type, a sample local area belonging to a character overlapping type, a sample local area belonging to a character background transition abnormal type and a sample local area belonging to a character edge shielding test model to generate a plurality of local positive samples, a plurality of first local negative samples, a plurality of second local negative samples and a plurality of third local negative samples.
In a second aspect, an embodiment of the present application provides an apparatus for testing a user interface, where the apparatus includes: the acquisition unit is configured to acquire a screenshot of a user interface to be tested; the determining unit is configured to determine whether the screenshot of the user interface to be tested meets a preset condition; the test unit is configured to respond to the fact that the preset condition is met, input the screenshot of the user interface to be tested into the global test model, and obtain a global test result corresponding to the user interface to be tested; and segmenting at least one local area from the screenshot of the user interface to be tested, inputting the segmented at least one local area into a pre-trained local test model, and obtaining a local test result corresponding to the user interface to be tested, wherein each local area comprises a character string.
In some embodiments, the determining unit comprises: an execution module configured to perform at least one of: identifying at least one character string in the screenshot of the user interface to be tested by utilizing an Optical Character Recognition (OCR) technology, and matching the identified at least one character string in a preset error character string set; determining the color type included in the screenshot of the user interface to be tested by using a color model; the determining module is configured to determine whether the screenshot of the user interface to be tested meets a preset condition based on a result of the at least one operation, wherein the preset condition includes at least one of the following: the character strings are unsuccessfully matched in the preset error character string set, and the number of the color types is larger than the preset number.
In some embodiments, the test unit comprises: the document object model acquisition module is configured to acquire a Document Object Model (DOM) of a user interface to be tested, wherein the DOM of the user interface to be tested comprises position information of at least one first local area, and each first local area comprises a character string; and the first local area segmentation module is configured to segment the at least one first local area from the screenshot of the user interface to be tested as the at least one local area according to the indication of the position information of the at least one first local area.
In some embodiments, the DOM of the user interface to be tested further includes location information of at least one second local area, wherein each second local area includes a plurality of character strings; and the test unit further comprises: the second local area segmentation module is configured to segment at least one second local area from a screenshot of the user interface to be tested according to the indication of the position information of the at least one second local area; and the local area segmentation module is configured to identify a plurality of character strings in each second local area of the at least one second local area by using an OCR technology, and segment an area where each character string in the plurality of identified character strings is located from the second local area as the plurality of local areas.
In some embodiments, the global test result includes a global type and location information of a region corresponding to the global type, and the local test result includes a local type and location information of a region corresponding to the local type.
In some embodiments, the global type includes at least one of: a global normal type, a global blank type and a global keyboard exception type, wherein the local type comprises at least one of the following types: the method comprises a character normal type, a character overlapping type, a character background transition abnormal type and a character edge shielding type.
In some embodiments, the global test model comprises at least one of: the global empty test model and the global keyboard abnormity test model are adopted, and the local test model comprises at least one of the following items: the method comprises a character overlapping test model, a character background transition abnormity test model and a character edge shielding test model.
In some embodiments, the apparatus further comprises a global test model training unit, the global test module training unit comprising: the global positive sample acquisition module is configured to acquire a screenshot of a sample user interface belonging to a global normal type and a global test result corresponding to the sample user interface belonging to the global normal type as a global positive sample; the first global negative sample acquisition module is configured to acquire a screenshot of a sample user interface belonging to a global blank type and a global test result corresponding to the sample user interface belonging to the global blank type as a first global negative sample; the second global negative sample acquisition module is configured to acquire a screenshot of a sample user interface belonging to the global keyboard exception type and a global test result corresponding to the sample user interface belonging to the global keyboard exception type as a second global negative sample; the global blank test model training module is configured and used for training a preset first convolution neural network based on a global positive sample and a first global negative sample by using a deep learning method to obtain a global blank test model; and the global keyboard anomaly test model training module is configured and used for training a preset second convolutional neural network based on a global positive sample and a second global negative sample by utilizing a deep learning method to obtain a global keyboard anomaly test model.
In some embodiments, the apparatus further comprises a global sample generation unit comprising: the first global negative sample generation module is configured to overlay a first preset map on the screenshot of the sample user interface belonging to the global normal type, and generate the screenshot of the sample user interface belonging to the global blank type; and the second global negative sample generation module is configured to cover the preset keyboard abnormal map on the screenshot of the sample user interface belonging to the global normal type, and generate the screenshot of the sample user interface belonging to the global keyboard abnormal type.
In some embodiments, the global sample generation unit further comprises: the first graphic heterogeneous processing module is configured to perform graphic heterogeneous processing on the screenshot of the sample user interface belonging to the global normal type, the screenshot of the sample user interface belonging to the global blank type and the screenshot of the sample user interface belonging to the global normal type, and generate a plurality of global positive samples, a plurality of first global negative samples and a plurality of second global negative samples.
In some embodiments, the apparatus further comprises a local test model training unit, the local test module training unit comprising: the local positive sample acquisition module is configured to acquire a sample local area belonging to the character normal type and a local test result corresponding to the sample local area belonging to the character normal type as a local positive sample; the local negative sample acquisition module is configured to acquire a sample local area belonging to a character overlap type and a local test result corresponding to the sample local area belonging to the character overlap type as a first local negative sample; the second local negative sample acquisition module is configured to acquire a sample local area belonging to the character background transition abnormal type and a local test result corresponding to the sample local area belonging to the character background transition abnormal type, and the local test result is used as a second local negative sample; the third local negative sample acquisition module is configured to acquire a sample local area belonging to the character edge shielding type and a local test result corresponding to the sample local area belonging to the character edge shielding type as a third local negative sample; the character overlapping test model training module is configured to train a preset third convolutional neural network based on a local positive sample and a first local negative sample by using a deep learning method to obtain a character overlapping test model; the character background transition anomaly test model training module is configured and used for training a preset fourth convolutional neural network based on a local positive sample and a second local negative sample by using a deep learning method to obtain a character background transition anomaly test model; and the character edge shielding test model training module is configured and used for training a preset fifth convolutional neural network based on the local positive sample and the third local negative sample by utilizing a deep learning method to obtain a character edge shielding test model.
In some embodiments, the apparatus further comprises a local sample generation unit comprising: the first local negative sample generation module is configured to cover a preset character map on a sample local area belonging to a normal type of a character and generate a sample local area belonging to an overlapped type of the character; the second local negative sample generation module is configured to cover the preset layer on a sample local area belonging to a normal type of a character, and generate a sample local area belonging to an abnormal type of a background transition of the character; and the third local negative sample generation module is configured to cover the second preset map on the edge of the sample local area belonging to the normal type of the character, and generate the sample local area belonging to the character edge shielding test model.
In some embodiments, the local sample generation unit further comprises: and the second graphic heterogeneous processing module is configured to perform graphic heterogeneous processing on a sample local area belonging to a character normal type, a sample local area belonging to a character overlapping type, a sample local area belonging to a character background transition abnormal type and a sample local area belonging to a character edge shielding test model to generate a plurality of local positive samples, a plurality of first local negative samples, a plurality of second local negative samples and a plurality of third local negative samples.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; storage means for storing one or more programs; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation of the first aspect.
In a fourth aspect, the present application provides a computer-readable medium, on which a computer program is stored, which, when executed by a processor, implements the method as described in any implementation manner of the first aspect.
According to the method and the device for testing the user interface, the screenshot of the user interface to be tested is obtained, so that whether the screenshot of the user interface to be tested meets the preset condition or not can be conveniently determined; under the condition that the preset conditions are met, firstly, inputting a screenshot of a user interface to be tested into a global test model, so as to obtain a global test result corresponding to the user interface to be tested; and then, at least one local area is divided from the screenshot of the user interface to be tested, and the divided at least one local area is input into a pre-trained local test model, so that a local test result corresponding to the user interface to be tested is obtained. The user interface is tested by using the deep learning method and combining the global test model and the local test model, manual judgment by testers is not needed, labor cost is saved, and the testing efficiency of the user interface is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for testing a user interface according to the present application;
FIG. 3 is a flow diagram of yet another embodiment of a method for testing a user interface according to the present application;
FIG. 4 is a flow diagram of one embodiment of a method for training a global test model according to the present application;
FIG. 5 is a flow diagram of one embodiment of a method for training a local test model according to the present application;
FIG. 6 is a schematic block diagram illustrating one embodiment of an apparatus for testing a user interface according to the present application;
FIG. 7 is a block diagram of a computer system suitable for use in implementing the electronic device of an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the method for testing a user interface or the apparatus for testing a user interface of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 101, 102, 103 to interact with the server 105 via the network 104 to receive or send messages or the like. Various client applications, such as shopping applications, search applications, instant messaging tools, mailbox clients, social platform software, etc., may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be various electronic devices including, but not limited to, smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 105 may be a server that provides various services, such as a user interface test server that tests a user interface. The user interface test server may analyze and perform other processing on the received screenshot of the user interface to be tested, and generate a processing result (e.g., a global test result and a local test result corresponding to the user interface to be tested).
It should be noted that the method for testing the user interface provided in the embodiment of the present application is generally performed by the server 105, and accordingly, the apparatus for testing the user interface is generally disposed in the server 105.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for testing a user interface in accordance with the present application is shown. The method for testing the user interface comprises the following steps:
step 201, obtaining a screenshot of a user interface to be tested.
In this embodiment, an electronic device (for example, the server 105 shown in fig. 1) on which the method for testing a user interface operates may obtain a screenshot of a user interface to be tested from a terminal device (for example, the terminal devices 101, 102, and 103 shown in fig. 1) through a wired connection manner or a wireless connection manner. Generally, the terminal device may be an electronic device with a screenshot function, on which various client applications are installed, such as a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like. The terminal device displays the user interface of the client application in the process of running the client application, and at the moment, the terminal device can intercept the displayed user interface by utilizing the screenshot function. The user interface generally refers to an operation interface of a user, also called a human-computer interface, and the user can interact with the client application through the user interface.
Step 202, determining whether the screenshot of the user interface to be tested meets a preset condition.
In this embodiment, based on the screenshot of the user interface to be tested obtained in step 201, the electronic device may determine whether the screenshot of the user interface to be tested satisfies a preset condition, and execute step 203' and step 203 if it is determined that the preset condition is satisfied; and in the case that the preset condition is determined not to be met, ending the flow. The preset condition may be various conditions set in advance. Generally, if the screenshot of the user interface meets the preset condition, it is indicated that the defect of the user interface is small, and the method is suitable for testing the user interface; if the screenshot of the user interface does not meet the preset condition, it indicates that the user interface has a large defect, and the screenshot is not suitable for testing by the method for testing the user interface, so the process is ended.
In some optional implementation manners of this embodiment, the preset condition may be null, and at this time, it is assumed that the screenshot of the user interface to be tested satisfies the preset condition by default, and step 203' and step 203 may be directly performed.
In some optional implementations of this embodiment, the preset condition may include, but is not limited to, at least one of the following: the character strings are unsuccessfully matched in the preset error character string set, and the number of the color types is larger than the preset number.
Specifically, first, the electronic device may perform at least one of the following operations:
1. identifying at least one character string in the screenshot of the user interface to be tested by utilizing an Optical Character Recognition (OCR) technology, and matching the identified at least one character string in a preset error character string set. Here, the OCR (Optical Character Recognition) technology refers to a technology in which an electronic device checks a Character string in a screenshot of a user interface, determines a shape thereof by detecting a dark and light pattern, and then translates the shape into a computer word by a Character Recognition method. The method is a technology for converting a character string in a screenshot of a user interface into an image file of a black-and-white dot matrix in an optical mode aiming at the character string in the screenshot of the user interface, and converting the character string in the image into a text format through recognition software for further editing and processing by character processing software. The character strings in the preset error character string set can be various preset error character strings. For example, the character string in the preset error character string set may be a character string in a source code of the user interface to be tested or a character string in a messy code, and if the identified character string is successfully matched in the preset error character string set, it indicates that the character string in the source code is not completely converted into the character string to be displayed on the user interface to be tested or the messy code exists on the user interface to be tested, and at this time, it indicates that the defect of the user interface is large, so that the process is ended.
2. And determining the color type included in the screenshot of the user interface to be tested by using the color model. Here, the color Model may be an HSV (Hue, Saturation) color Model, which is also called a hexagonal cone Model (Hexcone Model), and parameters of colors in this Model are: hue (H), saturation (S), lightness (V). The HSV model can be used for quickly determining the color types included in the screenshots of the user interface to be tested. For example, if the screenshot of the user interface to be tested includes only one color, the screenshot of the user interface to be tested is a solid color picture, which indicates that there is no character in the user interface to be tested.
Then, the electronic device may determine whether the screenshot of the user interface to be tested satisfies a preset condition based on a result of the at least one operation.
Specifically, if at least one character string in the screenshot of the user interface to be tested is unsuccessfully matched in the preset error character string set and/or the color type included in the screenshot of the user interface to be tested is greater than a preset number (for example, 1 type), determining that the screenshot of the user interface to be tested meets a preset condition, and then continuing to execute step 203' and step 203; and if at least one character string in the screenshots of the user interface to be tested is successfully matched in the preset error character string set and/or the color types included in the screenshots of the user interface to be tested are not more than the preset number, determining that the screenshots of the user interface to be tested do not meet the preset conditions, and then ending the flow.
And 203', inputting the screenshot of the user interface to be tested into the global test model to obtain a global test result corresponding to the user interface to be tested.
In this embodiment, the electronic device may input the screenshot of the user interface to be tested to the global test model when the screenshot of the user interface to be tested satisfies the preset condition, so as to obtain a global test result corresponding to the user interface to be tested. The global test result may only include the global type, or may include both the global type and the location information of the region corresponding to the global type. The global type may include, but is not limited to, at least one of: a global normal type, a global blank type, and a global keyboard exception type. The global leave empty type may include a global leave black type and a full leave white type. For example, for a user interface, the global test result may be the global blank type and the location information of the blank area.
In this embodiment, the global test model may be used to represent a correspondence between a screenshot of the user interface and a global test result corresponding to the user interface. As an example, a person skilled in the art may perform statistical analysis on a large number of screenshots of the user interface and the global test result corresponding to the user interface, thereby generating a correspondence table in which the screenshots of the plurality of user interfaces and the global test result corresponding to the user interface are stored, and taking the correspondence table as a global test model. The electronic device can calculate the similarity between the screenshot of the user interface to be tested and the screenshot of each user interface in the corresponding relation table, and based on the similarity calculation result, the global test result corresponding to the user interface to be tested is found out from the corresponding relation table. For example, the screenshot of the user interface with the highest similarity to the screenshot of the user interface to be tested is determined, and then the global test result corresponding to the screenshot of the user interface is searched from the corresponding relation table and is used as the global test result corresponding to the user interface to be tested.
In this embodiment, the global test model may be composed of one multi-classification model or a plurality of single-classification models. Where the global test model is composed of a plurality of single classification models, the global test model may include, but is not limited to, at least one of: a global blank test model and a global keyboard abnormity test model.
Step 203, at least one local area is segmented from the screenshot of the user interface to be tested.
In this embodiment, the electronic device may segment at least one local area from the screenshot of the user interface to be tested, when the screenshot of the user interface to be tested satisfies a preset condition. Wherein each local area comprises a string. As an example, the electronic device may identify at least one string in the screenshot of the user interface to be tested using OCR technology and then segment an area in which each word in the identified at least one string is located from the screenshot of the user interface to be tested. The region in which the character string is located may be a minimum rectangular region including the character string.
And 204, inputting the at least one segmented local area into a pre-trained local test model to obtain a local test result corresponding to the user interface to be tested.
In this embodiment, based on the at least one local region segmented in step 203, the electronic device may input each of the segmented at least one local region to the local test model one by one, so as to obtain a local test result corresponding to the user interface to be tested. The local test result may only include the local type, or may include both the local type and the location information of the region corresponding to the local type. The local type may include, but is not limited to, at least one of: the method comprises a character normal type, a character overlapping type, a character background transition abnormal type and a character edge shielding type. For example, for a local region including a character segmented from a screenshot of a user interface, the local test result may be a character overlap type and position information of the local region.
In this embodiment, the local test type may be used to represent a correspondence between a local area of a screenshot of the user interface and a local test result corresponding to the user interface. As an example, a person skilled in the art may perform statistical analysis on a large number of local regions and local test results corresponding to the local regions, thereby generating a correspondence table in which a plurality of local regions and local test results corresponding to the local regions are stored, and use the correspondence table as a local test model. The electronic device may calculate similarity between the local area of the screenshot of the user interface to be tested and each local area in the correspondence table, and find out a local test result corresponding to the local area of the screenshot of the user interface to be tested from the correspondence table based on a similarity calculation result. For example, a local area with the highest similarity to the local area of the screenshot of the user interface to be tested is determined, and then the local test result corresponding to the local area is found from the corresponding relationship table and is used as the local test result corresponding to the user interface to be tested.
In this embodiment, the local test model may be composed of one multi-classification model or a plurality of single-classification models. Where the local test model is composed of a plurality of single classification models, the local test model may include, but is not limited to, at least one of: the method comprises a character overlapping test model, a character background transition abnormity test model and a character edge shielding test model.
According to the method for testing the user interface, the screenshot of the user interface to be tested is obtained, so that whether the screenshot of the user interface to be tested meets the preset condition or not is determined; under the condition that the preset conditions are met, firstly, inputting a screenshot of the user interface to be tested into a global test model, and thus obtaining a global test result corresponding to the user interface to be tested; and then, at least one local area is divided from the screenshot of the user interface to be tested, and the divided at least one local area is input into a pre-trained local test model, so that a local test result corresponding to the user interface to be tested is obtained. The user interface is tested by using the deep learning method and combining the global test model and the local test model, manual judgment by testers is not needed, labor cost is saved, and the testing efficiency of the user interface is improved.
With further reference to FIG. 3, a flow 300 of yet another embodiment of a method for testing a user interface according to the present application is shown. The process 300 of the method for testing a user interface includes the steps of:
step 301, acquiring a screenshot of a user interface to be tested.
In this embodiment, an electronic device (for example, the server 105 shown in fig. 1) on which the method for testing a user interface operates may obtain a screenshot of a user interface to be tested from a terminal device (for example, the terminal devices 101, 102, and 103 shown in fig. 1) through a wired connection manner or a wireless connection manner. In general, a terminal device may be an electronic device having a screen capture function, on which various client applications are installed. The terminal device displays the user interface of the client application in the process of running the client application, and at the moment, the terminal device can intercept the displayed user interface by utilizing the screenshot function.
Step 302, determining whether the screenshot of the user interface to be tested meets a preset condition.
In this embodiment, based on the screenshot of the user interface to be tested obtained in step 301, the electronic device may determine whether the screenshot of the user interface to be tested satisfies a preset condition, and execute step 303' and step 303 if it is determined that the preset condition is satisfied; and in the case that the preset condition is determined not to be met, ending the flow. The preset condition may be various conditions set in advance.
And step 303', inputting the screenshot of the user interface to be tested into the global test model to obtain a global test result corresponding to the user interface to be tested.
In this embodiment, the electronic device may input the screenshot of the user interface to be tested to the global test model when the screenshot of the user interface to be tested satisfies the preset condition, so as to obtain a global test result corresponding to the user interface to be tested. The global test result may only include the global type, or may include both the global type and the location information of the area corresponding to the global type. The global type may include, but is not limited to, at least one of: a global normal type, a global blank type, and a global keyboard exception type. The global blank type may include a global black type and a full blank type.
Step 303, obtaining a Document Object Model (DOM) of the user interface to be tested.
In this embodiment, the electronic device may obtain a DOM (Document Object Model) of the user interface to be tested from the terminal device through a wired connection manner or a wireless connection manner. Here, on the user interface, the objects of the user interface are organized in a tree structure, and the standard model used to represent the objects in the user interface is referred to as the DOM of the user interface. In practice, the operating system of the terminal device may be Android (a Linux-based free and open source operating system) or IOS (a mobile operating system). Generally, regardless of whether the operating system of the terminal device is Android or IOS, TextView (TextView is a component for displaying a character string) information is included in the DOM of the user interface to be tested, and the TextView information may include position information of at least one first partial region, where each first partial region includes only one character string, and the first partial region may be a minimum rectangular region including the one character string.
And step 304, according to the indication of the position information of the at least one first local area, dividing the at least one first local area from the screenshot of the user interface to be tested to serve as the at least one local area.
In this embodiment, based on the DOM information of the user interface to be tested obtained in step 303, the electronic device may determine at least one first local area in the screenshot of the user interface to be tested according to the position information of the at least one first local area, and then segment the at least one first local area in the screenshot of the user interface to be tested as the at least one local area.
In some optional implementation manners of this embodiment, if the operating system of the terminal device is Android, the DOM of the user interface to be tested may further include WebView (WebView is a web page view, which is a control used to display a web page) information. For example, if a class attribute of "class ═ android. The WebView information may include position information of at least one second local area, each second local area includes a plurality of character strings, and the second local area may be a minimum rectangular area including the plurality of character strings. The electronic equipment can divide the at least one second local area from the screenshot of the user interface to be tested according to the indication of the position information of the at least one second local area; for each second local area in the at least one second local area, a plurality of character strings in the second local area are identified by using an OCR technology, and the area where each character string in the identified character strings is located is divided from the second local area to serve as a plurality of local areas.
Step 305, inputting the at least one segmented local region into a pre-trained local test model to obtain a local test result corresponding to the user interface to be tested.
In this embodiment, based on the at least one local region segmented in step 304, the electronic device may input each of the segmented at least one local region to the local test model one by one, so as to obtain a local test result corresponding to the user interface to be tested. The local test result may only include the local type, or may include both the local type and the location information of the region corresponding to the local type. The local type may include, but is not limited to, at least one of: the method comprises a character normal type, a character overlapping type, a character background transition abnormal type and a character edge shielding type.
As can be seen from fig. 3, compared to the embodiment corresponding to fig. 2, the flow 300 of the method for testing a user interface in the present embodiment highlights the step of segmenting the local area. Therefore, according to the scheme described in the embodiment, the DOM of the user interface of the sample to be tested can be analyzed, so that the position information of the local area can be quickly determined, and the efficiency of segmenting the local area is improved.
With further reference to FIG. 4, a flow 400 of one embodiment of a method for training a global test model according to the present application is shown. The process 400 of the method for training a global test model includes the steps of:
step 401, acquiring a screenshot of a sample user interface belonging to the global normal type and a global test result corresponding to the sample user interface belonging to the global normal type as a global positive sample.
In this embodiment, an electronic device (e.g., the server 105 shown in fig. 1) on which the method for training the global test model operates may obtain a screenshot of a sample user interface belonging to the global normal type and a global test result corresponding to the sample user interface belonging to the global normal type, and take the screenshot and the global test result as a global positive sample. The screenshot of the sample user interface belonging to the global normal type is input information in the global positive sample, and the global test result corresponding to the sample user interface belonging to the global normal type is output information in the global positive sample. In general, the screenshot of a sample user interface that is of the global normal type is identical to the intended reference of the sample user interface.
Step 402, acquiring a screenshot of a sample user interface belonging to the global leave-blank type and a global test result corresponding to the sample user interface belonging to the global leave-blank type as a first global negative sample.
In this embodiment, the electronic device may obtain a screenshot of a sample user interface belonging to the global leave-blank type and a global test result corresponding to the sample user interface belonging to the global leave-blank type, and use the screenshot as the first global negative sample. The screenshot of the sample user interface belonging to the global blank type is input information in the first global negative sample, and the global test result corresponding to the sample user interface belonging to the global blank type is output information in the first global negative sample. Generally, the screenshot of the sample user interface belonging to the global blank type may be a screenshot of the sample user interface when the defect of the global blank type occurs, or may be a screenshot generated on the basis of the screenshot of the sample user interface belonging to the global normal type.
In some optional implementations of the embodiment, the electronic device may overlay the first preset map on the screenshot of the sample user interface belonging to the global normal type, and generate the screenshot of the sample user interface belonging to the global blank type. The first preset map may be a pure color map with preset size and pixel value (e.g., RGB value). For example, a solid color map having an area no less than one fifth of the screen shot of the sample user interface and RGB values between 0-10 or 250-255. Wherein, the pure color map with RGB value between 0-10 is covered on the screenshot of the sample user interface belonging to the global normal type, so as to generate the screenshot of the sample user interface belonging to the global black-left type; the screen shot of the sample user interface belonging to the global whitespace type can be generated by overlaying the screen shot of the sample user interface belonging to the global normal type with a pure color map with RGB values between 250 and 255.
And 403, training a preset first convolutional neural network based on the global positive sample and the first global negative sample by using a deep learning method to obtain a global blank test model.
In this embodiment, the electronic device may take a screenshot of a sample user interface belonging to a global normal type in the global positive sample and a screenshot of a sample user interface belonging to a global blank type in the first global negative sample as input, take a global test result corresponding to the sample user interface belonging to the global normal type in the global positive sample and a global test result corresponding to the sample user interface belonging to the global blank type in the first global negative sample as corresponding outputs, and train a preset first convolution neural network (e.g., ResNet), so as to obtain a global blank test model. Among them, the Convolutional Neural Network (CNN) may be a kind of feed-forward Neural Network, and its artificial neurons may respond to a part of surrounding units in the coverage range, and has excellent performance for large-scale image processing. Here, the network parameters (e.g., weight parameters and bias parameters) of the first convolutional neural network may be initialized with different small random numbers, and the network parameters of the first convolutional neural network may be adjusted by using a BP (Back Propagation) algorithm or an SGD (Stochastic Gradient Descent) algorithm during the training process.
Step 402', a screenshot of the sample user interface belonging to the global keyboard exception type and a global test result corresponding to the sample user interface belonging to the global keyboard exception type are obtained as a second global negative sample.
In this embodiment, the electronic device may obtain a screenshot of a sample user interface belonging to the global keyboard exception type and a global test result corresponding to the sample user interface belonging to the global keyboard exception type, and use the screenshot as a second global negative sample. The screenshot of the sample user interface belonging to the global keyboard exception type is input information in the second global negative sample, and the global test result corresponding to the sample user interface belonging to the global keyboard exception type is output information in the second global negative sample. Generally, the screenshot of the sample user interface belonging to the global keyboard exception type may be a screenshot of the sample user interface when the defect of the global keyboard exception type occurs, or may be a screenshot generated on the basis of the screenshot of the sample user interface belonging to the global normal type.
In some optional implementation manners of this embodiment, the electronic device may overlay a preset keyboard exception map on the screenshot of the sample user interface belonging to the global normal type, and generate the screenshot of the sample user interface belonging to the global keyboard exception type. Wherein the preset keyboard exception map may be overlaid at a keyboard region in a screenshot of the sample user interface of the global normal type.
And 403', training a preset second convolutional neural network based on the global positive sample and the second global negative sample by using a deep learning method to obtain a global keyboard anomaly test model.
In this embodiment, the electronic device may take a screenshot of a sample user interface belonging to a global normal type in the global positive sample and a screenshot of a sample user interface belonging to a global keyboard abnormal type in the second global negative sample as input, respectively take a global test result corresponding to the sample user interface belonging to the global normal type in the global positive sample and a global test result corresponding to the sample user interface belonging to the global keyboard abnormal type in the second global negative sample as corresponding outputs, and train a preset second convolutional neural network (e.g., ResNet), thereby obtaining a global keyboard abnormal test model. Here, the network parameters of the second convolutional neural network may be initialized with some different small random numbers, and the BP algorithm or the SGD algorithm may be used to adjust the network parameters of the second convolutional neural network during the training process.
In some optional implementation manners in this embodiment, the electronic device may perform graphics heterogeneous processing on the screenshot of the sample user interface belonging to the global normal type, the screenshot of the sample user interface belonging to the global blank type, and the screenshot of the sample user interface belonging to the global normal type, so as to generate a plurality of global positive samples, a plurality of first global negative samples, and a plurality of second global negative samples. Here, the electronic device may process the screenshots of the sample user interface by using a heterogeneous graphical processing manner such as graphical rotation, graphical scaling, and graphical zooming, so as to expand the number of screenshots of the sample user interface.
With further reference to FIG. 5, a flow 500 of one embodiment of a method for training a local test model according to the present application is illustrated. The process 500 of the method for training a local test model includes the steps of:
step 501, a sample local area belonging to the character normal type and a local test result corresponding to the sample local area belonging to the character normal type are obtained as local positive samples.
In the present embodiment, an electronic device (e.g., the server 105 shown in fig. 1) on which the method for training the local test model operates may obtain a sample local area belonging to the normal type of the character and a local test result corresponding to the sample local area belonging to the normal type of the character, and take it as a local positive sample. The local test result corresponding to the local area of the sample belonging to the normal type of the character is output information in the local positive sample. In general, the sample partial region belonging to the normal type of character is identical to the expected reference of the sample partial region.
Step 502, a sample local area belonging to the character overlap type and a local test result corresponding to the sample local area belonging to the character overlap type are obtained as a first local negative sample.
In this embodiment, the electronic device may acquire a sample local area belonging to the character overlap type and a local test result corresponding to the sample local area belonging to the character overlap type, and take them as a first local negative sample. Wherein the sample local area belonging to the character overlap type is input information in the first local negative sample, and the local test result corresponding to the sample local area belonging to the character overlap type is output information in the first local negative sample. In general, the sample partial area belonging to the character overlap type may be a figure when the sample partial area has a defect of the character overlap type, or may be a figure generated on the basis of the sample partial area belonging to the local normal type.
In some optional implementations of the embodiment, the electronic device may overlay the preset character map on the sample local area belonging to the normal type of the character, and generate the sample local area belonging to the character overlap type.
Step 503, training a preset third convolutional neural network based on the local positive sample and the first local negative sample by using a deep learning method to obtain a character overlap test model.
In this embodiment, the electronic device may take a sample local area belonging to the character normal type in the local positive sample and a sample local area belonging to the character overlap type in the first local negative sample as inputs, take a local test result corresponding to the sample local area belonging to the character normal type in the local positive sample and a local test result corresponding to the sample local area belonging to the character overlap type in the first local negative sample as corresponding outputs, and train a preset third convolutional neural network (e.g., ResNet), thereby obtaining the character overlap test model. Here, the network parameters of the third convolutional neural network may be initialized with some different small random numbers, and the BP algorithm or the SGD algorithm may be used to adjust the network parameters of the third convolutional neural network during the training process.
Step 502', a sample local area belonging to the character background transition abnormal type and a local test result corresponding to the sample local area belonging to the character background transition abnormal type are obtained as a second local negative sample.
In this embodiment, the electronic device may obtain a sample local area belonging to the character background transition abnormal type and a local test result corresponding to the sample local area belonging to the character background transition abnormal type, and use them as a second local negative sample. The local test result corresponding to the sample local area belonging to the character background transition abnormal type is output information in the second local negative sample. In general, the sample local area belonging to the character background transition abnormal type may be a graph when the sample local area has a defect of the character background transition abnormal type, or may be a graph generated on the basis of the sample local area belonging to the local normal type.
In some optional implementation manners of this embodiment, the electronic device may cover the preset map layer on the sample local area belonging to the normal type of the character, and generate the sample local area belonging to the background transition abnormal type of the character. The preset layer may be a layer with a preset transparency. For example, a layer with a transparency of 50%.
Step 503', training a preset fourth convolutional neural network based on the local positive sample and the second local negative sample by using a deep learning method to obtain a character background transition anomaly test model.
In this embodiment, the electronic device may take, as inputs, a sample local area in the local positive sample that belongs to the character normal type and a sample local area in the second local negative sample that belongs to the character background transition abnormality type, respectively, take, as corresponding outputs, a local test result in the local positive sample that corresponds to the sample local area that belongs to the character normal type and a local test result in the second local negative sample that corresponds to the sample local area that belongs to the character background transition abnormality type, respectively, and train a preset fourth convolutional neural network (e.g., ResNet), thereby obtaining a character background transition abnormality test model. Here, the network parameters of the fourth convolutional neural network may be initialized with some different small random numbers, and the network parameters of the fourth convolutional neural network may be adjusted by using a BP algorithm or an SGD algorithm in the training process.
Step 502 ″, a sample local area belonging to the character edge occlusion type and a local test result corresponding to the sample local area belonging to the character edge occlusion type are obtained as a third local negative sample.
In this embodiment, the electronic device may obtain a sample local area belonging to the character edge occlusion type and a local test result corresponding to the sample local area belonging to the character edge occlusion type, and use the local test result as a third local negative sample. The local test result corresponding to the sample local area belonging to the character edge occlusion type is output information in the third local negative sample. In general, the sample local area belonging to the character edge occlusion type may be a pattern when the sample local area has a defect of the character edge occlusion type, or may be a pattern generated on the basis of the sample local area belonging to the local normal type.
In some optional implementations of the embodiment, the electronic device may overlay the second preset map on an edge of the sample local area belonging to the normal type of the character, and generate the sample local area belonging to the character edge occlusion test model. Wherein the second preset map may be a pure color map of a preset size and pixel values (e.g., RGB values), and the second preset map may be overlaid at edges of the character string in the sample partial area.
And step 503, training a preset fifth convolutional neural network based on the local positive sample and the third local negative sample by using a deep learning method to obtain a character edge shielding test model.
In this embodiment, the electronic device may take a sample local area belonging to the character normal type in the local positive sample and a sample local area belonging to the character edge occlusion type in the third local negative sample as inputs, take a local test result corresponding to the sample local area belonging to the character normal type in the local positive sample and a local test result corresponding to the sample local area belonging to the character edge occlusion type in the third local negative sample as corresponding outputs, and train a preset fifth convolutional neural network (e.g., ResNet), so as to obtain a character edge occlusion test model. Here, the network parameters of the fifth convolutional neural network may be initialized with some different small random numbers, and the BP algorithm or the SGD algorithm may be used to adjust the network parameters of the fifth convolutional neural network during the training process.
In some optional implementations of the present embodiment, the electronic device may perform a graph heterogeneous process on a sample local area belonging to a character normal type, a sample local area belonging to a character overlap type, a sample local area belonging to a character background transition abnormal type, and a sample local area belonging to a character edge occlusion test model, to generate a plurality of local positive samples, a plurality of first local negative samples, a plurality of second local negative samples, and a plurality of third local negative samples. Here, the electronic device may process the sample local regions by using a graphics heterogeneous processing method such as graphics rotation, graphics scaling, and graphics scaling, so as to expand the number of the sample local regions.
With further reference to fig. 6, as an implementation of the method shown in the above figures, the present application provides an embodiment of an apparatus for testing a user interface, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus may be applied to various electronic devices.
As shown in fig. 6, the apparatus 600 for testing a user interface of the present embodiment may include: an acquisition unit 601, a determination unit 602, and a test unit 603. The acquisition unit 601 is configured to acquire a screenshot of a user interface to be tested; a determining unit 602 configured to determine whether a screenshot of a user interface to be tested satisfies a preset condition; the testing unit 603 is configured to input the screenshot of the user interface to be tested to the global testing model in response to determining that the preset condition is met, and obtain a global testing result corresponding to the user interface to be tested; and segmenting at least one local area from the screenshot of the user interface to be tested, inputting the segmented at least one local area into a pre-trained local test model, and obtaining a local test result corresponding to the user interface to be tested, wherein each local area comprises a character string.
In the present embodiment, in the apparatus 600 for testing a user interface: the specific processing of the obtaining unit 601, the determining unit 602, and the testing unit 603 and the technical effects thereof can refer to the related descriptions of step 201, step 202, step 203, and steps 203'-204' in the corresponding embodiment of fig. 2, which are not repeated herein.
In some optional implementations of this embodiment, the determining unit 602 may include: an execution module (not shown in the figure) configured to perform at least one of the following operations: identifying at least one character string in the screenshot of the user interface to be tested by utilizing an Optical Character Recognition (OCR) technology, and matching the identified at least one character string in a preset error character string set; determining the color type included in the screenshot of the user interface to be tested by using a color model; a determining module (not shown in the figure) configured to determine whether the screenshot of the user interface to be tested meets a preset condition based on a result of the at least one operation, wherein the preset condition includes at least one of the following: the character strings are unsuccessfully matched in the preset error character string set, and the number of the color types is larger than the preset number.
In some optional implementations of this embodiment, the test unit 603 may include: a document object model obtaining module (not shown in the figure) configured to obtain a document object model DOM of a user interface to be tested, where the DOM of the user interface to be tested may include location information of at least one first local area, where each first local area includes a character string; a first partial region segmentation module (not shown in the figure) configured to segment the at least one first partial region from the screenshot of the user interface to be tested as the at least one partial region according to the indication of the position information of the at least one first partial region.
In some optional implementations of this embodiment, the DOM of the user interface to be tested may further include location information of at least one second local area, where each second local area includes a plurality of character strings; and the test unit 603 may further include: a second local area splitting module (not shown in the figure) configured to split at least one second local area from the screenshot of the user interface to be tested according to the indication of the position information of the at least one second local area; and a local region segmentation module (not shown in the figure) configured to, for each of the at least one second local region, identify a plurality of character strings in the second local region by using an OCR technology, and segment, from the second local region, a region in which each of the identified character strings is located as the plurality of local regions.
In some optional implementation manners of this embodiment, the global test result may include a global type and location information of a region corresponding to the global type, and the local test result may include a local type and location information of a region corresponding to the local type.
In some optional implementations of this embodiment, the global type may include, but is not limited to, at least one of: a global normal type, a global blank type and a global keyboard exception type, wherein the local type comprises at least one of the following types: the method comprises a character normal type, a character overlapping type, a character background transition abnormal type and a character edge shielding type.
In some optional implementations of this embodiment, the global test model may include, but is not limited to, at least one of: the global empty test model and the global keyboard abnormity test model are adopted, and the local test model comprises at least one of the following items: the method comprises a character overlapping test model, a character background transition abnormity test model and a character edge shielding test model.
In some optional implementations of this embodiment, the apparatus 600 for testing a user interface may further include a global test model training unit (not shown in the figure), and the global test module training unit may include: a global positive sample obtaining module (not shown in the figure) configured to obtain a screenshot of a sample user interface belonging to a global normal type and a global test result corresponding to the sample user interface belonging to the global normal type as a global positive sample; a first global negative sample obtaining module (not shown in the figure), configured to obtain a screenshot of a sample user interface belonging to a global blank type and a global test result corresponding to the sample user interface belonging to the global blank type as a first global negative sample; a second global negative sample obtaining module (not shown in the figure), configured to obtain a screenshot of a sample user interface belonging to the global keyboard exception type and a global test result corresponding to the sample user interface belonging to the global keyboard exception type as a second global negative sample; a global blank test model training module (not shown in the figure) configured to train a preset first convolutional neural network based on a global positive sample and a first global negative sample by using a deep learning method to obtain a global blank test model; and a global keyboard anomaly test model training module (not shown in the figure) configured to train a preset second convolutional neural network based on the global positive sample and the second global negative sample by using a deep learning method to obtain a global keyboard anomaly test model.
In some optional implementations of this embodiment, the apparatus 600 for testing a user interface may further include a global sample generation unit (not shown in the figure), and the global sample generation unit may include: a first global negative sample generation module (not shown in the figure) configured to overlay a first preset map on the screenshot of the sample user interface belonging to the global normal type, and generate the screenshot of the sample user interface belonging to the global blank type; and a second global negative example generating module (not shown in the figure) configured to overlay the preset keyboard exception map on the screenshot of the sample user interface belonging to the global normal type, and generate the screenshot of the sample user interface belonging to the global keyboard exception type.
In some optional implementations of this embodiment, the global sample generation unit may further include: and a first graph heterogeneous processing module (not shown in the figure) configured to perform graph heterogeneous processing on the screenshot of the sample user interface belonging to the global normal type, the screenshot of the sample user interface belonging to the global blank type, and the screenshot of the sample user interface belonging to the global normal type, and generate a plurality of global positive samples, a plurality of first global negative samples, and a plurality of second global negative samples.
In some optional implementations of this embodiment, the apparatus 600 for testing a user interface may further include a local test model training unit (not shown in the figure), and the local test module training unit may include: a local positive sample obtaining module (not shown in the figure) configured to obtain a sample local area belonging to the character normal type and a local test result corresponding to the sample local area belonging to the character normal type as a local positive sample; a first local negative sample acquisition module (not shown in the figure) configured to acquire a sample local area belonging to the character overlap type and a local test result corresponding to the sample local area belonging to the character overlap type as a first local negative sample; a second local negative sample obtaining module (not shown in the figure), configured to obtain a sample local area belonging to the character background transition abnormal type and a local test result corresponding to the sample local area belonging to the character background transition abnormal type, as a second local negative sample; a third local negative sample obtaining module (not shown in the figure), configured to obtain a sample local area belonging to the character edge occlusion type and a local test result corresponding to the sample local area belonging to the character edge occlusion type, as a third local negative sample; a character overlap test model training module (not shown in the figure) configured to train a preset third convolutional neural network based on the local positive sample and the first local negative sample by using a deep learning method to obtain a character overlap test model; a character background transition anomaly test model training module (not shown in the figure) configured to train a preset fourth convolutional neural network based on a local positive sample and a second local negative sample by using a deep learning method to obtain a character background transition anomaly test model; and a character edge occlusion test model training module (not shown in the figure) configured to train a preset fifth convolutional neural network based on the local positive sample and the third local negative sample by using a deep learning method, so as to obtain a character edge occlusion test model.
In some optional implementations of this embodiment, the apparatus 600 for testing a user interface may further include a local sample generation unit (not shown in the figure), and the local sample generation unit may include: a first local negative sample generation module (not shown in the figure) configured to overlay a preset character map on a sample local area belonging to a normal type of a character, and generate a sample local area belonging to an overlapping type of the character; a second local negative sample generation module (not shown in the figure), configured to cover the preset layer on a sample local area belonging to a normal type of the character, and generate a sample local area belonging to an abnormal type of a background transition of the character; and a third local negative sample generation module (not shown in the figure) configured to cover the second preset map on an edge of the sample local area belonging to the normal type of the character, and generate the sample local area belonging to the character edge occlusion test model.
In some optional implementations of this embodiment, the local sample generation unit may further include: and a second graph heterogeneous processing module (not shown in the figure) configured to perform graph heterogeneous processing on the sample local area belonging to the character normal type, the sample local area belonging to the character overlapping type, the sample local area belonging to the character background transition abnormal type, and the sample local area belonging to the character edge occlusion test model, so as to generate a plurality of local positive samples, a plurality of first local negative samples, a plurality of second local negative samples, and a plurality of third local negative samples.
Referring now to FIG. 7, shown is a block diagram of a computer system 700 suitable for use in implementing the electronic device of an embodiment of the present application. The electronic device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 7, the computer system 700 includes a Central Processing Unit (CPU)701, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from a storage section 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data necessary for the operation of the system 700 are also stored. The CPU 701, the ROM 702, and the RAM 703 are connected to each other via a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
The following components are connected to the I/O interface 705: an input portion 706 including a keyboard, a mouse, and the like; an output section 707 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 708 including a hard disk and the like; and a communication section 709 including a network interface card such as a LAN card, a modem, or the like. The communication section 709 performs communication processing via a network such as the internet. A drive 710 is also connected to the I/O interface 705 as needed. A removable medium 711 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 710 as necessary, so that a computer program read out therefrom is mounted into the storage section 708 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program can be downloaded and installed from a network through the communication section 709, and/or installed from the removable medium 711. The computer program, when executed by a Central Processing Unit (CPU)701, performs the above-described functions defined in the method of the present application. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable medium or any combination of the two. A computer readable medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a determination unit, and a test unit. Where the names of these elements do not in some cases constitute a definition of the element itself, for example, the capture element may also be described as a "capture element for capturing a screenshot of a user interface to be tested".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a screenshot of a user interface to be tested; determining whether the screenshot of the user interface to be tested meets a preset condition; in response to the fact that the preset conditions are met, inputting the screenshot of the user interface to be tested into the global test model to obtain a global test result corresponding to the user interface to be tested; and segmenting at least one local area from the screenshot of the user interface to be tested, inputting the segmented at least one local area into a pre-trained local test model, and obtaining a local test result corresponding to the user interface to be tested, wherein each local area comprises a character string.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (15)

1. A method for testing a user interface, comprising:
acquiring a screenshot of a user interface to be tested;
determining whether the screenshot of the user interface to be tested meets a preset condition;
in response to the fact that the preset condition is met, inputting the screenshot of the user interface to be tested into a global test model to obtain a global test result corresponding to the user interface to be tested; dividing at least one local area from the screenshot of the user interface to be tested, inputting the divided at least one local area into a pre-trained local test model, and obtaining a local test result corresponding to the user interface to be tested, wherein each local area comprises a character string;
wherein, the global test model is obtained by training the following steps:
acquiring a screenshot of a sample user interface belonging to a global normal type and a global test result corresponding to the sample user interface belonging to the global normal type as a global positive sample;
acquiring a screenshot of a sample user interface belonging to the global blank type and a global test result corresponding to the sample user interface belonging to the global blank type, and taking the screenshot and the global test result as a first global negative sample;
acquiring a screenshot of a sample user interface belonging to the global keyboard abnormal type and a global test result corresponding to the sample user interface belonging to the global keyboard abnormal type, and taking the screenshot and the global test result as a second global negative sample;
training a preset first convolution neural network based on the global positive sample and the first global negative sample by using a deep learning method to obtain a global blank test model;
and training a preset second convolutional neural network based on the global positive sample and the second global negative sample by using a deep learning method to obtain a global keyboard anomaly test model.
2. The method of claim 1, wherein the determining whether the screenshot of the user interface to be tested satisfies a preset condition comprises:
performing at least one of the following operations:
identifying at least one character string in the screenshot of the user interface to be tested by utilizing an Optical Character Recognition (OCR) technology, and matching the identified at least one character string in a preset error character string set;
determining the color type included in the screenshot of the user interface to be tested by using a color model;
determining whether the screenshot of the user interface to be tested meets a preset condition or not based on the result of at least one operation, wherein the preset condition comprises at least one of the following conditions: the character strings are unsuccessfully matched in the preset error character string set, and the number of the color types is larger than the preset number.
3. The method of claim 1, wherein said segmenting at least one local region from a screenshot of the user interface to be tested comprises:
acquiring a Document Object Model (DOM) of the user interface to be tested, wherein the DOM of the user interface to be tested comprises position information of at least one first local area, and each first local area comprises a character string;
and according to the indication of the position information of the at least one first local area, segmenting the at least one first local area from the screenshot of the user interface to be tested to serve as the at least one local area.
4. The method of claim 3, wherein the DOM of the user interface to be tested further comprises position information of at least one second local area, wherein each second local area comprises a plurality of character strings; and
the step of segmenting at least one local area from the screenshot of the user interface to be tested further comprises the following steps:
segmenting the at least one second local area from the screenshot of the user interface to be tested according to the indication of the position information of the at least one second local area;
for each second local area in the at least one second local area, a plurality of character strings in the second local area are identified by using an OCR technology, and the area where each character string in the identified character strings is located is divided from the second local area to serve as a plurality of local areas.
5. The method of claim 1, wherein the global test result includes a global type and location information of a region corresponding to the global type, and the local test result includes a local type and location information of a region corresponding to the local type.
6. The method of claim 5, wherein the global type comprises at least one of: a global normal type, a global blank type and a global keyboard exception type, wherein the local type comprises at least one of the following types: the method comprises a character normal type, a character overlapping type, a character background transition abnormal type and a character edge shielding type.
7. The method of claim 6, wherein the global test model comprises at least one of: the global empty test model and the global keyboard abnormity test model comprise at least one of the following items: the method comprises a character overlapping test model, a character background transition abnormity test model and a character edge shielding test model.
8. The method of claim 1, wherein the global negative examples are generated by:
covering the first preset map on the screenshot of the sample user interface belonging to the global normal type, and generating the screenshot of the sample user interface belonging to the global blank type;
and covering the preset keyboard abnormal map on the screenshot of the sample user interface belonging to the global normal type, and generating the screenshot of the sample user interface belonging to the global keyboard abnormal type.
9. The method of claim 8, wherein the method further comprises:
and carrying out graphic heterogeneous processing on the screenshot of the sample user interface belonging to the global normal type, the screenshot of the sample user interface belonging to the global blank type and the screenshot of the sample user interface belonging to the global normal type to generate a plurality of global positive samples, a plurality of first global negative samples and a plurality of second global negative samples.
10. The method of claim 7, wherein the local test model is trained by:
acquiring a sample local area belonging to the character normal type and a local test result corresponding to the sample local area belonging to the character normal type as a local positive sample;
acquiring a sample local area belonging to a character overlapping type and a local test result corresponding to the sample local area belonging to the character overlapping type as a first local negative sample;
acquiring a sample local area belonging to the character background transition abnormal type and a local test result corresponding to the sample local area belonging to the character background transition abnormal type, and taking the local test result as a second local negative sample;
acquiring a sample local area belonging to a character edge shielding type and a local test result corresponding to the sample local area belonging to the character edge shielding type, and taking the local test result as a third local negative sample;
training a preset third convolutional neural network based on the local positive sample and the first local negative sample by using a deep learning method to obtain a character overlapping test model;
training a preset fourth convolutional neural network based on the local positive sample and the second local negative sample by using a deep learning method to obtain a character background transition anomaly test model;
and training a preset fifth convolutional neural network based on the local positive sample and the third local negative sample by using a deep learning method to obtain a character edge shielding test model.
11. The method of claim 10, wherein the local negative examples are generated by:
covering a preset character map on a sample local area belonging to a normal type of characters to generate a sample local area belonging to an overlapped type of characters;
covering a preset layer on a sample local area belonging to a normal type of a character to generate a sample local area belonging to an abnormal type of a background transition of the character;
and covering the second preset map on the edge of the sample local area belonging to the normal type of the character to generate the sample local area belonging to the character edge shielding test model.
12. The method of claim 11, wherein the method further comprises:
and carrying out graphic heterogeneous processing on a sample local area belonging to a character normal type, a sample local area belonging to a character overlapping type, a sample local area belonging to a character background transition abnormal type and a sample local area belonging to a character edge shielding test model to generate a plurality of local positive samples, a plurality of first local negative samples, a plurality of second local negative samples and a plurality of third local negative samples.
13. An apparatus for testing a user interface, comprising:
the acquisition unit is configured to acquire a screenshot of a user interface to be tested;
the determining unit is configured to determine whether the screenshot of the user interface to be tested meets a preset condition;
the test unit is configured to respond to the fact that the preset condition is met, input the screenshot of the user interface to be tested into a global test model, and obtain a global test result corresponding to the user interface to be tested; dividing at least one local area from the screenshot of the user interface to be tested, inputting the divided at least one local area into a pre-trained local test model, and obtaining a local test result corresponding to the user interface to be tested, wherein each local area comprises a character string;
wherein, the device also includes a global test model training unit, the global test module training unit includes:
the global positive sample acquisition module is configured to acquire a screenshot of a sample user interface belonging to a global normal type and a global test result corresponding to the sample user interface belonging to the global normal type as a global positive sample;
the first global negative sample acquisition module is configured to acquire a screenshot of a sample user interface belonging to a global blank type and a global test result corresponding to the sample user interface belonging to the global blank type as a first global negative sample;
the second global negative sample acquisition module is configured to acquire a screenshot of a sample user interface belonging to the global keyboard exception type and a global test result corresponding to the sample user interface belonging to the global keyboard exception type as a second global negative sample;
the global blank test model training module is configured and used for training a preset first convolution neural network based on the global positive sample and the first global negative sample by using a deep learning method to obtain a global blank test model;
and the global keyboard anomaly test model training module is configured and used for training a preset second convolutional neural network based on the global positive sample and the second global negative sample by utilizing a deep learning method to obtain a global keyboard anomaly test model.
14. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-12.
15. A computer-readable medium, on which a computer program is stored, wherein the computer program, when being executed by a processor, carries out the method according to any one of claims 1-12.
CN201810129865.6A 2018-02-08 2018-02-08 Method and apparatus for testing user interface Active CN108229485B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810129865.6A CN108229485B (en) 2018-02-08 2018-02-08 Method and apparatus for testing user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810129865.6A CN108229485B (en) 2018-02-08 2018-02-08 Method and apparatus for testing user interface

Publications (2)

Publication Number Publication Date
CN108229485A CN108229485A (en) 2018-06-29
CN108229485B true CN108229485B (en) 2022-05-17

Family

ID=62669563

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810129865.6A Active CN108229485B (en) 2018-02-08 2018-02-08 Method and apparatus for testing user interface

Country Status (1)

Country Link
CN (1) CN108229485B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109359056B (en) * 2018-12-21 2022-11-11 北京搜狗科技发展有限公司 Application program testing method and device
CN110008110B (en) * 2019-01-28 2022-07-22 创新先进技术有限公司 User interface testing method and device
CN110059596B (en) * 2019-04-03 2020-07-07 北京字节跳动网络技术有限公司 Image identification method, device, medium and electronic equipment
CN110245681A (en) * 2019-05-10 2019-09-17 北京奇艺世纪科技有限公司 Model generating method, application interface method for detecting abnormality, device, terminal device and computer readable storage medium
CN110309073B (en) * 2019-06-28 2021-07-27 上海交通大学 Method, system and terminal for automatically detecting user interface errors of mobile application program
CN110851349B (en) * 2019-10-10 2023-12-26 岳阳礼一科技股份有限公司 Page abnormity display detection method, terminal equipment and storage medium
CN111078552A (en) * 2019-12-16 2020-04-28 腾讯科技(深圳)有限公司 Method and device for detecting page display abnormity and storage medium
CN111198815B (en) * 2019-12-24 2023-11-03 中移(杭州)信息技术有限公司 Compatibility testing method and device for user interface
CN111259843B (en) * 2020-01-21 2021-09-03 敬科(深圳)机器人科技有限公司 Multimedia navigator testing method based on visual stability feature classification registration
CN116662206B (en) * 2023-07-24 2024-02-13 泰山学院 Computer software online real-time visual debugging method and device
CN116662211B (en) * 2023-07-31 2023-11-03 四川弘和数智集团有限公司 Display interface testing method, device, equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573747A (en) * 2015-12-10 2016-05-11 小米科技有限责任公司 User interface test method and apparatus
CN107622016A (en) * 2017-09-25 2018-01-23 无线生活(杭州)信息科技有限公司 A kind of page method of testing and device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8627295B2 (en) * 2009-10-09 2014-01-07 General Electric Company Methods and apparatus for testing user interfaces
US20110214107A1 (en) * 2010-03-01 2011-09-01 Experitest, Ltd. Method and system for testing graphical user interfaces
CN107229560A (en) * 2016-03-23 2017-10-03 阿里巴巴集团控股有限公司 A kind of interface display effect testing method, image specimen page acquisition methods and device
CN106874926A (en) * 2016-08-04 2017-06-20 阿里巴巴集团控股有限公司 Service exception detection method and device based on characteristics of image
CN106502891B (en) * 2016-10-19 2019-05-17 广州视源电子科技股份有限公司 The automatic testing method and device of user interface
CN107025174B (en) * 2017-04-06 2020-01-10 网易(杭州)网络有限公司 Method, device and readable storage medium for user interface anomaly test of equipment
CN107506300B (en) * 2017-08-09 2020-10-13 百度在线网络技术(北京)有限公司 User interface testing method, device, server and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105573747A (en) * 2015-12-10 2016-05-11 小米科技有限责任公司 User interface test method and apparatus
CN107622016A (en) * 2017-09-25 2018-01-23 无线生活(杭州)信息科技有限公司 A kind of page method of testing and device

Also Published As

Publication number Publication date
CN108229485A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108229485B (en) Method and apparatus for testing user interface
US11244208B2 (en) Two-dimensional document processing
CN109948507B (en) Method and device for detecting table
US11941529B2 (en) Method and apparatus for processing mouth image
KR102002024B1 (en) Method for processing labeling of object and object management server
CN110222694B (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN110717919A (en) Image processing method, device, medium and computing equipment
US20220036068A1 (en) Method and apparatus for recognizing image, electronic device and storage medium
US20230008696A1 (en) Method for incrementing sample image
CN114663904A (en) PDF document layout detection method, device, equipment and medium
CN112750162A (en) Target identification positioning method and device
CN108665769B (en) Network teaching method and device based on convolutional neural network
CN111738252A (en) Method and device for detecting text lines in image and computer system
CN113923474A (en) Video frame processing method and device, electronic equipment and storage medium
CN111414889B (en) Financial statement identification method and device based on character identification
US9672299B2 (en) Visualization credibility score
US9881210B2 (en) Generating a computer executable chart visualization by annotating a static image
CN115631374A (en) Control operation method, control detection model training method, device and equipment
CN115758005A (en) Large-screen image processing method, device and medium
CN115376137A (en) Optical character recognition processing and text recognition model training method and device
CN112232390B (en) High-pixel large image identification method and system
CN113468066A (en) User interface testing method and device
CN114445751A (en) Method and device for extracting video key frame image contour features
CN115050086B (en) Sample image generation method, model training method, image processing method and device
CN113886745B (en) Page picture testing method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant