WO2015039585A1 - Method and device for testing software reliability - Google Patents

Method and device for testing software reliability Download PDF

Info

Publication number
WO2015039585A1
WO2015039585A1 PCT/CN2014/086389 CN2014086389W WO2015039585A1 WO 2015039585 A1 WO2015039585 A1 WO 2015039585A1 CN 2014086389 W CN2014086389 W CN 2014086389W WO 2015039585 A1 WO2015039585 A1 WO 2015039585A1
Authority
WO
WIPO (PCT)
Prior art keywords
testing
interface
client device
image
server
Prior art date
Application number
PCT/CN2014/086389
Other languages
French (fr)
Inventor
Ying Wu
Linghong LI
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Publication of WO2015039585A1 publication Critical patent/WO2015039585A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites

Definitions

  • the disclosed implementations relate generally to the field of testing technologies, and, in particular, to testing method and system for testing software reliability.
  • the conventional scheme for testing the reliability of software on mobile phones includes the full-random performance of clicks, dragging gestures, and/or the like on interfaces of the software over a long time period (e.g. , with a MTTF random key test tool) to count a number of crashes and/or abnormalities with the software, so as to evaluate the reliability and stability of the software.
  • a method of testing software reliability is performed at a server (e.g. , server system 108, Figures 1-2) with one or more processors and memory.
  • the method includes obtaining, from a client device (e.g. , client device 104, Figures 1 and 3) , a first image of a first interface of a software program executed on the client device.
  • the method includes performing edge detection on the first image to obtain edge information for edges of the first interface in the first image.
  • the method includes selecting one or more testing locations in the first image based on the edge information and sending a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations.
  • the method includes obtaining, from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations.
  • a computing device (e.g. , server system 108, Figures 1-2, client device 104, Figures 1 and 3, or a combination thereof) includes one or more processors and memory storing one or more programs for execution by the one or more processors, the one or more programs include instructions for performing, or controlling performance of, the operations of any of the methods described herein.
  • a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a computing device (e.g.
  • a computing device e.g. , server system 108, Figures 1-2, client device 104, Figures 1 and 3, or a combination thereof
  • a computing device includes means for performing, or controlling performance of, the operations of any of the methods described herein.
  • FIG. 1 is a block diagram of a server-client environment in accordance with some embodiments.
  • FIG. 2 is a block diagram of a server system in accordance with some embodiments.
  • Figure 3 is a block diagram of a client device in accordance with some embodiments.
  • Figure 4 is a block diagram of a test history database in accordance with some embodiments.
  • Figure 5 is a block diagram of a portion of an interface hierarchy library in accordance with some embodiments.
  • Figures 6A illustrates a screenshot of an interface for a software program in accordance with some embodiments.
  • Figure 6B illustrates a visual representation of edge information for the interface in Figure 6A in accordance with some embodiments.
  • Figures 7A-7B illustrate a flowchart diagram of a method of testing software reliability in accordance with some embodiments.
  • Figures 8A-8C illustrate a flowchart diagram of a method of testing software reliability in accordance with some embodiments.
  • Figure 9 is a block diagram of an apparatus for test software reliability in accordance with some embodiments.
  • server-client environment 100 includes client-side processing 102-1, 102-2 (hereinafter “client-side modules 102” ) executed on a client device 104-1, 104-2, and server-side processing 106 (hereinafter “server-side module 106” ) executed on a server system 108.
  • client-side module 102 communicates with server-side module 106 through one or more networks 110.
  • Client-side module 102 provides client-side functionalities for the software testing application (e.g. , performing tests, screenshot invocation, test result transmission, etc. ) and communications with server-side module 106.
  • client-side module 102 is configured to execute as a background process concurrently with and independent of the control and operations of other applications executed in the foreground (e.g. , application (s) 326, Figure 3) , in order to capture screenshots of interfaces presented by client device 104 for the other applications.
  • the client-side module 102 is also able to perform tests with the other applications and invoke screen capture functions of the operating system, and/or other functionalities related to software testing.
  • Server-side module 106 provides server-side functionalities for the software testing application (e.g. , performing edge detection, selecting testing locations, analyzing test results, etc. ) for any number of client modules 102 each residing on a respective client device 104.
  • server-side module 106 includes one or more processors 112, test history database 114, interface hierarchy library 116, an I/O interface to one or more clients 118, and an I/O interface to one or more external services 120.
  • I/O interface to one or more clients 118 facilitates the client-facing input and output processing for server-side module 106.
  • processor (s) 112 perform edge detection on an image of an interface for a software program (i.e. , an application) to obtain edge information, select testing locations at which to perform test operations on the interface based on the edge information, send a testing request to the client device 104 including the testing locations, and analyze test results.
  • a software program i.e. , an application
  • Test history database 114 stores test results and corresponding testing locations
  • interface hierarchy library 116 stores a hierarchy of interfaces for software programs that have been tested by the software testing application.
  • I/O interface to one or more external services 120 facilitates communications with one or more external services 122 (e.g. , web servers or cloud-based service providers such as video and/or image hosting and storage websites) .
  • client device 104 examples include, but are not limited to, a handheld computer, a wearable computing device, a personal digital assistant (PDA) , a tablet computer, a laptop computer, a desktop computer, a cellular telephone, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices or other data processing devices.
  • PDA personal digital assistant
  • EGPS enhanced general packet radio service
  • Examples of one or more networks 110 include local area networks (LAN) and wide area networks (WAN) such as the Internet.
  • One or more networks 110 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB) , FIREWIRE, Long Term Evolution (LTE) , Global System for Mobile Communications (GSM) , Enhanced Data GSM Environment (EDGE) , code division multiple access (CDMA) , time division multiple access (TDMA) , Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP) , Wi-MAX, or any other suitable communication protocol.
  • USB Universal Serial Bus
  • FIREWIRE Long Term Evolution
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • EDGE Enhanced Data GSM Environment
  • CDMA code division multiple access
  • TDMA time division multiple access
  • Bluetooth Wi-Fi
  • Wi-Fi voice over Internet Protocol
  • Wi-MAX Wi-MAX
  • Server system 108 is implemented on one or more standalone data processing apparatuses or a distributed network of computers.
  • server system 108 also employs various virtual devices and/or services of third party service providers (e.g., third-party cloud service providers) to provide the underlying computing resources and/or infrastructure resources of server system 108.
  • third party service providers e.g., third-party cloud service providers
  • the server system 108 includes, but are not limited to, a handheld computer, a tablet computer, a laptop computer, a desktop computer, or a combination of any two or more of these data processing devices or other data processing devices.
  • Server-client environment 100 shown in Figure 1 includes both a client-side portion (e.g. , client-side module 102) and a server-side portion (e.g. , server-side module 106) .
  • data processing is implemented as a standalone application installed on client device 104.
  • client-side module 102 is a thin-client that provides only user-facing input and output processing functions, and delegates all other data processing functionalities to a backend server (e.g. , server system 108) .
  • FIG. 2 is a block diagram illustrating server system 108 in accordance with some embodiments.
  • Server system 108 typically, includes one or more processing units (CPUs) 112, one or more network interfaces 204 (e.g. , including I/O interface to one or more clients 118 and I/O interface to one or more external services 120) , memory 206, and one or more communication buses 208 for interconnecting these components (sometimes called a chipset) .
  • Memory 206 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 206, optionally, includes one or more storage devices remotely located from one or more processing units 112. Memory 206, or alternatively the non-volatile memory within memory 206, includes a non-transitory computer readable storage medium. In some implementations, memory 206, or the non-transitory computer readable storage medium of memory 206, stores the following programs, modules, and data structures, or a subset or superset thereof:
  • ⁇ operating system 210 including procedures for handling various basic system services and for performing hardware dependent tasks
  • ⁇ network communication module 212 for connecting server system 108 to other computing devices (e.g. , client devices 104 and external service (s) 122) connected to one or more networks 110 via one or more network interfaces 204 (wired or wireless) ;
  • ⁇ server-side module 106 which provides server-side data processing and functionalities for the recording application, including but not limited to:
  • ⁇ obtaining module 222 for obtaining, from a client device 104, a first image of a first interface of a software program executed on the client device 104 and for obtaining test results from the client device 104;
  • ⁇ edge detection module 224 for performing edge detection on the first image to obtain edge information for edges of the first interface in the first image
  • ⁇ identifying module 226 for identifying clusters and/or predefined shapes in the first image based on the edge information
  • ⁇ selecting module 228 for selecting one or more testing locations in the first image based on the edge information
  • ⁇ position information module 229 for identifying position information for the one or more testing locations selected by selecting module 228;
  • ⁇ requesting module 230 for sending a testing request to the client device 104 to perform one or more predefined operations (sometimes also herein called “test operations” or “testing operations” ) at each of the one or more selected testing locations;
  • ⁇ analyzing module 232 for analyzing the test results received from the client device 104;
  • ⁇ storing module 234 for storing the test results and the corresponding testing locations in test history database 114;
  • ⁇ interface mapping module 236 for generating a hierarchy of the interfaces for the software program executed on the client device 104 and storing the hierarchy in interface hierarchy library 116;
  • ⁇ server data 240 storing data for the software testing application, including but not limited to:
  • test history database 114 storing test results and corresponding testing locations
  • interface hierarchy library 116 storing a hierarchy of interfaces for software programs that have been tested by the software testing application.
  • Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above.
  • the above identified modules or programs i.e., sets of instructions
  • memory 206 optionally, stores a subset of the modules and data structures identified above.
  • memory 206 optionally, stores additional modules and data structures not described above.
  • FIG. 3 is a block diagram illustrating a representative client device 104 associated with a user in accordance with some embodiments.
  • Client device 104 typically, includes one or more processing units (CPUs) 302, one or more network interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components (sometimes called a chipset) .
  • Client device 104 also includes a user interface 310.
  • User interface 310 includes one or more output devices 312 that enable presentation of media content, including one or more speakers and/or one or more visual displays.
  • User interface 310 also includes one or more input devices 314, including user interface components that facilitate user input such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a gesture capturing camera, or other input buttons or controls. Furthermore, some client devices 104 use a microphone and voice recognition or a camera and gesture recognition to supplement or replace the keyboard.
  • Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices.
  • Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302.
  • Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium.
  • memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
  • ⁇ operating system 316 including procedures for handling various basic system services and for performing hardware dependent tasks
  • ⁇ network communication module 318 for connecting client device 104 to other computing devices (e.g. , server system 108) connected to one or more networks 110 via one or more network interfaces 304 (wired or wireless) ;
  • ⁇ presentation module 320 for enabling presentation of information (e.g., a user interface for application (s) 326, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc. ) at client device 104 via one or more output devices 312 (e.g. , displays, speakers, etc. ) associated with user interface 310;
  • information e.g., a user interface for application (s) 326, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc.
  • output devices 312 e.g. , displays, speakers, etc.
  • ⁇ input processing module 322 for detecting one or more user inputs or interactions from one of the one or more input devices 314 and interpreting the detected input or interaction;
  • ⁇ web browser module 324 for navigating, requesting (e.g. , via HTTP) , and displaying websites and web pages thereof;
  • client device 104 e.g., games, application marketplaces, payment platforms, and/or other web or non-web based applications
  • applications 326 for execution by client device 104 (e.g., games, application marketplaces, payment platforms, and/or other web or non-web based applications) ;
  • ⁇ screen capture module 328 for performing a screen shot function of operating system 316 so as to capture images of user interface for application (s) 326 executed in the foreground;
  • ⁇ client-side module 102 which provides client-side data processing and functionalities for the software testing application, including but not limited to:
  • ⁇ request handling module 332 for receiving and handling testing requests from server system 108;
  • ⁇ performing module 334 for performing the one or more predefined operations at each of the one or more selected testing locations within the first interface according to the testing request;
  • ⁇ screenshot invoking module 336 for invoking screen capture module 328 to capture one or more screenshots of the resulting interface from the one or more predefined operations at each of the one or more selected testing locations within the first interface;
  • ⁇ sending module 338 for sending to server system 108 test results (e.g., screenshots) for the one or more predefined operations performed at each of the one or more selected testing locations;
  • ⁇ client data 350 storing data associated with the software testing application, including, but is not limited to:
  • ⁇ test instructions store 352 for buffering and optionally storing the one or more predefined operations to be performed at the one or more selected testing locations within the first interface according to the testing request included in the testing request;
  • test results store 354 for buffering and optionally storing the test results for the one or more predefined operations performed at each of the one or more selected testing locations.
  • Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above.
  • the above identified modules or programs i.e., sets of instructions
  • memory 306 optionally, stores a subset of the modules and data structures identified above.
  • memory 306, optionally, stores additional modules and data structures not described above.
  • At least some of the functions of the client-side module 102 are performed by server-side module 106, and the corresponding sub-modules of these functions may be located within the server-side module 106 rather than the client-side module 102. In some embodiments, at least some of the functions of the server-side module 106 are performed by client-side module 102, and the corresponding sub-modules of these functions may be located within the client-side module 102 rather than the server-side module 106. For example, in some embodiments, edge detection module 224, analyzing module 232, storing module 234, and interface mapping module 236 may be implemented at least in part on client-side module 102. Server system 108 and client device 104 shown in Figures 2-3, respectively, are merely illustrative, and different configurations of the modules for implementing the functions described herein are possible in various embodiments.
  • FIG. 4 is a block diagram of a test history database 114 in accordance with some embodiments.
  • server system 108 manages and operates a software testing application that tests the reliability of software programs such as applications and/or websites. For example, the software testing application runs automated tests to seek out bugs, errors, malfunctions, and/or abnormalities with software programs.
  • server system 108 maintains test history database 114 so as to record previously run tests and their results.
  • test history database 114 is stored local to server system 108.
  • test history database 114 is stored remotely from server system 108 (e.g. , by one of external services 122) ; however, server system 108 has access to test history database 114.
  • test history database 114 includes entries for each of a plurality of test sequences.
  • each of the test sequences corresponds to a testing request sent to a client device 104.
  • a test sequence includes a single test with a predefined operation to be performed at a specific testing location within an interface of a software program.
  • a test sequence includes two or more tests with one or more predefined operations to be performed at each of a two or more of testing locations within the interface of the software program.
  • an entry for a respective test sequence in test history database 114 includes: (a) a test sequence identifier 402-A (e.g. , a unique number) for the respective test sequence; (b) an image 404-A corresponding to a first interface of a software program executed on the client device 104 (e.g. , a screenshot of a home interface for an application) ; (c) edge information 406-A corresponding to the result of an edge detection algorithm run on image 404-A; (d) website/application 408-A including the name of or identifier for the software program corresponding to image 404-A; and (e) test results 410-A for the respective test sequence run by the client device 104 on the software program corresponding to image 404-A.
  • a test sequence identifier 402-A e.g. , a unique number
  • test results 410-A are linked to a plurality of sub-entries for each of the tests corresponding to test sequence identifier 402-A performed by the client device 104.
  • a respective sub-entry for a first test in the respective test sequence corresponding to test sequence identifier 402-A includes: (i) a test identifier 420-A (e.g. , a unique number) corresponding to the first test; (ii) operation 422-A corresponding to the predefined operation for the first test (e.g.
  • test result 426-A corresponding to the result of the first test (e.g. , a screenshot of the interface displayed by the software program after performing operation 422-A at a location corresponding to position information 424-A) ; and (v) analysis 428-a corresponding to a determination by server system 108 whether the first interface of the software program includes an error or abnormality based on the first test.
  • Figure 5 is a block diagram of a portion of an interface hierarchy library 116 in accordance with some embodiments.
  • Figure 5 shows an interface hierarchy for a respective software program tested by the software testing application.
  • the software testing application is managed and operated by server system 108, and the software testing application includes a server-side portion (e.g. , server-side module 106, Figures 1-2) and a client-side portion (e.g. , client-side module 102, Figures 1 and 3) .
  • server-side module 108 sends testing requests to client device 104 to perform tests on software programs executed by the client device 104 with client-side module 102.
  • a testing entity sends an instruction via server-side module 106 to client-side module 102 to execute the respective software program (e.g. , one of application (s) 326, Figure 3) and also to capture and return a screenshot of the home interface for the respective software program.
  • a testing entity sends the screenshot of the home interface for the respective software program to server-side module 106 via client-side module 102.
  • interface 502 is the screenshot of the home interface for the respective software application.
  • server-side module 106 performs edge detection on the screenshot of interface 502 to obtain edge information corresponding to the edges of interface 502. Based on the edge information, server-side module 106 selects one or more testing locations in interface 502 and also determines one or more predefined operations (sometimes also herein called “test operations” ) to be performed at each of the one or more selected testing locations. Subsequently, server-side module 106 sends a testing request to client-side module 102 to perform the one or more determined operations at the corresponding one or more selected testing locations.
  • test operations sometimes also herein called “test operations”
  • the testing request is a test sequence with one or more tests, where each test includes a predefined operation and a corresponding testing location.
  • interface 502 corresponds to image 404-A ( Figure 4)
  • the testing request corresponds to test sequence identifier 402-A ( Figure 4)
  • each of the test in the testing request corresponds to a test identifier 420 ( Figure 4) .
  • client-side module 102 performs each of the tests included in the testing request. For example, for a respective test in the testing request, client-side module 102 captures a screenshot of the interface displayed by the respective software program in response to performing a respective predefined operation at a respective testing location. After performing the tests included in the testing request, client-side module 102 sends the testing results (i.e. , subsequent images or screenshots of the interface displayed by the respective software program in response to performing the tests included in the testing request) to server-side module 106 in serial order or as a batch. For example, the received test results are associated with a corresponding test identifier and stored in test history database (e.g. , a first test in the testing request is associated with test identifier 420-A ( Figure 4) and its corresponding test result is associated with test result 426-A ( Figure 4)) .
  • test history database e.g. , a first test in the testing request is associated with test identifier 420-A ( Figure 4) and its corresponding test
  • server-side module 106 determines whether the subsequent images are interfaces that are different from the home interface. For example, if a respective test includes a predefined operation (e.g. , a tap gesture) performed at a testing location corresponding to a button different interface, the home interface and the subsequent image of the interface following the respective test. In accordance with a determination that the subsequent images include an interface different from the home interface, server-side module 106 links the subsequent images to the home interface in the interface hierarchy in order to determine the interface hierarchy of the respective software program. As such, the software testing application is able to track the interface hierarchy of the respective software program and drill down to subsequent interfaces of the respective software program so as to test those subsequent interfaces.
  • a predefined operation e.g. , a tap gesture
  • interface 502 is the home interface and interfaces 504, 506, and 508 are interfaces different from interface 502 following the testing request.
  • interfaces 504, 506, and 508 correspond to three tests performed at different testing locations (e.g. , the tests included in the testing request) in interface 502.
  • a test includes a sequence of testing locations so as to drill down the interface hierarchy and test interfaces outside of the home interface.
  • a respective testing request includes three two-part tests where the first part of each of the three tests includes a respective operation to be performed at a respective location in interface 502 so as to drill down to interface 504.
  • the second part of a first one of the three tests includes a respective operation to be performed at a first location in interface 504
  • the second part of a second one of the three tests includes a respective operation to be performed at a second location in interface 504
  • the second part of a third one of the three tests includes a respective operation to be performed at a third location in interface 504.
  • the resulting interface displayed by the respective software program are interfaces 510, 512, and 514, respectively, which are all different from interface 504.
  • a operation performed at a testing location in a interface may trace back to an interface that is higher up in the interface hierarchy; however, this interface is accounted for in the interface hierarchy as a new interface.
  • interface 516 is the same as interface 502 but interface 516 is considered a new interface in the interface hierarchy because interface 516 was displayed by the respective software program as a result of drilling down to interface 508 from interface 502 and performing an operation as a testing location in interface 506.
  • Figure 6A is a screenshot captured by a client device 104 of an interface for a software program executed by the client device 104 in accordance with some embodiments.
  • software program is a respective application of application (s) 326 stored by client device 104.
  • the interface is a home interface for the respective application.
  • the screenshot in Figure 6A corresponds to image 404-A in test history database 114 ( Figure 4) .
  • Figure 6B illustrates a visual representation of edge information for the interface in Figure 6A in accordance with some embodiments.
  • the visual representation in Figure 6B is the result of running an edge detection algorithm on the interface in Figure 6A.
  • edge information 406-A in test history database 114 Figure 4
  • the edge information is used by server-side module 106 or component (s) thereof (e.g.
  • identifying module 226, Figure 2 and selecting module 228, Figure 2 to identify clusters and/or predefined shapes in the screenshot in Figure 6A of the interface for the software program based on the edge information and to select one or more testing locations in the interface in Figure 6A based on the edge information.
  • Figures 7A-7B illustrates a flowchart diagram of a method 700 of testing software reliability in accordance with some embodiments.
  • method 700 is performed by a server with one or more processors.
  • method 700 is performed by server system 108 ( Figures 1-2) or a component thereof (e.g., server-side module 106, Figures 1-2) .
  • method 700 is governed by instructions that are stored in a non-transitory computer readable storage medium of the server and the instructions are executed by one or more processors of the server.
  • Optional operations are indicated by dashed lines (e.g. , boxes with dashed-line borders) .
  • server system 108 through interactions with client device 104 (e.g. , a mobile phone or a mobile terminal) , tests the reliability of a software program (e.g. , an application) running on client device 104.
  • client device 104 e.g. , a mobile phone or a mobile terminal
  • server system 108 and client device 104 may be connected via a wired data connection or via a wireless data connection.
  • the server sends an instruction to a mobile terminal for acquiring a screenshot of a software program running on the mobile terminal.
  • the mobile terminal captures an image of the software program currently running and sends the image to server through the data connection.
  • the image corresponds to the screenshot in Figure 6A where the screenshot corresponds to a home interface of an application executed by the mobile terminal.
  • the server acquires (702) an image of a software program running on a mobile terminal and displayed on a touch screen of the mobile terminal.
  • the server obtains (704) edge information indicating edges of the image based on an edge detection process performed on the image.
  • the server calls a relevant image edge detection algorithm to perform edge detection on the received image.
  • Figure 6B includes a visual representation of edge information for the screenshot in Figure 6A.
  • the image edge detection algorithm may be, but is not limited to, the canny image edge detection algorithm.
  • the server acquires (706) position information for a detection spot used for testing the reliability of the software program according to the edge information.
  • the position information may be, but not limited to, a coordinate of a point in the touch screen region of the mobile terminal.
  • the server acquires the position information by (708) : selecting one or more points from the edge information at random; and acquiring position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program.
  • a point on an edge in the interface is the detection spot to be tested thereby improving test efficiency.
  • the server acquires the position information by (710) : acquiring one or more points according to a predetermined spacing from a respective edge of the image based on the edge information; and acquiring position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program.
  • a point near an edge in the interface is selected as the detection spot thereby improving test efficiency.
  • the server sends (712) a detection request to the mobile terminal, where the detection request includes position information for one or more detection spots and a touch screen operation instruction information for each of the one or more detection spots.
  • the touch screen operation instruction information comprises, but is not limited to, at least one of the following: click, dragging, long-press, or the like.
  • the mobile terminal after receiving the detection request, performs an operation indicated by the touch screen operation instruction information at each of the one or more detection spots corresponding to the position information. Subsequently, the mobile terminal returns the detection result obtained through the operation (s) to the server.
  • the server selects a coordinate of a character “ ⁇ ” of “ ⁇ ” as the position information of the detection spot, and sets “click” as the operation in the corresponding touch screen operation instruction information Then, the server sends the coordinate of the character “ ⁇ ” on the touch screen of the mobile terminal and the touch screen operation instruction information indicating “click” operation to the mobile terminal. After receiving the coordinate and the touch screen operation instruction information, the mobile terminal performs the “click” operation at the coordinate, and returns the result corresponding to the operation to the server as a detection result.
  • selecting the coordinate of the character “ ⁇ ” of “ ⁇ ” as the position information of the detection spot is only a example.
  • a coordinate of a stroke of the character “ ⁇ ” may also be selected as the position information of the detection spot.
  • a coordinate of a point of a stroke of the character “ ⁇ ” may even be selected as the position information of the detection spot. The embodiment has no limit in this aspect.
  • the server receives (714) a detection result from the mobile terminal, where the detection result is obtained by the mobile terminal which performs an operation indicated by the touch screen operation instruction information at a position indicated by the position information of a respective detection spot of the one or more detection spots.
  • the server successively receives (716) detection results from the mobile terminal which performs operations indicated by the touch screen operation instruction information at the positions indicated by the position information of one or more detection spots. For example, the mobile terminal immediately returns a detection result to the server after performing a corresponding operation, thereby enabling the server to acquire the detection result in real time.
  • the server receives (718) a test response from the mobile terminal, wherein the test response comprises detection results for operations indicated by the touch screen operation instruction information on positions indicated by the position information of each detection sport of the one or more detection spots.
  • the mobile terminal returns a batched set of detection results to the server that corresponds to a plurality of operations performed by the mobile terminal, thereby enabling the server to acquire detection results in batches.
  • the server (720) scans the detection results for a set of predetermined key words and, in accordance with a determination that one or more predetermined key words in the set of predetermined key words are included in the scanned detection results, identifies an error with the software.
  • the server records the detection results in the form of a log in a memory (e.g., test history database 114, Figures 1-2 and 4) .
  • the server searches the log for key words (e.g. , “abnormality” ) and determinates whether the detection results includes one or more of the key words.
  • the server identifies an error, abnormality, or unreliable issue with the software program.
  • Figures 8A-8C illustrates a flowchart diagram of a method 800 of testing software reliability in accordance with some embodiments.
  • method 800 is performed by a server with one or more processors.
  • method 800 is performed by server system 108 ( Figures 1-2) or a component thereof (e.g., server-side module 106, Figures 1-2) .
  • method 800 is governed by instructions that are stored in a non-transitory computer readable storage medium of the server and the instructions are executed by one or more processors of the server.
  • Optional operations are indicated by dashed lines (e.g. , boxes with dashed-line borders) .
  • the server obtains (802) , from a client device, a first image of a first interface of a software program executed on the client device.
  • server system 108 or a component thereof e.g. , obtaining module 222, Figure 2 obtains the first image from a respective client device 104 in response to an instruction sent by server system 108.
  • the first image is a screenshot
  • the first interface is a home interface an application.
  • Figure 6A for example, is a screenshot of an interface for a software program executed by the respective client device 104 that is sent by the respective client device 104 to server system 108.
  • the software program is a respective application of application (s) 326 stored by the respective client device 104
  • the interface is a home interface for the respective application.
  • the server performs (804) edge detection on the first image to obtain edge information for edges of the first interface in the first image.
  • server system 108 or a component thereof e.g. , edge detection module 224, Figure 2
  • edge detection module 224 runs an edge detection algorithm on the first image that detects all edges of user interface components, images, text, and the like in the first interface.
  • Figure 6B includes a visual representation of edge information for the screenshot in Figure 6A.
  • the server selects (806) one or more testing locations in the first image based on the edge information.
  • server system 108 or a component thereof e.g., selecting module 228, Figure 2 selects one or more testing locations in the first image based on the edge information.
  • the first interface includes white spaces that can be invoked to perform various functions. So, in some embodiments, some testing locations should be based on defined white space of significant size.
  • the edge information includes the location and size of significant white spaces on the UI.
  • the server selects the one or more testing locations in the first image based on the edge information by (808) : identifying at least one cluster of edges based on the edge information that satisfies one or more testing parameters; and selecting a location within a predetermined distance of a representative point of the at least one cluster as the testing location.
  • server system 108 or a component thereof e.g., identifying module 226, Figure 2 identifies a cluster based on the edge information that satisfies the one or more testing parameters.
  • the testing parameters indicate that a cluster is required to have a preset number of edges within a preset radius (e.g., X edges within a radius of N pixels or M millimeters) .
  • the representative point is the center of a cluster of edges. In some embodiments, the representative point is a point on an edge of the cluster (e.g. , a point on an edge of a cluster including the four edges of a button) . In some embodiments, the representative point is selected based on the characteristics of the cluster in question. For example, for a cluster with well defined boundaries (e.g. , a cluster indicating a button or affordance of the first interface) , the representative point is a random point on the boundary edge of the cluster. In another example, for a cluster indicating an undefined shape, the representative point is the center of mass or weighted center of the undefined shape. Alternatively, in some embodiments, the one or more testing locations are randomly selected locations on or near an edge in the interface.
  • the server selects the one or more testing locations in the first image based on the edge information by (810) : identifying at least one predefined shape based on the edge information that satisfies one or more testing parameters; and selecting a location within a predetermined distance of a center of the at least one predefined shape as the testing location.
  • server system 108 or a component thereof identifies a predefined based on the edge information that satisfies the one or more testing parameters.
  • the testing parameters indicate that a cluster is required to have a predefined shape with edges that are parallel to the edges of the screen such as a rectangle or square affordance.
  • the testing parameters indicate that a cluster is required to have a predefined shape with curved edges such as a circle or oval.
  • the server selects a location that is within the predefined shape (e.g. , the center of mass or weighted center of the predefined shape) or close to the perimeter of the predefined shape (e.g. , on outside of the predefined shape) .
  • the server identifies (812) position information for the one or more selected testing locations.
  • server system 108 or a component thereof e.g. , position information module 229, Figure 2 identifies position information for the one or more testing locations selected in operations 806-810.
  • respective position information includes the pixel coordinates of a selected testing location.
  • the server sends (814) a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations.
  • server system 108 or a component thereof e.g. , requesting module 230, Figure 2 sends a testing request to the client device 104 to perform one or more predefined operations at each of the one or more selected testing locations.
  • a respective predefined operation is one of a tap gesture, long-press gesture, swipe gesture, dragging gesture, or the like.
  • a respective predefined operation also includes selection of a hardware button in conjunction with some user interface component such as depressing a home button in concert with a long-press or tap on a touch screen affordance.
  • one or more operations on the hardware buttons can be introduced between any two consecutive predefined operations at the different testing locations.
  • the one or more predefined operations are selected for a particular testing location can be based on the shape and/or distribution of the edges associated with the testing location. For example, a cluster of edges that indicates the presence of a long scroll bar may give rise to a testing location for a tap gesture, a swipe gesture, or a dragging gesture. In another example, a cluster of edges that indicate the presence of a button may give rise to a testing location for a tap gesture only.
  • the testing request includes (816) the position information for the one or more selected testing locations. In some embodiments, the testing request includes the position identified for the one or more testing locations by the position information module 229.
  • the server obtains (818) , from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations.
  • server system 108 or a component thereof e.g. , obtaining module 222, Figure 2 obtains test results from the client device 104 in response to sending the testing request.
  • server system 108 or a component thereof e.g. , storing module 234, Figure 2 stores the test results and the corresponding testing locations and test operations in test history database 114 ( Figures 1-2 and 4) in association with the testing request.
  • each of the test sequences is associated with a test sequence identifier 402, and each test sequence corresponds to a testing request sent to a client device 104.
  • test results 410 for the test sequences are linked to a plurality of sub-entries for each of the tests corresponding to test sequence identifier 402-A performed by the client device 104.
  • a sub-entry for a respective test includes a test identifier 420, a test operation 422, position information 424 corresponding to a testing location, and test results 426 for the test performed at the client device (e.g. , a screenshot of the interface resulting from the test operation 422 performed at the testing location corresponding to the position information 424) .
  • the test results are screenshots after performing the predefined operation at the selected testing location. For example, in response to performing the predefined operation at the selected testing location, a subsequent interface is displayed or the interface changes. In some embodiments, if the test results are not received within Z seconds the server determines that an error has occurred with respect to a testing location. In some embodiments, if a screenshot of the home screen is returned after a test operation, the server may determine that the app has crashed after the test operation. In some embodiments, if the same screenshot has been returned after Y test operations (where Y is large) , the server may determine that the application is frozen.
  • the testing request identifies two or more testing locations
  • the server obtains, from the client device, test results by (820) : receiving first test results for a first predefined operation performed at a first testing location of the two or more testing locations; and, after receiving the first test results, receiving second test results for a second predefined operation performed at a second testing location of the two or more testing locations.
  • the test results are received in real-time, serial/successive order from the client device.
  • the real-time testing results are received by the server when the testing request includes one testing location and one predefined operation.
  • the server sends a testing request including a first testing location and corresponding test operation to the device (e.g.
  • the device performs the requested test and sends back the test result to the server.
  • the server sends a second testing request including a second testing location and corresponding test operation to the device, and the device performs the second requested test and sends back a second test result, and so on so forth.
  • the testing request identifies two or more testing locations
  • the server obtains, from the client device, test results by (822) : receiving batched test results at least including first test results for a first predefined operation performed at a first testing location of the two or more testing locations and second test results for a second predefined operation performed at a second testing location of the two or more testing locations.
  • the testing request includes a command to return to the first interface associated with the first image after performing a predefined operation at a testing location.
  • the device e.g. , client device 104, Figures and 3 running the software program to be tested receives a testing request which may include a series of testing locations and corresponding test operations.
  • the device performs each of the test operation at the specified testing locations in sequence, and obtains a screen shot (test result) after each of the test operations, and, after all of the test operations are performed, the device sends all of the screenshots (with the sequence data) in a bundle back to the server.
  • the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device.
  • the server For a respective image of the one or more subsequent images, the server (824) : performs character recognition on the respective image to obtain text of the respective image; and, in accordance with a determination that the text of the respective image includes one or more words in a predefined set of words, records the testing location corresponding to the respective image.
  • server system 108 or a component thereof e.g. , analyzing module 232, Figure 2 analyzes the test results (i.e.
  • the subsequent images by performing character recognition on the subsequent images and searching the character recognition results for words in predefined set of words. For example, after performing the predefined operation at the testing location, an error/malfunction with the software program causes an error message to appear. Thus, the server is looking for words like “error, ” “abnormality, ” “crash, ” or the like.
  • the predefined set of words includes words indicating occurrence of an error or abnormal execution of the operation.
  • the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device.
  • the server For a respective image of the one or more subsequent images, the server (826) : records the testing location corresponding to the respective image in accordance with a determination that the respective image includes a second interface different from the first interface in the first image; and associates the testing location corresponding to the respective image with the second interface in an interface hierarchy for the software program executed on the client device.
  • a second interface is displayed at the client device.
  • server system 108 or a component thereof (e.g. , interface mapping module 236, Figure 2) generates a hierarchy of the interfaces for the software program executed on the client device 104 and stores the hierarchy in interface hierarchy library 116.
  • the first interface is a home interface for the software program.
  • a test sequence with testing locations at two or more interfaces in the hierarchy is used to drill down to a respective interface in the hierarchy from the first interface so as to test the respective interface and get back to the respective interface.
  • Figure 9 is a block diagram of an apparatus for test software reliability in accordance with some embodiments.
  • the apparatus implements method 700 in Figures 7A-7B.
  • the apparatus corresponds to server system 108 ( Figures 1-2) or a component thereof (e.g. , server-side module 106, Figures 1-2) .
  • the apparatus may be implemented in whole or in part on a device (e.g., server system 108, Figures 1-2) through software, hardware, or a combination thereof.
  • the apparatus includes: a first acquiring module 902; an edge detection module 904; a second acquiring module 906; a first transmission module 908; and a second transmission module 910.
  • first acquiring module 902 is configured to acquire an image of a software program running on a mobile terminal.
  • edge detection module 904 is configured to obtain edge information indicating edges of the image based on an edge detection process performed on the image acquired by first acquiring module 902.
  • second acquiring module 906 is configured to acquire position information for a detection spot used for testing the reliability of the software program according to the edge information.
  • the position information may be, but is not limited to, a coordinate of a point in the touch screen region of the mobile terminal.
  • second acquiring module 906 includes: a first selecting sub-unit 922; and a first acquiring sub-unit 924.
  • first selecting sub-unit 922 is configured to select one or more points from the edge information at random.
  • first acquiring sub-unit 924 is configured to acquire position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program.
  • second acquiring module 906 includes: a second acquiring sub-unit 926; and a third acquiring sub-unit 928.
  • second acquiring sub-unit 926 is configured to acquire one or more points according to a predetermined spacing from a respective edge of the image based on the edge information.
  • third acquiring sub-unit 928 is configured to acquire position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program.
  • first transmission module 908 is configured to send a detection request to the mobile terminal.
  • the detection request includes position information for one or more detection spots and a touch screen operation instruction information for each of the one or more detection spots.
  • the touch screen operation instruction information comprises, but is not limited to, at least one of the following: click, dragging and long press.
  • second transmission module 910 is configured to receive a detection result from the mobile terminal.
  • the detection result is obtained by the mobile terminal which performs an operation indicated by the touch screen operation instruction information at a position indicated by the position information of a respective detection spot of the one or more detection spots.
  • second transmission module 910 is configured to successively receive detection results from the mobile terminal which performs operations indicated by the touch screen operation instruction information at the positions indicated by the position information of one or more detection spots.
  • second transmission module 910 is configured to receive a test response from the mobile terminal, wherein the test response comprises detection results for operations indicated by the touch screen operation instruction information on positions indicated by the position information of each detection sport of the one or more detection spots.
  • the apparatus further includes: a scanning module 912; and a determining module 914.
  • scanning module 912 is configured to scan the detection results for a set of predetermined key words.
  • determining module 914 is configured to identify an error with the software in accordance with a determination that one or more predetermined key words in the set of predetermined key words are included in the scanned detection results.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • Debugging And Monitoring (AREA)

Abstract

A method and system for testing software reliability are disclosed. A server with one or more processors and memory obtains, from a client device, a first image of a first interface of a software program executed on the client device. The server performs edge detection on the first image to obtain edge information for edges of the first interface in the first image. The server selects one or more testing locations in the first image based on the edge information and sends a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations. In response to sending the testing request, the server obtains, from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations.

Description

METHOD AND DEVICE FOR TESTING SOFTWARE RELIABILITY
PRIORITY CLAIM AND RELATED APPLICATION
This application claims priority to Chinese Patent Application No. 201310437097.8, entitled “Method and Apparatus for Testing Software Reliability, ” filed on September 23, 2013, which is incorporated by reference in its entirety.
TECHNICAL FIELD
The disclosed implementations relate generally to the field of testing technologies, and, in particular, to testing method and system for testing software reliability.
BACKGROUND
The conventional scheme for testing the reliability of software on mobile phones (i.e. , a black box test for a software without program source code or third party software) includes the full-random performance of clicks, dragging gestures, and/or the like on interfaces of the software over a long time period (e.g. , with a MTTF random key test tool) to count a number of crashes and/or abnormalities with the software, so as to evaluate the reliability and stability of the software.
However, the above-mentioned technical solution has the following problem (s) : With the randomness of the conventional testing scheme, many operations fail to achieve truly operable control of the software (e.g. , many operation events become invalid events) , which makes the conventional testing scheme unreliable. With respect to above the problem, there is no effective solution presently.
SUMMARY
In some embodiments, a method of testing software reliability is performed at a server (e.g. , server system 108, Figures 1-2) with one or more processors and memory. The method includes obtaining, from a client device (e.g. , client device 104, Figures 1 and 3) , a first image of a first interface of a software program executed on the client device. The method includes performing edge detection on the first image to obtain edge information for  edges of the first interface in the first image. The method includes selecting one or more testing locations in the first image based on the edge information and sending a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations. In response to sending the testing request, the method includes obtaining, from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations.
In some embodiments, a computing device (e.g. , server system 108, Figures 1-2, client device 104, Figures 1 and 3, or a combination thereof) includes one or more processors and memory storing one or more programs for execution by the one or more processors, the one or more programs include instructions for performing, or controlling performance of, the operations of any of the methods described herein. In some embodiments, a non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a computing device (e.g. , server system 108, Figures 1-2, client device 104, Figures 1 and 3, or a combination thereof) with one or more processors, cause the computing device to perform, or control performance of, the operations of any of the methods described herein. In some embodiments, a computing device (e.g. , server system 108, Figures 1-2, client device 104, Figures 1 and 3, or a combination thereof) includes means for performing, or controlling performance of, the operations of any of the methods described herein.
Various advantages of the present application are apparent in light of the descriptions below.
BRIEF DESCRIPTION OF DRAWINGS
The aforementioned features and advantages of the disclosed technology as well as additional features and advantages thereof will be more clearly understood hereinafter as a result of a detailed description of preferred embodiments when taken in conjunction with the drawings.
To describe the technical solutions in the embodiments of the present disclosed technology or in the prior art more clearly, the following briefly introduces the accompanying drawings required for describing the embodiments or the prior art. Apparently, the accompanying drawings in the following description show merely some embodiments of the  present disclosed technology, and persons of ordinary skill in the art may still derive other drawings from these accompanying drawings without creative efforts.
Figure 1 is a block diagram of a server-client environment in accordance with some embodiments.
Figure 2 is a block diagram of a server system in accordance with some embodiments.
Figure 3 is a block diagram of a client device in accordance with some embodiments.
Figure 4 is a block diagram of a test history database in accordance with some embodiments.
Figure 5 is a block diagram of a portion of an interface hierarchy library in accordance with some embodiments.
Figures 6A illustrates a screenshot of an interface for a software program in accordance with some embodiments.
Figure 6B illustrates a visual representation of edge information for the interface in Figure 6A in accordance with some embodiments.
Figures 7A-7B illustrate a flowchart diagram of a method of testing software reliability in accordance with some embodiments.
Figures 8A-8C illustrate a flowchart diagram of a method of testing software reliability in accordance with some embodiments.
Figure 9 is a block diagram of an apparatus for test software reliability in accordance with some embodiments.
Like reference numerals refer to corresponding parts throughout the several views of the drawings.
DETAILED DESCRIPTION
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one skilled in the art that the subject matter  may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
The following clearly and completely describes the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application. Apparently, the described embodiments are merely a part rather than all of the embodiments of the present application. All other embodiments obtained by persons of ordinary skill in the art based on the embodiments of the present application without creative efforts shall fall within the protection scope of the present application.
As shown in Figure 1, data processing for a software testing application is implemented in a server-client environment 100 in accordance with some embodiments. In accordance with some embodiments, server-client environment 100 includes client-side processing 102-1, 102-2 (hereinafter “client-side modules 102” ) executed on a client device 104-1, 104-2, and server-side processing 106 (hereinafter “server-side module 106” ) executed on a server system 108. Client-side module 102 communicates with server-side module 106 through one or more networks 110. Client-side module 102 provides client-side functionalities for the software testing application (e.g. , performing tests, screenshot invocation, test result transmission, etc. ) and communications with server-side module 106. In some embodiments, client-side module 102 is configured to execute as a background process concurrently with and independent of the control and operations of other applications executed in the foreground (e.g. , application (s) 326, Figure 3) , in order to capture screenshots of interfaces presented by client device 104 for the other applications. The client-side module 102 is also able to perform tests with the other applications and invoke screen capture functions of the operating system, and/or other functionalities related to software testing. Server-side module 106 provides server-side functionalities for the software testing application (e.g. , performing edge detection, selecting testing locations, analyzing test results, etc. ) for any number of client modules 102 each residing on a respective client device 104.
In some embodiments, server-side module 106 includes one or more processors 112, test history database 114, interface hierarchy library 116, an I/O interface to one or more clients 118, and an I/O interface to one or more external services 120. I/O interface to one or more clients 118 facilitates the client-facing input and output processing for server-side  module 106. In some embodiments, processor (s) 112 perform edge detection on an image of an interface for a software program (i.e. , an application) to obtain edge information, select testing locations at which to perform test operations on the interface based on the edge information, send a testing request to the client device 104 including the testing locations, and analyze test results. Test history database 114 stores test results and corresponding testing locations, and interface hierarchy library 116 stores a hierarchy of interfaces for software programs that have been tested by the software testing application. I/O interface to one or more external services 120 facilitates communications with one or more external services 122 (e.g. , web servers or cloud-based service providers such as video and/or image hosting and storage websites) .
Examples of client device 104 include, but are not limited to, a handheld computer, a wearable computing device, a personal digital assistant (PDA) , a tablet computer, a laptop computer, a desktop computer, a cellular telephone, a smart phone, an enhanced general packet radio service (EGPRS) mobile phone, a media player, a navigation device, a game console, a television, a remote control, or a combination of any two or more of these data processing devices or other data processing devices.
Examples of one or more networks 110 include local area networks (LAN) and wide area networks (WAN) such as the Internet. One or more networks 110 are, optionally, implemented using any known network protocol, including various wired or wireless protocols, such as Ethernet, Universal Serial Bus (USB) , FIREWIRE, Long Term Evolution (LTE) , Global System for Mobile Communications (GSM) , Enhanced Data GSM Environment (EDGE) , code division multiple access (CDMA) , time division multiple access (TDMA) , Bluetooth, Wi-Fi, voice over Internet Protocol (VoIP) , Wi-MAX, or any other suitable communication protocol.
Server system 108 is implemented on one or more standalone data processing apparatuses or a distributed network of computers. In some embodiments, server system 108 also employs various virtual devices and/or services of third party service providers (e.g., third-party cloud service providers) to provide the underlying computing resources and/or infrastructure resources of server system 108. In some embodiments, the server system 108 includes, but are not limited to, a handheld computer, a tablet computer, a laptop computer, a desktop computer, or a combination of any two or more of these data processing devices or other data processing devices.
Server-client environment 100 shown in Figure 1 includes both a client-side portion (e.g. , client-side module 102) and a server-side portion (e.g. , server-side module 106) . In some embodiments, data processing is implemented as a standalone application installed on client device 104. In addition, the division of functionalities between the client and server portions of client environment data processing can vary in different embodiments. For example, in some embodiments, client-side module 102 is a thin-client that provides only user-facing input and output processing functions, and delegates all other data processing functionalities to a backend server (e.g. , server system 108) .
Figure 2 is a block diagram illustrating server system 108 in accordance with some embodiments. Server system 108, typically, includes one or more processing units (CPUs) 112, one or more network interfaces 204 (e.g. , including I/O interface to one or more clients 118 and I/O interface to one or more external services 120) , memory 206, and one or more communication buses 208 for interconnecting these components (sometimes called a chipset) . Memory 206 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 206, optionally, includes one or more storage devices remotely located from one or more processing units 112. Memory 206, or alternatively the non-volatile memory within memory 206, includes a non-transitory computer readable storage medium. In some implementations, memory 206, or the non-transitory computer readable storage medium of memory 206, stores the following programs, modules, and data structures, or a subset or superset thereof:
·operating system 210 including procedures for handling various basic system services and for performing hardware dependent tasks;
·network communication module 212 for connecting server system 108 to other computing devices (e.g. , client devices 104 and external service (s) 122) connected to one or more networks 110 via one or more network interfaces 204 (wired or wireless) ;
·server-side module 106, which provides server-side data processing and functionalities for the recording application, including but not limited to:
ο obtaining module 222 for obtaining, from a client device 104, a first image of a first interface of a software program executed on the client device 104 and for obtaining test results from the client device 104;
ο edge detection module 224 for performing edge detection on the first image to obtain edge information for edges of the first interface in the first image;
ο identifying module 226 for identifying clusters and/or predefined shapes in the first image based on the edge information;
ο selecting module 228 for selecting one or more testing locations in the first image based on the edge information;
ο position information module 229 for identifying position information for the one or more testing locations selected by selecting module 228;
ο requesting module 230 for sending a testing request to the client device 104 to perform one or more predefined operations (sometimes also herein called “test operations” or “testing operations” ) at each of the one or more selected testing locations;
ο analyzing module 232 for analyzing the test results received from the client device 104;
ο storing module 234 for storing the test results and the corresponding testing locations in test history database 114; and
ο interface mapping module 236 for generating a hierarchy of the interfaces for the software program executed on the client device 104 and storing the hierarchy in interface hierarchy library 116; and
·server data 240 storing data for the software testing application, including but not limited to:
ο test history database 114 storing test results and corresponding testing locations; and
ο interface hierarchy library 116 storing a hierarchy of interfaces for software programs that have been tested by the software testing application.
Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, memory 206, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 206, optionally, stores additional modules and data structures not described above.
Figure 3 is a block diagram illustrating a representative client device 104 associated with a user in accordance with some embodiments. Client device 104, typically, includes one or more processing units (CPUs) 302, one or more network interfaces 304, memory 306, and one or more communication buses 308 for interconnecting these components (sometimes called a chipset) . Client device 104 also includes a user interface 310. User interface 310 includes one or more output devices 312 that enable presentation of media content, including one or more speakers and/or one or more visual displays. User interface 310 also includes one or more input devices 314, including user interface components that facilitate user input such as a keyboard, a mouse, a voice-command input unit or microphone, a touch screen display, a touch-sensitive input pad, a gesture capturing camera, or other input buttons or controls. Furthermore, some client devices 104 use a microphone and voice recognition or a camera and gesture recognition to supplement or replace the keyboard. Memory 306 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; and, optionally, includes non-volatile memory, such as one or more magnetic disk storage devices, one or more optical disk storage devices, one or more flash memory devices, or one or more other non-volatile solid state storage devices. Memory 306, optionally, includes one or more storage devices remotely located from one or more processing units 302. Memory 306, or alternatively the non-volatile memory within memory 306, includes a non-transitory computer readable storage medium. In some implementations, memory 306, or the non-transitory computer readable storage medium of memory 306, stores the following programs, modules, and data structures, or a subset or superset thereof:
·operating system 316 including procedures for handling various basic system services and for performing hardware dependent tasks;
·network communication module 318 for connecting client device 104 to other computing devices (e.g. , server system 108) connected to one or more networks 110 via one or more network interfaces 304 (wired or wireless) ;
·presentation module 320 for enabling presentation of information (e.g., a user interface for application (s) 326, widgets, websites and web pages thereof, and/or games, audio and/or video content, text, etc. ) at client device 104 via one or more output devices 312 (e.g. , displays, speakers, etc. ) associated with user interface 310; 
·input processing module 322 for detecting one or more user inputs or interactions from one of the one or more input devices 314 and interpreting the detected input or interaction;
·web browser module 324 for navigating, requesting (e.g. , via HTTP) , and displaying websites and web pages thereof;
·one or more applications 326 for execution by client device 104 (e.g., games, application marketplaces, payment platforms, and/or other web or non-web based applications) ;
·screen capture module 328 for performing a screen shot function of operating system 316 so as to capture images of user interface for application (s) 326 executed in the foreground; and
·client-side module 102, which provides client-side data processing and functionalities for the software testing application, including but not limited to:
ο request handling module 332 for receiving and handling testing requests from server system 108;
ο performing module 334 for performing the one or more predefined operations at each of the one or more selected testing locations within the first interface according to the testing request;
ο screenshot invoking module 336 for invoking screen capture module 328 to capture one or more screenshots of the resulting interface from the one or more predefined operations at each of the one or more selected testing locations within the first interface; and
ο sending module 338 for sending to server system 108 test results (e.g., screenshots) for the one or more predefined operations performed at each of the one or more selected testing locations; and
·client data 350 storing data associated with the software testing application, including, but is not limited to:
ο test instructions store 352 for buffering and optionally storing the one or more predefined operations to be performed at the one or more selected testing locations within the first interface according to the testing request included in the testing request; and
ο test results store 354 for buffering and optionally storing the test results for the one or more predefined operations performed at each of the one or more selected testing locations.
Each of the above identified elements may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, modules or data structures, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, memory 306, optionally, stores a subset of the modules and data structures identified above. Furthermore, memory 306, optionally, stores additional modules and data structures not described above.
In some embodiments, at least some of the functions of the client-side module 102 are performed by server-side module 106, and the corresponding sub-modules of these functions may be located within the server-side module 106 rather than the client-side module 102. In some embodiments, at least some of the functions of the server-side module 106 are performed by client-side module 102, and the corresponding sub-modules of these functions may be located within the client-side module 102 rather than the server-side module 106. For example, in some embodiments, edge detection module 224, analyzing module 232, storing module 234, and interface mapping module 236 may be implemented at least in part on client-side module 102. Server system 108 and client device 104 shown in Figures 2-3, respectively, are merely illustrative, and different configurations of the modules for implementing the functions described herein are possible in various embodiments.
Figure 4 is a block diagram of a test history database 114 in accordance with some embodiments. In some embodiments, server system 108 manages and operates a software testing application that tests the reliability of software programs such as applications and/or websites. For example, the software testing application runs automated tests to seek out bugs, errors, malfunctions, and/or abnormalities with software programs. In some embodiments, server system 108 maintains test history database 114 so as to record previously run tests and their results. In some embodiments, test history database 114 is stored local to server system 108. In some embodiments, test history database 114 is stored remotely from server system 108 (e.g. , by one of external services 122) ; however, server system 108 has access to test history database 114.
In Figure 4, test history database 114 includes entries for each of a plurality of test sequences. For example, each of the test sequences corresponds to a testing request sent to a client device 104. In some embodiments, a test sequence includes a single test with a predefined operation to be performed at a specific testing location within an interface of a software program. In some embodiments, a test sequence includes two or more tests with one or more predefined operations to be performed at each of a two or more of testing locations within the interface of the software program.
In Figure 4, an entry for a respective test sequence in test history database 114 includes: (a) a test sequence identifier 402-A (e.g. , a unique number) for the respective test sequence; (b) an image 404-A corresponding to a first interface of a software program executed on the client device 104 (e.g. , a screenshot of a home interface for an application) ; (c) edge information 406-A corresponding to the result of an edge detection algorithm run on image 404-A; (d) website/application 408-A including the name of or identifier for the software program corresponding to image 404-A; and (e) test results 410-A for the respective test sequence run by the client device 104 on the software program corresponding to image 404-A.
In Figure 4, test results 410-A are linked to a plurality of sub-entries for each of the tests corresponding to test sequence identifier 402-A performed by the client device 104. A respective sub-entry for a first test in the respective test sequence corresponding to test sequence identifier 402-A includes: (i) a test identifier 420-A (e.g. , a unique number) corresponding to the first test; (ii) operation 422-A corresponding to the predefined operation for the first test (e.g. , tap, click, long-press, swipe, or the like) ; (iii) position information 424- A corresponding to the testing location within the first interface at which operation 422-A is performed; (iv) test result 426-A corresponding to the result of the first test (e.g. , a screenshot of the interface displayed by the software program after performing operation 422-A at a location corresponding to position information 424-A) ; and (v) analysis 428-a corresponding to a determination by server system 108 whether the first interface of the software program includes an error or abnormality based on the first test.
Figure 5 is a block diagram of a portion of an interface hierarchy library 116 in accordance with some embodiments. Figure 5 shows an interface hierarchy for a respective software program tested by the software testing application. In some embodiments, the software testing application is managed and operated by server system 108, and the software testing application includes a server-side portion (e.g. , server-side module 106, Figures 1-2) and a client-side portion (e.g. , client-side module 102, Figures 1 and 3) . For example, server-side module 108 sends testing requests to client device 104 to perform tests on software programs executed by the client device 104 with client-side module 102.
For example, a testing entity sends an instruction via server-side module 106 to client-side module 102 to execute the respective software program (e.g. , one of application (s) 326, Figure 3) and also to capture and return a screenshot of the home interface for the respective software program. In another example, a testing entity sends the screenshot of the home interface for the respective software program to server-side module 106 via client-side module 102.
In Figure 5, interface 502 is the screenshot of the home interface for the respective software application. Continuing with the examples above, after receiving interface 502, server-side module 106 performs edge detection on the screenshot of interface 502 to obtain edge information corresponding to the edges of interface 502. Based on the edge information, server-side module 106 selects one or more testing locations in interface 502 and also determines one or more predefined operations (sometimes also herein called “test operations” ) to be performed at each of the one or more selected testing locations. Subsequently, server-side module 106 sends a testing request to client-side module 102 to perform the one or more determined operations at the corresponding one or more selected testing locations. In some embodiments, the testing request is a test sequence with one or more tests, where each test includes a predefined operation and a corresponding testing location. For example, interface 502 corresponds to image 404-A (Figure 4) , the testing request corresponds to test sequence  identifier 402-A (Figure 4) , and each of the test in the testing request corresponds to a test identifier 420 (Figure 4) .
In response to receiving the testing request, client-side module 102 performs each of the tests included in the testing request. For example, for a respective test in the testing request, client-side module 102 captures a screenshot of the interface displayed by the respective software program in response to performing a respective predefined operation at a respective testing location. After performing the tests included in the testing request, client-side module 102 sends the testing results (i.e. , subsequent images or screenshots of the interface displayed by the respective software program in response to performing the tests included in the testing request) to server-side module 106 in serial order or as a batch. For example, the received test results are associated with a corresponding test identifier and stored in test history database (e.g. , a first test in the testing request is associated with test identifier 420-A (Figure 4) and its corresponding test result is associated with test result 426-A (Figure 4)) .
In response to receiving the testing results, server-side module 106 determines whether the subsequent images are interfaces that are different from the home interface. For example, if a respective test includes a predefined operation (e.g. , a tap gesture) performed at a testing location corresponding to a button different interface, the home interface and the subsequent image of the interface following the respective test. In accordance with a determination that the subsequent images include an interface different from the home interface, server-side module 106 links the subsequent images to the home interface in the interface hierarchy in order to determine the interface hierarchy of the respective software program. As such, the software testing application is able to track the interface hierarchy of the respective software program and drill down to subsequent interfaces of the respective software program so as to test those subsequent interfaces. In Figure 5, interface 502 is the home interface and interfaces 504, 506, and 508 are interfaces different from interface 502 following the testing request. For example, interfaces 504, 506, and 508 correspond to three tests performed at different testing locations (e.g. , the tests included in the testing request) in interface 502.
In some embodiments, a test includes a sequence of testing locations so as to drill down the interface hierarchy and test interfaces outside of the home interface. For example, a respective testing request includes three two-part tests where the first part of each of the three  tests includes a respective operation to be performed at a respective location in interface 502 so as to drill down to interface 504. However, the second part of a first one of the three tests includes a respective operation to be performed at a first location in interface 504, the second part of a second one of the three tests includes a respective operation to be performed at a second location in interface 504, and the second part of a third one of the three tests includes a respective operation to be performed at a third location in interface 504. In this example, the resulting interface displayed by the respective software program are  interfaces  510, 512, and 514, respectively, which are all different from interface 504.
In some embodiments, a operation performed at a testing location in a interface may trace back to an interface that is higher up in the interface hierarchy; however, this interface is accounted for in the interface hierarchy as a new interface. For example, interface 516 is the same as interface 502 but interface 516 is considered a new interface in the interface hierarchy because interface 516 was displayed by the respective software program as a result of drilling down to interface 508 from interface 502 and performing an operation as a testing location in interface 506.
Figure 6A is a screenshot captured by a client device 104 of an interface for a software program executed by the client device 104 in accordance with some embodiments. For example, software program is a respective application of application (s) 326 stored by client device 104. Continuing with this example, the interface is a home interface for the respective application. For example, the screenshot in Figure 6A corresponds to image 404-A in test history database 114 (Figure 4) .
Figure 6B illustrates a visual representation of edge information for the interface in Figure 6A in accordance with some embodiments. For example, the visual representation in Figure 6B is the result of running an edge detection algorithm on the interface in Figure 6A. In some embodiments, edge information 406-A in test history database 114 (Figure 4) includes edge information corresponding to the result of running an edge detection algorithm on the interface in Figure 6A such as the visual representation in Figure 6B or other related information. For example, the edge information is used by server-side module 106 or component (s) thereof (e.g. , identifying module 226, Figure 2 and selecting module 228, Figure 2) to identify clusters and/or predefined shapes in the screenshot in Figure 6A of the interface for the software program based on the edge information and to select one or more testing locations in the interface in Figure 6A based on the edge information.
Figures 7A-7B illustrates a flowchart diagram of a method 700 of testing software reliability in accordance with some embodiments. In some embodiments, method 700 is performed by a server with one or more processors. For example, in some embodiments, method 700 is performed by server system 108 (Figures 1-2) or a component thereof (e.g., server-side module 106, Figures 1-2) . In some embodiments, method 700 is governed by instructions that are stored in a non-transitory computer readable storage medium of the server and the instructions are executed by one or more processors of the server. Optional operations are indicated by dashed lines (e.g. , boxes with dashed-line borders) .
In some embodiments, server system 108, through interactions with client device 104 (e.g. , a mobile phone or a mobile terminal) , tests the reliability of a software program (e.g. , an application) running on client device 104. For example, server system 108 and client device 104 may be connected via a wired data connection or via a wireless data connection.
In some embodiments, the server sends an instruction to a mobile terminal for acquiring a screenshot of a software program running on the mobile terminal. After receiving the instruction, the mobile terminal captures an image of the software program currently running and sends the image to server through the data connection. For example, the image corresponds to the screenshot in Figure 6A where the screenshot corresponds to a home interface of an application executed by the mobile terminal.
The server acquires (702) an image of a software program running on a mobile terminal and displayed on a touch screen of the mobile terminal.
The server obtains (704) edge information indicating edges of the image based on an edge detection process performed on the image. In some embodiments, after receiving the image in operation 702, the server calls a relevant image edge detection algorithm to perform edge detection on the received image. For example, Figure 6B includes a visual representation of edge information for the screenshot in Figure 6A. For example, the image edge detection algorithm may be, but is not limited to, the canny image edge detection algorithm.
The server acquires (706) position information for a detection spot used for testing the reliability of the software program according to the edge information. In some embodiments, the position information may be, but not limited to, a coordinate of a point in the touch screen region of the mobile terminal.
In some embodiments, the server acquires the position information by (708) : selecting one or more points from the edge information at random; and acquiring position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program. In the embodiment, a point on an edge in the interface is the detection spot to be tested thereby improving test efficiency.
In some embodiments, the server acquires the position information by (710) : acquiring one or more points according to a predetermined spacing from a respective edge of the image based on the edge information; and acquiring position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program. In the embodiment, a point near an edge in the interface is selected as the detection spot thereby improving test efficiency.
The server sends (712) a detection request to the mobile terminal, where the detection request includes position information for one or more detection spots and a touch screen operation instruction information for each of the one or more detection spots. In some embodiments, the touch screen operation instruction information comprises, but is not limited to, at least one of the following: click, dragging, long-press, or the like.
In some embodiments, after receiving the detection request, the mobile terminal performs an operation indicated by the touch screen operation instruction information at each of the one or more detection spots corresponding to the position information. Subsequently, the mobile terminal returns the detection result obtained through the operation (s) to the server.
An example procedure of performing the operation will be described with reference to Figures 6A-6B. For example, based on the edges shown in Figure 6B, the server selects a coordinate of a character “中” of “中国象棋” as the position information of the detection spot, and sets “click” as the operation in the corresponding touch screen operation instruction information Then, the server sends the coordinate of the character “中” on the touch screen of the mobile terminal and the touch screen operation instruction information indicating “click” operation to the mobile terminal. After receiving the coordinate and the touch screen operation instruction information, the mobile terminal performs the “click” operation at the coordinate, and returns the result corresponding to the operation to the server as a detection result.
As will be appreciated by one of skill in the art, selecting the coordinate of the character “中” of “中国象棋” as the position information of the detection spot is only a example. In order to improve detection precision, in another example, a coordinate of a stroke of the character “中” may also be selected as the position information of the detection spot. Furthermore, in another example, a coordinate of a point of a stroke of the character “中” may even be selected as the position information of the detection spot. The embodiment has no limit in this aspect.
In response to the detection request, the server receives (714) a detection result from the mobile terminal, where the detection result is obtained by the mobile terminal which performs an operation indicated by the touch screen operation instruction information at a position indicated by the position information of a respective detection spot of the one or more detection spots.
In some embodiments, the server successively receives (716) detection results from the mobile terminal which performs operations indicated by the touch screen operation instruction information at the positions indicated by the position information of one or more detection spots. For example, the mobile terminal immediately returns a detection result to the server after performing a corresponding operation, thereby enabling the server to acquire the detection result in real time.
In some embodiments, the server receives (718) a test response from the mobile terminal, wherein the test response comprises detection results for operations indicated by the touch screen operation instruction information on positions indicated by the position information of each detection sport of the one or more detection spots. For example, the mobile terminal returns a batched set of detection results to the server that corresponds to a plurality of operations performed by the mobile terminal, thereby enabling the server to acquire detection results in batches.
In some embodiments, after receiving the detection results, the server (720) : scans the detection results for a set of predetermined key words and, in accordance with a determination that one or more predetermined key words in the set of predetermined key words are included in the scanned detection results, identifies an error with the software. For example, the server records the detection results in the form of a log in a memory (e.g., test history database 114, Figures 1-2 and 4) . Continuing with this example, the server searches  the log for key words (e.g. , “abnormality” ) and determinates whether the detection results includes one or more of the key words. In accordance with a determination that search includes at least one of the key words, the server identifies an error, abnormality, or unreliable issue with the software program.
It should be understood that the particular order in which the operations in Figures 7A-7B have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e.g. , method 800) are also applicable in an analogous manner to method 700 described above with respect to Figures 7A-7B.
Figures 8A-8C illustrates a flowchart diagram of a method 800 of testing software reliability in accordance with some embodiments. In some embodiments, method 800 is performed by a server with one or more processors. For example, in some embodiments, method 800 is performed by server system 108 (Figures 1-2) or a component thereof (e.g., server-side module 106, Figures 1-2) . In some embodiments, method 800 is governed by instructions that are stored in a non-transitory computer readable storage medium of the server and the instructions are executed by one or more processors of the server. Optional operations are indicated by dashed lines (e.g. , boxes with dashed-line borders) .
The server obtains (802) , from a client device, a first image of a first interface of a software program executed on the client device. In some embodiments, server system 108 or a component thereof (e.g. , obtaining module 222, Figure 2) obtains the first image from a respective client device 104 in response to an instruction sent by server system 108. For example, the first image is a screenshot, and the first interface is a home interface an application. Figure 6A, for example, is a screenshot of an interface for a software program executed by the respective client device 104 that is sent by the respective client device 104 to server system 108. In this example, the software program is a respective application of application (s) 326 stored by the respective client device 104, and the interface is a home interface for the respective application.
The server performs (804) edge detection on the first image to obtain edge information for edges of the first interface in the first image. In some embodiments, server system 108 or a component thereof (e.g. , edge detection module 224, Figure 2) performs edge  detection on the first image to obtain edge information for edges of the first interface in the first image. For example, edge detection module 224 runs an edge detection algorithm on the first image that detects all edges of user interface components, images, text, and the like in the first interface. Figure 6B, for example, includes a visual representation of edge information for the screenshot in Figure 6A.
The server selects (806) one or more testing locations in the first image based on the edge information. In some embodiments, server system 108 or a component thereof (e.g., selecting module 228, Figure 2) selects one or more testing locations in the first image based on the edge information. In some embodiments, the first interface includes white spaces that can be invoked to perform various functions. So, in some embodiments, some testing locations should be based on defined white space of significant size. As such, the edge information includes the location and size of significant white spaces on the UI.
In some embodiments, the server selects the one or more testing locations in the first image based on the edge information by (808) : identifying at least one cluster of edges based on the edge information that satisfies one or more testing parameters; and selecting a location within a predetermined distance of a representative point of the at least one cluster as the testing location. In some embodiments, server system 108 or a component thereof (e.g., identifying module 226, Figure 2) identifies a cluster based on the edge information that satisfies the one or more testing parameters. In some embodiments, the testing parameters indicate that a cluster is required to have a preset number of edges within a preset radius (e.g., X edges within a radius of N pixels or M millimeters) . In some embodiments, the representative point is the center of a cluster of edges. In some embodiments, the representative point is a point on an edge of the cluster (e.g. , a point on an edge of a cluster including the four edges of a button) . In some embodiments, the representative point is selected based on the characteristics of the cluster in question. For example, for a cluster with well defined boundaries (e.g. , a cluster indicating a button or affordance of the first interface) , the representative point is a random point on the boundary edge of the cluster. In another example, for a cluster indicating an undefined shape, the representative point is the center of mass or weighted center of the undefined shape. Alternatively, in some embodiments, the one or more testing locations are randomly selected locations on or near an edge in the interface.
In some embodiments, the server selects the one or more testing locations in the first image based on the edge information by (810) : identifying at least one predefined shape  based on the edge information that satisfies one or more testing parameters; and selecting a location within a predetermined distance of a center of the at least one predefined shape as the testing location. In some embodiments, server system 108 or a component thereof (e.g., identifying module 226, Figure 2) identifies a predefined based on the edge information that satisfies the one or more testing parameters. For example, the testing parameters indicate that a cluster is required to have a predefined shape with edges that are parallel to the edges of the screen such as a rectangle or square affordance. In another example, the testing parameters indicate that a cluster is required to have a predefined shape with curved edges such as a circle or oval. In some embodiments, the server selects a location that is within the predefined shape (e.g. , the center of mass or weighted center of the predefined shape) or close to the perimeter of the predefined shape (e.g. , on outside of the predefined shape) .
In some embodiments, prior to sending the testing request, the server identifies (812) position information for the one or more selected testing locations. In some embodiments, server system 108 or a component thereof (e.g. , position information module 229, Figure 2) identifies position information for the one or more testing locations selected in operations 806-810. For example, respective position information includes the pixel coordinates of a selected testing location.
The server sends (814) a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations. In some embodiments, server system 108 or a component thereof (e.g. , requesting module 230, Figure 2) sends a testing request to the client device 104 to perform one or more predefined operations at each of the one or more selected testing locations. In some embodiments, a respective predefined operation is one of a tap gesture, long-press gesture, swipe gesture, dragging gesture, or the like. In some embodiments, a respective predefined operation also includes selection of a hardware button in conjunction with some user interface component such as depressing a home button in concert with a long-press or tap on a touch screen affordance. In some embodiments, at any time during the testing, one or more operations on the hardware buttons can be introduced between any two consecutive predefined operations at the different testing locations.
In some embodiments, the one or more predefined operations are selected for a particular testing location can be based on the shape and/or distribution of the edges associated with the testing location. For example, a cluster of edges that indicates the  presence of a long scroll bar may give rise to a testing location for a tap gesture, a swipe gesture, or a dragging gesture. In another example, a cluster of edges that indicate the presence of a button may give rise to a testing location for a tap gesture only.
In some embodiments, the testing request includes (816) the position information for the one or more selected testing locations. In some embodiments, the testing request includes the position identified for the one or more testing locations by the position information module 229.
In response to sending the testing request, the server obtains (818) , from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations. In some embodiments, server system 108 or a component thereof (e.g. , obtaining module 222, Figure 2) obtains test results from the client device 104 in response to sending the testing request. In some embodiments, server system 108 or a component thereof (e.g. , storing module 234, Figure 2) stores the test results and the corresponding testing locations and test operations in test history database 114 (Figures 1-2 and 4) in association with the testing request. For example, in Figure 4, each of the test sequences is associated with a test sequence identifier 402, and each test sequence corresponds to a testing request sent to a client device 104. Continuing with this example, in Figure 4, test results 410 for the test sequences are linked to a plurality of sub-entries for each of the tests corresponding to test sequence identifier 402-A performed by the client device 104. In this example, a sub-entry for a respective test includes a test identifier 420, a test operation 422, position information 424 corresponding to a testing location, and test results 426 for the test performed at the client device (e.g. , a screenshot of the interface resulting from the test operation 422 performed at the testing location corresponding to the position information 424) .
In some embodiments, the test results are screenshots after performing the predefined operation at the selected testing location. For example, in response to performing the predefined operation at the selected testing location, a subsequent interface is displayed or the interface changes. In some embodiments, if the test results are not received within Z seconds the server determines that an error has occurred with respect to a testing location. In some embodiments, if a screenshot of the home screen is returned after a test operation, the server may determine that the app has crashed after the test operation. In some embodiments,  if the same screenshot has been returned after Y test operations (where Y is large) , the server may determine that the application is frozen.
In some embodiments, the testing request identifies two or more testing locations, and the server obtains, from the client device, test results by (820) : receiving first test results for a first predefined operation performed at a first testing location of the two or more testing locations; and, after receiving the first test results, receiving second test results for a second predefined operation performed at a second testing location of the two or more testing locations. In some embodiments, the test results are received in real-time, serial/successive order from the client device. For example, the real-time testing results are received by the server when the testing request includes one testing location and one predefined operation. For example, the server sends a testing request including a first testing location and corresponding test operation to the device (e.g. , client device 104, Figures and 3) running the software program to be tested, the device performs the requested test and sends back the test result to the server. Continuing with this example, the server sends a second testing request including a second testing location and corresponding test operation to the device, and the device performs the second requested test and sends back a second test result, and so on so forth.
In some embodiments, the testing request identifies two or more testing locations, and the server obtains, from the client device, test results by (822) : receiving batched test results at least including first test results for a first predefined operation performed at a first testing location of the two or more testing locations and second test results for a second predefined operation performed at a second testing location of the two or more testing locations. In some embodiments, the testing request includes a command to return to the first interface associated with the first image after performing a predefined operation at a testing location. For example, with respect to the batch testing case, the device (e.g. , client device 104, Figures and 3) running the software program to be tested receives a testing request which may include a series of testing locations and corresponding test operations. Continuing with this example, the device performs each of the test operation at the specified testing locations in sequence, and obtains a screen shot (test result) after each of the test operations, and, after all of the test operations are performed, the device sends all of the screenshots (with the sequence data) in a bundle back to the server.
In some embodiments, the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device. For a respective image of the one or more subsequent images, the server (824) : performs character recognition on the respective image to obtain text of the respective image; and, in accordance with a determination that the text of the respective image includes one or more words in a predefined set of words, records the testing location corresponding to the respective image. In some embodiments, after obtaining the test results, server system 108 or a component thereof (e.g. , analyzing module 232, Figure 2) analyzes the test results (i.e. , the subsequent images) by performing character recognition on the subsequent images and searching the character recognition results for words in predefined set of words. For example, after performing the predefined operation at the testing location, an error/malfunction with the software program causes an error message to appear. Thus, the server is looking for words like “error, ” “abnormality, ” “crash, ” or the like. Thus, the predefined set of words includes words indicating occurrence of an error or abnormal execution of the operation.
In some embodiments, the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device. For a respective image of the one or more subsequent images, the server (826) : records the testing location corresponding to the respective image in accordance with a determination that the respective image includes a second interface different from the first interface in the first image; and associates the testing location corresponding to the respective image with the second interface in an interface hierarchy for the software program executed on the client device. In some embodiments, in response to performing a test operation at a testing location of the first interface, a second interface is displayed at the client device. As such, a hierarchy/map of interfaces for the software program can be built and the various interfaces in the hierarchy can be individually tested for errors/malfunctions. In some embodiments, after obtaining the test results, server system 108 or a component thereof (e.g. , interface mapping module 236, Figure 2) generates a hierarchy of the interfaces for the software program executed on the client device 104 and stores the hierarchy in interface hierarchy library 116. For example, the first interface is a home interface for the software program. In some embodiments, a test sequence with testing locations at two or more interfaces in the hierarchy is used to drill  down to a respective interface in the hierarchy from the first interface so as to test the respective interface and get back to the respective interface.
It should be understood that the particular order in which the operations in Figures 8A-8C have been described is merely exemplary and is not intended to indicate that the described order is the only order in which the operations could be performed. One of ordinary skill in the art would recognize various ways to reorder the operations described herein. Additionally, it should be noted that details of other processes described herein with respect to other methods described herein (e.g. , method 700) are also applicable in an analogous manner to method 800 described above with respect to Figures 8A-8C.
Figure 9 is a block diagram of an apparatus for test software reliability in accordance with some embodiments. In some embodiments, the apparatus implements method 700 in Figures 7A-7B. For example, the apparatus corresponds to server system 108 (Figures 1-2) or a component thereof (e.g. , server-side module 106, Figures 1-2) . In some embodiments, the apparatus may be implemented in whole or in part on a device (e.g., server system 108, Figures 1-2) through software, hardware, or a combination thereof. In some embodiments, the apparatus includes: a first acquiring module 902; an edge detection module 904; a second acquiring module 906; a first transmission module 908; and a second transmission module 910.
In some embodiments, first acquiring module 902 is configured to acquire an image of a software program running on a mobile terminal.
In some embodiments, edge detection module 904 is configured to obtain edge information indicating edges of the image based on an edge detection process performed on the image acquired by first acquiring module 902.
In some embodiments, second acquiring module 906 is configured to acquire position information for a detection spot used for testing the reliability of the software program according to the edge information. In some embodiments, the position information may be, but is not limited to, a coordinate of a point in the touch screen region of the mobile terminal.
In some embodiments, second acquiring module 906 includes: a first selecting sub-unit 922; and a first acquiring sub-unit 924.
In some embodiments, first selecting sub-unit 922 is configured to select one or more points from the edge information at random.
In some embodiments, first acquiring sub-unit 924 is configured to acquire position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program.
In some embodiments, second acquiring module 906 includes: a second acquiring sub-unit 926; and a third acquiring sub-unit 928.
In some embodiments, second acquiring sub-unit 926 is configured to acquire one or more points according to a predetermined spacing from a respective edge of the image based on the edge information.
In some embodiments, third acquiring sub-unit 928 is configured to acquire position information for the one or more points, where respective position information corresponds to a detection spot for testing the reliability of the software program.
In some embodiments, first transmission module 908 is configured to send a detection request to the mobile terminal. In some embodiments, the detection request includes position information for one or more detection spots and a touch screen operation instruction information for each of the one or more detection spots. In some embodiments, the touch screen operation instruction information comprises, but is not limited to, at least one of the following: click, dragging and long press.
In some embodiments, second transmission module 910 is configured to receive a detection result from the mobile terminal. In some embodiments, the detection result is obtained by the mobile terminal which performs an operation indicated by the touch screen operation instruction information at a position indicated by the position information of a respective detection spot of the one or more detection spots. In some embodiments, second transmission module 910 is configured to successively receive detection results from the mobile terminal which performs operations indicated by the touch screen operation instruction information at the positions indicated by the position information of one or more detection spots. In some embodiments, second transmission module 910 is configured to receive a test response from the mobile terminal, wherein the test response comprises detection results for operations indicated by the touch screen operation instruction  information on positions indicated by the position information of each detection sport of the one or more detection spots.
In some embodiments, the apparatus further includes: a scanning module 912; and a determining module 914.
In some embodiments, scanning module 912 is configured to scan the detection results for a set of predetermined key words.
In some embodiments, determining module 914 is configured to identify an error with the software in accordance with a determination that one or more predetermined key words in the set of predetermined key words are included in the scanned detection results.
While particular embodiments are described above, it will be understood it is not intended to limit the application to these particular embodiments. On the contrary, the application includes alternatives, modifications and equivalents that are within the spirit and scope of the appended claims. Numerous specific details are set forth in order to provide a thorough understanding of the subject matter presented herein. But it will be apparent to one of ordinary skill in the art that the subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.

Claims (20)

  1. A method of testing software reliability, comprising:
    at a server with one or more processors and memory:
    obtaining, from a client device, a first image of a first interface of a software program executed on the client device;
    performing edge detection on the first image to obtain edge information for edges of the first interface in the first image;
    selecting one or more testing locations in the first image based on the edge information;
    sending a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations; and
    in response to sending the testing request, obtaining, from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations.
  2. The method of claim 1, further comprising.
    prior to sending the testing request, identifying position information for the one or more selected testing locations,
    wherein the testing request includes the position information for the one or more selected testing locations.
  3. The method of any of claims 1-2, wherein selecting the one or more testing locations in the first image based on the edge information comprises:
    identifying at least one cluster of edges based on the edge information that satisfies one or more testing parameters; and
    selecting a location within a predetermined distance of a representative point of the at least one cluster as the testing location.
  4. The method of any of claims 1-2, wherein selecting the one or more testing locations in the first image based on the edge information comprises:
    identifying at least one predefined shape based on the edge information that satisfies one or more testing parameters; and
    selecting a location within a predetermined distance of a center of the at least one predefined shape as the testing location.
  5. The method of any of claims 1-4, wherein the testing request identifies two or more testing locations; and
    wherein obtaining, from the client device, test results comprises:
    receiving batched test results at least including first test results for a first predefined operation performed at a first testing location of the two or more testing locations and second test results for a second predefined operation performed at a second testing location of the two or more testing locations.
  6. The method of any of claims 1-5, wherein the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device, the method further comprising:
    for a respective image of the one or more subsequent images:
    performing character recognition on the respective image to obtain text of the respective image; and
    in accordance with a determination that the text of the respective image includes one or more words in a predefined set of words, recording the testing location corresponding to the respective image.
  7. The method of any of claims 1-5, wherein the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device, the method further comprising:
    for a respective image of the one or more subsequent images:
    in accordance with a determination that the respective image includes a second interface different from the first interface in the first image:
    recording the testing location corresponding to the respective image; and
    associating the testing location corresponding to the respective image with the second interface in an interface hierarchy for the software program executed on the client device.
  8. A server, comprising:
    one or more processors; and
    memory storing one or more programs to be executed by the one or more processors, the one or more programs comprising instructions for:
    obtaining, from a client device, a first image of a first interface of a software program executed on the client device;
    performing edge detection on the first image to obtain edge information for edges of the first interface in the first image;
    selecting one or more testing locations in the first image based on the edge information;
    sending a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations; and
    in response to sending the testing request, obtaining, from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations.
  9. The server of claim 8, wherein the one or more programs further comprise instructions for:
    prior to sending the testing request, identifying position information for the one or more selected testing locations,
    wherein the testing request includes the position information for the one or more selected testing locations.
  10. The server of any of claims 8-9, wherein selecting the one or more testing locations in the first image based on the edge information comprises:
    identifying at least one cluster of edges based on the edge information that satisfies one or more testing parameters; and
    selecting a location within a predetermined distance of a representative point of the at least one cluster as the testing location.
  11. The server of any of claims 8-9, wherein selecting the one or more testing locations in the first image based on the edge information comprises:
    identifying at least one predefined shape based on the edge information that satisfies one or more testing parameters; and
    selecting a location within a predetermined distance of a center of the at least one predefined shape as the testing location.
  12. The server of any of claims 8-11, wherein the testing request identifies two or more testing locations; and
    wherein obtaining, from the client device, test results comprises:
    receiving batched test results at least including first test results for a first predefined operation performed at a first testing location of the two or more testing locations and second test results for a second predefined operation performed at a second testing location of the two or more testing locations
  13. The server of any of claims 8-12, wherein the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device, and
    wherein the one or more programs further comprise instructions for:
    for a respective image of the one or more subsequent images:
    performing character recognition on the respective image to obtain text of the respective image; and
    in accordance with a determination that the text of the respective image includes one or more words in a predefined set of words, recording the testing location corresponding to the respective image.
  14. The server of any of claims 8-12, wherein the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device, and
    wherein the one or more programs further comprise instructions for:
    for a respective image of the one or more subsequent images:
    in accordance with a determination that the respective image includes a second interface different from the first interface in the first image:
    recording the testing location corresponding to the respective image; and
    associating the testing location corresponding to the respective image with the second interface in an interface hierarchy for the software program executed on the client device.
  15. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which, when executed by a server with one or more processors, cause the server to perform operations comprising:
    obtaining, from a client device, a first image of a first interface of a software program executed on the client device;
    performing edge detection on the first image to obtain edge information for edges of the first interface in the first image;
    selecting one or more testing locations in the first image based on the edge information;
    sending a testing request to the client device to perform one or more predefined operations at each of the one or more selected testing locations; and
    in response to sending the testing request, obtaining, from the client device, test results for the one or more predefined operations performed at each of the one or more selected testing locations.
  16. The non-transitory computer readable storage medium of claim 15, wherein the instructions cause the server to perform operations further comprising:
    prior to sending the testing request, identifying position information for the one or more selected testing locations,
    wherein the testing request includes the position information for the one or more selected testing locations.
  17. The non-transitory computer readable storage medium of any of claims 15-16, wherein selecting the one or more testing locations in the first image based on the edge information comprises:
    identifying at least one cluster of edges based on the edge information that satisfies one or more testing parameters; and
    selecting a location within a predetermined distance of a representative point of the at least one cluster as the testing location.
  18. The non-transitory computer readable storage medium of any of claims 15-16, wherein selecting the one or more testing locations in the first image based on the edge information comprises:
    identifying at least one predefined shape based on the edge information that satisfies one or more testing parameters; and
    selecting a location within a predetermined distance of a center of the at least one predefined shape as the testing location.
  19. The non-transitory computer readable storage medium of any of claims 15-18, wherein the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device, and
    wherein the instructions cause the server to perform operations further comprising:
    for a respective image of the one or more subsequent images:
    performing character recognition on the respective image to obtain text of the respective image; and
    in accordance with a determination that the text of the respective image includes one or more words in a predefined set of words, recording the testing location corresponding to the respective image.
  20. The non-transitory computer readable storage medium of any of claims 15-18, wherein the test results for the one or more predefined operations performed at each of the one or more selected testing locations include one or more subsequent images of the software program executed on the client device, and
    wherein the instructions cause the server to perform operations further comprising:
    for a respective image of the one or more subsequent images:
    in accordance with a determination that the respective image includes a second interface different from the first interface in the first image:
    recording the testing location corresponding to the respective image; and
    associating the testing location corresponding to the respective image with the second interface in an interface hierarchy for the software program executed on the client device.
PCT/CN2014/086389 2013-09-23 2014-09-12 Method and device for testing software reliability WO2015039585A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310437097.8A CN104461857B (en) 2013-09-23 2013-09-23 The detection method and device of software reliability
CN201310437097.8 2013-09-23

Publications (1)

Publication Number Publication Date
WO2015039585A1 true WO2015039585A1 (en) 2015-03-26

Family

ID=52688237

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/086389 WO2015039585A1 (en) 2013-09-23 2014-09-12 Method and device for testing software reliability

Country Status (2)

Country Link
CN (1) CN104461857B (en)
WO (1) WO2015039585A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955861A (en) * 2016-05-19 2016-09-21 努比亚技术有限公司 Fault detection apparatus and method as well as mobile terminal
GB2541250A (en) * 2015-07-28 2017-02-15 Testplant Europe Ltd Method of, and apparatus for, creating reference images for an automated test of software with a graphical user interface.

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110830319B (en) * 2018-08-10 2021-07-20 长鑫存储技术有限公司 Management method, device and system for integrated circuit test

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7143391B1 (en) * 2002-12-11 2006-11-28 Oracle International Corporation Method and apparatus for globalization testing computer software
CN101521834A (en) * 2009-04-02 2009-09-02 深圳市茁壮网络技术有限公司 Automatic testing method, device and system
CN102420712A (en) * 2010-09-28 2012-04-18 ***通信集团公司 Testing method and equipment
CN103312850A (en) * 2013-05-10 2013-09-18 江苏科技大学 Mobile phone automation testing system and work method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7143391B1 (en) * 2002-12-11 2006-11-28 Oracle International Corporation Method and apparatus for globalization testing computer software
CN101521834A (en) * 2009-04-02 2009-09-02 深圳市茁壮网络技术有限公司 Automatic testing method, device and system
CN102420712A (en) * 2010-09-28 2012-04-18 ***通信集团公司 Testing method and equipment
CN103312850A (en) * 2013-05-10 2013-09-18 江苏科技大学 Mobile phone automation testing system and work method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2541250A (en) * 2015-07-28 2017-02-15 Testplant Europe Ltd Method of, and apparatus for, creating reference images for an automated test of software with a graphical user interface.
US9804955B2 (en) 2015-07-28 2017-10-31 TestPlant Europe Limited Method and apparatus for creating reference images for an automated test of software with a graphical user interface
GB2541250B (en) * 2015-07-28 2019-01-02 Testplant Europe Ltd Method of, and apparatus for, creating reference images for an automated test of software with a graphical user interface.
US10810113B2 (en) 2015-07-28 2020-10-20 Eggplant Limited Method and apparatus for creating reference images for an automated test of software with a graphical user interface
CN105955861A (en) * 2016-05-19 2016-09-21 努比亚技术有限公司 Fault detection apparatus and method as well as mobile terminal

Also Published As

Publication number Publication date
CN104461857A (en) 2015-03-25
CN104461857B (en) 2018-05-18

Similar Documents

Publication Publication Date Title
US11134101B2 (en) Techniques for detecting malicious behavior using an accomplice model
US10310969B2 (en) Systems and methods for test prediction in continuous integration environments
US10805198B2 (en) Techniques for infrastructure analysis of internet-based activity
US11870741B2 (en) Systems and methods for a metadata driven integration of chatbot systems into back-end application services
WO2015039566A1 (en) Method and system for facilitating automated web page testing
US10257316B2 (en) Monitoring of node.js applications
US20150106723A1 (en) Tools for locating, curating, editing, and using content of an online library
US11960354B2 (en) Proactive learning of network software problems
US9740668B1 (en) Plotting webpage loading speeds and altering webpages and a service based on latency and pixel density
US20180124109A1 (en) Techniques for classifying a web page based upon functions used to render the web page
US11580294B2 (en) Techniques for web framework detection
CN107807841B (en) Server simulation method, device, equipment and readable storage medium
US20230035104A1 (en) Verification method, apparatus and device, and storage medium
CN112817831A (en) Application performance monitoring method, device, computer system and readable storage medium
US11914667B2 (en) Managing multi-dimensional array of data definitions
WO2015039585A1 (en) Method and device for testing software reliability
WO2018205392A1 (en) Control response area display control method, electronic apparatus, and storage medium
US11068558B2 (en) Managing data for rendering visualizations
US10909206B2 (en) Rendering visualizations using parallel data retrieval
KR20180076020A (en) Apparatus and method for application test automation
US11003473B2 (en) Emulating functions provided in application resources
CN109756393B (en) Information processing method, system, medium, and computing device
CN110879738B (en) Operation step display method and device and electronic equipment
CN107562611B (en) Method and device for realizing simulation test
CN116909829A (en) Page performance monitoring method, system, equipment and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14846199

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14846199

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205N DATED 01/06/2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14846199

Country of ref document: EP

Kind code of ref document: A1