CN110851351A - Deployment environment testing method and device, computer equipment and storage medium - Google Patents

Deployment environment testing method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN110851351A
CN110851351A CN201910968955.9A CN201910968955A CN110851351A CN 110851351 A CN110851351 A CN 110851351A CN 201910968955 A CN201910968955 A CN 201910968955A CN 110851351 A CN110851351 A CN 110851351A
Authority
CN
China
Prior art keywords
data
deployment environment
expected
database
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910968955.9A
Other languages
Chinese (zh)
Inventor
金英俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
OneConnect Smart Technology Co Ltd
OneConnect Financial Technology Co Ltd Shanghai
Original Assignee
OneConnect Financial Technology Co Ltd Shanghai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by OneConnect Financial Technology Co Ltd Shanghai filed Critical OneConnect Financial Technology Co Ltd Shanghai
Priority to CN201910968955.9A priority Critical patent/CN110851351A/en
Publication of CN110851351A publication Critical patent/CN110851351A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The embodiment of the invention discloses a deployment environment testing method, which comprises the following steps: when the trigger of the test starting event is monitored, acquiring an expected result file pointed by the test starting event through path information input in the interface, wherein the expected result file is expected data of a database and/or expected configuration of a deployment environment; pulling the installed database data and/or the set deployment environment configuration of the current deployment environment, and carrying out comparison test on the database data and/or the set deployment environment configuration and the expected result file; if the data of the two are the same, the test is passed; and if the data of the two data are different, recording abnormal fields with difference between the data of the two data, and returning the abnormal fields to the input interface for displaying. The scheme realizes the system environment configuration and the installed data test in the deployment environment, and detects whether the environment of the deployment system is correct before the deployment software starts to be checked, so that some potential environment configuration risks can be eliminated before the checking procedure, and the test efficiency is improved.

Description

Deployment environment testing method and device, computer equipment and storage medium
Technical Field
The present invention relates to the field of system testing, and in particular, to a deployment environment testing method, apparatus, computer device, and storage medium.
Background
In a software development project, after each software is subjected to version transfer and deployment, because a system environment of a developer is different from a deployment environment of a client side, some functional blocking problems or degradation problems are frequent, technicians need to test the deployment environment of the client side at this time, an existing test method is a single step debugging test, the single step debugging test is a flow for tracking program execution step by step, changes of variable values are monitored, and the reason of bug is found, however, the single step debugging test does not include testing the system environment configuration of the deployment environment and an installed database, so that the correctness of the system environment configuration and the installed database file cannot be reflected in the existing single step debugging test result, and the deployed software cannot normally run under the condition that the system environment configuration is improper or the installed database file has messy codes, how to realize the system environment configuration under the deployment environment and the correctness test of the installed data file is a problem which needs to be solved urgently at present.
Disclosure of Invention
In view of this, an object of the embodiments of the present invention is to provide a method, an apparatus, a computer device, and a storage medium for testing a deployment environment, which implement testing of system environment configuration and installed data files in a deployment environment, and detect whether an environment of a deployment system is correct before a deployment software starts to check, so that some potential environment configuration risks and various exceptions caused by data messy codes in a database can be eliminated before a check procedure, and testing efficiency is improved.
In order to achieve the above object, an embodiment of the present invention provides a deployment environment testing method, including the following steps:
creating an input interface, monitoring a test starting event triggered in the interface, and acquiring an expected result file pointed by the test starting event through path information input in the interface when the test starting event is triggered, wherein the expected result file is expected data of a database and/or expected configuration of a deployment environment;
pulling the installed database data and/or the set deployment environment configuration of the current deployment environment, and carrying out comparison test on the database data and/or the set deployment environment configuration and the expected result file;
if the data of the two are the same, the test is passed; and if the data of the two data are different, recording abnormal fields with difference between the data of the two data, and returning the abnormal fields to the input interface for displaying.
Preferably, the step of pulling the installed database data and/or the set deployment environment configuration data of the current deployment environment and performing a comparison test with the expected result file includes:
analyzing the database expected data in the expected result file, and recording the field name and the expected value of each form page of the database expected data as a key value pair to obtain a key value pair set;
and traversing each key value pair in the set, taking each key in the key value pairs as a query field, and querying the correctness of the numerical value corresponding to each key in the database data installed in the deployment environment one by one. .
Preferably, the step of pulling the installed database data and/or the set deployment environment configuration data of the current deployment environment and performing a comparative test on the database data and the set deployment environment configuration data with the expected result file further includes:
analyzing the expected configuration of the deployment environment in the expected result file, and taking each parameter name and the corresponding parameter value in the expected configuration of the deployment environment as a parameter key-value pair to obtain a parameter key-value pair set;
and traversing the parameter key-value pair set, taking the key of each key-value pair in the set as a query field, and querying the correctness of the value corresponding to each key in the set deployment environment configuration parameters one by one.
Preferably, before the step of analyzing the database expected data in the expected result file, recording the field name and the expected value of each form page of the database expected data as a key-value pair, and obtaining a key-value pair set, the method further includes:
and connecting the installed database in the current deployment environment, pulling the data if the connection is successful, and displaying an error prompt box if the connection is failed.
Preferably, if the data of the two are the same, the test is passed; if the two data are different, recording an abnormal field with difference between the two data, and returning the abnormal field to the input interface for displaying, further comprising:
and copying data content corresponding to the abnormal field in the expected result file, and performing covering processing on the abnormal field in the database data installed in the current deployment environment and/or the set deployment environment configuration data.
Preferably, the step of copying the data content corresponding to the exception field in the expected result file, and performing the override processing on the exception field in the database data installed in the current deployment environment and/or the set deployment environment configuration data includes:
copying the data content corresponding to the abnormal field in the expected result file, and allocating a new interval in a system memory for environmental simulation;
loading an instance module containing the data content in the simulation environment, and if the instance module runs normally, performing coverage processing on the abnormal field in the file to be checked.
Preferably, in the recording process of the abnormal field, the corresponding index value is pulled and stored together, and the index value is used for completing the rapid positioning of the abnormal field in the source text.
In order to achieve the above object, an embodiment of the present invention further provides an apparatus for deploying an environment test, including:
the interface module is used for creating an input interface, monitoring a test starting event triggered in the interface, and acquiring an expected result file pointed by the test starting event through path information input in the interface when the test starting event is triggered, wherein the expected result file is expected data of a database and/or expected configuration of a deployment environment;
the test module is used for pulling the installed database data and/or the set deployment environment configuration of the current deployment environment and carrying out comparison test on the database data and the set deployment environment configuration and the expected result file; if the data of the two are the same, the test is passed; and if the data of the two data are different, recording abnormal fields with difference between the data of the two data, and returning the abnormal fields to the input interface for displaying.
To achieve the above object, an embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the steps of deploying environment testing according to any one of claims 1 to 7.
To achieve the above object, an embodiment of the present invention further provides a computer-readable storage medium having a computer program stored therein, the computer program being executable by at least one processor to cause the at least one processor to perform the steps of deploying environment tests as claimed in any one of claims 1 to 7.
Compared with the prior art, the deployment environment testing method, the deployment environment testing device, the computer equipment and the computer readable storage medium provided by the embodiment of the invention realize the testing of the system environment configuration and the installed data file in the deployment environment, and detect whether the environment of the deployment system is correct before the deployment software starts to be checked, so that some potential environment configuration risks and various exceptions caused by data messy codes in a database can be eliminated before the checking procedure, and the testing efficiency is improved.
Drawings
FIG. 1 is a flow chart of the steps of a deployment environment testing method of the present invention;
FIG. 2 is a flowchart illustrating a step S200 of a deployment environment testing method according to an embodiment of the present invention;
FIG. 3 is a schematic flowchart of another embodiment of the step S200 in the first embodiment of the deployment environment testing method according to the present invention;
FIG. 4 is a flowchart illustrating a step S300 of the deployment environment testing method according to an embodiment of the present invention;
FIG. 5 is a schematic diagram illustrating program modules of a second embodiment of the deployment environment testing apparatus;
fig. 6 is a schematic diagram of a hardware structure of a third embodiment of the computer apparatus according to the present invention.
Detailed Description
For better understanding of the technical solutions of the present invention, the following detailed description of the embodiments of the present invention is provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, etc. may be used to describe the designated key in embodiments of the present invention, the designated key should not be limited to these terms. These terms are only used to distinguish specified keywords from each other. For example, the first specified keyword may also be referred to as the second specified keyword, and similarly, the second specified keyword may also be referred to as the first specified keyword, without departing from the scope of embodiments of the present invention.
The word "if" as used herein may be interpreted as referring to "at … …" or "when … …" or "corresponding to a determination" or "in response to a detection", depending on the context. Similarly, the phrase "if determined" or "if detected (a stated condition or time)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
Example one
Fig. 1 is a flowchart illustrating steps of a deployment environment test according to an embodiment of the present invention. It is to be understood that the flow charts in the embodiments of the present invention do not limit the order of executing the steps. The method comprises the following specific steps:
step S100 is to create an input interface, monitor a test start event triggered in the interface, and when it is monitored that the test start event is triggered, obtain an expected result file pointed to by the test start event through path information input in the interface, where the expected result file is expected data of a database and/or expected configuration of a deployment environment.
Creating an input interface for technicians to perform related operations is the first step of implementing the technical solution of the present invention.
Specifically, the processing unit creates a thread for running a pre-edited window program, and after the program runs, a window frame is popped up from a system desktop, wherein the window frame comprises at least two text input frames and a trigger button, the text input frames are used for a technician to input a preparation document path and other verification data path information, and the trigger button is used for monitoring trigger operation of the technician for executing a deployment environment test.
The prepared file may also be referred to as an expected result file, which is used as a reference file for a subsequent comparison test, and is preset by a technician, where the prepared file may be copied to deployed hardware equipment by the technician, or the file data pointed by a path may be pulled according to an expected file path input by a text input box through an external device such as a usb disk, and if the path is in a local disk of the hardware equipment, the path may be "C:\\windows\\setup.insif the expected result file is in an external device, taking a sandish brand U disk as an example, the path may be G:\\setup.ins
the expected result files comprise two types, one type is expected database data, and because under different system environments, the situation that part of database data is deployed from the development environment of developers to hardware equipment of a client side can generate messy codes or format variation and the like, data is lost or changed, and deployed software cannot run normally, so that technicians are required to prepare correct expected database data in advance to perform comparison test on the database data in the current deployment environment, and further know whether the database data in the current deployment environment has abnormal problems such as data loss or data change and the like. For example, a technician compiles a program under the condition that a computer PC terminal simulates an android development environment with JAVA, and a customer needs to require that software can normally run on a mobile phone terminal, although operating systems of the two are android systems, because a computer is different from a mobile phone hardware device and versions of the two android systems may not be consistent, differences naturally exist between the two android systems, when software is handed over from the computer terminal of the technician to the mobile phone terminal for running, database data transferred together are deformed and environment configuration is different, and some functional blocking problems or degradation problems naturally occur when the software is deployed.
Another type is deployment environment expected configuration, and general software has the minimum requirements of its own hardware device, software environment, and the like, for example, software that only supports a 64-bit windows system cannot normally operate normally in a 32-bit windows system, so a technician is also required to write a deployment environment expected configuration file in advance for performing a comparison test on the set deployment environment configuration data, and further determine whether the currently set deployment environment configuration data meets the operation of deployment software.
For example, the deployment environment expected configuration data may be as follows:
TABLE 1
When a verification starting event triggered by a trigger button in an interface is monitored, the processing unit acquires relevant information of an expected result file pointed by an input path of the input box, and then calls the expected result file pointed by the path, and loads the expected result file into a cache for the follow-up inspection step to call at any time.
S200, pulling the installed database data and/or the set deployment environment configuration of the current deployment environment, and carrying out comparison test on the database data and/or the set deployment environment configuration and the expected result file; if the data of the two are the same, the test is passed; and if the data of the two data are different, recording abnormal fields with difference between the data of the two data, and returning the abnormal fields to the input interface for displaying.
After the expected file is obtained, pulling the installed database data in the current deployment environment and/or pulling the current set deployment environment configuration, and performing a comparison test on the configuration and the expected result file.
Specifically, the window in the step S100 is further provided with two check boxes, which respectively correspond to the triggering of the database test and the deployment environment configuration test, and if the technician only checks the check box corresponding to the database test, only the installed database data of the current deployment environment is pulled in this link, and the deployment environment configuration data is not pulled, and the deployment environment configuration data is not tested, and the check of the check boxes correspondingly triggers the corresponding test content.
In brief, the deployment environment is a platform for software running, that is, a collection of elements in a software environment, hardware, a network, and data, where the software environment refers to an environment formed by an operating system, a database, and other application software when the software to be tested runs, the software environment is relatively complex, factors to be considered are many, and there are languages, versions of the operating system, versions of each coexisting software, and the like.
Taking database testing as an example, a technician is preset with an installed database path of the deployment environment, and generally, the installed database includes database data supporting the deployment environment system itself and a self-contained database required for deploying software. The paths of the two databases are usually fixed, the database data of the system supporting the deployment environment is usually set in a local disk, such as a disk C, while the self-contained database required for deploying software is usually placed under an installation folder generated by an installation file, and the paths are also fixedly set, such as: c: \ \ security finance internal client\\mysql
The processing unit accesses the two databases through a path preset by a technician, pulls data in the databases, and performs specific data comparison test on the pulled data and an expected result file to obtain a test result.
And after the comparison test is finished, outputting a test result through a preset selection structure statement, wherein one test is passed, the other test is not passed, the judgment standard is whether the data of the database data of the pulled deployment environment and/or the data of the deployment environment configuration and the expected result file are the same or not, if the data of the database data of the deployment environment and/or the data of the deployment environment configuration and the expected result file are the same, the test is passed, if the data of the deployment environment and/or the data of the deployment environment configuration are different from the expected result file, recording abnormal fields with difference between the data of the deployment environment and the expected result file.
Wherein, the abnormal field is obtained from the comparison test, if the database comparison test is used alone, the abnormal field is a field in which the installed database data of the current deployment environment is different from the expected result file data, for example, the installed database data includes a table a, and the first-line data is "borrower"; zhang III; the loan amount is three million ", and the database expectation data in the expectation result file comprises an expectation table A, wherein the first-line data of the expectation table A is 'borrower'; zhang III; and the loan amount is two million, and the loan amount is used as a field 'three million of loan amount' in the installed database data of the deployment environment of the tested party, namely, an abnormal field.
The above description is merely an understanding of the exception field, which is intended to characterize the difference data between the data under test and the expected data, and which may also be part of a function instance, code, array, class, or the like.
Compared with the prior art, the deployment environment testing method provided by the invention realizes the testing of the system environment configuration and the installed data files in the deployment environment, and detects whether the environment of the deployment system is correct before the deployment software starts to be checked, so that some potential environment configuration risks and various exceptions caused by data messy codes in a database can be eliminated before the checking procedure, and the testing efficiency is improved.
For example, referring to fig. 2, step S200 may further include:
s210, analyzing the expected data of the database in the expected result file, and recording the field name and the expected value of each form page of the expected data of the database as a key value pair to obtain a key value pair set.
Specifically, the database expected data is written in an excel format, the processing unit analyzes the database expected data, and records a field name and an expected field value in each form page in a file as a key-value pair set, wherein the field name is a key and the expected field value is a numerical value.
Illustratively, the expected data of the database includes two forms, the names of which are "a" and "B", respectively corresponding to the database a and the database B, and the stored data in the corresponding database is recorded in each form. The processing unit analyzes according to the sequence of the form page, firstly analyzes the form 'A', stores each field name and corresponding field value in the form 'A' as a key-value pair set by adopting a map object function, and comprises the following steps:
put ("loan amount", "three million")
Wherein, the loan amount is a key, and the loan amount is three million, which means three million.
Put ("borrower", "Zhang three")
Wherein, the 'borrower' is a key, and the 'Zhang III' is a value, which means that the borrower is Zhang III.
S220, traversing each key value pair in the set, taking each key in the key value pairs as a query field, and querying the correctness of the numerical value corresponding to each key in the database data installed in the deployment environment one by one.
And the processing unit acquires each key value pair set recorded in the previous step, extracts the keys in the set as query fields, searches the numerical value corresponding to the key in the database data file installed in the deployment environment, and compares the numerical value with the numerical value in the expected result file to analyze the correctness of the numerical value.
Illustratively, if a key value pair in the expected result file is "borrower", "zhang" and the key "borrower" is used to make no query field, traversal is performed in each form field name in the database data installed in the deployment environment to obtain a field value corresponding to the name of the "borrower" field as "lie four", where "zhang" is not equal to "lie four", and the database data installed in the deployment environment is incorrect in the key value pair and deformed in the data corresponding to the key value pair.
In addition, when recording incorrect field values and corresponding names, the position of the field can be obtained, and index records are added to enable technicians to quickly locate the field in the original database.
Optionally, referring to fig. 3, step S200 further includes:
s240, analyzing the expected configuration of the deployment environment in the expected result file, and taking each parameter name and the corresponding parameter value in the expected configuration of the deployment environment as a parameter key-value pair to obtain a parameter key-value pair set;
s250, traversing the parameter key-value pair set, using the key of each key-value pair in the set as a query field, and querying the correctness of the value corresponding to each key in the set deployment environment configuration parameters one by one.
Specifically, the window provided by the above description is provided with a check box, so that the processing unit enters the step of contrast test of deployment environment configuration only in checking the deployment environment configuration test. For the comparison test of the configuration file, acquiring the expected configuration of the deployment environment pointed by the specified path input by the input interface, analyzing the specific parameter name and parameter value of the expected configuration, defining the expected configuration as an MAP object, and then pulling the set deployment environment configuration, wherein how to pull the set deployment environment configuration is the prior art, the invention does not repeat how to pull the configuration parameters, the pulled data may be scattered, and at this time, the scattered data is collated, and the following table can be generated:
configuring items Existing arrangements
CPU 2.4GHZ
Memory device 256M
Hard disk Over 10G
Network card 10/100M
TABLE 2
Comparing table 2 with table 1 (expected configuration of deployment environment), it can be seen that the parameter value of "CPU" related to the set deployment environment configuration is 2.4GHZ, which is not in accordance with the expected file, and thus "CPU; 2.4GHZ "is an exception field.
Optionally, before step S210, the method further includes:
step 210A, a connection attempt is performed on the database address, if the connection is successful, the database data installed in the deployment environment is pulled, and if the connection is failed, an error prompt box is displayed.
Aiming at the pulling and subsequent comparison tests of the database data installed in the current deployment environment, firstly, connection attempt needs to be carried out on the installed database address, if the connection attempt fails, the program is stopped and an error prompt box is displayed, and if the connection attempt succeeds, the next step is triggered to carry out analysis on the expected result file of the database.
Optionally, referring to fig. 4, after step S300, the method further includes:
step 400 copies the data content in the expected result file corresponding to the exception field, and performs an override process on the exception field in the installed database file.
Optionally, step 400 includes:
step 410 copies the data content of the expected result file corresponding to the exception field, and allocates a new interval in the system memory for environmental simulation.
Step 420 loads an instance module containing the data content in the simulation environment, and if the instance module runs normally, the exception field in the installed database file is overwritten.
Specifically, after field values are verified, the processing unit records and caches each abnormal field, and after verification is completed, all abnormal fields can be written into a notepad for display. In addition, the index page number of the abnormal field can be written into the notebook all through, so that the technical personnel can conveniently position the abnormal field.
In addition, the abnormal field is also displayed through a text box in the window in step S100, which is not limited in the present invention.
Illustratively, the output result of the exception field may be as follows:
installed database Database prospective files Accuracy of measurement Indexing
“system”=“win7” “system”=“win10” error 36 pages
“paper”=“4*3” “system”=“4*5” error 37 pages
“color”=“0,10,0” “color”=“50,0,100” error 221 Page
Example two
Referring to FIG. 5, a schematic diagram of the deployment environment test module of the present invention is shown. In this embodiment, the deployment environment testing apparatus 20 may include or be divided into one or more program modules, and the one or more program modules are stored in a storage medium and executed by one or more processors to implement the present invention and implement the deployment environment based testing method described above. The program module referred to in the embodiments of the present invention refers to a series of computer program instruction segments capable of performing specific functions, and is more suitable for describing the execution process of the deployment environment testing apparatus 20 in the storage medium than the program itself. The following description will specifically describe the functions of the program modules of the present embodiment:
the interface module 200 is configured to create an input interface, monitor a test start event triggered in the interface, and when it is monitored that the test start event is triggered, obtain an expected result file pointed to by the test start event through path information input in the interface, where the expected result file is expected data of a database and/or expected configuration of a deployment environment;
the testing module 300 is configured to pull the installed database data and/or the set deployment environment configuration of the current deployment environment, and perform a comparison test on the database data and the set deployment environment configuration with the expected result file; if the data of the two are the same, the test is passed; and if the data of the two data are different, recording abnormal fields with difference between the data of the two data, and returning the abnormal fields to the input interface for displaying.
EXAMPLE III
Fig. 6 is a schematic diagram of a hardware architecture of a computer device according to a third embodiment of the present invention. In the present embodiment, the computer device 2 is a device capable of automatically performing numerical calculation and/or information processing in accordance with a preset or stored instruction. The computer device 2 may be a rack server, a blade server, a tower server or a rack server (including an independent server or a server cluster composed of a plurality of servers), and the like. As shown, the computer device 2 includes, but is not limited to, at least a memory 21, a processor 22, a network interface 23, and a deployment environment testing apparatus 20, which may be communicatively coupled to each other via a system bus. Wherein:
in this embodiment, the memory 21 includes at least one type of computer-readable storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. In some embodiments, the storage 21 may be an internal storage unit of the computer device 2, such as a hard disk or a memory of the computer device 2. In other embodiments, the memory 21 may also be an external storage device of the computer device 2, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the computer device 20. Of course, the memory 21 may also comprise both internal and external memory units of the computer device 2. In this embodiment, the memory 21 is generally used for storing an operating system installed in the computer device 2 and various application software, such as the program code of the deployment environment testing apparatus 20 in the second embodiment. Further, the memory 21 may also be used to temporarily store various types of data that have been output or are to be output.
Processor 22 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 22 is typically used to control the overall operation of the computer device 2. In this embodiment, the processor 22 is configured to run the program code stored in the memory 21 or process data, for example, run the deployment environment testing apparatus 20, so as to implement the deployment environment testing method of the first embodiment.
The network interface 23 may comprise a wireless network interface or a wired network interface, and the network interface 23 is generally used for establishing communication connection between the computer device 2 and other electronic apparatuses. For example, the network interface 23 is used to connect the computer device 2 to an external terminal through a network, establish a data transmission channel and a communication connection between the computer device 2 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System of Mobile communication (GSM), Wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, Bluetooth (Bluetooth), Wi-Fi, and the like.
In this embodiment, the deployment environment testing apparatus 20 stored in the memory 21 may also be divided into one or more program modules, and the one or more program modules are stored in the memory 21 and executed by one or more processors (in this embodiment, the processor 22) to complete the present invention.
For example, fig. 5 is a schematic diagram of program modules for implementing the second embodiment of the deployment environment testing apparatus 20, in this embodiment, the deployment environment testing apparatus 20 may be divided into an interface module 200 and a testing module 300, where the program modules referred to in the present invention refer to a series of computer program instruction segments capable of performing specific functions, and are more suitable than programs for describing the execution process of the deployment environment testing apparatus 20 in the computer device 2. The specific functions of the program module have been described in detail in the second embodiment, and are not described herein again.
Example four
The present embodiment also provides a computer-readable storage medium, such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application mall, etc., on which a computer program is stored, which when executed by a processor implements corresponding functions. The computer-readable storage medium of this embodiment is used to store the deployment environment testing apparatus 20, and when executed by the processor, the deployment environment testing method of the first embodiment is implemented.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A method for deployment environment testing, comprising:
creating an input interface, monitoring a test starting event triggered in the interface, and acquiring an expected result file pointed by the test starting event through path information input in the interface when the test starting event is triggered, wherein the expected result file is expected data of a database and/or expected configuration of a deployment environment;
pulling the installed database data and/or the set deployment environment configuration of the current deployment environment, and carrying out comparison test on the database data and/or the set deployment environment configuration and the expected result file;
if the data of the two are the same, the test is passed; and if the data of the two data are different, recording abnormal fields with difference between the data of the two data, and returning the abnormal fields to the input interface for displaying.
2. The deployment environment testing method according to claim 1, wherein the step of pulling the installed database data and/or the set deployment environment configuration data of the current deployment environment and performing the comparison test with the expected result file comprises:
analyzing the database expected data in the expected result file, and recording the field name and the expected value of each form page of the database expected data as a key value pair to obtain a key value pair set;
and traversing each key value pair in the set, taking each key in the key value pairs as a query field, and querying the correctness of the numerical value corresponding to each key in the database data installed in the deployment environment one by one.
3. The deployment environment testing method according to claim 1, wherein the step of pulling the installed database data and/or the set deployment environment configuration data of the current deployment environment and performing the comparison test with the expected result file further comprises:
analyzing the expected configuration of the deployment environment in the expected result file, and taking each parameter name and the corresponding parameter value in the expected configuration of the deployment environment as a parameter key-value pair to obtain a parameter key-value pair set;
and traversing the parameter key-value pair set, taking the key of each key-value pair in the set as a query field, and querying the correctness of the value corresponding to each key in the set deployment environment configuration parameters one by one.
4. The deployment environment testing method according to claim 2, wherein the step of parsing the database expected data in the expected result file, recording a field name and an expected value of each form page of the database expected data as a key-value pair, and obtaining a set of key-value pairs is preceded by the step of:
and connecting the installed database in the current deployment environment, pulling the data if the connection is successful, and displaying an error prompt box if the connection is failed.
5. The deployment environment testing method of claim 1, wherein if the data of the two are the same, the test is passed; if the two data are different, recording an abnormal field with difference between the two data, and returning the abnormal field to the input interface for displaying, further comprising:
and copying data content corresponding to the abnormal field in the expected result file, and performing covering processing on the abnormal field in the database data installed in the current deployment environment and/or the set deployment environment configuration data.
6. The deployment environment testing method according to claim 5, wherein the step of copying data content corresponding to the exception field in the expected result file, and performing overlay processing on the exception field in the installed database data and/or the set deployment environment configuration data of the current deployment environment comprises:
copying the data content corresponding to the abnormal field in the expected result file, and allocating a new interval in a system memory for environmental simulation;
and loading an instance module containing the data content in the simulation environment, and if the instance module runs normally, performing coverage processing on the abnormal field.
7. The deployment environment testing method according to claim 1, wherein in the recording process of the exception field, the corresponding index value is pulled and stored together, and the index value is used to complete fast positioning of the exception field in the source text thereof.
8. An apparatus for deploying environmental testing, the apparatus comprising:
the interface module is used for creating an input interface, monitoring a test starting event triggered in the interface, and acquiring an expected result file pointed by the test starting event through path information input in the interface when the test starting event is triggered, wherein the expected result file is expected data of a database and/or expected configuration of a deployment environment;
the test module is used for pulling the installed database data and/or the set deployment environment configuration of the current deployment environment and carrying out comparison test on the database data and the set deployment environment configuration and the expected result file; if the data of the two are the same, the test is passed; and if the data of the two data are different, recording abnormal fields with difference between the data of the two data, and returning the abnormal fields to the input interface for displaying.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the computer program, when executed by the processor, implements the steps of the deployment environment testing method according to any of claims 1 to 7.
10. A computer-readable storage medium, having stored thereon a computer program executable by at least one processor to cause the at least one processor to perform the steps of deploying environment tests as claimed in any one of claims 1 to 7.
CN201910968955.9A 2019-10-12 2019-10-12 Deployment environment testing method and device, computer equipment and storage medium Pending CN110851351A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910968955.9A CN110851351A (en) 2019-10-12 2019-10-12 Deployment environment testing method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910968955.9A CN110851351A (en) 2019-10-12 2019-10-12 Deployment environment testing method and device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN110851351A true CN110851351A (en) 2020-02-28

Family

ID=69597981

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910968955.9A Pending CN110851351A (en) 2019-10-12 2019-10-12 Deployment environment testing method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110851351A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112579376A (en) * 2020-12-16 2021-03-30 中国建设银行股份有限公司 Double-run-stage data verification method, system, equipment and storage medium
CN113052463A (en) * 2021-03-25 2021-06-29 平安银行股份有限公司 Workflow verification method and device, computer equipment and storage medium
CN113900677A (en) * 2021-10-18 2022-01-07 盐城金堤科技有限公司 Deployment method, device and equipment of program running environment and computer storage medium
WO2022110646A1 (en) * 2020-11-24 2022-06-02 平安普惠企业管理有限公司 Configuration method and related device
CN115809205A (en) * 2023-02-10 2023-03-17 安徽合信国质检验检测有限公司 Software detection sample deployment system based on cloud computing technology
CN116450400A (en) * 2023-06-19 2023-07-18 北京翼辉信息技术有限公司 Application program abnormality analysis method and device, electronic equipment and storage medium
WO2023229784A1 (en) * 2022-05-26 2023-11-30 Microsoft Technology Licensing, Llc Allow list of container images based on deployment configuration at a container orchestration service

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109491887A (en) * 2018-09-28 2019-03-19 深圳壹账通智能科技有限公司 Test environment dispositions method, device, computer equipment and storage medium
CN109815111A (en) * 2018-12-13 2019-05-28 深圳壹账通智能科技有限公司 Gray scale test method, device, computer equipment and storage medium
CN109828904A (en) * 2018-12-14 2019-05-31 深圳壹账通智能科技有限公司 System Authentication method, device, electronic equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109491887A (en) * 2018-09-28 2019-03-19 深圳壹账通智能科技有限公司 Test environment dispositions method, device, computer equipment and storage medium
CN109815111A (en) * 2018-12-13 2019-05-28 深圳壹账通智能科技有限公司 Gray scale test method, device, computer equipment and storage medium
CN109828904A (en) * 2018-12-14 2019-05-31 深圳壹账通智能科技有限公司 System Authentication method, device, electronic equipment and storage medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022110646A1 (en) * 2020-11-24 2022-06-02 平安普惠企业管理有限公司 Configuration method and related device
CN112579376A (en) * 2020-12-16 2021-03-30 中国建设银行股份有限公司 Double-run-stage data verification method, system, equipment and storage medium
CN113052463A (en) * 2021-03-25 2021-06-29 平安银行股份有限公司 Workflow verification method and device, computer equipment and storage medium
CN113052463B (en) * 2021-03-25 2023-09-26 平安银行股份有限公司 Workflow verification method, workflow verification device, computer equipment and storage medium
CN113900677A (en) * 2021-10-18 2022-01-07 盐城金堤科技有限公司 Deployment method, device and equipment of program running environment and computer storage medium
WO2023229784A1 (en) * 2022-05-26 2023-11-30 Microsoft Technology Licensing, Llc Allow list of container images based on deployment configuration at a container orchestration service
CN115809205A (en) * 2023-02-10 2023-03-17 安徽合信国质检验检测有限公司 Software detection sample deployment system based on cloud computing technology
CN115809205B (en) * 2023-02-10 2023-04-18 安徽合信国质检验检测有限公司 Software detection sample deployment system based on cloud computing technology
CN116450400A (en) * 2023-06-19 2023-07-18 北京翼辉信息技术有限公司 Application program abnormality analysis method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN110851351A (en) Deployment environment testing method and device, computer equipment and storage medium
CN108614707B (en) Static code checking method, device, storage medium and computer equipment
US10509693B2 (en) Method for identifying a cause for a failure of a test
US9170921B2 (en) Application testing automation
US10289536B2 (en) Distinguishing public and private code in testing environments
US10599558B1 (en) System and method for identifying inputs to trigger software bugs
WO2019071891A1 (en) Code coverage analysis method and application server
CN110866258B (en) Rapid vulnerability positioning method, electronic device and storage medium
CN113448862B (en) Software version testing method and device and computer equipment
US8661414B2 (en) Method and system for testing an order management system
CN111045927A (en) Performance test evaluation method and device, computer equipment and readable storage medium
US11868465B2 (en) Binary image stack cookie protection
CN111831554B (en) Code checking method and device
CN109684205B (en) System testing method, device, electronic equipment and storage medium
CN115310011A (en) Page display method and system and readable storage medium
CN111367796B (en) Application program debugging method and device
CN113656318A (en) Software version testing method and device and computer equipment
US9792202B2 (en) Identifying a configuration element value as a potential cause of a testing operation failure
CN113238953A (en) UI automation test method and device, electronic equipment and storage medium
Xiao et al. Performing high efficiency source code static analysis with intelligent extensions
CN108845889A (en) A kind of test method and system checked BIOS option and whether meet Intel demand
CN112558982B (en) Code detection method and device and computer equipment
CN111611153B (en) Method and device for detecting overdrawing of user interface
CN112559370A (en) Front-end-based React project unit testing method and related equipment
CN115934500A (en) Pipeline construction method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination