Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that the description of "first", "second", etc. in this disclosure is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implying an indication of the number of technical features being indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present invention.
The test case generation method provided by the embodiment of the invention can be applied to an application environment as shown in fig. 1, wherein a client communicates with a server through a network. The method comprises the steps of obtaining test information corresponding to software to be tested after a server receives a test case request sent by a user terminal and used for generating the software to be tested, wherein the test information comprises a test object and test parameters, analyzing the test type of the software to be tested according to the test object, determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested, generating a first test case according to the test model and the test parameters, testing coverage rate of the first test case to obtain a second test case, and feeding the second test case back to the user terminal as a target test case. The method comprises the steps of obtaining test information corresponding to software to be tested after receiving a request for generating the test cases of the software to be tested, guaranteeing the integrity and accuracy of the test information, providing a basis for subsequent steps, analyzing the test types of the software to be tested according to test objects in the test information, guaranteeing matching of the subsequently generated test cases and the test objects, determining a test model corresponding to the software to be tested according to the mapping relation between the test types and the test models, selecting a proper test model according to the test types to improve the efficiency and accuracy of generating the test cases, generating a first test case according to the test model and the obtained test parameters, providing a basis for subsequent coverage rate test by generating a preliminary test case, finally carrying out coverage rate test on the first test case to obtain a second test case, feeding back the second test case to a user, testing the coverage rate, optimizing the test case, and guaranteeing that the finally generated test case has higher coverage rate. According to the invention, the corresponding test model is selected through the test type of the test object, and the coverage rate detection is carried out on the test cases generated by the selected test model, so that the coverage rate of the test cases is ensured, and the efficiency and the accuracy of automatically generating the test cases are improved. The clients may be, but are not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices. The server may be implemented by a stand-alone server or a server cluster formed by a plurality of servers. The present invention will be described in detail with reference to specific examples.
Referring to a flow chart of a test case generation method according to an embodiment of the present invention shown in fig. 2, in the embodiment of the present invention, the test case generation method includes the following steps S1 to S5:
S1, after receiving a test case request for generating software to be tested, which is sent by a user terminal, test information corresponding to the software to be tested is obtained, wherein the test information comprises a test object and test parameters.
In an embodiment, after receiving a test case request for generating software to be tested sent by a user terminal, analyzing the request to obtain test information corresponding to the software to be tested.
In an embodiment, the obtaining the test information corresponding to the software to be tested includes:
acquiring a test requirement document uploaded by the user terminal;
Carrying out keyword recognition on the test requirement document by using a preset keyword recognition algorithm to obtain document keywords;
And extracting test objects and test parameters from the document keywords, and taking the test objects and the test parameters as the test information.
In this embodiment, in order to simplify the user operation, the client that sends the test case request may also support the user to upload the requirement document of the software to be tested. For example, the client may include a requirement document uploading control, through which a user may trigger a requirement document uploading operation, and upload a requirement document of the software to be tested. For this case, in a specific implementation, a requirement document of the software to be tested may be obtained, and then, keyword recognition is performed on the requirement document, so as to obtain a test object and a test parameter of the software to be tested.
When the analysis is carried out on the requirement document, keyword recognition can be carried out on the requirement document, and the test requirement of the software to be tested is obtained through the recognized keywords. The embodiment of the application is not particularly limited to the keywords, and the keywords can be determined according to actual conditions. The analysis of the requirement document can also be completed by a large language model, in other words, the requirement document can be input into the large language model, so that the test information of the software to be tested is obtained.
The embodiment of the application is not particularly limited to the software to be tested, and the software to be tested can be software corresponding to system software, software corresponding to application software, software corresponding to embedded software, software corresponding to information security software, software corresponding to industrial software or software corresponding to other software.
S2, analyzing the test type of the software to be tested according to the test object.
In an embodiment, the analyzing the test type of the software to be tested according to the test object includes:
Acquiring a test object in the test information;
judging the type of the test object, and if the test object is a flow class, judging the test type of the software to be tested as a first type;
If the test object is a numerical class, analyzing that the test object is a discrete numerical class or a continuous numerical class, if the test object is a discrete numerical class, judging that the test type of the software to be tested is a second type, and if the test object is a continuous numerical class, judging that the test type of the software to be tested is a third type;
And if the test object simultaneously comprises a numerical class and a flow class, judging that the test type of the software to be tested is a fourth type.
In this embodiment, the test objects are classified into a flow class, a numerical class, and a multi-factor combination class (including both a numerical class and a flow), where the numerical class continues to be classified into a discrete numerical class and a continuous numerical class. The test object of the process class generally refers to a system or a function to be executed according to a specific process or steps, and is characterized by comprising a plurality of steps, state transition and user interaction, such as a user registration process, wherein the steps comprise that a user accesses a registration page, fills in information such as a user name, a password, an email and the like, submits a registration form by the user, verifies the validity of input information by the system, sends a confirmation email by the system, and the user clicks a confirmation link to complete registration. The test object of the discrete numerical class means that the test object is a specific value or a limited number of values, i.e. the numerical value of the test object of the discrete numerical class can be enumerated and counted, such as the number of product purchases, the number of users, etc. The test object of the continuous numerical class means that the test object can take any value in a specific going interval, such as annual rate of return, stock price, bond denomination, and the like. The test object of the multi-factor combination class refers to testing a plurality of variables (factors) and each variable has a plurality of levels (values), for example, in a financial loan system, the variables are a loan type, a loan amount, and a loan period, and each variable has a plurality of values.
S3, determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested.
In an embodiment, the determining the corresponding test model according to the predetermined mapping relationship between the test type and the test model and the test type of the software to be tested includes:
If the test type of the software to be tested is a first type, acquiring a first model corresponding to the first type as a test model corresponding to the software to be tested;
if the test type of the software to be tested is a second type, acquiring a second model corresponding to the second type as a test model corresponding to the software to be tested;
If the test type of the software to be tested is a third type, acquiring a third model corresponding to the third type as a test model corresponding to the software to be tested;
and if the test type of the software to be tested is a fourth type, acquiring a fourth model corresponding to the fourth type as a test model corresponding to the software to be tested.
In this embodiment, when the test type of the software to be tested is the first type, a first model is selected according to the test model corresponding to the software to be tested. The first model is a finite state machine model, which is a mathematical model capable of covering all nodes and all states, and the finite state machine model can well generate test cases for software to be tested, the test object of which is an approval class. The operation principle of the finite state machine model is to process external events through state transition and execution of actions, so that behavior control of the software to be tested is realized.
Specifically, the step of generating the test case by the finite state machine model includes:
1. States and events are defined.
First, the state and events of the system are clarified. The state represents the condition of the system at a certain moment, and the event is an action triggering state transition.
2. And drawing a state diagram.
State diagrams are used to visualize states and transitions of the system. The state diagram can help you see the states of the system and the transition relationships between them clearly.
3. And generating a test case.
And generating a test case according to the state diagram, and ensuring that each state and transition are tested. Test cases may be generated using the following method:
3.1 traverse all states and transitions.
Ensuring that each state and each transition is tested at least once. The state diagram may be traversed by a depth-first search (DFS) or a breadth-first search (BFS).
3.2 Path coverage.
Test cases are generated that cover all possible paths. This can be achieved by generating all paths from the initial state to the final state.
3.3 Boundary value test.
For each state transition, a boundary value is tested. For example, if a condition of a certain state transition is that the value of a certain variable is within a certain range, the boundary value of the range is tested.
3.4 Abnormal path test.
In addition to the normal path, an abnormal path should be tested, i.e. a state transition that may lead to a system error.
In this embodiment, when the test type of the software to be tested is the second type, the second model is selected as the test model corresponding to the software to be tested. The second model is a decision tree model, and the decision tree model can effectively process multi-condition logic branches to help generate test cases which cover various condition combinations in whole, so that for the software to be tested, the test object is in a discrete numerical value class, the decision tree model is selected as the test model of the software to be tested.
Specifically, the step of generating the test case by the decision tree model includes:
1. defining input factors and output results
First, the input factors and output results of the system are clarified. The input factors are factors that affect the behavior of the system, and the output results are the results of the system at a particular combination of inputs.
Examples:
Inputting factors:
Age of user (discrete value: under 18 years old, 18-65 years old, over 65 years old)
IsStudent whether to use student (Boolean value: true, false)
HasDiscount whether there is a discount (Boolean value: true, false)
Outputting a result:
DiscountRate discount Rate (0.0,0.1,0.2,0.3)
2. Constructing decision trees
Decision tree algorithms (e.g., ID3, C4.5, CART, etc.) are used to construct the decision tree model. The decision tree predicts the value of the target variable through a series of conditional decisions (decision nodes).
2.1 Selection of root node
And selecting the characteristic with the maximum information gain or the minimum Indonesia as the root node.
2.2 Recursive splitting
The data set is divided into subsets according to the selected features, and the child nodes are recursively selected until a stop condition is met (e.g., all samples in the subset belong to the same class or the number of samples is less than a preset threshold).
2.3 Generation of leaf nodes
When the stop condition is satisfied, a leaf node is generated. The leaf nodes represent the final prediction result.
3. Extracting decision paths
All decision paths are extracted from the constructed decision tree. Each path from the root node to the leaf node represents a set of input conditions and corresponding output results.
4. Generating test cases
And generating test cases according to the extracted decision paths. Each path corresponds to a test case, ensuring that each condition combination is tested.
In this embodiment, when the test type of the software to be tested is a third type, a third model is selected as the test model corresponding to the software to be tested. The third model is an equivalence class analysis table model, and the operation principle of the equivalence class analysis table model is to divide all possible input data into a plurality of equivalence classes, and then select a few representative data from each equivalence class as test cases. The test object is the software to be tested of continuous value class, and the values of the test object cannot be enumerated and counted, so that the embodiment selects the equivalent class analysis table as the test model of the software to be tested, each input interval can be ensured to be tested, the omission of key input values is avoided, and the efficiency and the accuracy of generating the test case are improved.
Specifically, the step of generating the test case by the equivalence class analysis table model comprises the following steps:
1. defining input factors and output results
First, the input factors and output results of the system are clarified. The input factors are factors that affect the behavior of the system, and the output results are the results of the system at a particular combination of inputs.
Examples:
Inputting factors:
Age of user (continuous numerical value, range: 0 to 100)
IsStudent whether to use student (Boolean value: true, false)
HasDiscount whether there is a discount (Boolean value: true, false)
Outputting a result:
DiscountRate discount Rate (0.0,0.1,0.2,0.3)
2. Dividing equivalence classes
According to the characteristics of the input factors, the input values are divided into a plurality of equivalent classes. The behavior of the input values in each equivalence class in the system is equivalent.
2.1 Determining valid equivalence classes and invalid equivalence classes
And (5) effective equivalence class, namely input values meeting the system requirements.
Invalid equivalence class, input value which does not meet the system requirement.
Example equivalence class partitioning:
Age:
effective equivalence classes [0,18 ], [18,65 ], [65,100]
Invalid equivalence classes [ -1, 0), (100, 101]
IsStudent:
The effective equivalence class is True, false
HasDiscount:
The effective equivalence class is True, false
3. Selecting representative test cases
One or more representative values are selected from each equivalence class as test cases. Meanwhile, a boundary value and an outlier are considered.
Example test cases:
normal values:
Age:10,25,70
IsStudent:True,False
HasDiscount:True,False
Boundary value:
Age:0,18,65,100
Outliers:
Age:-1,101
4. generating test cases
And generating a test case according to the selected representative value, and recording an expected output result.
In this embodiment, when the test type of the software to be tested is the fourth type, a fourth model is selected as the test model corresponding to the software to be tested. The fourth model is an orthogonal experiment table model, and the orthogonal experiment table model is a model for designing multi-factor experiments through an orthogonal array, so that comprehensive test coverage can be obtained in fewer experiment times. The test type is the fourth type, that is, the test object is combined by multiple factors, the application scene is complex, and the conventional model processing efficiency and accuracy are low, so that the orthogonal experiment table model is adopted as the test model of the software to be tested in the embodiment.
Specifically, the step of generating the test case by the orthogonal experiment table model includes:
1. defining input factors and levels
First, the input factors of the system and the possible values (levels) of each factor are specified.
Examples:
Inputting factors:
FactorA temperature (continuous value: 20 ℃,30 ℃,40 ℃)
Factor B pressure (continuous values: 100kPa,200kPa,300 kPa)
Factor C, humidity (continuous value: 40%,60%, 80%)
2. Selecting an orthogonal array
An appropriate orthogonal array is selected based on the input factors and the number of levels. An orthogonal array is a special matrix in which each level of each factor is uniformly distributed in each column.
Common orthogonal arrays are:
l4 is applicable to 2 factors, each factor having 2 levels
L8 is applicable to 2 factors, 4 levels for each factor, or 3 factors, 2 levels for each factor
L9 is applicable to 3 factors, each factor having 3 levels
L16 is applicable to 4 factors, each factor having 2 levels, or 2 factors, each factor having 4 levels
By way of example, an L9 orthogonal array is selected because it can handle 3 factors, each factor having 3 levels (L4, L8, L9, L16 are identifiers of the orthogonal array, the number following each identifier indicates the number of experiments that the orthogonal array can handle, and the letter "L" indicates that this is an orthogonal array).
3. Generating test cases
And generating a test case according to the orthogonal array, and recording an expected output value.
According to the embodiment, the corresponding test model is selected for the test type of the software to be tested through the predetermined mapping relation between the test type and the test model, so that the generated test case can cover various conditions of the test object comprehensively, and the efficiency of generating the test case is improved.
S4, generating a first test case according to the test model and the test parameters.
In an embodiment, the generating a first test case according to the test model and the test parameters includes:
Acquiring an input parameter value and an expected parameter value in the test parameters;
constructing an initial test case according to the input parameter value, and executing the initial test case to obtain an output parameter value;
Judging whether the output parameter value is included in the desired parameter value;
If the output parameter value is not contained in the expected parameter value, judging that the initial test case is unqualified;
and if the output parameter value is contained in the expected parameter value, taking the initial test case as the first test case.
In this embodiment, in an embodiment, the input parameter value refers to input data set for executing a test case. The expected parameter value refers to an output parameter value that should be obtained after the input parameter value is input to the test case, and the expected parameter value is usually a range of values, and when the output parameter value is within the range of values of the expected parameter value, the test case is qualified. If the output parameter value is not in the value range of the expected parameter value, judging that the test case is unqualified, and reconstructing the initial test case until the obtained output parameter value is in the value range of the expected parameter value.
According to the embodiment, the initial first test case is generated through the determined test model and the acquired test parameters, so that the follow-up coverage rate test is facilitated.
S5, performing coverage rate test on the first test case to obtain a second test case, and feeding the second test case back to the user terminal as a target test case.
In an embodiment, the performing the coverage rate test on the first test case to obtain a second test case includes:
executing the first test case by using a preset test frame to obtain a test result;
calculating a coverage value of the test result by using a preset coverage tool, and comparing the coverage value with a preset coverage threshold;
if the coverage value is greater than or equal to the coverage threshold, the initial test case is used as the second test case;
And if the coverage value is smaller than the coverage threshold, modifying the first test case, and recalculating the coverage value of the modified first test case until the calculated coverage value is larger than or equal to the coverage threshold, and taking the test case corresponding to the coverage value larger than or equal to the coverage threshold as the second test case.
In this embodiment, modifying the first test case includes modifying a title and description, updating a precondition, adjusting a test step, or updating an expected result, etc. The preset test framework is used for executing the first test case, obtaining a test result after execution is completed, analyzing the test report by utilizing the coverage rate tool to obtain a coverage rate report result, and calculating a coverage value corresponding to the first test case according to the coverage rate report result. The coverage report results typically include information of the number of lines and total lines of code that have been executed, coverage of branches and conditions, areas of code that have not been covered, and the number of relevant lines.
The calculation of the coverage value comprises statement coverage, branch coverage, path coverage and condition coverage.
And the statement coverage rate is the proportion of the number of code lines covered by the calculation test case to the total number of code lines.
Statement coverage= (total number of code lines/number of executed code lines) ×100%.
And the branch coverage rate is the proportion of the conditional branch number covered by the test case to the total conditional branch number.
Branch coverage= (total conditional branch number/covered conditional branch number) ×100%.
And the path coverage rate is the ratio of the code path number covered by the test case to the total path number.
Path coverage= (total number of code paths/number of covered code paths) ×100%.
The condition coverage rate is the ratio of the number of the conditional expression results (true/false) covered by the calculation test case to the total conditional expression results.
Condition coverage= (total conditional expression result number/covered conditional expression result number) ×100%.
According to the embodiment, the first test case is executed by using the preset test framework to obtain a test result, the coverage value of the test result is calculated, the coverage value is compared with the preset coverage threshold, and the first test case is modified according to the comparison result to ensure that the coverage value of the test case is larger than the preset coverage threshold, so that a second test case with better effect is obtained. And carrying out coverage rate test on the first test case, optimizing the test case to obtain a second test case, and ensuring that the finally generated test case has higher coverage rate.
In an embodiment, after feeding back the second test case as the target test case to the user terminal, the method includes:
judging whether a test case similar to the target test case is stored in a preset database or not;
If the database stores the similar test cases with the target test cases, replacing the similar test cases with the target test cases and storing the similar test cases into the database;
And if the database does not store the test cases similar to the target test case, storing the target test case into the database.
In this embodiment, after the target test case is fed back to the user terminal, the target test case is stored in a preset database, where the preset database may be a SQL SERVER database, an Oracle database or a MySQL database, and specifically, which database may be selected according to the actual requirement or the application environment. The SQL SERVER database is a relational database management system and is widely used for storing and managing test cases. The system provides powerful data storage and query functions and is suitable for scenes in which a large number of test cases need to be efficiently managed and queried. The Oracle database is also a commonly used relational database for storing and managing test cases. The method has high reliability and safety, and is suitable for the storage requirement of the test case with high data safety requirement. The MySQL database is an open-source relational database management system, and is also widely used for storing test cases due to its open source and cost effectiveness. The method supports a large number of concurrent connections and is suitable for a test environment which needs to process a large number of user accesses.
When the target test case is stored in a preset database, whether the test case similar to the target test case is stored or not is required to be inquired in the preset database, if so, the target test case is an updated version of the similar test case in the preset database, and therefore the target test case is stored in the database instead of the similar test case. If not, the target test case is directly stored in the database.
The embodiment stores the generated target test cases in the preset database, and can conveniently add, modify and delete the cases, thereby improving the expandability and maintainability of the test cases, effectively tracking and managing the test process, and simultaneously facilitating the multiplexing and knowledge transfer of the test cases.
According to the technical scheme, the method comprises the steps of obtaining test information corresponding to software to be tested by receiving a request for generating the test cases, guaranteeing the integrity and accuracy of the test information, analyzing the test type of the software to be tested according to test objects in the test information, guaranteeing matching of the subsequently generated test cases and the test objects, determining a test model corresponding to the software to be tested according to the test type, guaranteeing generation of high-quality test cases by selecting a proper test model, generating a first test case according to the test model and test parameters, testing coverage rate of the first test case to obtain a second test case, feeding the second test case back to a user, and optimizing the test cases through coverage rate testing to ensure that the finally generated test cases have higher coverage rate. According to the invention, the corresponding test model is selected through the test type of the test object, and the coverage rate detection is carried out on the test cases generated by the selected test model, so that the coverage rate of the test cases is ensured, and the efficiency of automatically generating the test cases is improved.
FIG. 3 is a schematic structural diagram of a test case generating device according to an embodiment of the present invention.
The test case generating device 100 according to the present invention may be mounted in an electronic apparatus. Depending on the implemented functions, the test case generating device may include a test information obtaining module 101, a test type analyzing module 102, a test model determining module 103, a test case generating module 104, and a test case returning module 105, where the modules may also be referred to as a unit, and refer to a series of computer program segments capable of being executed by a processor of an electronic device and performing a fixed function, and stored in a memory of the electronic device.
In the present embodiment, the functions concerning the respective modules/units are as follows:
The test information acquisition module 101 is used for acquiring test information corresponding to the software to be tested after receiving a test case request for generating the software to be tested sent by a user terminal, wherein the test information comprises a test object and test parameters;
The test type analysis module 102 is used for analyzing the test type of the software to be tested according to the test object;
the test model determining module 103 is used for determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
The test case generation module 104 is configured to generate a first test case according to the test model and the test parameters;
and the test case returning module 105 is used for carrying out coverage rate test on the first test case to obtain a second test case, and feeding back the second test case as a target test case to the user terminal.
In an embodiment, the test information obtaining module 101 is configured to, when executing the obtaining the test information corresponding to the software to be tested:
acquiring a test requirement document uploaded by the user terminal;
Carrying out keyword recognition on the test requirement document by using a preset keyword recognition algorithm to obtain document keywords;
And extracting test objects and test parameters from the document keywords, and taking the test objects and the test parameters as the test information.
In one embodiment, the test type analysis module 102 is configured to, when executing the analysis of the test type of the software under test according to the test object:
Acquiring a test object in the test information;
judging the type of the test object, and if the test object is a flow class, judging the test type of the software to be tested as a first type;
If the test object is a numerical class, analyzing that the test object is a discrete numerical class or a continuous numerical class, if the test object is a discrete numerical class, judging that the test type of the software to be tested is a second type, and if the test object is a continuous numerical class, judging that the test type of the software to be tested is a third type;
And if the test object simultaneously comprises a numerical class and a flow class, judging that the test type of the software to be tested is a fourth type.
In an embodiment, the test model determining module 103 is configured to, when executing the determining the corresponding test model according to the predetermined mapping relationship between the test type and the test model and the test type of the software to be tested:
If the test type of the software to be tested is a first type, acquiring a first model corresponding to the first type as a test model corresponding to the software to be tested;
if the test type of the software to be tested is a second type, acquiring a second model corresponding to the second type as a test model corresponding to the software to be tested;
If the test type of the software to be tested is a third type, acquiring a third model corresponding to the third type as a test model corresponding to the software to be tested;
and if the test type of the software to be tested is a fourth type, acquiring a fourth model corresponding to the fourth type as a test model corresponding to the software to be tested.
In one embodiment, the test case generation module 104 is configured to, when executing the generating the first test case according to the test model and the test parameters:
Acquiring an input parameter value and an expected parameter value in the test parameters;
constructing an initial test case according to the input parameter value, and executing the initial test case to obtain an output parameter value;
Judging whether the output parameter value is included in the desired parameter value;
If the output parameter value is not contained in the expected parameter value, judging that the initial test case is unqualified;
and if the output parameter value is contained in the expected parameter value, taking the initial test case as the first test case.
In an embodiment, when executing the coverage rate test on the first test case to obtain a second test case, the test case return module 105 is configured to:
executing the first test case by using a preset test frame to obtain a test result;
calculating a coverage value of the test result by using a preset coverage tool, and comparing the coverage value with a preset coverage threshold;
if the coverage value is greater than or equal to the coverage threshold, the initial test case is used as the second test case;
And if the coverage value is smaller than the coverage threshold, modifying the first test case, and recalculating the coverage value of the modified first test case until the calculated coverage value is larger than or equal to the coverage threshold, and taking the test case corresponding to the coverage value larger than or equal to the coverage threshold as the second test case.
In an embodiment, when executing the feedback of the second test case as the target test case to the user terminal, the test case return module 105 is configured to:
judging whether a test case similar to the target test case is stored in a preset database or not;
If the database stores the similar test cases with the target test cases, replacing the similar test cases with the target test cases and storing the similar test cases into the database;
And if the database does not store the test cases similar to the target test case, storing the target test case into the database.
The invention provides a test case generating device, which is used for acquiring test information corresponding to software to be tested by receiving a request for generating the test case, ensuring the integrity and accuracy of the test information, analyzing the test type of the software to be tested according to a test object in the test information, ensuring the matching of a subsequently generated test case and the test object, determining a test model corresponding to the software to be tested according to the test type, ensuring the generation of a high-quality test case by selecting a proper test model, generating a first test case according to the test model and test parameters, performing coverage rate test on the first test case to obtain a second test case, feeding back the second test case to a user, and optimizing the test case through coverage rate test to ensure that the finally generated test case has higher coverage rate. According to the invention, the corresponding test model is selected through the test type of the test object, and the coverage rate detection is carried out on the test cases generated by the selected test model, so that the coverage rate of the test cases is ensured, and the efficiency of automatically generating the test cases is improved.
The specific limitation of the test case generating device can be referred to the limitation of the test case generating method, and will not be described herein. The respective modules in the test case generating device described above may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 4. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes non-volatile and/or volatile storage media and internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is for communicating with an external client via a network connection. The computer program, when executed by a processor, performs functions or steps of a server side of a test case generating method.
In one embodiment, a computer device is provided, which may be a client, the internal structure of which may be as shown in FIG. 5. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is for communicating with an external server via a network connection. The computer program, when executed by a processor, performs a function or step of a test case generation client side
In one embodiment, a computer device is provided comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of when executing the computer program:
after receiving a test case request for generating software to be tested, which is sent by a user, acquiring test information corresponding to the software to be tested, wherein the test information comprises a test object and test parameters;
analyzing the test type of the software to be tested according to the test object;
determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
generating a first test case according to the test model and the test parameters;
And performing coverage rate test on the first test case to obtain a second test case, and feeding back the second test case serving as a target test case to the user.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
After receiving a test case request for generating software to be tested, which is sent by a user terminal, acquiring test information corresponding to the software to be tested, wherein the test information comprises a test object and test parameters;
analyzing the test type of the software to be tested according to the test object;
determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
generating a first test case according to the test model and the test parameters;
and performing coverage rate test on the first test case to obtain a second test case, and feeding back the second test case serving as a target test case to the user terminal.
It should be noted that, the functions or steps implemented by the computer readable storage medium or the computer device may correspond to the relevant descriptions of the server side and the client side in the foregoing method embodiments, and are not described herein for avoiding repetition.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
The foregoing embodiments are merely illustrative of the technical solutions of the present application, and not restrictive, and although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that modifications may still be made to the technical solutions described in the foregoing embodiments or equivalent substitutions of some technical features thereof, and that such modifications or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application. It should be noted that, if a software tool or component other than the company appears in the embodiment of the present application, the embodiment is merely presented by way of example, and does not represent actual use.