[go: up one dir, main page]

CN119597653A - Test case generation method and device, electronic equipment and storage medium - Google Patents

Test case generation method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN119597653A
CN119597653A CN202411656186.6A CN202411656186A CN119597653A CN 119597653 A CN119597653 A CN 119597653A CN 202411656186 A CN202411656186 A CN 202411656186A CN 119597653 A CN119597653 A CN 119597653A
Authority
CN
China
Prior art keywords
test
test case
type
software
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411656186.6A
Other languages
Chinese (zh)
Inventor
孙立超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An International Financial Leasing Co Ltd
Original Assignee
Ping An International Financial Leasing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An International Financial Leasing Co Ltd filed Critical Ping An International Financial Leasing Co Ltd
Priority to CN202411656186.6A priority Critical patent/CN119597653A/en
Publication of CN119597653A publication Critical patent/CN119597653A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3676Test management for coverage analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

本发明涉及测试用例生成领域及金融科技领域,在接受到生成待测试软件的测试用例请求后,获取待测试软件对应的测试信息,根据测试信息中的测试对象分析待测试软件的测试类型,再根据测试类型与测试模型的映射关系,确定待测试软件对应的测试模型,根据测试类型选择合适的测试模型,然后根据测试模型和获取的测试参数生成第一测试用例,通过生成初步的测试用例,最后对第一测试用例进行覆盖率测试得到第二测试用例,将第二测试用例反馈至用户终端。本发明通过测试对象的测试类型选取对应的测试模型,并对选取的测试模型生成的测试用例进行覆盖率检测,确保测试用例的覆盖率,从而提高生成测试用例的效率。

The present invention relates to the field of test case generation and the field of financial technology. After receiving a test case request for generating software to be tested, the test information corresponding to the software to be tested is obtained, the test type of the software to be tested is analyzed according to the test object in the test information, and then the test model corresponding to the software to be tested is determined according to the mapping relationship between the test type and the test model, and a suitable test model is selected according to the test type, and then a first test case is generated according to the test model and the obtained test parameters, and a preliminary test case is generated, and finally a coverage test is performed on the first test case to obtain a second test case, and the second test case is fed back to a user terminal. The present invention selects a corresponding test model according to the test type of the test object, and performs coverage detection on the test case generated by the selected test model, so as to ensure the coverage of the test case, thereby improving the efficiency of generating test cases.

Description

Test case generation method and device, electronic equipment and storage medium
Technical Field
The invention relates to the field of test case generation and the field of financial science and technology, in particular to a test case generation method, a device, electronic equipment and a storage medium.
Background
It is well known that financial software products are tested for various functions or interfaces, such as account management, transaction processing, transfer, loan application, and investment operations, prior to delivery from the factory. The test cases are important components of the software test and are the basis and the core of the software test as long as the test is performed.
In the prior art, one method is to generally acquire data by a tester, and then manually generate a large number of test cases according to the acquired data. The other method is to directly utilize the deep learning model to generate test cases, but different test methods or different test models are often needed for different test objects and test requirements, and a single deep learning model cannot meet the requirements, so that the quality of the test cases generated by using the deep learning model cannot be guaranteed.
Therefore, how to ensure that test cases with excellent quality can be generated quickly has become a technical problem to be solved.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a test case generation method that can quickly generate a test case with excellent quality in the process of generating a test case.
In order to achieve the above object, the present invention provides a test case generating method, which is characterized in that the method includes:
After receiving a test case request for generating software to be tested, which is sent by a user terminal, acquiring test information corresponding to the software to be tested, wherein the test information comprises a test object and test parameters;
analyzing the test type of the software to be tested according to the test object;
determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
generating a first test case according to the test model and the test parameters;
and performing coverage rate test on the first test case to obtain a second test case, and feeding back the second test case serving as a target test case to the user terminal.
In addition, in order to achieve the above object, the present invention also provides a test case generating device, which is characterized in that the device includes:
the test information acquisition module is used for acquiring test information corresponding to the software to be tested after receiving a test case request for generating the software to be tested, which is sent by the user terminal, wherein the test information comprises a test object and test parameters;
The test type analysis module is used for analyzing the test type of the software to be tested according to the test object;
The test model determining module is used for determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
the test case generation module is used for generating a first test case according to the test model and the test parameters;
and the test case returning module is used for carrying out coverage rate test on the first test case to obtain a second test case, and feeding back the second test case serving as a target test case to the user terminal.
In addition, to achieve the above object, the present invention also provides an electronic device including:
a memory storing at least one computer program, and
And the processor executes the program stored in the memory to realize the test case generation method.
In addition, to achieve the above object, the present invention further provides a computer readable storage medium having at least one computer program stored therein, the at least one computer program being executed by a processor in an electronic device to implement the test case generating method described above.
According to the technical scheme, the method comprises the steps of obtaining test information corresponding to software to be tested through receiving a request for generating the test cases, guaranteeing the integrity and accuracy of the test information, analyzing the test type of the software to be tested according to test objects in the test information, guaranteeing matching of the subsequently generated test cases and the test objects, determining a test model corresponding to the software to be tested according to the test type, guaranteeing generation of high-quality test cases through selecting a proper test model, generating a first test case according to the test model and test parameters, performing coverage rate test on the first test case to obtain a second test case, feeding the second test case back to a user, and optimizing the test cases through coverage rate test to ensure that the finally generated test cases have higher coverage rate. According to the invention, the corresponding test model is selected through the test type of the test object, and the coverage rate detection is carried out on the test cases generated by the selected test model, so that the coverage rate of the test cases is ensured, and the efficiency of automatically generating the test cases is improved.
Drawings
FIG. 1 is a schematic diagram of an application environment of a test case generating method according to an embodiment of the present invention;
FIG. 2 is a flowchart of a test case generating method according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a test case generating device according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a computer device according to an embodiment of the present invention;
fig. 5 is a schematic diagram of another configuration of a computer device according to an embodiment of the present invention.
The achievement of the objects, functional features and advantages of the present application will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that the description of "first", "second", etc. in this disclosure is for descriptive purposes only and is not to be construed as indicating or implying a relative importance or implying an indication of the number of technical features being indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include at least one such feature. In addition, the technical solutions of the embodiments may be combined with each other, but it is necessary to base that the technical solutions can be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, the combination of the technical solutions should be considered to be absent and not within the scope of protection claimed in the present invention.
The test case generation method provided by the embodiment of the invention can be applied to an application environment as shown in fig. 1, wherein a client communicates with a server through a network. The method comprises the steps of obtaining test information corresponding to software to be tested after a server receives a test case request sent by a user terminal and used for generating the software to be tested, wherein the test information comprises a test object and test parameters, analyzing the test type of the software to be tested according to the test object, determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested, generating a first test case according to the test model and the test parameters, testing coverage rate of the first test case to obtain a second test case, and feeding the second test case back to the user terminal as a target test case. The method comprises the steps of obtaining test information corresponding to software to be tested after receiving a request for generating the test cases of the software to be tested, guaranteeing the integrity and accuracy of the test information, providing a basis for subsequent steps, analyzing the test types of the software to be tested according to test objects in the test information, guaranteeing matching of the subsequently generated test cases and the test objects, determining a test model corresponding to the software to be tested according to the mapping relation between the test types and the test models, selecting a proper test model according to the test types to improve the efficiency and accuracy of generating the test cases, generating a first test case according to the test model and the obtained test parameters, providing a basis for subsequent coverage rate test by generating a preliminary test case, finally carrying out coverage rate test on the first test case to obtain a second test case, feeding back the second test case to a user, testing the coverage rate, optimizing the test case, and guaranteeing that the finally generated test case has higher coverage rate. According to the invention, the corresponding test model is selected through the test type of the test object, and the coverage rate detection is carried out on the test cases generated by the selected test model, so that the coverage rate of the test cases is ensured, and the efficiency and the accuracy of automatically generating the test cases are improved. The clients may be, but are not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices. The server may be implemented by a stand-alone server or a server cluster formed by a plurality of servers. The present invention will be described in detail with reference to specific examples.
Referring to a flow chart of a test case generation method according to an embodiment of the present invention shown in fig. 2, in the embodiment of the present invention, the test case generation method includes the following steps S1 to S5:
S1, after receiving a test case request for generating software to be tested, which is sent by a user terminal, test information corresponding to the software to be tested is obtained, wherein the test information comprises a test object and test parameters.
In an embodiment, after receiving a test case request for generating software to be tested sent by a user terminal, analyzing the request to obtain test information corresponding to the software to be tested.
In an embodiment, the obtaining the test information corresponding to the software to be tested includes:
acquiring a test requirement document uploaded by the user terminal;
Carrying out keyword recognition on the test requirement document by using a preset keyword recognition algorithm to obtain document keywords;
And extracting test objects and test parameters from the document keywords, and taking the test objects and the test parameters as the test information.
In this embodiment, in order to simplify the user operation, the client that sends the test case request may also support the user to upload the requirement document of the software to be tested. For example, the client may include a requirement document uploading control, through which a user may trigger a requirement document uploading operation, and upload a requirement document of the software to be tested. For this case, in a specific implementation, a requirement document of the software to be tested may be obtained, and then, keyword recognition is performed on the requirement document, so as to obtain a test object and a test parameter of the software to be tested.
When the analysis is carried out on the requirement document, keyword recognition can be carried out on the requirement document, and the test requirement of the software to be tested is obtained through the recognized keywords. The embodiment of the application is not particularly limited to the keywords, and the keywords can be determined according to actual conditions. The analysis of the requirement document can also be completed by a large language model, in other words, the requirement document can be input into the large language model, so that the test information of the software to be tested is obtained.
The embodiment of the application is not particularly limited to the software to be tested, and the software to be tested can be software corresponding to system software, software corresponding to application software, software corresponding to embedded software, software corresponding to information security software, software corresponding to industrial software or software corresponding to other software.
S2, analyzing the test type of the software to be tested according to the test object.
In an embodiment, the analyzing the test type of the software to be tested according to the test object includes:
Acquiring a test object in the test information;
judging the type of the test object, and if the test object is a flow class, judging the test type of the software to be tested as a first type;
If the test object is a numerical class, analyzing that the test object is a discrete numerical class or a continuous numerical class, if the test object is a discrete numerical class, judging that the test type of the software to be tested is a second type, and if the test object is a continuous numerical class, judging that the test type of the software to be tested is a third type;
And if the test object simultaneously comprises a numerical class and a flow class, judging that the test type of the software to be tested is a fourth type.
In this embodiment, the test objects are classified into a flow class, a numerical class, and a multi-factor combination class (including both a numerical class and a flow), where the numerical class continues to be classified into a discrete numerical class and a continuous numerical class. The test object of the process class generally refers to a system or a function to be executed according to a specific process or steps, and is characterized by comprising a plurality of steps, state transition and user interaction, such as a user registration process, wherein the steps comprise that a user accesses a registration page, fills in information such as a user name, a password, an email and the like, submits a registration form by the user, verifies the validity of input information by the system, sends a confirmation email by the system, and the user clicks a confirmation link to complete registration. The test object of the discrete numerical class means that the test object is a specific value or a limited number of values, i.e. the numerical value of the test object of the discrete numerical class can be enumerated and counted, such as the number of product purchases, the number of users, etc. The test object of the continuous numerical class means that the test object can take any value in a specific going interval, such as annual rate of return, stock price, bond denomination, and the like. The test object of the multi-factor combination class refers to testing a plurality of variables (factors) and each variable has a plurality of levels (values), for example, in a financial loan system, the variables are a loan type, a loan amount, and a loan period, and each variable has a plurality of values.
S3, determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested.
In an embodiment, the determining the corresponding test model according to the predetermined mapping relationship between the test type and the test model and the test type of the software to be tested includes:
If the test type of the software to be tested is a first type, acquiring a first model corresponding to the first type as a test model corresponding to the software to be tested;
if the test type of the software to be tested is a second type, acquiring a second model corresponding to the second type as a test model corresponding to the software to be tested;
If the test type of the software to be tested is a third type, acquiring a third model corresponding to the third type as a test model corresponding to the software to be tested;
and if the test type of the software to be tested is a fourth type, acquiring a fourth model corresponding to the fourth type as a test model corresponding to the software to be tested.
In this embodiment, when the test type of the software to be tested is the first type, a first model is selected according to the test model corresponding to the software to be tested. The first model is a finite state machine model, which is a mathematical model capable of covering all nodes and all states, and the finite state machine model can well generate test cases for software to be tested, the test object of which is an approval class. The operation principle of the finite state machine model is to process external events through state transition and execution of actions, so that behavior control of the software to be tested is realized.
Specifically, the step of generating the test case by the finite state machine model includes:
1. States and events are defined.
First, the state and events of the system are clarified. The state represents the condition of the system at a certain moment, and the event is an action triggering state transition.
2. And drawing a state diagram.
State diagrams are used to visualize states and transitions of the system. The state diagram can help you see the states of the system and the transition relationships between them clearly.
3. And generating a test case.
And generating a test case according to the state diagram, and ensuring that each state and transition are tested. Test cases may be generated using the following method:
3.1 traverse all states and transitions.
Ensuring that each state and each transition is tested at least once. The state diagram may be traversed by a depth-first search (DFS) or a breadth-first search (BFS).
3.2 Path coverage.
Test cases are generated that cover all possible paths. This can be achieved by generating all paths from the initial state to the final state.
3.3 Boundary value test.
For each state transition, a boundary value is tested. For example, if a condition of a certain state transition is that the value of a certain variable is within a certain range, the boundary value of the range is tested.
3.4 Abnormal path test.
In addition to the normal path, an abnormal path should be tested, i.e. a state transition that may lead to a system error.
In this embodiment, when the test type of the software to be tested is the second type, the second model is selected as the test model corresponding to the software to be tested. The second model is a decision tree model, and the decision tree model can effectively process multi-condition logic branches to help generate test cases which cover various condition combinations in whole, so that for the software to be tested, the test object is in a discrete numerical value class, the decision tree model is selected as the test model of the software to be tested.
Specifically, the step of generating the test case by the decision tree model includes:
1. defining input factors and output results
First, the input factors and output results of the system are clarified. The input factors are factors that affect the behavior of the system, and the output results are the results of the system at a particular combination of inputs.
Examples:
Inputting factors:
Age of user (discrete value: under 18 years old, 18-65 years old, over 65 years old)
IsStudent whether to use student (Boolean value: true, false)
HasDiscount whether there is a discount (Boolean value: true, false)
Outputting a result:
DiscountRate discount Rate (0.0,0.1,0.2,0.3)
2. Constructing decision trees
Decision tree algorithms (e.g., ID3, C4.5, CART, etc.) are used to construct the decision tree model. The decision tree predicts the value of the target variable through a series of conditional decisions (decision nodes).
2.1 Selection of root node
And selecting the characteristic with the maximum information gain or the minimum Indonesia as the root node.
2.2 Recursive splitting
The data set is divided into subsets according to the selected features, and the child nodes are recursively selected until a stop condition is met (e.g., all samples in the subset belong to the same class or the number of samples is less than a preset threshold).
2.3 Generation of leaf nodes
When the stop condition is satisfied, a leaf node is generated. The leaf nodes represent the final prediction result.
3. Extracting decision paths
All decision paths are extracted from the constructed decision tree. Each path from the root node to the leaf node represents a set of input conditions and corresponding output results.
4. Generating test cases
And generating test cases according to the extracted decision paths. Each path corresponds to a test case, ensuring that each condition combination is tested.
In this embodiment, when the test type of the software to be tested is a third type, a third model is selected as the test model corresponding to the software to be tested. The third model is an equivalence class analysis table model, and the operation principle of the equivalence class analysis table model is to divide all possible input data into a plurality of equivalence classes, and then select a few representative data from each equivalence class as test cases. The test object is the software to be tested of continuous value class, and the values of the test object cannot be enumerated and counted, so that the embodiment selects the equivalent class analysis table as the test model of the software to be tested, each input interval can be ensured to be tested, the omission of key input values is avoided, and the efficiency and the accuracy of generating the test case are improved.
Specifically, the step of generating the test case by the equivalence class analysis table model comprises the following steps:
1. defining input factors and output results
First, the input factors and output results of the system are clarified. The input factors are factors that affect the behavior of the system, and the output results are the results of the system at a particular combination of inputs.
Examples:
Inputting factors:
Age of user (continuous numerical value, range: 0 to 100)
IsStudent whether to use student (Boolean value: true, false)
HasDiscount whether there is a discount (Boolean value: true, false)
Outputting a result:
DiscountRate discount Rate (0.0,0.1,0.2,0.3)
2. Dividing equivalence classes
According to the characteristics of the input factors, the input values are divided into a plurality of equivalent classes. The behavior of the input values in each equivalence class in the system is equivalent.
2.1 Determining valid equivalence classes and invalid equivalence classes
And (5) effective equivalence class, namely input values meeting the system requirements.
Invalid equivalence class, input value which does not meet the system requirement.
Example equivalence class partitioning:
Age:
effective equivalence classes [0,18 ], [18,65 ], [65,100]
Invalid equivalence classes [ -1, 0), (100, 101]
IsStudent:
The effective equivalence class is True, false
HasDiscount:
The effective equivalence class is True, false
3. Selecting representative test cases
One or more representative values are selected from each equivalence class as test cases. Meanwhile, a boundary value and an outlier are considered.
Example test cases:
normal values:
Age:10,25,70
IsStudent:True,False
HasDiscount:True,False
Boundary value:
Age:0,18,65,100
Outliers:
Age:-1,101
4. generating test cases
And generating a test case according to the selected representative value, and recording an expected output result.
In this embodiment, when the test type of the software to be tested is the fourth type, a fourth model is selected as the test model corresponding to the software to be tested. The fourth model is an orthogonal experiment table model, and the orthogonal experiment table model is a model for designing multi-factor experiments through an orthogonal array, so that comprehensive test coverage can be obtained in fewer experiment times. The test type is the fourth type, that is, the test object is combined by multiple factors, the application scene is complex, and the conventional model processing efficiency and accuracy are low, so that the orthogonal experiment table model is adopted as the test model of the software to be tested in the embodiment.
Specifically, the step of generating the test case by the orthogonal experiment table model includes:
1. defining input factors and levels
First, the input factors of the system and the possible values (levels) of each factor are specified.
Examples:
Inputting factors:
FactorA temperature (continuous value: 20 ℃,30 ℃,40 ℃)
Factor B pressure (continuous values: 100kPa,200kPa,300 kPa)
Factor C, humidity (continuous value: 40%,60%, 80%)
2. Selecting an orthogonal array
An appropriate orthogonal array is selected based on the input factors and the number of levels. An orthogonal array is a special matrix in which each level of each factor is uniformly distributed in each column.
Common orthogonal arrays are:
l4 is applicable to 2 factors, each factor having 2 levels
L8 is applicable to 2 factors, 4 levels for each factor, or 3 factors, 2 levels for each factor
L9 is applicable to 3 factors, each factor having 3 levels
L16 is applicable to 4 factors, each factor having 2 levels, or 2 factors, each factor having 4 levels
By way of example, an L9 orthogonal array is selected because it can handle 3 factors, each factor having 3 levels (L4, L8, L9, L16 are identifiers of the orthogonal array, the number following each identifier indicates the number of experiments that the orthogonal array can handle, and the letter "L" indicates that this is an orthogonal array).
3. Generating test cases
And generating a test case according to the orthogonal array, and recording an expected output value.
According to the embodiment, the corresponding test model is selected for the test type of the software to be tested through the predetermined mapping relation between the test type and the test model, so that the generated test case can cover various conditions of the test object comprehensively, and the efficiency of generating the test case is improved.
S4, generating a first test case according to the test model and the test parameters.
In an embodiment, the generating a first test case according to the test model and the test parameters includes:
Acquiring an input parameter value and an expected parameter value in the test parameters;
constructing an initial test case according to the input parameter value, and executing the initial test case to obtain an output parameter value;
Judging whether the output parameter value is included in the desired parameter value;
If the output parameter value is not contained in the expected parameter value, judging that the initial test case is unqualified;
and if the output parameter value is contained in the expected parameter value, taking the initial test case as the first test case.
In this embodiment, in an embodiment, the input parameter value refers to input data set for executing a test case. The expected parameter value refers to an output parameter value that should be obtained after the input parameter value is input to the test case, and the expected parameter value is usually a range of values, and when the output parameter value is within the range of values of the expected parameter value, the test case is qualified. If the output parameter value is not in the value range of the expected parameter value, judging that the test case is unqualified, and reconstructing the initial test case until the obtained output parameter value is in the value range of the expected parameter value.
According to the embodiment, the initial first test case is generated through the determined test model and the acquired test parameters, so that the follow-up coverage rate test is facilitated.
S5, performing coverage rate test on the first test case to obtain a second test case, and feeding the second test case back to the user terminal as a target test case.
In an embodiment, the performing the coverage rate test on the first test case to obtain a second test case includes:
executing the first test case by using a preset test frame to obtain a test result;
calculating a coverage value of the test result by using a preset coverage tool, and comparing the coverage value with a preset coverage threshold;
if the coverage value is greater than or equal to the coverage threshold, the initial test case is used as the second test case;
And if the coverage value is smaller than the coverage threshold, modifying the first test case, and recalculating the coverage value of the modified first test case until the calculated coverage value is larger than or equal to the coverage threshold, and taking the test case corresponding to the coverage value larger than or equal to the coverage threshold as the second test case.
In this embodiment, modifying the first test case includes modifying a title and description, updating a precondition, adjusting a test step, or updating an expected result, etc. The preset test framework is used for executing the first test case, obtaining a test result after execution is completed, analyzing the test report by utilizing the coverage rate tool to obtain a coverage rate report result, and calculating a coverage value corresponding to the first test case according to the coverage rate report result. The coverage report results typically include information of the number of lines and total lines of code that have been executed, coverage of branches and conditions, areas of code that have not been covered, and the number of relevant lines.
The calculation of the coverage value comprises statement coverage, branch coverage, path coverage and condition coverage.
And the statement coverage rate is the proportion of the number of code lines covered by the calculation test case to the total number of code lines.
Statement coverage= (total number of code lines/number of executed code lines) ×100%.
And the branch coverage rate is the proportion of the conditional branch number covered by the test case to the total conditional branch number.
Branch coverage= (total conditional branch number/covered conditional branch number) ×100%.
And the path coverage rate is the ratio of the code path number covered by the test case to the total path number.
Path coverage= (total number of code paths/number of covered code paths) ×100%.
The condition coverage rate is the ratio of the number of the conditional expression results (true/false) covered by the calculation test case to the total conditional expression results.
Condition coverage= (total conditional expression result number/covered conditional expression result number) ×100%.
According to the embodiment, the first test case is executed by using the preset test framework to obtain a test result, the coverage value of the test result is calculated, the coverage value is compared with the preset coverage threshold, and the first test case is modified according to the comparison result to ensure that the coverage value of the test case is larger than the preset coverage threshold, so that a second test case with better effect is obtained. And carrying out coverage rate test on the first test case, optimizing the test case to obtain a second test case, and ensuring that the finally generated test case has higher coverage rate.
In an embodiment, after feeding back the second test case as the target test case to the user terminal, the method includes:
judging whether a test case similar to the target test case is stored in a preset database or not;
If the database stores the similar test cases with the target test cases, replacing the similar test cases with the target test cases and storing the similar test cases into the database;
And if the database does not store the test cases similar to the target test case, storing the target test case into the database.
In this embodiment, after the target test case is fed back to the user terminal, the target test case is stored in a preset database, where the preset database may be a SQL SERVER database, an Oracle database or a MySQL database, and specifically, which database may be selected according to the actual requirement or the application environment. The SQL SERVER database is a relational database management system and is widely used for storing and managing test cases. The system provides powerful data storage and query functions and is suitable for scenes in which a large number of test cases need to be efficiently managed and queried. The Oracle database is also a commonly used relational database for storing and managing test cases. The method has high reliability and safety, and is suitable for the storage requirement of the test case with high data safety requirement. The MySQL database is an open-source relational database management system, and is also widely used for storing test cases due to its open source and cost effectiveness. The method supports a large number of concurrent connections and is suitable for a test environment which needs to process a large number of user accesses.
When the target test case is stored in a preset database, whether the test case similar to the target test case is stored or not is required to be inquired in the preset database, if so, the target test case is an updated version of the similar test case in the preset database, and therefore the target test case is stored in the database instead of the similar test case. If not, the target test case is directly stored in the database.
The embodiment stores the generated target test cases in the preset database, and can conveniently add, modify and delete the cases, thereby improving the expandability and maintainability of the test cases, effectively tracking and managing the test process, and simultaneously facilitating the multiplexing and knowledge transfer of the test cases.
According to the technical scheme, the method comprises the steps of obtaining test information corresponding to software to be tested by receiving a request for generating the test cases, guaranteeing the integrity and accuracy of the test information, analyzing the test type of the software to be tested according to test objects in the test information, guaranteeing matching of the subsequently generated test cases and the test objects, determining a test model corresponding to the software to be tested according to the test type, guaranteeing generation of high-quality test cases by selecting a proper test model, generating a first test case according to the test model and test parameters, testing coverage rate of the first test case to obtain a second test case, feeding the second test case back to a user, and optimizing the test cases through coverage rate testing to ensure that the finally generated test cases have higher coverage rate. According to the invention, the corresponding test model is selected through the test type of the test object, and the coverage rate detection is carried out on the test cases generated by the selected test model, so that the coverage rate of the test cases is ensured, and the efficiency of automatically generating the test cases is improved.
FIG. 3 is a schematic structural diagram of a test case generating device according to an embodiment of the present invention.
The test case generating device 100 according to the present invention may be mounted in an electronic apparatus. Depending on the implemented functions, the test case generating device may include a test information obtaining module 101, a test type analyzing module 102, a test model determining module 103, a test case generating module 104, and a test case returning module 105, where the modules may also be referred to as a unit, and refer to a series of computer program segments capable of being executed by a processor of an electronic device and performing a fixed function, and stored in a memory of the electronic device.
In the present embodiment, the functions concerning the respective modules/units are as follows:
The test information acquisition module 101 is used for acquiring test information corresponding to the software to be tested after receiving a test case request for generating the software to be tested sent by a user terminal, wherein the test information comprises a test object and test parameters;
The test type analysis module 102 is used for analyzing the test type of the software to be tested according to the test object;
the test model determining module 103 is used for determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
The test case generation module 104 is configured to generate a first test case according to the test model and the test parameters;
and the test case returning module 105 is used for carrying out coverage rate test on the first test case to obtain a second test case, and feeding back the second test case as a target test case to the user terminal.
In an embodiment, the test information obtaining module 101 is configured to, when executing the obtaining the test information corresponding to the software to be tested:
acquiring a test requirement document uploaded by the user terminal;
Carrying out keyword recognition on the test requirement document by using a preset keyword recognition algorithm to obtain document keywords;
And extracting test objects and test parameters from the document keywords, and taking the test objects and the test parameters as the test information.
In one embodiment, the test type analysis module 102 is configured to, when executing the analysis of the test type of the software under test according to the test object:
Acquiring a test object in the test information;
judging the type of the test object, and if the test object is a flow class, judging the test type of the software to be tested as a first type;
If the test object is a numerical class, analyzing that the test object is a discrete numerical class or a continuous numerical class, if the test object is a discrete numerical class, judging that the test type of the software to be tested is a second type, and if the test object is a continuous numerical class, judging that the test type of the software to be tested is a third type;
And if the test object simultaneously comprises a numerical class and a flow class, judging that the test type of the software to be tested is a fourth type.
In an embodiment, the test model determining module 103 is configured to, when executing the determining the corresponding test model according to the predetermined mapping relationship between the test type and the test model and the test type of the software to be tested:
If the test type of the software to be tested is a first type, acquiring a first model corresponding to the first type as a test model corresponding to the software to be tested;
if the test type of the software to be tested is a second type, acquiring a second model corresponding to the second type as a test model corresponding to the software to be tested;
If the test type of the software to be tested is a third type, acquiring a third model corresponding to the third type as a test model corresponding to the software to be tested;
and if the test type of the software to be tested is a fourth type, acquiring a fourth model corresponding to the fourth type as a test model corresponding to the software to be tested.
In one embodiment, the test case generation module 104 is configured to, when executing the generating the first test case according to the test model and the test parameters:
Acquiring an input parameter value and an expected parameter value in the test parameters;
constructing an initial test case according to the input parameter value, and executing the initial test case to obtain an output parameter value;
Judging whether the output parameter value is included in the desired parameter value;
If the output parameter value is not contained in the expected parameter value, judging that the initial test case is unqualified;
and if the output parameter value is contained in the expected parameter value, taking the initial test case as the first test case.
In an embodiment, when executing the coverage rate test on the first test case to obtain a second test case, the test case return module 105 is configured to:
executing the first test case by using a preset test frame to obtain a test result;
calculating a coverage value of the test result by using a preset coverage tool, and comparing the coverage value with a preset coverage threshold;
if the coverage value is greater than or equal to the coverage threshold, the initial test case is used as the second test case;
And if the coverage value is smaller than the coverage threshold, modifying the first test case, and recalculating the coverage value of the modified first test case until the calculated coverage value is larger than or equal to the coverage threshold, and taking the test case corresponding to the coverage value larger than or equal to the coverage threshold as the second test case.
In an embodiment, when executing the feedback of the second test case as the target test case to the user terminal, the test case return module 105 is configured to:
judging whether a test case similar to the target test case is stored in a preset database or not;
If the database stores the similar test cases with the target test cases, replacing the similar test cases with the target test cases and storing the similar test cases into the database;
And if the database does not store the test cases similar to the target test case, storing the target test case into the database.
The invention provides a test case generating device, which is used for acquiring test information corresponding to software to be tested by receiving a request for generating the test case, ensuring the integrity and accuracy of the test information, analyzing the test type of the software to be tested according to a test object in the test information, ensuring the matching of a subsequently generated test case and the test object, determining a test model corresponding to the software to be tested according to the test type, ensuring the generation of a high-quality test case by selecting a proper test model, generating a first test case according to the test model and test parameters, performing coverage rate test on the first test case to obtain a second test case, feeding back the second test case to a user, and optimizing the test case through coverage rate test to ensure that the finally generated test case has higher coverage rate. According to the invention, the corresponding test model is selected through the test type of the test object, and the coverage rate detection is carried out on the test cases generated by the selected test model, so that the coverage rate of the test cases is ensured, and the efficiency of automatically generating the test cases is improved.
The specific limitation of the test case generating device can be referred to the limitation of the test case generating method, and will not be described herein. The respective modules in the test case generating device described above may be implemented in whole or in part by software, hardware, or a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 4. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes non-volatile and/or volatile storage media and internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is for communicating with an external client via a network connection. The computer program, when executed by a processor, performs functions or steps of a server side of a test case generating method.
In one embodiment, a computer device is provided, which may be a client, the internal structure of which may be as shown in FIG. 5. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The network interface of the computer device is for communicating with an external server via a network connection. The computer program, when executed by a processor, performs a function or step of a test case generation client side
In one embodiment, a computer device is provided comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of when executing the computer program:
after receiving a test case request for generating software to be tested, which is sent by a user, acquiring test information corresponding to the software to be tested, wherein the test information comprises a test object and test parameters;
analyzing the test type of the software to be tested according to the test object;
determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
generating a first test case according to the test model and the test parameters;
And performing coverage rate test on the first test case to obtain a second test case, and feeding back the second test case serving as a target test case to the user.
In one embodiment, a computer readable storage medium is provided having a computer program stored thereon, which when executed by a processor, performs the steps of:
After receiving a test case request for generating software to be tested, which is sent by a user terminal, acquiring test information corresponding to the software to be tested, wherein the test information comprises a test object and test parameters;
analyzing the test type of the software to be tested according to the test object;
determining a corresponding test model according to a predetermined mapping relation between the test type and the test model and the test type of the software to be tested;
generating a first test case according to the test model and the test parameters;
and performing coverage rate test on the first test case to obtain a second test case, and feeding back the second test case serving as a target test case to the user terminal.
It should be noted that, the functions or steps implemented by the computer readable storage medium or the computer device may correspond to the relevant descriptions of the server side and the client side in the foregoing method embodiments, and are not described herein for avoiding repetition.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in embodiments provided herein may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (Rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM), among others.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions.
The foregoing embodiments are merely illustrative of the technical solutions of the present application, and not restrictive, and although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those skilled in the art that modifications may still be made to the technical solutions described in the foregoing embodiments or equivalent substitutions of some technical features thereof, and that such modifications or substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application. It should be noted that, if a software tool or component other than the company appears in the embodiment of the present application, the embodiment is merely presented by way of example, and does not represent actual use.

Claims (10)

1.一种测试用例生成方法,其特征在于,所述方法包括:1. A test case generation method, characterized in that the method comprises: 在接收到用户终端发送的生成待测试软件的测试用例请求后,获取所述待测试软件对应的测试信息,所述测试信息包括测试对象和测试参数;After receiving a test case request for generating the software to be tested sent by the user terminal, obtaining test information corresponding to the software to be tested, the test information including a test object and test parameters; 根据所述测试对象分析所述待测试软件的测试类型;Analyzing the test type of the software to be tested according to the test object; 根据预先确定的测试类型与测试模型的映射关系,及所述待测试软件的测试类型确定对应的测试模型;Determine the corresponding test model according to the predetermined mapping relationship between the test type and the test model and the test type of the software to be tested; 根据所述测试模型和所述测试参数生成第一测试用例;Generate a first test case according to the test model and the test parameters; 对所述第一测试用例进行覆盖率测试得到第二测试用例,将所述第二测试用例作为目标测试用例反馈至所述用户终端。A coverage test is performed on the first test case to obtain a second test case, and the second test case is fed back to the user terminal as a target test case. 2.如权利要求1所述的测试用例生成方法,其特征在于,所述获取所述待测试软件对应的测试信息,包括:2. The test case generation method according to claim 1, wherein obtaining the test information corresponding to the software to be tested comprises: 获取所述用户终端上传的测试需求文档;Obtaining the test requirement document uploaded by the user terminal; 利用预设的关键词识别算法对所述测试需求文档进行关键词识别,得到文档关键词;Using a preset keyword recognition algorithm to perform keyword recognition on the test requirement document to obtain document keywords; 从所述文档关键词中提取测试对象和测试参数,将所述测试对象和测试参数作为所述测试信息。A test object and a test parameter are extracted from the document keywords, and the test object and the test parameter are used as the test information. 3.如权利要求1所述的测试用例生成方法,其特征在于,所述根据所述测试对象分析所述待测试软件的测试类型,包括:3. The test case generation method according to claim 1, wherein the step of analyzing the test type of the software to be tested according to the test object comprises: 获取所述测试信息中的测试对象;Obtaining a test object from the test information; 判断所述测试对象的类型,若所述测试对象为流程类,则判断所述待测试软件的测试类型为第一类型;Determine the type of the test object, and if the test object is a process type, determine that the test type of the software to be tested is a first type; 若所述测试对象为数值类,则分析所述测试对象为离散数值类或连续数值类,若所述测试对象为离散数值类,则判断所述待测试软件的测试类型为第二类型,若所述测试对象为连续数值类,则判断所述待测试软件的测试类型为第三类型;If the test object is a numerical type, the test object is analyzed to determine whether it is a discrete numerical type or a continuous numerical type. If the test object is a discrete numerical type, the test type of the software to be tested is determined to be the second type. If the test object is a continuous numerical type, the test type of the software to be tested is determined to be the third type. 若所述测试对象同时包含数值类和流程类,则判断所述待测试软件的测试类型为第四类型。If the test object includes both the numerical class and the process class, the test type of the software to be tested is determined to be the fourth type. 4.如权利要求1所述的测试用例生成方法,其特征在于,所述根据预先确定的测试类型与测试模型的映射关系,及所述待测试软件的测试类型确定对应的测试模型,包括:4. The test case generation method according to claim 1, wherein the step of determining the corresponding test model according to the predetermined mapping relationship between the test type and the test model and the test type of the software to be tested comprises: 若所述待测试软件的测试类型为第一类型,则获取与所述第一类型对应的第一模型作为根据所述待测试软件对应的测试模型;If the test type of the software to be tested is the first type, obtaining a first model corresponding to the first type as a test model corresponding to the software to be tested; 若所述待测试软件的测试类型为第二类型,则获取与所述第二类型对应的第二模型作为根据所述待测试软件对应的测试模型;If the test type of the software to be tested is the second type, obtaining a second model corresponding to the second type as a test model corresponding to the software to be tested; 若所述待测试软件的测试类型为第三类型,则获取与所述第三类型对应的第三模型作为根据所述待测试软件对应的测试模型;If the test type of the software to be tested is the third type, obtaining a third model corresponding to the third type as a test model corresponding to the software to be tested; 若所述待测试软件的测试类型为第四类型,则获取与所述第四类型对应的第四模型作为根据所述待测试软件对应的测试模型。If the test type of the software to be tested is the fourth type, a fourth model corresponding to the fourth type is obtained as a test model corresponding to the software to be tested. 5.如权利要求1所述的测试用例生成方法,其特征在于,所述根据所述测试模型和所述测试参数生成第一测试用例,包括:5. The test case generation method according to claim 1, wherein generating the first test case according to the test model and the test parameters comprises: 获取所述测试参数中的输入参数值和期望参数值;Obtaining input parameter values and expected parameter values in the test parameters; 根据所述输入参数值构建初始测试用例,执行所述初始测试用例,得到输出参数值;Construct an initial test case according to the input parameter value, execute the initial test case, and obtain an output parameter value; 判断所述输出参数值是否包含在所述期望参数值中;Determining whether the output parameter value is included in the expected parameter value; 若所述输出参数值不包含在所述期望参数值中,则判断所述初始测试用例不合格;If the output parameter value is not included in the expected parameter value, the initial test case is judged to be unqualified; 若所述输出参数值包含在所述期望参数值中,则将所述初始测试用例作为所述第一测试用例。If the output parameter value is included in the expected parameter value, the initial test case is used as the first test case. 6.如权利要求1所述的测试用例生成方法,其特征在于,所述对所述第一测试用例进行覆盖率测试得到第二测试用例,包括:6. The test case generation method according to claim 1, wherein performing coverage testing on the first test case to obtain the second test case comprises: 利用预设的测试框架执行所述第一测试用例,得到测试结果;Execute the first test case using a preset test framework to obtain a test result; 利用预设的覆盖率工具计算所述测试结果的覆盖值,将所述覆盖值与预设的覆盖阈值进行比较;Calculating the coverage value of the test result using a preset coverage tool, and comparing the coverage value with a preset coverage threshold; 若所述覆盖值大于或等于所述覆盖阈值,则将所述初始测试用例作为所述第二测试用例;If the coverage value is greater than or equal to the coverage threshold, taking the initial test case as the second test case; 若所述覆盖值小于所述覆盖阈值,则对所述第一测试用例进行修改,并重新计算修改后的第一测试用例的覆盖值,直至计算得到的覆盖值大于或等于所述覆盖阈值,将覆盖值大于或等于所述覆盖阈值所对应的测试用例作为所述第二测试用例。If the coverage value is less than the coverage threshold, the first test case is modified and the coverage value of the modified first test case is recalculated until the calculated coverage value is greater than or equal to the coverage threshold, and the test case corresponding to the coverage value greater than or equal to the coverage threshold is used as the second test case. 7.如权利要求1所述的测试用例生成方法,其特征在于,所述将所述第二测试用例作为目标测试用例反馈至所述用户终端之后,包括:7. The test case generation method according to claim 1, characterized in that after feeding back the second test case as a target test case to the user terminal, the method further comprises: 判断在预设的数据库中是否存储有与所述目标测试用例相似的测试用例;Determine whether a test case similar to the target test case is stored in a preset database; 若所述数据库中存储有与所述目标测试用例相似的测试用例,则将目标测试用例替换所述相似的测试用例并存储至所述数据库中;If a test case similar to the target test case is stored in the database, the target test case replaces the similar test case and stores it in the database; 若所述数据库中未存储有与所述目标测试用例相似的测试用例,则将所述目标测试用例存储至所述数据库中。If the database does not store a test case similar to the target test case, the target test case is stored in the database. 8.一种测试用例生成装置,其特征在于,所述装置包括:8. A test case generation device, characterized in that the device comprises: 测试信息获取模块:用于在接收到用户终端发送的生成待测试软件的测试用例请求后,获取所述待测试软件对应的测试信息,所述测试信息包括测试对象和测试参数;The test information acquisition module is used to acquire the test information corresponding to the software to be tested after receiving the test case request for generating the software to be tested sent by the user terminal, wherein the test information includes the test object and the test parameters; 测试类型分析模块:用于根据所述测试对象分析所述待测试软件的测试类型;Test type analysis module: used for analyzing the test type of the software to be tested according to the test object; 测试模型确定模块:用于根据预先确定的测试类型与测试模型的映射关系,及所述待测试软件的测试类型确定对应的测试模型;Test model determination module: used to determine the corresponding test model according to the predetermined mapping relationship between the test type and the test model and the test type of the software to be tested; 测试用例生成模块:用于根据所述测试模型和所述测试参数生成第一测试用例;A test case generation module: used to generate a first test case according to the test model and the test parameters; 测试用例返回模块:用于对所述第一测试用例进行覆盖率测试得到第二测试用例,将所述第二测试用例作为目标测试用例反馈至所述用户终端。Test case return module: used to perform coverage test on the first test case to obtain a second test case, and feed back the second test case as a target test case to the user terminal. 9.一种电子设备,其特征在于,所述电子设备包括:9. An electronic device, characterized in that the electronic device comprises: 至少一个处理器;以及,at least one processor; and, 与所述至少一个处理器通信连接的存储器;其中,a memory communicatively connected to the at least one processor; wherein, 所述存储器存储有可被所述至少一个处理器执行的计算机程序,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行如权利要求1至7中任意一项所述的测试用例生成方法。The memory stores a computer program executable by the at least one processor, and the instructions are executed by the at least one processor so that the at least one processor can execute the test case generation method as described in any one of claims 1 to 7. 10.一种计算机可读存储介质,存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1至7中任意一项所述的测试用例生成方法。10. A computer-readable storage medium storing a computer program, wherein when the computer program is executed by a processor, the test case generation method according to any one of claims 1 to 7 is implemented.
CN202411656186.6A 2024-11-19 2024-11-19 Test case generation method and device, electronic equipment and storage medium Pending CN119597653A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411656186.6A CN119597653A (en) 2024-11-19 2024-11-19 Test case generation method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411656186.6A CN119597653A (en) 2024-11-19 2024-11-19 Test case generation method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN119597653A true CN119597653A (en) 2025-03-11

Family

ID=94843419

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411656186.6A Pending CN119597653A (en) 2024-11-19 2024-11-19 Test case generation method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN119597653A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120631787A (en) * 2025-08-11 2025-09-12 苏州元脑智能科技有限公司 Test case generation method, device, server and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120631787A (en) * 2025-08-11 2025-09-12 苏州元脑智能科技有限公司 Test case generation method, device, server and storage medium

Similar Documents

Publication Publication Date Title
US11526799B2 (en) Identification and application of hyperparameters for machine learning
CN109032829B (en) Data anomaly detection method and device, computer equipment and storage medium
CN108153670B (en) Interface testing method and device and electronic equipment
US11403305B2 (en) Performing data mining operations within a columnar database management system
CN108804548B (en) Test data query method, device, computer equipment and storage medium
CN112948504B (en) Data acquisition method and device, computer equipment and storage medium
US20240193485A1 (en) System and method of operationalizing automated feature engineering
US20240249008A1 (en) Policy consistency verification apparatus, policy consistency verification method, and policy consistency verification program
CN119597653A (en) Test case generation method and device, electronic equipment and storage medium
CN118211042A (en) Product data processing method, device, computer equipment and storage medium
CN116401140A (en) Data processing method, device, equipment, readable medium and software product
CN116756022A (en) Data preparation methods, devices, computer equipment and storage media
CN114546802B (en) A scoring method, device, equipment and storage medium for target applications
CN119597610A (en) Interface testing method, device, computer equipment, medium and program product
CN113867975B (en) A command line quick response method, device and computer equipment
CN117971649A (en) Data processing method, device, computer equipment and storage medium
CN120336180A (en) Test function library generation method, device, equipment and storage medium
CN115827478A (en) Code viewing method and device, computer equipment and storage medium
CN117112401A (en) Automated test case generation method, device, computer equipment and storage medium
CN120162345A (en) Risk prediction method, device and computer equipment for DDL change operation
CN120631727A (en) Application tracking data processing method, device, equipment and storage medium
CN119557332A (en) Test data processing method, device, computer equipment and readable storage medium
CN113342674A (en) Performance baseline regression testing method, device, equipment and medium based on learning
CN116881163A (en) Method, device and equipment for generating and processing test data in financial information system
CN119441228A (en) Supply chain information retrieval method, device, computer equipment, storage medium and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination