Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a micro-service architecture automation test method, a device, electronic equipment and a storage medium, which can efficiently perform interface test, support full-flow automation from development design to interface joint debugging to business scene test, and ensure the overall quality and stability of a micro-service system.
A first aspect of the present application provides a method for automatically testing a micro-service architecture, the method comprising:
Configuring an access address of a target interface document, analyzing the target interface document to store the analyzed interface definition into an interface main table in a lasting mode, and generating an interface test case according to the target interface document, wherein the target interface document is a Swagger interface document or an OpenApi interface document;
when the data inconsistency exists between the target interface document and the interface main table, acquiring interface definition difference data, and storing the interface definition difference data into a temporary table;
When determining to return to a specific interface version, acquiring specific version information of the specific interface version, and updating the specific version information to an interface master table;
configuring global variables { { } on interfaces, and extracting data of a request head or a request body from returned response bodies;
arranging the interface test cases in a test suite based on a service scene, and realizing the dependence transfer of the interface context by using a variable transfer mechanism;
setting a timing execution task of the test suite or providing an immediate execution option through a timing task function of the Spring Boot framework;
and rapidly executing the interface test task according to the timing execution task or the immediate execution option.
In an alternative embodiment, after said storing said difference data in the temporary table, the method further comprises:
When receiving an interface main table updating instruction of a user, updating the interface definition difference data in the temporary table to the interface main table;
Deleting the interface definition difference data in the temporary table after the interface main table is determined to be updated;
generating a version identifier based on the current system time, and storing updated interface definition data to an interface version table in a lasting mode, wherein the interface definition data comprises the interface definition difference data and unchanged data in the interface main table.
In an alternative embodiment, the method further comprises:
when detecting that the source interface definition is changed, acquiring an interface state of the source interface;
When the interface state is determined to be the developing state, automatically updating interface information to an automatic test platform;
And when the interface state is determined to be other states, labeling a change label and prompting the user to confirm change information to update the interface information, wherein the other states comprise a state under test or a released state.
In an alternative embodiment, after the data extraction of the request header or the request body is performed on the returned response body, the method further includes:
judging whether the extracted data needs to be processed or not;
when it is determined that the extracted data needs to be processed, an internal method is called in a parameter or a message body, and the extracted data is processed according to the internal method, wherein the internal method is a code logic module preset for realizing a specific data processing function, the parameter is a parameter transferred when an interface is called, and the message body is a data carrier in an interface request or response.
In an optional implementation manner, before the configuration of the access address of the Swagger interface document, the method further comprises the steps of collecting the target interface document in real time through a Spring Boot timing task calling OpenAPIParser component, and storing the interface definition data obtained through analysis to a MySQL database in a lasting mode.
In an alternative embodiment, the method further comprises:
arranging service scene cases through the test suite, and supporting inter-suite calling and batch assertion configuration;
And combining a plurality of test kits based on the test plan, triggering execution by a Spring Boot timing task, and generating a visual test report.
A second aspect of the present application provides an automated testing apparatus for a micro-service architecture, the apparatus comprising:
The interface document analysis module is used for configuring an access address of a target interface document, analyzing the target interface document to store the analyzed interface definition into an interface main table in a lasting mode, and generating an interface test case according to the target interface document, wherein the target interface document is a Swagger interface document or an OpenApi interface document;
the consistency check module is used for acquiring interface definition difference data and storing the interface definition difference data into a temporary table when the target interface document is determined to be inconsistent with the data in the interface main table;
The version regression updating module is used for acquiring specific version information of the specific interface version when determining to regress the specific interface version, and updating the specific version information to an interface master table;
The interface configuration module is used for configuring global variables { } for the interface and extracting data of a request head or a request body from a returned response body;
The test suite arrangement module is used for arranging the interface test cases based on the service scene in the test suite, and realizing the dependence transmission of the interface context by utilizing a variable transmission mechanism;
The timing task configuration module is used for setting the timing execution task of the test suite or providing an immediate execution option through the timing task function of the Spring Boot framework;
and the interface test task execution module is used for quickly executing the interface test task according to the timing execution task or the immediate execution option.
A third aspect of the application provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the microservice architecture automation test method when executing the computer program.
A fourth aspect of the present application provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the micro-service architecture automation test method described above.
In summary, the method, the device, the electronic device and the storage medium for automatically testing the micro-service architecture provided by the application have at least one of the following beneficial effects:
1. Through automatic analysis of the interface document, interface definition can be quickly acquired, corresponding test cases are generated, the complexity and error-prone performance of manually writing the test cases are avoided, and the generation efficiency of the test cases is improved;
2. the data consistency of the interface document and the interface main table is monitored in real time, so that the change of the interface definition can be found and processed in time, the consistency of the test case and the interface definition is ensured, and the test failure caused by the interface change is avoided;
3. Through the version regression function, the interface definition of the historical version can be conveniently rolled back to carry out compatibility test or problem reproduction, so that the flexibility and traceability of the test are improved;
4. The data can be conveniently shared among a plurality of test cases by configuring the global variable, so that the reusability of the test cases is improved;
5. Meanwhile, by utilizing a variable transmission mechanism, data transmission and dependency relationship between interfaces can be realized, and the authenticity and effectiveness of the test scene are improved;
6. Meanwhile, the instant execution option is provided, so that the test execution can be triggered conveniently and manually, and different test requirements are met;
7. by rapidly executing the interface test task, the interface problem can be found in time, the test efficiency is improved, and the test accuracy and reliability are ensured.
Detailed Description
The invention will be further described with reference to the drawings and examples.
The conception, specific structure, and technical effects produced by the present invention will be clearly and completely described below with reference to the embodiments and the drawings to fully understand the objects, features, and effects of the present invention. It is apparent that the described embodiments are only some embodiments of the present invention, but not all embodiments, and that other embodiments obtained by those skilled in the art without inventive effort are within the scope of the present invention based on the embodiments of the present invention. In addition, all the coupling/connection relationships referred to in the patent are not direct connection of the single-finger members, but rather, it means that a better coupling structure can be formed by adding or subtracting coupling aids depending on the specific implementation. The technical features in the invention can be interactively combined on the premise of no contradiction and conflict.
Referring to fig. 1, a flow chart of a micro service architecture automation test method according to an embodiment of the application is shown, and the micro service architecture automation test method includes the following steps.
S11, configuring an access address of a target interface document, analyzing the target interface document to store the analyzed interface definition into an interface main table in a lasting mode, and generating an interface test case according to the target interface document.
The target interface document is a Swagger interface document or an OpenApi interface document, and the Swagger interface document and the OpenApi interface document are OpenAPI specifications.
In some embodiments, the electronic device may define the URL address of the Swagger interface document or the OpenAPI interface document in a configuration file (e.g., application. Yml) and periodically call a Swagger/OpenAPI document parsing tool (e.g., swagger-parser or OpenAPI-generator) and then parse the Swagger2/3 and OpenApi interface definitions using OpenAPIParser. The Swagger2/3 and OpenApi interface definitions are created in advance by a developer in a development tool, and can be quickly and accurately written into an automatic test platform, and automatic test cases are generated and updated. Specifically, when the Swagger interface document information is obtained, the electronic device may parse the Swagger interface document information according to the OpenAPI specification to determine a field of a request body and a field of a response body in the Swagger interface document information, and generate a test case based on the request body field, and store the request body field, the response body field, and the test case record in the database.
For a clearer understanding of the inventive concept, the target interface document in the embodiment of the present application will be described by taking Swagger interface document as an example. Wherein the interface information is shown with reference to fig. 2.
And S12, when the fact that the data of the interface main table of the target interface document is inconsistent is determined, acquiring interface definition difference data, and storing the interface definition difference data into a temporary table.
The electronic device may pre-define a temporary table (e.g., api_change_temp), store the interface definition of the change, and the fields may include, but are not limited to, id (primary key), doc_id (document identification), field_name (field name), original_value (primary table value), new_value (resolution value), created_at (recording time), version_time stamp (time stamp for version recording), and the like. And when the interface document is analyzed each time, the electronic equipment can compare the current analysis result with the interface definition stored in the database to search inconsistent data. When it is determined that inconsistent data exists, the inconsistent data is determined to be interface definition difference data. In the embodiment of the application, the electronic device can compare the request body field of the Swagger interface document with the request body field of the database, if the two fields are inconsistent, the interface definition difference data is indicated to exist, the interface definition difference data is stored in the temporary table, and a unique version_timestamp is generated, namely, the current timestamp YYYYMMDDHHMMSS format is utilized.
In an alternative embodiment, after said storing said difference data in the temporary table, the method further comprises:
When receiving an interface main table updating instruction of a user, updating the interface definition difference data in the temporary table to the interface main table;
Deleting the interface definition difference data in the temporary table after the interface main table is determined to be updated;
generating a version identifier based on the current system time, and storing updated interface definition data to an interface version table in a lasting mode, wherein the interface definition data comprises the interface definition difference data and unchanged data in the interface main table.
In some embodiments, when the front page receives a "confirm UPDATE" button clicked by a user, that is, when an instruction for updating the interface main table is received, the temporary table is queried according to the record ID or the batch_id, a field and a new value to be updated are obtained, UPDATE operation is performed on the interface main table, and the value of the corresponding field is updated to the new value, so that the UPDATE of the interface definition difference data in the temporary table into the interface main table is realized. When the update of the interface main table is finished, deleting the data corresponding to the temporary table, generating a new version number, such as YYYYMMDDHHMMSS _v1, or increasing the version number based on the business rule according to the current system time, and updating the record to the interface version table. The fields of the interface version table may include, but are not limited to, id (primary key), doc_id (document identification), field_name (field name), original_value, new_value, updated_at (update time), version_number, updated_by (operator). After updating the interface master table, the update contents are inserted into the version table, including document identification, field name, original value, new value, update time, version number, operator (available from user session). If the update of the master table fails (e.g., database lock conflict, field constraint conflict), a rollback operation maintains the temporary table record and returns an error message to the front end to prompt the user.
Through the optional implementation mode, automatic detection of data inconsistency, safe updating after user confirmation and version tracing are realized, and accuracy and maintainability of system data are ensured.
And S13, when determining to return to the specific interface version, acquiring specific version information of the specific interface version, and updating the specific version information to an interface master table.
In some embodiments, the user may specify a version number (e.g. version_id=123) or a version timestamp (e.g. timestamp=2023-10-01-12:00:00) to be rolled back through the front-end interface or interface call, and the electronic device queries whether a specific interface version exists from the interface version table, verifies the interface state of the specific interface version, inserts a data snapshot (e.g. all field values) of the current interface main table into the temporary table, and records the backup timestamp, so as to prevent the main table data from being unable to be recovered when the rollback operation fails. Then, the data record of the target version is inquired from the interface version table, the fields in the version table are ensured to be in one-to-one correspondence with the fields of the main table, the data in the version table is covered to the main table through an UPDATE or REPLACE INTO statement, and a new record is inserted INTO the interface version table to mark the rollback operation. After the version rollback of the specific interface is completed, the main table data and the target version data are compared, the correct rollback result is ensured, a visual interface can be provided, and the difference between the main table data and the version table data before and after rollback is allowed to be compared by a user.
In some embodiments, after determining that the rollback of the specific interface version is successful, the electronic device may notify related personnel (such as a data administrator) through mail, a sms message or a system message, notify that the rollback operation is completed, and record detailed information (such as operation time, user, target version, rollback result) of the rollback operation in the operation log table, so as to facilitate audit and problem investigation, as shown in fig. 3. When an exception occurs while backfilling the master or inserting the version table (e.g., database connection interrupt, field constraint conflict), the transaction is immediately rolled back, ensuring that the master table data is not corrupted, and an error message is returned to the front end (e.g., "rollback failure, version number 123 is not present"), and an exception log is recorded. Referring to fig. 4 together, the user can also view the previous modification information through the automated test platform and support manual restoration by the user.
By the aid of the optional implementation mode, the system can safely and efficiently realize rollback operation of the interface version, and meanwhile, the integrity and traceability of data are guaranteed.
Referring also to fig. 5, in some embodiments, the automated test platform may include a collaboration project module, an API asset module, a use case management module, an automated regression module, and an API monitor module, where the collaboration project module includes a warehouse management system (Warehouse MANAGEMENT SYSTEM, WMS), a User service management system (User SERVICE SYSTEM, USS), and a listening system, and the API asset module displays API interfaces of different versions, including multiple version interfaces of WMS and USS, and the API interfaces are managed and tested, and the use case management module is configured to manage API test cases, including creating, editing, and deleting test cases, such as interface use case 01, interface use case 02, interface use case 03, and interface use case 04, all for testing APIs. Through the automatic synchronization function, the API assets and the test cases are synchronized and assembled into a test suite for regression testing. The automated regression module is used for displaying different test scenes, including scenes such as purchase order warehousing operation, APP order receiving and returning closing, relation early warning and relation query, and the like, and the scenes can be used for automated regression testing so as to ensure the stability and the correctness of the API under different conditions. The API monitoring module is used for showing different aspects of API monitoring, including pipeline triggering, service interface monitoring, infrastructure interface monitoring and the like, and is used for monitoring the performance and health condition of the API in real time and ensuring the stable operation of the system. By visualizing a complete API lifecycle management process, stability and reliability of the API are ensured, and development efficiency and system performance are improved by automated tools and processes.
S14, configuring global variables { } on interfaces, and extracting data of a request head or a request body from returned response bodies.
The global variable is of date type, random character type, custom variable, non-plaintext variable and enumeration type. In the embodiment of the application, in order to define the specifications and distinguish which are global variables, the electronic equipment can configure the global variables { } on the interfaces, and test data of different types and different formats are constructed for the automatic use cases, so that test personnel can conveniently and uniformly manage the test data, and the global variables are referenced through the { { } }. After the global variable { { } is completed for the interface configuration to be tested, the electronic device may use JSONPath or regular expressions to make a request header or request body extraction for the returned response body. The JSONPath is suitable for a response body in the JSON format, data is rapidly located through a path expression, the regular expression is suitable for unstructured text, and target data is extracted through pattern matching. Specifically, the electronic device may locate the target field in the interface response, select an extraction mode (JSONPath or regular expression) to extract, and assign the extraction result to the global variable (e.g., { user_id }).
In order to extract information of an interface response body, interfaces with dependency relations are convenient to refer, in the embodiment of the application, the electronic equipment adopts variable transfer, a response head can be extracted by a key name, a response body can be extracted by jsonpath or a regular expression, and references are made on a request head, parameters, a message body and assertion of the next interface through $ { }. That is, values can be directly obtained and stored as variables through the header key name when the response header extraction is performed, and data nodes can be located by using JsonPath expressions or contents can be matched through regular expressions when the response body extraction is performed.
In some embodiments, when the interface is sent, the electronic device may use OkHttp to encapsulate the information of the interface and return the encapsulated information to the caller. And the response body and the response head can be subjected to assertion inspection, the assertion of the response body extracts the actual result through the regular expression and JsonPath, whether the actual result is consistent with the expected result or not is judged, the response body also supports data structure assertion, and the data type of the whole response body is asserted. The assertion of the response head mainly extracts the actual result through the name of the response head, matches with the expected result, and accords with the expected result. If the result is matched with the expected result, the case execution is successful, otherwise, the execution fails.
In an alternative embodiment, after the data extraction of the request header or the request body is performed on the returned response body, the method further includes:
judging whether the extracted data needs to be processed or not;
when it is determined that the extracted data needs to be processed, an internal method is called in a parameter or a message body, and the extracted data is processed according to the internal method, wherein the internal method is a code logic module preset for realizing a specific data processing function, the parameter is a parameter transferred when an interface is called, and the message body is a data carrier in an interface request or response.
The method comprises the steps of embedding a code logic module preset during the method, and realizing a specific data processing function. In order to reduce transcoding, the method also provides abundant functions for testers and meets the test of service scenes. The built-in method provides a uuid method, an md5 method, dateToTimestamp method, and the like, and the reference manner is exemplified by $ { uuid () }. In some embodiments, after the data extraction is complete, the electronic device may utilize the built-in method $ { method name () } to determine whether the extracted data requires further processing based on the traffic requirements, such as the login password is ciphertext encrypted using RSA. Specifically, the electronic device may determine whether the extracted data conforms to a specific format (such as JSON, XML, etc.), determine whether the extracted data includes certain key fields (such as status, error, etc.), determine whether the extracted data meets a service rule (such as whether the extracted data is empty or exceeds a threshold value, etc.), and when it is determined that the specific data needs to be further processed, directly call a built-in method in the code, and send the extracted data as a parameter, or dynamically call a corresponding built-in method according to the parameter or an identifier in the message body through a reflection mechanism or configuration mapping. The parameters are additional information transferred during interface call, and are used for specifying the call mode of a built-in method or providing necessary context, for example, specifying a processing method, namely, method_name= "to_ uppercase", providing an encryption key, namely, key= "my_secret_key", and a message body, namely, a data carrier in interface request or response, possibly containing data to be processed, for example, a request message body, namely, { "data": "HELLO WORLD", "method": "to_ uppercase" }, and a response message body, namely, { "status": "success", "result": "HELLO WORLD" }. And then, extracting data and a method identifier from the message body, and dynamically calling a built-in method by combining parameters to realize the processing of the extracted data.
Through the optional implementation manner, flexible data processing logic is realized by extracting data, judging processing requirements and dynamically calling a built-in method.
S15, arranging the interface test cases in the test suite based on the service scene, and realizing the dependence transfer of the interface context by using a variable transfer mechanism.
In some embodiments, the electronic device may pre-construct a defined test plan, where the test plan is composed of one or more test suites, the user may customize the execution time, the test report notifier, the automated test platform backend service may trigger using a Spring Boot timing task according to the user-defined time, the timing task execution may generate a test report when completed, and notify related personnel. The interface use cases are arranged into corresponding scene use cases according to service scenes through the use case management module, and call among test suite is supported, and request heads and assertions are added in batches.
In the automatic test of the interface, the service scene arrangement is carried out on the interface use cases through the test suite, and the context-dependent transfer is realized by using variable transfer. Specifically, the electronic device may comb the calling sequence and the dependency relationship between the interfaces, for example, after the user logs in, obtain the Token, and then use the Token to call other interfaces, so as to determine which output data of the interfaces need to be used as input parameters of the other interfaces, for example, the Token returned by the login interface needs to be transferred to the subsequent interface. Next, a test suite is created in the test framework (e.g., postman, JMeter, pytest, etc.) for organizing the associated interface use cases. Referring also to fig. 6, the availability and reliability of the overall system is improved by iterating the precipitation of alternatives, allowing the system to respond to business needs faster by version iterations and updates. Referring to fig. 7, through the interface formed by the context, scene regression is performed according to project requirements, the input time is long, the benefit is slow, and the report height can be realized for the continuous iteration version.
Further, in each interface use case, the data to be transferred to the subsequent use case, such as Token returned by the login interface, is extracted, and the extracted data is stored in a variable provided by the test framework, for example, in Postman, a variable may be stored using pm.environment.set () or pm.collectionvariables.set (), and in Pytest, the data may be stored using a field or global variable. And in the case of a dependent interface use case, the previously stored variable is read as an input parameter. Then, the electronic device may set the execution order of the interface cases in the test suite to ensure that the dependency is correct, for example, in Postman, the order may be adjusted by dragging the cases, and in Pytest, the dependency may be managed using pytest. And then, different input data are provided for the interface use cases by using a parameterization technology so as to cover more business scenes. And finally, running the test suite, observing whether the execution sequence and the dependence transfer of the interface use cases are correct, and ensuring that the dependence relationship is transferred correctly and the interface call is successful by checking the execution result of each interface use case.
In some embodiments, the electronic device may optimize the design of the test case according to the test result, reduce redundant codes, improve the test efficiency, and update the test data periodically, so as to ensure the accuracy and reliability of the test case.
Through the optional implementation manner, the business scene arrangement of the interface use case can be realized in the test suite, and the context-dependent transmission is realized by using variable transmission, so that the efficiency and the reliability of the automatic test of the interface are improved.
S16, setting a timing execution task of the test suite or providing an immediate execution option through a timing task function of the Spring Boot framework.
The Spring Boot item default includes a timing task function (Spring-context module), no extra dependence is needed, it is ensured that @ EnableScheduling notes are effective (usually configured on a main startup class), and the execution period of the task is defined by using @ Scheduled notes, that is, the timing execution task is set, which may include fixed frequency execution @ Scheduled (fixedRate =5000) (executed every 5 seconds), fixed delay execution @ Scheduled (fixedDelay =10000) (delayed by 10 seconds after the last task is completed) and Cron expression @ Scheduled (cron= "0 0/1. Or providing a REST interface (such as/executeTestSuite), triggering the execution of the test suite through the HTTP request, namely providing an immediate execution option, and directly calling the execution method of the test suite in the interface implementation to realize manual triggering. In some embodiments, the electronic device may control the enabling/disabling of the timed task via a configuration file (e.g., application. Yml), read the configuration in the task method, and dynamically control whether the timed task is executed. The timing execution task or the immediate execution option is executed through the test suite, the same set of test logic is shared, code repetition is avoided, the timing execution task is only responsible for scheduling, and the immediate execution option, namely the manual triggering, calls the same logic through an interface.
S17, rapidly executing the interface test task according to the timing execution task or the immediate execution option.
In some embodiments, the test tasks are executed concurrently by multiple processes (or threads), so that the CPU and I/O resources are fully utilized, and the test efficiency is improved. Spring Boot itself does not directly provide multi-process support, but can be implemented through Java concurrency tools (e.g., thread pools) or operating system level process management. Specifically, the test suite is split into multiple independent test tasks, each task corresponds to a test of an interface or a group of interfaces, for example, 100 interface test tasks are split into 10 groups of 10 tasks, and the interface test tasks are executed by using a multi-process scheme or a multi-thread scheme. For a multiprocessing scheme, the electronic device may use ProcessBuilder or operating system commands (e.g., run time (). Exec ()) of Java to start sub-processes to execute test tasks, each of the sub-processes independently runs a test script, and the main process is responsible for collecting results. For a multi-threaded scheme, a thread pool (e.g., executorService) is used to manage multiple threads, each thread performs a test task, and the number of threads (e.g., CPU cores) is reasonably set to avoid resource contention.
Through the optional implementation manner, the automatic scheduling and efficient execution of the test suite are realized, and the timing task and the manual triggering are combined, so that the daily automatic requirement is met, and the flexible manual test scene is supported.
In an alternative embodiment, the method further comprises:
when detecting that the source interface definition is changed, acquiring an interface state of the source interface;
When the interface state is determined to be the developing state, automatically updating interface information to an automatic test platform;
And when the interface state is determined to be other states, labeling a change label and prompting the user to confirm change information to update the interface information, wherein the other states comprise a state under test or a released state.
The automatic test platform is arranged in the electronic equipment. In some embodiments, the electronic device may define a plurality of interface states for the interface in advance, including an in-development state, an in-test state, a published state, etc., each interface state corresponding to different operating logic. The electronic equipment compares the field of the Swagger interface document request body with the field of the request body of the database in real time, and if the database record interface information is inconsistent with the field of the Swagger request body, the Swagger interface document is indicated to be changed, namely the source interface definition is changed. And when the source interface is determined to be changed, acquiring the interface state of the source interface. And when the interface state is determined to be the developing state, directly synchronizing the updated interface information to the automatic test platform, and recording a change log after updating. When the interface state is the under-test state or other non-development state, a change label (such as 'newly added parameter', 'modified return value') is generated, and the change label is notified to the user in a notification form (such as mail or popup window) to request to confirm whether to update. Specifically, the interface list may have a change icon, so that the user knows which interfaces are changed, and clicking the change icon may pop-up the window to display change information. And after the user confirms, updating the interface information and recording the log, and if the user refuses, keeping the current version unchanged.
Whenever the interface information is changed, a new version is generated, information such as version number, change content, update time, update person and the like is recorded, and a complete snapshot of the interface (such as JSON/YAML format) is stored for recovery when rolling back. And through a rollback function, allowing a user to select any historical version for recovery, updating the current version number to be the selected version number after rollback, and recording a rollback operation log.
In an alternative real-time manner, the method further comprises:
arranging service scene cases through the test suite, and supporting inter-suite calling and batch assertion configuration;
And combining a plurality of test kits based on the test plan, triggering execution by a Spring Boot timing task, and generating a visual test report.
Each test suite corresponds to a service module or a functional scenario, such as a user registration suite, an order processing suite, and the like, the electronic device can create a test suite class by using a test framework (such as JUnit, testNG), define test cases contained in the suite by using notes or configuration files, and realize inter-suite collaboration by relying on injection or method call, for example, the order processing suite relies on user data generated by the user registration suite, and verify a plurality of conditions in batches by using an assertion library (such as AssertJ, hamcrest).
In addition, the electronic device may also define a test plan as a collection of test suites, supporting dynamic loading of suites in the test plan using reflection or dependent injection frameworks (e.g., spring) in terms of combinations of dimensions such as priority, environment, etc. When the automatic test of the interface needs to be executed, the timing task is configured by using the Spring @ Scheduled annotation or TaskScheduler interface, the test plan execution logic is packaged, and the concurrent execution suite is supported. When the interface automation test is completed, tools such as Allure, extentReports can be utilized to automatically generate a corresponding test report (which can be an HTML format report), and the test report is visualized, for example, the test result can be displayed in a chart mode, including the total number of cases, the case skip book, the success rate, the starting time, the time consumption and the running environment, and the downloading of the test report is supported, and only the failed cases are displayed, as shown in fig. 8.
Referring to FIG. 9, in some embodiments, an automated test platform may expose test reports, test executions, test suites, and use case management. The test report may include different types of test reports, such as an interface test report, a performance test report, a data quality report, and an interface test report, in which test results and analysis are recorded in detail, test execution provides various modes and tools for test execution, including timing execution, direct execution, assertion logic and variable management, supporting automated test and manual test, test suites are used for organizing and executing a group of related test cases, including scene coverage, data verification, multi-source verification, row-by-row verification, json comparison, regular expressions, time comparison, etc., and case management provides functions of case creation, batch operation, manual creation, automatic entry, swagger import, batch deletion, batch execution, batch replication, etc., supporting efficient management and maintenance of test cases.
Compared with a manual execution interface, the application realizes 10 times improvement of the test speed, thereby greatly shortening the interface test period, and in addition, the application can intuitively manage the continuous change of the interface under the micro service, discover and solve the problem of business scene regression caused by the interface change in time, and ensure the software quality and business continuity. Meanwhile, before the business personnel know the interface error reporting, the problem is solved rapidly, and the testers develop the interface automatic test work under the condition that the codes do not need to be written, so that more testers can participate in the practice of automatic test, the overall test capability of a team is improved, the technical threshold is reduced, the on-line Bug is reduced, the defect escape rate is reduced by more than 20% on average, and the quality of software is obviously improved. Aiming at interface testing, the application uses frames of Spring Boot, okHttp, fastjson, nacos, myatis and the like, uses Mysql database and Redis, rocketMQ middleware to use multithread concurrent processing automation task execution, can realize the automation testing of interfaces, has more micro service interfaces, is updated frequently, carries out version backtracking and change record, and triggers execution by one key, thereby realizing the management and large-scale automation testing of the micro service interfaces.
Referring to fig. 10, a functional block diagram of an automated testing apparatus for micro-service architecture according to an embodiment of the present application is shown.
In some embodiments, the micro-service architecture automation test device 10 may include a plurality of functional modules comprised of computer program segments. The computer program of each program segment of the micro-service architecture automation test device 10 may be stored in a memory of an electronic device and executed by at least one processor to perform (see fig. 1 for details) the functions of the micro-service architecture automation test. May be divided into a plurality of functional modules according to the functions they perform. The functional modules may include an interface document parsing module 101, a consistency check module 102, a version regression update module 103, an interface configuration module 104, a test suite orchestration module 105, a timing task configuration module 106, and an interface test task execution module 107. The module referred to in the present application refers to a series of computer program segments capable of being executed by at least one processor and of performing a fixed function, stored in a memory. In the present embodiment, the functions of the respective modules will be described in detail in the following embodiments.
The interface document parsing module 101 is configured to configure an access address of a target interface document, parse the target interface document to persist parsed interface definitions to an interface master table, and generate an interface test case according to the target interface document, where the target interface document is a Swagger interface document or an OpenApi interface document.
The consistency check module 102 is configured to obtain interface definition difference data when it is determined that the target interface document is inconsistent with the data in the interface master table, and store the interface definition difference data in a temporary table.
The version regression updating module 103 is configured to obtain specific version information of a specific interface version when determining to regress the specific interface version, and update the specific version information to an interface master table.
The interface configuration module 104 is configured to configure global variables { { } for the interfaces, and extract data of the request header or the request body from the returned response body.
The test suite arrangement module 105 is configured to arrange the interface test cases in the test suite based on a service scenario, and implement dependency transmission of the interface context by using a variable transmission mechanism.
The timing task configuration module 106 is configured to set a timing execution task of the test suite or provide an immediate execution option through a timing task function of the Spring Boot framework.
The interface test task execution module 107 is configured to quickly execute an interface test task according to the timed execution task or the immediate execution option.
The consistency check module 102 is further configured to update the interface definition difference data in the temporary table to the interface master table when an update interface master table instruction of a user is received, delete the interface definition difference data in the temporary table after the update of the interface master table is determined to be completed, generate a version identifier based on a current system time, and persist the updated interface definition data to an interface version table, where the interface definition data includes the interface definition difference data and unchanged data in the interface master table.
The version regression updating module 103 is further configured to obtain an interface state of the source interface when detecting that the source interface definition is changed, automatically update interface information to an automated test platform when determining that the interface state is an in-development state, and mark a change label and prompt the user to confirm the change information to update the interface information when determining that the interface state is other states including an in-test state or a published state.
The interface configuration module 104 is further configured to determine whether the extracted data needs to be processed, and when it is determined that the extracted data needs to be processed, call an internal method in a parameter or a message body, and process the extracted data according to the internal method, where the internal method is a code logic module preset to implement a specific data processing function, the parameter is a parameter transferred during interface call, and the message body is a data carrier in an interface request or response.
The interface document analysis module 101 is further configured to, before the configuration of the access address of the Swagger interface document, call OpenAPIParser a component through a Spring Boot timing task, collect the target interface document in real time, and persist the interface definition data obtained by analysis to a MySQL database.
The test suite arrangement module 105 is further configured to arrange service scenario cases through the test suites, support inter-suite call and batch assertion configuration, combine a plurality of test suites based on a test plan, trigger execution through Spring Boot timing tasks, and generate a visual test report.
It should be understood that the various modifications and embodiments of the micro service architecture automation test method provided in the foregoing embodiment are equally applicable to the micro service architecture automation test device of the present embodiment, and those skilled in the art will clearly know the implementation method of the micro service architecture automation test device of the present embodiment through the foregoing detailed description of the micro service architecture automation test method, which is not described in detail herein for brevity of description.
Referring to fig. 11, a schematic structural diagram of an electronic device according to an embodiment of the present application is shown. In a preferred embodiment of the present application, the electronic device 11 includes a memory 111, at least one processor 112, and at least one communication bus 113.
It will be appreciated by those skilled in the art that the configuration of the electronic device shown in fig. 11 is not limiting of the embodiments of the present application, and that either a bus-type configuration or a star-type configuration is possible, and that the electronic device 11 may include more or less other hardware or software than that shown, or a different arrangement of components.
In some embodiments, the electronic device 11 is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and its hardware includes, but is not limited to, a microprocessor, an application specific integrated circuit, a programmable gate array, a digital processor, an embedded device, and the like. The electronic device 11 may further include a user device, where the user device includes, but is not limited to, any electronic product that can interact with a user by using a keyboard, a mouse, a remote control, a touch pad, or a voice control device, for example, a personal computer, a tablet computer, a smart phone, a digital camera, etc.
In the foregoing embodiments of the present application, it should be understood that the disclosed method, apparatus, computer readable storage medium and electronic device may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple components or modules may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with respect to each other may be an indirect coupling or communication connection via some interfaces, devices or components or modules, which may be in electrical, mechanical, or other forms.
The components illustrated as separate components may or may not be physically separate, and components shown as components may or may not be physical modules, i.e., may be located in one place, or may be distributed over multiple network modules. Some or all of the components may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional module in each embodiment of the present invention may be integrated into one processing module, or each component may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
The integrated modules, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. The storage medium includes a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present invention is not limited by the order of actions described, as some steps may be performed in other order or simultaneously in accordance with the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily all required for the present invention.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
While the preferred embodiment of the present application has been described in detail, the present application is not limited to the embodiments, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present application, and the equivalent modifications or substitutions are included in the scope of the present application as defined in the appended claims.