[go: up one dir, main page]

CN110309038B - Performance test method and device, electronic equipment and computer readable storage medium - Google Patents

Performance test method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN110309038B
CN110309038B CN201910310587.9A CN201910310587A CN110309038B CN 110309038 B CN110309038 B CN 110309038B CN 201910310587 A CN201910310587 A CN 201910310587A CN 110309038 B CN110309038 B CN 110309038B
Authority
CN
China
Prior art keywords
execution
test script
machines
test
execution machines
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910310587.9A
Other languages
Chinese (zh)
Other versions
CN110309038A (en
Inventor
张震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Life Insurance Company of China Ltd
Original Assignee
Ping An Life Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Life Insurance Company of China Ltd filed Critical Ping An Life Insurance Company of China Ltd
Priority to CN201910310587.9A priority Critical patent/CN110309038B/en
Publication of CN110309038A publication Critical patent/CN110309038A/en
Application granted granted Critical
Publication of CN110309038B publication Critical patent/CN110309038B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

The invention relates to a performance testing method and device, electronic equipment and a computer readable storage medium. The method comprises the following steps: acquiring a test script from a development and debugging environment, wherein the test script comprises the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines; calculating the number of the execution machines and the number of the concurrency to be executed by each execution machine according to the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines; adding the concurrent number to be executed by each executive machine in the test script to the test script so as to update the test script; generating execution images with the same number as the execution machines according to the calculated number of the execution machines and the updated test script, wherein each execution image corresponds to one execution machine; and controlling the execution machine to test according to the execution mirror image corresponding to the execution machine. The invention is simple to realize and can improve the testing efficiency of the executive machine.

Description

Performance test method and device, electronic equipment and computer readable storage medium
Technical Field
The invention relates to the field of cloud storage, in particular to a performance testing method and device of an execution machine, electronic equipment and a computer readable storage medium.
Background
At present, many performance testing tools on the market, for example, a meter, can initiate a test in a distributed execution manner when the number of concurrences that need to be executed is greater than 1000, but the number of concurrences of each execution machine and how many execution machines are needed in total need to be calculated in advance. Therefore, the existing performance testing tool requires that the execution machine resources are prepared in place in advance, and a large amount of labor cost is required to be invested to create the execution machine environment, so that the efficiency of performance testing is low.
Disclosure of Invention
In view of the above, it is necessary to provide a performance testing method, apparatus, electronic device and computer readable storage medium to improve the efficiency of performance testing.
A first aspect of the present application provides a performance testing method, the method comprising:
accessing a development and debugging environment of a performance testing management platform;
acquiring a test script from the development and debugging environment, wherein the test script comprises the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines;
calculating the number of the execution machines and the number of the concurrency to be executed by each execution machine according to the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines;
adding the to-be-executed execution of each execution machine in the test script to the test script so as to update the test script;
generating execution images with the same number as the execution machines according to the calculated number of the execution machines and the updated test script, wherein the execution images are instruction sets for testing the execution machines, and each execution image corresponds to one execution machine; and
and controlling the execution machine to test according to the execution mirror image corresponding to the execution machine.
Preferably, the development and debugging environment for accessing the performance test management platform includes:
and starting an Agent control program in the performance test management platform and accessing the development and debugging environment through the Agent control program.
Preferably, the calculating the number of the execution machines and the number of the concurrency to be executed by each execution machine according to the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines includes:
calculating the number of the execution machines according to a first preset formula, wherein the first preset formula is b = a/a + (a% a = = 00); and
calculating the concurrency number to be executed by each of the execution machines according to a second preset formula, where the second preset formula is c = a/b + (a% b = = 00).
Preferably, the test script includes a parameter file, and the generating of the execution images having the same number as the execution machines according to the calculated number of the execution machines and the updated test script includes:
judging whether the parameter file of the test script can be repeatedly executed or not;
when the parameter file in the test script can be repeatedly executed, copying the parameter file of the test script into each execution mirror image, equally dividing the total concurrency number in the test script according to the number of the execution machines, and copying the equally divided total concurrency number into each execution mirror image; and
when the parameter files in the test script can not be repeatedly executed, equally dividing the quantity of the parameter files of the test script according to the quantity of the execution machines, copying the equally divided parameter files into each execution mirror image, equally dividing the total concurrency quantity in the test script according to the quantity of the execution machines, and copying the equally divided total concurrency quantity into each execution mirror image.
Preferably, the method further comprises: and checking the test script.
Preferably, the verifying the test script comprises:
filling a storage path of the test script in the development and debugging environment, searching the test script according to the storage path, checking the file type and the existence of the test script, and storing the test script in the performance test management platform after the checking is passed.
Preferably, the method further comprises:
and carrying out statistical analysis on the test result, and displaying the analysis result to a user, wherein a statistical table with two dimensions can be generated when the test result is subjected to statistical analysis, wherein the transverse dimension of the statistical table is the standard-reaching analysis dimension of the test result, and the longitudinal dimension of the statistical table is the trend analysis dimension of the historical test result.
A second aspect of the present application provides a performance testing apparatus, the apparatus comprising:
the access module is used for accessing a development and debugging environment of the performance testing management platform;
the script acquisition module is used for acquiring a test script from the development and debugging environment, wherein the test script comprises the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines;
the calculation module is used for calculating the number of the execution machines and the number of the concurrencies to be executed by each execution machine according to the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines;
the updating module is used for adding the concurrent number to be executed by each execution machine in the test script into the test script so as to update the test script;
the execution mirror image generation module is used for generating execution mirror images with the same number as the execution machines according to the calculated number of the execution machines and the updated test script, wherein the execution mirror images are instruction sets for testing the execution machines, and each execution mirror image corresponds to one execution machine; and
and the test module is used for controlling the execution machine to test according to the execution mirror image corresponding to the execution machine.
A third aspect of the application provides an electronic device comprising a processor for implementing the performance testing method when executing a computer program stored in a memory.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the performance testing method.
The method and the device calculate the number of the execution machines and the concurrent number of each execution machine to be executed according to the set concurrent number of each execution machine and the maximum concurrent number of each execution machine so as to update the test script, generate the execution mirror images with the same number as the execution machines according to the calculated number of the execution machines and the updated test script, and control each execution mirror image to test the execution machines corresponding to the execution mirror images, thereby improving the test efficiency of the execution machines.
Drawings
FIG. 1 is a schematic diagram of an application environment of the performance testing method of the present invention.
FIG. 2 is a flow chart of a preferred embodiment of the performance testing method of the present invention.
FIG. 3 is a block diagram of an embodiment of the performance testing apparatus of the present invention.
FIG. 4 is a diagram of an electronic device according to a preferred embodiment of the invention.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Preferably, the performance testing method of the present invention is applied to one or more electronic devices. The electronic device is a device capable of automatically performing numerical calculation and/or information processing according to a preset or stored instruction, and the hardware includes, but is not limited to, a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The electronic device may be a desktop computer, a notebook computer, a tablet computer, a cloud server, or other computing device. The device can be in man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch pad or voice control equipment and the like.
Example 1
Fig. 1 is a diagram of an application environment of a performance testing method according to an embodiment of the present invention.
Referring to fig. 1, the performance testing method is applied to the performance testing management platform 10, and the performance testing management platform 10 includes a management server 20 and an execution machine 30. The management server 20 is connected to the execution machine 30. The management server 20 includes a development debugging environment 21. The management server 20 user obtains the test script in the development and debugging environment 21, and controls the execution machine 30 to perform the test according to the test script.
FIG. 2 is a flow chart of a performance testing method according to an embodiment of the invention. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
Referring to fig. 2, the performance testing method specifically includes the following steps:
step S201, accessing the development and debugging environment of the performance test management platform 10.
In this embodiment, an Agent control program is started in the performance test management platform 10, and the development and debugging environment is accessed through the Agent control program. In this embodiment, the development and debugging environment is a Linux operating system instance used through a UI interface. In this embodiment, the Agent control program is a software entity with adaptivity and intelligence, and can complete a job in an active service manner on behalf of a user or other programs. The Agent control program can also complete the following work: feeding back the self-starting state to the management server 20; checking the local file existence of the development debugging environment 21; and monitoring message notification, and starting a Jmeter tool to execute the locally stored test script of the development and debugging environment 21 when receiving a message needing to be executed.
Step S202, obtaining a test script from the development and debugging environment 21.
In this embodiment, the test script is stored in the local storage space of the development and debugging environment 21, and the test script may be obtained from the local storage space of the development and debugging environment 21. In another embodiment, the test script uploaded by the user is obtained through the development and debugging environment 21. In this embodiment, the test script refers to a series of instructions for a particular test that may be executed by the automated execution tool. For example, the test script may be automatically executed by a Jmeter tool. The Jmeter tool is a Java-based stress test tool, which can be used to test the performance of static and dynamic resources, such as files, servlets, perl scripts, java objects, databases, FTP servers, and the like. In this embodiment, the test script includes a parameter file, a set total concurrency number of the execution machines 30, and a set maximum concurrency number of the execution machines 30.
Step S203, calculating the number of the execution machines 30 and the number of the concurrencies to be executed by each execution machine 30 according to the set total concurrency number of the execution machines 30 and the maximum concurrency number of the execution machines 30.
In this embodiment, the management server 20 obtains the total concurrency number of the execution machines 30 and the maximum concurrency number of the execution machines 30 set in the test script, and calculates the number of the execution machines 30 according to a first preset formula according to the obtained set total concurrency number of the execution machines 30 and the obtained maximum concurrency number of the execution machines 30. In this embodiment, the first preset formula is: b = a/a + (a% a = = 00), where a is the total number of concurrencies of the execution machines 30 set by the user, a is the set maximum number of concurrencies of each execution machine 30, and b is the calculated number of execution machines 30. The management server 20 further calculates the amount of concurrency to be executed by each execution machine 30 according to a second preset formula according to the set total amount of concurrency of the execution machines 30 and the calculated amount of concurrency of the execution machines 30. In this embodiment, the second preset formula is: c = a/b + (a% b = = 0.
Step S204, adding the number of concurrences to be executed by each of the execution machines 30 in the test script to the test script so as to update the test script.
For example, when the calculated concurrency number to be executed by each execution machine 30 is 3000, the management server 20 adds 3000 concurrency numbers to be executed by each execution machine 30 to the test script.
In step S205, the execution images with the same number as the execution machines 30 are generated according to the calculated number of the execution machines 30 and the updated test script, where the execution images are instruction sets for testing the execution machines 30, and each execution image corresponds to an execution machine.
In one embodiment, the management server 20 determines whether the parameter file of the test script can be repeatedly executed. When the parameter file in the test script can be repeatedly executed, the management server 20 copies the parameter file of the test script into each execution image, equally divides the total concurrency number in the test script according to the number of the execution machines 30, and copies the equally divided total concurrency number into each execution image. For example, when the number of the execution machines 30 is 10 and the total concurrent number in the test script is 3000, the management server 20 generates 10 execution images according to the number of the execution machines 30 and the test script, then copies the parameter file of the test script into each execution image, finally divides 3000 concurrences in the test script into 10 equal parts, and concurrently copies each divided 300 into a corresponding execution image.
When the parameter file in the test script cannot be repeatedly executed, the management server 20 equally divides the parameter file amount of the test script according to the number of the execution machines 30 and copies the equally divided parameter file into each execution mirror image, and equally divides the total concurrency amount in the test script according to the number of the execution machines 30 and copies the equally divided total concurrency amount into each execution mirror image.
For example, when the number of the execution machines is 10, and the parameter file includes 300 ten thousand pieces of data, and the total concurrent number in the test script is 3000, the management server 20 generates 10 execution images according to the number of the execution machines 30 and the test script, then performs 10 equal divisions on the data in the parameter file of the test script, copies each equally divided parameter file into a corresponding execution image, and finally performs 10 equal divisions on the data in the test script and copies equally divided 300 into a corresponding execution image.
In this embodiment, for a parameter file that cannot be repeatedly executed in a test script, the data size in the parameter file is averaged according to the calculated number of execution machines, and the data size of the averaged parameter file is written into a corresponding execution mirror image, so that each execution mirror image is ensured to execute different tests.
And step S206, controlling the execution machine to test according to the execution mirror image corresponding to the execution machine.
In the present embodiment, the management server 20 acquires progress of the execution image generation process, and when it is confirmed that the execution image generation process is completed, the execution machine 30 corresponding to the execution machine is controlled by the message notification Agent control program, and the test is performed in accordance with the execution image corresponding to the execution machine.
In this embodiment, the method further comprises: and checking the test script.
In this embodiment, a path of a test script is filled in the development and debugging environment 21, the development and debugging environment 21 searches for the test script according to the storage path and checks the file type and existence of the test script, and after the check is passed, the test script is stored in the performance test management platform 10. For example, when the test script is a Jmeter test script, the development and debugging environment 21 can first determine that the type of the test script is Jmeter, and then check whether a file of the test script really exists in the development and debugging environment 21 according to an absolute path.
In this embodiment, the method further comprises:
the test results of each execution image are stored in the database of the management server 20.
In this embodiment, the method further includes:
and carrying out statistical analysis on the test result, and displaying the analysis result to a user.
In this embodiment, a statistical table with two dimensions may be generated when performing statistical analysis on the test result, where a horizontal dimension of the statistical table is a standard property analysis dimension of the test result, and a vertical dimension is a trend property analysis dimension of the historical test result.
Example 2
FIG. 3 is a block diagram of a preferred embodiment of the performance testing apparatus 40 of the present invention.
In some embodiments, the performance testing device 40 operates in an electronic device. The performance testing apparatus 40 may include a plurality of functional modules composed of program code segments. The program code of the various program segments in the performance testing apparatus 40 may be stored in a memory and executed by at least one processor to perform the functions of the performance testing.
In this embodiment, the performance testing apparatus 40 may be divided into a plurality of functional modules according to the functions executed by the apparatus. Referring to fig. 3, the performance testing apparatus 40 may include an access module 401, a script obtaining module 402, a calculating module 403, an updating module 404, an execution image generating module 405, and a testing module 406. The modules referred to herein are a series of computer program segments stored in a memory that can be executed by at least one processor and that perform a fixed function. In some embodiments, the functionality of the modules will be described in greater detail in subsequent embodiments.
The access module 401 is used to access the development and debugging environment of the performance testing management platform 10.
In this embodiment, the access module 401 starts an Agent control program in the performance test management platform 10 and accesses the development and debugging environment through the Agent control program. In this embodiment, the development and debugging environment is an example of a Linux operating system used through a UI interface. In this embodiment, the Agent control program is a software entity with adaptivity and intelligence, and can complete a job in an active service manner on behalf of a user or other programs. The Agent control program can also complete the following work: feeding back the self-starting state to the management server 20; checking the local file existence of the development debugging environment 21; and monitoring message notification, and starting a Jmeter tool to execute the locally stored test script of the development and debugging environment 21 when receiving a message needing to be executed.
The script obtaining module 402 is configured to obtain a test script from the development and debugging environment 21.
In this embodiment, the test script is stored in the local storage space of the development and debugging environment 21, and the script obtaining module 402 may obtain the test script from the local storage space of the development and debugging environment 21. In another embodiment, the script obtaining module 402 obtains the test script uploaded by the user through the development and debugging environment 21. In this embodiment, the test script refers to a series of instructions for a specific test, which can be executed by the automated execution tool. For example, the test script may be automatically executed by a Jmeter tool. The Jmeter tool is a Java-based stress testing tool, which can be used to test the performance of static and dynamic resources, such as files, servlets, perl scripts, java objects, databases, FTP servers, and the like. In this embodiment, the test script includes a parameter file, a set total concurrency number of each execution unit 30, and a set maximum concurrency number of the execution units 30.
The calculating module 403 is configured to calculate the number of the execution machines 30 and the number of the concurrency to be executed by each execution machine 30 according to the set number of the concurrency of the execution machines 30 and the maximum number of the concurrency of the execution machines 30.
In this embodiment, the calculating module 403 obtains the total concurrency number of the execution machines 30 and the maximum concurrency number of the execution machines 30 set in the test script, and calculates the number of the execution machines 30 according to a first preset formula according to the obtained total concurrency number of the execution machines 30 and the obtained maximum concurrency number of the execution machines 30. In this embodiment, the first preset formula is: b = a/a + (a% a = = 00), where a is the total number of concurrencies of the execution machines 30 set by the user, a is the set maximum number of concurrencies of each execution machine 30, and b is the calculated number of execution machines 30. The calculating module 403 further calculates the amount of concurrency to be executed by each of the execution machines 30 according to a second preset formula, based on the set total amount of concurrency of the execution machines 30 and the calculated amount of concurrency of the execution machines 30. In this embodiment, the second preset formula is: c = a/b + (a% b = = 00), where c is the calculated number of concurrencies to be executed by each execution machine 30.
The updating module 404 is configured to add the number of concurrences to be executed by each of the execution machines 30 in the test script to the test script so as to update the test script.
For example, when the calculated concurrency number to be executed by each execution machine 30 is 3000, the update module 404 adds the 3000 concurrency number to be executed by each execution machine 30 to the test script.
The execution image generation module 405 is configured to generate execution images with the same number as the execution machines 30 according to the calculated number of the execution machines 30 and the updated test script, where the execution images are instruction sets for testing the execution machines 30, and each execution image corresponds to an execution machine.
In a specific embodiment, the execution image generation module 405 determines whether the parameter file of the test script can be repeatedly executed. When the parameter file in the test script can be repeatedly executed, the execution image generation module 405 copies the parameter file of the test script into each execution image, equally divides the total concurrency number in the test script according to the number of the execution machines 30, and copies the equally divided total concurrency number into each execution image. For example, when the number of the execution machines 30 is 10 and the total number of the concurrencies in the test script is 3000, the execution image generation module 405 generates 10 execution images according to the number of the execution machines 30 and the test script, then copies the parameter file of the test script into each execution image, finally divides 3000 of the test script into 10 equal parts, and concurrently copies 300 of each equal part into a corresponding execution image.
When the parameter file in the test script cannot be repeatedly executed, the execution image generation module 405 equally divides the parameter file amount of the test script according to the number of the execution machines 30, copies the equally divided parameter file into each execution image, equally divides the total concurrency amount in the test script according to the number of the execution machines 30, and copies the equally divided total concurrency amount into each execution image.
For example, when the number of the execution machines is 10, and the parameter file includes 300 ten thousand pieces of data and the total number of concurrencies in the test script is 3000, the execution image generation module 405 generates 10 execution images according to the number of the execution machines 30 and the test script, then performs 10 equal divisions on the data in the parameter file of the test script and copies each equally divided parameter file into a corresponding execution image, and finally performs 10 equal divisions on 3000 concurrencies in the test script and copies equally divided 300 concurrently into a corresponding execution image.
In this embodiment, the execution image generation module 405, for a parameter file that cannot be repeatedly executed in the test script, equally divides the data amount in the parameter file according to the calculated number of execution machines, and writes the data amount of the equally divided parameter file into a corresponding execution image, so as to ensure that each execution image executes a different test.
The test module 406 is configured to control the execution machine to perform a test according to an execution image corresponding to the execution machine.
In this embodiment, the test module 406 acquires progress of the execution image generation process, and notifies the Agent control program of the corresponding execution machine 30 via a message when it is confirmed that the execution image generation process is completed, to perform a test in accordance with the execution image corresponding to the execution machine.
In this embodiment, the test module 406 is further configured to verify the test script. In this embodiment, a storage path of a test script is filled in the development and debugging environment 21, the test module 406 searches for the test script according to the storage path, checks the file type and existence of the test script, and stores the test script in the performance test management platform 10 after the check is passed. For example, when the test script is a Jmeter test script, the test module 406 can first determine that the type of the test script is Jmeter, and then check whether a file of the test script really exists in the development and debugging environment 21 according to the storage path.
In this embodiment, the test module 406 is further configured to store the test result of each execution image in the database of the management server 20.
In this embodiment, the test module 406 is further configured to perform statistical analysis on the test result and display the analysis result to the user. In an embodiment, the test module 406 may generate a two-dimensional statistical table when performing statistical analysis on the test result, where a horizontal dimension of the statistical table is a standard analysis dimension of the test result, and a vertical dimension is a trend analysis dimension of the historical test result.
Example 3
FIG. 4 is a diagram of an electronic device 6 according to a preferred embodiment of the present invention.
The electronic device 6 comprises a memory 61, a processor 62 and a computer program 63 stored in the memory 61 and executable on the processor 62. The processor 62 implements the steps in the performance testing method embodiments described above, such as the steps S201 to S206 shown in fig. 2, when executing the computer program 63. Alternatively, the processor 62 implements the functions of the modules/units in the above embodiments of the performance testing apparatus, such as the modules 401 to 406 in fig. 3, when executing the computer program 63.
Illustratively, the computer program 63 may be partitioned into one or more modules/units that are stored in the memory 61 and executed by the processor 62 to carry out the invention. The one or more modules/units may be a series of computer program instruction segments capable of performing certain functions, the instruction segments being used for describing the execution process of the computer program 63 in the electronic device 6. For example, the computer program 63 may be divided into an accessing module 401, a script obtaining module 402, a calculating module 403, an updating module 404, an execution image generating module 405, and a testing module 406 in fig. 3, where specific functions of each module are described in embodiment two.
In this embodiment, the electronic device 6 and the performance test management platform 10 may be the same device. For example, the electronic device 6 may be a computing device such as a desktop computer, a notebook, a palm computer, and a cloud server. It will be appreciated by those skilled in the art that the schematic diagram is merely an example of the electronic device 6, and does not constitute a limitation of the electronic device 6, and may include more or less components than those shown, or some components may be combined, or different components, for example, the electronic device 6 may further include an input-output device, a network access device, a bus, etc.
The Processor 62 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor 62 may be any conventional processor or the like, the processor 62 being the control center for the electronic device 6, with various interfaces and lines connecting the various parts of the overall electronic device 6.
The memory 61 may be used for storing the computer programs 63 and/or modules/units, and the processor 62 may implement various functions of the electronic device 6 by running or executing the computer programs and/or modules/units stored in the memory 61 and calling data stored in the memory 61. The memory 61 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the stored data area may store data (such as audio data, a phonebook, etc.) created according to the use of the electronic device 6, and the like. In addition, the memory 61 may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
The integrated modules/units of the electronic device 6, if implemented in the form of software functional modules and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and which, when executed by a processor, may implement the steps of the above-described embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, read-Only Memory (ROM), random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
In the several embodiments provided in the present invention, it should be understood that the disclosed electronic device and method may be implemented in other manners. For example, the above-described embodiments of the electronic device are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be implemented in practice.
In addition, each functional module in each embodiment of the present invention may be integrated into the same processing module, or each module may exist alone physically, or two or more modules may be integrated into the same module. The integrated module can be realized in a hardware form, and can also be realized in a form of hardware and a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is to be understood that the word "comprising" does not exclude other modules or steps, and the singular does not exclude the plural. Several modules or electronic devices recited in the electronic device claims may also be implemented by one and the same module or electronic device by means of software or hardware. The terms first, second, etc. are used to denote names, but not to denote any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A method of performance testing, the method comprising:
accessing a development and debugging environment of a performance test management platform;
acquiring a test script from the development and debugging environment, wherein the test script comprises the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines;
calculating the number of the execution machines and the concurrent number to be executed by each execution machine according to the set total concurrent number of the execution machines and the maximum concurrent number of the execution machines;
adding the concurrent number to be executed by each executive machine in the test script to the test script so as to update the test script;
generating execution images with the same number as the execution machines according to the calculated number of the execution machines and the updated test script, wherein the execution images are instruction sets for testing the execution machines, and each execution image corresponds to one execution machine; and
and controlling the execution machine to test according to the execution mirror image corresponding to the execution machine.
2. The performance testing method of claim 1, wherein the accessing a development debugging environment of a performance testing management platform comprises:
and starting an Agent control program in the performance test management platform and accessing the development and debugging environment through the Agent control program.
3. The performance testing method of claim 1, wherein the calculating the number of the execution machines and the number of the concurrency to be executed by each execution machine according to the set total concurrency number of the execution machines and the set maximum concurrency number of the execution machines comprises:
calculating the number of the execution machines according to a first preset formula, wherein the first preset formula is b = a/a + (a% a = = 00); and
calculating the number of concurrencies to be executed by each of the execution machines according to a second preset formula, where the second preset formula is c = a/b + (a% b = = 00).
4. The performance testing method of claim 1, wherein the test script comprises a parameter file, and the generating of the same number of execution images as the number of execution machines according to the calculated number of execution machines and the updated test script comprises:
judging whether the parameter file of the test script can be repeatedly executed or not;
when the parameter file in the test script can be repeatedly executed, copying the parameter file of the test script into each execution mirror image, equally dividing the total concurrency number in the test script according to the number of the execution machines, and copying the equally divided total concurrency number into each execution mirror image; and
when the parameter files in the test script can not be repeatedly executed, equally dividing the quantity of the parameter files of the test script according to the quantity of the execution machines, copying the equally divided parameter files into each execution mirror image, equally dividing the total concurrency quantity in the test script according to the quantity of the execution machines, and copying the equally divided total concurrency quantity into each execution mirror image.
5. The performance testing method of claim 1, wherein the method further comprises: and verifying the test script.
6. The performance testing method of claim 5, wherein said verifying said test script comprises:
filling a storage path of the test script in the development and debugging environment, searching the test script according to the storage path, checking the file type and the existence of the test script, and storing the test script in the performance test management platform after the checking is passed.
7. The performance testing method of claim 1, wherein the method further comprises:
and carrying out statistical analysis on the test result, and displaying the analysis result to a user, wherein a statistical table with two dimensions can be generated when the test result is subjected to statistical analysis, wherein the transverse dimension of the statistical table is the standard-reaching analysis dimension of the test result, and the longitudinal dimension of the statistical table is the trend analysis dimension of the historical test result.
8. A performance testing apparatus, the apparatus comprising:
the access module is used for accessing the development and debugging environment of the performance test management platform;
the script acquisition module is used for acquiring a test script from the development and debugging environment, wherein the test script comprises the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines;
the calculation module is used for calculating the number of the execution machines and the number of the concurrencies to be executed by each execution machine according to the set total concurrency number of the execution machines and the maximum concurrency number of the execution machines;
the updating module is used for adding the to-be-executed execution of each execution machine in the test script into the test script so as to update the test script;
the execution mirror image generation module is used for generating execution mirror images with the same number as the execution machines according to the calculated number of the execution machines and the updated test script, wherein the execution mirror images are instruction sets for testing the execution machines, and each execution mirror image corresponds to one execution machine; and
and the test module is used for controlling the execution machine to test according to the execution mirror image corresponding to the execution machine.
9. An electronic device, characterized in that: the electronic device comprises a processor for implementing the performance testing method of any one of claims 1-7 when executing a computer program stored in a memory.
10. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program, when executed by a processor, implements the performance testing method of any of claims 1-7.
CN201910310587.9A 2019-04-17 2019-04-17 Performance test method and device, electronic equipment and computer readable storage medium Active CN110309038B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910310587.9A CN110309038B (en) 2019-04-17 2019-04-17 Performance test method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910310587.9A CN110309038B (en) 2019-04-17 2019-04-17 Performance test method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN110309038A CN110309038A (en) 2019-10-08
CN110309038B true CN110309038B (en) 2023-02-07

Family

ID=68074413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910310587.9A Active CN110309038B (en) 2019-04-17 2019-04-17 Performance test method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN110309038B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114490364A (en) * 2022-01-14 2022-05-13 合肥力动软件开发有限公司 Method of concurrently executing automated test scripts based on automated test virtual machine technology

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105760286A (en) * 2016-02-17 2016-07-13 中国工商银行股份有限公司 Application database dynamic property detection method and detection device
CN106067043A (en) * 2016-06-01 2016-11-02 重庆中科云丛科技有限公司 A kind of performance test methods and system
CN107341098A (en) * 2017-07-13 2017-11-10 携程旅游信息技术(上海)有限公司 Software performance testing method, platform, equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10042709B2 (en) * 2011-06-06 2018-08-07 International Business Machines Corporation Rebuild prioritization during a plurality of concurrent data object write operations
CN109359031B (en) * 2018-09-04 2023-08-22 中国平安人寿保险股份有限公司 Multi-device application program testing method and device, server and storage medium
CN109460333A (en) * 2018-11-01 2019-03-12 郑州云海信息技术有限公司 A kind of function test method of storage system, device and relevant device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105760286A (en) * 2016-02-17 2016-07-13 中国工商银行股份有限公司 Application database dynamic property detection method and detection device
CN106067043A (en) * 2016-06-01 2016-11-02 重庆中科云丛科技有限公司 A kind of performance test methods and system
CN107341098A (en) * 2017-07-13 2017-11-10 携程旅游信息技术(上海)有限公司 Software performance testing method, platform, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于LoadRunner的一种性能;李怡等;《计算机应用研究》;20091130;全文 *

Also Published As

Publication number Publication date
CN110309038A (en) 2019-10-08

Similar Documents

Publication Publication Date Title
CN113127347B (en) Interface testing method, device, equipment and readable storage medium
CN108460068B (en) Method, device, storage medium and terminal for importing and exporting report
US7415444B2 (en) Determining compliance rates for probabilistic requests
CN110704297A (en) Code evaluation method and device, computer equipment and storage medium
US20110161063A1 (en) Method, computer program product and apparatus for providing an interactive network simulator
CN111026670B (en) Test case generation method, test case generation device and storage medium
CN110806970A (en) Client test method and device based on simulation server response and electronic equipment
CN112256670A (en) Data migration method, terminal device and readable storage medium
CN113886162A (en) Computing equipment performance test method, computing equipment and storage medium
CN114035864A (en) Interface processing method, interface processing device, electronic device and storage medium
CN113886260A (en) Automated testing method, system, computer equipment and storage medium
CN114816993A (en) Full link interface test method, system, medium and electronic equipment
JP7633398B2 (en) Providing application error data for use by third party library development systems
US20220164182A1 (en) Code review using quantitative linguistics
CN111290942A (en) Pressure testing method, device and computer readable medium
CN113176993A (en) Case testing method and device, electronic equipment and storage medium
CN110309038B (en) Performance test method and device, electronic equipment and computer readable storage medium
CN111459814A (en) Automatic test case generation method and device and electronic equipment
CN111897728B (en) Interface debugging method and related equipment
US20180203790A1 (en) Detection of software errors
US20210216434A1 (en) Creation of minimal working examples and environments for troubleshooting code issues
CN112416738A (en) Image testing method, device, computer device and readable storage medium
CN112579428A (en) Interface testing method and device, electronic equipment and storage medium
CN112631949B (en) Debugging method and device, computer equipment and storage medium
TW202307670A (en) Device and method for automated generation of parameter testing requests

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant