[go: up one dir, main page]

US20130326486A1 - Keyword based software testing system and method - Google Patents

Keyword based software testing system and method Download PDF

Info

Publication number
US20130326486A1
US20130326486A1 US13/959,488 US201313959488A US2013326486A1 US 20130326486 A1 US20130326486 A1 US 20130326486A1 US 201313959488 A US201313959488 A US 201313959488A US 2013326486 A1 US2013326486 A1 US 2013326486A1
Authority
US
United States
Prior art keywords
test
software application
software
test system
glossary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/959,488
Inventor
Rick R. Roth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Open Text SA
Original Assignee
Open Text SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Open Text SA filed Critical Open Text SA
Priority to US13/959,488 priority Critical patent/US20130326486A1/en
Assigned to CAPTARIS, INC. reassignment CAPTARIS, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROTH, RICK R.
Assigned to OPEN TEXT INC. reassignment OPEN TEXT INC. MERGER (SEE DOCUMENT FOR DETAILS). Assignors: CAPTARIS, INC.
Assigned to OPEN TEXT S. A. reassignment OPEN TEXT S. A. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OPEN TEXT INC.
Publication of US20130326486A1 publication Critical patent/US20130326486A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/362Debugging of software
    • G06F11/3636Debugging of software by tracing the execution of the program
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Prevention of errors by analysis, debugging or testing of software
    • G06F11/3668Testing of software
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback

Definitions

  • Tests may test software to ensure correctness, completeness, security, and quality.
  • the tester may identify a software defect (“bug”).
  • the tester may provide a sequence of steps so that a software developer can reproduce the defect.
  • the software developer may then resolve the defect, such as by fixing source code and producing a new “build” of the software. It is well known in the art that fixing bugs sometimes introduces other bugs. Thus, testers often perform regression testing, which could involve following the steps previously identified as producing the defect.
  • test automation tools can record a tester's interactions with software as steps and then play back the steps.
  • Some test automation tools employ frameworks that interact with tested software programmatically, such as by using an application program interface (API) provided by the tested software.
  • API application program interface
  • Automated testing techniques other than recording and playing back steps can further be divided into keyword-based testing and model-based testing.
  • keyword-based testing each discrete interaction with the tested software is assigned a keyword and can have associated parameters.
  • the process of logging in can be associated with a keyword “login” and have as associated parameters a user identifier and a password.
  • the software tester may specify one or more keywords, such as in a sequence, so that the test automation tool performs steps relating to each specified keyword.
  • model-based testing the software tester specifies (or causes to be specified) a state engine model of the tested software.
  • the state engine model can identify a set of states relating to the tested software and interactions that cause the tested software to move from one state to another.
  • a software tester can then specify a test suite that moves the tested software from, through, or to various states.
  • the test suite may specify that the tested software is to progress from a “not logged in” state through a “logging in” state to a “logged in” state.
  • FIG. 1 is a block diagram illustrating components of an improved software testing system in various implementations.
  • FIG. 2 is a block diagram illustrating additional components of an improved software testing system in various implementations.
  • FIG. 3 is a flow diagram illustrating a configure routine invoked by the improved software testing system in some implementations.
  • FIG. 4 is a flow diagram illustrating a develop_test_script routine invoked by the improved software testing system in some implementations.
  • FIG. 5 is a flow diagram illustrating a develop_random_test_script routine invoked by the improved software testing system in some implementations.
  • FIG. 6 is a flow diagram illustrating an execute_test_script routine invoked by the improved software testing system in some implementations.
  • the improved software testing system generates a state engine model when a software tester specifies keywords and employs the state engine model to automate testing of software.
  • the specified keywords are stored in a glossary of keywords.
  • the specified keywords can be stored in an extensible markup language (XML) document.
  • the glossary can contain keywords and parameters or other specification of valid input, such as prior or next states associated with each keyword.
  • the glossary can also contain default values for various parameters.
  • a software tester can then associate one or more keywords with test scripts that can later be selected for automating tests.
  • the improved software testing system can receive test components from the software tester.
  • the improved software testing system may provide a wizard interface that enables the software tester to associate keywords with test components.
  • the improved software testing system can execute associated components and provide parameters identified in the glossary corresponding to the referenced keyword.
  • a test component can be executable logic that is specified in a script language or object code.
  • the test component can interact with software that is to be tested, such as via a testing application or framework.
  • the testing application or framework may interact with the tested software via an API provided by the tested software, an API provided by an underlying operating system, and so forth.
  • the improved software testing system can employ one or more personas.
  • a persona is an indication of one or more test scripts that are to execute and associated weightings. As an example, a persona can indicate that one test script is to execute 75% of the time and another test script is to execute 25% of the time. When the software tester begins automated testing and selects this persona, the improved software testing system can employ the test scripts at the indicated proportions.
  • a software tester can generate test scripts by specifying keywords or can request the improved software testing system to automatically generate test scripts.
  • the improved software testing system can automatically generate test scripts by causing the tested software to move through the various states identified by the generated state engine model.
  • the improved software testing system thus can automatically generate “random walk” tests and persona-based or weighted-path tests.
  • the improved software testing system can also enable syntax checking, transition checking, and automatic generation of parameters during manual specification of test scripts.
  • the improved software testing system can verify whether the software developer has provided the appropriate, syntactically correct parameters. If the software developer has not specified syntactically correct parameters, the improved software testing system can warn the software tester or make appropriate corrections.
  • the improved software testing system can verify whether test scripts are appropriately designed to cause the tested software to transition from one state to another.
  • the improved software testing system can verify the state engine model to make this verification.
  • the improved software testing system can automatically generate parameters during manual specification of test scripts by employing the parameters specified in the glossary.
  • the parameters can be values, ranges of values, options selections, and so forth.
  • the improved software testing system enables efficient test automation while simultaneously mitigating errors from incorrect input by software testers.
  • the improved software testing system can store the steps and the results of the steps, such as in a log file.
  • the test script can review the stored steps and results to more easily locate the software code causing the defect.
  • FIG. 1 is a block diagram illustrating components of an improved software testing system in various implementations.
  • the improved software testing system 100 includes a test platform component 102 , glossary component 104 , keyword engine component 106 , test script component 108 , application component 110 , and persona component 112 .
  • the test platform component 102 can include a test application and other components that facilitate testing software.
  • the test application can be part of a testing platform, such as an application that coordinates other components of the improved software testing system.
  • the test application can employ the glossary component 104 and test script component 108 to execute test scripts.
  • the test platform component can also automatically generate test scripts, such as based on the state engine model.
  • the glossary component 104 can be a document, file, or other repository of information associated with software that is to be tested.
  • a glossary 200 can include keywords 202 and runtime states 204 .
  • Keywords are commands or sets of commands that the improved software testing system can execute when executing a test script.
  • Runtime states are states of the software that is to be tested.
  • the runtime states can include start states and end states relating to a keyword.
  • the glossary can also include parameters (e.g., values associated with a command of the tested software), parameter default values, weights, and so forth.
  • the weights can specify how often a corresponding keyword is to be employed, such as by a test script.
  • the test application can prompt the software tester for values or can generate values for the parameters.
  • the keyword engine component 106 can employ keywords from the glossary and identify test components that a software developer has provided corresponding to the keywords.
  • the test components can cause the tested software to invoke a command, such as by employing an API provided by the tested software or operating system.
  • test scripts can identify a sequence of steps, such as by identifying keywords from the glossary.
  • the test scripts can be created manually or automatically.
  • a software tester can create a test script by specifying one or more keywords.
  • the test application can automatically create test scripts.
  • the application component 110 is a software application that is to be tested.
  • the improved software testing system can test various applications.
  • the improved software testing system can function with one or more persona components 112 .
  • the persona can identify the type of testing, duration, the test scripts to execute, weights for the test scripts, and so forth.
  • the computing devices on which the improved software testing system operates may include one or more central processing units, memory, input devices (e.g., keyboard and pointing devices), output devices (e.g., display devices), storage devices (e.g., disk drives), and network devices (e.g., network interfaces).
  • the memory and storage devices are computer-readable media that may store instructions that implement the improved software testing system.
  • the data structures and message structures may be stored or transmitted via a data transmission medium, such as a signal on a communications link.
  • Various communications links may be employed, such as the Internet, a local area network, a wide area network, or a point-to-point dial-up connection.
  • the improved software testing system may use various computing systems or devices, including personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, programmable consumer electronics, electronic game consoles, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • the improved software testing system may also provide its services to various computing systems, such as personal computers, cell phones, personal digital assistants, consumer electronics, home automation devices, and so on.
  • the improved software testing system may be described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types.
  • functionality of the program modules may be combined or distributed as desired in various embodiments.
  • the routine 300 is invoked to configure the improved software testing system.
  • the routine begins at block 302 .
  • the routine receives program state information, such as state information associated with the software that is to be tested.
  • the improved software testing system can employ the program state information to derive a state engine model for the software to be tested.
  • the routine produces a glossary of keywords.
  • the routine can produce the glossary of keywords by generating keywords from the received program state information. Alternatively, the routine may receive the keywords from a user.
  • the glossary can include other information associated with the keywords, such as parameters.
  • the routine produces a model of the application's (e.g., software's) behavior.
  • the routine returns.
  • FIG. 4 is a flow diagram illustrating a develop_test_script routine invoked by the improved software testing system in some implementations.
  • the routine 400 is invoked to develop a test script, such as under the direction of a software tester.
  • the routine begins at block 402 .
  • the routine provides a set of keywords based on the current state of the software to be tested.
  • the routine may assume that the software is in a “not running” state.
  • the routine may retrieve this information from the glossary and provide it to the user in a user interface.
  • the routine receives a keyword selection from the user.
  • the routine determines whether the selected keyword needs parameters. If that is the case, the routine receives parameters at block 410 .
  • the improved software testing system can generate parameters automatically or may prompt a user for the parameters.
  • the routine continues at block 412 . If the selected keyword needs no parameters, the routine also continues at block 412 .
  • the routine determines whether there are any more steps. The routine may make this determination by asking the user whether any more steps need to be added to the test script. Alternatively, when the routine determines that the software's state is finished, the routine may determine that there are no more steps. If there are more steps, the routine continues at block 404 . Otherwise, the routine continues at block 416 , where it returns.
  • FIG. 5 is a flow diagram illustrating a develop_random_test_script routine invoked by the improved software testing system in some implementations.
  • the routine 500 is invoked to develop a test script automatically.
  • the routine begins at block 502 .
  • the routine can receive persona information. Alternatively, the routine may derive this information from the glossary. The routine employs this information to determine which keywords should be added to the test script. As an example, keywords with a greater weight may be added to the test script with higher frequency than other keywords with a lower weight.
  • the routine selects a keyword based on the software's current state and the persona information.
  • the routine stores steps relating to the selected keyword in the test script.
  • the routine may identify test components associated with the selected keyword.
  • the routine determines whether more steps are needed. The routine may make this determination randomly or may determine that additional steps are required based on the state of the software after performing the previous step. If more steps are required, the routine continues at block 506 . Otherwise, the routine continues at block 512 , where it returns.
  • FIG. 6 is a flow diagram illustrating an execute_test_script routine invoked by the improved software testing system in some implementations.
  • the improved software testing system can invoke the routine 600 to execute a test script.
  • the routine begins at block 602 .
  • the routine loads the test script.
  • the test script can be identified by a software tester or can be identified randomly, such as based on a selected persona.
  • the routine performs steps indicated in the test script.
  • the routine may load test components that are indicated by keywords identified in the test script.
  • the routine may report errors.
  • the routine may log such errors. Later, when a software developer wants to determine why the software could not execute, the software developer can review the log.
  • the log may contain a list of keywords or steps that were executed.
  • the routine returns at block 610 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)

Abstract

An improved software testing system is described. In various embodiments, the system may produce a glossary containing keywords from the program state information associated with a software application to be tested. The system may produce a state engine model for the software application utilizing the program state information associated with the software application. The system may generate a test script by causing the software application to move through states identified by the state engine model such that the test script can identify a sequence of steps or keywords from the glossary. The system may select a keyword based on the current state of the software application and persona information. A persona may indicate a type of testing.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application is a continuation of U.S. patent application Ser. No. 11/749,609, filed May 16, 2007, which is a conversion of and claims the benefit of U.S. Provisional Application No. 60/800,866, filed on May 16, 2006, both of which are incorporated herein by reference.
  • BACKGROUND
  • Large software development projects can have multiple phases, including specification, development, and testing. Various software development methodologies include repeating some or all of these phases multiple times, such as in large or complex software development projects. Professional software development teams generally employ testers to test software before it is released to customers. The testers may test software to ensure correctness, completeness, security, and quality. When the tested software does not conform to a tester's expectations, the tester may identify a software defect (“bug”). The tester may provide a sequence of steps so that a software developer can reproduce the defect. The software developer may then resolve the defect, such as by fixing source code and producing a new “build” of the software. It is well known in the art that fixing bugs sometimes introduces other bugs. Thus, testers often perform regression testing, which could involve following the steps previously identified as producing the defect.
  • Various software testing techniques exist. These techniques can generally be classified as manual testing and automated testing. Manual testing requires a human to perform most of the testing steps. As an example, the human may test the software by either following various scenarios that detail a potential user's interactions with the software or by taking various steps, such as randomly, to identify defects. When performing automated testing, software testers use test automation tools to automatically cause the tested software to take various steps. For example, test automation tools can record a tester's interactions with software as steps and then play back the steps. Some test automation tools employ frameworks that interact with tested software programmatically, such as by using an application program interface (API) provided by the tested software.
  • Automated testing techniques other than recording and playing back steps can further be divided into keyword-based testing and model-based testing. In keyword-based testing, each discrete interaction with the tested software is assigned a keyword and can have associated parameters. As an example, the process of logging in can be associated with a keyword “login” and have as associated parameters a user identifier and a password. To automatically test software, the software tester may specify one or more keywords, such as in a sequence, so that the test automation tool performs steps relating to each specified keyword. In model-based testing, the software tester specifies (or causes to be specified) a state engine model of the tested software. The state engine model can identify a set of states relating to the tested software and interactions that cause the tested software to move from one state to another. A software tester can then specify a test suite that moves the tested software from, through, or to various states. As an example, the test suite may specify that the tested software is to progress from a “not logged in” state through a “logging in” state to a “logged in” state.
  • Conventionally, keyword-based testing was thought to be useful for regression testing. On the other hand, model-based testing was thought to be good for discovering software defects.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating components of an improved software testing system in various implementations.
  • FIG. 2 is a block diagram illustrating additional components of an improved software testing system in various implementations.
  • FIG. 3 is a flow diagram illustrating a configure routine invoked by the improved software testing system in some implementations.
  • FIG. 4 is a flow diagram illustrating a develop_test_script routine invoked by the improved software testing system in some implementations.
  • FIG. 5 is a flow diagram illustrating a develop_random_test_script routine invoked by the improved software testing system in some implementations.
  • FIG. 6 is a flow diagram illustrating an execute_test_script routine invoked by the improved software testing system in some implementations.
  • In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
  • DESCRIPTION
  • An improved software testing system is described. In various implementations, the improved software testing system generates a state engine model when a software tester specifies keywords and employs the state engine model to automate testing of software. The specified keywords are stored in a glossary of keywords. As an example, the specified keywords can be stored in an extensible markup language (XML) document. The glossary can contain keywords and parameters or other specification of valid input, such as prior or next states associated with each keyword. The glossary can also contain default values for various parameters. A software tester can then associate one or more keywords with test scripts that can later be selected for automating tests.
  • The improved software testing system can receive test components from the software tester. As an example, the improved software testing system may provide a wizard interface that enables the software tester to associate keywords with test components. When a test script references a keyword, the improved software testing system can execute associated components and provide parameters identified in the glossary corresponding to the referenced keyword. A test component can be executable logic that is specified in a script language or object code. The test component can interact with software that is to be tested, such as via a testing application or framework. The testing application or framework may interact with the tested software via an API provided by the tested software, an API provided by an underlying operating system, and so forth.
  • The improved software testing system can employ one or more personas. A persona is an indication of one or more test scripts that are to execute and associated weightings. As an example, a persona can indicate that one test script is to execute 75% of the time and another test script is to execute 25% of the time. When the software tester begins automated testing and selects this persona, the improved software testing system can employ the test scripts at the indicated proportions.
  • Once the improved software testing system is configured with keywords and test components, a software tester can generate test scripts by specifying keywords or can request the improved software testing system to automatically generate test scripts. The improved software testing system can automatically generate test scripts by causing the tested software to move through the various states identified by the generated state engine model.
  • The improved software testing system thus can automatically generate “random walk” tests and persona-based or weighted-path tests.
  • The improved software testing system can also enable syntax checking, transition checking, and automatic generation of parameters during manual specification of test scripts. When a software developer specifies a sequence of keywords, the improved software testing system can verify whether the software developer has provided the appropriate, syntactically correct parameters. If the software developer has not specified syntactically correct parameters, the improved software testing system can warn the software tester or make appropriate corrections. The improved software testing system can verify whether test scripts are appropriately designed to cause the tested software to transition from one state to another. The improved software testing system can verify the state engine model to make this verification. The improved software testing system can automatically generate parameters during manual specification of test scripts by employing the parameters specified in the glossary. The parameters can be values, ranges of values, options selections, and so forth.
  • Thus, the improved software testing system enables efficient test automation while simultaneously mitigating errors from incorrect input by software testers.
  • When a test script executes, the improved software testing system can store the steps and the results of the steps, such as in a log file. When a software developer needs to reproduce a defect the test script identifies, the software developer can review the stored steps and results to more easily locate the software code causing the defect.
  • The improved software testing system will now be described with reference to the Figures. FIG. 1 is a block diagram illustrating components of an improved software testing system in various implementations. The improved software testing system 100 includes a test platform component 102, glossary component 104, keyword engine component 106, test script component 108, application component 110, and persona component 112.
  • The test platform component 102 can include a test application and other components that facilitate testing software. The test application can be part of a testing platform, such as an application that coordinates other components of the improved software testing system. The test application can employ the glossary component 104 and test script component 108 to execute test scripts. The test platform component can also automatically generate test scripts, such as based on the state engine model.
  • The glossary component 104 can be a document, file, or other repository of information associated with software that is to be tested. As is illustrated in FIG. 2, a glossary 200 can include keywords 202 and runtime states 204. Keywords are commands or sets of commands that the improved software testing system can execute when executing a test script. Runtime states are states of the software that is to be tested. The runtime states can include start states and end states relating to a keyword. The glossary can also include parameters (e.g., values associated with a command of the tested software), parameter default values, weights, and so forth. The weights can specify how often a corresponding keyword is to be employed, such as by a test script.
  • When the glossary indicates that a particular keyword requires parameters, the test application can prompt the software tester for values or can generate values for the parameters.
  • Returning to FIG. 1, the keyword engine component 106 can employ keywords from the glossary and identify test components that a software developer has provided corresponding to the keywords. The test components can cause the tested software to invoke a command, such as by employing an API provided by the tested software or operating system.
  • One or more test scripts can identify a sequence of steps, such as by identifying keywords from the glossary. The test scripts can be created manually or automatically. As an example, a software tester can create a test script by specifying one or more keywords. Alternatively, the test application can automatically create test scripts.
  • The application component 110 is a software application that is to be tested. The improved software testing system can test various applications.
  • The improved software testing system can function with one or more persona components 112. The persona can identify the type of testing, duration, the test scripts to execute, weights for the test scripts, and so forth.
  • The computing devices on which the improved software testing system operates may include one or more central processing units, memory, input devices (e.g., keyboard and pointing devices), output devices (e.g., display devices), storage devices (e.g., disk drives), and network devices (e.g., network interfaces). The memory and storage devices are computer-readable media that may store instructions that implement the improved software testing system. In addition, the data structures and message structures may be stored or transmitted via a data transmission medium, such as a signal on a communications link. Various communications links may be employed, such as the Internet, a local area network, a wide area network, or a point-to-point dial-up connection.
  • The improved software testing system may use various computing systems or devices, including personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, programmable consumer electronics, electronic game consoles, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The improved software testing system may also provide its services to various computing systems, such as personal computers, cell phones, personal digital assistants, consumer electronics, home automation devices, and so on.
  • The improved software testing system may be described in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, and so on that perform particular tasks or implement particular abstract data types. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments.
  • Turning now to FIG. 3, a flow diagram is illustrated describing a configure routine invoked by the improved software testing system in some implementations. The routine 300 is invoked to configure the improved software testing system. The routine begins at block 302. At block 304, the routine receives program state information, such as state information associated with the software that is to be tested. The improved software testing system can employ the program state information to derive a state engine model for the software to be tested. At block 306, the routine produces a glossary of keywords. The routine can produce the glossary of keywords by generating keywords from the received program state information. Alternatively, the routine may receive the keywords from a user. The glossary can include other information associated with the keywords, such as parameters. At block 308, the routine produces a model of the application's (e.g., software's) behavior. At block 310, the routine returns.
  • FIG. 4 is a flow diagram illustrating a develop_test_script routine invoked by the improved software testing system in some implementations. The routine 400 is invoked to develop a test script, such as under the direction of a software tester. The routine begins at block 402. At block 404, the routine provides a set of keywords based on the current state of the software to be tested. When the routine is first invoked, the routine may assume that the software is in a “not running” state. The routine may retrieve this information from the glossary and provide it to the user in a user interface. At block 406, the routine receives a keyword selection from the user. At decision block 408, the routine determines whether the selected keyword needs parameters. If that is the case, the routine receives parameters at block 410. In various implementations, the improved software testing system can generate parameters automatically or may prompt a user for the parameters. After receiving the parameters, the routine continues at block 412. If the selected keyword needs no parameters, the routine also continues at block 412. At decision block 414, the routine determines whether there are any more steps. The routine may make this determination by asking the user whether any more steps need to be added to the test script. Alternatively, when the routine determines that the software's state is finished, the routine may determine that there are no more steps. If there are more steps, the routine continues at block 404. Otherwise, the routine continues at block 416, where it returns.
  • FIG. 5 is a flow diagram illustrating a develop_random_test_script routine invoked by the improved software testing system in some implementations. The routine 500 is invoked to develop a test script automatically. The routine begins at block 502. At block 504, the routine can receive persona information. Alternatively, the routine may derive this information from the glossary. The routine employs this information to determine which keywords should be added to the test script. As an example, keywords with a greater weight may be added to the test script with higher frequency than other keywords with a lower weight. At block 506, the routine selects a keyword based on the software's current state and the persona information. At block 508, the routine stores steps relating to the selected keyword in the test script. As an example, the routine may identify test components associated with the selected keyword. At decision block 510, the routine determines whether more steps are needed. The routine may make this determination randomly or may determine that additional steps are required based on the state of the software after performing the previous step. If more steps are required, the routine continues at block 506. Otherwise, the routine continues at block 512, where it returns.
  • FIG. 6 is a flow diagram illustrating an execute_test_script routine invoked by the improved software testing system in some implementations. The improved software testing system can invoke the routine 600 to execute a test script. The routine begins at block 602. At block 604, the routine loads the test script. The test script can be identified by a software tester or can be identified randomly, such as based on a selected persona. At block 606, the routine performs steps indicated in the test script. As an example, the routine may load test components that are indicated by keywords identified in the test script. At block 608, the routine may report errors. As an example, if the test script could not execute or caused errors in the software, the routine may log such errors. Later, when a software developer wants to determine why the software could not execute, the software developer can review the log. The log may contain a list of keywords or steps that were executed. The routine returns at block 610.
  • CONCLUSION
  • Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense, that is to say, in the sense of “including, but not limited to.” Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. When the claims use the word “or” in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
  • The above detailed description of embodiments of the improved software testing system is not intended to be exhaustive or to limit the improved software testing system to the precise form disclosed above. While specific embodiments of, and examples for, the improved software testing system are described above for illustrative purposes, various equivalent modifications are possible within the scope of the improved software testing system, as those skilled in the relevant art will recognize. For example, while processes or blocks are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number, respectively.
  • The teachings of the improved software testing system provided herein can be applied to other systems, not necessarily the system described herein. The elements and acts of the various embodiments described above can be combined to provide further embodiments.
  • These and other changes can be made to the improved software testing system in light of the above Detailed Description. While the above description details certain embodiments of the improved software testing system and describes the best mode contemplated, no matter how detailed the above appears in text, the improved software testing system can be practiced in many ways. As noted above, particular terminology used when describing certain features or aspects of the improved software testing system should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the improved software testing system with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the improved software testing system to the specific embodiments disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the improved software testing system encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the improved software testing system under the claims.
  • While certain aspects of the improved software testing system are presented below in certain claim forms, the inventors contemplate the various aspects of the improved software testing system in any number of claim forms. For example, while only one aspect of the improved software testing system is recited as embodied in a computer-readable medium, other aspects may likewise be embodied in a computer-readable medium. Accordingly, the inventors reserve the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the improved software testing system.

Claims (20)

1. A system, comprising:
a testing system having hardware and software that test a software application;
wherein the software of the testing system comprises instructions translatable by the hardware of the test system to cause the test system to perform:
receiving program state information associated with the software application;
producing a glossary containing keywords from the program state information associated with the software application;
producing a state engine model for the software application utilizing the program state information associated with the software application;
generating one or more test scripts by causing the software application to move through states identified by the state engine model, the one or more test scripts identifying a sequence of steps or keywords from the glossary; and
storing the one or more test scripts in a data storage device.
2. The system of claim 1, wherein the instructions are further translatable by the hardware of the test system to cause the test system to perform:
receiving persona information over a network connection, the persona information identifying at least one of the one or more test scripts for execution and associated weightings.
3. The system of claim 1, wherein the instructions are further translatable by the hardware of the test system to cause the test system to perform:
determining persona information from the glossary containing the keywords associated with the software application.
4. The system of claim 3, wherein the persona information further identifies a type of testing.
5. The system of claim 3, wherein the instructions are further translatable by the hardware of the test system to cause the test system to perform:
selecting a keyword from the glossary based on a current state of the software application and the persona information.
6. The system of claim 5, wherein the instructions are further translatable by the hardware of the test system to cause the test system to perform:
storing steps relating to the selected keyword in a test script.
7. A method, comprising:
receiving, by a test system running on one or more computing devices and having hardware and software that test a software application, program state information associated with the software application;
producing, by the test system, a glossary containing keywords from the program state information associated with the software application;
producing, by the test system, a state engine model for the software application utilizing the program state information associated with the software application;
generating one or more test scripts by causing the software application to move through states identified by the state engine model, the one or more test scripts identifying a sequence of steps or keywords from the glossary; and
storing, by the test system, the one or more test scripts in a data storage device.
8. The method of claim 7, further comprising:
receiving, by the test system, persona information over a network connection, the persona information identifying at least one of the one or more test scripts for execution and associated weightings.
9. The method of claim 7, further comprising:
determining, by the test system, persona information from the glossary containing the keywords associated with the software application.
10. The method of claim 9, wherein the persona information further identifies a type of testing.
11. The method of claim 9, further comprising:
selecting, by the test system, a keyword from the glossary based on a current state of the software application and the persona information.
12. The method of claim 11, further comprising:
storing, by the test system, steps relating to the selected keyword in a test script.
13. The method of claim 7, further comprising:
identifying a test script from the one or more test scripts;
executing the test script to test the software application; and
reporting any errors in the software application or the executing step.
14. A computer program product comprising at least one non-transitory computer readable medium storing instructions translatable by at least one processor to cause a test system to perform:
receiving program state information associated with a software application;
producing a glossary containing keywords from the program state information associated with the software application;
producing a state engine model for the software application utilizing the program state information associated with the software application;
generating one or more test scripts by causing the software application to move through states identified by the state engine model, the one or more test scripts identifying a sequence of steps or keywords from the glossary; and
storing the one or more test scripts in a data storage device.
15. The computer program product of claim 14, where the instructions are further translatable by the at least one processor to cause the test system to perform:
receiving persona information over a network connection, the persona information identifying at least one of the one or more test scripts for execution and associated weightings.
16. The computer program product of claim 14, where the instructions are further translatable by the at least one processor to cause the test system to perform:
determining persona information from the glossary containing the keywords associated with the software application.
17. The computer program product of claim 16, wherein the persona information further identifies a type of testing.
18. The computer program product of claim 16, where the instructions are further translatable by the at least one processor to cause the test system to perform:
selecting a keyword from the glossary based on a current state of the software application and the persona information.
19. The computer program product of claim 18, where the instructions are further translatable by the at least one processor to cause the test system to perform:
storing steps relating to the selected keyword in a test script.
20. The computer program product of claim 14, where the instructions are further translatable by the at least one processor to cause the test system to perform:
identifying a test script from the one or more test scripts;
executing the test script to test the software application; and
reporting any errors in the software application or the executing step.
US13/959,488 2006-05-16 2013-08-05 Keyword based software testing system and method Abandoned US20130326486A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/959,488 US20130326486A1 (en) 2006-05-16 2013-08-05 Keyword based software testing system and method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US80086606P 2006-05-16 2006-05-16
US11/749,609 US8522214B2 (en) 2006-05-16 2007-05-16 Keyword based software testing system and method
US13/959,488 US20130326486A1 (en) 2006-05-16 2013-08-05 Keyword based software testing system and method

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/749,609 Continuation US8522214B2 (en) 2006-05-16 2007-05-16 Keyword based software testing system and method

Publications (1)

Publication Number Publication Date
US20130326486A1 true US20130326486A1 (en) 2013-12-05

Family

ID=38723993

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/749,609 Expired - Fee Related US8522214B2 (en) 2006-05-16 2007-05-16 Keyword based software testing system and method
US13/959,488 Abandoned US20130326486A1 (en) 2006-05-16 2013-08-05 Keyword based software testing system and method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/749,609 Expired - Fee Related US8522214B2 (en) 2006-05-16 2007-05-16 Keyword based software testing system and method

Country Status (2)

Country Link
US (2) US8522214B2 (en)
WO (1) WO2007137082A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017079649A3 (en) * 2015-11-06 2017-09-08 Alibaba Group Holding Limited Method, system, and device for item search
CN109669868A (en) * 2018-12-17 2019-04-23 南昌弘为企业管理有限公司 The method and system of software test
TWI904063B (en) 2023-09-21 2025-11-01 韓商韓領有限公司 Method for keyword-based testing and system therefor

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8522214B2 (en) 2006-05-16 2013-08-27 Open Text S.A. Keyword based software testing system and method
WO2008045117A1 (en) * 2006-10-06 2008-04-17 Nielsen Media Research, Inc. Methods and apparatus to analyze computer software
US8187100B1 (en) * 2007-03-02 2012-05-29 Dp Technologies, Inc. Shared execution of hybrid states
US7930251B2 (en) * 2007-08-09 2011-04-19 Sap Ag Model driven state management of applications
US8171459B2 (en) * 2007-11-16 2012-05-01 Oracle International Corporation System and method for software performance testing and determining a frustration index
US8988439B1 (en) 2008-06-06 2015-03-24 Dp Technologies, Inc. Motion-based display effects in a handheld device
US8678925B1 (en) 2008-06-11 2014-03-25 Dp Technologies, Inc. Method and apparatus to provide a dice application
US8997221B2 (en) * 2008-10-10 2015-03-31 Safend Ltd. System and method for validating and controlling applications
US8587601B1 (en) 2009-01-05 2013-11-19 Dp Technologies, Inc. Sharing of three dimensional objects
US8843893B2 (en) * 2010-04-29 2014-09-23 Sap Ag Unified framework for configuration validation
US20130263090A1 (en) * 2012-03-30 2013-10-03 Sony Online Entertainment Llc System and method for automated testing
US9026853B2 (en) * 2012-07-31 2015-05-05 Hewlett-Packard Development Company, L.P. Enhancing test scripts
KR20140053542A (en) * 2012-10-26 2014-05-08 삼성전자주식회사 Automatic testing apparatus for embedded software, automatic testing method thereof and test scenario composing method
KR20140056478A (en) * 2012-10-26 2014-05-12 삼성전자주식회사 Automatic testing apparatus for embedded software, automatic testing method thereof and test scenario composing method
US8930897B2 (en) * 2013-03-15 2015-01-06 Palantir Technologies Inc. Data integration tool
US8997052B2 (en) 2013-06-19 2015-03-31 Successfactors, Inc. Risk-based test plan construction
US9311215B2 (en) * 2014-02-12 2016-04-12 International Business Machines Corporation Defining multi-channel tests system and method
US9880915B2 (en) 2014-03-05 2018-01-30 Microsoft Technology Licensing, Llc N-gram analysis of inputs to a software application
US9594665B2 (en) 2014-03-05 2017-03-14 Microsoft Technology Licensing, Llc Regression evaluation using behavior models of software applications
US9355016B2 (en) 2014-03-05 2016-05-31 Microsoft Technology Licensing, Llc Automated regression testing for software applications
WO2015132637A1 (en) * 2014-03-05 2015-09-11 Concurix Corporation N-gram analysis of software behavior in production and testing environments
US9329980B2 (en) 2014-03-05 2016-05-03 Microsoft Technology Licensing, Llc Security alerting using n-gram analysis of program execution data
CN104965790B (en) * 2015-07-17 2018-04-27 小米科技有限责任公司 Method for testing software and system based on crucial word drive
CN105068927A (en) * 2015-08-04 2015-11-18 株洲南车时代电气股份有限公司 Keyword drive-based automatic test method of urban rail drive control units
US10223245B1 (en) 2016-05-27 2019-03-05 Amdocs Development Limited System, method, and computer program for identifying tests to automate in a software testing project
US10122866B2 (en) 2016-09-02 2018-11-06 Ricoh Company, Ltd. Automated test suite mechanism
CN108334441A (en) * 2017-01-19 2018-07-27 深圳市优朋普乐传媒发展有限公司 A kind of automated testing method and system of Software Development Kit
CN107665115B (en) * 2017-10-31 2021-02-19 胡学锋 Software development platform and method
US10367650B2 (en) * 2017-11-06 2019-07-30 Cognizant Technology Solutions India Pvt. Ltd. System and method for efficiently developing and testing home automation systems
EP3608785A1 (en) * 2018-08-09 2020-02-12 PiRo Systems Engineering GmbH Keyword-driven requirements
US10878804B2 (en) * 2018-10-10 2020-12-29 International Business Machines Corporation Voice controlled keyword generation for automated test framework
US11138097B2 (en) * 2019-09-24 2021-10-05 Aetna Inc. Automated web testing framework for generating and maintaining test scripts
CN116303492A (en) * 2022-12-08 2023-06-23 中汽创智科技有限公司 A data management method, device, equipment and storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5754760A (en) * 1996-05-30 1998-05-19 Integrity Qa Software, Inc. Automatic software testing tool
US5918037A (en) * 1996-06-05 1999-06-29 Teradyne, Inc. Generating tests for an extended finite state machine using different coverage levels for different submodels
US6694290B1 (en) * 1999-05-25 2004-02-17 Empirix Inc. Analyzing an extended finite state machine system model
US20020032538A1 (en) * 2000-05-09 2002-03-14 Lee Young-Seok Software test system and method
US20020091968A1 (en) * 2001-01-08 2002-07-11 Donald Moreaux Object-oriented data driven software GUI automated test harness
US6944848B2 (en) * 2001-05-03 2005-09-13 International Business Machines Corporation Technique using persistent foci for finite state machine based software test generation
US6978275B2 (en) * 2001-08-31 2005-12-20 Hewlett-Packard Development Company, L.P. Method and system for mining a document containing dirty text
US7117484B2 (en) * 2002-04-16 2006-10-03 International Business Machines Corporation Recursive use of model based test generation for middleware validation
US20040025083A1 (en) * 2002-07-31 2004-02-05 Murthi Nanja Generating test code for software
US20040103396A1 (en) * 2002-11-20 2004-05-27 Certagon Ltd. System for verification of enterprise software systems
US7124402B2 (en) 2002-12-30 2006-10-17 International Business Machines Corporation Testing software module responsiveness to string input tokens having lengths which span a range of integral values
US8281286B2 (en) * 2006-03-31 2012-10-02 Cisco Technology, Inc. Methods and systems for automated testing of applications using an application independent GUI map
US8522214B2 (en) 2006-05-16 2013-08-27 Open Text S.A. Keyword based software testing system and method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017079649A3 (en) * 2015-11-06 2017-09-08 Alibaba Group Holding Limited Method, system, and device for item search
CN109669868A (en) * 2018-12-17 2019-04-23 南昌弘为企业管理有限公司 The method and system of software test
TWI904063B (en) 2023-09-21 2025-11-01 韓商韓領有限公司 Method for keyword-based testing and system therefor

Also Published As

Publication number Publication date
US20080010539A1 (en) 2008-01-10
WO2007137082A3 (en) 2008-10-02
US8522214B2 (en) 2013-08-27
WO2007137082A2 (en) 2007-11-29

Similar Documents

Publication Publication Date Title
US8522214B2 (en) Keyword based software testing system and method
US9465718B2 (en) Filter generation for load testing managed environments
RU2390826C2 (en) Automatic verification of test cases implicitily related to automatic execution of test cases
US9940225B2 (en) Automated error checking system for a software application and method therefor
US6067639A (en) Method for integrating automated software testing with software development
Zeiss et al. Applying the ISO 9126 quality model to test specifications
US20060265475A9 (en) Testing web services as components
CN111124919A (en) User interface testing method, device, equipment and storage medium
US9342439B2 (en) Command coverage analyzer
US20020116153A1 (en) Test automation framework
US20080052690A1 (en) Testing software with a build engine
WO2017172667A1 (en) Privilege test and monitoring
US7895575B2 (en) Apparatus and method for generating test driver
US20040148590A1 (en) Hierarchical test suite
CN113094281B (en) Test method and device for hybrid App
Santos-Neto et al. Requirements for information systems model-based testing
Micskei et al. Robustness testing techniques for high availability middleware solutions
CN109669868A (en) The method and system of software test
Liu et al. Test cases selection method of rapid regression testing
CN116932414B (en) Method and equipment for generating interface test case and computer readable storage medium
US11914503B2 (en) Automated performance measurement over software lifecycle
Silva et al. Analyzing structure-based techniques for test coverage on a J2ME software product line
Isosaari Smoke Testing Display Viewer 5
Farchi et al. Random Test Generation of Application Programming Interfaces
KR20070021879A (en) Test Driver Generation Device and Method

Legal Events

Date Code Title Description
AS Assignment

Owner name: CAPTARIS, INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROTH, RICK R.;REEL/FRAME:031087/0250

Effective date: 20070725

AS Assignment

Owner name: OPEN TEXT INC., WASHINGTON

Free format text: MERGER;ASSIGNOR:CAPTARIS, INC.;REEL/FRAME:031089/0971

Effective date: 20090625

AS Assignment

Owner name: OPEN TEXT S. A., LUXEMBOURG

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OPEN TEXT INC.;REEL/FRAME:031265/0293

Effective date: 20110725

STCB Information on status: application discontinuation

Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION