Chu, 2012 - Google Patents
A blackboard-based decision support framework for testing client/server applicationsChu, 2012
- Document ID
- 960315181578005413
- Author
- Chu H
- Publication year
- Publication venue
- 2012 Third World Congress on Software Engineering
External Links
Snippet
To assist a solution to the problem of the test environment spanning multiple platforms, this paper proposes a decision support framework with the blackboard model to integrate all complementary features into a single automated test environment for multi-platform …
- 230000003993 interaction 0 abstract description 8
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
- G06F11/3495—Performance evaluation by tracing or monitoring for systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
- G06F11/3476—Data logging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
- G06F11/3414—Workload generation, e.g. scripts, playback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3696—Methods or tools to render software testable
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3664—Environments for testing or debugging software
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/875—Monitoring of systems including the internet
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/362—Software debugging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for programme control, e.g. control unit
- G06F9/06—Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
- G06F9/46—Multiprogramming arrangements
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210209007A1 (en) | Methods for improved web application testing using remote headless browsers and devices thereof | |
US6477483B1 (en) | Service for load testing a transactional server over the internet | |
Leung | Quality metrics for intranet applications | |
Nguyen | Testing applications on the Web: Test planning for Internet-based systems | |
Foster et al. | Compatibility verification for web service choreography | |
Chu | A blackboard-based decision support framework for testing client/server applications | |
US8291068B2 (en) | Automatic protocol detection | |
WO2001009752A2 (en) | A system, method and article of manufacture for a host framework design in an e-commerce architecture | |
US20070255579A1 (en) | Method and system for recording interactions of distributed users | |
Camacho et al. | Agile team members perceptions on non-functional testing: influencing factors from an empirical study | |
US9823999B2 (en) | Program lifecycle testing | |
Fagerström et al. | Verdict machinery: On the need to automatically make sense of test results | |
Di Meglio et al. | E2E-Loader: A Tool to Generate Performance Tests from End-to-End GUI-Level Tests | |
Chu | AN INTELLIGENT FRAMEWORK FOR DYNAMIC TEST PLAN OF CLIENT/SERVER APPLICATIONS | |
Rings et al. | Testing Grid application workflows using TTCN-3 | |
Rising | System test pattern language | |
Goris | Robotic Process Automation AN ASSESMENT OF PROCESS DISCOVERY TECHNIQUES WITH THE PURPOSE OF FINDING RPA ELIGIBLE PROCESSES | |
Muhamadi et al. | Student record retrieval system using knowledge sharing | |
Coffey et al. | Concept mapping for the efficient generation and communication of security assurance cases | |
Haber | Sensemaking sysadmins: Lessons from the field | |
Idris | Comparison of GUI test automation strategies in a Clearing System: A case study at Nasdaq Stockholm AB | |
Alber et al. | The HUBzero Platform: Extensions and Impressions. | |
Barber | Automated Testing for Embedded Devices | |
Vail | Stress, load, volume, performance, benchmark and base line testing tool evaluation and comparison | |
Höjbert et al. | How to Improve Feedback and Traceability for Performance in Software Development |