Srivastava et al., 2005 - Google Patents
Efficient integration testing using dependency analysisSrivastava et al., 2005
View PDF- Document ID
- 14061340831323621890
- Author
- Srivastava A
- Thiagarajan J
- Schertz C
- Publication year
- Publication venue
- Microsoft Research, TechReport MSR-TR-2005-94
External Links
Snippet
Although testing starts with individual programs, programs are rarely self-contained in real software environments. They depend on external subsystems like language run time and operating system libraries for various functionalities. These subsystems are developed …
- 238000004458 analytical method 0 title abstract description 20
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3676—Test management for coverage analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/362—Software debugging
- G06F11/3636—Software debugging by tracing the execution of the program
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3604—Software analysis for verifying properties of programs
- G06F11/3612—Software analysis for verifying properties of programs by runtime analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for programme control, e.g. control unit
- G06F9/06—Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
- G06F9/46—Multiprogramming arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/86—Event-based monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N33/00—Investigating or analysing materials by specific methods not covered by the preceding groups
- G01N33/48—Investigating or analysing materials by specific methods not covered by the preceding groups biological material, e.g. blood, urine; Haemocytometers
- G01N33/50—Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kim et al. | SPLat: Lightweight dynamic analysis for reducing combinatorics in testing configurable systems | |
Srivastava et al. | Effectively prioritizing tests in development environment | |
Rothermel et al. | On test suite composition and cost-effective regression testing | |
Ernst et al. | The Daikon system for dynamic detection of likely invariants | |
Schuler et al. | Covering and uncovering equivalent mutants | |
Do et al. | An empirical study of the effect of time constraints on the cost-benefits of regression testing | |
US9052980B2 (en) | Exception based quality assessment | |
Foidl et al. | Integrating software quality models into risk-based testing | |
US20080154710A1 (en) | Minimal Effort Prediction and Minimal Tooling Benefit Assessment for Semi-Automatic Code Porting | |
Lagerström et al. | Exploring the relationship between architecture coupling and software vulnerabilities | |
Musco et al. | A large-scale study of call graph-based impact prediction using mutation testing | |
Srivastava et al. | Efficient integration testing using dependency analysis | |
Shihab et al. | Prioritizing the creation of unit tests in legacy software systems | |
Nanda et al. | Making defect-finding tools work for you | |
Huang et al. | Scaling predictive analysis of concurrent programs by removing trace redundancy | |
Alqadi et al. | Slice-based cognitive complexity metrics for defect prediction | |
Eghbali et al. | DyLin: A Dynamic Linter for Python | |
Nashaat et al. | Detecting security vulnerabilities in object-oriented php programs | |
Kinneer et al. | Sofya: A flexible framework for development of dynamic program analyses for Java software | |
Christakis et al. | Bounded abstract interpretation | |
Dósea et al. | How do design decisions affect the distribution of software metrics? | |
Harmon et al. | A modular worst-case execution time analysis tool for Java processors | |
Le et al. | Marple: Detecting faults in path segments using automatically generated analyses | |
Zhang et al. | Hybrid Regression Test Selection by Integrating File and Method Dependences | |
Azadmanesh et al. | Blast: Bytecode-level analysis on sliced traces |