Malik et al., 2010 - Google Patents
Using load tests to automatically compare the subsystems of a large enterprise systemMalik et al., 2010
View PDF- Document ID
- 14019001700461144048
- Author
- Malik H
- Adams B
- Hassan A
- Flora P
- Hamann G
- Publication year
- Publication venue
- 2010 IEEE 34th Annual Computer Software and Applications Conference
External Links
Snippet
Enterprise systems are load tested for every added feature, software updates and periodic maintenance to ensure that the performance demands on system quality, availability and responsiveness are met. In current practice, performance analysts manually analyze load …
- 238000000034 method 0 abstract description 104
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
- G06F11/3495—Performance evaluation by tracing or monitoring for systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
- G06F11/3414—Workload generation, e.g. scripts, playback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
- G06F11/3476—Data logging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3447—Performance evaluation by modeling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3604—Software analysis for verifying properties of programs
- G06F11/3612—Software analysis for verifying properties of programs by runtime analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
- G06Q10/063—Operations research or analysis
- G06Q10/0639—Performance analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/875—Monitoring of systems including the internet
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
- G06F2201/86—Event-based monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/22—Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Error detection; Error correction; Monitoring responding to the occurence of a fault, e.g. fault tolerance
- G06F11/0703—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
- G06F17/5009—Computer-aided design using simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Nguyen et al. | Automated detection of performance regressions using statistical process control techniques | |
Malik et al. | Automatic detection of performance deviations in the load testing of large scale systems | |
Jiang et al. | A survey on load testing of large-scale software systems | |
Bird et al. | Don't touch my code! Examining the effects of ownership on software quality | |
Jia et al. | An approach for anomaly diagnosis based on hybrid graph model with logs for distributed services | |
Meneely et al. | Predicting failures with developer networks and social network analysis | |
Jiang et al. | Automated performance analysis of load tests | |
Chen et al. | CauseInfer: Automated end-to-end performance diagnosis with hierarchical causality graph in cloud environment | |
Bird et al. | Putting it all together: Using socio-technical networks to predict failures | |
US9009680B2 (en) | Selecting instrumentation points for an application | |
US8661125B2 (en) | System comprising probe runner, monitor, and responder with associated databases for multi-level monitoring of a cloud service | |
Peiris et al. | Pad: Performance anomaly detection in multi-server distributed systems | |
US20080148242A1 (en) | Optimizing an interaction model for an application | |
Avritzer et al. | The role of modeling in the performance testing of e-commerce applications | |
Zeng et al. | Traceark: Towards actionable performance anomaly alerting for online service systems | |
Malik et al. | Pinpointing the subsystems responsible for the performance deviations in a load test | |
Ghaith et al. | Anomaly detection in performance regression testing by transaction profile estimation | |
Malik et al. | Automatic comparison of load tests to support the performance analysis of large enterprise systems | |
Cito et al. | Interactive production performance feedback in the IDE | |
Cito et al. | Identifying root causes of web performance degradation using changepoint analysis | |
Camacho et al. | Chaos as a Software Product Line—a platform for improving open hybrid‐cloud systems resiliency | |
Ayala-Rivera et al. | One size does not fit all: In-test workload adaptation for performance testing of enterprise applications | |
Gao et al. | An exploratory study on assessing the impact of environment variations on the results of load tests | |
Malik et al. | Using load tests to automatically compare the subsystems of a large enterprise system | |
Kubacki et al. | Exploring operational profiles and anomalies in computer performance logs |