[go: up one dir, main page]

Malik et al., 2010 - Google Patents

Using load tests to automatically compare the subsystems of a large enterprise system

Malik et al., 2010

View PDF
Document ID
14019001700461144048
Author
Malik H
Adams B
Hassan A
Flora P
Hamann G
Publication year
Publication venue
2010 IEEE 34th Annual Computer Software and Applications Conference

External Links

Snippet

Enterprise systems are load tested for every added feature, software updates and periodic maintenance to ensure that the performance demands on system quality, availability and responsiveness are met. In current practice, performance analysts manually analyze load …
Continue reading at www.academia.edu (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3495Performance evaluation by tracing or monitoring for systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring
    • G06F11/3476Data logging
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3447Performance evaluation by modeling
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3604Software analysis for verifying properties of programs
    • G06F11/3612Software analysis for verifying properties of programs by runtime analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
    • G06Q10/063Operations research or analysis
    • G06Q10/0639Performance analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/875Monitoring of systems including the internet
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F2201/00Indexing scheme relating to error detection, to error correction, and to monitoring
    • G06F2201/86Event-based monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/22Detection or location of defective computer hardware by testing during standby operation or during idle time, e.g. start-up testing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Error detection; Error correction; Monitoring responding to the occurence of a fault, e.g. fault tolerance
    • G06F11/0703Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/50Computer-aided design
    • G06F17/5009Computer-aided design using simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor; File system structures therefor

Similar Documents

Publication Publication Date Title
Nguyen et al. Automated detection of performance regressions using statistical process control techniques
Malik et al. Automatic detection of performance deviations in the load testing of large scale systems
Jiang et al. A survey on load testing of large-scale software systems
Bird et al. Don't touch my code! Examining the effects of ownership on software quality
Jia et al. An approach for anomaly diagnosis based on hybrid graph model with logs for distributed services
Meneely et al. Predicting failures with developer networks and social network analysis
Jiang et al. Automated performance analysis of load tests
Chen et al. CauseInfer: Automated end-to-end performance diagnosis with hierarchical causality graph in cloud environment
Bird et al. Putting it all together: Using socio-technical networks to predict failures
US9009680B2 (en) Selecting instrumentation points for an application
US8661125B2 (en) System comprising probe runner, monitor, and responder with associated databases for multi-level monitoring of a cloud service
Peiris et al. Pad: Performance anomaly detection in multi-server distributed systems
US20080148242A1 (en) Optimizing an interaction model for an application
Avritzer et al. The role of modeling in the performance testing of e-commerce applications
Zeng et al. Traceark: Towards actionable performance anomaly alerting for online service systems
Malik et al. Pinpointing the subsystems responsible for the performance deviations in a load test
Ghaith et al. Anomaly detection in performance regression testing by transaction profile estimation
Malik et al. Automatic comparison of load tests to support the performance analysis of large enterprise systems
Cito et al. Interactive production performance feedback in the IDE
Cito et al. Identifying root causes of web performance degradation using changepoint analysis
Camacho et al. Chaos as a Software Product Line—a platform for improving open hybrid‐cloud systems resiliency
Ayala-Rivera et al. One size does not fit all: In-test workload adaptation for performance testing of enterprise applications
Gao et al. An exploratory study on assessing the impact of environment variations on the results of load tests
Malik et al. Using load tests to automatically compare the subsystems of a large enterprise system
Kubacki et al. Exploring operational profiles and anomalies in computer performance logs