Hennig et al., 2003 - Google Patents
From UML to performance measures–simulative performance predictions of IT-systems using the Jboss application server with OMNET++Hennig et al., 2003
View PDF- Document ID
- 6733466202484121221
- Author
- Hennig A
- Revill D
- Ponitsch M
- Publication year
- Publication venue
- Al-Dabass [1]
External Links
Snippet
In this paper, we argue the case for thorough performance engineering already in the early development phases of complex IT-systems, particularly web-based ones on the example of the Open Source Application Server JBoss. We show the need for a fast and efficient …
- 238000004088 simulation 0 abstract description 26
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
- G06F17/5009—Computer-aided design using simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for programme control, e.g. control unit
- G06F9/06—Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
- G06F9/46—Multiprogramming arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/30—Information retrieval; Database structures therefor; File system structures therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/60—Software deployment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformations of program code
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/20—Software design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance or administration or management of packet switching networks
- H04L41/14—Arrangements for maintenance or administration or management of packet switching networks involving network analysis or design, e.g. simulation, network model or planning
- H04L41/145—Arrangements for maintenance or administration or management of packet switching networks involving network analysis or design, e.g. simulation, network model or planning involving simulating, designing, planning or modelling of a network
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Bertolino et al. | CB-SPE Tool: Putting component-based performance engineering into practice | |
Cortellessa et al. | Early reliability assessment of UML based software models | |
Bell et al. | Optorsim: A grid simulator for studying dynamic data replication strategies | |
Balsamo et al. | Model-based performance prediction in software development: A survey | |
Koziolek et al. | A model transformation from the palladio component model to layered queueing networks | |
White et al. | Improving domain-specific language reuse with software product line techniques | |
De Farias et al. | COMFIT: A development environment for the Internet of Things | |
Woodside et al. | A wideband approach to integrating performance prediction into a software design environment | |
Castellanos et al. | A model-driven architectural design method for big data analytics applications | |
Koziolek et al. | Predicting the performance of component-based software architectures with different usage profiles | |
Mos et al. | Performance management in component-oriented systems using a Model Driven Architecture/spl trade/approach | |
Becker et al. | Model-driven generation of performance prototypes | |
Brown et al. | An approach to benchmarking configuration complexity | |
Wirsing et al. | Sensoria patterns: Augmenting service engineering with formal analysis, transformation and dynamicity | |
JP2004118842A (en) | How to provide enhanced dynamic system simulation capabilities outside the original modeling environment | |
Hennig et al. | From UML to performance measures–simulative performance predictions of IT-systems using the Jboss application server with OMNET++ | |
Xu et al. | Modeling the Execution Architecture of a Mobile Phone Software System by Colored Petri Nets | |
Bondarev et al. | CARAT: a toolkit for design and performance analysis of component-based embedded systems | |
Balsamo et al. | Software performance: state of the art and perspectives | |
Kirschner | Model-driven reverse engineering of technology-induced architecture for quality prediction | |
Mancini et al. | A simulation-based framework for autonomic web services | |
Koziolek et al. | Evaluating performance of software architecture models with the Palladio component model | |
Mancini et al. | Performance-driven development of a web services application using MetaPL/HeSSE | |
Hennig et al. | Performance prototyping–generating and simulating a distributed IT-system from UML models | |
Xu et al. | Modeling execution architecture of software system using colored Petri nets |