Automated WAIT for Cloud-Based Application Testing
Files in This Item:
|Portillo_2014_automated(PHOEBE1).pdf||750.16 kB||Adobe PDF||Download|
|Title:||Automated WAIT for Cloud-Based Application Testing||Authors:||Portillo Dominguez, Andres Omar; Wang, Miao; Murpy, John; Magoni, Damien||Permanent link:||http://hdl.handle.net/10197/9234||Date:||4-Apr-2014||Online since:||2018-02-16T19:11:00Z||Abstract:||Cloud computing is causing a paradigm shift in the provision and use of software. This has changed the way of obtaining, managing and delivering computing services and solutions. Similarly, it has brought new challenges to software testing. A particular area of concern is the performance of cloud-based applications. This is because the increased complexity of the applications has exposed new areas of potential failure points, complicating all performance-related activities. This situation makes the performance testing of cloud environments very challenging. Similarly, the identification of performance issues and the diagnosis of their root causes are time-consuming and complex, usually require multiple tools and heavily rely on expertise. To simplify these tasks, hence increasing the productivity and reducing the dependency on human experts, this paper presents a lightweight approach to automate the usage of expert tools in the performance testing of cloud-based applications. In this paper, we use a tool named Whole-system Analysis of Idle Time to demonstrate how our research work solves this problem. The validation involved two experiments, which assessed the overhead of the approach and the time savings that it can bring to the analysis of performance issues. The results proved the benefits of the approach by achieving a significant decrease in the time invested in performance analysis while introducing a low overhead in the tested system.||Funding Details:||Science Foundation Ireland||Type of material:||Conference Publication||Publisher:||IEEE||Copyright (published version):||2014 IEEE||Keywords:||Cloud computing; Performance testing; Automation; Performance analysis; Expert tools; Testing; Expert systems; Silicon; High definition video; Productivity; Monitoring||DOI:||10.1109/ICSTW.2014.46||Language:||en||Status of Item:||Peer reviewed||Conference Details:||2014 IEEE Seventh International Conference on Software Testing, Verification and Validation Workshops (ICSTW), Ohio, United States of America, 31 March- 4 April 2014|
|Appears in Collections:||Computer Science Research Collection|
PEL Research Collection
Show full item record
This item is available under the Attribution-NonCommercial-NoDerivs 3.0 Ireland. No item may be reproduced for commercial purposes. For other possible restrictions on use please refer to the publisher's URL where this is made available, or to notes contained in the item itself. Other terms may apply.