Automated User Experience Testing through Multi-Dimensional Performance Impact Analysis
Although there are many automated software testing suites, they usually focus on unit, system and interface testing. However, especially software updates such as new security features have the potential to significantly diminish user experience. In this paper, we propose a novel automated user experience testing methodology that learns how code changes impact the time unit and system tests take, and extrapolate user experience changes based on this information. Such a tool can be integrated into existing continuous integration pipelines, and it provide software teams immediate user experience feedback. We construct a feature set from lexical, layout and syntactic characteristics of the code, and using Abstract Syntax Tree-Based Embeddings, we can calculate the approximate semantic distance to feed into a machine learning algorithm. In our experiments, we use several regression methods to estimate the time impact of software updates. Our open-source tool achieved 3.7% mean absolute error rate with random forest regressor.
Fri 21 MayDisplayed time zone: Amsterdam, Berlin, Bern, Rome, Stockholm, Vienna change
15:00 - 16:15 | Testing for Specific Domains - 2AST 2021 at AST Room Chair(s): Alejandra Garrido LIFIA, University of La Plata & CONICET, Argentina | ||
15:00 15mShort-paper | Automated User Experience Testing through Multi-Dimensional Performance Impact Analysis AST 2021 Pre-print Media Attached | ||
15:15 30mLong-paper | A Survey of Video Game Testing AST 2021 Cristiano Politowski Concordia University, Fabio Petrillo Université du Québec à Chicoutimi, Canada, Yann-Gaël Guéhéneuc Concordia University and Polytechnique Montréal Pre-print Media Attached | ||
15:45 30mLong-paper | Test suites as a source of training data for static analysis alert classifiers AST 2021 Pre-print Media Attached |
Go directly to this room on Clowdr