Automated accessibility evaluation software for authenticated environments: A heuristic usability evaluation

1Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Web accessibility has been the subject of much discussion regarding the need to make Web content accessible to all people, regardless of their abilities or disabilities. While some testing techniques require human intervention, accessibility can also be evaluated by automated tools. Automated evaluation tools are software programs that examine the code of Web pages to determine if they conform to a set of accessibility guidelines that are often based on the Web Content Accessibility Guidelines Version 2.0 (WCAG 2.0), developed by the World Wide Web Consortium (W3C). In this context, the purpose of this study is to analyze an automated software program for evaluating authenticated environments and verify the usability of this tool, since automated systems require precision and reliability in terms of both results and use in any type of environment. With this in mind, this paper aimed at evaluating the ASES software by means of a heuristic evaluation carried out by three experts. The analysis revealed major accessibility problems, as well as improper functioning of available tools and inconsistency of results. Furthermore, ASES was found to have problems of efficiency, interaction, validity, and reliability in the results presented. Considering that this is an open-source accessibility testing tool that can be found on a government web site, the correction or improvement of the system's deficiencies identified in this study is highly recommended, as there is a lack of software available to evaluate authenticated environments. © 2014 Springer International Publishing.

Cite

CITATION STYLE

APA

Pivetta, E. M., Saito, D. S., Da Silva Flor, C., Ulbricht, V. R., & Vanzin, T. (2014). Automated accessibility evaluation software for authenticated environments: A heuristic usability evaluation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8516 LNCS, pp. 77–88). Springer Verlag. https://doi.org/10.1007/978-3-319-07509-9_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free