The reliability and validity of automated tools for examining variation in syntactic complexity across genres

58Citations
Citations of this article
67Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This study investigates two automated systems, the L2 syntactic complexity analyzer (SCA) and Coh-Metrix, for their analysis of English syntactic complexity as a way to capture variation across two genres. The purpose of the paper is to partially replicate previous studies that have found complexity differences between narrative and argumentative essays while also evaluating two automated tools that can be used for analysing variation in other contexts. We first test the reliability of SCA and Coh-Metrix by comparing their results with the hand-coded results of 30 essays. We then examine the two systems' ability to detect genre effects on syntactic complexity using 162 essays (narrative and argumentative) written by 81 ESL students. Results indicate that the majority of measures from the two systems are reliable, but some measures were not transparent and consistent. Additionally, both SCA and Coh-Metrix measures show higher syntactic complexity in the argumentative essays than the narratives, confirming both the findings of previous research and that both tools may be valid for studying variation in complexity. Findings are discussed in terms of the motivation for genre differences.

Cite

CITATION STYLE

APA

Polio, C., & Yoon, H. J. (2018). The reliability and validity of automated tools for examining variation in syntactic complexity across genres. International Journal of Applied Linguistics (United Kingdom), 28(1), 165–188. https://doi.org/10.1111/ijal.12200

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free