The reliability and validity of automated tools for examining variation in syntactic complexity across genres

61Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.

Abstract

This study investigates two automated systems, the L2 syntactic complexity analyzer (SCA) and Coh-Metrix, for their analysis of English syntactic complexity as a way to capture variation across two genres. The purpose of the paper is to partially replicate previous studies that have found complexity differences between narrative and argumentative essays while also evaluating two automated tools that can be used for analysing variation in other contexts. We first test the reliability of SCA and Coh-Metrix by comparing their results with the hand-coded results of 30 essays. We then examine the two systems' ability to detect genre effects on syntactic complexity using 162 essays (narrative and argumentative) written by 81 ESL students. Results indicate that the majority of measures from the two systems are reliable, but some measures were not transparent and consistent. Additionally, both SCA and Coh-Metrix measures show higher syntactic complexity in the argumentative essays than the narratives, confirming both the findings of previous research and that both tools may be valid for studying variation in complexity. Findings are discussed in terms of the motivation for genre differences.

References Powered by Scopus

Towards an organic approach to investigating CAF in instructed SLA: The case of complexity

809Citations
N/AReaders
Get full text

Task complexity, task difficulty, and task production: Exploring interactions in a componential framework

767Citations
N/AReaders
Get full text

Automated evaluation of text and discourse with Coh-Metrix

673Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Computer-assisted EFL writing and evaluations based on artificial intelligence: a case from a college reading and writing course

58Citations
N/AReaders
Get full text

Complexity, accuracy, and fluency as indices of college-level L2 writers’ proficiency

49Citations
N/AReaders
Get full text

Quantifying Disciplinary Voices: An Automated Approach to Interactional Metadiscourse in Successful Student Writing

36Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Polio, C., & Yoon, H. J. (2018). The reliability and validity of automated tools for examining variation in syntactic complexity across genres. International Journal of Applied Linguistics (United Kingdom), 28(1), 165–188. https://doi.org/10.1111/ijal.12200

Readers over time

‘18‘19‘20‘21‘22‘23‘24‘2505101520

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 24

60%

Professor / Associate Prof. 8

20%

Lecturer / Post doc 5

13%

Researcher 3

8%

Readers' Discipline

Tooltip

Linguistics 24

63%

Arts and Humanities 8

21%

Social Sciences 4

11%

Economics, Econometrics and Finance 2

5%

Save time finding and organizing research with Mendeley

Sign up for free
0