Lexicometry: A Quantifying Heuristic for Social Scientists in Discourse Studies

  • Scholz R
N/ACitations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This chapter introduces lexicometry as a quantitative heuristic methodology for the analysis of discourses that complements qualitative hermeneutic methods. On this understanding, it draws a connection between Bachelard’s concept of ‘epistemic rupture’ and quantitative methods which allows the discovery of discursive phenomena prior to the interpretation of meaning in texts. Lexicometry is a corpus-driven approach that deploys, besides common corpus linguistic methods, complex algorithms to analyse the lexis of a given corpus exhaustively. It does so by contrasting different corpus parts organised in partitions. Taking examples from a corpus of 4000 press texts on the global financial crisis of 2008, the contribution illustrates how a large text corpus can be reduced systematically to a readable size. It also demonstrates different ways of exploring the lexicosemantic macro-structures using correspondence analysis, descending hierarchical classification, and other methods.

Cite

CITATION STYLE

APA

Scholz, R. (2019). Lexicometry: A Quantifying Heuristic for Social Scientists in Discourse Studies (pp. 123–153). https://doi.org/10.1007/978-3-319-97370-8_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free