In the past 20 years, Silicon Valley’s platforms and opaque algorithms have increasingly influenced civic discourse, helping Facebook, Twitter, and others extract and consolidate the revenues generated. That trend has reduced the profitability of local news organizations, but not the importance of locally created news reporting in residents’ day‐to‐day lives. The disruption of the economics and distribution of news has reduced, scattered, and diversified local news sources (digital‐first newspapers, digital‐only newsrooms, and television and radio broadcasters publishing online), making it difficult to inventory and understand the information health of communities, individually and in aggregate. Analysis of this national trend is often based on the geolocation of known news outlets as a proxy for community coverage. This measure does not accurately estimate the quality, scale, or diversity of topics provided to the community. This project is developing a scalable, semi‐automated approach to describe digital news content along journalism‐quality‐focused standards. We propose identifying representative corpora and applying machine learning and natural language processing to estimate the extent to which news articles engage in multiple journalistic dimensions, including geographic relevancy, critical information needs, and equity of coverage.
CITATION STYLE
Khanom, A., Kiesow, D., Zdun, M., & Shyu, C. R. (2023). The News Crawler: A Big Data Approach to Local Information Ecosystems. Media and Communication, 11(3), 318–329. https://doi.org/10.17645/mac.v11i3.6789
Mendeley helps you to discover research relevant for your work.