YouTube’s ‘up next’ feature algorithmically suggests videos to watch after a video that is currently playing. This feature has been criticised for limiting users’ exposure to diverse media content and information sources; meanwhile, YouTube has reported that they have implemented technical and policy changes to address these concerns. Yet, there is limited data to support either the existing concerns or YouTube’s claims. Drawing on the concept of platform observability, this paper combines computational and qualitative methods to investigate the types of content YouTube’s ‘up next’ feature amplifies over time, using three search terms associated with sociocultural issues where concerns have been raised about YouTube’s role: ‘coronavirus’, ‘feminism’ and ‘beauty’. Over six weeks, we collected the videos (and their metadata) that were highly ranked in the search results for each keyword, as well as the top-ranked recommendations associated with each video, repeating the exercise for three steps in the recommendation chain. We then examined patterns in the recommended videos (and channels) for each query and their variation over time. We found evidence of YouTube's stated efforts to boost ‘authoritative’ media outlets, but at the same time, misleading and controversial content continues to be recommended. We also found that while algorithmic recommendations offer diversity in videos over time, there are clear ‘winners’ at the channel level that are given a visibility boost in YouTube’s 'up next' feature. These impacts were attenuated differently depending on the nature of the search topic.
CITATION STYLE
Matamoros-Fernandez, A., Gray, J. E., Bartolo, L., Burgess, J., & Suzor, N. (2021). WHAT’S ‘UP NEXT’? INVESTIGATING ALGORITHMIC RECOMMENDATIONS ON YOUTUBE ACROSS ISSUES AND OVER TIME. AoIR Selected Papers of Internet Research. https://doi.org/10.5210/spir.v2021i0.12208
Mendeley helps you to discover research relevant for your work.