Impact of social media disinformation explored in 'The Social Dilemma'
Abstract
This study explored the significant impact of disinformation spread through social media, focusing on the documentary "The Social Dilemma" by Jeff Orlowski. The film provided a critical lens to examine how social media algorithms amplified false narratives. Using a qualitative content analysis approach, the research highlighted key themes related to disinformation, such as political polarization in the U.S., the flat earth theory, the #Pizzagate conspiracy, Covid-19 misinformation, and the incitement of hate speech in Myanmar. The findings revealed that algorithms designed to maximize user engagement often prioritized sensational and misleading content, exacerbating the spread of false information. This fueled social tensions and undermined public health and democratic processes. The study emphasized the urgent need for increased public awareness of disinformation's effects and called on social media platforms to take responsibility for reducing its spread.
Keywords
Full Text:
PDFReferences
Boyd, J. P. (2020). Meteorology and oceanography on a flat earth. arXiv preprint arXiv:2003.08541. https://doi.org/10.48550/arxiv.2003.08541
Brun, G. (2015). Explication as a method of conceptual re-engineering. Erkenntnis, 81(6), 1211-1241. https://doi.org/10.1007/s10670-015-9791-5
Brown, A., Tse, T., & Fortune, T. (2018). Defining sensory modulation: a review of the concept and a contemporary definition for application by occupational therapists. Scandinavian Journal of Occupational Therapy, 26(7), 515-523. https://doi.org/10.1080/11038128.2018.1509370
Bucknell Bossen, C., & Kottasz, R. (2020). Uses and gratifications of TikTok: An exploration of user behavior. Journal of Marketing Communications, 27(3), 1–21. https://doi.org/10.1080/13527266.2020.1866657
Cervone, C., Augoustinos, M., & Maass, A. (2020). The language of derogation and hate: Functions, consequences, and reappropriation. Journal of Language and Social Psychology, 40(1), 80-101. https://doi.org/10.1177/0261927x20967394
Cinelli, M., Morales, G., Galeazzi, A., Quattrociocchi, W., & Starnini, M. (2021). The echo chamber effect on social media. Proceedings of the National Academy of Sciences, 118(9). https://doi.org/10.1073/pnas.2023301118
Duriau, V., Reger, R., & Pfarrer, M. (2007). A content analysis of the content analysis literature in organization studies: research themes, data sources, and methodological refinements. Organizational Research Methods, 10(1), 5-34. https://doi.org/10.1177/1094428106289252
Elo, S. and Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced Nursing, 62(1), 107-115. https://doi.org/10.1111/j.1365-2648.2007.04569.x
Flick, U. (2018). An introduction to qualitative research (6th ed.). SAGE Publications.
Freiling, I., Stubenvoll, M., & Matthes, J. (2023). Support for misinformation regulation on social media: it is the perceived harm of misinformation that matters, not the perceived amount. Policy & Internet, 15(4), 731-749. https://doi.org/10.1002/poi3.360
French, A. (2023). A typology of disinformation intentionality and impact. Information Systems Journal, 34(4), 1324-1354. https://doi.org/10.1111/isj.12495
Fu, D., Ban, Y., Tong, H., Maciejewski, R., & He, J. (2022). Disco: Comprehensive and explainable disinformation detection. https://doi.org/10.48550/arxiv.2203.04928
George, J. (2024). Discovering why people believe disinformation about healthcare. Plos One, 19(3), e0300497. https://doi.org/10.1371/journal.pone.0300497
Gillespie, T. (2018). Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media. Yale University Press.
Griffiths, P., Machery, E., & Linquist, S. (2009). The vernacular concept of innateness. Mind & Language, 24(5), 605-630. https://doi.org/10.1093/acprof:osobl/9780199927418.003.0012
Guess, A., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., & Reifler, J. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences, 117(27), 15536-15545. https://doi.org/10.1073/pnas.1920498117
Hanzel, I. (2021). Definition by abstraction as a method of the empirical sciences. Filozofia Nauki, 29(4), 31-56. https://doi.org/10.14394/filnau.2021.0021
Hillebrandt, M. (2021). The communicative model of disinformation: a literature note. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3798729
Horani, L. (2023). Sustainable design concepts and their definitions: an inductive content-analysis-based literature review. Technological Sustainability, 2(3), 225-243. https://doi.org/10.1108/techs-10-2022-0041
Hsieh, H. and Shannon, S. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277-1288. https://doi.org/10.1177/1049732305276687
Keller, T. R., Schoch, D., Stier, S., & Yang, J. (2020). Political astroturfing on Twitter: How to coordinate a disinformation campaign. Political Communication, 37(2), 256–280. https://doi.org/10.1080/10584609.2019.1661893
Kollár, D. (2022). Disinformation as a contemporary key security challenge in the context of the Russian-Ukrainian conflict. Politické Vedy, 25(3), 87-109. https://doi.org/10.24040/politickevedy.2022.25.3.87-109
Ledwich, M., & Zaitsev, A. (2019). Algorithmic extremism: Examining YouTube's rabbit hole of radicalization. arXiv preprint arXiv:1912.11211.
Lee, A. (2020). Online hoaxes, existential threat, and internet shutdown: a case study of securitization dynamics in Indonesia. Journal of Indonesian Social Sciences and Humanities, 10(1), 17-34. https://doi.org/10.14203/jissh.v10i1.156
Lindgren, S. (2017). Digital media and society. SAGE Publications.
Lischka, J. (2018). Logics in social media news making: How social media editors marry the facebook logic with journalistic standards. Journalism, 22(2), 430-447. https://doi.org/10.1177/1464884918788472
Marshall, J. (2017). Disinformation society, communication and cosmopolitan democracy. Cosmopolitan Civil Societies an Interdisciplinary Journal, 9(2), 1-24. https://doi.org/10.5130/ccs.v9i2.5477
Marshall, J. P., Goodman, J., Zowghi, D., & Rimini, F. d. (2015). Disorder and the disinformation society: The social dynamics of information, networks and software (Routledge Research in Information Technology and Society; Vol. 17). Routledge
Martens, B., Aguiar, L., Gómez-Herrera, E., & Müller-Langer, F. (2018). The digital transformation of news media and the rise of disinformation and fake news. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3164170
Mayring, P. (2014). Qualitative content analysis: theoretical foundation, basic procedures and software solution. Klagenfurt. https://nbn-resolving.org/urn:nbn:de:0168-ssoar-395173
Meta. (2022). Our approach to misinformation. https://transparency.fb.com/features/approach-to-misinformation/
Mogali, S. (2023). Initial impressions of chatgpt for anatomy education. Anatomical Sciences Education, 17(2), 444-447. https://doi.org/10.1002/ase.2261
Moretti, F., Vliet, L., Bensing, J., Deledda, G., Mazzi, M., Rimondini, M., … & Fletcher, (2011). A standardized approach to qualitative content analysis of focus group discussions from different countries. Patient Education and Counseling, 82(3), 420-428. https://doi.org/10.1016/j.pec.2011.01.005
Nakayama, C., Iwasa, H., Moriyama, N., & Yasumura, S. (2023). Relationship between the effects of perceived damage caused by harmful rumors about Fukushima after the nuclear accident and information sources and media. International Journal of Environmental Research and Public Health, 20(3), 2077. https://doi.org/10.3390/ijerph20032077
Novaes, C. & Reck, E. (2015). Carnapian explication, formalisms as cognitive tools, and the paradox of adequate formalization. Synthese, 194(1), 195-215. https://doi.org/10.1007/s11229-015-0816-z
O’Callaghan, D., Greene, D., Conway, M., Carthy, J., & Cunningham, P. (2015). Down the (white) Rabbit Hole: The Extreme Right and Online Recommender Systems. Social Science Computer Review, 33(4), 459–478. https://doi.org/10.1177/0894439314555329
Qerimi, G. & Gërguri, D. (2022). Infodemic and the crisis of distinguishing disinformation from accurate information: case study on the use of Facebook in Kosovo during Covid-19. Information & Media, 94, 87-109. https://doi.org/10.15388/im.2021.94.56
Rapti, M., Tsakalidis, G., Petridou, S., & Vergidis, K. (2022). Fake news incidents through the lens of the dcam disinformation blueprint. Information, 13(7), 306. https://doi.org/10.3390/info13070306
Samchynska, O. (2022). Disinformation: Concept and essence. Administrative Law and Process, (3(38)), 32-45. https://doi.org/10.17721/2227-796x.2022.3.03
Soral, W., Bilewicz, M., & Winiewski, M. (2017). Exposure to hate speech increases prejudice through desensitization. Aggressive Behavior, 44(2), 136-146. https://doi.org/10.1002/ab.21737
Tufekci, Z. (2018). Twitter and tear gas: The power and fragility of networked protest. Yale University Press.
Twitter. (2021). Updates on our work around misleading information. https://blog.twitter.com/en_us/topics/product/2021/misinformation-labels
Watt, G. (2023). The Making Sense of Politics, Media and Law. Cambridge University Press. https://doi.org/10.1017/9781009336413
White, M., & White, E. (2006). Content analysis: A flexible methodology. Library trends, 55(1). https://muse.jhu.edu/pub/1/article/202361/55.1white_tab01.html
YouTube Official Blog. (2020). How YouTube supports elections. https://blog.youtube/news-and-events/how-youtube-supports-elections/
Zhang, Z. and Luo, L. (2019). Hate speech detection: a solved problem? the challenging case of long tail on twitter. Semantic Web, 10(5), 925-945. https://doi.org/10.3233/sw-180338
DOI: http://dx.doi.org/10.24329/aspikom.v9i1.1534
Refbacks
- There are currently no refbacks.

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License