Disinformation in the media: An interdisciplinary approach
DOI:
https://doi.org/10.47475/2070-0695-2025-57-3-30-40Keywords:
disinformation, mass media, interdisciplinary approach, public consciousness, media literacy, algorithmic moderation, fact checking, legal regulation, cognitive distortions, social polarizationAbstract
This interdisciplinary study examines disinformation through the lenses of psychology, sociology, law, and technology. It identifies key drivers of false information spread: cognitive biases (e.g., confirmation bias), emotional triggers (fear, anger), and social dynamics like algorithmic echo chambers. The dual role of AI is analyzed – its capacity to generate synthetic content and tools to detect it.
Legal frameworks are assessed, emphasizing balanced strategies that combine platform self-regulation and state oversight to protect free speech while mitigating harm. Technological solutions, including machine learning and blockchain, are evaluated for their potential and limitations in content verification.
A three-pillar model is proposed: education to strengthen critical thinking and media literacy; legal reforms ensuring algorithmic transparency; hybrid systems (AI + crowdsourcing) for content moderation.
The research highlights the necessity of global cooperation to establish ethical standards in a rapidly evolving digital landscape. Resilience against disinformation, it argues, depends on integrating adaptive technology, coherent regulation, and public trust. By addressing human vulnerabilities and systemic risks, the study outlines pathways to safeguard democratic values in the information age.
References
Babyich, D., Volar'evich, M. (2018). Novye problemy - starye resheniya? Kriticheskij vzglyad na doklad ekspertnoj gruppy vysokogo urovnya Evropejskoj komissii o fejkovyh novostyah i onlajn dezinformacii [New Problems - Old Solutions? A Critical Look at the Report of the European Commission’s High-Level Expert Group on Fake News and Online Disinformation]. Vestnik RUDN. Seriya: Politologiya, 20 (3), 447–460. http://dx.doi.org/10.22363/2313-1438-2018-20-3-447-460. (In Russ).
Bocharov, A. B., Demidov, M. O. (2020). Tekhnologiya faktchekinga v bor'be s «informacionnym musorom»: problemy i perspektivy [Fact-Checking Technology in Combating "Information Noise": Problems and Prospects]. Upravlencheskoe konsul'tirovanie, 12, 102–111. https://doi.org/10.22394/1726-1139-2020-12-102-111. (In Russ).
Godovanyuk, K. A. (2019). Kiberbezopasnost' i bor'ba s dezinformaciej: opyt Velikobritanii [Cybersecurity and Combating Disinformation: The Experience of the UK]. Nauchno-analiticheskij Vestnik Instituta Evropy RAN, 4, 87–92. https://doi.org/10.15211/vestnikieran420198792. (In Russ).
Grachev, G. V., Mel'nik, I. A. (2003). Manipulirovanie lichnost'yu [Manipulation of Personality]. Moscow: Algoritm. 256 p. (In Russ).
Dolgov, A. Yu. (2016). Informacionno-psihologicheskoe vozdejstvie: teoriya i praktika [Information-Psychological Impact: Theory and Practice]. Moscow: Nauka. 198 p. (In Russ).
Ivanova, A. P. (2023). Dezinformaciya v Internete: neizbezhimaya real'nost'? [Disinformation on the Internet: An Inevitable Reality?]. Social'nye i gumanitarnye nauki. Otechestvennaya i zarubezhnaya literatura, 3, 177–188. https://doi.org/10.31249/iajpravo/2023.03.15. (In Russ).
Klement'eva, V. S. (2019). K voprosu ob otvetstvennosti za razmeshchenie «fejkovyh novostej» i oskorblenie gosudarstvennyh organov v kiberprostranstve [On Liability for Publishing "Fake News" and Insulting State Authorities in Cyberspace]. Vestnik Moskovskogo universiteta MVD Rossii, 5, 78–81. https://doi.org/10.24411/2073-0454-2019-10258. (In Russ).
Laminina, O. G., Gryazyuk, A. E., Nedorezova, K. D. (2023). Informacionno-analiticheskie tekhnologii protivodejstviya dezinformacii na primere social'nyh setej [Information-Analytical Technologies for Countering Disinformation: The Case of Social Networks]. Gumanitarnye, social'no-ekonomicheskie i obshchestvennye nauki, 6. 48–51. https://doi.org/10.23672/SAE.2023.35.10.026. (In Russ).
Lassuell, G. (2021). Tekhnika propagandy v mirovoj vojne [Propaganda Technique in the World War]. Moscow: Aspekt Press. 215 p. (In Russ).
Makashova, V. V. (2023). Dezinformaciya kak predmet nauchnogo analiza: tradicionnye i novye podhody [Disinformation as a Subject of Scientific Analysis: Traditional and New Approaches]. Mediaal'manah, 6, 5–23. https://doi.org/10.30547/mediaalmanah.6.2023.1622. (In Russ).
Manzi, D. (2020). Upravlenie rynkom dezinformacii: pervaya popravka i bor'ba protiv fejkovyh novostej [Managing the Misinformation Market: The First Amendment and the Fight Against Fake News]. Aktual'nye problemy ekonomiki i prava, 14 (1), 142–164. https://dx.doi.org/10.21202/1993-047X.14.2020.1.142-164 (In Russ).
Mirontseva, S. A. (2023). Mezhdisciplinarnyj podhod v issledovanii problem regulirovaniya dezinformacii [An Interdisciplinary Approach to Studying the Regulation of Disinformation]. Politicheskie nauki, 4, 73–85. https://doi.org/10.31618/nas.2413-529L202L4.73.512. (In Russ).
Nekrasov, G. N., Romanova, I. I. (2017). Razrabotka poiskovogo robota dlya obnaruzheniya veb-kontenta s fejkovymi novostyami [Development of a Search Robot to Detect Web Content with Fake News]. Innovacionnye, informacionnye i kommunikacionnye tekhnologii, 1, 128–130. (In Russ).
Pakhomova, A. Yu. (2023). Fenomen dezinformacii v otechestvennoj praktike: definicii i sinonimichnye ponyatiya [The Phenomenon of Disinformation in Domestic Practice: Definitions and Synonymous Concepts]. Mediakkmunikacii i zhurnalistika, 40, 17–25. https://doi.org/10.18454/RULB.2023.40.17. (In Russ).
Allport, G. (1954). The Nature of Prejudice. Cambridge, Massachusetts: Addison-Wesley. 537 p.
Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgments. Groups, leadership and men, 177–190. Pittsburgh, PA: Carnegie Press.
Begg, I. M., Anas, A., Farinacci, S. (1992). Dissociation of processes in belief: Source recollection, statement familiarity, and the illusion of truth. Journal of Experimental Psychology: General, 121 (4), 446–458. https://doi.org/10.1037/0096-3445.121.4.446.
Benkler, Y., Faris, R., Roberts, H. (2018). Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics. Oxford University Press. 472 p.
Filipova, I. A. (2024). Legal Regulation of Artificial Intelligence: Experience of China. Journal of Digital Technologies and Law, 2 (1), 46–73. https://doi.org/10.21202/jdtl.2024.4.
Hatfield, E., Cacioppo, J. T., Rapson, R. L. (1993). Emotional contagion. Current Directions in Psychological Science, 2 (3), 96–100. https://doi.org/10.1111/1467-8721.ep10770953.
Keyes, R. (2004). The Post-Truth Era: Dishonesty and Deception in Contemporary Life. New York: St. Martin’s Press. 320 p.
Kunda, Z. (1990). The Case for Motivated Reasoning. Psychological Bulletin, 108 (3), 480–498. https://doi.org/10.1037/0033-2909.108.3.480.
Lazarsfeld, P., Berelson, B., Gaudet, H. (1944). The People’s Choice [Выбор народа]. Columbia University Press. 178 p.
Lipovetsky, G. (2005). Hypermodern Times. Cambridge: Polity Press. 150 p.
Lühring, J., Shetty, A., Koschmieder, C., Garcia, D., Waldherr, A., Metzler, H. (2024). Emotions in misinformation studies: distinguishing affective state from emotional response and misinformation recognition from acceptance. Cognitive Research: Principles and Implications, 9 (82). https://doi.org/10.1186/s41235-024-00567-5.
Myers, D. G., Lamm, H. (1976). The group polarization phenomenon [Феномен групповой поляризации]. Journal of Personality and Social Psychology, 34 (5), 481–490. https://doi.org/10.1037/0022-3514.34.5.481.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psycholog, 2 (2), 175–220. https://doi.org/10.1037/1089-2680.2.2.175.
Nyhan, B., Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32 (2), 303–330. https://doi.org/10.1007/s11109-010-9112-2.
Pennycook, G., Rand, D. G. (2021). The Psychology of Fake News. Trends in Cognitive Sciences, 25 (5), 388–402. https://doi.org/10.1016/j.tics.2021.02.007.
Postman, N. (1985). Amusing Ourselves to Death: Public Discourse in the Age of Show Business. New York: Viking. 184 p.
Rozin, P., Royzman, E. B. (2001). Negativity bias, negativity dominance, and contagion. Personality and Social Psychology Revie, 5 (4), 296–320. https://doi.org/10.1207/S15327957PSPR0504_2.
Sunstein, C. R. (2009). Conspiracy Theories: Causes and Cures. Journal of Political Philosophy, 17 (2), 202–227. https://doi.org/10.1111/j.1467-9760.2008.00325.x.
Tandoc, E. C., Lim, Z. W., Ling, R. (2018). Defining ‘Fake News’. Digital Journalism, 6 (2), 137–153. https://doi.org/10.1080/21670811.2017.1360143.
Tversky, A., Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124.
Vosoughi, S., Roy, D., Aral, S. (2018). The spread of true and false news online. Science, 359, 1146–1151. https://doi.org/10.1126/science.aap9559.
Wardle, C., Derakhshan, H. (2017). Information Disorder: Toward an Interdisciplinary Framework for Research and Policy. Strasbourg: Council of Europe. 109 p.
Woolley, S. C., Howard, P. N. (2019). Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media. Oxford University Press. 263 p.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Иван Диреев

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.





