Disinformation
Social and Media Contexts
Summary
Excerpt
Table Of Contents
- Cover
- Title
- Copyright
- About the author
- About the book
- This eBook can be cited
- Contents
- CEDMO: Three Years of Transformation of the Digital Media Ecosystem in Central Europe
- Introduction (Karina Stasiuk-Krajewska, Michał Wenzel)
- The Phenomenon of Disinformation from a Psychological Perspective: Determinants and Counteraction (Jakub Kuś)
- Countering Disinformation in the Context of a Rule of Law Crisis (Filip Cyuńczyk)
- Fake News as Text and Genre: Selected Linguistic (Communicative) Structures and Their Functions (Karina Stasiuk-Krajewska)
- Awareness and Knowledge of Disinformation among Polish News Journalists (Katarzyna Bąkowicz)
- Media Consumption and Attitudes Towards Vaccinations Against COVID-19 and the War in Ukraine (Michał Wenzel)
- Disinformation in Slovakia: The Spread and Media Behaviour (Peter Krajčovič)
- An Analysis of the Narratives of Right-Wing Populist Movements on Social Media in Relation to Vaccination and the War in Ukraine (Grzegorz Rzeczkowski, Przemyslaw Witkowski, Roland Zarzycki)
- Misinformation as a Source of Hate Among Polish Influencers (Emilia Zakrzewska)
- Epilogue: How Disinformation Changes Our Lives? (Katarzyna Bąkowicz)
CEDMO: Three Years of Transformation of the Digital Media Ecosystem in Central Europe
“Until ‘appearances’ are confronted with reality, they do not appear as appearances
Until ‘living in a lie’ is confronted with ‘living in the truth’, there is no perspective,
that reveals its falsity. Once there is an alternative to these, however,
it threatens them for what they are, in their essence and integrity.
(Václav Havel, The Power of the Powerless, 1978)
Urban legends as a precursor to rumours spread in mass-forwarded emails, known as chain emails. Generative artificial intelligence tools, led by ChatGPT, as a digital imprint of one of the most famous Prague legends about the Golem. A scroll bringing the Golem to life as a prefigurement of the world view and distortions embedded in AI training data. Yes, the artificial intelligence that churns out deceptive videos known as deepfakes with an ease all its own. The old mixes with the new, the analogue with the digital, life in the lie with life in the truth. Watching the transformation of the digital media ecosystem is fascinating. It teaches us that what appears new (and is often fashionable) only appears so.
Not succumbing to delusion requires detachment, openness, honesty and courage. These are attributes that are integral to the principles of the Central European Digital Media Observatory (CEDMO), a multidisciplinary consortium established on 1 October 2021. During the three years of its existence, the research part of CEDMO, consisting of Charles University in Prague, SWPS University in Warsaw and Cyril and Methodius University in Trnava, has been trying to understand and subsequently communicate to the wider public the knowledge related to life in a post-factual society, which has gone through several crises in recent years - from health (the COVID-19 pandemic), to energy, set against the backdrop of the Russian aggression in Ukraine, to media, by which we mean the ongoing collapse of the economic models of serious news media. The book you are about to open conveys some of the insights I have in mind. When the playwright Václav Havel reflected on the stability of the post-totalitarian system in his essay The Power of the Powerless in the late 1970s, he noted that one of its pillars rests on a shaky foundation: namely, lies. “It therefore proves itself only insofar as one is willing to live in a lie,” Havel wrote.
Without wishing to question the shaky foundations of the lie, its stability is undoubtedly helped by the tools of generative artificial intelligence churning out vast amounts of text, sounds and images that are not a true representation of reality. What is more, these tools also contribute to an unprecedented degree to the erosion of the foundations of life in truth. To paraphrase the words of journalist Farhad Manjoo, the problem of living in the era of generative artificial intelligence is not that many texts, sounds, photos and videos are fraudulent, but that we stop trusting real texts, sounds, photos and videos.
The three-year research effort of the CEDMO consortium to date, which has resulted, inter alia, in this book, offers a robust foundation for the next three years of a unique multidisciplinary hub in Central Europe. “If there are no people, let there be at least Robots, at least the shadow of man, at least his work, at least his parable,” states one of the characters at the end of the drama R.U.R. by journalist and writer Karel Čapek, who gave the world the word robot in this very work. Research on human imprints in computer neural networks, their impact on the media ecosystem, as well as human works remixed by generative artificial intelligence are just a slice of CEDMO’s future research activities that can contribute to the development of algorithmic and AI literacy alongside media literacy. These are the ones that can make the foundations of living in truth more robust.
With best wishes for an inspiring read
Karina Stasiuk-Krajewska, Michał Wenzel
Introduction
Due to the complexity and dynamics of the phenomenon itself, but also to the fact that the term is used for different purposes and with different intentions, disinformation, the title category of this book, is not a precise concept (Bernceker, Flowerree, Grundman, 2021).
The above-mentioned definitional difficulties and the (too) broad semantic scope, led to the tendency for the term disinformation to be sometimes replaced with the term FIMI, which is an acronym for Foreign Information Manipulation & Interference. The term was introduced by the European External Action Service in response to the growing disinformation threat coming from outside actors, such as Russia, but also China and India. FIMI is “a pattern of behaviour that threatens or has the potential to negatively impact values, procedures and political processes. Such activity is manipulative in character, conducted in an intentional and coordinated manner. Such activity can be performed by state or non-state actors, including their proxies inside and outside of their own territory” (Tackling Disinformation, Foreign Information Manipulation & Interference).
In the view presented, FIMI is an activity, which is:
- • Harmful (for democracies, societies, values, security etc);
- • Not illegal (which means: not covered by other legal instruments, located in a “grey zone”);
- • Manipulative (which includes manipulation of content);
- • Intentional (which means that it is not misinformation, but it purses a specific goal, and is organised/financed);
- • Coordinated (It uses different parts of the “ecosystem” and is not “organic”).
Of course, the FIMI category is more precise and easier to operationalize (Deppe, 2023) - such, after all, were the main motives for its introduction. However, in the context of the texts presented in this publication, it seems inadequate to the extent that the subject of the authors’ interest was generally not whether the messages analysed in terms of structure or influence were of external origin or not. Therefore, we decided to stay with the category of disinformation, with all its limitations.
Details
- Pages
- 178
- Publication Year
- 2024
- ISBN (PDF)
- 9783631923184
- ISBN (ePUB)
- 9783631923191
- ISBN (Hardcover)
- 9783631918159
- DOI
- 10.3726/b22085
- Language
- English
- Publication date
- 2024 (August)
- Keywords
- disinformation media research
- Published
- Berlin, Bruxelles, Chennai, Lausanne, New York, Oxford, 2024. 178 pp., 27 fig. b/w, 7 tables.
- Product Safety
- Peter Lang Group AG