ECREA

European Communication Research
and Education Association

Log in

The Failed Epistemologies of AI? Making Sense of AI Errors, Failures and their Impacts on Society

13.03.2024 22:45 | Anonymous member (Administrator)

Annals of the Fondazione Luigi Einaudi

Deadline: April 24, 2024

Editors:

Prof. Veronica Barassi (veronica.barassi@unisg.ch)

Dr. Philip Di Salvo (philip.disalvo@unisg.ch)

School of Humanities and Social Sciences

University of St. Gallen

Generative artificial intelligence tools and large language models are gaining a prominent space in our society. Probably for the first time in history, humans have now to relate and interact with technological systems capable of producing and generating new content and knowledge mimicking humans’ imagination, speech, and behaviors in ways that was not possible before. This new state of things brings inevitably profound consequences and potential sea changes for numerous social, scientific, and cultural fields raising epistemological, ethical, political economical and philosophical questions about the epistemologies of AI and the processes of knowledge production of these systems. The race for AI innovation is being framed with reference to the ‘superintelligence’ of our machines, their processing power, their ability to learn and generate knowledge. In public debate, AI technologies are admired for their powers, and feared for their threats. Yet, we are increasingly confronted with the fact that these machines make errors and mistakes, they are fallible and inaccurate, and they are often culturally biased. From Generative AI technologies that ‘hallucinate’ and invent facts to predictive policing technologies that lead to wrongful arrests, our world is quickly coming to terms with the fact that the AI we are building is not only astonishing and incredibly powerful, but often unable to understand the complexity of our human experience and our cultural worlds. Research has shown that AI errors and their problematic outcomes can’t be considered as mere coding glitches, but as the direct expression of the structural inequalities of our societies and they confront us with critical questions about our supposed anthropocentric position as knowledge-creators.

The aim of this special issue is to gather scholars coming from different fields of the social sciences and humanities to investigate how artificial intelligence systems are challenging epistemological assumptions in various societal areas and how the failures of such systems are impacting on knowledge creation and diffusion in their areas of interest. Overall, the special issue aims at overcoming dominant and hyped takes and narratives around AI and its supposed (super)powers, and critically reflect on how we can identify and learn how to coexist with the limitations of AI driven knowledge production.

Possible topics include, but are not restricted to:

  • Impacts of AI Errors and Failures: Exploring the ways in which AI failures, inaccuracies and errors in AI impact human understanding, decision-making, and interpersonal relationships.
  • Cultural Limitations of AI Knowledge: Investigating how AI systems intersect with cultural norms, values, and belief systems, and assessing the limits to cultural diversity and inclusivity of these technologies.
  • Fake News and DeepFakes: Generative AI, democracy, disinformation, and the public sphere
  • Social Construction of AI Truth: Investigating how AI systems construct and perpetuate particular truths, shaping public perceptions and influencing social narratives.
  • Bias and Discrimination in AI: Analyzing how inherent biases in training data, algorithms, and decision-making processes contribute to perpetuating social inequalities and reinforcing existing power structures.

Submission procedure

We invite interested scholars to submit an abstract (300 words, 3 to 5 keywords) by 24th of April, 2024 to editors@annalsfondazioneluigieinaudi.it, veronica.barassi@unisg.ch; philip.disalvo@unisg.ch.

The issue’s editors will review the abstracts and send notifications of acceptance or rejection by the 8th of June, 2024.

The special issue will include up to 8 contributions among those received through the call for papers. Final papers (about 8000 words) will be due on 8th of December 2024. Please note that acceptance of abstracts does not necessarily imply acceptance of the paper for the special issue. For further information (including the aim and scope of the Journal), please refer to the Journal’s website.

contact

ECREA

Chaussée de Waterloo 1151
1180 Uccle
Belgium

Who to contact

Support Young Scholars Fund

Help fund travel grants for young scholars who participate at ECC conferences. We accept individual and institutional donations.

DONATE!

CONNECT

Copyright 2017 ECREA | Privacy statement | Refunds policy