Call: Trust and the Digital Society – Special issue of the Journal of Trust Research

Call for Papers:

Trust and the Digital Society
For a special issue of the Journal of Trust Research
https://think.taylorandfrancis.com/special_issues/trust-and-the-digital-society/

Special Issue Editors:
Prof. Dr. Balázs Bodó, University of Amsterdam, the Netherlands (b.bodo@uva.nl)
Dr Linda Weigl, University of Amsterdam, the Netherlands
Dr Mónika Simon, University of Amsterdam, the Netherlands
Dr Tomasz Zurek, University of Amsterdam, the Netherlands
Dr Theo Araujo, University of Amsterdam, the Netherlands

Deadline for submissions: December 1, 2025

BACKGROUND

Trust is the latest shared societal resource to be disrupted by digital innovation on a global scale. Concerns abound regarding growing distrust in institutions, practices, and professions which were highly trusted. More and more people have less confidence than before in journalism, science, vaccines, schools and universities, otherwise fair and reliable public institutions. Political polarization creates tensions in interpersonal trust relations, and sometimes tears friendships, and even families apart. While skepticism and distrust can also be understood as liberal democratic virtues, online they are all too often subject to ‘weaponization’ at the hands of trolls, online influencers, dishonest politicians and sock puppet accounts connected to authoritarian state sponsored disinformation campaigns. In online environments, where outrage often leads to higher levels of ‘engagement’, these dynamics feed into new ‘coalitions of distrust’ forming across and between different groups united by their shared antagonism of ‘the mainstream’. On the other end of the spectrum, we also see an increase of ‘overconfidence’ in untrustworthy actors. Throughout history, people have often placed trust in questionable hands, but what distinguishes the present is the scale at which this occurs online, where accountability is frequently lacking. The rise of the sharing economy has made it common to trust strangers with our homes, cars, and personal belongings, often without fully considering the risks involved. Similarly, the growing presence of generative AI has led many to trust the output of these systems without hesitation in their daily lives. Trust is fluid, and there are just too many opportunities for it to flow into unwarranted places: the untrustworthy seem to be increasingly trusted, while the trustworthy aren’t.

Thus, trust has become one of the central concepts in the digital society. On the one hand, the trustworthiness of our information infrastructures, such as platforms, AI, and encrypted communications emerged as a central concern (trust in technology). On the other hand, trust relations in the digital society, such as trust in expertise, science, news, or public institutions have been fundamentally disrupted (trust by technology). In each case we may be facing a slightly different formulation of the same fundamental questions:

Trust in technology: First, what makes these new digital innovations (un)trustworthy? What mix of regulation, transparency, accountability, oversights, technical design, business models will provide the greatest confidence that our new digital infrastructures can deliver on their promises, while keeping the best interest of their users and of the society in mind?

Trust by technology: Moreover, how does digital innovation shape trust in the digital society? What are the dynamics that shape trust relations vis-à-vis other people, institutions, technologies, etc.? How do the different components of trust change and transform due to digitization: the circumstances of the one who trusts, the characteristics of the one to be trusted, the environment in which trust emerges (or not).

SCOPE OF THE SPECIAL ISSUE

This Special Issue, an outcome of the Amsterdam Trust Summit 2025, invites submissions that address these questions. In particular, we encourage submission addressing the following themes:

Theories of trust and distrust in the digital society: Theoretical and empirical work on technology-related risks, uncertainties, and harms as well as benefits and new dynamics, and both new and revisited models of trust and distrust in and as mediated by digital technology.

Trust dynamics around emerging technologies: Trust processes and changes in trust over time related to specific technologies, such as AI, platforms, self-driving cars, blockchains, as they are developed, implemented, and negotiated across various societal domains. These domains include journalism, science, the justice system, education systems, economic transactions, labor, finance, supply chains, public institutions, interpersonal relations, and epistemic frameworks.

Individual trusting behaviors and impacts around technology: Work focusing on extending trust theory regarding the antecedents, processes, and consequences of trusting behavior vis-à-vis technology, as well as shifts in trust behaviors through technology mediated relations.

Trustworthiness safeguards of socio-technical infrastructures: Empirical and doctrinal research around trustworthiness and regulation, self-regulation, trust and safety teams, technical designs and architectures of trustworthiness, and trustworthiness by markets and competition.

Narratives of trust and distrust in popular culture: Research that addresses the issue of trust and distrust in media and social media conversations, conspiracy thinking and the prevalence of conspiracy theories, the rise and impact of fake news, as well as misinformation and disinformation in (social) media content. We expect work in this area to not only document these narratives but also critically analyze their construction, circulation, and effects.

Methods of studying trust in the information age: This theme invites contributions that critically reflect on how emerging or technologically facilitated methods can provide new insights into trust itself. Methodological work focusing on various research methods, such as quantitative trust research, including surveys and questionnaires, experiments, statistical modelling, network analysis, content analysis, network modeling, time series analysis, computational methods (textual, audiovisual); qualitative trust research, including interviews, case studies, ethnography, qualitative content analysis, think aloud studies; and interdisciplinary approaches to study trust in the fields of psychology, sociology, political science, communication science, neuroscience, economics etc. We especially welcome work that demonstrates how these methods uncover dynamic, contextual, or previously inaccessible dimensions of trust.

As we face growing challenges in understanding trust amidst technological mediation and disruption, this Special Issue aims to shed light on these issues. The contributions that will be featured are intended to explore diverse perspectives on how trust is mediated and reshaped by technological infrastructures, and whether and how we can deal with these developments. By engaging with the complex socio-technical and political interplay between individuals, institutions, and technologies, we hope this issue will inspire further research and offer meaningful insights into the limitations and safeguards of a trustworthy digital society.

TIMELINE

Full papers due: December 1st, 2025
Review period: December 2025 – August 2026.
Final acceptance deadline: August 2026
Planned publication date: Fall 2026

SUBMISSION INSTRUCTIONS

For this Special Issue, we invite conceptual/theoretical, qualitative, and quantitative empirical research, normative and descriptive, from the humanities, social sciences, law and natural sciences. We are looking for contributions that address the interplay between digital technologies, social, political, institutional dynamics and trust. We welcome a broad range of submissions spanning disciplines such as law, sociology, political science, media studies, psychology, economics, management & organization, computer science, and more. Contributions can employ diverse methodologies, including qualitative, quantitative and big data approaches, historical analysis, ethnomethodology, experimentation, as well as theoretical reflections on the trust dynamics in the digital society. Additionally, we strongly encourage interdisciplinary approaches that illuminate the multifaceted aspects of this topic.

Important note on conceptual clarity: We encourage submissions that engage with trust as a distinct and substantive concept, rather than as a proxy for related ideas such as technology adoption, acceptance, or usability. While trust is often entangled with these dynamics, this Special Issue seeks contributions that examine how trust is formed, challenged, negotiated, and transformed over time.

EDITOR BIOS:

PROF. DR. BALÁZS BODÓ. Balázs is Professor of information Law and Policy with a special emphasis on technology governance at the Institute for Information Law (IViR) at the University of Amsterdam. He was a Fulbright Visiting Researcher at Stanford University’s Center for Internet and Society in 2006/7. In 2012/13 he was a Fulbright Fellow at the Berkman Center for Internet and Society at Harvard University. In 2013 he moved to Amsterdam as a Marie Curie Fellow at the Institute for Information Law (IViR) at the University of Amsterdam. In 2018 he received an ERC Starting Grant to study the legal, and political implications of blockchain based technologies, and started the Blockchain & Society Policy Research Lab. He has been invited by the European Commission to serve as an expert for various blockchain related projects. In 2019 he has been a senior visiting fellow at the Weizenbaum-Institut für die vernetzte Gesellschaft, Berlin. He is the founding (co)director of the University of Amsterdam’s interdisciplinary research area on Trust in the digital society. His academic interests include digital piracy, decentralized techno-social systems, shadow libraries, informal media economies, regulatory conflicts around new technological architectures, and trust.

DR. LINDA WEIGL. Linda Weigl is a postdoctoral researcher at the University of Amsterdam’s (UvA) Institute for Information Law (IViR) with a background in Political Science and European public policy. Her research focuses on the governance of digital technologies, particularly platforms and identity systems. Linda is part of Trust Research Priority Area at UvA, which studies the evolution of trust in response to emerging algorithmic trust production technologies and explores the potential disruptions to existing trust relationships. The goal of her research is to uncover the power dynamics and risks behind digital trust infrastructures by scrutinizing the accountability of companies, governments’ regulatory power, and users’ risk awareness. From studying the EU’s evolving tech regulations like eIDAS (2.0), the DSA, and the DMA, to speaking directly with Trust and Safety teams at major platforms, Linda’s research balances different methodological approaches. Specifically, she is interested in how digital systems shape, challenge, and sometimes undermine trust, both at the institutional level and in everyday digital interactions.

DR. TOMASZ ZUREK. Tomasz Zurek holds a master’s degree in management (1999) and a doctorate in computer science (2004), with a dissertation focused on the application of artificial intelligence in banking. His current research interests include normative reasoning and argumentation, value-based reasoning, and the computational modeling of trust in multi-agent systems. Recently, he developed a formal model of trust for multi-agent systems, which has been implemented and experimentally validated. The results of this work have been presented and published at leading conferences in the field, including ICAART and AAMAS. For several years, Tomasz has served as an assistant professor at the Institute of Computer Science at Maria Curie-Skłodowska University in Lublin, Poland (currently on sabbatical), where he has successfully combined research with teaching. He is also an associate fellow at the T.M.C. Asser Institute and a postdoctoral researcher at the Informatics Institute, University of Amsterdam (TRUST RPA project). Tomasz has authored and co-authored more than 60 peer-reviewed papers and has served on the program committees of numerous prestigious conferences and workshops dedicated to Artificial Intelligence and Law, as well as multi-agent systems.

DR MÓNIKA SIMON. Mónika Simon is a computational social scientist specializing in political communication and journalism. She obtained her training at the Amsterdam School of Communication Research (ASCOR) and currently holds a postdoctoral position at ASCOR as part of the University Research Priority Area on Trust in the Digital Society (Trust RPA).  In her current role as a postdoc, she works with the broader interdisciplinary RPA team to understand the many facets of trust and distrust in our society.  Her main research focus is unravelling automatic versus deliberative trust/distrust using social scientific methods and economic games. Together with Theo Araujo and Jan Engelmann (neuroeconomics) she is currently developing a series of studies to identify, understand, and address biases that may impact various trust-related attitudes and behaviors in the digital society.

PROF. DR. T.B. (THEO) ARAUJO. Professor of Media, Organizations and Society in the Department of Communication Science and Scientific Director of the Amsterdam School of Communication Research (ASCoR) at the University of Amsterdam. He is co-director of the Communication in the Digital Society Initiative (uva.nl/communication-digital-society ) and has co-founded and is a former co-director of the Digital Communication Methods Lab (digicomlab.eu). His research focuses on the increasing adoption of artificial intelligence and related technologies within our communication environment, including conversational agents and automated-decision making. He is also interested in the latest developments of computational social science, and in the implementation of large-scale data collection and analysis for communication research. In 2021, he has been awarded a Platform Digital Infrastructure Social Sciences and Humanities Grant to lead a consortium of six Dutch universities to develop D3I, a digital data donation infrastructure (d3i-infra.github.io), enabling researchers to partner with users to study digital infrastructures via data donation.


Comments


Leave a Reply

Your email address will not be published. Required fields are marked *

ISPR Presence News

Search ISPR Presence News:



Archives