Prof Calzada’s article ‘Trustworthy AI for Whom? GenAI Detection Techniques of Trust Through Decentralized Web3 Ecosystems’ has been accepted for publication stemming from his Horizon Europe ENFIELD project

Professor Igor Calzada is pleased to announce that his latest research article, “Trustworthy AI for Whom? GenAI Detection Techniques of Trust Through Decentralized Web3 Ecosystems,” co-authored with BME scholars, has been accepted for publication in Big Data and Cognitive Computing (BDCC, ISSN 2504-2289; Impact Factor 3.1; CiteScore 7.1), an open-access journal by MDPI. The article, officially accepted on March 1, 2025, contributes to the ongoing debate on trust in artificial intelligence (AI), particularly in the context of Generative AI (GenAI) and Web3 ecosystems.

Here the final version of the preprint (the accepted version in its current form) while is being processed by the journal:

Calzada, I., Németh, G., & Al-Radhi, M. S. (2025), Trustworthy AI for Whom? GenAI Detection Techniques of Trust Through Decentralized Web3 Ecosystems. Big Data & Cognitive Computing. https://doi.org/10.20944/preprints202501.2018.v1

The Significance of This Research

As AI systems become more embedded in society, challenges related to trust, governance, and fairness have taken center stage. In this paper, Prof. Calzada and his co-authors examine how GenAI detection techniques can reinforce trust through decentralized Web3 infrastructures. By exploring the intersection of AI and blockchain technologies, the study proposes innovative approaches to addressing AI governance challenges.

Related Initiatives and Ongoing Discussions

This research aligns with several initiatives and policy discussions on AI governance:

Future Directions

Prof. Calzada’s research aims to redefine trust in (Gen or Urban) AI by integrating decentralized and transparent mechanisms that promote accountability. He invites scholars, policymakers, and industry leaders to engage with these findings and contribute to the broader discourse on Trustworthy AI.

No Comments

Sorry, the comment form is closed at this time.