Blockchain-assisted explainable decision traces (BAXDT): An approach for transparency and accountability in artificial intelligence systems

dc.authorid0000-0002-0694-9220
dc.contributor.authorParlak, Ismail Enes
dc.date.accessioned2026-02-08T15:15:21Z
dc.date.available2026-02-08T15:15:21Z
dc.date.issued2025
dc.departmentBursa Teknik Üniversitesi
dc.description.abstractThe increasing opacity and lack of verifiable audit trails in AI decision-making systems pose significant challenges to establishing trust and accountability, particularly in high-impact domains. This paper introduces Blockchain-Assisted Explainable Decision Traces (BAXDT), a novel architecture designed to enhance the transparency and auditability of AI systems. BAXDT creates comprehensive, immutable records for each AI decision by integrating model outputs, SHAP-based XAI summaries, a novel Explanation Density Metric, and detailed model/data context into a unified JSON trace. The 0.80 threshold for the Explanation Density Metric was empirically supported by Kneedle-based automatic threshold detection. The BAXDT architecture leverages blockchain by recording a cryptographic hash of each decision trace on-chain, while the full trace is stored off-chain. The system's effectiveness was demonstrated through a multifaceted evaluation: simulations across three diverse public datasets (medical, financial, educational) confirmed its domain-agnostic applicability; a scalability analysis of up to 20,000 traces demonstrated its efficient and linear performance; and a successful deployment on the Ethereum Sepolia public testnet verified its real-world viability. A case study on text data further underscored the framework's flexibility. BAXDT provides a robust framework for documenting AI decisions-what, why, based on what, and when-thereby fostering trustworthy AI and supporting regulatory compliance.
dc.identifier.doi10.1016/j.knosys.2025.114402
dc.identifier.issn0950-7051
dc.identifier.issn1872-7409
dc.identifier.scopus2-s2.0-105014945606
dc.identifier.scopusqualityQ1
dc.identifier.urihttps://doi.org/10.1016/j.knosys.2025.114402
dc.identifier.urihttps://hdl.handle.net/20.500.12885/5742
dc.identifier.volume329
dc.identifier.wosWOS:001567008200001
dc.identifier.wosqualityQ1
dc.indekslendigikaynakWeb of Science
dc.indekslendigikaynakScopus
dc.language.isoen
dc.publisherElsevier
dc.relation.ispartofKnowledge-Based Systems
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.snmzWOS_KA_20260207
dc.subjectExplainable artificial intelligence (XAI)
dc.subjectBlockchain
dc.subjectDecision traceability
dc.subjectArtificial intelligence accountability
dc.subjectAuditability
dc.titleBlockchain-assisted explainable decision traces (BAXDT): An approach for transparency and accountability in artificial intelligence systems
dc.typeArticle

Dosyalar