Blockchain-assisted explainable decision traces (BAXDT): An approach for transparency and accountability in artificial intelligence systems
| dc.authorid | 0000-0002-0694-9220 | |
| dc.contributor.author | Parlak, Ismail Enes | |
| dc.date.accessioned | 2026-02-08T15:15:21Z | |
| dc.date.available | 2026-02-08T15:15:21Z | |
| dc.date.issued | 2025 | |
| dc.department | Bursa Teknik Üniversitesi | |
| dc.description.abstract | The increasing opacity and lack of verifiable audit trails in AI decision-making systems pose significant challenges to establishing trust and accountability, particularly in high-impact domains. This paper introduces Blockchain-Assisted Explainable Decision Traces (BAXDT), a novel architecture designed to enhance the transparency and auditability of AI systems. BAXDT creates comprehensive, immutable records for each AI decision by integrating model outputs, SHAP-based XAI summaries, a novel Explanation Density Metric, and detailed model/data context into a unified JSON trace. The 0.80 threshold for the Explanation Density Metric was empirically supported by Kneedle-based automatic threshold detection. The BAXDT architecture leverages blockchain by recording a cryptographic hash of each decision trace on-chain, while the full trace is stored off-chain. The system's effectiveness was demonstrated through a multifaceted evaluation: simulations across three diverse public datasets (medical, financial, educational) confirmed its domain-agnostic applicability; a scalability analysis of up to 20,000 traces demonstrated its efficient and linear performance; and a successful deployment on the Ethereum Sepolia public testnet verified its real-world viability. A case study on text data further underscored the framework's flexibility. BAXDT provides a robust framework for documenting AI decisions-what, why, based on what, and when-thereby fostering trustworthy AI and supporting regulatory compliance. | |
| dc.identifier.doi | 10.1016/j.knosys.2025.114402 | |
| dc.identifier.issn | 0950-7051 | |
| dc.identifier.issn | 1872-7409 | |
| dc.identifier.scopus | 2-s2.0-105014945606 | |
| dc.identifier.scopusquality | Q1 | |
| dc.identifier.uri | https://doi.org/10.1016/j.knosys.2025.114402 | |
| dc.identifier.uri | https://hdl.handle.net/20.500.12885/5742 | |
| dc.identifier.volume | 329 | |
| dc.identifier.wos | WOS:001567008200001 | |
| dc.identifier.wosquality | Q1 | |
| dc.indekslendigikaynak | Web of Science | |
| dc.indekslendigikaynak | Scopus | |
| dc.language.iso | en | |
| dc.publisher | Elsevier | |
| dc.relation.ispartof | Knowledge-Based Systems | |
| dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | |
| dc.rights | info:eu-repo/semantics/closedAccess | |
| dc.snmz | WOS_KA_20260207 | |
| dc.subject | Explainable artificial intelligence (XAI) | |
| dc.subject | Blockchain | |
| dc.subject | Decision traceability | |
| dc.subject | Artificial intelligence accountability | |
| dc.subject | Auditability | |
| dc.title | Blockchain-assisted explainable decision traces (BAXDT): An approach for transparency and accountability in artificial intelligence systems | |
| dc.type | Article |












