Explainable AI (XAI) is insufficient. It provides a narrative for a decision after it occurs, but offers no cryptographic proof the model executed as claimed. This creates a trust gap for high-stakes applications like on-chain trading agents or autonomous financial protocols.