Open Canadian Standard · March 2026 · Built for the Age When Machines Decide

AI Decision Records, introducing The AgDR Accountability Standard

When the AI decided, who was responsible?

AgDR v0.2

Atomic Kernel Inference Standard

CC0 / Apache 2.0

Policy = PPP  ·  Provenance · Place · Purpose

Every AI decision captured atomically at the exact inference point, signed with BLAKE3 + Ed25519, chained in a forward-secret Merkle tree. The AI governance standard that is tamper-proof from birth. Court-ready under the Canada Evidence Act. No new legislation required.

“Don’t trust the machine. Don’t even trust me. Trust the record.”

Aligned with the global AI governance conversation

EU AI Act NIST AI RMF ISO/IEC 42001 Canada Evidence Act CBCA s.122 PHIPA / PIPEDA

What Every AgDR Record Captures

Every profession that earned public trust did it the same way; by keeping the record. The moment a decision fires, AgDR is the witness, written at the exact instant, before anyone can change the story.

Provenance

Who is acting, on whose behalf, from what verified state. Every accountability chain begins with a name and the record that name is attached to holds forever.

Place

Where the decision is headed. The intended destination, regulatory boundary, and definition of success. Perfectly frozen at the inference instant.

Purpose

Why the decision is being made. The explicit intent, fiduciary duty, and ethical anchor. Every AgDR record closes on three questions: Was it beautiful in its intent? Was it true in its reasoning? Was it wise in its consequence?

Full Reasoning Trace

Every step of the model’s chain-of-thought, signed at the inference instant

Human Delta Chain

Every human review, override, and escalation is named, timed, and signed

FOI Terminal Node

A named human at the end of every chain. The point where the machine’s decision meets a conscience.

Stress-Tested at Real-World Scale

TSX Toronto equity desk · Single AGI agent · March 2026

100M
decisions processed

Every decision signed, sealed, and traceable.

253,807
decisions / second

Real-time AI governance at production scale.

3.94 µs
avg. AKI capture latency

Captured before the decision can be questioned.

100%
success rate · zero dropped records

Not one record lost. The standard holds.

Every record independently verifiable today and in 2076. No proprietary tools. No vendor dependency. The standard holds.  See the full stress test →

Every profession that earned public trust kept the record. Medicine. Aviation. Law. The standard was always the same; only the technology changed.

Built on Existing Law

No new legislation required. The frameworks already exist, AgDR simply implements what they always meant.

Referenced Global Standards & Frameworks

EU AI Act 2024/1689 ↗ NIST AI RMF 1.0 ↗ ISO/IEC 42001:2023 ↗ Canada Evidence Act ↗

Read the Standard

Every claim is documented. Every component is specified. Every page is open. The complete AI governance specification is free, forever.

Open Source Ecosystem

The AgDR standard is backed by working open-source implementations. From the canonical specification to production-grade systems to experimental research; every layer is open.

Questions the Standard Answers

Every institution arriving at AI governance asks the same questions. The answers are not opinions, it is immutable auditable records.

What is an AI decision record and why does it matter now? +

How do I verify an AgDR record in Python? +

What proves record integrity in AI audits? +

How does AgDR implement EU AI Act Article 12 and Article 14? +

How does the NIST AI Risk Management Framework relate to AgDR? +

What is the difference between AgDR and a standard audit log? +

The Organizations That Keep the Record Lead the Ones That Don’t.

Organizations adopting AgDR gain something no AI governance framework has delivered before: a structured, atomic, tamper-evident raw data source from every high-stakes AI decision. Full reasoning trace. PPP triplet. Human accountability chain. All captured at the kernel level. Simultaneously AgDR is an AI audit trail, a training dataset, and a legal instrument.

Every AgDR record is simultaneously an AI audit trail, a training dataset, and a legal instrument admissible under existing law.

Three Seats at the Founding Table

Every infrastructure that civilization now relies on had a founding moment, a small group who saw the necessity before the world did. We are at that moment for AI governance. Three roles remain open for those who understand what this is.

Technical Co-Founder

The specification exists. The standard is open. What is needed now is the builder who turns proof into permanence. the production runtime, SDK, and developer ecosystem that every AI deployment runs on. This is not a job. It is authorship of infrastructure.

Regulatory Credibility Anchor

Between a standard and a recognized standard stands one thing: the person whose name and judgment makes courts and regulators lean in. The difference is not technical. It is the weight of a career applied at the right moment.

Founding Pilot Partner

Every standard that changed an industry was proven first in one real environment, by one organization willing to go first. Your deployment does not test the standard. It becomes the proof that changes the question from “why AgDR?” to “why not yet?”

founding@accountability.ai

One email. Tell us what you see that others have not yet seen.

h