AI Governance for Software Development

The standards didn't disappear because AI showed up.

Prompt Codex brings the discipline of BABOK, DMBOK, IEEE, and ISO 42001 to AI-assisted development — structured requirements, governed prompts, full traceability, and audit-ready output.

Request Early Access See the Architecture
30 yrs
Built by a practitioner
6+
Standards integrated
Aug 2026
EU AI Act enforcement
Now
Design partners forming
BABOK DMBOK IEEE 29148 DORA ISTQB ISO 42001 NIST AI RMF
The Problem

Search any job board for "AI development governance." Nothing comes back.

The category doesn't exist yet. No one is hiring for it because no one has built the tooling. But the role, the framework, and the regulatory demand are all arriving at the same time.

Cursor, Copilot, and Claude Code are generating production software from casual prompts — with no requirements verification, no compliance check, and no record of what was asked or why.

For organizations in healthcare, financial services, defense, and government, that isn't a process gap. It's a liability.

01

The standards didn't go away. BABOK, DMBOK, IEEE 29148, ISTQB — decades of methodology for how software gets built responsibly. None of it maps to AI-assisted workflows. Yet.

02

Zero audit trail. When AI-generated code causes an incident, there is no way to trace the decision back to a requirement, a prompt, or a human approval. That's indefensible in regulated environments.

03

Regulatory timelines are fixed. The EU AI Act begins enforcement for high-risk AI in August 2026. ISO/IEC 42001 is already becoming a procurement requirement. Organizations are not ready.

04

Adoption is outpacing governance. The gap between what AI can generate and what organizations can verify widens every quarter. The practitioner community recognizes it. No one has built the solution.

The Platform

Governance built for the AI development era.

Prompt Codex is not a code generation tool. It is the governance layer above them — the enforcement engine that ensures every AI-assisted deliverable traces from a documented requirement through a validation gate to a verified, auditable output.

Built on the Discovery Lattice framework, Prompt Codex maps the question corpus derived from industry standards — BABOK, DMBOK, IEEE, DORA, ISTQB — directly to your development workflow.

Nothing moves forward until the gate passes.
Why It Matters

The exposure is real. The frameworks already exist.

— 01 / The Gap

AI Is Already in Your Development Pipeline

Your teams are using AI coding tools in production. In most organizations, that's happening faster than governance can follow. The gap between what's being generated and what's being verified is wider than most risk officers realize.

— 02 / The Exposure

When Something Goes Wrong, the Trail Stops at the AI

There is no audit trail. No record of what was asked, what was approved, or who was accountable. In a regulated environment — healthcare, financial services, defense, government — that is not a process gap. It is a liability.

— 03 / The Frameworks

The Standards You're Already Accountable To Apply Here

ISO 42001, NIST AI RMF, and existing data governance frameworks don't disappear because AI is involved. Prompt Codex maps directly to the standards your organization already uses — no new compliance language to learn.

— 04 / The Timeline

For Organizations Touching EU Markets, the Clock Is Running

EU AI Act enforcement for high-risk AI begins August 2026. If your organization operates in or sells into EU markets, that timeline is yours. The organizations with governance structures in place before enforcement begins are in a materially different position.

Why Now

Three forces are converging.

The role, the framework, and the market are all emerging simultaneously. The question isn't whether governance will be required — the regulatory trajectory makes that inevitable. The question is what form it takes and who defines it.

AI coding tool adoption is outpacing compliance teams' ability to establish oversight. The EU AI Act begins enforcement for high-risk AI in August 2026. ISO/IEC 42001 is already a de facto procurement requirement for vendors serving regulated industries. And the practitioner standards bodies — IIBA, DAMA International — acknowledge the problem exists but haven't issued structured solutions.

I've spent 30 years building software companies and the last several years inside one of the largest healthcare IT operations in the world, watching this gap widen in real time. The standards we built as an industry didn't become irrelevant when AI arrived. They became urgent. Prompt Codex is my answer to that problem — built independently, outside any corporate role, because practitioners don't wait for policy papers.

— The Founder

Join the Waitlist.

Design partners, enterprise early adopters, and investors. If you work in governance, compliance, or AI-assisted development in regulated environments — this conversation is for you.

No spam, no sharing. Access is invitation-only.