AI & State Capacity · Reference Framework

AI Components
for Government

Before authorizing any AI deployment, your ministry needs to know three things: what this component does, who controls it, and whether your institution can govern it without creating liability you cannot recover from.

This tool answers all three.
How to use it

Run the capacity self-check first. Your results draw a boundary on the table between components your institution can govern today and those that require building capability first. Click any cell for what the component does, who controls it, and what to specify in a contract.

· 16 questions · under 5 minutes Your results will draw a governance boundary on the table below.
Gov. Capacity ↓ Component →
C1Input
C2Intelligence
C3Memory
C4Action
C5Control
1 L1 Low Procurement and data protection basics in place
People Deployment owner (product manager) Procurement officer with tech literacy Data protection lead
Infrastructure Basic logging infrastructure Authenticated user base or digital ID
Governance Data protection law in force Standard procurement rules applicable
Td Training Data Defines what the model learns A
Lm Language Model Generates text from inputs A
Em Embeddings Encodes meaning for similarity search A
Pr Prompt Directs model task behavior A
Al Audit Log Records every system action A
2 L2 Medium Vendor oversight, acceptance testing, documented decisions
People Product manager with sign-off authority Vendor manager or technical lead Acceptance tester (in-house or contracted)
Infrastructure API access to at least one core system Data classification policy in place
Governance AI-specific contract clauses Documented decisions with named approver
Ft Fine-tuning Adapts model to domain data A
Mm Multimodal Processes image and text inputs A
Vs Vector Store Retrieves context by similarity A
Fc Function Call Executes actions in external systems S
Gr Guardrails Blocks unsafe or invalid outputs A
Hl Human Review Approves outputs before execution S
3 L3 High Ongoing evaluation cycles, data engineering, formal oversight
People Data engineer (inputs, retrieval, drift) PM running evaluation cycles Dedicated oversight or assurance role
Infrastructure (DPI) Data exchange layer or interoperability stack Audit logging infrastructure Shared digital infrastructure accessible to AI
Governance AI governance policy adopted Vendor performance framework with metrics
Sd Synthetic Data Generates artificial training data A
Sm Small Model Runs lightweight local inference A
Rg RAG Grounds responses in source documents A
Kg Knowledge Graph Connects entities across government records A
Ag Agent Orchestrates sequential task workflows S
Rt Red-teaming Stress-tests system vulnerabilities A
4 L4 Full Full operational capacity
People Senior PM with incident response authority In-house administrative law function Security architect or AI safety lead
Infrastructure (DPI) Citizen-facing digital services with appeals Audit logging validated by data protection authority Interoperable national ID for automated decisions
Governance Legal basis for automated decisions established Named ministerial accountability
Lf Live Data Feed Pulls real-time external data A
Rm Reasoning Model Chains structured reasoning steps S
The Research
Frontier
Ma Multi-agent Coordinates distributed task systems D
Ip Interpretability Explains model decision pathways A
Capacity Level ↓
L1 Low
L2 Medium
L3 High
L4 Full
Ownership
🏛 Government-controlled
⚖ Shared (you configure; vendor runs the engine)
🏢 Vendor-controlled
Decision type
AAdvisory — informs human decision
SAssisted — human confirms before action
DDeterministic — acts on records or processes
Click any cell to read its definition and key questions
LLM & generative AI only.
Rows = government capacity required, not AI autonomy.
CostAI is metered. A scaled deployment can exhaust a multi-year budget in weeks. Require consumption caps in every contract.
Data ResidencyProcessing data in a foreign jurisdiction may conflict with sovereignty law regardless of which components you use.
Security & MonitoringAccess controls, runtime drift detection, and incident response are not optional. Specify them or they will not exist.
Exit & PortabilitySpecify export formats and migration paths before signature. After lock-in, this negotiation is over.
Who controls it: 
What it is
Who controls it · In practice
Procurement questions
Add this component to your readiness roadmap. When you're done, generate a Strategic Report across all selected components.
0
Components Selected
AI Components for Government · Companion Instrument
AI Governance Readiness
Institutional Diagnostic
24 questions across 5 dimensions.
Pick the option that describes your
organization today, not where you plan to be. ← Periodic Table

This instrument is a self-assessment for government teams, not an external audit or a certification checklist. It covers four institutional dimensions: the digital infrastructure your country operates, the data your AI would depend on, the people and authority structures needed to govern it, and the capacity to challenge vendors and detect failure. No composite score. The output is a profile across four dimensions, each placed on a four-level scale. Where you are weakest is where to start.

0 / 24 answered
About Your Organization
These four questions frame your profile. They are not scored.
Context 01 What level of government are you assessing?
Context 02 What is the country's income classification?
Context 03 Which sector best describes the services you deliver?
Context 04 Where is your organization in AI deployment?
Digital Public Infrastructure
Country-level conditions that determine whether AI can operate coherently across agencies at all.
Q 01 · Digital Identity If you needed to verify a citizen's identity digitally across two agencies today, what would happen?
Q 02 · Data Exchange Can your ministry receive structured data from another ministry automatically, without a manual export?
Q 03 · Digital Payments Can government services disburse or receive payments digitally without requiring physical presence?
Q 04 · Connectivity and Access What proportion of the citizens your services reach have reliable enough internet access to use a digital service without assistance?
Q 05 · Platform Trust When your government launches a new digital service, what typically happens?
Data Readiness
Whether the data an AI system would depend on is usable, legal, and representative.
Q 06 · Digitization Is the data your ministry would use to operate an AI system currently in machine-readable form?
Q 07 · Legal Basis If you wanted to use administrative data from your ministry in an AI system, is the legal basis clear?
Q 08 · Cross-Agency Sharing If your AI system needed data from another ministry, could you get it legally and practically?
Q 09 · Coverage and Representation Does the data your services generate actually cover the full population you are meant to reach?
Q 10 · Data Quality If you pulled the same dataset from two different points in time, would the structure and definitions be consistent?
People and Authority
Whether the right people exist, with the right mandates, to govern AI deployment.
Q 11 · Acceptance Testing An AI vendor delivers a system. Who checks whether it actually works before it goes live?
Q 12 · Data Stewardship Who knows whether your administrative records are complete, legally usable, and representative of the population?
Q 13 · Product Authority A vendor delivers a component that does not meet requirements. Who has the authority to formally reject it and halt deployment?
Q 14 · Senior Accountability If an AI system caused harm to a citizen, who is the named official who would be called to account?
Q 15 · Override Authority A frontline staff member disagrees with an AI recommendation that affects a citizen's case. What happens?
Q 16 · Escalation and Suspension An AI system starts producing outputs that seem wrong. Who decides to suspend it, and how quickly can that happen?
Q 17 · Procurement Literacy A vendor submits a technical specification for an AI system. Who in your organization reads it critically?
Capacity to Challenge and Detect
Whether the organization can detect when AI systems fail, degrade, or cause harm.
Q 18 · Vendor Challenge A vendor claims their AI system is 94% accurate. Who decides whether that claim is credible and sufficient for your use case?
Q 19 · Post-Deployment Monitoring Your AI system has been live for six months. Who checks whether it is still performing as expected?
Q 20 · Bias Detection You suspect your AI system is performing worse for a specific demographic or region. Who investigates?
Q 21 · Independent Audit Could your organization assess an AI system's performance without relying on the vendor to run it?
Q 22 · Feedback Loop Awareness AI systems can alter the data they depend on over time. A triage model changes referral patterns, then trains on data it already shaped. Who in your organization would notice this?
Q 23 · Automation Bias Frontline staff are known to over-rely on AI outputs even when their own judgment would have been better. What has your organization done about this?
Q 24 · Institutional Memory If the three people most responsible for your AI systems left tomorrow, what would happen to governance and monitoring?
0 of 20 scored questions answered
AI Governance Readiness · Institutional Profile Your Readiness
Profile
This profile shows where your organization sits across four institutional dimensions. Each dimension is placed on a four-level scale: Fragile, Partial, Functional, and Robust. The weakest dimension is where to start. A profile that is strong on DPI and data but weak on people is a different problem from the reverse, and requires a different intervention.
Government Level·
Income Group·
Sector·
Deployment Stage·
Section I Digital Public Infrastructure
L1 Fragile
L2 Partial
L3 Functional
L4 Robust
Section II Data Readiness
L1 Fragile
L2 Partial
L3 Functional
L4 Robust
Section III People and Authority
L1 Fragile
L2 Partial
L3 Functional
L4 Robust
Section IV Capacity to Challenge
L1 Fragile
L2 Partial
L3 Functional
L4 Robust
Deployment authorization

Overall signal

← Periodic Table