/praetom

Production intelligence for AI-generated code.

A model of your codebase joined with your operational reality, queried by the agents that write your software.

praetom · live
/praetomhow’s checkout going today?
checkout · feature_healthlive

// is it being used?

2,341 ↗ +12% w/w

// is it working?

2,289 of 2,341 made it through · 52 got stuck

last issue · 6 days ago · 14 customers affected

[ show me what changed → ]

praetom.feature_health("checkout")

slash command · claude · cursor · slack · linear · github

01The problem

Teams ship by feature. The operational stack still speaks service. When checkout breaks, the user doesn’t care which microservice failed — but every dashboard makes you do that translation by hand.

More of the code is being written by agents that don’t read it afterward. Service boundaries mean nothing to them. The feature is the only abstraction that actually exists for the team operating it.

02How it works
// your codebase// features, in plain englishyour-app/├─ app/api/checkout/route.ts├─ app/api/users/signup/route.ts├─ lib/cart/session.ts├─ app/api/orders/create.ts├─ app/api/search/route.ts├─ workers/email/welcome.ts├─ app/api/checkout/payment.ts├─ app/api/orders/[id]/route.ts└─ app/api/search/filter.ts9 files · 4 features detected · across 3 directoriespraetomreads · derives · instrumentscheckoutCustomer completes a purchase.3 endpoints · auto-instrumented · 4 spansordersOrder creation and lookup.2 endpoints · auto-instrumented · 3 spansonboardingNew user signs up and verifies.2 endpoints · auto-instrumented · 2 spanssearchCustomer finds a product.2 endpoints · auto-instrumented · 2 spans

praetom reads your repo · names your features · instruments them

  1. A feature contract, in plain English.

    You write what the feature does, what counts as healthy, what an incident looks like. “checkout succeeds within 400ms for 99.9% of users.”

  2. A feature map, built by AI.

    praetom reads the codebase and traces every path that implements the contract — across services, queues, databases, frontend, external APIs. The graph updates as the code changes.

  3. A consultable production reality.

    On top of the graph: code, telemetry, incidents, narrative. Coding agents query it before generating. The PR bot queries it on diff. You query it in Slack. Same call, structured data for the agent + a rendered widget for you.

The LLM is the new primary writer of code. Shift-left reaches its asymptote at the moment of generation. Nothing is earlier.

03Where you use it

praetom doesn’t add a tab. It lands in the tools your team is already in. One MCP server backs Claude, Cursor, and ChatGPT; one Slack app handles the team-broadcast slot. Same answer everywhere.

GitHub · PR check

Low risk · 3 priors clean


this diff touches
checkout · refund
confidence
0.78 retrieval · 0.81 causal

similar pattern in #1142 · clean

[ open praetom analysis ↗ ]

MCP · tool call

praetom.what_changed("checkout")

{
  "since": "30d",
  "prs": [
    { "id":1247, "by":"@dani" },
    { "id":1233, "by":"@miles" },
    { "id":1221, "by":"@dani" }
  ]
}

five tools in the praetom mcp

Slack · DM

a

austen10:14 AM

is anyone on checkout right now?

p

praetom10:14 AM

@dani has 3 open PRs touching it.

last shipped 2h ago — clean so far.

no incidents this week.


conversation answers · no widget required

Cursor · inline

refund.ts · claude is writing

47async function processRefund(req) {
48  ↳ consulting praetom...
    similar to: checkout
    existing pattern: line 47
    avoid: retry loop (#1233)
49  await validateRefund(req);
50}

agent reads praetom before it writes

claude.md · skill

.claude/skills/praetom.md

---
name: praetom
description: feature health,
  usage, incidents, ownership
when_to_invoke: feature
  mentioned by name
---

versioned in repo · auto-consulted by claude

Linear · ticket

CXN-1247in review · 2h

Fix payment retry loop in checkout

praetom · blast radius
if this regresses
↳ ~52 customers blocked / day
↳ ~$8.4k revenue at risk / day

priors · 2 incidents tagged retry

[ show fix in this PR → ]

04Why now

Four independent technical shifts had to land. All four did, in early 2026.

LLMs that can read a codebase and trace call graphs.

OpenTelemetry at critical mass — feature tags propagate cleanly across services.

MCP as the standardized agent integration surface — install once, every agent gets access.

MCP rich-UI just shipped — the same tool call serves a JSON return for the agent and a rendered widget for the human.

05What's different
vs. Sentry / Datadog
Their primitive is the service; their SLO is a latency budget. Their tools assume a code-literate user. praetom’s primitive is the feature; the contract is plain English.
vs. PostHog
Frontend-event-centric. praetom instruments backend along the same feature boundaries the frontend events already mark.
vs. Bits AI / Seer / Resolve
Reactive. They answer “what just broke.” praetom answers “what’s about to break” — before generation.
vs. Vercel Agent
Platform-locked to Vercel-hosted apps. praetom runs on Railway, Fly, AWS, GCP, self-hosted.
06Install
07Get started

Onboarding teams by hand so the first month is right. Sign up and we’ll be in touch with next steps.

Sign up →