
AshrafI got tired of EXPLAIN tools that sent my query plans to AI services or hallucinated index...
I got tired of EXPLAIN tools that sent my query plans to AI services or hallucinated index suggestions. So I built something different.
Last month I was debugging a slow query at work. The EXPLAIN output was 400 lines of JSON. I tried three different "AI-powered" analyzers:
I just wanted to know: where did the time actually go?
Current tools fall into two camps:
Neither tells you "4.5 seconds were spent in triggers, 0.03ms in the scan."
PlanCheck is a client-side PostgreSQL EXPLAIN analyzer with one rule: truth only.
| Issue | What it looks like |
|---|---|
| Trigger amplification | 4.5s in triggers, 0.03ms in scan |
| JIT overhead | 700ms compilation to save 100ms |
| Bitmap recheck failure | Index returns 15K rows, all discarded |
| Recursive CTE runaway | 100K iterations with 99.9% filter rejection |
| Partition pruning failure | 53M rows scanned, 1 returned |
| ... plus 13 more | (too many to list, but you get the idea) |
Parsing EXPLAIN output in the browser sounds easy. It's not.
PostgreSQL has 40+ node types. Each has different fields, metrics, and edge cases. I had to handle:
TEXT format (messy indentation, inconsistent fields)null values (missing timing data)All while keeping the bundle under 200KB.
When a tool suggests "add an index," it's guessing your intent. Maybe you want that full table scan for analytics. Maybe the index exists but PostgreSQL chose not to use it.
So PlanCheck only reports measurable facts:
You decide what to fix.
Paste your EXPLAIN (ANALYZE, BUFFERS, FORMAT JSON) output. No signup. No data collection.
I'm looking for edge cases I haven't tested yet. If you have a query plan that breaks it, I want to see it. Drop a comment or reach out!