ArkforgeAn honest look at building a REST API that monitors websites and uses AI to explain what changed. The wins, the mistakes, and why I have zero users.
Two weeks ago I had a dumb problem: I was manually checking 8 different web pages every day. Competitor pricing. Job boards. Regulatory updates.
25 minutes a day, every day. On something a script should handle.
So I built ArkWatch — a REST API that monitors URLs and uses AI to tell you what actually changed, not just that something changed.
Here's my honest take.
You give it a URL. It watches it. When something meaningful changes, AI summarizes what happened in plain English.
curl -X POST https://watch.arkforge.fr/api/v1/watches \
-H "X-API-Key: your_key" \
-H "Content-Type: application/json" \
-d '{"url": "https://competitor.com/pricing", "name": "Pricing Page"}'
When something changes:
{
"changes_detected": true,
"ai_summary": "Enterprise plan price increased from $79 to $99/mo. New 'Teams' tier added at $49/mo.",
"ai_importance": "high",
"sentiment": "neutral"
}
That ai_summary is the whole point. Most monitoring tools give you a raw HTML diff. That's useless for 90% of use cases.
FastAPI + auto-generated docs saved me days.
I can point anyone to watch.arkforge.fr/docs and they get interactive Swagger docs. Test endpoints in the browser. No README needed.
AI summarization is genuinely useful.
I wasn't sure feeding diffs to an LLM would produce anything valuable. It does. Mistral AI is surprisingly good at extracting "what matters" from a wall of HTML changes.
Boring tech ships fast.
No Kubernetes. No microservices. No message queue. Just FastAPI on a single VPS with systemd workers.
| Component | Choice |
|---|---|
| Framework | FastAPI (Python) |
| AI | Mistral API |
| Workers | Systemd |
| Proxy | Nginx + Let's Encrypt |
| Infra | Single VPS, < $5/month |
API response time: ~50ms.
I underestimated noise filtering.
This is the hard part nobody warns you about. Web pages "change" constantly:
My first version flagged every page as "changed" on every single check. Useless.
I had to build a pipeline: strip scripts, styles, nav, footers. Hash content sections separately. Maintain ignore patterns for known dynamic elements. Only flag changes that pass a significance threshold. Then run the AI analysis.
This took longer than the entire rest of the project combined.
I built the easy parts first.
API endpoints? Two hours. Change detection that actually works? Two weeks and counting. Should have validated the hard part before writing a single endpoint.
Zero users.
The API works. The tech is solid. I have zero paying users. Building is the fun part. Distribution is the hard part. Every developer knows this, and yet here I am writing about it on Dev.to hoping someone reads it.
| Metric | Value |
|---|---|
| Development time | ~2 weeks |
| Monthly server cost | < $5 |
| API response time | ~50ms |
| Paying customers | 0 |
| Beta testers | looking for them |
| Lines of code | ~3000 |
Not a success story. A work-in-progress story.
| Plan | Watches | Check Interval | Price |
|---|---|---|---|
| Free | 3 | Daily | $0 forever |
| Starter | 10 | Hourly | $9/mo |
| Pro | 50 | Every 5 min | $29/mo |
Free tier is permanent. No credit card required.
I need 3-5 beta testers who have a real use case. Not "I'll sign up and forget" — someone who actually checks web pages regularly and would benefit from automation.
Use cases I'm thinking about:
Free tier: 3 URLs, daily checks. No strings.
API docs: watch.arkforge.fr/docs
Website: arkforge.fr
If you could set up automated monitoring on any web page right now, what would it be? And what would you want the alert to tell you?
I'm trying to understand real use cases before building features nobody wants.
Building in public. Infra cost: $5/month. Revenue: $0. Users: 0. Optimism: unreasonable.
I read every comment. AMA.