Ravi KumarTL;DR: I shipped LinkPreview.io - a tool that shows how your URL looks when shared on 6 social...
TL;DR: I shipped LinkPreview.io - a tool that shows how your URL looks when shared on 6 social platforms. No signup, no BS, just paste a URL and see all your preview cards + get a scored list of fixes.
You know that feeling when you share your blog post on Twitter, and instead of your beautiful featured image, it shows... nothing? Or worse, a random screenshot from halfway down the article?
I had this happen with my podcast episode pages. Everything looked fine when I tested locally. The og:image tag was there in the HTML. But Twitter refused to show the image. I spent 30 minutes jumping between:
Each tool showed me slightly different data. Each one required authentication. And none of them showed me what my link would actually look like in the feed.
I thought: "There has to be a better way."
Narrator: There wasn't.
So I built one.
LinkPreview.io - Paste any URL, see instant previews for:
Plus, it scores your meta tags 0-100 and gives you actionable fixes with copy-paste code snippets.
🔗 Try it: linkpreview.io
SvelteKit - I chose Svelte 5 (with runes) because:
Cloudflare Pages - Edge deployment, zero config, serverless API routes. I can deploy with git push. It's beautiful.
Cheerio - Server-side HTML parsing. Lighter than Puppeteer, perfect for extracting <meta> tags.
Tailwind v4 - Dark mode first, utility-first, fast.
The core feature is: user pastes a URL, my server fetches it, extracts meta tags, returns JSON.
Simple, right?
Wrong.
If I just do fetch(userUrl), an attacker can:
http://192.168.1.1
http://169.254.169.254/latest/meta-data/
http://internal-admin-panel:8080
My solution:
import { isPrivateIp } from './ssrf-protection.js';
async function safeFetch(url) {
// 1. Parse and validate URL
const parsed = new URL(url);
// 2. Pre-redirect IP check
const ip = await dns.resolve(parsed.hostname);
if (isPrivateIp(ip)) {
throw new Error('Private IP addresses are not allowed');
}
// 3. Fetch with redirect: 'manual' so we can check each hop
let response = await fetch(url, { redirect: 'manual' });
// 4. Follow redirects manually, checking each destination IP
while (response.status >= 300 && response.status < 400) {
const location = response.headers.get('location');
const redirectIp = await dns.resolve(new URL(location).hostname);
if (isPrivateIp(redirectIp)) {
throw new Error('Redirect to private IP blocked');
}
response = await fetch(location, { redirect: 'manual' });
}
return response;
}
The trick: even if the initial URL is public, a redirect can send you to a private IP. You have to check every hop.
I also added:
169.254.169.254, metadata.google.internal, etc.)Every platform truncates titles and descriptions at different lengths:
| Platform | Title Limit | Description Limit |
|---|---|---|
| 70 chars | 200 chars | |
| ~100 chars | ~300 chars | |
| 60 chars | 160 chars | |
| 120 chars | ~150 chars | |
| Slack | No limit | ~150 chars |
| Discord | No limit | ~200 chars |
These are all undocumented and vary by context (e.g., Twitter timeline vs. DMs).
I built visual truncation bars:
<script>
let titleLength = $derived(metaTags.title?.length || 0);
let twitterLimit = 70;
let truncationPercent = $derived((titleLength / twitterLimit) * 100);
let isOverLimit = $derived(titleLength > twitterLimit);
</script>
<div class="truncation-bar">
<div
class="fill"
class:red={isOverLimit}
style="width: {Math.min(truncationPercent, 100)}%"
/>
<span>{titleLength}/{twitterLimit}</span>
</div>
The bar turns red when you're over the limit. Simple, but effective.
How do you score meta tags? I built a weighted system:
Critical issues (20 points each):
og:title
og:image
Major issues (10 points each):
og:description
Minor issues (5 points each):
twitter:card
Start at 100, subtract points for each issue. Simple, transparent, and users immediately understand "52/100" means "you need to fix stuff."
Old Svelte:
<script>
let count = 0;
$: doubled = count * 2;
</script>
New Svelte (runes):
<script>
let count = $state(0);
let doubled = $derived(count * 2);
</script>
The $derived rune is so much clearer than $:. It reads like: "this value is derived from that state." No magic, no confusion.
For this project, every preview card is a derived value:
let twitterCard = $derived({
title: metaTags['twitter:title'] || metaTags['og:title'] || metaTags.title,
description: metaTags['twitter:description'] || metaTags['og:description'],
image: metaTags['twitter:image'] || metaTags['og:image'],
card: metaTags['twitter:card'] || 'summary_large_image'
});
Reactivity just works. Change one meta tag, all 6 platform cards update instantly.
1. Most CMSs get it wrong by default
WordPress, Ghost, Wix - they all generate og:image thumbnails that are too small. The default is usually 800x600, but:
If your image is too small, platforms either don't show it or stretch it to look terrible.
2. Twitter cards fallback to Open Graph, but inconsistently
Twitter usually falls back to og:title if twitter:title is missing. But not always. Sometimes it uses the <title> tag. Sometimes it just shows the URL.
The safe bet: always set both og:* and twitter:* tags explicitly.
3. Google doesn't care about Open Graph
Google Search ignores og:title and og:description completely. It uses:
<title> tag (truncates at 60 chars)<meta name="description"> (truncates at 160 chars)But if you have JSON-LD structured data, Google prefers that over meta tags. So now you need three sets of metadata.
4. Facebook's debugger lies
Facebook Sharing Debugger shows you a cached version of your page. Even if you update your meta tags, the debugger still shows the old version until you click "Scrape Again."
I've seen developers waste hours debugging a non-existent problem because they didn't know about the cache.
1. I almost over-engineered the database
My original plan: store every URL validation in Cloudflare D1, let users create accounts, save history, etc.
Then I thought: "Why? What value does that add in Phase 1?"
Answer: None.
So I made it stateless. The server extracts tags, returns JSON, forgets everything. No database, no user accounts, no complexity. You can still share results via URL params (linkpreview.io?url=example.com).
Ship the simplest version that solves the problem.
2. I spent too long on visual polish
I wasted 2 hours tweaking the box shadow on the preview cards. Does it matter? No. Does it feel better? Yes.
But here's the thing: users don't care about your box shadow. They care about whether your tool solves their problem. Polish matters, but not as much as functionality.
Ship fast, polish later.
3. I didn't validate my assumptions
I assumed people wanted side-by-side comparison (test staging vs. production URLs). So I built a whole UI for it.
Then I realized: nobody asked for this. It's a Phase 2 feature at best.
I cut it. Saved myself 4 hours.
Validate demand before building features.
This is Phase 1. I kept it intentionally minimal to ship fast. Phase 2 ideas:
Pricing model: freemium. Free for single URLs, paid for bulk/monitoring/API.
🔗 linkpreview.io
Source code (coming soon): https://github.com/ravichosun/LinkPreview
Built with:
Total build time: ~1 weekend + polish
Total lines of code: ~8,000
Client bundle size: <30KB gzipped
If you've ever shared a link on social media and been frustrated by broken previews, give LinkPreview.io a try. And if you find bugs (you will), let me know in the comments!
Happy shipping 🚀