Back to Blog
I Had Zero Pages Indexed for Three Months. Here's the One-Line Fix.

I Had Zero Pages Indexed for Three Months. Here's the One-Line Fix.

A canonical URL mismatch between www and non-www kept my entire blog invisible to Google for three months. Six files, twelve line changes, and a sitemap resubmission fixed it. Here's how to check yours.

SEOWeb DevelopmentEngineeringNext.js
March 4, 2026
6 min read

A developer posted on Hacker News yesterday about losing the SEO battle for his own open source project. A fake site was outranking the real one. The thread hit 400+ points and 200 comments, mostly people raging at Google.

I read the thread and felt a different kind of recognition. Not the "squatter stole my name" kind. The "I did this to myself" kind.

For three months, my blog had zero pages indexed in Google. Not low rankings. Not buried on page five. Zero. Eight URLs submitted in my sitemap, zero indexed. I published twelve posts during that period. None of them showed up in any search result, ever.

The cause was a one-line mismatch that took five minutes to fix once I found it.

The mismatch

Google decided the canonical version of my site was https://www.mkweb.dev. That's the URL Google chose as authoritative. You can check yours in Google Search Console under URL Inspection.

My code disagreed. Every canonical tag, every Open Graph URL, every structured data reference, every sitemap entry pointed to https://mkweb.dev (no www).

typescript
// What my code said:
const url = "https://mkweb.dev/blog/" + slug;

// What Google expected:
const url = "https://www.mkweb.dev/blog/" + slug;

That's it. Three characters: www.

Google saw my sitemap full of https://mkweb.dev/... URLs and checked them against its canonical preference for https://www.mkweb.dev/.... They didn't match. So Google ignored the sitemap. For three months.

Why this is worse than it sounds

The frustrating part isn't that Google is strict about canonicals. The frustrating part is that nothing tells you this is happening.

Google Search Console showed my sitemap as "submitted" and "successful." The coverage report showed pages as "discovered" but not "indexed." The status reason was the famously unhelpful "Discovered - currently not indexed," which could mean fifty different things.

I checked for noindex tags. None. I checked robots.txt. Clean. I checked page speed, mobile usability, Core Web Vitals. All fine. I added structured data, Article schema, proper meta descriptions. Nothing changed.

The canonical mismatch doesn't trigger an error. It doesn't show a warning. It just quietly prevents indexing, and every debugging path leads you somewhere else.

Where the mismatch hides

In a Next.js app, canonical URLs can come from at least six different places. Mine was wrong in all of them:

layout.tsx sets the global metadataBase and default canonical:

typescript
// before
metadataBase: new URL("https://mkweb.dev"),
alternates: { canonical: "https://mkweb.dev" },

// after
metadataBase: new URL("https://www.mkweb.dev"),
alternates: { canonical: "https://www.mkweb.dev" },

blog/[slug]/page.tsx generates per-post metadata and JSON-LD:

typescript
// before
url: "https://mkweb.dev/blog/" + slug,
image: "https://mkweb.dev" + post.coverImage,
author: { url: "https://mkweb.dev" },

// after
url: "https://www.mkweb.dev/blog/" + slug,
image: "https://www.mkweb.dev" + post.coverImage,
author: { url: "https://www.mkweb.dev" },

sitemap.ts and robots.ts reference the base URL for sitemap generation and the sitemap location in robots.txt.

StructuredData.tsx has a separate hardcoded URL for the site-wide Organization schema.

Six files. Twelve line changes. Every single one was mkweb.devwww.mkweb.dev.

How to check yours right now

This takes two minutes.

  1. Open Google Search Console. Go to URL Inspection. Paste your homepage URL.
  2. Look at the "Google-selected canonical" field. This is what Google considers the real URL.
  3. Compare it to what your code outputs. View source on your site, search for canonical. Check your sitemap XML. Check your structured data.

If the Google-selected canonical has www and your code doesn't (or vice versa), you have the same problem I did.

You can also check from the command line:

bash
# What your sitemap says
curl -s https://yoursite.com/sitemap.xml | grep '<loc>'

# What Google thinks (check manually in Search Console)
# Look for "Google-selected canonical" in URL Inspection

If you're on Next.js, grep for your domain across the codebase:

bash
grep -r "https://yoursite.com" app/ components/ --include="*.tsx" --include="*.ts"

Every result should use the same www (or non-www) prefix that Google selected as canonical.

Why Google picks www over non-www (or vice versa)

Google's documentation says it uses "signals" to choose a canonical, including: which version has more inbound links, which version the server redirects to, and which version the sitemap specifies.

In practice, if you set up your DNS with both A and CNAME records (common with Vercel, Netlify, and Cloudflare), Google often picks the www version because the CNAME record explicitly points www.yoursite.com to your host. The bare domain uses an A record, which is less specific.

The fix isn't to fight Google's canonical choice. The fix is to match it. Check what Google picked, then make your code agree.

The broader lesson from that HN thread

The developer on Hacker News had a different problem. A fake site was squatting on his project name and outranking the legitimate site. But the thread surfaced a pattern that applies to both situations: Google gives you almost no actionable feedback when something is wrong with your SEO.

The HN comments were full of developers who had done everything "right" (proper meta tags, good content, sitemap submitted) and still had indexing problems. One commenter offered to help fix the issue and listed concrete steps: verify in Search Console, submit sitemap, add Organization schema, build backlinks from your GitHub repo.

That's good advice. But it also means every developer with a side project needs to become a part-time SEO analyst, and the tools Google provides for debugging are genuinely bad. "Discovered - currently not indexed" as a status message is the equivalent of a compiler that says "something went wrong" with no line number.

My specific failure was entirely self-inflicted. Nobody squatted on my domain. No black hat SEO undermined my rankings. I just had mkweb.dev in my code where Google wanted www.mkweb.dev, and the system silently dropped every page for three months without a single warning.

After the fix

I fixed the canonical mismatch on March 1st. Resubmitted the sitemap to both the www and non-www properties in Search Console. Google's documentation says re-indexing can take "a few days to a few weeks."

I'm monitoring it now. Three days post-fix, Search Console still shows the pages as "Discovered - currently not indexed," but that's expected given the delay. If you're reading this post via Google, the fix worked.

The takeaway is simple: before you debug anything else about your site's SEO, verify that your code's canonical URLs match what Google selected as your canonical domain. Two-minute check. Could save you three months.

Share

Get new posts in your inbox

Architecture, performance, security. No spam.

Keep reading