SEO and metadata
This guide covers the SEO setup included in the boilerplate: automatic sitemap generation, dynamic robots.txt, and how to add metadata to your pages.
Overview
The boilerplate includes basic SEO support out of the box:
- Automatic sitemap via
@nuxtjs/sitemapat/sitemap.xml - Dynamic robots.txt at
/robots.txt - Meta tags using Nuxt's
useSeoMetacomposable - Nuxt Content integration for blog posts and docs
Sitemap
The sitemap is automatically generated and includes all public pages, blog posts, and docs.
Configuration
Minimal configuration in nuxt.config.ts:
export default defineNuxtConfig({
site: {
url: process.env.NUXT_PUBLIC_SITE_URL || 'http://localhost:3000',
},
sitemap: {
exclude: ['/auth/**', '/app/**', '/checkout/**'],
},
})
The sitemap automatically discovers:
- All pages in
app/pages/ - Blog posts from
content/blog/ - Docs from
content/docs/
Private routes (/auth, /app, /checkout) are excluded from search engines.
Nuxt Content integration
Content collections are wrapped with asSitemapCollection in content.config.ts:
import { asSitemapCollection } from '@nuxtjs/sitemap/content'
export default defineContentConfig({
collections: {
content: defineCollection(
asSitemapCollection({
type: 'page',
source: '**/*.md',
})
),
docs: defineCollection(
asSitemapCollection({
type: 'page',
source: { include: 'docs/**/*.md' },
})
),
},
})
Vercel deployment
For Vercel and other serverless platforms, configure Nuxt Content to use the native SQLite connector:
export default defineNuxtConfig({
content: {
experimental: {
sqliteConnector: 'native', // Required for Vercel
},
},
})
This avoids issues with the better-sqlite3 native module in serverless environments.
Robots.txt
Dynamic robots.txt at /robots.txt uses your site URL from environment variables.
Implementation in server/routes/robots.txt.ts:
export default defineEventHandler(event => {
const config = useRuntimeConfig()
const siteUrl = config.public.siteUrl || 'https://example.com'
const robotsTxt = `# Allow all search engines
User-agent: *
Allow: /
# Block private/internal routes
Disallow: /api/
Disallow: /auth/
Disallow: /app/
Disallow: /checkout/
# Sitemap
Sitemap: ${siteUrl}/sitemap.xml
`
setHeader(event, 'Content-Type', 'text/plain; charset=utf-8')
setHeader(event, 'Cache-Control', 'public, max-age=3600')
return robotsTxt
})
Edit this file to customize which routes are blocked or to add bot-specific rules.
Page metadata
Add SEO metadata to pages using the useSeoMeta composable:
<script setup lang="ts">
useSeoMeta({
title: 'About us',
description: 'Learn more about our company and mission',
})
</script>
<template>
<div>
<h1>About us</h1>
<p>Content here...</p>
</div>
</template>
With OpenGraph and Twitter Cards
<script setup lang="ts">
const { data: post } = await useAsyncData('post', () =>
queryContent(`/blog/${route.params.slug}`).findOne()
)
useSeoMeta({
title: post.value.title,
description: post.value.description,
ogTitle: post.value.title,
ogDescription: post.value.description,
ogImage: post.value.image.src,
twitterCard: 'summary_large_image',
})
</script>
Environment configuration
Set your production site URL in environment variables:
NUXT_PUBLIC_SITE_URL=https://yourdomain.com
NUXT_PUBLIC_SITE_NAME=Your Site Name
NUXT_PUBLIC_SITE_URL in production. The sitemap and robots.txt require it.Testing
Test your SEO setup locally:
pnpm dev
# Visit:
# http://localhost:3000/sitemap.xml
# http://localhost:3000/robots.txt
After deployment, validate with:
- Google Search Console - Submit sitemap
- Facebook Sharing Debugger - Test OG tags
- Twitter Card Validator - Test Twitter cards