Technical SEO
Site speed, crawlability, structured data, sitemaps — the technical foundation of discoverability.
Page Speed and Core Web Vitals
Google officially uses Core Web Vitals as a ranking signal. These three metrics measure user experience:
- LCP (Largest Contentful Paint): How quickly the main content loads. Target: under 2.5 seconds.
- INP (Interaction to Next Paint): How quickly the page responds to user input (clicks, taps). Target: under 200ms.
- CLS (Cumulative Layout Shift): How much content shifts around as the page loads. Target: under 0.1 (minimal jank).
A slow website ranks worse. Faster websites also convert better — a 1-second delay can reduce conversions by 7%. Tools like Google PageSpeed Insights show you exactly what to fix. Many issues are fixable through image optimization, lazy loading, and removing render-blocking resources.
Crawlability: Robots.txt, Sitemaps, and Canonical Tags
Robots.txt is a file that tells search engine crawlers what they can and cannot access. If your robots.txt blocks search engines, they won't index your pages. Most sites should allow crawling of public pages.
Sitemaps are XML files that list all your pages and their metadata (last updated, change frequency, priority). Sitemaps help crawlers find pages that might not be linked internally. You submit your sitemap to Google Search Console.
Canonical tags tell search engines which version of a page is the "official" one. If you have the same content at example.com/product and www.example.com/product, canonical tags prevent duplicate indexing. Without canonicals, Google picks one version and may not give credit to the other.
Structured Data and Schema Markup
Structured data (also called schema markup) is code you add to pages that explains what content is on the page. Instead of guessing, you tell Google: "This is a product with a price of $99, 4.5 stars, and 200 reviews."
Schema markup enables rich results in search: product carousels, recipe cards, reviews, FAQs, and more. These eye-catching results get higher click-through rates. For local businesses, LocalBusiness schema is critical for appearing in the local pack.
You don't need schema for basic ranking, but it significantly improves click-through rates and helps Google understand your content. Most platforms handle basic schema automatically, but custom implementations give better results.
URL Structure and SEO
Good URL structure is both for users and search engines:
- Use hyphens to separate words (example.com/web-design not web_design or webdesign)
- Keep URLs short and descriptive (example.com/services not example.com/page123?id=services)
- Use lowercase letters only
- Avoid dynamic parameters when possible (Google treats www.example.com?product=coffee and www.example.com?product=tea as different URLs)
- Never change URLs without 301 redirects (see "The Real Cost of Cheap" for migration costs)
URLs are part of ranking signals and appear in search results. Users form judgments from URLs. Changing URLs after launch is expensive.
HTTPS and Mobile-First Indexing
HTTPS (secure connections) has been a ranking signal since 2014. All modern websites use HTTPS. It's cheap (often free through Let's Encrypt) and essential for user trust.
Mobile-first indexing means Google primarily uses the mobile version of your site for ranking. If your mobile site is slower, has less content, or doesn't work properly, your rankings suffer. Responsive design is mandatory.
Duplicate Content and JavaScript Rendering
Duplicate content occurs when the same content appears on multiple URLs. This can be unintentional (session IDs, tracking parameters) or intentional (copying content across pages). Google may not index all duplicates, and ranking is split across them. Use canonical tags or 301 redirects to consolidate signals.
JavaScript rendering is a major crawlability issue. If your site loads content dynamically with JavaScript, Google must render the page (run the JavaScript) to see the content. This is slower and more error-prone than HTML content. Google might miss content that fails to load via JavaScript. Heavy client-side rendering can hurt SEO.
Modern frameworks (React, Vue, Angular) are increasingly crawlable, but best practice is server-side rendering or static generation. Next.js handles this well. Older SPA (Single Page Application) builds often struggle with SEO.
Technical SEO by Platform
| Platform | URL Control | Page Speed | Schema Support | Canonical Control | JS Rendering | Overall |
|---|---|---|---|---|---|---|
| Wix | Limited | Moderate | Basic | Limited | Moderate | Fair |
| Squarespace | Good | Good | Good | Good | Good | Very Good |
| WordPress | Excellent | Depends* | Excellent | Excellent | Excellent | Excellent |
| Shopify | Moderate | Very Good | Good | Moderate | Good | Very Good |
| Webflow | Excellent | Very Good | Very Good | Excellent | Very Good | Excellent |
| Next.js | Excellent | Excellent | Excellent | Excellent | Excellent | Excellent |