How to Run a Technical SEO Audit
A technical audit is a systematic review of your site's technical health. It identifies issues blocking crawling, indexation, and ranking. A thorough audit takes 2-4 hours. Start with critical issues, then fix high-priority items over time.
What Is a Technical Audit?
A technical audit evaluates how well your site is optimised for search engines from an infrastructure perspective. It's not about content or links — it's about the plumbing: can search engines crawl and index your pages? Are pages fast? Is the structure sound?
A good audit results in a prioritised list of fixes. Critical issues (blocking indexation) go first. High-priority issues (affecting rankings) come next. Medium and low-priority improvements are done as time allows.
The Priority Framework
Critical Issues: Block Indexation or Visibility
- Entire site blocked in robots.txt
- HTTPS certificate expired or invalid
- Major sections inaccessible or 404ing
- Important pages accidentally noindexed
High-Priority Issues: Hurt Rankings
- Poor Core Web Vitals (LCP > 4s, INP > 500ms, CLS > 0.25)
- Large crawl-to-index gap (many more crawled than indexed)
- Massive duplicate content (hundreds of parameter variations not consolidated)
- Slow TTFB or slow mobile experience
- Significant crawl budget waste (crawling low-value pages)
Medium-Priority Issues: Incremental Improvements
- No XML sitemap
- Poor internal linking structure
- Missing or invalid structured data
- Redirect chains (not loops, but inefficient)
- Some broken internal links
Low-Priority Issues: Optimisations
- Minor CWV issues (LCP 2.5-3s, INP 200-300ms)
- Missing meta descriptions (doesn't affect ranking, only CTR)
- Suboptimal hreflang configuration (if not using international versions)
Step-by-Step Audit Process
Step 1: Verify Access and Basics (15 min)
- Confirm your site is live and accessible. Open it in a browser.
- Check HTTPS. Is there a green lock? No "not secure" warning?
- Verify robots.txt isn't blocking the site: example.com/robots.txt. Check for Disallow: /.
Step 2: Google Search Console Review (20 min)
- Coverage report: How many pages indexed vs submitted? Large gap = indexation problem.
- Index Coverage errors: Any pages blocked, with errors, or excluded? Click to investigate.
- Core Web Vitals: Are metrics in good, needs improvement, or poor range?
- Mobile Usability: Any warnings?
- Security & Manual Actions: Any penalties or security issues?
Step 3: Crawl Audit with Screaming Frog (30 min)
- Download Screaming Frog (free or paid version). Start a crawl of your site.
- Review the Crawl Overview: page count, response codes, broken links.
- Check for duplicate content: multiple URLs with identical titles/descriptions.
- Look for redirect chains: any URLs redirecting multiple times?
- Check for noindex pages: are important pages noindexed?
- Review internal link distribution: are important pages well-linked?
Step 4: Speed and Core Web Vitals (15 min)
- Run PageSpeed Insights on 5-10 pages (homepage, top content pages, category pages).
- Note field data from CrUX (real user data) vs lab data.
- Identify which metrics are failing (LCP, INP, CLS).
- Document quick wins (image optimisation, lazy-loading, code-splitting).
Step 5: Mobile Testing (10 min)
- Open your site on an actual mobile phone or use Chrome DevTools device emulation.
- Is it responsive? Are all features accessible on mobile?
- Is content missing or hidden compared to desktop?
Step 6: Structured Data Check (10 min)
- Run Google's Rich Results Test on key pages (homepage, article pages, product pages).
- Are there structured data errors? If so, document what needs fixing.
Step 7: Sitemap and robots.txt Review (5 min)
- Check example.com/sitemap.xml. Is it present? Valid?
- Check robots.txt. Are critical pages disallowed? Are CSS/JS blocked?
- Is the sitemap submitted in Google Search Console?
Step 8: Document Findings and Prioritise (30 min)
Create a spreadsheet or document listing all issues found, categorised by priority. Include:
- Issue category (crawlability, speed, indexation, etc.)
- Specific problem (e.g., "LCP 3.2s on homepage")
- Impact (critical, high, medium, low)
- Recommended fix
- Effort (1-10 hours)
Audit Checklist: What to Check
| Issue Category | What to Check | Tool | Priority |
|---|---|---|---|
| Crawlability | Can Googlebot access your pages? Check for blocked resources, robots.txt issues, authentication walls. | Google Search Console, robots.txt tester | Critical |
| Indexation | Are pages indexed? Check GSC Coverage report for excluded URLs, noindex pages, duplicates. | Google Search Console | Critical |
| Site Speed & CWV | Is the site fast? Check LCP, INP, CLS metrics from real user data. | Google Search Console, PageSpeed Insights | High |
| Mobile Experience | Is mobile working? Test on actual devices or DevTools. Check responsive design. | Chrome DevTools, actual phones | High |
| HTTPS | Is the site using HTTPS? Check certificate validity and mixed content. | Browser, SSL Labs | High |
| Duplicate Content | Are duplicate pages properly consolidated with canonicals? | Screaming Frog, GSC | High |
| Structured Data | Is schema markup present and valid? | Google Rich Results Test, Schema Validator | Medium |
| Internal Linking | Are important pages linked from navigation? Is linking logical? | Screaming Frog, manual review | Medium |
| Broken Links & 404s | Are there many 404 errors? Dead internal links? | Screaming Frog, Google Search Console | Medium |
| Redirects | Any redirect chains or loops? Are 301s used correctly? | Screaming Frog, log analysis | Medium |
| Sitemaps | Is a sitemap present and submitted in GSC? Is it up-to-date? | Browser, Google Search Console | Medium |
| Log File Analysis | What is Googlebot crawling? Any patterns or errors? (Mainly for large sites) | Screaming Frog Log Analyser, ELK | Low (for small sites) |
Tools You'll Need
- Google Search Console (free): Essential. Crawl stats, indexation, Core Web Vitals, errors.
- Screaming Frog (free or $249/yr): Crawls your site and reports on links, duplicates, redirects, errors.
- PageSpeed Insights (free): Measures Core Web Vitals and provides optimisation suggestions.
- Google Rich Results Test (free): Validates structured data.
- SSL Labs (free): Checks HTTPS/SSL certificate validity.
- Lighthouse (free, built into Chrome DevTools): Detailed performance analysis (remember, this is lab data).
Audit Frequency
Small sites (under 10,000 pages): Quarterly audits are sufficient.
Medium sites (10,000-100,000 pages): Monthly audits recommended.
Large sites (over 100,000 pages): Monthly audits, with continuous log monitoring.
After major changes: Always audit after launching new site versions, migrating platforms, or significant restructuring.
Expected Outcomes from an Audit
A well-run audit should produce:
- 1-3 critical issues requiring immediate attention
- 3-8 high-priority issues to address in the next 1-2 quarters
- 5-10 medium-priority improvements to implement over time
- A roadmap for the next 3-6 months of technical work
If you find zero issues, you either have a very mature site or your audit wasn't thorough. Most sites have at least a few fixable issues.
Common Audit Findings
- Poor Core Web Vitals (80% of sites)
- Unoptimised images (90% of sites)
- Missing or poor internal linking (60% of sites)
- Duplicate content not consolidated (40% of sites)
- Missing XML sitemap (30% of sites)
- No structured data (50% of sites)
- Redirect chains (50% of sites)
If you identify and fix just the top 1-2 issues, you're already ahead of most sites.