I run SEO audits every week. For client sites, for competitors, and for my own properties. After auditing hundreds of websites over the past decade, I’ve learned this: most sites have the same 12-15 critical issues, and fixing them in the right order is what separates amateurs from pros.
This guide is my exact audit process—the same framework I use when a client pays $5,000 for a comprehensive SEO audit. You’ll get the checklists, the tools, the priority system, and real examples of findings I’ve uncovered. No fluff. Just the process that works.
Why I Audit (And Why You Should Too)
I audit for three reasons:
- Traffic drops. When organic traffic falls 20%+ month-over-month, an audit finds the cause. Was it a Google update? A technical error? Content decay? The audit tells you.
- Competitive pressure. Competitors outrank you on keywords you used to own. An audit shows their advantages—faster site, better content, stronger backlinks—so you know where to invest.
- Preventive maintenance. I audit my sites quarterly even when traffic is stable. Small issues compound. A broken canonical here, a slow page there, 50 orphaned pages—individually minor, collectively devastating.
The ROI is absurd. I’ve seen a single audit uncover indexation blocks costing a site 60% of its potential traffic. Three hours of work, six-figure recovery. That’s why I audit obsessively.
The 4-Phase Audit Framework
I break every audit into four phases, always in this order:
- Technical Audit (foundation): Crawling, indexing, Core Web Vitals, mobile-friendliness
- Content Audit (relevance): Keyword targeting, content quality, internal linking, thin/duplicate content
- Backlink Audit (authority): Link profile health, toxic links, competitor gaps
- UX Audit (engagement): Navigation, CTA placement, conversion friction
You must audit in this sequence. Technical issues block crawlers from seeing your content. If Google can’t crawl your site properly, your brilliant content and powerful backlinks don’t matter. Fix the foundation first.
Phase 1: Technical SEO Audit
Technical SEO is the plumbing. Invisible to users, critical to rankings. I start here because a single technical flaw—like a robots.txt block or a redirect chain—can crater your entire site’s visibility.
Tools I Use for Technical Audits
- Screaming Frog SEO Spider (primary): Desktop crawler, handles 500K+ URLs on paid version. My #1 tool.
- Google Search Console (GSC): Index coverage, Core Web Vitals, mobile usability. Free, authoritative.
- PageSpeed Insights / Lighthouse: Core Web Vitals scoring, performance recommendations.
- GTmetrix: Waterfall charts, load time breakdowns, LCP/CLS debugging.
- Ahrefs Site Audit: Automated monthly crawls, tracks issues over time (I use this for large enterprise clients).
Technical Audit Checklist (19 Items)
Crawlability & Indexation
- robots.txt check: Visit yoursite.com/robots.txt. Ensure no critical directories are blocked (Disallow: /blog/ would be catastrophic). I’ve seen three sites with accidental Disallow: / blocking the entire site.
- XML sitemap validation: Check yoursite.com/sitemap.xml exists. Upload to GSC. Verify all important URLs are included and no 404s are listed. Screaming Frog finds orphaned pages (not in sitemap, not linked) fast.
- Index coverage (GSC): GSC → Index → Coverage. Check for errors (Submitted URL not found, Redirect error, Server error). Also check excluded URLs—sometimes Google excludes pages you want indexed.
- Crawl errors: Screaming Frog → Response Codes. Export all 404s, 500s, 301/302 redirect chains. I prioritize 404s with backlinks (Ahrefs finds these—Site Explorer → Best by Links → filter by 404).
- Canonical tags: Every page should have a canonical URL. Screaming Frog shows missing/conflicting canonicals. Common error: paginated pages canonicalizing to page 1 (don’t do this—self-canonicalize each page).
- Noindex/nofollow tags: Screaming Frog → Directives → Noindex/Nofollow. Verify no important pages are noindexed. I once found a client’s entire blog noindexed via Yoast default setting.
- Duplicate content: Screaming Frog → Content → Duplicate. Check duplicate titles, descriptions, and H1s. Also check URL parameters (yoursite.com/page?ref=123 vs yoursite.com/page)—use canonicals or GSC URL parameter handling.
Core Web Vitals & Performance
- Largest Contentful Paint (LCP): Target <2.5s. GSC → Experience → Core Web Vitals shows pass/fail per URL. PageSpeed Insights shows what's slowing LCP (usually images or server response time).
- Interaction to Next Paint (INP): Target <200ms. Check for JavaScript blocking the main thread. I use Lighthouse → Performance → Diagnostics → Minimize main-thread work.
- Cumulative Layout Shift (CLS): Target <0.1. Caused by images without dimensions, late-loading ads, web fonts. Add explicit width/height to
tags.
- Page speed (overall): GTmetrix → Waterfall. Identify large files (>200KB images, unoptimized JS/CSS). I aim for <1s load time on cable, <3s on 3G.
- Mobile-friendliness: GSC → Experience → Mobile Usability. Check for text too small, clickable elements too close, viewport not set. Test on real devices (iPhone, Android)—simulators miss touch target issues.
Site Architecture
- URL structure: Clean, keyword-descriptive URLs. Avoid: yoursite.com/p=123 or /category/subcategory/sub-subcategory/post (too deep). Prefer: yoursite.com/keyword-topic.
- HTTPS: Entire site on HTTPS. Check for mixed content (HTTP resources on HTTPS pages). Screaming Frog flags these.
- Redirect chains: Screaming Frog → Response Codes → Redirection (3XX). A → B → C chains slow crawlers and waste link equity. Flatten to A → C.
- Orphan pages: Pages with zero internal links. Screaming Frog → Internal → Orphan Pages. Google may not discover these. Add internal links from related content.
- Crawl depth: Important pages should be ≤3 clicks from homepage. Screaming Frog → Internal → Crawl Depth. Pages at depth 5+ get crawled less frequently.
- Hreflang (multi-language sites): Check hreflang tags for international sites. Common error: en-us pointing to en-gb instead of self-referencing. Use Screaming Frog → Internationalization → Hreflang.
- Structured data (schema): Check with Google Rich Results Test. Validate JSON-LD markup. I prioritize Article, Product, FAQ, HowTo, LocalBusiness schemas—they drive rich snippets.
Real Example: Technical Audit Finding
Client: E-commerce site, 12,000 products, traffic down 40% over 3 months.
Audit finding: Screaming Frog crawl showed 8,200 products returning 302 redirects to category pages. Developer had implemented “temporary” redirects for out-of-stock items but never reverted them. Google de-indexed 68% of the product catalog.
Fix: Changed 302s to 200s (restored product pages), added “out of stock” schema to product markup, kept pages live. Re-submitted sitemap to GSC.
Result: 95% of products re-indexed within 3 weeks. Traffic recovered to 110% of baseline within 60 days.
Lesson: 302 redirects signal “temporary” to Google. After ~4 weeks, Google may de-index the original URL. If you must redirect, use 301 (permanent) unless you have a specific reason for 302.
Phase 2: Content Audit
Content is the product Google sells to searchers. If your content doesn’t satisfy user intent better than the other 10 results on page one, you won’t rank—regardless of your technical perfection.
I audit content for relevance, quality, structure, and internal linking. Most sites have 30-40% of their content underperforming or actively harming rankings (thin pages, keyword cannibalization, outdated info).
Tools I Use for Content Audits
- Google Search Console: Performance data—queries, clicks, impressions, CTR, position per URL.
- Google Analytics (GA4): Engagement rate, avg time on page, bounce rate per URL.
- Screaming Frog: Extract all title tags, meta descriptions, H1s, word counts.
- Ahrefs Content Explorer / Site Explorer: Keyword rankings per URL, traffic estimates, backlinks per page.
- Surfer SEO: Content score vs top 10 competitors (I use this for high-value pages).
- Clearscope or MarketMuse: Topic coverage, semantic keyword suggestions (enterprise clients).
Content Audit Checklist (16 Items)
Inventory & Performance
- Content inventory: Export all URLs from Screaming Frog. Categorize by type (blog, product, service, landing page). Add GSC traffic data, GA4 engagement data, Ahrefs rankings.
- Thin content: Flag pages <300 words. Exception: product pages (images + specs can rank). Thin blog posts ("10 tips" with 50 words per tip) rarely rank. Expand or consolidate.
- Duplicate content: Screaming Frog → Duplicate Titles/Descriptions/H1s. Also check for scraped/syndicated content (Copyscape for exact duplicates).
- Content decay: GSC → Performance → filter last 6 months vs previous 6 months. Flag pages with >30% traffic drop. Likely causes: outdated info, new competitors, lost backlinks.
- Keyword cannibalization: Multiple URLs targeting the same keyword. GSC → Performance → filter by query, see which URLs rank. Pick the best one, de-optimize or redirect the others.
On-Page SEO Elements
- Title tags: 50-60 characters, primary keyword within first 5 words. Screaming Frog shows too-long (>60 chars) and missing titles. Check GSC → Performance → Pages → CTR. Low CTR = bad title.
- Meta descriptions: 150-160 characters, compelling call-to-action, secondary keyword. Not a ranking factor but affects CTR. I rewrite descriptions for pages with <2% CTR.
- H1 tags: One per page, contains primary keyword. Screaming Frog flags missing/multiple H1s. Common error: logo wrapped in
(don’t do this—logo is not a heading).
- Header hierarchy: Proper H1 → H2 → H3 structure. No skipping (H1 → H3 without H2). Screaming Frog → H1/H2 tabs. Clean hierarchy helps Google parse content.
- Keyword placement: Primary keyword in H1, first 100 words, at least one H2, naturally throughout. Over-optimization (keyword stuffing) triggers Panda-like quality filters.
- Image optimization: Alt text (descriptive, includes keyword when relevant), file names (keyword-target.jpg not IMG_1234.jpg), compressed (<200KB per image), WebP format.
Internal Linking & Content Structure
- Internal links: Every page should link to 3-5 related pages. Screaming Frog → Internal → Inlinks. Pages with <3 inlinks are orphan risks. I use Ahrefs Site Audit → Internal Link Opportunities for suggestions.
- Anchor text diversity: Avoid repetitive exact-match anchors. Mix branded, partial-match, and natural language anchors. GSC won’t show internal anchor text, but Screaming Frog → Anchor Text tab does.
- Content depth: Comprehensive content (2,000+ words) outranks shallow content for competitive queries. I use Surfer SEO to benchmark top 10 word counts, then aim for top-3 average.
- Readability: Flesch-Kincaid grade level 8-10 for general audiences. Short paragraphs (2-4 sentences), bullet lists, subheadings every 300 words. Yoast/Rank Math show readability scores.
- Content freshness: Pages with recent updates rank higher for time-sensitive queries. GSC shows publication date in rich results. I update top 50 pages quarterly—refresh stats, add new sections, update dates.
Real Example: Content Audit Finding
Client: B2B SaaS blog, 240 posts, stagnant traffic despite publishing 4 posts/month.
Audit finding: GSC data showed 62% of posts had zero clicks in the past 12 months. Keyword research revealed most posts targeted zero-volume or branded competitor terms (“[Competitor] alternative”). Thin content (avg 600 words), no internal links, generic titles.
Fix: Consolidated 148 zero-traffic posts into 22 comprehensive guides targeting their actual keywords (“project management software,” “task automation tools”). Redirected old URLs to new guides. Expanded content to 2,500+ words per guide, added 8-12 internal links per post, rewrote titles for CTR.
Result: Organic traffic +180% in 4 months. 19 of 22 new guides ranked page 1 within 60 days. Engagement rate (GA4) increased from 28% to 61%.
Lesson: More content ≠ better rankings. Consolidate thin, low-traffic posts into fewer, deeper, more authoritative pieces. Google rewards comprehensive answers, not shallow coverage.
Phase 3: Backlink Audit
Backlinks remain a top-3 ranking factor. Quality matters infinitely more than quantity. I’ve seen sites with 1,000 backlinks outranked by sites with 50 backlinks—because the 50 were from .edu domains, industry publications, and high-DR sites.
A backlink audit identifies toxic links (spam, PBNs, paid links), finds gaps vs competitors, and uncovers link-building opportunities.
Tools I Use for Backlink Audits
- Ahrefs Site Explorer: Most comprehensive backlink database (I prefer over Majestic/Moz). Shows referring domains, anchor text, link growth over time.
- Google Search Console: Links → Top linking sites. Free, shows what Google sees (though less detailed than Ahrefs).
- SEMrush Backlink Audit: Automated toxic link detection. I use this as a second opinion.
- Manual review: I spot-check top 50 backlinks in Ahrefs. Tools miss context (a “toxic” link from a forum might be from an industry expert).
Backlink Audit Checklist (12 Items)
Link Profile Health
- Total referring domains: Ahrefs → Site Explorer → Backlinks → Referring Domains. Track growth over time. Sudden spikes = possible negative SEO (spam attack). Sudden drops = lost links or Google de-indexed linking sites.
- Domain Rating (DR) / Domain Authority (DA): Ahrefs DR or Moz DA. Higher is better but not everything. I prioritize relevance over raw DR. A DR 30 industry blog > DR 60 generic news site.
- Link velocity: Ahrefs → New/Lost Referring Domains. Healthy: steady growth. Red flag: 500 new links in one week (likely spam). I investigate all spikes >50 links/week.
- Anchor text distribution: Ahrefs → Anchors. Healthy profile: 40-60% branded, 20-30% URL/generic (“click here”), 10-20% partial-match, <5% exact-match. Over-optimized anchor text (50%+ exact-match) risks Penguin-like filters.
- Dofollow vs nofollow ratio: Ahrefs → Backlinks → Dofollow/Nofollow filter. Healthy: 70-85% dofollow. 100% dofollow = unnatural. Some nofollow is normal (social media, forums, press releases).
- Link placement: Ahrefs → Backlinks → inspect individual links. In-content links (within article body) pass more equity than sidebar/footer links. Guest post bio links are low-value.
Toxic Link Detection
- Spam score: Ahrefs → Referring Domains → Spam Score. Flag domains with >50% spam score. SEMrush Backlink Audit auto-flags toxic links (I review before disavowing).
- Irrelevant links: Manual review. Example: pet blog linking to fintech site = irrelevant. Likely paid or PBN. I disavow unless there’s a legitimate reason (co-marketing, partnership).
- Sitewide links: Ahrefs → Backlinks → filter by “Sitewide.” Blogrolls, footer links from 500 pages on one domain. Google discounts most sitewide links. Excessive sitewide links from low-quality sites = red flag.
- Link farms / PBNs: Look for: same IP range (Ahrefs → Referring IPs), same registrar, similar templates, interlinked domains. I disavow entire networks.
- Foreign-language links (if targeting English): 1,000 Russian or Chinese links to an English site = spam. Check Ahrefs → Referring Domains → Language. Disavow non-English domains unless you have international audience.
- Disavow file: Create disavow.txt (domain:example.com or url:exact-url per line). Upload to GSC → Disavow Links Tool. I disavow conservatively—only clear spam. Disavowing good links by mistake hurts more than leaving mediocre links.
Competitor Backlink Gap Analysis
I run this for every client. It’s the fastest way to find link-building opportunities.
Process:
- Ahrefs → Site Explorer → enter competitor domain
- Backlinks → Best by Links → filter by DR >30, dofollow only
- Export top 100 linking domains
- Ahrefs → Link Intersect → enter your domain + 3 competitors
- Filter: “Linking to competitors but not to you”
- Export opportunities → prioritize by DR, relevance, link type (guest post, resource page, directory)
Result: I typically find 20-40 realistic link opportunities per competitor analysis. These are domains already linking to similar content—far easier to acquire than cold outreach.
Real Example: Backlink Audit Finding
Client: Local service business, rankings dropped 60% overnight.
Audit finding: Ahrefs showed 2,400 new backlinks in 10 days—all from Russian forums, adult sites, pharma blogs. Clear negative SEO attack (competitor or automated spam bot).
Fix: Created disavow file with 140 spam domains. Submitted to GSC. Also submitted reconsideration request (explaining the attack). Monitored Ahrefs for new spam links weekly.
Result: Rankings recovered 80% within 3 weeks of disavow. Full recovery after 2nd disavow update (caught 60 more spam domains). Client now monitors backlinks weekly via Ahrefs alerts.
Lesson: Negative SEO is real. Set up Ahrefs alerts (Settings → Alerts → New Backlinks) to catch spam attacks early. Disavow promptly but carefully—document every domain you disavow.
Phase 4: UX Audit
UX audits find conversion killers: confusing navigation, broken CTAs, slow load times, poor mobile experience. Google’s ranking algorithm incorporates user engagement signals (click-through rate, dwell time, pogo-sticking). Bad UX = bad engagement = worse rankings.
I focus on measurable UX issues—things I can track in GA4 or observe in session recordings.
Tools I Use for UX Audits
- Google Analytics 4: Engagement rate, avg session duration, pages per session, conversion rate per page.
- Hotjar or Microsoft Clarity: Heatmaps, session recordings, rage clicks. Clarity is free and surprisingly good.
- Google Search Console: Mobile usability issues, clickable elements too close.
- Lighthouse (Chrome DevTools): Accessibility score, best practices, SEO basics.
- Manual testing: I test on real devices (iPhone 14, Samsung Galaxy, iPad) + desktop. Simulators miss touch target issues and font rendering bugs.
UX Audit Checklist (11 Items)
Navigation & Structure
- Navigation clarity: Can users find key pages in ≤2 clicks? Test: give someone unfamiliar with your site a task (“find pricing”). If they struggle, your nav is broken.
- Breadcrumbs: Every page beyond homepage should have breadcrumbs. Helps users backtrack, helps Google understand site hierarchy. Add BreadcrumbList schema.
- Search functionality: If site has >50 pages, add search. Track search queries in GA4 (Events → view_search_results). Failed searches reveal content gaps.
- 404 page: Custom 404 with search box + links to popular pages. GA4 → Events → page_view → filter page_location contains “404.” High 404 rate = broken internal links or outdated backlinks.
Mobile UX
- Touch target size: Buttons/links ≥48×48 CSS pixels. GSC flags this under Mobile Usability. I test by tapping with thumb on real device—if I miss the target, it’s too small.
- Text readability: Font size ≥16px on mobile. Contrast ratio ≥4.5:1 (Lighthouse → Accessibility checks this). I test in bright sunlight (outdoor readability).
- Viewport configuration: in . Without this, mobile browsers render desktop version.
Conversion & Engagement
- CTA placement: Clear call-to-action above fold on key pages (product, service, landing pages). Heatmaps show if users see CTAs. I use contrasting color for CTA buttons (green on white, orange on blue).
- Form friction: Minimize required fields. Every extra field reduces conversions ~5-10%. I remove “Company” and “Phone” from lead forms unless absolutely necessary.
- Page load time (UX perspective): Users abandon after 3 seconds. GA4 → Events → page_view → avg engagement time. Pages with <10s avg = users leaving fast, likely due to slow load or bad content.
- Exit rate analysis: GA4 → Explore → Free Form → Dimension: Page path → Metric: Exit rate. High exit rate on key pages (pricing, product) = UX issue or unclear value prop. Session recordings (Hotjar/Clarity) show why users leave.
Real Example: UX Audit Finding
Client: SaaS company, high traffic (50K/month), low conversions (0.8% free trial signups).
Audit finding: Hotjar session recordings showed 60% of users on pricing page scrolled to bottom, then left. CTA button (“Start Free Trial”) was below fold, light blue on white (low contrast). Form required 9 fields including “Company Size” dropdown with 12 options. GA4 showed 78% form abandonment rate.
Fix: Moved CTA above fold, changed button to high-contrast orange, reduced form to 3 fields (name, email, password), removed “Company Size” (asked post-signup). A/B tested for 14 days.
Result: Conversion rate increased from 0.8% to 2.3% (+187%). Form completion rate went from 22% to 71%. Same traffic, nearly 3x conversions.
Lesson: UX issues cost conversions. Session recordings reveal what analytics can’t—exact moments users get frustrated and leave. Every extra form field is a conversion killer.
Audit Prioritization: What to Fix First
You’ll find 100+ issues in any comprehensive audit. You can’t fix everything at once. I use a priority matrix based on impact and effort.
Priority Matrix
| Priority | Impact | Effort | Examples | Deadline |
|---|---|---|---|---|
| P0 (Critical) | High | Low | robots.txt blocking site, entire site noindexed, massive indexation error, manual penalty | Fix today |
| P1 (High) | High | Medium | Core Web Vitals failing, 500+ 404 errors with backlinks, toxic backlink attack, duplicate content | Fix this week |
| P2 (Medium) | Medium | Low-Medium | Missing H1s on 50 pages, thin content on low-traffic pages, internal linking gaps, slow images | Fix this month |
| P3 (Low) | Low-Medium | High | Site redesign, URL restructure, migrating CMS, rewriting entire content library | Quarterly project |
| P4 (Backlog) | Low | Any | Minor schema errors, alt text on non-ranking images, micro-optimizations | When time permits |
My Prioritization Rules
- Fix blocking issues first. If Google can’t crawl/index your site, nothing else matters. P0 always comes before P1.
- Prioritize pages with existing traffic. Fixing a page that gets 1,000 visits/month beats optimizing a page with 10 visits/month.
- Fix pages with rankings on page 2. Position 11-20 pages are low-hanging fruit—small improvements push them to page 1. Use GSC to find these (Performance → filter Position 11-20).
- Batch similar fixes. Fixing 200 missing alt tags in one session is faster than fixing them one-by-one over weeks. I export Screaming Frog data to spreadsheet, bulk-edit, then upload via CSV or API.
- Track fixes in a log. I use Notion or Google Sheets: Issue | Priority | Date Found | Date Fixed | Result (traffic change). This proves ROI and helps refine future audits.
Post-Audit: Monitoring & Iteration
An audit is not a one-time event. SEO decays. Competitors improve. Google updates. I re-audit every site quarterly (monthly for enterprise clients or during traffic drops).
What I Monitor Post-Audit
- GSC Index Coverage: Weekly check for new errors. Set up email alerts in GSC (Settings → Users and permissions → Add email for critical issues).
- Core Web Vitals: Monthly check via GSC. I track LCP/INP/CLS for top 20 landing pages.
- Rankings: I use Ahrefs Rank Tracker or SEMrush Position Tracking. Track top 50 keywords weekly. Alert on >5 position drops.
- Backlinks: Ahrefs alerts for new/lost backlinks. I review new backlinks weekly—disavow spam immediately.
- Traffic: GA4 + GSC. I compare 30-day windows. >15% drop = investigate immediately (algorithm update, technical issue, content decay).
- Conversions: GA4 conversion tracking. If traffic increases but conversions don’t, it’s a UX or targeting issue (ranking for wrong keywords).
Quarterly Audit Checklist (Maintenance)
Every 90 days:
- Full Screaming Frog crawl → compare to previous crawl (new errors?)
- GSC Index Coverage review → new excluded pages?
- Content decay check → GSC Performance, last 90 days vs previous 90 days, flag >30% traffic drops
- Backlink audit → Ahrefs, check for spam/lost links
- Core Web Vitals → GSC, verify still passing
- Competitor analysis → Ahrefs, who’s gaining traffic on your keywords?
- Update audit log → document changes, track ROI
Audit Reporting: What to Include
If you’re auditing for a client or stakeholder, the report format matters. I structure reports as:
- Executive Summary (1 page): Top 5 critical issues, estimated traffic impact, timeline for fixes.
- Prioritized Issue List (table): Issue | Category (Technical/Content/Backlinks/UX) | Impact | Effort | Priority | Recommendation.
- Detailed Findings (per category): Screenshots, Screaming Frog exports, GSC data, competitor comparisons.
- Action Plan (Gantt chart or timeline): Who does what, by when. I use ClickUp or Asana for this.
- Appendix: Raw data (full Screaming Frog crawl, Ahrefs exports, GSC screenshots).
I deliver reports as PDF + live Google Sheet (for issue tracking). Clients update the sheet as they fix issues—keeps everyone aligned.
Tools Summary: My Audit Stack
| Tool | Purpose | Cost | Alternative |
|---|---|---|---|
| Screaming Frog | Technical crawling, on-page analysis | $259/year | Sitebulb ($35/mo) |
| Google Search Console | Index coverage, Core Web Vitals, search performance | Free | Bing Webmaster Tools |
| Ahrefs | Backlinks, competitor analysis, keyword research | $129/mo | SEMrush ($139/mo), Moz ($99/mo) |
| Google Analytics 4 | Traffic, engagement, conversions | Free | Matomo (self-hosted) |
| PageSpeed Insights | Core Web Vitals, performance recommendations | Free | GTmetrix (free), WebPageTest |
| Microsoft Clarity | Heatmaps, session recordings | Free | Hotjar ($39/mo) |
| Surfer SEO | Content optimization, SERP analysis | $89/mo | Clearscope ($170/mo), Frase ($45/mo) |
Minimum viable stack (budget): Screaming Frog + GSC + GA4 + Clarity = $259/year. You can run professional audits with just these four tools.
Professional stack (agency/consultant): Add Ahrefs ($129/mo) + Surfer SEO ($89/mo) = ~$2,800/year. This is my daily stack.
Common Audit Mistakes (And How I Avoid Them)
- Auditing without objectives. “Do an SEO audit” is too vague. I start every audit with a goal: recover lost traffic, improve conversions, prepare for migration, outrank competitor X. The goal determines what I prioritize.
- Ignoring intent. You can have perfect technical SEO and still not rank if your content doesn’t match search intent. I check top 10 results for every target keyword—if they’re all listicles, I write a listicle. If they’re all product comparisons, I write a comparison.
- Over-relying on tools. Tools miss context. Ahrefs might flag a link as “toxic” when it’s actually a legitimate editorial link from a niche forum. I manually review flagged issues before acting.
- Fixing low-impact issues first. I’ve seen teams spend weeks fixing minor alt text issues while ignoring a sitewide noindex tag. Always prioritize by impact, not ease.
- Not tracking results. If you don’t measure the outcome of your fixes, you can’t learn what works. I log every fix and track traffic/rankings 30 days later.
- Analysis paralysis. Audits can uncover 200+ issues. Don’t wait to fix everything—fix P0/P1 issues immediately, then iterate. Perfection is the enemy of progress.
Audit Frequency: How Often Should You Audit?
- New sites: Full audit before launch + 30 days post-launch.
- Established sites (stable traffic): Quarterly full audits + monthly spot-checks (GSC errors, Core Web Vitals).
- High-competition niches: Monthly full audits. Competitors move fast—you need to move faster.
- After traffic drops: Immediate audit (same day if >20% drop).
- After Google updates: Wait 7 days post-rollout (volatility settles), then audit if impacted.
- Before/after migrations: Full audit before (baseline), full audit 7 days after (catch issues), follow-up audit 30 days after (measure impact).
Final Thoughts: Audit Like a Pro
I’ve run audits on 5-page local business sites and 500,000-page e-commerce sites. The process is the same: crawl, analyze, prioritize, fix, measure. The difference between amateurs and pros is not the tools—it’s the prioritization and the follow-through.
Amateurs find 100 issues and feel overwhelmed. Pros find 100 issues, fix the 5 that matter, and ignore the rest. Amateurs audit once and forget. Pros audit quarterly and track improvements obsessively.
Your audit is only valuable if you act on it. I’ve seen $10,000 audits sit in a Google Drive folder for 6 months while traffic continues to drop. Don’t be that person. Pick your top 3 P0/P1 issues, fix them this week, measure the results in 30 days, then move to the next 3.
SEO is a system, not a project. The audit is your diagnostic. The fixes are your treatment. The monitoring is your ongoing health maintenance. Treat it like that, and you’ll outrank competitors who audit once and disappear.
Related Resources
- Technical SEO Guide — Deep dive into crawling, indexing, site architecture
- Core Web Vitals Guide — Fix LCP, INP, CLS for better rankings
- On-Page SEO Checklist — Optimize titles, meta, headers, content
- Keyword Research Guide — Find keywords worth targeting
- Link Building Strategies — Earn high-quality backlinks at scale
- Schema Markup Guide — Implement JSON-LD for rich results
- SEO Content Writing Guide — Write content that ranks
- How to Measure SEO ROI — Track and prove SEO value
- Technical SEO Glossary — Key terms explained
- Crawling Glossary — Understand how search engines crawl
- Indexing Glossary — How Google indexes pages
- robots.txt Glossary — Control crawler access
- XML Sitemap Glossary — Help Google discover your content
- Core Web Vitals Glossary — LCP, INP, CLS explained