Home Solutions Blog Get Free Audit
Back to Blog SEO

Technical SEO Checklist: 20 Must-Fix Issues

Technical SEO is the foundation that determines whether search engines can discover, crawl, render, and index your content effectively. A site with brilliant content and strong backlinks can still underperform if crawl errors, slow page speeds, or indexation issues prevent Google from processing its pages. This 20-point checklist, organized by priority, gives you a systematic framework for auditing and fixing the technical health of any website. Use tools like Screaming Frog, Google Search Console, and Ahrefs Site Audit to work through each item.

Crawlability and Indexation (Items 1-7)

1. XML Sitemap Accuracy: Your sitemap should include only indexable, canonical, 200-status URLs. Remove redirected, noindexed, or orphaned pages. Submit it in Google Search Console and check the coverage report for discrepancies. 2. Robots.txt Configuration: Verify your robots.txt is not accidentally blocking critical resources like CSS, JS, or entire directories. Use Google's robots.txt tester in Search Console. 3. Crawl Budget Optimization: For sites with over 10,000 pages, crawl budget matters. Block faceted navigation, internal search results, and parameter-heavy URLs from crawling to focus Googlebot on high-value pages. 4. Canonical Tags: Every indexable page needs a self-referencing canonical tag, and all duplicate or near-duplicate pages must point to the correct canonical version.

5. Hreflang Tags: If you serve content in multiple languages or regions, implement hreflang annotations to prevent duplicate content issues across localized versions. Validate with Ahrefs or the hreflang tag checker tool. 6. 404 Error Cleanup: Crawl your site and fix or redirect any URLs returning 404 errors, especially those with inbound links or historical traffic. Use Search Console's coverage report to identify these. 7. Redirect Chain Resolution: Redirect chains (A redirects to B which redirects to C) waste crawl budget and dilute link equity. Flatten every chain so that the original URL redirects directly to the final destination. Screaming Frog's redirect chain report makes this straightforward.

Security and Mobile (Items 8-11)

8. HTTPS Everywhere: Every page, resource, and internal link should use HTTPS. Mixed content warnings (HTTP resources on HTTPS pages) undermine trust signals and can trigger browser security alerts. Run a Screaming Frog crawl filtered by "insecure content" to find and fix mixed content. 9. Mobile Usability: Google uses mobile-first indexing, meaning the mobile version of your site is what gets crawled and ranked. Test with Google's Mobile-Friendly Test and fix issues like text too small to read, clickable elements too close together, and content wider than the screen.

10. Core Web Vitals: Google's page experience signals measure real-world user performance. Target an LCP (Largest Contentful Paint) under 2.5 seconds, an INP (Interaction to Next Paint) under 200 milliseconds, and a CLS (Cumulative Layout Shift) under 0.1. Use PageSpeed Insights and Chrome User Experience Report data to prioritize fixes. 11. Structured Data: Implement schema markup for your content type: Article, Product, LocalBusiness, FAQ, or HowTo. Validate with Google's Rich Results Test. Structured data does not directly boost rankings, but it enables rich snippets that significantly improve click-through rates from the SERP.

"Technical SEO is not a one-time project. It is a continuous process because every site redesign, CMS update, and content migration introduces new potential issues."

Site Architecture and Content (Items 12-16)

12. Internal Linking Structure: Build a logical internal link structure where every important page is reachable within three clicks from the homepage. Use descriptive anchor text and link contextually between topically related pages. 13. URL Structure: Keep URLs short, descriptive, and lowercase with hyphens. Avoid parameters, session IDs, and unnecessary folder depth. A clean URL like /services/seo-audit/ outperforms /services.php?id=47&cat=3 in both user experience and crawlability. 14. Duplicate Content: Use Screaming Frog's near-duplicate detection or Siteliner to identify pages with substantially similar content. Consolidate with canonicals, noindex tags, or content differentiation.

15. Pagination Handling: For paginated content like blog archives or product listings, ensure each paginated page is indexable with a unique title and that the full content set is crawlable via next/previous links. Google no longer supports rel=next/prev as a ranking signal, so self-referencing canonicals on each page are the correct approach. 16. JavaScript Rendering: If your site relies heavily on JavaScript (React, Angular, Vue), verify that Googlebot can render critical content. Use Google's URL Inspection tool to compare the rendered HTML with your source. Server-side rendering (SSR) or pre-rendering with tools like Prerender.io ensures content is visible to crawlers without executing client-side JS.

Performance and Assets (Items 17-20)

17. Image Optimization: Serve images in next-gen formats like WebP or AVIF, implement responsive srcset attributes, and add descriptive alt text to every image. Lazy-load below-the-fold images to reduce initial page weight. 18. Log File Analysis: Server log files show exactly how Googlebot interacts with your site. Tools like Screaming Frog Log Analyzer or Botify reveal which pages get crawled most often, which are ignored, and whether bot traffic is wasted on low-value URLs. This data is invaluable for crawl budget optimization on large sites.

19. Orphan Pages: Orphan pages have no internal links pointing to them, making them effectively invisible to crawlers navigating your site structure. Cross-reference your sitemap with your internal link graph to find pages that exist in the sitemap but receive zero internal links. Either link to them from relevant content or remove them. 20. Site Architecture Depth: Aim for a flat site architecture where important pages sit no more than three levels deep. Deep pages (four or more clicks from the homepage) receive less crawl frequency and less link equity. Use breadcrumbs and hub pages to flatten deep structures. For related optimization strategies, check out our content clusters for SEO guide and our annual SEO audit walkthrough.

  • Run a full crawl with Screaming Frog and cross-reference with Google Search Console data.
  • Fix all redirect chains so original URLs point directly to the final destination.
  • Target Core Web Vitals benchmarks: LCP under 2.5s, INP under 200ms, CLS under 0.1.
  • Implement structured data for your primary content types and validate with Rich Results Test.
  • Audit for orphan pages monthly and ensure every important URL has internal links.
  • Analyze server log files quarterly to understand Googlebot's actual crawl behavior.

Stay Updated with SMRTLV

Subscribe to our newsletter for the latest tips and insights on digital marketing strategies.