• April 21, 2023
  • 9 min. read

How to do a technical SEO Audit: 2023 Checklist

content writer

Anush Bichakhchyan

Content Writer

How to do a technical SEO Audit: 2023 Checklist

According to statistics, only 9.37% of websites rank on Google. The rest simply get lost in the ocean of millions of websites not getting any traffic. SEO components—on-page, off-page, and technical SEO—have a crucial role in website ranking and are advised to be part of constant website maintenance. 

We do love SEO; hope it is mutual and that our efforts at optimizing websites with three components pay off. We won’t prioritize any of the SEO components but offer a comprehensive checklist for each. Here is a complete on-page and off-page SEO checklist. This article is going to be all about technical SEO. 

What is technical SEO?


Technical SEO, like the other components of SEO, is aimed at making the website better for users by optimizing website performance and focusing on the technical aspects of the website. It is not like design or content; you can’t see it, but you will definitely feel its absence. Unlike other components, for example, off-page SEO, which takes time and more commitment to get results, technical SEO is more straightforward. A technical SEO checks external factors, follows specific search engine guidelines, and checks everything related to visibility and user experience on the website. 

Why is technical SEO important?

Technical SEO is critical, be it a landing page or a massive multi-page website with services, because both the owners of a single page and a multi-page website want to get rankings. Here are the reasons why you shouldn't skip technical SEO. 

  • Technical SEO makes your website accessible to search engines to crawl and index the pages.
  • Optimizes click-through rate checking titles and descriptions and revealing missing items.
  • Checks and helps improve page speed - the key ranking factor for user experience. 
  • Helps improve website security for search engines and users.
  • Helps understand website analytics and focuses on those marketing efforts that require a maximum contribution.

Technical SEO checklist

The following checklist includes all the major components required for a complete technical SEO audit. Follow those steps to ensure website user-friendliness and make it visible in search engines.

Website loading speed time


Page load is one of the most important factors that can affect user interaction in the first few seconds, if not the most important. 50% of bounce rates occur within the first three seconds of page load. Yes, life is too fast to wait for a page to load. 

To start with page speed optimizations, use Google’s Speed Test to get a page load time analysis. Pay special attention to the page load distribution chart with performance metrics: First Contentful Paint (FCP), and LCP (Largest Contentful Paint). The tool shows which of these two takes the greater loading portion or which one is the fastest. 

Improve server response time

Server response time is the amount of time required to load HTML code from the server in order to render the page. Slow server response time can be caused by slow application logic, a slow database, slow routing, resource starvation, plugins, or a bad server, all of which can result in slow website crawling. Google recommends keeping the response time under 200 ms. But first of all, you should discover slow-performing processes.

  1. Measure response time
  2. Collect data about issues(if there are any)
  3. Monitor regression and fix issues(if there are any)

Often, the main reason for slow server response time is the server; that’s why it is critical to choose a quality server from the beginning. Depending on the scalability of the website, it can be hosted on a shared hosting service or on dedicated servers. 

What does it mean by “shared hosting?” Imagine you are sharing the same host (same IP), which is used by another crappy website. Google penalizes the domain of crappy websites (the domain is associated with IP), so by the effect, your IP (your domain as well) will be penalized.

Optimize image sizes 

Large image sizes are one of the primary causes of slow page load time. If your website has a lot of images, the server takes more time to load the page, especially when the images are not optimized. Reducing image size is quite easy, while image size optimization will significantly boost page load speed. The optimization should not affect the quality of the images.

Recommendations to optimize images:

  • Automatically compress images with tools or plugins
  • Use PNG and WebP formats
  • For JPEG formats reduce quality to 85%
  • Choose vector formats that are resolution and scale independent
  • Remove unnecessary metadata

Minimize the render-blocking Javascript and CSS

Once you've performed a speed test on Google’s PageSpeed Insights, you may have the following message: “Eliminate render-blocking JavaScript and CSS in above-the-fold content in case you have some blocked resources that cause a delay in rendering your page.” 

Use Google guidelines and the following methods to avoid blocking JavaScript:

  • Use Inline JavaScript
  • Make JavaScript Asynchronous
  • Defer loading of JavaScript

If a page contains blocking external stylesheets, delaying the time to first render, then you should optimize CSS delivery.

PageSpeed’s minification technique, the process of removing unnecessary data, will show the resources (HTML, CSS, and JavaScript) that need to be optimized. There are three processes based on the minification process.

Compress the data. To make your website load faster, you need to compress the data by reducing the size of the files that the browser needs to download. Once you've eliminated any unnecessary resources, you can compress the remaining ones to help improve your website's loading speed. 

  • Gzip compression. The process is used for compressing text-based data, like web pages and style sheets, before sending them to a browser. This is especially effective for CSS files and HTML containing a lot of repeated text and white spaces. Gzip achieves this by temporarily replacing similar strings within a text file to make the file size smaller.

Optimize the resources. Another way to optimize your website is by only keeping relevant information and files on it. This way, you can avoid having any irrelevant data that could slow down your website. After deciding on relevant files, you can start doing content-specific optimizations.

Limit the number of Resources and HTTP requests


Multiple requests slow down the server. Ideally, the number of requests should be under 100. Of course, everything depends on the page. On a large page, there can be more requests. After optimization, delete unnecessary resources and compress the remaining ones to minimize the download size. 

Set a browser cache policy

Page speed can also be optimized with a browser cache and a caching plugin that best suits a website's needs. Of course, it has its drawbacks too, as changes won’t be visible immediately, and to see the new version, you will need to handle the cache-cleaning functionality by yourself.

Reduce the number of redirects 

Redirects are meant to save the website from broken pages, but they can also cause problems if you do not control the number. Redirects can be a necessary evil, but too many of them will result in a poor user experience as the user spends more time waiting for the page to load.

To optimize page speed, you need to check redirect chains, each source should be pointed to a single destination. If you have 404 pages, add some creative design solutions to lead users to another page. You can use Google Search Console or other SEO checkup tools to find broken pages.

Avoid loading the site to keep it simple

A technical SEO audit is a regular process. You should keep your finger on the pulse and keep it as simple as possible to not overload the website and make it slower. 



After you have corrected the errors with the page load speed, you can now work on its functionality to improve its visibility in SERPs.


As far as Google's prioritized mobile-first indexing is concerned, it is crucial to make your website mobile-friendly. Generally, it is about responsiveness. You can test your website on Google’s Mobile-Friendly Test page.  

Search engine friendliness

URLs are not part of the design or performance, but they definitely need to be optimized with keywords and easy to follow. 

For easy URLs, follow the rules:

  • Use dashes (-) 
  • Make it shorter
  • Use keywords

Use the secure protocol - HTTPS


Since 2014, Google has advised users to switch from the HTTP protocol to HTTPS in order to improve data security and assure users that the website is safe to use. 

Set preferred version

The next step is to ensure all the site versions point to your preferred one. These are the versions:

  • http://site.com
  • https://site.com
  • http://www.site.com
  • https://www.site.com

Important: As www is a subdomain and Google doesn’t understand whether it’s the same site or not, it is crucial to choose and stick to one version. Note, that the four options below can be considered 4 different websites and are therefore detected as duplicates. As said in the previous point, you should first choose the HTTPS protocol and stick to either the www or non-www option. The rest should 301 direct to one. 

Set up 301 redirects after site migration

Website migration should be followed by 301 redirects’ setup. Take into consideration the following recommendations:

  • Set up the 301 redirect code from the old URL to the new URL
  • Avoid redirection loops and chains
  • Verify one version
  • Submit address change in Search Console
  • Update the sitemap.xml to represent actual urls without any redirects and make sure it's submitted in search console
  • Check for broken links

Ensure crawlability and fix errors


If the pages are not crawlable, they won’t have a ranking and won’t be visible in the SERP. The stages to provide crawlability actually include all the steps of technical SEO we are now discussing.  

Test Robots.TXT file


If you have crawlability issues, they may be related to the robots.txt file. You can test robots.txt with a tool in Search Console, edit the file to get it ready for crawling. If no file blocks Google web crawlers, you will have 0 errors. 

Verify the indexed content


Within the technical SEO check the content of indexed pages and set up proper no-index tags. Use Google Search Console to get insights and the status of indexed pages. 

Review sitemap

Keeping an eye on the XML sitemap is important to make sure your website is organized. And again, Search Console is the magic tool where you check the Sitemap report to manage and test the sitemap file. Here are the steps to follow:

  • Create a sitemap.xml file on your website.
  • Enter sitemap URL
  • Click on Test sitemap
  • When completed, click Open Test Results
  • Fix errors if there are any
  • Click Submit Sitemap

Review blocked resources with Fetch as Google

To review blocked resources (Hashbang URLs) with Fetch as Google, you can use the Fetch as Google tool in the Google Search Console. This tool allows you to see how Google crawls and renders your website, and can help you identify any issues with your URLs.

Optimize crawl budget

The crawl budget refers to the number of pages crawled or resources allocated for crawling. If site content is frequently updated and is represented in pages and in a sitemap, then you would be frequently indexed. The problem is that Google crawls a limited number of pages and may incorrectly select which URLs to crawl. It is highly advised to take these steps:

  • Check 404 and fix them
  • Get rid of duplicate content
  • Prevent low-quality and spam content indexation
  • Keep the sitemap up to date and properly ordered
  • Fix infinite space issues

Avoid meta refresh for moving a website

Google doesn’t recommend using "meta refresh" for site migration. Use HTTP responses with a status code of 3xx as a preferred redirect option. 

Here, HTTP redirection is the preferred option. 

Use Hreflang for Multi-Language Websites

Multi-language websites require Hreflang tags for language-specific pages. The attributes rel=”alternate” hreflang=”x” serve the correct language and regional URL in the following situations:

  • If there are small regional variations in a single language
  • If the site content is translated into multiple languages

Ensure performance monitoring

The final step in this block is to keep tracking all the processes and results to fix any major and minor errors on the spot.

Content Optimization


Once you fix general issues, you can focus on more specific components: content optimization, broken links, internal linking, etc.

Audit internal links

Internal links, connecting pages, are meant to build the website architecture, i.e., the crawl path. When checking the internal links, check the following:

  • Broken links
  • Redirected links
  • Click depth
  • Orphan pages
  • Group internal linking into silos

Implement programmatic SEO

Implementing programmatic SEO for large websites is crucial. As these sites can have thousands to millions of pages, manual optimization becomes impractical. Programmatic SEO automates and resolves SEO challenges at scale, handling tasks like meta tag generation, site speed optimization, and duplicate content issues. This ensures that extensive content is search-engine optimized, enhancing visibility, user experience, and conversions. In today's digital age, neglecting programmatic SEO means missing out on significant growth opportunities.

Maintain duplicate content

There are several ways to check for duplicate content on your website:

  1. Use a plagiarism checker: There are many online tools such as Copyscape, Grammarly, SmallSEOTools, etc. that can be used to check for duplicate content on your website. These tools will compare your content to other websites to see if it matches.
  2. Use Google Search Console: Use the "coverage" report to check for any pages that Google has identified as duplicate content.
  3. Use a website crawler: There are website crawlers, such as ScreamingFrog, DeepCrawl, etc., that can be used to crawl your website and identify any duplicate content.
  4. Manual checking: you can manually check for duplicate content by using the "site:domain.com" operator in Google and then browsing through the pages of your website, looking for any similarities in the content.
  5. Use a browser extension: There are browser extensions, such as Check My Links, LinkMiner, etc., that can be used to check for duplicate content on your website by identifying broken links and duplicate pages.

Use structured data to highlight content

Use structured markup data to be visible in more specific search queries, have rich snippets, get featured in the Knowledge Graph, help Google offer results based on contextual understanding, gain beta releases, etc. Use schema.org to mark up the pages and make them easy to understand by search engines. 

Control the number of on-page links

For the SEO and user experience, a single page with many links may be too much and potentially destructive. It is advised to add links as necessary and per relevance. 

Skip canonicalizing blog pages to the blog root

Google doesn’t favor canonicalizing blog subpages to the root of the blog as a preferred version. It is advised to have a self-referencing canonical tag for the page, ensuring fewer duplicate content issues. 

Creating a user-friendly website


You are almost done. Time to take care of website’s user-friendliness maintaining and fixing issues. You can't just ignore the user experience because Google cares so much about it. 

Set up AMP

To improve the UX, Google advises using AMP, which is designed to load pages super fast on mobile. It is not a search engine ranking factor, still, AMP is a factor of being perfectly optimized. 

Add breadcrumbs


The breadcrumbs we know from the tale are used in websites with the same purpose—leading the user through the website. By including breadcrumbs, you improve the usability and navigation of your website. The component also helps search engines see the site structure. 

Test on different platforms and devices


Finally, you have gone through all the steps, optimized your website, and completed the technical SEO checklist. Now it is time to test it on as many platforms and devices as possible. 


Time to perform technical audit


What is a technical SEO audit?

A technical SEO audit is a comprehensive review of a website's technical aspects to identify issues and opportunities that can affect its search engine visibility. This includes analyzing website architecture, page speed, crawling and indexing, site structure, meta tags, content optimization, and other technical factors that affect the website's ability to rank on search engines.

Why is a technical audit important for SEO?

A technical audit helps identify and fix technical issues that can negatively impact a website's search engine visibility. These issues can prevent search engines from crawling, indexing, and ranking a website, leading to a decrease in organic traffic and revenue. 

What does technical SEO include?

Technical SEO includes analyzing website speed, mobile-friendliness, site architecture, URL structure, crawlability, indexing, schema markup, and other technical factors that affect how search engines crawl and index a website.

What tools should I use for a technical SEO audit?

There are several tools available for conducting a technical SEO audit, including Google Search Console, SEMrush, Ahrefs, Screaming Frog, and Moz Pro. 

How long does a technical SEO audit take?

The length of a technical SEO audit depends on the size and complexity of the website. A small website with a few pages may only take a few hours to audit, while a large website with hundreds or thousands of pages may take several days or even weeks to complete.

How often should you perform a technical audit?

It is recommended to perform a technical SEO audit at least once in six months or whenever major changes are made to the website, such as a redesign or migration. Regular technical audits can help website owners identify and fix issues before they negatively impact the website's search engine visibility and organic traffic.