Paid Media, SEM, eGuide

Technical SEO Checklist
The Important & Actionable Factors

Our SEO team has overseen hundreds of technical SEO audits over the years to identify critical areas for improvement. Often this is the first step when we’re  implementing an SEO strategy, as website health is a strong determining factor in your ability to rank, convert and increase organic traffic.

We’ve created this checklist with the intent of helping independent webmasters or in-house teams understand and conduct their own technical audit (and spoiler alert, read on to claim a FREE technical report on your site’s overall health and issues that could be hindering performance).

Let’s get started!



Getting Started – Tools You’ll Need

Identifying key issues in a technical audit can be a difficult task without the right toolset. Thankfully there are many available. To help you do a complete technical SEO audit, we’re going to focus on the most user-friendly and effective tools available, including:

  • Screaming Frog – A powerful crawler tool that identifies common onsite SEO issues with many advanced features.
  • SEMrush – Another crawler that analyses the health of a website and lists key areas where the website is struggling.
  • JetOctopus – Great for enterprise level websites (more than 100,000 URLs) that provides a comprehensive technical audit.
  • Ahrefs – Our favoured tool for analysing backlinks & anchor text with a built-in site audit function to quickly ID any tech issues.
  • Page Speed Insights – A free Google developer tool that provides insights into page related issues reducing load times and more. Screaming frog has an API connector that can include this information on all URLs crawled. Extremely useful when conducting large scale audits.

 


Crawlers, Indexing & Sitemaps

Before discussing the significance of crawlers, it’s important to recognise the role that indexation plays in Google search. A website with non-indexed pages is like a shop in the desert. It doesn’t matter what the shop sells, if no one knows about it – if no one can find it or get to it – it’s not going to earn you any business.

indago seo marketing

1. Crawlers

Crawlers (commonly known as spiderbots or web spiders) such as Googlebot will crawl websites to add and sort pages categorically within a search engine. These bots are responsible for indexing publicly available pages and do so to organise content for users based on their specific search query. Crawlers also designate a certain budget per crawl.

The budget refers to the number of pages a bot can crawl and index on a website within a specific timeframe. This signifies the importance of factors such as page speed and internal links which we will discuss below.


2. Indexing - Are Your Most Important Pages indexed?

An index in SEO refers to the database search engines use to store information. They contain the data of all the websites that crawlers can find. If a website or a specific page is not listed in a search engines index, then users won’t be able to find it, and that’s why it’s important to review your pages and ensure that they are indexable and easy for bots to find.

There are a few things you can do:

2.1 Check No-index Tags

If you’re using Screaming Frog, there is an easy way to see if there are any meta tags blocking your pages from being indexed. Once your crawl is finished, export the data and you will find a column called ‘Meta Robots 2’ which either has an ‘index, follow’ or ‘noindex, follow’ attribute. Review these tags and ensure that no valuable pages have the ‘noindex’ tags.

2.2 Internal & External Linking

As mentioned earlier, getting your pages indexed is crucial in the digital space and helping crawlers through internal linking will do exactly that. The Googlebot uses internal linking as a method to identify important pages and prioritise those that have many internal and external links pointing to them. Internal links will also send the Googlebot to all the different pages where internal links are coming from. Auditing your internal and external links through your preferred crawler is important, as broken links can reduce your websites crawlability and user experience.

Ask yourself these questions when analysing internal and external linking:

  • Are any links pointing to 404 errors?
  • Am I linking to the most important pages?
  • How many clicks does it take to reach important pages?
  • Is the anchor text for my links consistent with the page’s content?

2.3 Robots.txt

Before a crawler begins crawling your website, it will request a file known as ‘robots.txt’. This file provides instruction on what pages web bots have permission to crawl and what content they can index. They are generally used to avoid overloading your site with crawl request bot traffic, but most importantly, to keep a file or page off Google, like password protected pages or those with similar content. Read more about robots.txt and how to create one.

2.4 Sitemaps

Another method to ensure your pages get indexed is to check or create your sitemap. A sitemap has many functions but primarily exists for crawlers to quickly identify and index important pages on your website. There are two types of sitemaps, HTML and XML. HTML sitemaps are written for people and viewable on your page to help users navigate through the pages on your website. XML is the important sitemap when considering indexation as they are designed specifically for crawlers and can quickly identify the important pieces of information about your site from it. After you’ve evaluated or created your sitemap, be sure to submit it to the major search engines to get them crawling. Read the Google Developers support article on how to build and submit your sitemap.

 

Identifying Any Major Errors – 4xx & 5xx errors, redirect loops, Meta Data

4xx & 5xx errors

After you’ve finished reviewing indexation, the next thing to do is to check for any major crawl errors on site. Crawl errors are found when a search engine fails to reach a page on your website. These are generally known as 400 or 500 errors. Screaming Frog, Jet Octopus Deep Crawl and Semrush can help you identify these issues. Once you’ve finished crawling your site, export the data and review the status codes, looking for any 400 and 500 errors. The next thing to do is identify where you want to direct these pages to and implement 301 redirects.

Identify and Fix Redirect Loops

Simply put, a redirect loop occurs when a URL has multiple redirects, sending the original URL to another URL, which redirects to the initial URL requested. Redirect loops are often the result of poor redirect configuration and result in users and crawlers getting stuck in a loop on your website. From a user experience perspective, they result in errors being displayed on page because the user will never reach the requested page. As for SEO, redirects pass on ranking signals like authority and relevance, also known as ‘link juice’. In a redirect loop, the final URL never resolves meaning that these ranking signals are lost. Screaming Frog under the reports tab allows you to export all your redirects on site and identifies any URLs with any chains to be fixed.

 

FREE TECHNICAL SEO AUDIT

Complete form below to receive a comprehensive report on your site’s overall health and issues that could be hindering performance.

 

Meta Data – Page Titles, Meta Descriptions & Heading tags

Next, you will want to review all your page’s meta data, including page titles, meta descriptions and H1 tags to ensure they’re optimised and relevant to the page’s content.

Page Titles

The page title should be around 55 characters and include the keyword you’re targeting. This allows users and crawlers to quickly identify what your content is about, which can affect how well your page ranks in organic searches for specific queries.

Meta Descriptions

Meta descriptions are a snippet that exist on your organic listing and should entice interest to searchers looking for specific content. Well-composed meta descriptions can increase your click-through rate. Ideally, your meta descriptions should be no longer than 155 characters and no more than 920 pixels. Use this tool to help optimise your meta descriptions.

Heading tags

Heading tags, commonly seen as H1’s and H2’s, represent the pages hierarchy and are used to separate headings and subheadings on your page. The H1 is the most important signal to crawlers that use these tags to identify your content and understand the page. Ensuring your headings structure is consistent and relevant to the content is important and should be reviewed.

A crawl of your site will easily identify what meta data needs to be adjusted by categorically assorting them into duplicates, over or below recommended character count etc.




Core Web Vitals

Google considers user experience as a ranking factor, which means reviewing your website’s core web vitals is important. Core web vitals refer to the performance of your page and it’s broken down into three categories:

  • Largest Contentful Paint (LCP): The LCP metric measures your website’s loading performance. To provide a good user experience, LCP should resolve within 2.5 seconds of when the page first starts loading.
  • First Input Delay (FID): FID refers to your website’s interactivity, i.e., how long it takes before a user can interact with your page. Pages should have an FID score of 100 milliseconds or less.
  • Cumulative Layout Shift (CLS): The CLS metric measures the visual stability of your pages, which can be seen in any assets on page shifting around the page. A good user experience score should maintain a CLS of 0.1 or less.

 


There are a few tools that can help identify issues that affect core web vitals, Lighthouse (an inbuilt developer tool on chrome) or page speed insights tool. Screaming Frog has an API connector that integrates page speed insights into their crawl, making it simple to identify common core web vital issues across your pages. These reports will make recommendations where there is room for improvement across these metrics.

Whilst this is only a portion of what is required to conduct a full technical audit, it’s a great place to start and tick off important factors that are easy to action.

For more of a deep dive, check our FREE three-part SEO training course hosted by our Chief Technical Officer, Peter Dimakidis or, of course, get in touch with our Sydney SEO team and we’ll do the hard work for you!

Are technical SEO issues hindering your site’s performance?
Would you like to improve your online conversions? Apply for a FREE quote