Run a Technical SEO Audit to Maximize Your Site’s Effectiveness
A technical SEO audit is a full analysis of everything related to your website’s SEO.
If you want to find out why your site is not generating the traffic you think it should, or if you just want to be sure it ranks higher in search results, a complete and detailed web audit will give you a deeper insight into what to change to achieve the best results.
There are many “automated” website SEO audit tools where you plug in your website URL, and they give you a basic report with some information and errors detected. While this can provide a useful overview of your website, these reports are not true audits and don’t offer you a real value of what needs to be fixed on your website to improve traffic and conversions.
Doing a technical SEO audit for your own website or blog is not as complicated as it sounds.
Here is an SEO Audit Checklist that I follow when I audit a site.
Table of contents:
- Step 1. Connect your site with Google Search Console
- Step 2. Crawl your site
- Step 3. Sitemap
- Step 4. Robots .txt files
- Step 5. No-follow links
- Step 6. Site performance
- Step 7. Indexability
- Step 8. Content
- Step 9. Keyword research
- Step 10. Other Search Engines
- Step 11. Off-page SEO
Connect your site with Google Search Console
You must have a Google account to log in to Search Console, then you have to verify the website you connect with.
You can do it with different ways:
- With your domain name provider
- HTML file upload
- Adding a meta tag to your website
- With Google Analytics
- With Google tag manager
Once you have verified your website, you will have a lot of useful data and can fix a number of problems that negatively effect your site’s ranking. We will talk more about this later.
Crawl your site
SEO begins with the crawl experience. Before you can diagnose problems with the site, you need to have a reliable report of what’s happening on your website.
You can do a crawl and discover crawl errors with Search Console by clicking on your verified property, then selecting Crawl and clicking on “Crawl Errors”. You can get a short crawl statistic, too.
Search engines and users are unable to access your site’s content if you have URLs that return errors (like a 404 HTML status code).
During your site crawl, you should identify and fix any URLs that return errors. Redirecting is the best solution for those.
Simple 301 Redirects is a plugin for this that works well for WordPress websites.
If you have a link in your content that no longer exists, you will have a broken link and you should remove it immediately.
Sitemaps are files that show search engines how your site is organized and the pages available there. They are not mandatory, but they do help.
By submitting your sitemap to Google Search Console, you’re making Google’s job easier by giving them the information they need. It’s especially important if your site is new, and not a lot of other websites link to it, or if your pages don’t link together.
After selecting your property, on the left, you’ll see an option called “Crawl.” Under “Crawl,” there will be an option marked “Sitemaps.”
Then click “Add/Test sitemap” and you will get a field to add text to it.
Type “system/feeds/sitemap” in that box and hit “Submit”.
You don’t always want search engines to access your entire site. There may be pages you don’t want crawlers to see.
A robots.txt file – placed in the root of your site – tells search engine robots what you do and don’t want to be indexed.
Here is an example of a robots.txt file if I allow every search engine to index all of my pages:
And here is Moz’s robots.txt where they can allow some pages and disallow others:
Never put personal information such as users information in robots.txt files, because people can still see what you disallowed and can easily follow the link. (with the link: domain/robots.txt)
You can edit your robots.txt file from Search Console, but you have to download the file and manually upload to the root of your website.
No follow links
“Nofollow” provides a way for webmasters to tell search engines “Don’t follow links on this page” or “Don’t follow this specific link.”
It looks like this:
<meta name=”robots” content=”nofollow” />
Common uses for nofollows include links in comments, links to other websites within your content, embeds such as widgets or infographics, or anything off-topic that you still want to link people to, but don’t want search engines to follow these links.
A nofollow won’t prevent the linked page from being crawled completely; it just prevents it being crawled through that specific link.
Users are impatient, and if your site takes too long to load, they will leave. Similarly, search engine crawlers have a limited amount of time that they can spend discovering each site on the Internet.
While page speed is more of an indicator than a ranking factor, websites that load quickly are crawled more thoroughly than those with slower page speed.
For search engines, better performance and fast page speed is a sign of a healthy site that provides good customer experience and therefore should be rewarded with higher ranking.
You can evaluate your site’s performance with different free tools. Google Page Speed is one that I use.
You want to determine how many of your pages are actually being indexed by the search engines.
Most search engines have a “site:” command that allows you to search for content on a specific website. It can roughly tell you the number of pages that are being indexed by that specific search engine.
In my case, the site command is:
If the indexed results and your total page number is roughly the same, that means that the search engines are successfully crawling and indexing your site’s pages.
To investigate a page’s content, the best way is if you work with the cached page version of your website (the text-only version).
If you do a google search, and find the page you are looking for, click on the green arrow, and select “cached” to see Google’s version of the site.
There’s no established rule for how much content a page should contain, but using at least 300 words is a good average.
The content should contain keywords, but not too many (we will talk about it later).
It’s pretty subjective, but you have to determine if the content is useful and valuable for searchers. You can get a guess based on bounce rate, and time spent on page.
Look for spelling and grammatical errors.
Keyword research, targeting, and keyword usage
“Keyword research is a core SEO task that involves identifying popular words and phrases people enter into search engines -make in an attempt to figure out what to rank for.”
When I am auditing a website, I collect the keywords I want to rank for, and find similar keywords too. Once I have a list of these keyword ideas, I look for data that determines how popular each keyword is and how difficult it would be to rank for them on search engine results pages. (For example with Google Adwords Keyword Planner)
It depends on your strategy which keywords you want to rank for, so pick the best ones, and create content for those keywords, or update your existing content to (ideally) contain them.
Be sure that the content has the keyword in it (if you can, in the first paragraph), it appears in the URL, in the titles and meta descriptions, image alt tags, H1 tags, and anchor text. Don’t overuse keywords. Keep it around 1% so one keyword for every 100 words.
A good practice is placing the keywords and other keywords with similar meanings (targeting more than one keyword) so chances are better that you will appear in a search for the particular topic.
Be sure that keywords to use in a natural sounding way.
Other Search Engines
You want to be sure that your site is accessible from all major search engines.
These include Google, Bing, and Yahoo – these are some we are all familiar with.
But there are other major search engines with millions of search queries every day, often that are tied to a specific media or content type.
If you have videos, or if you can create some, be on YouTube. You can create a how-to video about how to use your product, or solve a specific problem. You can share company news, or create presentation slides for teaching a specific topic.
If you have an ecommerce store, it is worth considering being on Amazon.
If you have a local business, get listed in Google Maps.
Target image, news, and app searches, if it makes sense for your industry or product.
Check the traffic in your Google Analytics. If you gain traffic over time, that’s a good sign for search engines.
Also, if you have backlinks from other trustworthy sites, it will help you rank higher in search results. The best is if you can get backlinks from a well-ranked page, but it doesn’t mean that you should go to blogs and spam them with your links and comments. Commenting other blogs is a good practice, but only if it makes sense. Always read the blog post first, and refer to details in your comment.
While social engagement is not a ranking factor, earning shares and tweets can help in branding, earn more traffic, and maybe some backlinks.
SEO auditing is a crucial process for your website or blog.
Sometimes business owners think that writing a blog or improving page speed is enough to rank higher. This is not true as there are many other factors involved.
With a technical SEO audit, you can discover what is wrong and what you need to improve to rank higher in search results. With this information, you can make your website better and more visible to search engines, leading to more traffic and conversion.