Within the all time buzzing internet, a website particularly an e-commerce one has to adapt to the latest SEO parameters despite it being designed to be SEO friendly at the outset. Over the time, changes in the search algorithms and website structure could affect the ranking of your website and an SEO audit of your website could come to your rescue.
A DIY SEO audit will help you locate the flaws or errors in your website which could have gone unnoticed while hampering the visibility within the search.
The list of checks being quite extensive, they can be easily grouped together for convenience –
Search Engines – Accessibility and Presence
You could login with your website at the Google Webmaster Tools and request for a “Crawl” of your website followed by “Crawl Errors” and you will be displayed a listing of the existing errors of your site.
Steps to Crawl and index your website
- On the Search Console Home page, click the www version of the site you want.
- In the Crawl section, select the option Fetch as Google.
- Enter the page you want to crawl. Select the device and click fetch if you want a quick check and click Fetch and Render if you want a deeper check.
The search engines are going to crawl through your website so you need to be there. If at all, your website is not crawled it won’t be visible either, so to begin with
URL Canonicalisation
Make sure that your website cannot be accessed by two different URLs. For example, https://www.example.com and the non www version https://example.com. It more than one version of a domain exists, Search engines get confused which version to index.
You specify preferred domain to Google by following the given steps:-
- Verify both the non www and the www version of your domain.
- On the Webmaster Tools Home page, click your site.
- Click the gear icon , and then click Site Settings.
- In the Preferred domain section, select the option www.kudoscars.co.uk and click save.
Robots.txt file
This robots.txt file contains instructions for search robots which direct them to visit specified sections of your website. This file is placed in the root of your website with location www.example.com/robots.txt and if not configured correctly, the search robots may skip crawling your website altogether.
Title and Meta Tags
The Title and Meta tags on the website are very important and it gives an idea to the search engines and users what a particular page is about.
Double check the meta tags especially for the “noindex” tag as they prevent the crawlers from viewing as well as crawling the page.
404s and Broken Links
Your website must issue a custom 404 error page when a visitor tries to access a page that doesn’t exist. The 404 error page must match your websites theme and must point to all the important pages of the website.
Following are some good examples of a 404 error page:-
Make sure all the pages of your website carry a proper URL. In case when old pages that no longer exists are searched for, the visitor should be redirected properly to a new one through a 301 redirect.
Basically a 301 redirect tells search engines that the old URL is no longer working and it has been permanently moved to the new URL
Consistency in XML Sitemap
Sitemaps give search engines a list of all the pages on your website which makes it easier to crawl every pages of your website.
The XML sitemap should be present on the site and uploaded on to your Webmaster Tools Account to enable the bots to crawl the site. Take care that it follows the requisite protocols. You could even compare the XML sitemap with the Site Crawl and check out for discrepancies if any..
You can verify your XML sitemap to Google by following the given steps:-
- On the Search Console Home page, click the site you want.
- In the Crawl section, select the option Sitemaps.
- Click the Add/Test Sitemap button on the page.
- Enter the location of your XML sitemap. You can test your sitemap before submitting it to Google.