If you are looking for an SEO checklist that will help you to increase your site’s organic traffic and rank on Google, you have just found it.
We have put together the ultimate checklist that you need to drive SEO success in 2020, covering 41 best practice points and tasks that you need to know about.
From the SEO basics to must-knows when analyzing your off-page signals, use this as a reference point for ensuring that your site is adhering to best-practice and that you’re not being held back by issues that you have missed.
Here are the main categories I will cover in this guide:
- How to Use This SEO Checklist
- SEO Basics Checklist
- A Keyword Research Checklist
- Technical SEO Checklist
- On-Page SEO and Content Checklist
- Off-Page SEO Checklist
How to Use This SEO Checklist
We’ve broken this checklist down into sections that cover the main focus areas of SEO; the basics, keyword research, technical SEO, on-page SEO, and content and off-page factors.
There’s a good chance that your site already covers many of these points, and if it does, great!
However, we also know that all websites have opportunities to improve and are confident that you will find at least some best-practice areas that you have overlooked.
Some of these points might not be relevant to you, and that is OK!
Work through the list, reference these against your site, resolve issues, and maximize opportunities where you can. SEO success doesn’t come from simply following a checklist, but to outrank your competitors; you need to make sure you are at least covering most of these points.
SEO Basics Checklist
If you haven’t got the basics covered, your site will struggle to rank for competitive terms.
The following points are very much housekeeping tasks but form the basics of implementing a successful SEO strategy.
1. Set Up Google Search Console and Bing Webmaster Tools
Google Search Console is an essential tool that provides you with invaluable insights into your site’s performance as well as a wealth of data that you can use to grow your site’s organic visibility and traffic.
You can learn more about why it is so important to use, how to set it up, and more in our definitive guide.
Bing Webmaster Tools is the equivalent platform, just providing data and insights for their search engine.
These all-important tools allow you to view the search terms and keywords that users are finding your site on the SERPs for, submit sitemaps, identify crawl errors, and much more.
If you have not got these set up, do so now, and thank us later.
2. Set Up Google Analytics
Without the right data, you can’t make the right decisions.
Google Analytics is a free analytics tool that allows you to view data and insights about how many people are visiting your site, who they are, and how they are engaging with it.
Our definitive guide will walk you through everything you need to know about the tool as a beginner, including how to set it up and the reports that you will find the most useful, but one this is for sure, and that is that you can’t run a successful SEO strategy without it.
You will also need to connect Google Analytics and Google Search Console to import data from the latter.
3. Install and Configure An SEO Plugin (If You Are Using WordPress)
If you are using WordPress as your CMS (which there is a pretty good chance that you are, given that it now powers 35% of the web), you should install and configure an SEO plugin to provide the functionality and features that you need to properly optimize your site.
In SEMrush’s recently published WordPress SEO checklist, we have SEO plugin suggestions for you. Whichever plugin you choose pretty much comes down to personal preference, but these are three great options.
If you are using a different CMS to WordPress, speak with your developer to see whether you need to install a dedicated SEO plugin or module or whether the features that you need are included out of the box.
Plug in SEO, as an example, is one of the most popular Shopify SEO apps.
4. Generate and Submit A Sitemap
The purpose of a sitemap is to help search engines decide which pages should be crawled and which the canonical version of each is.
It is simply a list of URLs that specify your site’s main content to make sure that it gets crawled and indexed.
In Google’s own words:
A sitemap tells the crawler which files you think are important in your site, and also provides valuable information about these files: for example, for pages, when the page was last updated, how often the page is changed, and any alternate language versions of a page.
Google supports a number of different sitemap formats, but XML is the most commonly used. You will usually find your site’s sitemap at https://www.domain.com/sitemap.xml
If you are using WordPress and one of the plugins mentioned above, you will find that generating a sitemap is standard functionality.
Otherwise, you can generate an XML sitemap with one of the many sitemap generator tools that are available. In fact, we recently updated our ultimate guide to sitemaps, which includes our top recommendations.
Once you have generated your sitemap, make sure that this is submitted to Google Search Console and Bing Webmaster Tools.
Make sure to also reference your sitemap in your robots.txt file.
5. Create a Robots.txt File
Quite simply, your site’s robots.txt file tells search engine crawlers the pages and files that web crawlers can or can’t request from your site.
Most commonly, it is used to prevent certain sections of your site from being crawled and is not intended to be used as a way to de-index a webpage and stop it showing on Google.
You can find your site’s robots.txt file at https://www.domain.com/robots.txt
Check whether you already have one in place.
If you don’t, you need to create one – even if you are not currently needing to prevent any web pages from being crawled.
Several WordPress SEO plugins allow users to create and edit their robots.txt file, but if you are using a different CMS, you might need to manually create the file using a text editor and upload it to the root of your domain.
You can learn more about how to use robots.txt files in this beginner’s guide.
6. Check Search Console For Manual Actions
In rare instances, you might find that your site has been negatively affected by having a manual action imposed upon it.
Manual actions are typically caused by a clear attempt to violate or manipulate Google’s Webmaster Guidelines – this includes things like user-generated spam, structured data issues, unnatural links (both to and from your site), thin content, hidden text and even what is referred to as pure spam.
Most sites won’t be affected by a manual action and never will be.
That said, you can check for these in the manual actions tab in Google Search Console. – Read more