The 2020 Keyword Research Guide for SEO

My Post - 2020-01-24T113650.066.pngThis guide is intended to teach you how to do in-depth and meaningful keyword research.

Good keyword research allows you to uncover the terms, phrases, questions, and answers that are important to your users or customers AND important to achieving your goals, whether they are getting more pageviews, capturing leads, or selling products and services. Keyword research sets you up for building effective strategies for improving or expanding your content to acquire higher rankings, and to rank on a wider variety of terms, to ultimately drive more relevant organic traffic to your site.

1. What Is Keyword Research?

Keyword research is the process of finding all of the possible search engine queries which may be relevant to your business and your customers. Keyword research includes not only finding these keywords but also sorting and prioritizing them into logical, related groups, which can then inform how you might change existing pages on your site or create new content.

Why Keyword Research Is (Still) Important for SEO

While some SEOs may argue that keywords are no longer important or won’t be essential in the future, they are still crucial not only for search engine rankings but for understanding the search intent behind a given query. As long as people search using the search engines by typing a query into a search box or making a voice query on an “assistant”, it will be crucial to understand the following:

  • What those queries are.
  • How important they are to your business.
  • How you might create the best content to answer the intent of the query.

Even as search trends change, if people are looking for an answer to “something”, keywords will continue to matter.

Old school “individual” keywords and optimizing a single page for a single keyword has certainly gone by the wayside. However, using groups of related keywords, and examining their relative popularity, can not only give you insights into opportunities to drive more organic traffic to your site but can also help you understand the overall intent of your potential users. This information can help you better satisfy those intents not only through optimizing your website but potentially optimizing your product selection, navigation, UI, etc.

Understanding Keyword Themes (Groups of Related Keywords)

Some may refer to groups of related keywords as topics or themes, but at heart, they are groups of individual keywords that signal a similar need or intent by a searcher. As such, keyword research should never be left as simply a list of keywords, but rather used to form various segments of interrelated keywords.

A single topic or theme might lend itself to a single piece of content that can answer all of the needs within that topic, and thus a single page is “optimized” for the entire group of keywords. Or, the topic may be broad enough to signal that you should have an entire section of your website with many pieces of content targeted at answering the user intents.

For example, if you were writing a post about “how to fry an egg”, one single article might satisfy the intent for all the keywords around that “theme”. Example:

  • How to fry an egg
  • How to cook a sunny side up egg
  • How to cook an egg over medium
  • How to fry an egg for a sandwich
  • How to fry an egg in the microwave
  • How to fry an egg over easy
  • How to fry an egg over hard
  • How to fry an egg over medium
  • How to fry an egg sunny side up
  • How to fry an egg with oil
  • How to fry an egg without oil

If you had a group of keywords or a theme around “what caused the decline and fall of the Roman Empire,” all of the intents around that theme of keywords are unlikely to be satisfied by a single piece of content and would likely require a much larger body of content.

Keyword/Query Trends

Some SEOs argue that individual “head” keywords aren’t going to matter anymore because of voice search —which leads to long, natural language search queries. Search queries, in general, are becoming much longer, in part due to voice search.

But, that doesn’t mean that shorter “head” keywords can’t form the basis for starting your keyword research and helping to uncover many longer-tail keyword variants.

This is partly because, at least for now, there really is no separate voice search results or database.

Google, for instance, simply returns essentially the same results for a voice query as if you had typed that exact query into the search box on the Google web interface or search app. For many of these long longtail queries, Google is simply going to parse out the most important terms in the query and return the results for that.

For instance, someone may search for “Hey Google, what are the best running shoes for a person who has flat feet?”. Looking at Google search results, it is easy to see that Google returns the exact same result set for that query as it does for “best running shoes flat feet.”  – Read more

SEO in 2020: What Role Do Keywords Play?

My Post - 2020-01-20T122505.216.pngIt’s 2020, and SEO professionals who’ve been at it for a while will know just how much has changed in the past decade.

In 2010, we still hadn’t been hit with Panda or Hummingbird or RankBrain or BERT, and many of us still thought “SEO content” was a matter of:

  • Adding our target keyword and its close variants in the content X times.
  • Making sure to add that keyword to all the magic places like your title tag, meta description, H1, etc.
  • Writing at least X words because that’s the magic length for rankings.

But Google’s algorithm has matured.

We know now (or we should) that getting our content ranked isn’t a matter of tricking Google by stuffing keywords in all the right places. It’s about providing an exceptional experience to searchers.

So how exactly should we be using keywords?

To answer that, we’ll need to take a step back and address what it really means to write content for search.

What Is SEO Content?

SEO content is content written for the purpose of ranking in search engines. That term, however, has fallen out of favor with many SEO professionals.

That’s because “SEO content” implies content written for search engines rather than humans, and that’s not good.

Why?

Because Google’s algorithm is a programmatic representation of the searcher.

If the algorithm is trying to model what a human visitor would pick as the best result, the answer to “how to rank” is to do what’s best for searchers.

So if that’s the kind of content Google wants to rank, then the way to write “SEO content” is just to write in a way that people will enjoy – right?

Not quite. There’s a bit more to it than that.

How Do I Make Content SEO Friendly?

SEO-friendly content is content that answers the intent of the searcher’s question clearly and comprehensively, and has a high degree of expertise, authoritativeness, and trustworthiness.

Let’s break that down.

Content That Answers the Intent of the Searcher’s Question

“SEO friendly” content is content that, first and foremost, answers a searcher’s question.

This means that the topic of the page itself will be dictated by the questions your audience is asking.

This also means that not all content is relevant for a search audience. Some content is written for thought leadership or to break news (new ideas = no existing search demand). Other content is written to attract social engagement.

We write content for many different purposes, so we shouldn’t expect every single one of our pages to rank well in search engines.

That means adding search audience-focused topics to your editorial calendar, rather than attempting to sprinkle keywords onto all your pages, many of which weren’t written for a search audience in the first place.

Content That’s Clear & Comprehensive

When you ask a question, do you prefer getting an answer that’s convoluted, vague, and clunky? Or direct, specific, and straightforward?

It’s a no-brainer, right? Google thinks so too.

But it isn’t as shiny and exciting to talk about grammar and diction. I think most SEO professionals would rather talk about topics like natural language processing.

But even the most meticulously researched brief can be ruined by content that doesn’t read well, so this stuff matters.

Don’t underestimate the power of tools like Microsoft Word’s “Grammar & Refinements” settings that can help you:

  • Replace complex words with simpler ones.
  • Swap wordiness for conciseness.
  • Go from passive to active voice.

…and much more.

Google also values content that’s comprehensive. Just take a look at what they say in their quality rater guidelines:

The Highest rating may be justified for pages with a satisfying or comprehensive amount of very high-quality main content.

Or on their Webmasters Blog:

Q: What counts as a high-quality site?

A: You can answer “yes” to “Does this article provide a complete or comprehensive description of the topic?”

Be thorough and be clear when you’re answering your search audience’s questions. – Read more

10 Essential On-Page SEO Factors You Need to Know

My Post - 2020-01-17T121406.221.pngSucceeding in organic search today requires optimizing for a combination of factors that search engines consider important – technical, on-page and off-page.

Over the years, we’ve seen increased focus toward off-page techniques – such as link building – and other technical elements.

But the reality is, off-page SEO won’t do much good if you don’t pay attention to the fundamentals – on-page SEO.

Smart SEO practitioners know that on-page optimization should be constantly prioritized.

And because the search landscape is ever-evolving, it’s important to make sure your on-page SEO knowledge is up to date.

In this post, we will cover what on-page SEO is, why it matters, and 10 of the most important on-page SEO considerations today.

What Is On-Page SEO?

On-page SEO (also known as on-site SEO) refers to the practice of optimizing web pages to improve a website’s search engine rankings and earn organic traffic.

In addition to publishing relevant, high-quality content, on-page SEO includes optimizing your headlines, HTML tags (title, meta, and header), and images. It also means making sure your website has a high level of expertise, authoritativeness, and trustworthiness.

It takes into account various aspects of the webpage that, when added together, will improve your website’s visibility in the search results.

Why On-Page SEO Is Important

On-page SEO is important because it helps search engines understand your website and its content, as well as identify whether it is relevant to a searcher’s query.

As search engines become more sophisticated, there is a greater focus toward relevance and semantics in search engine results pages (SERPs).

Google, with its plethora of complex algorithms, is now much better at:

  • Understanding what users are actually searching for when they type a query.
  • Delivering search results that meet user intent (informational, shopping, navigational).

Adapting to this development is essential, and you can do it by ensuring that your website and its content – both what is visible to users on your webpages (i.e., text, images, video, or audio) and elements that are only visible to search engines (i.e., HTML tags, structured data) – are well-optimized according to the latest best practices.

Additionally, you can’t simply ignore on-page SEO because you have more control when optimizing for on-site elements – as opposed to off-page SEO that consists of external signals (i.e., backlinks).

If you put effort into on-page strategies, you’ll see a boost in traffic and a rise in your search presence.

This guide will walk you through the most important elements of on-page SEO.

Paying close attention to these 10 areas will help improve your content and authority – and increase your rankings, traffic, and conversions.

1. E-A-T

E-A-T, which stands for Expertise, Authoritativeness, and Trustworthiness, is the framework that Google raters use to assess content creators, webpages, and websites as a whole.

Google has always put a premium on high-quality content. It wants to make sure that sites producing high-quality content are rewarded with better rankings and sites that create low-quality content get less visibility.

There is a clear relationship between what Google considers high-quality content and what appears in the search results.

Call it correlation or causation – whatever it is, E-A-T is somehow playing a role in Google’s organic search results. Which means E-A-T must be a consideration in your SEO strategy.

2. Title Tag

The title tag, an HTML tag that exists in the head section of each webpage, provides an initial cue or context as to what the topical subject matter is of the respective page it is on.

It is featured prominently in the search engine results pages (typically used as the clickable link) as well as in the browser window.

The title tag by itself has little impact on organic rankings, this why it’s sometimes overlooked.

That said, missing, duplicate, and poorly written title tags can all negatively impact your SEO results, so make sure you’re optimizing for this element.

3. Meta Description

Since the early days of SEO, meta descriptions have been an important optimization point.

Meta descriptions, meta tags that provide a description of what the page is about, are often displayed in the SERPs underneath the title of the page.

While Google maintains that meta descriptions don’t help with rankings, there is anecdotal evidence that indirect attributes of better descriptions do help. – Read more

7 Tips to Optimize Crawl Budget for SEO

My Post (98).pngCrawl budget is a vital SEO concept that often gets overlooked.

There are so many tasks and issues an SEO expert has to keep in mind that it’s often put on the back burner.

In short, crawl budget can, and should, be optimized.

In this article, you will learn:

  • How to improve your crawl budget along the way.
  • Go over the changes to crawl budget as a concept in the last couple of years.

What Is Crawl Budget

So for those of us who’ve had so much to think/worry/sweat about that we forgot what crawl budget even means, here’s a quick recap.

Crawl budget is simply the frequency with which search engine’s crawlers (i.e., spiders and bots) go over the pages of your domain.

That frequency is conceptualized as a tentative balance between Googlebot’s attempts to not overcrowd your server and Google’s overall desire to crawl your domain.

Crawl budget optimization is just a series of steps that you can take specifically to up the rate at which search engines’ bots visit your pages.

The more often they visit, the quicker it gets into the index that the pages have been updated.

Consequently, your optimization efforts will take less time to take hold and start affecting your rankings.

With that wording, it certainly sounds like the most important thing we all should be doing every second, right?

Well, not entirely.

Why Is Crawl Budget Optimization Neglected?

To answer that question, you only need to take a look at this official blog post by Google.

As Google explains plainly, crawling by itself is not a ranking factor.

So that alone is enough to stop certain SEO professionals from even thinking about crawl budget.

To many of us, “not a ranking factor” is equated to “not my problem.”

I disagree with that wholeheartedly.

But even forgetting that, there are Google’s Gary Illyes’ comments. He has stated outright that, sure, for a huge website of millions and millions of pages, crawl budget management makes sense.

But if you’re a modestly-sized domain, then you don’t have to actually concern yourself too much with crawl budget. (And in fact added that if you really have millions and millions of pages, you should consider cutting some content, which would be beneficial for your domain in general.)

But, as we all know, SEO is not at all a game of changing one big factor and getting the results.

SEO is very much a process of making small, incremental changes, taking care of dozens of metrics.

Our job, in a big way, is about making sure that thousands of tiny little things are as optimized as possible.

In addition, although it’s not a big crawling factor by itself, as Google’s John Mueller points out, it’s good for conversions and for the overall website health.

With all that said, I feel it’s important to make sure that nothing on your website is actively hurting your crawl budget.

How to Optimize Your Crawl Budget Today

There are still things that are super heavy-duty and others’ importance has changed dramatically to a point of not being relevant at all.

You still need to pay attention to what I call the “usual suspects” of website health.

1. Allow Crawling of Your Important Pages in Robots.Txt

This is a no-brainer, and a natural first and most important step.

Managing robots.txt can be done by hand, or using a website auditor tool.

I prefer to use a tool whenever possible. This is one of the instances where a tool is simply more convenient and effective.

Simply add your robots.txt to the tool of your choice will allow you to allow/block crawling of any page of your domain in seconds. Then you’ll simply upload an edited document and voila!

Obviously, anybody can pretty much do it by hand. But from my personal experience I know that with a really large website, where frequent calibrations might be needed, it’s just so much easier to let a tool help you out.

2. Watch Out for Redirect Chains

This is a common-sense approach to website health.

Ideally, you would be able to avoid having even a single redirect chain on your entire domain.

Honestly, it’s an impossible task for a really large website – 301 and 302 redirects are bound to appear.

But a bunch of those, chained together, definitely hurt your crawl limit, to a point where search engine’s crawler might simply stop crawling without getting to the page you need indexed.

One or two redirects here and there might not damage you much, but it’s something that everybody needs to take good care of nevertheless.

3. Use HTML Whenever Possible

Now, if we’re talking Google, then it has to be said that its crawler got quite a bit better at crawling JavaScript in particular, but also improved in crawling and indexing Flash and XML.

On the other hand, other search engines aren’t quite there yet.

Because of that, my personal standpoint is, whenever possible, you should stick to HTML.

That way, you’re not hurting your chances with any crawler for sure. – Read more

3 Qualities of the Best SEO Information

My Post (96).pngIn search marketing there are many different ideas of how Google ranks sites. Who do you believe? Nobody can say they 100% know Google’s algorithm. But we can know 100% that some explanations of how Google ranks sites are more trustworthy than others.

Different Levels of Knowledge

Some ideas are more true than others. What makes one idea more trustworthy is the evidence.

However, not all evidence is trustworthy. There are two poor and untrustworthy sources of evidence:

1. Anecdotal evidence

2. Correlation based evidence

Anecdotal SEO Evidence

Anecdotal refers to ideas that are based on the personal experience of one or more people, but without actual research and testing to confirm the idea.

The early years of SEO were dominated by hypotheses created by anecdotal evidence. One of the earliest examples is when affiliate marketers noticed that Google was consistently banning affiliate sites. That led to the hypothesis that Google hated affiliate sites and was actively going after them.

This led to hiding affiliate links by using JavaScript and/or URLs that are redirected through pages that were blocked from crawling with Robots.txt. The idea was to hide affiliate links from Google so that Google wouldn’t ban the site for being an affiliate site.

That’s an example of anecdotal evidence (a group of affiliate marketers noticed they all lost rankings) which then led to the idea that Google was “targeting” affiliate sites.

Of course, they were wrong. Not only have Googlers stated that Google does not treat affiliate sites as lower quality, but there is no research papers or patents to show that Google had researched algorithms that “targeted” affiliate sites or any specific kinds of marketers other than spammers.

There was no factual evidence to support the anecdotal evidence. Factual evidence, in my opinion, is how to identify a flimsy opinion from an evidence based insight.

It doesn’t matter what your favorite SEO guru tells you about Google. It doesn’t matter if that Guru is ranked at the top of Google, that doesn’t prove anything. What matters is the factual evidence to support the idea.

Correlation Based Hypotheses

The SEO industry is being exposed to less and less of this kind of SEO information. At one time, a correlation study based on millions of search results resulted in lots of links and attention. But the information was bad.

Just because all the top ranked sites have active social media presence does not mean that social media presence is a ranking factor.

That kind of correlation is especially wrong if there is absolutely no research on that kind of ranking factor by any search engine or university anywhere on earth.

But that’s the kind of correlation nonsense the SEO industry fell for during the mid-2000’s. And for a time many businesses wasted money doing things like trying to attract likes to their Facebook page.

The days when the SEO industry believed the outcomes of correlation studies was a dark period in the SEO industry.

While there are STILL some SEOs who are publishing correlation studies, many SEOs are increasingly skeptical and ignoring them, as well they should.

3 Qualities of the Best SEO Information

In my estimation, there are arguably three levels of SEO knowledge. At the top is canonical level information, followed by citation based knowledge and experience based knowledge.

1. Canonical SEO Information
Confirmed by Google to be true.

2. Citation Based Knowledge
This is information that is supported by reliable evidence such as patents and research papers.

3. Experience  Based SEO
Professionals who are actively creating websites and ranking them can be considered authoritative. You can’t argue with success, particularly with a person who is actually doing the work and succeeding with it. – Read more

How to Perform an In-Depth Technical SEO Audit

My Post (86).pngI’m not going to lie: Conducting an in-depth SEO audit is a major deal.

And, as an SEO consultant, there are a few sweeter words than, “Your audit looks great! When can we bring you onboard?”

Even if you haven’t been actively looking for a new gig, knowing your SEO audit nailed it is a huge ego boost.

But, are you terrified to start? Is this your first SEO audit? Or, you just don’t know where to begin? Sending a fantastic SEO audit to a potential client puts you in the best possible place.

It’s a rare opportunity for you to organize your processes and rid your potential client of bad habits (cough*unpublishing pages without a 301 redirect*cough) and crust that accumulates like the lint in your dryer.

So take your time. Remember: Your primary goal is to add value to your customer with your site recommendations for both the short-term and the long-term.

Ahead, I’ve put together the need-to-know steps for conducting an SEO audit and a little insight to the first phase of my processes when I first get a new client. It’s broken down into sections below. If you feel like you have a good grasp on a particular section, feel free to jump to the next. – Read more

Pro Tip: A look back at some helpful tips from our SEO community in 2019

My Post (85).pngHere are 5 tips from our SEO community we think are worth a second look.

Our community is dedicated to helping fellow SEOs so we wanted to look back at a few insights shared this past year that were particularly popular with readers.

1. Google doesn’t hate your website

“The personal animosity complaint is as frequent as it is irrational,” explains ex-Googler Kaspar Szymanski. “Google has never demonstrated a dislike of a website and it would make little sense to operate a global business based on personal enmity. The claim that a site does not rank because of a Google feud is easily refuted with an SEO audit that will likely uncover all the technical, content, on- and off-page shortcomings. There are Google penalties, euphemistically referred to as Manual Spam Actions; however, these are not triggered by personal vendettas and can be lifted by submitting a compelling Reconsideration Request. If anything, Google continues to demonstrate indifference towards websites. This includes its own properties, which time and again had been penalized for different transgressions.” MORE >>

2. JavaScript publishers: Here’s how to view rendered HTML pages via desktop

“Many don’t know this but you can use the Rich Results Test to view the rendered HTML based on Googlebot desktop. Once you test a URL, you can view the rendered HTML, page loading issues and a JavaScript console containing warnings and errors,” explains Glenn Gabe of G-Squared Interactive. “And remember, this is the desktop render, not mobile. The tool will show you the user-agent used for rendering was Googlebot desktop. When you want to view the rendered HTML for a page, I would start with Google’s tools. But that doesn’t mean they are the only ways to check rendered pages. Between Chrome, third-party crawling tools and some plugins, you have several more rendering weapons in your SEO arsenal.” MORE >>

3. Missing results in SERP even after using FAQ Schema?

“Google will only show a maximum of three FAQ results on the first page. If you’re using FAQ Schema and ranking in the top 10 but your result isn’t appearing on the first page, then it could be something unrelated,” explains SEO consultant Brodie Clark. “A few possible scenarios include: 1) Google has decided to filter out your result because the query match isn’t relevant enough with the content on your page; 2) The guidelines for implementation are being breached in some form (maybe your content is too promotional in nature); 3) There is a technical issue with your implementation. Use Google’s Rich Results Test and Structured Data Testing Tool to troubleshoot.” MORE >>

4. How to avoid partial rendering issues with service workers

“When I think about service workers, I think about them as a content delivery network running in your web browser,” explains Hamlet Batista of Ranksense and SMX Advanced speaker. “A CDN helps speed up your site by offloading some of the website functionality to the network. One key functionality is caching, but most modern CDNs can do a lot more than that, like resizing/compressing images, blocking attacks, etc. A mini-CDN in your browser is similarly powerful. It can intercept and programmatically cache the content from a progressive web app. One practical use case is that this allows the app to work offline. But what caught my attention was that as service worker operates separate from the main browser thread, it could also be used to offload the processes that slow the page loading (and rendering process) down.” MORE >>

– Read more

Page load time and crawl budget rank will be the most important SEO indicators in 2020

My Post (82).pngBased on my own testing, PLT and CBR are the technical aspects I believe will determine website success, or failure, in the new year.

Google has the ability to impose its own rules on website owners, both in terms of content and transparency of information, as well as the technical quality. Because of this, the technical aspects I pay the most attention to now – and will do so next year – are the speed of websites in the context of different loading times I am calling PLT (Page Load Time).

Time to first byte (TTFB) is the server response time from sending the request until the first byte of information is sent. It demonstrates how a website works from the perspective of a server (database connection, information processing and data caching system, as well as DNS server performance). How do you check TTFB? The easiest way is to use one of the following tools:

  • Developer tools in the Chrome browser
  • WebPageTest
  • Byte Check

Interpreting results

TTFB time below 100ms is an impressive result. In Google’s recommendations, TTFB time should not exceed 200ms. It is commonly adopted that the acceptable server response time calculated to receiving the first byte may not exceed 0.5s. Above this value, there may be problems on a server so correcting them will improve the indexation of a website.

Improving TTFB

1. Analyze the website by improving either the fragments of code responsible for resource-consuming database queries (e.g. multi-level joins) or heavy code loading the processor (e.g. generating on-the-fly complex tree data structures, such as category structure or preparing thumbnail images before displaying the view without the use of caching mechanisms).

2. Use a Content Delivery Network (CDN). This is the use of server networks scattered around the world which provide content such as CSS, JS files and photos from servers located closest to the person who wants to view a given website. Thanks to CDN, resources are not queued, as in the case of classic servers, and are downloaded almost in parallel. The implementation of CDN reduces TTFB time up to 50%.

3. If you use shared hosting, consider migrating to a VPS server with guaranteed resources such as memory or processor power, or a dedicated server. This ensures only you can influence the operation of a machine (or a virtual machine in the case of VPS). If something works slowly, the problems may be on your side, not necessarily the server.

4. Think about implementing caching systems. In the case of WordPress, you have many plugins to choose from, the implementation of which is not problematic, and the effects will be immediate. WP Super Cache and W3 Total Cache are the plugins I use most often. If you use dedicated solutions, consider Redis, Memcache or APC implementations that allow you to dump data to files or store them in RAM, which can increase the efficiency.

5. Enable HTTP/2 protocol or, if your server already has the feature, HTTP/3. Advantages in the form of speed are impressive.

DOM processing time

DOM processing time is the time to download all HTML code. The more effective the code, the less resources needed to load it. The smaller amount of resources needed to store a website in the search engine index improves speed and user satisfaction.

I am a fan of reducing the volume of HTML code by eliminating redundant HTML code and switching the generation of displayed elements on a website from HTML code to CSS. For example, I use the pseudo classes :before and :after, as well as removing images in the SVG format from HTML (those stored inside <svg> </svg>).

Page rendering time

Page rendering time of a website is affected by downloading graphic resources, as well as downloading and executing JS code.

Minification and compression of resources is a basic action that speeds up the rendering time of a website. Asynchronous photo loading, HTML minification, JavaScript code migration from HTML (one where the function bodies are directly included in the HTML) to external JavaScript files loaded asynchronously as needed. These activities demonstrate that it is good practice to load only the Javascript or CSS code that is needed on a current sub-page. For instance, if a user is on a product page, the browser does not have to load JavaScript code that will be used in the basket or in the panel of a logged-in user.

The more resources needing to be loaded, the more time the Google Bot must spend to handle the download of information concerning the content of the website. If we assume that each website has a maximum number/maximum duration of Google Bot visits – which ends with indexing the content – the fewer pages we will be able to be sent to the search engine index during that time.

Crawl Budget Rank

The final issue requires more attention. Crawl budget significantly influences the way Google Bot indexes content on a website. To understand how it works and what the crawl budget is, I use a concept called CBR (Crawl Budget Rank) to assess the transparency of the website structure.

If Google Bot finds duplicate versions of the same content on a website, our CBR decreases. We know this in two ways: – Read more

9 Bad SEO Habits to Leave in 2019

My Post (81).pngAs we enter a new decade, it’s time to say goodbye to some bad SEO habits.

These are SEO tactics that just plain don’t work, or even worse, can get a website penalized.

Below is a list of the top nine habits that need to be kicked to the curb.

1. Creating Pages with Similar Content

Fortunately, this tactic is not as prevalent as it once was, but this issue periodically comes up, even today.

Pages with similar content, which is usually created for the sole purpose of targeting keywords, is not a good strategy.

For example, duplicating city pages within a website with the city name as the only difference can be harmful.

Essentially, you end up with low-quality pages that can pull down the rest of the site.

2. Link Building Using Generic, Templated Emails

We don’t like receiving spam, so why send it?

Link building has become more of a marketing tactic than just an SEO tactic. That means we have to identify and research our audience before creating our “marketing” message.

Sending a generic, templated message to someone asking for a link is not going to get you great results.

Instead, send fewer emails, but take the time to research that person’s website and understand what would interest their users or customers.

Also, don’t use a general salutation, such as “Dear Webmaster” or “Dear Website Owner.” Use the person’s name.

3. Trying to Solve Every Ranking Problem By Getting More Links

Yes, links still matter today, but they are only one of many factors of the ranking algorithm.

Links are a public endorsement and reflect that a website has valuable information.

Where the problems occur, though, is when links are gathered in an unnatural way, such as through link schemes, poor link directories, purchasing links, and other spammy tactics.

As we start the new year, these aggressive link building techniques should be abandoned and the focus should be on a link strategy that is more marketing and user-focused. Check out SEJ’s Link Building Guide for tips that can carry you into 2020.

4. Adding Marginal Content for SEO Purposes

You can’t have SEO without content.

SEO and content are intertwined.

You need content to optimize for search.

If you don’t optimize your content, searchers won’t find you.

So, there is no question that we need content, but there is still a problem.

Marginal content is often added to websites simply for the purpose of “improving SEO.”

However, having just any content isn’t good enough.

Avoid churning out a ton of content just for the sake of increasing the number of pages on a website. Google is constantly preaching quality content and even if the search engine wasn’t preaching it, we still need to focus on our users.

Your content has to be considered high quality, especially when compared to the competition.

5. Skipping Over Fundamental On-Page Optimization Elements

There has been speculation over the years regarding the correlation between title tags and rankings.

Regardless of where you stand on this topic, a good title can convert a searcher into a visitor (and even a customer, if you’re lucky).

Take the time to optimize your titles with keywords, but also be sure to make them compelling. – Read more

3 Free Tools to Help Investigate & Fix PPC Account Performance Changes

My Post (79).pngFiguring out why the performance of a PPC account has changed can be one of the most time-consuming tasks in PPC.

Not only is it a big time drain, but it’s also often associated with a fire drill, done at the urgent request of a boss or client who is demanding answers after something didn’t go as planned.

I’ll cover how a typical investigation is done and share free tools and scripts that can help speed up this process.

Of particular interest is Google’s just-announced ‘Explanations’ feature, which can be a great help when trying to find the culprit when things don’t go as planned.

How to Investigate Account Performance Changes

A typical investigation can consume hours if done manually and usually follows these steps:

  • Finding out you have an issue.
  • Determining if the change was across the whole account or mainly due to a few rogue items like some overly broad keywords.
  • Drilling down deeper into the responsible entities.
  • Collating metrics from various sources to understand if the change was due to a change you made, a change in user behavior, or a change by competitors.
  • Fixing the issue.

Step 1: Know You Have an Issue

We’ve all got a lot on our plates. So, chances are, you aren’t logging into all your accounts every hour.

That’s why it’s so important to have good monitoring in place so that you’ll get an alert if something is going on with an account.

If you don’t have good monitoring, rest assured your client will monitor things for you.

But that comes with a downside: they will yell at you

Also, by that point, things may have gone far off track.

So set up some good alerts and spare yourself that trouble.

You’ll look like the PPC rockstar you are if you squash a problem before it gets out of hand.

Step 2: Find the Best Place to Start the Investigation

Once you know that an investigation is needed, it’s time to find out where to start.

A big change in performance can come from the combination of many small changes or from some isolated bigger changes.

1 + 2 + 1 and 0 + 0 + 4 are both 4

It helps focus your effort when you know where the biggest changes appear to have happened.

Notice I use the word “appear” because it is possible that a campaign with no top-level change actually had lots of positive and negative changes that canceled each other out.

The simplest way to go about this step is to rank campaigns by the biggest net change.

This is simple to do.

Turn on the date range comparison feature in Google Ads. Then filter for only campaigns with a minimum level of data and then sort them from biggest to smallest change.

Step 3: Drill Down Deeper into Impacted Ads Entities

Once you’ve identified the campaigns most responsible for the change, repeat step 2 but now by looking at the ad groups with the biggest change in each affected campaign, one at a time.

Then repeat this again for keywords, queries, ads, etc. After doing this you have a list of individual things that you might be able to fix.

For example, you’ll know which keyword had the biggest drop in conversions and be able to fix its issue.

3 Free Tools to Help Investigate &#038; Fix PPC Account Performance Changes

Or you might find that an affected campaign has no keywords of special note and everything declined equally, indicating that the issue may be due to a campaign-level setting such as a budget change.

As you can see, this recursive step can be time consuming for larger accounts.

Step 4: Drill Down into the Metrics

When you’ve found the entities most responsible for a change, be they campaigns, queries, or something else, it’s time to investigate the underlying cause.

Looking at the numbers will help you hone in on the root cause.

3 Free Tools to Help Investigate &#038; Fix PPC Account Performance Changes

This isn’t easy and requires downloading a lot of data (even data from outside Ads, like Google Trends) and combining it in spreadsheets.

While the Google Ads interface shows metrics in a table, there are relationships that are easier to see in a cause chart, which is illustrated in both the above image from Google and the one below from Optmyzr (my company).

3 Free Tools to Help Investigate &#038; Fix PPC Account Performance Changes

For example, a conversion can only happen if you get a click. And a click can only happen if you get an impression, and an impression can only happen if a user searches for your keyword.

Understanding at which stage of these connected metrics things have unraveled will help pinpoint the likely fix.

An advertiser whose conversions have decreased should look at clicks, impressions, average CPC, impression share, etc. to determine what caused the change.

Once you know the lowest level metric that was impacted, you can correlate that with a likely cause and know if the reason is due to something you changed, something a competitor changed, or a change in user behavior. – Read more