Measure twice, cut once

My Post (99).pngWeb Performance Testing Tools and Tips

Many organizations struggle with site load times and have yet to adopt the right measurement tools or processes to improve it. As the product manager for Optimizely Web, naturally, I’m passionate about web page performance because it’s a major influence on the user experience and your business goals. To help our customers deliver snappy experiences for their end-users, we recently launched Performance Edge. It enables performant experimentation at scale, by reducing the impact on site speed. If you are ready to supercharge experiments on your most performance-sensitive pages with Optimizely, or you are starting to think about setting site speed goals for the year, following the best practices below will help your team improve performance scientifically.

Prioritize Performance.  Latency negatively impacts the user experience.  Your site’s KPIs are a function of the user experience.  It’s as simple as that.

Measure the right numbers. Many teams struggle to measure performance accurately. They lack focus when it comes to the metrics that matter. These metrics indicate how your performance is shaping the user experience. For example, Time to First Contentful Paint (FP/FCP) will let you measure the time it takes for a user to see something material on the page. It detects when the first major visible element, such as a hero image, renders. FCP is a pretty standard metric, and many tools like webpagetest support this out of the box.

Better yet, measure Time to First Meaningful Paint (FMP). FMP captures how long it takes a meaningful element to load. It’s up to you to decide which element makes the paint meaningful for your users. At Optimizely, I work on A/B testing products that modify web elements as the browser loads them. The element tested in variations of a given experiment, is the one that we consider meaningful. Time to Interactive (TTI) is when a visitor can click or tap and is important overall page performance health. Still, given that it is the outcome of even more resources loading in the browser than other metrics mentioned, it is less useful for identifying specific actions you can take to improve.

Use the right tools.  Synthetic testing tools (with network throttling to mimic mobile) help give you an initial read, but there is no substitute for real-world traffic. Using real traffic is called Real User Monitoring (RUM). It is important to make sure your RUM collects info like the visitor’s browser/device and location so you can slice your data later (more on that below). Synthetic tools work too, sometimes letting you mimic mobile traffic, but they usually suffer from a limited sample size issue.

Use the right analysis technique. Performance data is involved. There can be lots of variance and outliers. Visitors’ devices and locations are literally all over the map. Performance timings tend to be unstable over time. Most sites are built on or with dozens of 3rd party technologies like CDNs, frontend frameworks, A/B testing tools, databases, and APIs, to name a few. Your data is unlikely to reflect a perfect bell-curve. That is, it’s probably not normally distributed and will have a long tail due to outliers.

The best way to analyze website performance in the face of noise is to segment your visitors’ requests, measure it over time to account for seasonality, examine a large sample size, and use percentiles. Using averages instead will cloud your understanding because a small number of hanging requests (due to things like a CDN cache miss or spotty connection) will move the average…towards the outliers.

Segment your visitors.  Imagine a file loading in a browser – the main HTML document, a JavaScript bundle, or even an image. In this scenario, you measure the time to download (this is often a contributing factor to the metrics above). What influences how long that file takes to load? A lot of things, but the most important ones are connection speed and file size. We’ll talk about file size later on and focus on connection speeds here. Connection speed depends on the network (Wifi, 4G, 3G, etc.), as well as the bandwidth of the device and the location of the visitor. These two combined partially explain why mobile browsing is slower than your MacBook Pro at home. If you happen to work on a site with global visitor traffic, you’re likely to have visitors in parts of the world with slower connectivity like India, Africa, and Southeast Asia. What’s more, given how cheap data plans are nowadays, it is more likely for an internet user to be on a mobile device. Finally, mobile devices usually have lower CPU power, so executing JavaScript takes longer as well.

Use less code and split it up.  Aside from connection speed, the strongest factor in a file’s load time is its size. Big files take longer than small files. Slow networks and limited bandwidth exacerbate this. Some files are JavaScript code that needs to be executed by the browser as well. In this capacity, more code takes longer to run. When it comes to A/B testing, we recommend reducing the amount of code with Performance Edge and Custom Snippets. It also helps to split your code up into smaller chunks so that only what’s necessary loads initially, then load everything else when you need it.

Run proper tests.  When you want to improve your site’s performance, a) not everything is a silver-bullet, and b) you should measure and communicate the impact of the changes you’re making. Testing helps quantify any tradeoffs you’re making and enables you to communicate the impact of your work. – Read more

7 Tips to Optimize Crawl Budget for SEO

My Post (98).pngCrawl budget is a vital SEO concept that often gets overlooked.

There are so many tasks and issues an SEO expert has to keep in mind that it’s often put on the back burner.

In short, crawl budget can, and should, be optimized.

In this article, you will learn:

  • How to improve your crawl budget along the way.
  • Go over the changes to crawl budget as a concept in the last couple of years.

What Is Crawl Budget

So for those of us who’ve had so much to think/worry/sweat about that we forgot what crawl budget even means, here’s a quick recap.

Crawl budget is simply the frequency with which search engine’s crawlers (i.e., spiders and bots) go over the pages of your domain.

That frequency is conceptualized as a tentative balance between Googlebot’s attempts to not overcrowd your server and Google’s overall desire to crawl your domain.

Crawl budget optimization is just a series of steps that you can take specifically to up the rate at which search engines’ bots visit your pages.

The more often they visit, the quicker it gets into the index that the pages have been updated.

Consequently, your optimization efforts will take less time to take hold and start affecting your rankings.

With that wording, it certainly sounds like the most important thing we all should be doing every second, right?

Well, not entirely.

Why Is Crawl Budget Optimization Neglected?

To answer that question, you only need to take a look at this official blog post by Google.

As Google explains plainly, crawling by itself is not a ranking factor.

So that alone is enough to stop certain SEO professionals from even thinking about crawl budget.

To many of us, “not a ranking factor” is equated to “not my problem.”

I disagree with that wholeheartedly.

But even forgetting that, there are Google’s Gary Illyes’ comments. He has stated outright that, sure, for a huge website of millions and millions of pages, crawl budget management makes sense.

But if you’re a modestly-sized domain, then you don’t have to actually concern yourself too much with crawl budget. (And in fact added that if you really have millions and millions of pages, you should consider cutting some content, which would be beneficial for your domain in general.)

But, as we all know, SEO is not at all a game of changing one big factor and getting the results.

SEO is very much a process of making small, incremental changes, taking care of dozens of metrics.

Our job, in a big way, is about making sure that thousands of tiny little things are as optimized as possible.

In addition, although it’s not a big crawling factor by itself, as Google’s John Mueller points out, it’s good for conversions and for the overall website health.

With all that said, I feel it’s important to make sure that nothing on your website is actively hurting your crawl budget.

How to Optimize Your Crawl Budget Today

There are still things that are super heavy-duty and others’ importance has changed dramatically to a point of not being relevant at all.

You still need to pay attention to what I call the “usual suspects” of website health.

1. Allow Crawling of Your Important Pages in Robots.Txt

This is a no-brainer, and a natural first and most important step.

Managing robots.txt can be done by hand, or using a website auditor tool.

I prefer to use a tool whenever possible. This is one of the instances where a tool is simply more convenient and effective.

Simply add your robots.txt to the tool of your choice will allow you to allow/block crawling of any page of your domain in seconds. Then you’ll simply upload an edited document and voila!

Obviously, anybody can pretty much do it by hand. But from my personal experience I know that with a really large website, where frequent calibrations might be needed, it’s just so much easier to let a tool help you out.

2. Watch Out for Redirect Chains

This is a common-sense approach to website health.

Ideally, you would be able to avoid having even a single redirect chain on your entire domain.

Honestly, it’s an impossible task for a really large website – 301 and 302 redirects are bound to appear.

But a bunch of those, chained together, definitely hurt your crawl limit, to a point where search engine’s crawler might simply stop crawling without getting to the page you need indexed.

One or two redirects here and there might not damage you much, but it’s something that everybody needs to take good care of nevertheless.

3. Use HTML Whenever Possible

Now, if we’re talking Google, then it has to be said that its crawler got quite a bit better at crawling JavaScript in particular, but also improved in crawling and indexing Flash and XML.

On the other hand, other search engines aren’t quite there yet.

Because of that, my personal standpoint is, whenever possible, you should stick to HTML.

That way, you’re not hurting your chances with any crawler for sure. – Read more

3 Qualities of the Best SEO Information

My Post (96).pngIn search marketing there are many different ideas of how Google ranks sites. Who do you believe? Nobody can say they 100% know Google’s algorithm. But we can know 100% that some explanations of how Google ranks sites are more trustworthy than others.

Different Levels of Knowledge

Some ideas are more true than others. What makes one idea more trustworthy is the evidence.

However, not all evidence is trustworthy. There are two poor and untrustworthy sources of evidence:

1. Anecdotal evidence

2. Correlation based evidence

Anecdotal SEO Evidence

Anecdotal refers to ideas that are based on the personal experience of one or more people, but without actual research and testing to confirm the idea.

The early years of SEO were dominated by hypotheses created by anecdotal evidence. One of the earliest examples is when affiliate marketers noticed that Google was consistently banning affiliate sites. That led to the hypothesis that Google hated affiliate sites and was actively going after them.

This led to hiding affiliate links by using JavaScript and/or URLs that are redirected through pages that were blocked from crawling with Robots.txt. The idea was to hide affiliate links from Google so that Google wouldn’t ban the site for being an affiliate site.

That’s an example of anecdotal evidence (a group of affiliate marketers noticed they all lost rankings) which then led to the idea that Google was “targeting” affiliate sites.

Of course, they were wrong. Not only have Googlers stated that Google does not treat affiliate sites as lower quality, but there is no research papers or patents to show that Google had researched algorithms that “targeted” affiliate sites or any specific kinds of marketers other than spammers.

There was no factual evidence to support the anecdotal evidence. Factual evidence, in my opinion, is how to identify a flimsy opinion from an evidence based insight.

It doesn’t matter what your favorite SEO guru tells you about Google. It doesn’t matter if that Guru is ranked at the top of Google, that doesn’t prove anything. What matters is the factual evidence to support the idea.

Correlation Based Hypotheses

The SEO industry is being exposed to less and less of this kind of SEO information. At one time, a correlation study based on millions of search results resulted in lots of links and attention. But the information was bad.

Just because all the top ranked sites have active social media presence does not mean that social media presence is a ranking factor.

That kind of correlation is especially wrong if there is absolutely no research on that kind of ranking factor by any search engine or university anywhere on earth.

But that’s the kind of correlation nonsense the SEO industry fell for during the mid-2000’s. And for a time many businesses wasted money doing things like trying to attract likes to their Facebook page.

The days when the SEO industry believed the outcomes of correlation studies was a dark period in the SEO industry.

While there are STILL some SEOs who are publishing correlation studies, many SEOs are increasingly skeptical and ignoring them, as well they should.

3 Qualities of the Best SEO Information

In my estimation, there are arguably three levels of SEO knowledge. At the top is canonical level information, followed by citation based knowledge and experience based knowledge.

1. Canonical SEO Information
Confirmed by Google to be true.

2. Citation Based Knowledge
This is information that is supported by reliable evidence such as patents and research papers.

3. Experience  Based SEO
Professionals who are actively creating websites and ranking them can be considered authoritative. You can’t argue with success, particularly with a person who is actually doing the work and succeeding with it. – Read more

10 reasons why your business website should be mobile optimised

My Post (95).pngHere is a universal truth which you know – The sun rises in the east and sets in the west.

Now here are two more (almost) universal truths of the 21st century. Mobiles are here to stay. Any and every business that wants to stay relevant has to think about going mobile. Not surprised huh? Nothing new I guess except that you are yet to optimize your site for the web. But why, you ask. Give me some good reasons you say except for the fact that a responsive site makes browsing simpler.

Here we are to tell you and give you not one but ten reasons as to why mobile-optimisation is a must for your business if you want to see it grow in the 21st century.

1. Who is NOT on mobile? (Seriously!)

According to study by Statistica, the number of mobile phone users across the globe is expected to reach 5.07 billion by the end of 2019. More than 80% of all internet users use a smartphone.

The bottom-line is that these statistics are only going to go up in the future towards mobile users. It’s high time for you to be thinking about going mobile.

2. People love to access with mobile

It might be news for you but since the advent of Responsive Web Design which means that a website changes its display to according to size of the device it is being opened on, people have gotten used to the experience of getting the same features and design as they were used to getting on a desktop or a laptop.

Apart from the benefit from your prospective customers, there is a benefit for you the owner too. You don’t need to design different websites for different platforms which used to create a hole in your pocket. Responsive design means one website for all kinds of platforms and devices.

3. Mobile users buy more

According to various studies, mobile users spend more than their desktop brethren, especially if you have a business as an ecommerce destination. According to an OFCOM study of 2017, Amazon shoppers access the Amazon app for around 74 times in a day. This should be a good indicator about the purchase potential that awaits business owners such as you.

4. Google loves mobile responsiveness

Google came out with Mobile first index in 2017 which means that google gives first preference to websites that have a mobile optimized version of their desktop version. This update also meant that websites that have failed to meet its standards for mobile responsiveness have suffered in terms of lower SERP rankings. Any sensible business owner with an online business will know that there is no point in ignoring Google.

5. Social media recommendations happen on mobile

It is no secret that social media is an integral part of everyone’s lives today.  So much so that 91 % of mobile internet access is used for social activities according to a report by Microsoft.

Any business that is using social media or spending pounds on online advertising stands a much better chance of getting traction from potential customers via mobile.

Not having a responsive website can drive all your content marketing and paid advertising efforts down the drain. If you want to connect with the social audience, a website needs to be designed that count their social CTRs.

6. Carve out your own identity with mobile

According to a recent survey, only 56% of small business websites are responsive. Apart from this, another huge group who have actually implemented responsive design have done a poor job doing it.  This means that there is a great opportunity for you to distinguish yourself from competition. There is a lot of money lying on the table and it could be yours. – Read more

The big comings and goings in paid search 2019 that will shape how we market in 2020

My Post (88).pngAutomation, full-funnel campaigns, shoppable ads and privacy fueled PPC changes in 2019.

In 2019, Google shook up mobile search results pages with a redesign that introduced black “Ad” labels to text ads and favicons for organic listings. It also caused a stir in notifying some advertisers it would start handling campaign management for them. Automation continued to be a major theme. This year, it was reflected most prominently in Google’s product announcements aimed at owning the funnel with campaigns that extend across properties. Adjusting to new privacy restrictions and expectations also took on new urgency and will have a significant impact on search marketing in the year to come.

Bing celebrated its first decade. Ten years on, still, Bing’s market share doesn’t rival that of Google and likely never will – but perhaps that’s beside the point now. The newly-branded Microsoft Advertising doesn’t have to carry 90+% of its parent company’s revenues like rivals Facebook and Google’s ad businesses, and it began exhibiting more of an independent streak in 2019 rather than simply aiming to keep up with new Google Ads features (though, it’s still doing that, too).

These were the big things we said goodbye and hello to in paid search this year that will inform our campaigns in 2020.

We said goodbye to:

Average position metric. This old-timer rode off into the late fall sunset. The retirement of average position was more of a process headache than a loss in actionable data. Advertisers had months to shift their bidding strategies, reports and scripts to rely on the new position metrics that Google introduced in late 2018. Frederick Vallaeys of Optmyzr offered a history of the blurring of average position as an informative metric. The new impression share-based position metrics instead better indicate how often your ads appear above the organic listings.

Microsoft also introduced the new position metrics but said it is holding on to average position reporting. For now, anyway.

Accelerated delivery. Search and Shopping campaigns no longer have the option to have their ads serve as early and often as possible until the day ends or their daily budgets deplete. Accelerated delivery was a fan favorite for Shopping and brand campaigns with uncapped budgets, but Google said many still used it with capped daily budgets and that “this method can increase CPCs due to increased competition early in the day, or unintentionally spend most of your budget in earlier time zones.” Now, campaigns are optimized through standard delivery with Google’s algorithms based on the campaign’s goal, bidding strategy as well as contextual signals. – Read more

11 Headline Writing Tips to Drive Traffic & Clicks

My Post (87).pngWondering how to write a headline that drives traffic and clicks?

The best headlines are:

  • Extremely relevant to the content
  • Contain a keyword
  • Generate interests

There’s plenty of room to be creative and demonstrate value, right off the bat.

While there’s no exact science to writing a headline, there are useful headline writing tips that will help you whip up brilliant headlines.

Discover 11 ways to write good headlines.

1. Let Keywords Drive You

If you’re writing a piece of evergreen content, always do keyword research to find out what people are actually searching for.

A slight difference in wording can make a huge impact on traffic.

Let’s take this content, for example.

As with all content, I did keyword research beforehand to pinpoint what people are actually searching for.

I narrowed it down to these keyword phrases, based on their monthly search volume:

  • Headline writing tips: 360
  • How to write a headline: 360
  • Good headlines: 390
  • How to write a good headline: 170

By choosing a relevant keyword phrase with the most search volume, I can boost the ROI of the content.

Accordingly, I chose “headline writing tips” as my main keyword (and, of course, I can use the others as supporting keywords).

As you can see, the headline on this content is “12 Headline Writing Tips to Drive Traffic and Clicks.”

It’s keyword-rich, relevant and (hopefully) demonstrates value.

2. Come up with Multiple Headlines

If you find yourself with a bout of writer’s block and can’t come up with a headline that really strikes your fancy, try writing a bunch!

The act of brainstorming multiple headlines will really get your creative juices flowing, and you’ll land on something great eventually.

3. Know that Sometimes Short and Sweet is A-OK

Sometimes we need to get straight to the point.

Not every headline needs to be lengthy – sometimes being punchy and straightforward is a better approach, so don’t automatically discount a potential headline just because it’s short.

4. Pull a Quote from the Article

Another option for coming up with a good headline is pulling a quote from the content.

A quote, especially from a celebrity or influencer, can be excellent fodder for a headline.

The quote, of course, should be on-topic with the article as a whole.

Here’s one technical issue to keep in mind: unlike body copy, quotes in headlines should always appear in single quotations, according to Associated Press Style. – Read more

How to Perform an In-Depth Technical SEO Audit

My Post (86).pngI’m not going to lie: Conducting an in-depth SEO audit is a major deal.

And, as an SEO consultant, there are a few sweeter words than, “Your audit looks great! When can we bring you onboard?”

Even if you haven’t been actively looking for a new gig, knowing your SEO audit nailed it is a huge ego boost.

But, are you terrified to start? Is this your first SEO audit? Or, you just don’t know where to begin? Sending a fantastic SEO audit to a potential client puts you in the best possible place.

It’s a rare opportunity for you to organize your processes and rid your potential client of bad habits (cough*unpublishing pages without a 301 redirect*cough) and crust that accumulates like the lint in your dryer.

So take your time. Remember: Your primary goal is to add value to your customer with your site recommendations for both the short-term and the long-term.

Ahead, I’ve put together the need-to-know steps for conducting an SEO audit and a little insight to the first phase of my processes when I first get a new client. It’s broken down into sections below. If you feel like you have a good grasp on a particular section, feel free to jump to the next. – Read more

Pro Tip: A look back at some helpful tips from our SEO community in 2019

My Post (85).pngHere are 5 tips from our SEO community we think are worth a second look.

Our community is dedicated to helping fellow SEOs so we wanted to look back at a few insights shared this past year that were particularly popular with readers.

1. Google doesn’t hate your website

“The personal animosity complaint is as frequent as it is irrational,” explains ex-Googler Kaspar Szymanski. “Google has never demonstrated a dislike of a website and it would make little sense to operate a global business based on personal enmity. The claim that a site does not rank because of a Google feud is easily refuted with an SEO audit that will likely uncover all the technical, content, on- and off-page shortcomings. There are Google penalties, euphemistically referred to as Manual Spam Actions; however, these are not triggered by personal vendettas and can be lifted by submitting a compelling Reconsideration Request. If anything, Google continues to demonstrate indifference towards websites. This includes its own properties, which time and again had been penalized for different transgressions.” MORE >>

2. JavaScript publishers: Here’s how to view rendered HTML pages via desktop

“Many don’t know this but you can use the Rich Results Test to view the rendered HTML based on Googlebot desktop. Once you test a URL, you can view the rendered HTML, page loading issues and a JavaScript console containing warnings and errors,” explains Glenn Gabe of G-Squared Interactive. “And remember, this is the desktop render, not mobile. The tool will show you the user-agent used for rendering was Googlebot desktop. When you want to view the rendered HTML for a page, I would start with Google’s tools. But that doesn’t mean they are the only ways to check rendered pages. Between Chrome, third-party crawling tools and some plugins, you have several more rendering weapons in your SEO arsenal.” MORE >>

3. Missing results in SERP even after using FAQ Schema?

“Google will only show a maximum of three FAQ results on the first page. If you’re using FAQ Schema and ranking in the top 10 but your result isn’t appearing on the first page, then it could be something unrelated,” explains SEO consultant Brodie Clark. “A few possible scenarios include: 1) Google has decided to filter out your result because the query match isn’t relevant enough with the content on your page; 2) The guidelines for implementation are being breached in some form (maybe your content is too promotional in nature); 3) There is a technical issue with your implementation. Use Google’s Rich Results Test and Structured Data Testing Tool to troubleshoot.” MORE >>

4. How to avoid partial rendering issues with service workers

“When I think about service workers, I think about them as a content delivery network running in your web browser,” explains Hamlet Batista of Ranksense and SMX Advanced speaker. “A CDN helps speed up your site by offloading some of the website functionality to the network. One key functionality is caching, but most modern CDNs can do a lot more than that, like resizing/compressing images, blocking attacks, etc. A mini-CDN in your browser is similarly powerful. It can intercept and programmatically cache the content from a progressive web app. One practical use case is that this allows the app to work offline. But what caught my attention was that as service worker operates separate from the main browser thread, it could also be used to offload the processes that slow the page loading (and rendering process) down.” MORE >>

– Read more

How to Improve Page Speed for More Traffic & Conversions

My Post (84).pngPage speed is a critical factor in digital marketing today. It has a significant impact on:

  • How long visitors stay on your site.
  • How many of them convert into paying customer.
  • How much you pay on a CPC basis in paid search.
  • Where you rank in organic search.

Unfortunately, most websites perform poorly when it comes to page speed, and that has a direct negative impact on their revenue.

There is an almost infinite number of things we can spend our days doing as digital marketers, and there’s never enough time to do them all. As a result, some things get pushed to the back burner.

One of the things that seem to get pushed back most often is optimizing page speed. This is easy to understand because most people don’t truly comprehend the importance of this often overlooked detail, so they don’t see the value in investing time and money to improve it by a few seconds or less.

What may seem like an inconsequential amount of time to some marketers, including those who focus solely on search engine optimization, has been proven to be monumental by data from industry giants all the way down to our own analytics data.

I’ll assume that you’re like me and you want to maximize your results, and of course, your revenue, right? Then let’s get started in making your website faster than greased snot! (That’s quite a visual, isn’t it?)

1. Ditch the Budget Web Hosting

We’re all trying to save money these days, after all, those subscriptions to Raven, SEMrush, Moz, and all the other tools we use on a daily basis add up quickly. It’s almost like having an extra kid.

One way a lot of people try to save money is by choosing the kind of cheap shared hosting that crams as many websites as they can fit onto a server, much like a bunch of clowns piling into a single car. Performance be damned!

Sure, your website will be available most of the time as it would with most any web host, but it will load so bloody slowly that your visitors will leave frustrated without ever converting into buyers.

“But it’s barely noticeable!” these bargain shoppers insist.

Here’s the thing — it might be barely noticeable to you because it’s your baby and you love it.

But everyone else only wants to get in and get out of your website as quickly as possible.

People want to be on your site for just long enough to do what they came to do, whether that means to get an answer, buy a product, or some other specific objective. If you slow them down even a little bit, they will be likely to hate their experience and leave without converting.

Think about it like this:

Most people love their own kids unconditionally. But someone else’s kid screaming, throwing things, disrupting their night out at a restaurant? They hate that kid. It’s the same with your website.

How Much of a Difference Does It Really Make?

According to a study conducted by Amazon, a difference of just 100ms — a unit of time that a human can’t even perceive, was enough to reduce their sales by 1%. Walmart found similar results.

If that tiny unit of time has that much direct impact on sales, what kind impact do you think an extra second or more will have?

But it doesn’t stop there because how quickly (or slowly) your website loads also has an impact on organic search ranking and pay-per-click costs.

In other words, if your website loads slowly, you should expect your competitors who have invested in this critical area to eat your lunch.

Bottom line: skip the budget web hosting. If they are selling it like a commodity (based mainly on price) then they’ll treat their customers like a commodity too.

There are a lot of web hosts that are optimized for speed, particularly for WordPress websites, and some of them are priced similarly to the budget options.

So ask around, do some testing, and invest in a web host that will give you the performance to satisfy both your visitors and Google.

2. Reduce HTTP Calls

Every file needed for a webpage to render and function, such as HTML, CSS, JavaScript, images, and fonts require a separate HTTP request. The more requests made, the slower that page will load.

Now if you’re anything like most of the people I talk to, you’re probably thinking “Oh, I don’t need to worry about that, Jeremy. I know what I’m doing and I don’t add a bunch of bloated garbage into my website!”

That may be partially true. You may not add a bunch of bloated garbage to your website, but for 90+% of the websites that I encounter — it’s still there anyway.

That bloat isn’t there because the Bloat Fairy snuck it in while you were sleeping. It’s there because a majority of web designers, regardless of skill or experience, don’t make page speed a priority. The sad truth is that most don’t even know how.

Here’s where the problem starts:

Most themes load one or more CSS files and several JavaScript files. Some, such as Jquery or FontAwesome, are usually loaded remotely from another server, which dramatically increases the time it takes a page to load.

This becomes even more problematic when you consider the additional CSS and JavaScript files added by plugins. It’s easy to end up with half a dozen or more HTTP requests just from CSS and JavaScript files alone.

When you factor in all of the images on a page, which each require a separate HTTP request, it quickly gets out of hand.

  • Merge JavaScript files into one file.
  • Merge CSS files into one file.
  • Reduce or eliminate plugins that load their own JavaScript and/or CSS files. In some cases, as with Gravity Forms, you have the option to disable them from being loaded.
  • Use sprites for frequently used images.
  • Use a font like FontAwesome or Ionic Icons instead of image files wherever possible because then only one file needs to be loaded.

3. Include the Trailing Slash

Omitting the trailing slash on links pointing to your website, whether from external sources (link building efforts) or from within your own website, has an adverse impact on speed.

Here’s how:

When you visit a URL without the trailing slash, the web server will look for a file with that name. If it doesn’t find a file with that name, it will then treat it as a directory and look for the default file in that directory.

In other words, by omitting the trailing slash, you’re forcing the server to execute an unnecessary 301 redirect. While it may seem instantaneous to you, it does take slightly longer, and as we’ve already established, every little bit adds up. (this is bad)

or (this is also bad)

vs. (this is good)

or (this is also good)

– Read more

Nine voice search stats to close out 2019

My Post (83).pngA look back at some of the year’s key voice search and virtual assistant metrics.

From smartphones to smart home appliances, artificial intelligence, voice and virtual assistants are very much at the center of a shift in the way we interact with digital devices. While voice has not yet lived up to its promise, it’s clear it will be an enduring feature of the digital user experience across an expanding array of connected devices.

Mobile = 59% of search

Way back in 2015, Google announced that mobile search had surpassed search query volumes on the desktop. But it never said anything more precise and hasn’t updated the figure. Hitwise, in 2016 and again in 2019, found that mobile search volumes in the aggregate were about 59% of the total, with some verticals considerably higher (e.g., food/restaurants 68%) and others lower (e.g., retail 47%).

This isn’t a voice stat, but it’s important because the bulk of voice-based queries and commands occur on mobile devices rather than the desktop.

Voice on cusp of being first choice for mobile search

According to early 2019 survey data (1,700 U.S. adults) from Perficient Digital, voice is now the number two choice for mobile search, after the mobile browser:

  1. Mobile browser
  2. Voice search
  3. Phone’s search box/window
  4. Search app
  5. Text a friend

However between 2018 to 2019, voice grew as a favored entry point for mobile search at the apparent expense of the browser. Thus it could overtake text input as the primary mobile search UI in 2020.

Nearly 50% using voice for web search

Adobe released survey data in July that found 48% of consumers are using voice for “general web searches.” This is not the debunked “50% of searches will be mobile in 2020,” data point incorrectly attributed to comScore.

The vast majority of respondents (85%) reported using voice to control their smartphones; 39% were using voice on smart speakers, which is a proxy figure for device ownership.

Here are the top use cases for voice usage, predominantly on smartphones:

  1. Directions while driving — 52%
  2. Making a phone call — 51%
  3. Sending a text — 50%
  4. Checking the weather — 49%
  5. Playing music — 49%

Directions a top voice use case

Consistent with the Adobe survey, an April Microsoft report found a more specific hierarchy of “search” use cases on smartphones and smart speakers. Again, however, this is a primarily smartphone-based list:

  1. Searching for a quick fact — 68 percent
  2. Asking for directions — 65 percent
  3. Searching for a business — 47 percent
  4. Researching a product or service — 44 percent
  5. Making a shopping list — 39 percent

Crossing the 100 million smart speaker threshold

During 2019 there were multiple reports and estimates that sought to quantify the overall number of smart speakers in the U.S. and global markets. In early 2019, Edison research projected that there were roughly 118 million smart speakers in U.S. homes. However, other analyst firms and surveys found different numbers, typically somewhat lower.

Because people often own more than one smart speaker, the number of actual individual owners of smart speakers is considerably lower than 100 million: 65 million or 58 million, depending on the survey.

Amazon dominating Google in smart speaker market

Amazon, with its low-priced and aggressively marketed Echo Dot, controls roughly 70% to 75% of the U.S. smart speaker market according to analyst reports. In Q3 2019, for example, Amazon shipped 3X as many smart speaker and smart display units as Google.

Analyst firm Canalys argues Amazon’s success is a byproduct of its market-leading direct channel and discounting. Google’s direct and channel sales have so far not been able to keep pace with Amazon’s efforts.

Virtual assistant usage: Siri and Google lead

In contrast to the smart speaker market share figures, virtual assistant usage is a different story. This is because most virtual assistant usage is on smartphones and Amazon doesn’t have one.

A Microsoft report (in April) found a different market share distribution, with the Google Assistant and Siri tied at 36%, followed by Alexa.

Source: Microsoft (2019)

There are other surveys that suggest Google Assistant’s usage is greater than Siri’s.

58% use voice to find local business information

The connection between mobile and local search is direct. While Google has in the past said that 30% of mobile searches are related to location, there are plenty of indications that the figure is actually higher. Google itself said the number was “a third” of search queries in September, 2010 (Eric Schmidt), 40% in May, 2011 (Marissa Mayer) and, possibly, 46% in October 2018.

Asking for driving directions is not always an indication of a commercial intent to go somewhere and buy something. But as the Adobe and Microsoft surveys indicate, it’s a primary virtual assistant/voice search use case. A voice search survey conducted in 2018 by BrightLocal also found:

  • 58% of U.S. consumers had done a local business search by voice on a smartphone
  • 74% of voice search users (the 58%) use voice to search for local businesses at least weekly
  • 76% of voice search users search on smart speakers for local businesses at least once a week, with the majority doing so daily

– Read more