Why does a fast page load time matter?

Most people who work day-to-day in digital marketing know that the inner workings of the Google PageRank Algorithm are mostly a complete mystery. Whether or not your webpage will appear on the SERP’s is decided by a mind-boggling 200+ factors.

Thankfully for us here at Mackerel Media, there are a number ranking factors under our control that the industry (almost) always agrees can make a difference to how your site will be ranked by Google. It’s likely that you’ve already optimised your site for keywords or authority-giving links, but have you thought about how important page speed is on today’s fast paced, on-demand and mobile-focused internet?

Web Pages are Bigger than Ever

According to a recent study, the average page served today is 3.5 times greater than in 2010 – a whopping 2.5MB of code, advertising, images, videos and other rich media that needs to be loaded onto your device before you can watch the latest viral cat video or discover which type of meatball you should be on Buzzfeed.

And while slow load times can be infuriating on desktop computers, on mobile devices page load time being anything less than instant can actually be damaging. Google itself shared data with the industry that suggested 75% of mobile device users would abandon a webpage if it takes longer than 5 seconds to load, and 79% of those dissatisfied customers won’t visit your site again. Ad blockers have been on the rise in no small part due to this particular problem, but that’s a blog for another day!

A Lesson from Google

Google certainly learnt this the hard way. During a Web 2.0 conference in 2006, Marissa Mayer highlighted that a 0.5 second increase in SERP load-time resulted in a 20% decrease in traffic. That’s right, half a second was enough for the worlds most visited webpage to lose a fifth of its traffic.

If that wasn’t enough of a reason to make sure your webpage loads as fast as possible, then let me introduce you to the Gap of Death theory. This rightfully scary-sounding concept is the name given to the time between a webpage users load-time expectations and the actual time a page takes to load. For each additional second a user waits you can expect your conversion rate to drop by 7%.

A 7% loss in conversions per second is a nightmare scenario for any business, especially ones that rely on e-commerce websites. Amazon.com techie Greg Linden blogged about the results of internal A/B testing that suggested even very small delays can result in substantial and costly drops in revenue”.

The Paid Search Contagion

While the effects of page speed are intricately tied to your organic performance, a slow load time can also affect your paid digital marketing efforts too… sorry. Right here in the AdWords support documents, we can see that “landing page experience” and specifically “landing page load time” can have an effect your overall AdRank – and in turn your average advertising costs and ad position.

Now that we know the damage a slow loading page can cause to a business, what can we do to increase page speed?

With desktop, the answer is to keep an eye out for bottle necks and reduce their impact – bloated web pages, poor hosting services, 301 redirect chains, analytics code, slow widgets and plug ins or incompatible media. If you suspect that your webpage load time is too high, these are the usual suspects.

AMP to the Rescue?

Mobile sites have a smarter answer – Accelerated Mobile Pages or AMP. The AMP Project is designed to give users a great mobile experience no matter what platform their mobile device is loading web pages from. In essence, AMP pages have found success by augmenting the HTML code that powers a web page and pre-loading as much content as possible using standardised formatting before the user ever clicks on a link. The result is a page which appears to load instantly.

Unsurprisingly, Google is a huge fan of AMP pages – even allowing web developers to identify their pages with SERP friendly discoverable <”amphtml”> tags. However, whilst take up has been quick and widespread, some in the industry who have worked with or have knowledge of the tool for a while are not entirely fulsome in their praise. One commentator reported a bug in AMP that apparently risks inflating Unique User counts four-fold which would cause enormous issues for major publishers. Others were concerned that AMP Pages ‘masked’ the publishers’ URL making it harder for users to either identify the publisher or share a link to the article, but to its credit Google responded by changing AMP behaviour to ameliorate the concern.

So what does all of this information teach us? Simply put, slowly loading web pages can reduce the likelihood of a user ever reaching your site, and massively decrease the chances of them completing one of your conversion goals if they do.

In the end, how well your online business channels perform is down to how quickly your pages load. As Marissa said in her 2006 Web 2.0 talk, “Users really respond to speed.”

SEO Is Not Dead

20160502 MM Blog Image 1 - JPEG

You’ve heard the rumours. You’ve seen the think-pieces discussing a year-on-year drop in organic traffic and how native mobile-apps hoover up all ‘traditional’ searches. It all points to one thing, they claim – the end of Search Engine Optimisation.

If you work in digital marketing however, you know the truth: SEO is far from dead. Now, perhaps more than ever, the fundamental principles of search engine optimisation are crucial to any successful digital marketing strategy.

But how can we prove it?

At the end of February, one of Mackerel Media’s clients came to us with a problem. Despite being a well-know, large Scottish based firm, their organic search listings  weren’t appearing on the first page – what could we do to help them?

Now anybody will tell you that, unlike the relatively fast gains that can be made through PPC marketing, search engine optimisation is about playing the long game – but there can be some quick SEO fixes that can make a real difference in a reasonably short period of time.

Our initial investigation into the client’s website revealed, among other SEO quick-fixes, that most of the firm’s pages were missing H1 headings. The H1 heading element of a page has long been a stalwart of on-page search engine optimisation – but opinion has been divided over just how much influence it has on your final position on the SERP. However, with everything else seemingly set up correctly, the missing H1’s seemed a logical explanation for the underperforming pages. The next step was to get creative…

Working closely with the client, Mackerel Media was able to introduce around sixty new H1s to pages across the clients site – ranging from sector specific keywords to high volume sector terms, retroactively adding them to complement the existing site content. In total, all of the H1s were added in less than a day.

As a result, search positions have increased dramatically, with the number of our client’s pages ranking on the first page of Google up a massive 40.74% from last month. What’s more, the number of pages which now rank in first place for our client’s keywords has increased by 52.63%.

20160502 MM blog Image 2 - JPEG

20160503 MM Blog Image 4 - JPEG

In just one month, Mackerel Media’s client saw a 22.72% increase in total pages ranked simply thanks to some in-depth investigation and the introduction of H1 headings. If SEO truly is dead, then we’re yet to see any evidence.

And while this is a single example, it is far from an isolated case. Every day here at Mackerel Media we see the effects a well structured, planned and executed SEO strategy. Whether a site requires a technical overhaul, on-site content improvements or an off-site outreach programme – each change you make will have an effect on how Google, and ultimately the internet as a whole, will rank your page.

So next time somebody tells you SEO is dead, remember: the right changes can make a huge difference to where your site ranks amongst its competitors… who are almost certainly all making SEO improvements as well.

Google Local Results showing 3 listings for [glasgow distilery]

Google Strips it Back to Only 3 Local Results, Losing 4

First it was ten, then it was seven, and as of today, it’s now three. Google has taken a scythe to the Local results pack in its SERPS and appears to now only show three results.

Anyone who operates in the local space will be concerned by this change as it marks a major change in the space available on SERPs. The screenshot below shows a local result from Scotland for [glasgow distillery] in which the three-pack is clearly visible:

Google Local Results showing 3 listings for [glasgow distilery]

Why are Google doing this? Simply, we don’t know for sure, but we do know extensive testing has been taking place on the layout of these results, and the new layout does (to a certain degree) resemble that on tablet and mobile devices, so perhaps it is all being done in the name of a more unified user experience that encourages more users to click (or tap) through to extended listings.

There’s also an argument that limiting the available places increases competitiveness and will encourage local businesses who are pushed out of the three-pack to place Ads to recovered lost visibility and traffic.

Alongside this, there is of course the ongoing confusion around Google Local listings, how they are organised and ranked, how exactly organisations can improve their positioning and gain more traffic. Looking at the three results in the map above tells us little about how results are selected other than by geography – two lack reviews entirely and aren’t even distilleries, unless of course Blythswood Square has changed considerably since I was there last week.

Time will tell.

(not provided) Providing More Stress – Chrome 25

There are some in the SEO business who feel as if their trade is under constant threat from Google, and nothing has contributed more to that feeling that than the increasing proportion of (not provided) traffic in Organic Search Keyword Reports, and with the upcoming version 25 of Google’s Chrome browser, it’s going to get a lot worse.

For readers unfamiliar with (not provided), in October 2011 Google started to encrypt the referral data of anyone who was logged in to a Google service clicking on a link on a Google Search Results Page, meaning that the keyword the user entered into Google was no longer passed over to Analytics or the keyword was not provided, and there was no way of knowing what keyword had brought that user to a site. For example, we might be interested to learn how many people come to this site having searched for ‘mackerel media’, but we would see (not provided) in our Analytics reports – at least for a certain proportion of users (around 45% in our case).

When the change was announced it was suggested that a maximum of 10% of all keyword data would be lost to (not provided) but the reality has been very different, with some web site owners suggesting figures as high as 70% or 80%. Clearly, when you have no idea what keyword terms brought the vast majority of visitors to your site, detailed keyword analysis becomes somewhat hard. Google has extended the functionality of its Webmaster Tools to provide some keyword data, but it’s not a patch on Analytics functionality.

This brings us to Google Chrome version 25, currently still in development, but eager to go. With this new version, all searches performed via the omnibox – the wide box at the top of the browser where you type the URL of the site you want to visit or the keyword you want to search for – will be encrypted, meaning all of the organic keyword data will be lost. We don’t know what proportion of Chrome users search via the omnibox, but it’s probably safe to assume that the overwhelming majority do. With Chrome’s large market share it looks therefore as if organic keyword data is about to suffer something of a mortal blow.

Am I exaggerating by saying “mortal blow”? I’d hope not but in truth, probably. The days of organic keyword data are numbered – at least in Analytics packages, so web masters will need to look to the limited functionality available in Google Webmaster Tools. Firefox encrypts all Google searches. Safari on iOS6 obscures them in a different way. The Google Chrome blog sums it up neatly by saying:

Search has also been moving toward encryption.

Keyword data is still available if you are paying Google for AdWords clicks, so if you’re willing to pay Google for your traffic – as many many businesses and people are – then you will still know what your users searched for before arriving at your site.

Google Hotels Search Appears in Global SERPs

If you’ve been searching for hotels on Google recently you might have noticed that a Google Hotels Search widget is taking pride of place above the organic search results, immediately below the AdWords top slot. Needless to say this seems to strike right at the heart of hotel booking web sites such as Lastminute.com, LateRooms and the recently acquired Kayak.com.

Google has been testing and playing around with its Hotel Search product for quite some time, and we’ve seen it appearing sporadically, but this marks the first real global roll-out, which does tend to suggest it’s not just a test but rather a change, and rather a dramatic one.

If you search for a city + hotels term such as [edinburgh hotels], [bangkok hotels] or [hotels in vancouver] you’ll see Google’s ‘Hotel Finder’ widget very prominently displayed, taking up what would have been the first organic search result. In all likelihood this will have a pretty severe impact on the sites that did hold that coveted first position, reducing their organic traffic. You’ll see the widget in the screenshot below.

Google Hotel Finder widget sitting pretty in the top organic SERPs slot.

Whether this change will stick and whether users will actually use Google’s tool remains to be seen, but it’s a clear declaration of intent.

Google’s Hotel advertising model has been in development for some time and intriguing partnerships have developed, such as the one with Pegasus and its Open Hospitality application, which allows participating hotels to automatically have their pricing information and ads displayed.

Another day, another major Google development!

 

 

 

 

Is Your Content Hidden in Search Results?

As web sites become increasingly dynamic in nature and as the content they publish becomes ever more dynamic in tandem, the risk that site owners are inadvertently hiding their content behind un-indexable site search tools increases. Whilst this might not sound like much of a problem, if you are trying to gain traction with search engines and rank in competitive markets, effectively blocking off a large portion of your content could seriously inhibit your efforts.

In the vast majority of cases the problem arises when well-meaning developers implement functionality on a site that should in theory assist users with their goals by making the process of searching or navigating a site more efficient, but in reality cause problems due to the way in which the tools are constructed and their acting as a block to search engines. If the information architecture of a site isn’t perfect or doesn’t provide a browse-able route to all of a site’s content, then content can be blocked or hidden away in the bowels of a site, never to be seen again.

An Example

Take, for example, the Association of British Insurers, the body charged with representing insurers in the UK. Given the high winds we’re currently experiencing in Scotland and the troublesome ash cloud drifting over from Iceland, they probably have their work cut out for them at the moment and are probably experiencing more than their usual levels of attention, so they make a good example for a brief case study.

Supposing we’re interested in looking for Scottish insurance companies, so we might type ‘Scottish Friendly’ into the search box. We would see this page here:

http://www.abi.org.uk/MemberSearchResults.aspx?searchQuery=SCOTTISH%20FRIENDLY

That’s a normal search results page and not one you’d usually expect to be published to the web, unless search engines have gone to the trouble of using the search tool on the site.

However, if we look a little more closely at the main members’ database, we see a few interesting ‘features’:

  • Filter results by: allows you to filter the results down by the first letter of the name of the company, but it does so in an entirely search engine-unfriendly manner (it uses a script) that means the content can’t be indexed.
  • Next >> allows you to move to the next page of search results but again the subsequent pages are accessed via a script that can’t be indexed.

Neither of these are a particular problem if we are looking solely at links off a rarely used search results page, but given the main members database relies on exactly the same filtering technology, in effect it means almost none of the content is visible to search engines.

This problem could be solved with some technical changes but as always, it’s better for the problem never to arise in the first place, and that comes from thorough SEO planning and strategy from the outset.

Google Farmer/Panda & Unique, Quality Content – A Case Study

The search marketing industry has been furiously debating the pros, cons, impact and implications of Google’s recently stated plans to improve the quality of their search results by weeding out poor quality sites that rely on publishing poor-quality content. Some site owners have been up in arms, some have been ecstatic and some have been fairly ambivalent to the whole thing, showing that the results of the update are anything but clear-cut.

Whilst at the time of writing the update has not been rolled out in the UK, from our perspective, the ‘new’ advice is very much in line with the advice we’ve always given to clients, namely offer good products or services, write good content, offer something unique and useful to users, market your site in the right places and keep on developing it. We thought it would be useful to share some data from a recent project that might go some way to showing just how effective good quality, unique content can be at gaining ranking positions and beating down the competition.

The Essential Background

  • The site is brand new
  • It operates in a highly competitive sector
  • A focus was placed on writing a large amount of totally unique content

The Results

The graph below shows the rise in organic search traffic during the weeks after the launch of the site, denoted by the arrow towards the left hand side. Whilst we can’t share exact figures what we can say is:

  • Over 15,000 different keywords brought traffic to the site
  • The maximum recorded increase in traffic is over 1,000%
  • Google brought most traffic by a factor of >200

What Can We Conclude?

This clearly isn’t a particularly scientific study, but what it does show is that good quality, unique content, well target and well-optimised can bring in significant amounts of traffic in a relatively short period of time. It shows that unique content is rewarded with high ranking and it shows that once ‘accepted’ as being of high-quality, the site’s traffic can grow substantially. The positive impact of good quality content is clear.

What remains to be seen is whether the relatively young age of the site and its limited backlink profile will count against it once the Farmer update hits the UK, or whether it will continue to attract traffic in similar volumes. The answer will of course be to keep developing the content and keep marketing it to develop good quality links, both of which are happening.

What Do You Think?

What have you experienced like this? Good? Bad? Indifferent? Let us know in the comments below!

Site Speed Now a Factor for Google Ranking

It’s official – Google is now using Site Speed as a factor in determining where to rank pages in its organic search engine results pages, following on from speculation that arose in November 2009 after a now infamous Matt Cutts hint.

Google has been using site speed as a ranking factor in its AdWords model for some time, but speed is now one of the 200 or so factors taken into account when ranking a page in organic listings, marking a fairly fundamental change and making it ever more challenging for web developers, web masters and search engine optimisers to achieve strong, high ranking positions. The knock-on effects will also be felt at hosting companies (particularly those who rely on the high-volume, low-service model) as more pressure is put on them to deliver web pages quickly, which itself can be largely dependent on the quality of code produced by programmers but also on the hardware provided, internal infrastructure, bandwidth available and connections. Suffice to say, those who invest in high-quality infrastructure and fast connections will feel the benefit as clients migrate towards them, as will developers who take the time to produce efficient, clean code that doesn’t get in the way of the user experience.

The team at Google are keen to stress that the impact is likely to be minimal, saying:

While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven’t seen much change to your site rankings, then this site speed change possibly did not impact your site.

Speed has long been of interest to Google as they have carried out extensive testing on the impact of speed on how users interact with their search tools. In short, a speedier site means better click-through rates, higher levels of engagement and a generally higher degree of satisfaction.

SEO Course in Edinburgh – December 3rd 2009

Following on from a seminar conducted a few months ago, I’m delivering a day-long training course on SEO at Netresources in Edinburgh.

The course is designed for attendees who are new to SEO and will give them a grounding in a variety of principles and techniques that can be used to improve and optimise a web site. Some of the topics being covered are:

  • Semantic HTML
  • Web Development Techniques for SEO
  • Researching & Planning SEO
  • Writing effective, optimised content
  • Building Links
  • Techniques to Avoid

The course takes place on Thursday the 3rd of December, runs from 10am to 4pm, includes lunch and costs £300 + VAT per person. Note that the course is ILA approved.

For more details on the day and to book a place, head over to the Netresources site or call them on 0131 477 7127.

There are just 8 places available so if you’re interested you’d best book your place as soon as you can.

Hope to see you there!

– Nick

The Official Death of Keywords

At long last Google this week confirmed what most of us in the industry knew already – meta-keywords are not used in organic search rankings. No real surprise there to be honest! Although, what those who still sell it as a service will do, we have no idea.

The announcement was made in a post on the Google Webmaster blog the other day and has triggered the usual speculation and discussion that surrounds Google’s regular dissemination of information, most of which involves the usual picking apart of the statement looking for hidden meaning. Perhaps Dan Brown’s latest book is stirring up the SEO industry in its week of release?

Our View on Meta-Keywords

Our view on keywords has always been simple and straightforward: don’t bother. We have seen no difference in ranking ability of two pages that possess and lack keywords respectively, none whatsoever. Likewise, we have never seen any impact on ranking come about as a result of meta-keyword tweaking.

Meta-Descriptions Make Sense

The post also mentions that Google has not used the Meta-Description field for ranking purposes for a number of years, however, our experience is that the description field can actually aid matters as it plays a crucial role in determining the click-through rate of a listing on a search engine ranking page (SERP). Our view is that the higher the CTR on a link, the likelier the associated page is to rank more highly. Furthermore, the higher the click-through rate of a page, the more traffic you’re likely to attract, so a well optimised and written description can really make a difference.

If you don’t include a meta-description then Google will select a snippet of text from your site, which might not work as well as a bespoke one. We use the meta-description field to display the Mackerel Media phone number on the SERPs page, as you’ll be able to see here. Neat eh?

Looking to the future, we can only hope that now Google has cleared keywords up, they’ll move swiftly on to link spam blogs, dubious paid link vendors and all the other fun issues that keep us busy. Hmm…perhaps we’re a little too optimistic.