September 11, 2013 ↘︎

Finding another string for your bow.

Loading the Elevenlabs Text to Speech AudioNative Player...

When I originally wrote this post it turned into a bit of a rant about what Google was doing with our precious keyword information. But I wanted to remain objective about it, rather than negative, so I started again.

Combined Natural Search

The first announcement by Google that they’d be “hiding” keywords from marketers – all in the guise of protecting our privacy from all of those evil network admins and shared wifi networks – said it would only impact about 5% of our traffic. Unfortunately, it’s becoming more and more likely that we’ll eventually end up with no keyword data from natural search (if they’ve come from Google).

On the rise

While it’s been steadily rising month to month, over the course of the last month or so we’ve seen the number of unavailable keywords suddenly take a massive leap.

Prompted by a huge rise for one of our clients, we reviewed data from a number of different industry verticals to see how the percentage of “Keyword Unavailable” has grown over the last 9 months. The pattern is the same across pretty much all verticals.  Google has clearly done something else.  And it would appear that now, the majority of searches from Google are being done under SSL.

The SSL part never impacted the ability for you to see the search term – what they did was to remove the search term from the referrer string altogether.  But what we’re seeing lately is a large rise in searches done on their secure site, which is now pushing Keyword Unavailable, or Not Provided, up into the +60% mark.  In fact some clients are now seeing in excess of 70% of search terms not provided.  I can certainly understand the desire to secure our internet activity from prying eyes, I think it’s a good thing. And I can certainly see companies moving more towards this with eventually everything being done this way – a number of social sites have recently adopted secure connections as the default.

Unfortunately, it would also appear that many marketers are not that fussed either;  I would have thought the whole world of marketers would have cried foul and forced Google to revert back again…but that’s probably another story.

Maybe this points to the fact that some marketers are not using this information in their planning cycles…

So, what strategies can we use now?

Just because you don’t necessarily know what they searched for doesn’t mean you shouldn’t care about content performance.  And it doesn’t mean you shouldn’t be trying to infer intent.

Let’s assume that we lose sight of search terms entirely.  This puts it in the realm of most other traffic sources, except for paid search.   With Paid, you still have access to keywords etc. but you won’t get the actual term they’ve searched for.  And of course, you can still use the Google Webmaster Tool to approximate traffic based on term. Granted, a few of the other traffic sources can infer intent – if they come through from a banner, there’s the message; if they come through from an EDM, there’s also the message.

Entry page performance

We have a lot of data still being measured, and we can focus our efforts on using it in ways that we might have forgotten.  One of those is perhaps entry pages; entry pages help to infer intent. But not just standard entry page metrics, I’m talking about entry page performance.  By performance I mean ‘how well does the entry page perform in terms of captivating the user and driving the user behaviour that you’re looking for, whether that be purchase or engagement?’ And,  ‘how many people coming through it as an entry page, go on to purchase or sign up, or read more content?’

Looking at performance:

  • Use participation metrics in SiteCatalyst to understand performance.  
  • Use a combination of bounce, visit depth, time on site, new/repeat metrics to help determine whether the page is any good at converting.  
  • Entry page pathing – where does the visitor go once they’ve become enamoured with your content?
  • You can use a modified engagement index to help determine performance – focus the index on landing page or entry page performance.

Of course, you’re using segmentation by default now, so you’ll be comparing customers with prospects, to see if there are differences there.  You’ll be looking at different entry pages, visit depths and so forth to ensure you’re still getting value.

You should also be using your internal search terms – for a number of reasons.  Firstly, they tell you what people are looking for if they can’t find it on your site quickly (search terms used, number of searches).  Secondly, they’ll tell you what results you’re lacking (searches with zero results).

Your bow is not broken.

So, just because Google sneaked in and whipped the string off your bow, doesn’t mean that you’re unarmed.  There are other ways to string your bow.

DB logo
DB logo
DB logo