July 12, 2016 ↘︎

Measuring personalised experiences

Loading the Elevenlabs Text to Speech AudioNative Player...

Personalisation is all the rage lately.  It seems everyone and their vendor is on the personalisation bandwagon, hell-bent on providing a more relevant experience for our audiences.  And I fully support that.  But there’s two aspects that tend to be forgotten in the rush forward, which will impact your ability to tell your boss that it’s working.

The first is the strategy.  I’m going to save this for another post, but suffice it to say, personalisation is not a technology problem, nor is it a data problem, nor is it a content problem.  It is an assignment problem.  It’s the business rules, it’s the “why” we want to personalise, it’s the “to what end” that generally causes the most problems.  It gets big, massive, huge, in a heartbeat, and this is where many companies struggle to keep it going.  It’s easy for it to end up on the “too hard pile for today” or the “I’ll look at that tomorrow” pile.   But, that’s not what this post is about – that’ll be another one.

No, today’s post is about measuring the effect of personalisation.

You’ve just developed all of these variations in content; you’ve defined all the rules for targeting the content; you’ve ascertained the end goal of this targeted content, and your boss asks you how are we going to measure this to make sure it’s working…?

Personalisation measurement is a little different.

The normal way we measure stuff is page-based, or click-based.  The page was served, we capture a bunch of stuff about the page, and the visitor etc etc.  But we generally don’t capture what the “experience” was.  And note that I refer to it as the experience.  All of our vendors are talking to us about Experience Management – so we should be measuring the Experience.

And the “experience” comes down to the targeted variations of content that are displayed on a given page, to a given visitor.

So, for example (and I’m going to use the most overused Retail example) imagine your homepage has three variations:

  • Variation 1 – Unknown visitor, first visit, general mix of products and promotions.
  • Variation 2 – Female, repeat visit, products shown skew towards female, hero product for female category
  • Variation 3 – Male, repeat visit, products shown skew towards male, hero product within male category

Ok, so that’s a pretty easy and pretty generic example.

To measure success, or improvement, page-based measurement no longer really applies.  Sure you’ll still need to know generally things like visits, page views, bounces, time, and channels, and all the rest of your normal metrics.   But that won’t help you determine whether all your hard work to improve relevancy is actually paying off.

To measure experiences, you’ll need to go about it a different way:

  • measure the page they were on
  • keep track of the content they saw, and
  • capture the type of visitor they are

The page they were on gives you all of your normal metrics that you’re used to.

The content variations should be captured in a meaningful way.  Nowadays, you’ll probably use a data-layer to populate all of the variations displayed specific to this visitor – but this is the bit that enables you to see what content works, on what pages, for different visitors.  Capture that little lot into a listVar, and set a specific success event against it, one that will track the number of times this content was served.

Lastly, capture the type of visitor they are…and by this I mean any business segment, or profile that you’ve used in your targeting.  Once again, populate an eVar with this information.

Once you’ve got those three things captured, you’ll be better able to measure your personalisation efforts.  You’ll be able to see which Visitor Types are on your site the most, which types of content they are seeing and you’ll be able to attribute other conversion events back to the listVar values.

Personalisation is coming – so get ready for it, you’ll need to be prepared to measure it differently.

 

DB logo
DB logo
DB logo