digital analytics reminders from measurecamp

MeasurecampThis weekend I spent Saturday immersed in data and analytics at Measurecamp. I love a good unconference and this definitely is up there, brilliantly organised and lots of brain stretch, and of course I can’t not mention the free t-shirt and laser pointer.

I’m a complete advocate and enthusiast about the power of data and testing. So much so I made the business case, won the budget, and recruited a digital analyst in all of my last three workplaces. Plus an analyst role is on the cards for my current team at Raising IT too.

There’s a huge amount of material generated from the event that I won’t try to replicate here. But I thought what might be useful is a few of the top-line reminders I took away:

  • Testing has lots of trip ups and myths.
    • For long purchase / supporter journeys they may have made their mind up before you even started your test.
    • It’s possible to test a gazillion things, but really you should only test the things you can actively influence and change.
    • You shouldn’t necessarily use all the results to judge a test, any ‘whales’ (outliers) should be removed to avoid skewing the conclusion.
    • Traditional models assume the environment hasn’t changed. You need an agile analytics approach where there is change to factor in.
  • Tools for digital analysis have converged to greater or lesser extent.
    • Google analytics is very powerful these days and what most people are using. I heard only a mere mention of a couple of other providers throughout the day.
    • Tracking inbound phone call sources can be made easier through Twilio or Calltracks (and probably others).
  • Integrating tracking in your CMS is still a bit technical and time-consuming.
    • There’s so many intricacies with integrating analytics correctly that it kept coming up again and again. A few of the people I spoke to were frustrated that they spent more of their time on implementation of tracking than the analysis of the data.
  • Key performance indicators need buy-in,actual evidence is not enough.
    • You should only publish clear actionable results that people are actively bought into viewing.  This take out reminded me of test I did while at one of my organisations. They insisted we printed the whole of the monthly web stats and display them on the notice board … it took a whole three months before anyone returned the ‘claim a prize if you spot this’ slip!
  • Attribution models still need expert judgement.
    • Last click / first click / weighted or something else, you still need to make a judgement as there’s no clear-cut way to decide what’s best for your activity. Try it and shape it through use.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s