The Digital Analytics Hierarchy of Needs

A few weeks ago I discoverd Monica Rugati’s fantastic Data Science Hierarchy of Needs. It’s a data science-centric riff on Maslow’s Hierarchy of Needs, a classic concept in pyschology. I’ve found myself using Rugati’s diagram and the concept in conversations with colleagues, partners, customers and friends ever since, as a way to explain the challenges we face in this Digital Analytics space.

Business people get excited about the latest buzzwordsBig DataArtificial IntelligenceDeep Learning etc. Before you can break out TensorFlow and start doing bleeding-edge data science, you need to ensure you’ll be working on data that reflects reality.

In practice most new clients we audit have substantial problems with behavioural event data tracking and require at least some work to get to the minimum baseline of accurately tracking everything a user does on their web site and apps.

And so, let’s have a look at the Digital Analytics Hierarchy of Needs:

Where am I?

We see a full spectrum between near perfect and absolute disaster zones but the advice is generally much the same: you need to get your data collection in order if you want to do the cool stuff. Along the way you’ll unlock new insights and new capabilities. Done right, you’ll also give yourself awesome flexibility for future improvements and modifications to how you do Digital Analytics.

What does “done right” mean? That usually starts with tag management and a data layer. That’s a topic for another blog post, coming soon.

How long is this going to take?

To get to the minimum baseline, you don’t have to down tools and spend a year reworking. There’s always interim steps that will bring useful results and help you get insights. It all starts with getting a good idea of the current state and building a plan to get you to the top of the Digital Analytics nerdvana pyramid.

So how do I get to the top?

We tend to start our customer engagements with a data audit to find out where things stand and what work needs to be done to unlock the cool stuff. It’s a pretty quick and painless process, and there’s invariably a list of quick wins that bring immediate value, along with a roadmap of bigger pieces to work on.

Get in touch with us if you’d like to kick off the discussion.

Popular posts like this

Monitoring Snowplow bad rows using Lambda and Cloudwatch

Decoding Snowplow real-time bad rows (Thrift)

Make big data small again with Redshift ZSTD compression