Andy Young is an entrepreneur-in-residence at 500 Startups, where he helps teams with growth, analytics and product. Previously Andy led the launch of Stripe in the UK, founded and grew membership management platform GroupSpaces, and created Selective Tweets. Andy likes building useful things, and using technology to help people.
Okay. So yeah, welcome to this session on data-driven growth. So I’m Andy, thanks very much for having me. This is my Twitter handle, so please use that to direct any comments, feedback, heckles, you know, abuse in that direction.
It’s great to be here in Edinburgh. I’m actually a quarter Scottish, which I assumed was how the speaker running order today was calculated until Mike came up here in his kilt this morning, and now I’m not so sure. To introduce myself briefly, so for the past year I’ve been with the VC firm, 500 Startups, working with over 40 of our portfolio companies, helping them with their growth. Alongside that, I also do freelance consulting. Right now, I’m working on a project with a web-audience measurement firm Quantcast. Prior, I spent two years doing sales and operations as the country lead for Stripe here in the UK. And then before that, I rode the start-up roller-coaster as the technical co-founder of my own company GroupSpaces. Way, way back, I built, a Facebook application, Selective Tweets, that some of you, may have encountered. Somehow, and I’m still not quite sure but that reached over a million users over a period of years, so that was quite something to watch. I really can’t take credit for that, but people found it useful.
So what I’d like to do today is to take all these experiences from people I’ve worked with and projects I’ve worked on, and share with you a series of tools, tips and techniques about how to use data and analytics in order to get growth. So the agenda for today… first of all, why are we here? How is this useful? Why analytics? Next, take a deep dive into deconstructing what the modern analytics stat looks like in terms of tools and platforms. And then finally, lies, damn lies and tech metrics, open up a little, and share some of the truths about what happens when you take all this perfect theory and apply it to reality, in practice.
So, without further ado, why analytics? And the important thing to ask here is, you know, why analytics? What are we actually trying to achieve? And this is important because today we’re super fortunate to have a plethora of platforms available to us that make it super easy to jump right in and start collecting vast amounts of data. And the thing here is, don’t start with the data, right? This is my dad, and he worked from a home office, and often times my mum would come in into the office and discover he’d be reading the dictionary for half an hour or more. He’d gone there to look up the definition of a word, and in reading that definition, he’d discovered a new word and he’d be like “Okay, and I’ll look up this one,” and so on and so forth. And he got sucked in, spending so much time, having a lot of fun, but forgetting exactly why he was there in the first place. And this is what happens when we start with analytics by collecting the data, and just saying Ooh, what data have we got here?”
So, the common analytics fails that I screwed up in the past, particularly with GroupSpaces, and a lot of startups that I’ve worked with I see drowning in too much data, right? Not identifying the key questions up front that we need to answer, and not starting with clear hypotheses, things that we believe or suspect, and want to prove or disprove, right? So we don’t wanna do this.
We use analytics because we need to know how we’re doing, and there’s questions, how we’re doing for our overall businesses or products, are we growing, are we being successful? But then, zooming in to the day-to-day, the week-to-week for our individual experiments, our marketing campaigns, you know, our different customer acquisition channels, like different customer segments, and then for each of these, we wanna ask the questions, what is working? What is not working? Where should we be focusing for improvement? So the analytics pros, the people that I see being really successful that I’ve worked with, they start with a hypothesis or the question, because only then can they identify and collect and analyze the necessary, relevant data.
Yeah. And that’s good. but it’s so easy to, like, look at the data and then move on without doing proper written conclusions. And it doesn’t need to be lengthy, just take a few notes about what did we learn, and then moving beyond that, what are we gonna action differently? Are we gonna roll out this campaign? Are we gonna build new campaigns in the same form? You know, are we gonna try different things in the future? And the best people use analytics, not just to conclude an action, but then to iterate and revise what they do in the future, based on the learnings from pieces of analysis done before. So this is Analytics Pros.
So talking about analytics for growth, talked about analytics, but what do I mean by growth? Well, very simply getting more stuff to be more users or more revenue, they’re intertwined. But often times for a company, we’ll be prioritizing one over the other, and so it’s useful just to call out exactly what are we trying to achieve, depending on the stage of the products and the company. There’s a lot of different useful techniques we can use at the early stage, techniques such as Eric Ries’ Innovation Accounting, cohort analysis, hopefully everyone’s familiar. At the growth stage, when we scale up, techniques such as growth accounting, which I’ll cover quickly, customer segmentation, analysis of our marketing channels, and looking into the economics of our funnels.
So these techniques are all great, but we need to be clear on what questions we’re trying to answer. So, an early stage, you know, one of the questions, if we’re just launching a new product, maybe it’s a new startup, the key question is typically, are we getting traction? And Ash Maurya, the author of Running Lean and Scaling Lean, two books I highly recommend, you know, he defines traction as “the rate at which monetizable value is extracted from customers”. Okay, that sounds a little academic. Basically, it’s the rate, so, like, how fast are we creating something valuable, that maybe in the early stages we won’t be making money right away, but we’ll be able to monetize over time. So this is not vanity metrics, like sign-ups, registrations, but are we creating value? Are people engaging with our products, using them, you know, potentially purchasing whatever it may be?
Eric Ries defined the term “innovation accounting” in his book The Lean Startup, and here essentially we’re saying when we’re in the early stage, traditional accounting methods just don’t apply. We may not even have revenue or financials that we can work with, so this is why we need analytics, techniques such as cohort analysis. You know, cohort analysis is a technique for learning not just are we growing but are we growing better? So very briefly, any metric that we can measure, say for example, our percentage of active users. Any metric like this, when we’re measuring it over our entire user base, we run into problems that we’re trying to say “Okay, we’re rolling out new campaigns, new landing pages, new conversion funnels, and we wanna see if we’re getting better, not just if we’re getting more users or more customers.” And when we measure it across our entire user base, those differences that we may be making get drowned in the detail. So with cohort analysis, we very simply divide our users up into groups or cohorts, typically based on the month of sign-up, right, and then we can split out, you know, the users that we signed up more recently versus further in the past. And the patterns about whether or not we are making improvement, based on the changes we’re making, start to emerge. So super useful technique.
Moving onto growth stage, you know, the questions are, you know, are we growing? And hopefully we have a chart that looks like the classic up into the right. This is an example for a subscription business, and this is good. But any top level metric that we use, such as our monthly recurring, our number of transactions, any high level metric, only tells us the end result. It doesn’t tell us what’s happening beneath the surface, which can hide very different pictures. So growth accounting is a super useful technique, where we simply take the underlying components of any top level metric. So, in this case, take out a monthly recurring revenue for a subscription biz and say, how much revenue came from this month or this week from new customers that just signed up? How much did we get from reactivated customers we won back we’d previously lost? Did we get any expansion revenue from upgrades? Have we offset all this against, what are we losing from churned customers or contraction from people that are downgrading their accounts? And we really start to understand the health of this business. This is powerful, because take the same curve, the same growth curve, right, the data that produces these two charts are the same.
But, this chart produces the same top-level growth, and the picture for this business is very different. Here, we have a huge amounts of churn and we’re only producing growth because our acquisition efforts are so successful, that the acquisition is just beating out and offsetting our churn. But this is an unhealthy business, because as soon as the current customer acquisition efforts inevitably fall off or even, you know, fall, this business is gonna be in a very bad state because everyone we’re acquiring, we’re losing very quickly.
So, growth accounting has super useful methods, anytime you have this top-level metric. The folks at Social Capital have an extended series of blog posts if you’re interested in checking them out more there. Other things for the growth stage are funnel performance. But for marketers, it’s not just about conversion rates and what percentage reach the step of the sign-up funnel, right? As marketers, we should be thinking about the economics of the funnel, you know, our cost of acquisition, our cost of click, all the way through to, you know, how much are people spending when they convert? So not just the conversion rate, but what’s the basket size? What plan are the people upgrading for?
Ultimately, what’s the lifetime value of these customers? And this is key because as marketers, it’s critical that we need to understand the full funnel. It’s not good enough just to drop leads on new users, on new sign-ups, new registrations, or even first time purchases on the business and then think, “Job done, hand over to the product team” or, you know, whoever else it may be, right? And the thing here is that, you know, in the Growth Stage…actually, these are two questions that we ask every company that we start working with at 500 Startups. Think about this for yourself, which channels bring you the most customers? But which channels bring you your best customers? Because, often times, these are not the same. Here’s a question, how much do we want to pay per click at the top of the funnel? Seems reasonable that we might want to pay as little as possible per click, optimize, drive down our cost.
But think about this way, what if we tried to pay as much as possible per click? What would that give us? If we could pay as much as possible per click, we could outspend our competitors or we could perhaps scale into new acquisition channels that otherwise wouldn’t have been accessible to us. So for sure we want to grow in a scalable and profitable way, but as marketers, how should we be thinking about increasing the lifetime value, or at least looking for those customers with high life time values and increasing conversion rates, such as we can afford to spend more on acquiring a customer rather than less? Interesting food for thought.
This all comes down to customer segmentation and channel performance, because it’s very easy to look…and when you’re just starting out, look at our performance in aggregate across our entire customer base. But for every company I’ve worked with, that hides a lot of interesting things beneath the surface. So for different groups of customers, where do they come from? Which channels? And how do they behave in terms of conversion rate spend? And thinking about segmentation here, users acquired across via different channels inevitably have different behaviours. Conversion rates from Facebook, very different to referral, very different to, you know, organic, often times. Different cohorts, people who’ve signed up for our product over time will have experienced different versions of the product, or of the website experience or the app. Maybe different users will hopefully have been exposed to different buckets of A/B tests. So the thing…the key thing to this is, these are all properties of our users. And these are all things, you know, our UTM tags, landing page, when did they sign up, what A/B test did they get exposed to, that we should be ensuring that we store so that we can use these properties to slice and dice and segment our customers later to really understand these questions, where our best customers are coming from.
One key obstacle that most every company I’ve worked with runs into is demystifying direct traffic, where we got as much as we can to attribute… to tag our traffic and figure out where our customers are coming from, but still we get left with this large proportion of traffic that we just have no information and no insight. So I thought to run through a few ideas and tactics here that have worked. Thinking about this, you know, what do we do here? The first thing is just very simple, we have to tag everything. And typically, we’ll have tagged our paid traffic, Adwords, Facebook, whatever it may be successfully, but there’s more we can do where our aim is to map all direct and referral traffic, so that it shows up in our analytics tools with a specific source and medium. We do this for email, notifications from our own product, email marketing, organic social traffic, do we have that coming through and tagged appropriately? What about offline? How can we identify traffic that’s come from offline acquisition sources? There’s a lot of detail here I’m not going to jump into, but actually I’ve published a guide and a reference to tagging, so the links are in the slides, that you may find useful. So once we tag the basics, another thing is landing page can often be a clue. So take this example, we’ve tried to tag as much traffic as we can, but we still get loads of direct traffic, and if we drill down by landing page, we see a lot of traffic seems to be landing on the sign-in or sign-up address.
This is something I’ve seen a bunch of times, when maybe we have social log-ins so people sign up with Facebook or Twitter to use our site and in doing so, they’ll get redirected away to Facebook or Twitter and then come back to our site. Now, if the technology is not quite set up probably, this traffic can show up as a new, fresh visitor the moment they come back to our site, so there’s a clear clue there. Something else we see here is flash messages, so this could be another clue. If we’re sending email notifications to our user base to come back for our application in order to read some message that they’ve received, maybe our emails, our notification emails don’t have the appropriate tagging. And so this could be a clue to what we might be missing. Thinking about other difficult questions, cross-device offline channels. Cross-device, it’s a commonly increasing problem that customers will often research and go through the consideration phase on one device and then come back and purchase at a different time on a different device. And this is an unsolved problem for sure, so it really is a best case effort. But thinking about it, if we were to solve it, what would we need? We just need to be able to join the dots, so any uniquely identifying information we can use to connect customers on different devices. And this doesn’t need to be as heavyweight as trying to force people to create a login on mobile, but ideas like, if we could just prompt them for an email address or for a mobile number.
Here’s an example, actually going the other way from Path where you arrive on desktop but want to download a mobile app. And here they use Twilio in order to say “Hey, let’s make it easy for you. Just enter your mobile number and we’ll send you a link to download the app directly.” Now, they can combine this with deep-linking platforms such as Branch, in order to create a unique reference for that download link, so that when the user signs up on mobile with the same phone number, they can join those dots and complete that acquisition loop. Another trick, filling acquisition blanks, have you seen these sorts of surveys? Often forgotten or underused, where we just simply just ask our customer how they heard about us. So this could be super useful, doing this after the customer’s completed the conversion step. So we don’t wanna do it up front, we don’t wanna drain the funnel, so we don’t wanna get in the way of conversions. But say we’ve completed purchase, on the purchase completed screen, just ask the customer, you know, “How did you hear about us?”. Make it optional, and we might get a percentage of people that fill this out. But that’s okay, it’s reasonable to extrapolate that out, and for any missing attribution, we can build up this picture of where people are hearing about us. Fantastic.
Another thing. So getting pretty advanced, but geographic A/B tests. So take Google as an example. Now, you may have been exposed if you’ve been in London in the past couple of years. Google have been doing these really broad brand advertising campaigns for, in this case, voice search for mobile, promoting Android, also for Google Chrome, where they took over all the tube stations and wrappers round the free newspapers. Google, of course being data-driven, I was really fascinated to hear how they tested this numerically, and what they did is they took London and they looked at the baseline usage for Chrome, for example, in London, and compared against another similar city, in this case I think it was in Germany. When they didn’t run the advertising campaigns and they could A/B test it, if you like, monitoring performance over the subsequent weeks of usage of the product in order to see what impact that local advertising was having.
Moving on. So, this is something that’s really useful that comes up, brand vs non-brand terms for organic and paid search channels. So when we think about…if people come to a search engine and they type in the name of our company or if they type in a long tail keyword or phrase. Google, by default and other analytics tools, will typically bucket all these together under organic search, under paid search, separate from our direct traffic. But if we think about it, what is the motivation, what is the context for someone who is typing our brand name into a search engine? Yeah, they probably have more in common with someone who is typing in our domain name and showing up as direct, than they do with someone who’s coming via a…or higher, more distant search intent via a longer tail search term. So one thing we can do here is make sure we’re taking advantage of…in particular for Google Analytics, the advanced channel groupings, making sure we fill up all our brand terms, got those filters set up, and set up those custom buckets so we’re splitting our brand search from our organic search because typically these have very different behaviors.
Attribution models, first click, last click, what about combinations? This is a pretty complex subject that often times people get bogged down with. I thought it’d be useful just to share one insight or one way I think about attribution in a growth context. So if we think about how we can grow, we can grow in two ways our acquisition, we can reach more people or we can increase our conversion rate. So if we think about how we can grow that, our first touch channels we need to know which channels those are, because those are the channels through which we grow our audience reach, this is where people first discover us. So if we try and increase our reach, we need to be scaling first touch channels. Compare those with subsequent touches, you know, subsequent clicks, last clicks, these help us grow our conversion rates for people that have already encountered us. And often times, these channels will overlap, and for some businesses, it may be an instant purchase decision and bounce or convert. But for me, just thinking about it in this way, how can we split out and identify these channels so that we can think about it in terms of how we can grow, reach or conversion.
And then finally, in this section, a bit of a sobering thought, that most marketing campaign experiments fail. And if they don’t, if most experiments aren’t failing, then probably we’re not trying bigger, crazy enough stuff. So a lot of the companies I’ve worked with, the thing we’ve found useful is realizing that, even within campaigns that fail, there’s normally some kind of success within that. So if we plan in advance, we need to ask the question, at what point in the user journey did the campaign fail? So maybe people didn’t convert, but did they get exposed to the campaign at all? Did they come to the site? And this is this, you know, cliche idea of doing things that don’t scale. But I was working with one company where they were experimenting with door-drops, like flyers, like physical flyers delivered to particular neighborhoods. And the first attempt we had almost no conversions at all, and it would be very tempting to think that that campaign failed.
But what we thought about was, well, what point…so here we can do things that don’t scale like, follow up and just knock on a few doors a week later and see if we can speak to people and, “Did you pick up on our flyer or did it get lost in the massive, you know, stuff that you find on your doormat?” If people actually found that, was the design eye-catching enough or do we need to iterate right at the first stage? Maybe people are picking up the flyer, are they coming to our website? If we plan in advance, what can we do to track that, including a particular short-code or short-URL that we can track, so we say go to this address or search for this or whatever it may be? Or maybe doing, again, the geographic A/B testing. So even within the campaign that fails, where can we learn what succeeds so we can come back and iterate?
So, moving on, diving into the analytic stats. So if we thought about the questions we wanna answer, the sort of analysis we wanna do, how do we collect this data? There’s a lot of stuff we wanna track, you know, right from the top of the funnel right through to activation engagement metrics and so on. And the thing is, in some ways, we’re very fortunate that there are so many platforms out there now. But, this gives the problem, how do we pick? And often times people do struggle with this. So thinking about how to pick, the first question is really understanding what functionality we need because it’s very easy, and I’ve struggled with this, it’s hard to keep up, getting confused about what functionality and features different platforms offer. So there’s a lot out there like we all start with Google Analytics, right, the session page view analysis and pretty much every company I work with, 500 and Beyond, goes beyond that though to use some kind of user or event based analytics tool like MixPanel, Amplitude, HEAT, Kissmetrics and so on. But there’s more, from mobile I already mentioned Branch, Deeplinking, Attribution, other mobile specific platforms, A/B testing, querying and charting, dashboards, audience demographics, marketing automation, CRM, email push notifications, and now there’s platforms.
It’s very easy to think, “Goodness me, how do we even make sense of this?” So, I’m not gonna belabor the point of going through like every different scenario here, although I would love to chat with you about your individual businesses afterwards. But just for now, just some thoughts around questions to ask yourselves when thinking what are good platforms for you. So what functionality, but who will be using it? Is this for devs and data scientists or do we need platforms that products or marketing people are gonna be familiar and fluent with? How do we want to use it? Is this for in-depth analysis or do we just want some kind of reporting or just dashboards? Which platforms, what do we need to integrate it with? And then, what’s our data volume, you know, and our budget?
Things to be wary of, data locking, future portability. It’s very useful to be able to switch and to use these over time. And also often times, particularly when we work with developers, and hands up, I am one, people think “Oh, we could just build our own.” I have yet to work with a company where the right answer was to build their own analytics and reporting stack. But all these tools, and there are fantastic tools out there, but they’re not a panacea. The perfect tooling will not bring us success, and in fact, it’s often better to just pick one or two tools, honestly, and get set up and running than trying to obsess over what’s the best tooling setup. So, really, you don’t really need to worry about it. The biggest single factor I found that affects the quality of analytics, the companies I work with is actually the quality of their tracking implementation. So if you’re gonna worry about anything at all, worry about this, when we’re implementing our tracking… Like every company I’ve worked with has struggled with the coverage or depth at some point when they’re not tracking information they need thinking back to attribution or the accuracy and like, you know… I’ve been through so many cycles of discovering bugs and things that I’ve implemented myself, right?
So, very simple, this is an example growth spread dashboard for a subscription business. We’ve just got columns around the top for…typically, a weekly cycle works well, and then rows down the left hand side for the different metrics starting with what’s most important, what are we trying to produce, and then the inputs to that. I actually have an example. This Google sheet is available at the bit.ly link if it’s useful to you. And what can we do with this? This is an incredibly useful to set that weekly cycle. So when we’ve zoomed out from the day-to-day, optimizing campaigns, designing experiments, and say, “Just how are we tracking? Are we making progress? How are conversion rates doing? Our overall costs.”
By sharing with the whole team, everyone can get context on how their part fits into the big picture and we can actually use this to predict and prioritize. Like, are our conversion rates doing just fine right now so should we be just focusing on channels? Or actually if our conversion rates has plummeted, we should have the visibility so we can dive in and like address that. And think about what numbers we should be reviewing on a daily cycle, for like individuals campaign, admin versus weekly versus perhaps monthly or even quarterly. Key thing for all the numbers that go here, metrics are people too. So every metric should be measured in terms of either unique people or a percentage conversion rate of those unique people. So when we get the inputs about various clicks, what really matters is how many got there eventually.
So, thinking about joining the dots, putting this together, how do we produce these sorts of dashboards? And initially manually is okay, like, people rush to automate, but by starting manually and filling out that spreadsheet, it’s actually a useful forcing function, so we just learn what works for us, like what numbers make sense. We can understand the data and make sure that we’re not working off incorrect calculations if we’re calculating them every week, and make sure we figured out what data we need when we fill it in. And then only over time can we start automating to reduce effort, when we’ve found something that works for us. And when we come to automate, there’s loads of useful plumbing tools like Zapier or Tray are fantastic, Supermetrics or Blockspring to help pull any sort of data marginal from Google Analytics, many other platforms into a spreadsheet. Segment, I already mentioned. And then finally onto our own database or reporting and perhaps using BI tools such as Tableau, Periscope, etc. Cool. So, we’ve talked about why analytics, what questions we wanna answer, we’ve talked about how to collect the data, what sort of platforms and tools are out there.
So finally, when then all this hits the real world… Now, some of you may relate. At least in my personal experience, the numbers never add up. And what I’d like to conclude is by sharing with you what I might call “The 5 stages of analytics grief”. So, follow along with me and just see how familiar this is. So, the first stage of analytics grief I think about is denial. This is where we’re all happy, we’re set up, we’ve got our tooling and our platforms in place and we’re like “Yep, we’re using this data and it’s great. Fantastic. Everything’s working. Brilliant.”
And then we notice that the clicks in our Facebook ad reporting dashboards aren’t matching up with the number of visitors that we’re getting inside Google Analytics. All right. And we sort of start digging around and so the next stage we enter in analytics grief becomes anger because we’re like, “Why doesn’t this make sense?” and I’ve stared at the data for like an hour. The next thing we look at is like comparing Mixpanel versus our own database. This is a project I was working on last month where we were measuring people that completed the set in the startup funnel. So it’s the same data being measured in two different places. It’s like, “Why is Mixpanel undercounting by like five or six percent?” So this makes no sense at all, it’s so frustrating. The next stage we enter is bargaining, and this is, I think, where we just start clicking around wildly within our analytics tools just trying to find some way to make the data add up, right? “Oh, if I just changed the filters” or, like, maybe just change a different tool and just try and find some way for this to make sense. And it’s actually a really useful point in that, by comparing the same measurements across different platforms, typically like Google Analytics or Mixpanel, so different types of measurement tools, can be a really useful way to just sanity check whether our instrumentation is correct and whether we have any issues or not.
But then finally, it does get a bit depressing and we sort of start to question, you know, where did it all go wrong? But it’s at this point that we hit the bottom and we start discovering, you know, perhaps some light at the end of the tunnel. Because if we think about where it all went wrong, how our numbers don’t add up, there’s two possibilities. One is our data is bad, and we’ve talked about, you know, touched on whether we should be looking into our analytics tracking, you know, for errors, bugs, whatever. The other is maybe our definitions are wrong or we’re misunderstanding what definitions we’re using, so maybe we think the data means something, but it actually means something else. These are kind of like the only two possibilities. So if we dive into these, we’ll be able to find some absolute truths, typically things where the data is in our own database, so like signups, transactional data, you know. We know how many purchases we had, we know how many pieces of content had been reported, compared to that, we have data that’s inherently lossy or noisy, anything measured from the client side via their tracking tools, right? GA Mixpanel, etc. Thinking about the definitions, there’s a lot of nuance in there, like uniques vs totals, often trips us up. About funnels, we’re looking at conversions rates, but what happens if someone enters the funnel partway through? How are they showing up in the data? Do we really understand these numbers in front of us, how they’ve been collected and what they mean?
Here’s the thing, even if the numbers do add up, they’re often still misleading. This is my personal experience. So in the latter days of my startup GroupSpaces, as we couldn’t make it commercially successful and I was scaling it back down, we ended up turning off AdWords. Now, we had AdWords tracking running perfectly, we saw how many signups we were getting from AdWords everyday, and we turned it off and you know what happened to our overall signups when we turned off AdWords as the main channel? Almost nothing. So all these people that I had accurately tracked as attributed to AdWords, it turned out that they were gonna find us anyway, right? And that, for me, was a very sobering experience. So what do we take away from this, you know, experimentation, you know, always looking to dig within the data?
So sort of summing up here, data discrepancies, Google Analytics versus Mixpanel, if we really understand the definitions, you know, they have different cookie rules, different lifetimes per user. This, you know, the numbers I showed before from Mixpanel versus our own database up by 6%, we believe that’s attributed to ad blockers where, you know, people using ad blockers can actually block analytics tools as well. So that clients site tracking we actually saw a dip of 6% in the metric there. Other things, maybe the page didn’t finish loading. If our page load time is really slow, people may get bored and hit the back button, so it will count in our Google AdWords or our Facebook marketing dashboard as a click and a visit, but maybe our tag manager hasn’t actually fired and we haven’t recorded that person has actually landed on our website, you know, or cross-browser bugs, for example.
And so finally, for me, the final stage of analytics grief is acceptance, where…you know, what I find works is if we’ve got a low level of discrepancy between tools maybe 3-5%, that’s okay. We can normally roll with it, live with it, so long as we make sure that we’re aware of it and it’s not scuppering any statistical significance and any analysis we may be doing. And if we have larger differences, that’s when we can try to debug and analyze and dive into some of these potential root causes. Do we understand that… Have we got the correct data? Do we understand the definitions that we’re dealing with?
And that’s pretty much it for this morning, so I hope that was helpful for you. In terms of takeaways, starting with your questions and hypotheses, being precise about what you wanna measure and why. Planning in advance so we can make sure we’re tracking those marketing campaigns, at what point will it succeed? At what point will it fail? Only automate once we have something that we want to… know that works not to scale up. And finally, making sure…we just make a quick bullet point documentation of everything we learn. And to remember, the numbers tell us what happened, but they don’t tell us why. So all of this analytics is fantastic, but it’s no substitute for customer research and creativity and all these other great things we need to do as well.
So, thanks very much. Good luck. And I’m an absolute geek about this stuff, so please don’t hesitate to come grab me throughout the day or this evening. I’d love to chat one on one. So, thank you.