1. No Trust in the Underlying Data
In many cases, the stakeholders do not trust the underlying data. There can be different reasons for this. In general, you need to identify: What does it take for the people receiving the reports to trust the data? How can you build confidence in analytics data?
Several factors can affect the level of trust in the underlying data.
Insufficiently Specific Requirements Management
In most cases, things start to go wrong early, during the design and implementation phase. Often, the requirements for measurables are not very specific. And even if they are, they are often diluted on the path to implementation: Designers misread what marketing wants. The development team then implements whatever is technically possible (“We don’t even have that data in the backend!”). And if there is no validation of the data during testing... Voilà: Useless data in the analytics tool.
Aligning requests with reality is one of the most time-consuming tasks during implementation. It is worth being nit-picky here!
The same goes for the testing phase: Make sure you have enough time to check the real data collected by the analytics tool and align it with the requirements. Run through different test cases and scenarios. This is, of course, also very time-consuming.
Comparing Data From Different Sources
A timeless classic in the list of misunderstandings is a comparison of seemingly identical data from different sources. We often witness this in the evaluation of marketing initiatives and, in particular, marketing campaigns. The advertising network with analytics tool and conversion tags integrated into the site spits out completely different numbers from the analytics tool itself. “Well, this can’t be right”, people will say in a slightly accusatory tone.
Yet there may be a simple explanation for the discrepancies. In most cases, people are simply comparing apples with oranges. Here are a few examples: Firstly, the clicks in the ad network cannot be equated to the visits in the analytics tool. Secondly, on the conversion pages – for instance, in a form to sign up for an event – the conversion tag will count every button click but the analytics tool only counts confirmation pages displayed.
And then there are situations where the exact same thing is measured and there are still discrepancies. We often see this with orders: The backend registers 100 orders, analytics only 90. It is clearly evident which 10 orders are missing in the analytics tool and yet there is no apparent reason for it.
Faulty Campaign Tracking
One of the main motivations to use analytics data is the evaluation of paid online advertising. Online, it is easiest to track the effects of the budget you invested. However, the basis for this is still formed by tracking parameters handed over in the link leading from the ad to the website.
In most cases, this is a manual process involving several parties, from requesting the tracking parameter to generating and integrating it into the link to integrating the link into the advertising platform. Kudos if you can pull that off without a hitch.
It doesn’t take much for clicks not to be tracked correctly with the tracking parameter: A question mark or space in the wrong place can mess everything up. This usually leads to an outcry when the first campaign reports come in and data is missing. When a footnote about missing data needs to be added to the reports, it usually does not help people to trust the data.
Bot Traffic Not Filtered Consistently
Some websites, especially online shops with a lot of everyday goods, are at particular risk of a large proportion of the traffic being bot traffic. An increasing number of automated crawlers are scanning the web to collect product details and compare prices. Their intentions are usually not malicious, but they create a significant amount of traffic, which distorts the overall picture, especially when their activities are irregular and some days just see an exceptionally large volume of this artificial traffic.
It is getting harder and harder to distinguish this non-human traffic. The bot technology keeps getting better.
2. The Important Things Are Not Measured
How do you evaluate the success of your online activities? The minimum requirement is reports measuring the traffic volume. These are standard in analytics tools. Do you also measure user interactions, known as events? Maybe several of them, maybe even so many that you don’t exactly know what they represent? Are those really the relevant figures?
Of course, it is important to have many users on your site or in your app. Without a sufficiently large user base, you will not generate any business. A large number of user interactions is also very helpful when you are analysing problems, no doubt about that.
But what really matters are the user interactions that contribute to your business objectives. Your business objectives translate into strategic online goals. These need to be measurable, and not necessarily just with analytics tools. Other data sources can also be added to the picture. Objectives become measurable when they have KPIs attached to them. Finding and defining these is a task for everyone who uses the analytics data. It takes a lot of discussions and consideration to identify the specific goals. KPIs are not provided by the analytics tool. You will also not find them in a best practice post titled something like “These are the 10 KPIs you need to track”. Defining them is a team effort.
3. Reporting Does Not Lead to Action
If there is a lack of trust and the important things are not measured, the data is of very little use to you. All the work you are putting into analytics is in vain. Nobody wants that.
However, the data is also of very little use if you spend all your time on reporting. If all you do all week is update recurring reports and measure new campaigns with the same reporting templates, you are certainly doing a lot, but are you doing the right thing? The question you should ask yourself is: Are you still reporting or have you started analysing yet?
Reporting is the foundation. It is important to identify trends with dashboards and reports and to be able to act on them. But if reporting eats up all your time and you have no capacities left to analyse findings and problems, you lack the basis for optimisation. After all, that is the raison d’être of analytics: Optimising! And doing so in a continual loop of report -> analyse -> identify measures -> implement measures, and then again from the top.
This, however, requires a culture of optimisation that everyone subscribes to. Establishing that is a job – THE job – of a digital analyst. That is why it is important to have the right people first, and the right tools second. Of course, you need the right tools, and usually, these cost money. But what good are the tools without people who know how to analyse the data and harness the full potential of the tools?
It takes a lot to turn analytics into an indispensable pillar for your company’s success. In most cases, however, you do not need new tools in order to be successful. Rather, you need a solid foundation providing reliable data relevant to your goals and people who can insightfully analyse the data, harnessing the full potential of the tools available to them.
Contact for your Digital Solution with UnicBook a free Consultation Session
Are you keen too discuss your digital tasks with us? We would be happy to exchange ideas with you: Jörg Nölke and Gerrit Taaks (from left to right).
Contact for your Digital SolutionBook an appointment
Are you keen to talk about your next project? We will be happy exchange ideas with you: Melanie Klühe, Stefanie Berger, Stephan Handschin and Philippe Surber (clockwise).