For part three, I want to focus a bit on how we got here.
I’ve often been heavily involved with complex implementations, going through constant tagging iterations. I’ve observed the following challenges (including, but not limited to) :
- There are interactions with content/customers that are not tracked. In the best situations, these interactions have been deemed “not important to the business”.
- Defining what to capture takes time ($$$).
- Transforming those requirements into actual collection takes time ($$$).
The challenges are experienced at varying degrees. A lot of folks are doing a great job. There are digital analytics teams who have much of their implementation work automated, whether through a tag management solution (TMS), via proprietary scripting, or through their CMS platform. There are data layer standards (!!!). There are teams out there working so closely with developers that the entire implementation is integrated with the development process.
There are a lot of teams, however, struggling with all three of these… and I expect that #1 affects any complex implementation whether fully automated or manually integrated every step of the way. This is one of two places I am coming from when I say that the web/digital analytics implementation will die. That “we’re doing things backwards” and should be collecting as much as possible.
Even if there isn’t an organizational feeling that more data would be helpful, I’m sure the following scenario is familiar to many:
- something that could have been captured was not being captured
- you started capturing it
- you wished you could have the data retroactively
From one organization to the next, there are variations on these themes. Some hold publishing for analytics. Some push to production, foregoing anything but base analytics until custom tags can be applied. Some don’t need to touch anything by virtue of their implementation, but may experience other limitations and missing data.
Where I think there is “danger” (you know, analytics danger!), is where time isn’t set aside to consider what could be missing.
“There wouldn’t be time to interpret other data; there isn’t time to interpret all of the data we have today.”
Watch out. The danger! Slippery slopes! Organizational bad habits!
“Everyone is so busy. We don’t use all the data we have, so why bother collecting more. It will just cost more money to collect more data that won’t be being used.”
There can be truth to it, but please read on… at a certain level, in certain markets especially, it becomes necessary to have more data to compete at a high level. (And don’t take it from me… please. Use your favorite search engine and search for some combination of data/analytics terms with the words proprietary, competitive, and advantage all thrown in the mix. This looks like a fantastic start.) While I find this quite compelling, and I hope you do, it’s not even the most compelling point to many.
The amount of time/effort(/$$$) being spent on collecting data is something I would quantify. Often, the results of this labor are thrown away, replaced with the next set. Would the time not be better spent focused on developing for analytics, and analytics development? Things that are lasting, that sit on top of the organization and the data? Tool-specific efforts are exactly that.
When the Gods of Web Analytics spake, they set forth a rule. You spend on people and process ahead of technology. 10/90 even. When talking about spending 90% on people… this isn’t what the Gods intended. Implementation is supposed to be included in the 10%. Compounding the problem, by defining requirements that impose on collection, and by not capturing data objectively, a level of bias is introduced the data being collected. This is where time should be set aside to consider what could be missing and how to bring it into the fold.
Your data is unique, and you are the only one who has it (insert NSA hilarity here). Your data provides a competitive edge, if you have the ability to find the appropriate signals. The ability comes from business intelligence, from data science. Eric Peterson, nearly four years ago, described the “coming bifurcation in web analytics tools” where there is a need for low level access to some basic business reporting and also a need for sophistication (BI/data science) in digital analytics tools for the enterprise. Four years later, there are still plenty of organizations that are just dipping their toes into the waters of full data integration.
I understand that many are not involved in an intense competition. They’re just analyzing their web data. Those businesses can carry on with GA. As Eric noted, it’s great that they get a pretty in-depth solution for free in GA and can spend their money on people to make sense of that data.
For others, organizations already piping data from web analytics into a data warehouse, a marketing database, a centralized system of any name… if you don’t have data scientists complaining about data collection today, you will have data scientists complaining about data collection tomorrow. Today, it may just be enough to have that holistic view in place. Tomorrow, there will be questions.
In my coming predictions post, I will predict that Google will include this methodology in 1-2 years. You will be able to turn it on, capture everything, and sort it out later. Out of the short list of predictions I am making for 2014, I feel that this one is nailed. Automatic event tagging already came to us in 2013. As data capture enhancements are made, more businesses will consider migrating to premium to get at all of the data. We’ll see more work in this direction in 2014, and by the end of 2015 Google Analytics will automatically integrate with every CMS and eCommerce platform via “capture everything”.
This has been the third part in a series related to the future of web analytics vendors. The clickbait title up to now stuck to a “web analytics implementation is dead” theme, which is funny because I insist that I am not a fan of the meme. We’re also talking about more than the web, and more than implementation. I chose “web” as not to declare the digital analytics implementation as dead. There is already a dichotomy in mobile data collection. Event streams are not really a new concept for gaming and app analytics — it just costs more to store more, and so tailored implementations do happen out of necessity. For web analytics, however, “implementation” means something pretty specific most of the time.
The web is no longer so unwieldy that we can’t capture full event streams.
This is the death of the “death of the web analytics implementation” series, though the related topics will live on. I hope you will come with me… thanks for reading!
EDIT: Thanks for stopping by. This "web/digital analytics implementation death" thing became a series. There were four posts: http://www.tbdac.com/web-analytics-implementation-dead/ http://www.tbdac.com/web-analytics-implementation-die-part-2/ http://www.tbdac.com/death-web-analytics-implementation-part-3/ http://www.tbdac.com/year-expansion-in-digital-analytics/ This will continue to be a theme of my analytics hygiene posts, and presentations at eMetrics Chicago and... (Boston?)... and...?