Digital Analytics Data Governance: Don’t Let The Ghosts of Implementations Past Haunt You

Digital Analytics Data Governance: Don’t Let The Ghosts of Implementations Past Haunt You

At the start of an analytics platform implementation, such as Adobe Analytics or Google Analytics, you typically have clear and well-defined goals, configuration, and documentation. However, over time, it’s easy for all of these to fall into disrepair. For example, the original architect of the implementation left without handing over their process, other priorities got in the way of maintaining historical aspects of the implementation, or there wasn’t a long-term process in place after the initial configuration.

Based on what we’ve seen, this is a very common scenario during and after analytics implementations. If you see yourself or your company in this description, rest assured that you can remediate your analytics architecture and governance with time, effort, and the right people in place. You’ll be taking a bigger risk if you look the other way and apply band-aids to the implementation while the analytics foundation suffers.

Auditing Your Analytics Implementation

You typically hear about the concept of technical debt in conjunction with development work, but it applies to analytics as well. As you apply easy one-off solutions over time without a long-term process, the work to maintain the analytics implementation increases. A tipping point can arrive where something critical breaks because of the resultant complexity, spurring renewed focus on the health of the underlying process. This can happen over and over until there is a reliable method for ensuring the implementation approach takes into account long-term thinking.

During the initial audit, you may risk throwing out code or variable configuration that has been well thought-through because it appears strange or complex at first glance. This is a tough tightrope to walk and requires trained eyes to review all of the interconnected parts to ensure nothing gets lost in the process. As you’re auditing, think about the concept of Chesterton’s Fence. Chesterton’s Fence states that “reforms should not be made until the reasoning behind the existing state of affairs is understood.” In other words, a critical step of audits involves reviewing all parts of the platform and speaking with those involved where possible. Without knowing why something is present, deleting the item can present a risk with unknown consequences.

Download Blue Acorn iCi’s “Amplify the Customer Experience with Analytics” whitepaper to learn how you can use analytics to enhance the entire customer journey.

If you’re using Adobe Analytics, you might find old code lurking in places you don’t expect. For example:

As these factors build up, you run the risk of collecting and storing unreliable data, particularly if the analytics team cannot explain where it comes from or why it looks a certain way. Rebuilding trust in data requires a thorough audit, repeatable processes that survive turnover to keep the data clean, and a dedicated team with strong communication skills.

If you don’t regularly audit the implementation for accuracy and proper functioning, users can also get confused and distrust the data. It is important to regularly review props, eVars, and events (all custom aspects of your Adobe Analytics implementation) and correct or disable any configurations that are problematic.

If your users regularly try to build reports but see that most of their selections result in incomplete or empty data sets, they’re less likely to keep working with the data in the future. User education and variable maintenance are key components in a solid governance plan. Along those lines, it’s critical for users to have a forum to be able to regularly discuss their questions and learn about variable updates so they know how to use them.

Finally, bad data prevents accurate personalization, since the information you have about a user depends on the quality of data collection in place. Inaccurate personalization can negatively impact user experience on your site, leading to decreased sales.

Related: Learn how to use analytics to create personas and win the moments that matter.

Maintaining Your Analytics Governance

To keep your analytics governance running smoothly, it’s necessary to include components like a solid user management strategy, alerting strategy, analytics release process, and implementation review cadence. We recommend conceptualizing each of these components before you complete your implementation and put them into action immediately post-launch.

User Management

User management strategy is critical for security purposes to ensure former employees and contractors no longer have access to data, and that new users can be onboarded consistently and quickly.

Alerting Strategy

We recommend combining the automated approach with manual validation for your Alerting Strategy. Alerting helps you understand key changes in the shape of your data over time, calculates variance and statistical significance on a continuous basis, and sends messages to the users you specify when data looks out of bounds. Periodic manual data validation is important to test items after releases and to ensure you have a human eye on the expected shape of the data. Keep in mind, there is a balance in the number of alerts: too few and you might miss something important; too many and you’ll stop looking at them.

Analytics Release Process

An analytics release process ensures that users know when to expect new items and changes to the implementation. Any unpredictable changes to the process can drive users away from comfort with the platform.

Implementation Review Cadence

Implementation review cadence is also critical from a predictability perspective. Publicizing that there is a schedule and team reviewing the implementation will build confidence in your data quality and reporting. Additionally, it’s key to give users the ability to flag issues with the data, as they may be closer to the expected values and need a mechanism to relay any concerns.

We’ve seen and implemented positive changes in these areas for our clients and have seen the difference it makes in user trust, data quality, and implementation longevity. We’d love to discuss these challenges with you if your team can relate. Reach out to us today to see how we can help.