Oakland

A Global Telco

Oakland Case Study

When a Global Telco wanted to enhance the data governance and data quality assurance of its financial reporting capability, it was reluctant to buy off-the-shelf software that wouldn’t fit the varied needs of its vision for Data Excellence.

Instead, Oakland devised an innovative assurance solution that helped to transform data operations performance, but more importantly, deliver daily insights that the business can trust and rely on.

The mission: data excellence

Like many organisations, the financial data of this Global Telco is extensive and complex, serving numerous business functions before flowing into an Enterprise Data Warehouse and Data Mart architecture. The data is then available for business intelligence and deeper analysis, creating a timely and accurate view of financial performance for tactical and strategic decision-making.

When it comes to translating data into intelligence, this organisations finance department had been advancing its data excellence mission for several years along three pillars:

  1. Creating a trusted single source of business intelligence
  2. Ensuring data insights are reliable and timely
  3. Delivering a compliant and sustainable business intelligence capability

One of the critical foundations of this mission has been the need to build trust across the financial intelligence required by the business.

The Business Intelligence Manager for the organisation and a key enabler of the data excellence journey, outlined the implication of building a trusted information resource:

“Delivering a digestible and simple user experience that provides transparency on the creation of final numbers is key to ongoing engagement, adoption and growth. When users truly trust the process, share and discuss the meaning of the number, this creates the environment where valuable business decisions can be made quickly and conversations about the quality of the number end.”

Business Intelligence Manager, Global Telco

Given the volume and variety of data flowing in and out of such a large-scale analytics environment, the quality of data had to be assured as it moved through the complex web of data pipelines and processing stages.

If data from source systems were delayed (or failed to load), the jobs that extract, transform and load (ETL) billions of records into the enterprise data warehouse could output incomplete datasets that also have quality issues. There had to be controls and checks to periodically reconcile the numbers held in the data warehouse; otherwise, any decision-making would be flawed.

From an operations perspective, the Business Intelligence team was keen to provide a ‘selfservice’ assurance monitoring platform that both operations and business teams could use to quickly identify the source of data assurance issues without raising multiple support tickets for the same problem.

In the past, there had been frustration at the length of time required to resolve processing issues, given the variety of data sources and pipelines involved. The Business Intelligence team wanted a faster way to identify and resolve problems, freeing them up to deliver the higher-value services required of a skilled analytics team.

 It was also apparent that given the complexity of the data pipelines within the analytics environment, building a robust system to track the lineage and performance of each data pathway was paramount to long-term assurance, both for internal governance and any external regulatory demands.

Applying process quality to data analytics pipelines

The organisation realised that selecting an off-theshelf tool to tackle their data assurance challenge wasn’t an option.

Following discussions with Oakland, there was a ‘meeting of minds’ around the importance of process quality for data management which helped guide the initiative forward.

Oakland has a long history of operating at the intersection of process and data, possessing deep expertise in what it takes to apply strict process quality controls to complex data pathways.

Andy Crossley, Director at Oakland, explains:

“When working with data-driven companies like these, the thousands of data pathways and processes these organisations possess closely resemble the infrastructure of a engineering plant, factory or rail network. We discovered that process quality techniques, such as Six Sigma and Statistical Process Control, are ideal for assuring the quality of data processes. Our experience of working with large complex businesses has highlighted that the benefits of continual improvement, cost reduction and operational efficiency are achieved when the techniques of process quality and information quality are intertwined.”

Andy Crossley, Oakland

Starting with a proof of concept

The initial goal was to create a demonstrator that simulated how the data analytics environment would be improved through process quality management techniques.

This approach was encouraged by our client, who were reassured that Oakland was delivering a best-fit solution, as opposed to off-the-shelf software that may appear attractive but fail to address the complex data assurance challenges of a Global Telco.

Walking the process

Drawing on our process experience, the Oakland team started by ‘walking the (data) process’ to understand the dependency that each business unit had on the data analytics results and any impacts that resulted from upstream process failures.

For example, we encountered an occasional lack of trust around the data within some user groups. We found that some store managers were tempted to bypass the business intelligence platform if their sales figures didn’t appear to match reality. The importance of building trust in the data was once again plain for all to see.

Walking the process helped to clarify how different users preferred to consume operational information given the distinct levels of data literacy across the business. The last thing we wanted was to burden store managers and other users with a cumbersome operational analytics tool that no one would use or understand.

The pilot outcome: creating a roadmap for the way forward

The pilot helped determine a strategy for productionising the data assurance demonstrator, prioritising the features and capabilities required, mapped against each teams requirements.

By applying the same process quality and data engineering expertise we had honed over many years within the utilities and transport sectors, we helped our client build a foundation for visualising and tracking the movement of data from source systems, across the entire data processing infrastructure, and on to the target analytical environment.

From concept to completion (and beyond)

The data assurance platform is now fully operational and creating substantial benefits within the organisation.

The (behind-the-scenes) data lineage foundation supports a complete visual history of quality assurance, from top-level business KPIs (such as profitability, churn, and commissions), across the connecting data flows and down to the core data subject areas.

Business and operational leaders are immediately alerted to a drop in process or data quality performance, with the system identifying which data source or pipeline is responsible.

The data assurance platform acts like a ‘Time Machine’, tracking historical performance and fault management to pinpoint trends and emerging issues that would have been impossible to discover in the past.

But despite the inherent complexity of coordinating the assurance of thousands of data processes and data pipelines, Andy Crossley explains how the final interface was designed with the business in mind:

“During this project, we obsessed over the one question every senior decision-maker wants to know: ‘Can we trust our numbers?’ The final interface delivers on that goal, providing an accurate metric for trust that the business understands. To enhance data literacy, we’ve purposefully hidden the complexity and made the final interface so simple that the business requires limited training to get the answers they need based on their specific needs and role within the organisation.”

Andy Crossley, Oakland

Progressing the data governance journey

The data assurance project demonstrated that even with the massive data complexity encountered, organisations can successfully govern even the most challenging data environments.

Of course, data governance hasn’t been established solely through technical means. The inclusion and support of key stakeholders with a vested interest in data assurance has helped accelerate the data governance mission.

Today, there are plans to extend the data assurance platform into other parts of the organisation, further extending the reach of data governance to ensure trusted data and confident decision-making.

Would you like to learn more about how Oakland can help with your Data Governance and Data Quality Assurance requirements?

Please book a call with one of our team to discuss your challenge:
hello@theoaklandgroup.co.uk
0113 234 1944