Oakland Group

The Use Case For Project Analytics

In this final article of the Project Analytics series, now that we’ve covered how to launch your Project Analytics capability, we explore what each category of users will make of their newly created capabilities now the skills, processes and technology are finally in place. 

There are typically four basic use cases that we observe when building out a Project Analytics capability: 

1. Report Consumers 

2. Analysts 

3. Partners/Advanced Analysts 

4. Machine Learning 

Use Cases for Report Consumers: 

By this stage, report consumers are likely to be loving the analytics and reporting capabilities you have created. They should now have fast, high-quality reports and analysis related to the specialist insights they need. 

It should be relatively easy to find use cases for this group simply by exploring the information you already report on today. The reports you already produce are most likely helpful but perhaps in need of improvement. 

However, it may not always be that straightforward to perform a straight enhancement of your existing reports for several reasons: 

The moral here is that ‘reporting democratisation’ can sound like a great idea, but it can turn and bite you if left unchecked. The whole argument for ‘self-service’ reporting is more of an operational and cultural challenge than a technical one. 

Use Cases for Analysts: 

By this stage, your analysts should be delighted with the project analytics capabilities you’ve created. However, as adoption increases, you’ll need to think carefully about what controls you need to put in place for this community. 

Analysts are often keen to build complex logic with data visualisation tools (e.g. PowerBI). Packing excessive data processing into the visualisation layer can soon become a maintenance and configuration headache if left unchecked. 

One solution is to give the analysts their own project analytics ‘sandpit’ to build new reports, test different datasets, and ensure any changes don’t negatively impact the wider production environment. You must also provide awareness training and possibly additional tooling to ensure that the relevant policies are understood and followed. 

Where data is ‘blended’ or processed from non-standard sources, you need to set clear guidelines. This exercise would include activities such as tagging, or flagging, your reporting analytics so that it identifies any data that comes from an approved source as trusted, versus any ‘sandpit’ reports that may lack assurances of trusted provenance and data quality. 

Finally, you will need to consider how to provide support and help to this community. Again, this is less a technical headache and more a cultural requirement. Solutions here will include forming various communities or forums to help spread the knowledge amongst the internal analyst workers. 

Use Cases for Partners and Advanced Analysts: 

By now, you will have covered off 80% of your project analytics use cases, but some people will want more and need to go to the raw data source. There is enormous value to be had when sharing your data with partners along the data supply chain. 

It is certainly worth highlighting the Project Data Analytics Task Force, which has made considerable progress in formulating ideas and approaches around increased data sharing for project analytics. 

Ideally, it makes sense to start with a limited use case of certain suppliers providing data, then extend your partner model for data ingestion/sharing over time. The merits of bilateral information sharing are clear for all to see, but you must ensure the appropriate governance and controls are in place from the outset. 

Use Cases for Machine Learning and Artificial Intelligence: 

There is no doubt that the potential for leveraging machine learning and AI is enormous, but you need to start with a substantial pool of quality data to train your AI/ML models. 

Even if your organisation runs many projects, for effective machine learning, that is likely to be too sparse a data set compared to other industries where models are trained over thousands or millions of data points. 

You can drop down into sub-components to gather more data, for example, by taking more granular time-slices such as monthly reports. 

Whatever approach you take, your models need to run on large data sets to create statistically relevant findings. 

Another aspect to consider is the ‘black box’ nature of machine learning versus something that is more transparent and helps the users clearly understand how the model derived a result. Generally, we would recommend opting for a more transparent model to increase confidence. 

Machine learning does allow you to gauge the likely outcome of a project based on learning about past performance across various feature categories. For example, you can find correlations between how well a project is run based on the quality of information entered and the project outcome.  

It makes sense to exploit external data to help train more accurate machine learning models. If you’re in the business of building bridges for example, you may only deliver a modest number of projects each year so you’ll still need a large enough sample size of bridge-building project data to create an accurate machine learning model. 

Whatever data you source for machine learning, you will still need to invest in the appropriate skills and technology to execute correctly and deliver impactful use cases. Standard, off-the-shelf visualisation tools, are unlikely to be sufficient for machine learning, so you may need to invest in tools such as Databricks. 

Additional Use Cases: 

One side benefit of all this project data is that you effectively construct a Digital Twin of your operation. You can observe precisely what is happening across the entire project lifecycle. When you have project snapshot data, you can analyse the project over time, creating a powerful resource to improve the business. 

You can start to see how often people are updating the project data, altering the project baseline, and maintaining an accurate commentary of project updates. 

Be careful not to leap in with assertions too soon, but gathering the right type of information will be extremely helpful for improving data quality through monitoring and improving confidence in project assurance and integrity. 

Creating high-quality historical project data analytics has proven to add value far beyond machine learning and AI. As discussed, the benefits of building the Digital Twin of a major project can be far more valuable than the allure of AI and machine learning. 

Managing the Culture Change of Project Analytics:  

Just because you’ve invested heavily in the technology and tools of project analytics doesn’t necessarily mean people will share the same passion or even use them.  

Project analytics is far more than a ‘tech project’; you need to get people using the tools by highlighting how they positively impact their working lives and deliver better outcomes for all concerned. 

It would help if you increased the workforce and management maturity and resilience, particularly in relation to coping with the inevitable discovery of issues and problems that shine a light on poor performance or other ‘skeletons in the closet’. There will always be bumps along the way, so try and avoid any adversarial scenarios where each party is trying to prove the other wrong. 

Finally, you want to instil a culture that becomes progressively less tolerant towards those who refuse to use the approved tools and revert to doing things ‘their way’.   

You will need to go out of your way to validate your findings and demonstrate that the information your project analytics capability provides is trustworthy and defensible. Expect some manual hard work in the near term to provide transparency, and don’t underestimate the desire for everyone to go back to their trusty spreadsheets.  

In short, culture change takes time, planning and persistence. 

The Journey Never Ends 

We started this series in the ‘foothills’ of project analytics and climbed steadily through the various peaks of expertise required to deliver a fully operational capability. 

The reality is that even at this late stage, you’re still at the beginning of a long journey. 

What comes next will first be impacted by changes in scope around:  

There are plenty of factors and dimensions that influence where you go next, and of course, as your approach changes and matures, you’ll find that you’ve constructed a highly competent team that is capable of adapting. 

By this point, you’ll have:  

Your project community will have become far more aware of project analytics, many of them having undergone new skills training.  

You will also have created more intelligent customers, so now may be the time to invest in additional functionality because you will be better informed on the options and benefits available. 

What Next? 

There are no silver bullets or ‘plug and play’ solutions for project analytics because every organisation is unique. 

However, the building blocks exist. They have been proven and are relatively straightforward to build out. 

As we’ve highlighted in this series, the solution requires a mix of technology, business expertise and user engagement, but it is actionable and within reach. 

The key is to start small, and the rest of your journey will write itself.