NarrativeWave | Resources

The 4 Most Common Mistakes in Data Analytics Projects

Written by George Garforth | January 2022

Data analytics projects at energy companies seem to be never-ending. Many projects take much longer than anticipated, while others are marked a complete failure either in completion or at delivering the results anticipated. Data analytics projects start with great intentions—usually with the hopes of reducing asset downtime. And there are indeed great gains that can be made by improving your analytics. So where do the projects go wrong?

The answer lies in a few common areas looking across data quality, project scope, human processes, and resource allocation. There is also a common belief (or possibly wishful thinking) that upgrading your analytics solution will be a silver bullet to eliminate downtime. The reality is that a data analytics project must be approached holistically and with a problem-solving mindset to uncover all of the gaps and opportunities in your operations. This article will reveal the most common mistakes that energy companies make when taking on data analytics projects so that you can confidently avoid them.

 

Mistake #1: Assuming your data is high quality

Energy companies collect a lot of data and not all of that data is useful, let alone correct. Yet somehow energy companies start data analytics projects thinking that their data is great. Thinking that all they need is a better analytics solution to make calculations with their data to better inform them of issues ahead of downtime. 

The reality is that when companies enter into a data analytics project without first validating and cleansing their data, they end up stalling the project and adding services expenses to get this done. Invariably, data quality problems are found. This includes discovering inaccurate or missing data, inconsistent data, and useless data that is just creating noise. All of these are problems that need to be addressed in order for an analytics solution to work to reduce downtime.

 

Mistake #2: Ignoring process improvement

When looking to reduce downtime, many analytics projects get started with the idea that getting a better analytics solution will proactively alert on the potential for downtime, which will therefore reduce downtime. That is absolutely correct, but also a bit narrow minded. Monitoring and alerts are not the only factors that contribute to downtime. And if the project doesn’t look holistically at operations, it may not yield the desired outcomes for downtime reduction.

The reality is that the communication around alerts is just as important as the alert itself. Who is alerted, what actions they are told to take, and the outcomes of those actions, are all very important to the process and can impact downtime. Timely communications with vendors and partners for replacement parts is also a factor in downtime reduction. By broadening the scope of the project to identify process gaps, analytics solutions can be used to automate and improve workflows, which can have a very big effect on downtime reduction. If process improvement is ignored, analytics projects may be deemed a failure for not providing the desired outcomes.

 

Mistake #3: Not adding your expertise

The engineers and field operators at energy companies have specialized knowledge of how their assets run, what the various problems are, and the most effective repairs. Ignoring this expertise and expecting an analytics solution to work for your unique environment is a recipe for project failure. 

It is critical to involve key resources in the data analytics project that have this tribal knowledge so that alerts can be more intelligently programmed and data calculations can be customized for unique assets. Most data analytics solution vendors will request interviews with these folks and do their best to gather this information. However, this process can add delays and increase services costs. It is better to identify these resources upfront and ensure their involvement in the project.

Additionally, more value can be gained from identifying resources to learn how to customize and tweak the analytics independent from the vendor. This will make the project go smoother and improve short and long term outcomes. Note that not all analytics solutions allow users to make backend changes so consider that when building your list of requirements for an analytics solution.

 

Mistake #4: Not investing in continuous improvement

Getting resources involved in the software short term makes implementation go smoother, but the bigger picture mistake is not continuing their involvement in working to continuously improve. Managing energy assets and the data surrounding their performance is not a “set it and forget it” exercise. Continuing to find ways to streamline operations and reduce downtime is a high value activity. 

Roles should be created for people that have specific responsibilities to look at the bigger operational performance picture and drill down into the algorithms of the analytics solution to make changes that yield improvement. This could be changing automated workflows or changing the equations for when an alert is raised. Not investing in continuous improvement of your data analytics will limit the value of that investment and downtime improvements may become stagnant.

 

A comprehensive approach will yield project success

Taking a comprehensive and proactive approach to a data analytics project is the key to making the project run smoothly and produce the best possible outcomes. The project scope must include data validation, process improvement, and the right resources to make it come together. If you’re looking to reduce downtime by improving data analytics, here is a great guide for doing the foundational work. It includes the questions you need to ask internally, the specific steps you need to take and even worksheets to help you make it happen. 

 

Don’t repeat the mistakes of others. There are great data analytics solutions that can deliver incredible results for reducing downtime. You just need to scope your data analytics project to cover the problem holistically. That’s how the incredible reductions in downtime come to fruition.