Why so many data projects fail before they start

At the recent re-launch of our conference, one of the highlights for many customers was a simple guide on how to get data projects right. Now, you cannot attend a conference or event without some reference to AI, IoT, Automation or Big Data. There are events where that is all they talk about.

Since we are a little different. We did not want another presentation on blue-sky thinking. Rather, we followed our ethos of practicality. And focus on the hard practicality of monetising your data. Jonathan Dutton had posted about dirty data previously but insights from one of the top data professionals in Australia provided a leveller for most in the room.

The topic of data was presented by Mario Ribiero. Mario is the GM for Data and Advanced Analytics at Blackwoods. From the get-go, Mario was clear on how to get the most out of your data.

64% of organisations still struggle to get the right data to make decisions

Before our event, we surveyed everyone who attended. 64% of those who responded indicated that they struggled to get the right data to make decisions. Many respondents are using procure-to-pay systems. They should fix collection problems right?

The reality is that technology alone cannot fix the problem. We all know this yet many continue to fall prey to the smooth sell of a technology platform. But without getting the foundations right, the data house you are building will fall down.

Getting the foundations right

So why is it so hard to get the basics right? Even when companies are able to get data – many still realise that things are missing.

One reason may be that they are so focussed on the shiny new object – they forget to realise it involves people. People get bored, they make mistakes and would rather focus on the shiny new objects. In our aspirational world, who wants to keep data clean?

But keeping clean data is the most critical piece of the equation. Without good data, there is no possibility of getting good clear outcomes. Data quality is a critical step in every data project and one that can lead to more failures than any other reason.

No data governance can make artificial intelligence (AI) into artificial stupidity. It is critical that you put key metrics in place to ensure correct data entry happens. This may mean working with suppliers as well as internal departments to unify your data.

So who should be responsible for ‘dirty’ data?

Make the producer of the data responsible for getting it right.

Ever heard the phrase “You cannot expect salespeople to get data quality right”. Well, you can’t if you do not set clear expectations. It is the same with every user or department who generates data. Make the person who creates the data responsible for getting it right.

Using an example of procurement data (which we spend a lot of time on), this is not a simple task upfront. It is simple once the process is working. It does take a little bit of effort upfront.

Any data strategy or element of it should align with the business strategy. And, who delivers on the business strategy – everyone in the business. Unifying your data requirements across the business is important. How you are set up with suppliers is important. It can take a few extra minutes upfront, but, save a lot of pain in the long run.

If you have suppliers who refuse or fail to provide data – then it might be time to reconsider who you buy from.

It is also important that you check data as it comes in. Not once. Not twice. But all the time. You can automate this. Suppliers are well known to change formats of data. Best to catch it early when this happens. A good process to ensure that the data is still coming through correct can save pain later.

What about people who circumvent the rules?

Mario talked about a personal example of his son. The basic rule is “if you make a mess, you clean it up”. This is something that we can teach children. We can also train adults in an organisation to follow rules.

I would call people who circumvent rules to save themselves a few seconds of effort selfish. But, in the modern corporate world – this may not be fair. Clear communication and understanding generally remove problems here. Or, if they persist, remove the problem people 😉

We have also come across examples in organisations where people cry “I am too time-poor to do it right”. If they do not have the time to do it right the first time, when will they have the time to do it right the second time? Or who else will have time?

Roadblocks that lead to dirty data

“If I make the fields in said system mandatory, all our problems will go away!” is the catchphrase of many. Depending on your companies culture – this can have two effects.

  • In a tight culture of compliance, it can deliver a positive outcome.
  • In a loose culture of compliance, it can lead to data that is even dirtier than before. That is because people will fill anything in to make the red asterisk go away.

You should also consider “nice to have” vs “what is actually needed”. Many companies we work with are trying to collect too much information at the point of entry. Much of the added data is available through the public interwebs. Industry, legal entity names, addresses are some examples. It is best to use a validated source rather than rely on human entry where possible. This will reduce the amount of effort up front, and, avoid the pitfalls of bad data. It is after all, easier to complete data than verify whether it is bad.

Developing a data culture

The impact of strong culture is a known accelerator of compliance and growth. The same culture that drives data quality can also have a flow-on impact in other areas. Sales, marketing, security and compliance are some.

Even with the best systems, technologies or policies – people find the means. Make your business strategy clear. Explain peoples roles in meeting it. And, be clear when you communicate the requirements. It should be simple and most people will do the right thing. Make it easy for people to do their work and understand why they are doing it.

Make sure what you are doing is scalable

We have seen many proofs of concepts fall over when the time comes to take them into production. This is generally the result of:
a/ dirty data
b/ funded proof of concepts from suppliers (avoid these)
c/ lack of attention to process and culture.

Many times people focus too much on the technology or platform – rather than how to get the outcome required. Many proofs of concept happen with a high degree of attention to detail. Data gets cleaned, everyone is excited. A lot of resources are generally applied.

Can your project sustain that amount of incremental effort or cost to make it worthwhile in the long run? If the answer is yes, it will deliver a higher ROI ongoing, keep going. If not, then work out how to make it work or make a fast exit.

Stay sharp

The parting words of advice from Mario are to ensure that your teams “stay sharp”. There is so much information and courses available online in today’s world. This should not be a roadblock.

Focus on the basics, instil the right culture, develop with scale, stay sharp. Focus on that and your projects could reap dividends.