Data: The Importance of Accuracy, Integrity and Real-Time Integration

James Cotton
December 15, 2014

While it’s easy to see why data has become so important to modern businesses, the hazards it presents must also be accounted for. With new risks and pitfalls appearing daily as technology evolves, it’s no surprise that security is usually at the top of the priority list.

In truth, though, security is only one part of the equation. For any organisation to properly harness the full power of data, it must also see accuracy and integrity in similar lights. Once these have been guaranteed, it’s left to the way in which the data is applied to determine results. This is where the real-time concept comes in; information has a shelf-life, and it’s shortening rapidly.


Put simply, data is used to provide insight. Businesses, when armed with this, are able to improve the everyday decisions they make. This isn’t just for management, either – it applies from the ground up. However, data is rarely useful in its raw state; it must be processed and presented in a way that works on the appropriate levels so that it can be applied properly. The latest analytics tools make this part much easier, but there is still a journey that information must follow before it’s usable.

If data accuracy levels are low at the start of this process, the insight will be lacking and the decisions it influences are likely to be poor as a result. This is why organisations must realise that quality is more important that quantity; too many focus only on gathering as much information as possible without thinking about whether it’s correct and how it can be used. Add to this the question of whether it can be trusted and you have the issue of integrity to consider as well.


There’s no getting away from the fact that data comes from everywhere these days. Just as a few examples, we have mobile devices, loyalty cards, customer relationship management (CRM) systems, social media sites, GPS location data and complex market research tools. The source pool is still growing too; the ongoing development of concepts like the Internet of Things (IoT) mean that machines are also becoming an integral part of the data deluge. Alongside the more traditional computerised devices, businesses will soon be mining information from seemingly inanimate objects (think fridges, tables and cars).

With all of this in mind, data governance must be a priority. Not only will this information be arriving from all directions, it will exist in various formats: everything from numbers and formulas to individual words and pieces of text.  Traditionally, just as many tools would be used to deal with it as well. Some staff would be relying on their own spreadsheets and word documents while other, more data-competent team members put their faith in advanced data visualisation tools.  This kind of disparity causes its own set of unique problems – and can even make data useless.

Unless every data-using employee is singing from the same hymn sheet, auditability will always be a chore at best. With different platforms being depended on, and information coming from lots of different sources, the reliability of information cannot be reliably determined; and when this is the case, it should not be used to influence decisions. At the very least, decision makers need to know about the data heritage and the (lack of) confidence they can place in this information.

The solution to this is governance. Information has to be available to everyone, and while the latest wave of data discovery tools offer the right kind of accessibility, the need for central control is somewhat ignored. An organisation’s use of data can’t be limited to the IT department, but in order to ensure integrity on the whole, it’s IT staff that should be in control.

Real-time integration

The time it takes experts to generate actionable insight from raw data has always been a stumbling block, especially when the older BI model is adopted. Data is undoubtedly the business’ single greatest asset, but with every other organisation looking to harness its power, the usefulness decreases as time goes on. It’s no longer enough to embrace information. Now, it must be used quickly – or even instantly.

The periods of time in which businesses have to make their decisions – both major and minor – are shortening. When sales staff are out talking to clients, for example, they must be able to draw on the facts and figures that are up to date and that support their cases there and then; instead of having to wait for days, or even longer, until it’s ready. It’s this instant access that enables a firm to get ahead of its competitors, or keep up with them at the very least.

Responsiveness is also an important factor for firms to consider. To be truly competitive in the respective industries, companies must be in a position to respond to what’s going on around them – whether this involves the actions of a competitor, the behaviour of clients or a major world event. It’s this kind of versatility that separates a good business from a truly innovative one.

The danger of inaccurate data

Companies that utilise sophisticated data-led insights can’t simply rest on their laurels and marvel at a job well done, however. Inaccuracies in the data can quickly escalate from a minor niggle into something that compromises all the hard work and effort previously invested. Analysts (among them Joel Curry of Experian QAS UK and Ireland) have noted that accurate data can drive efficiency, profitability and growth. Inaccurate data, on the other hand, can cause real detriment to a business – and its bottom line.

This can be from as simple and seemingly innocuous circumstances as one person using data discovery whilst a second focuses instead on Excel or standard reports. It could even simply be under-briefed employees using slightly disparate terminology which throws their data out of sync.

As innocent and accidental as this all sounds, the impact could be huge. Curry went on to cite a survey which showed that one pound in every six that is spent from departmental budget is wasted. The biggest consideration here is that these losses are wholly avoidable, by ensuring that data accuracy always remains a key consideration.