A few weeks ago a resident of a small village in North Yorkshire discovered council workmen busy painting a large yellow “School - Keep Clear” sign and zigzag lines on the road outside his home, a converted school that he had lived in for 8 years. In fact, the school had actually been closed by the council for almost 20 years. When the homeowner challenged the workmen to explain their actions they simply said that they were carrying out instructions from the Highways Department and suggested that he buys some black paint to cover it over after they had gone.
Although in the scheme of things the story could be considered mildly amusing, at least for the general public if not the homeowner, it did make the front pages of the national tabloids to the probable (well deserved) embarrassment of the council’s “powers that be”. However, the story also helps to illustrate a much more serious point; why organizations need to ensure that their data sets are accurate, complete and up to date in order to avoid potentially costly mistakes as well as public ridicule.
While in this particular case the labour and materials costs would probably have been relatively modest, I am sure that the local residents would have plenty of suggestions as to how the money could have been better spent. Possibly more importantly, as well as the wasted highways budget such decisions can also result in other unintended consequences. This particularly includes reputational damage for the council’s managers and stakeholders that is likely to have helped sow seeds of doubt in the minds of the local council taxpayers. It is easy to imagine that having read the story they would be asking themselves if their local authority can get such a simple thing wrong, which other spending decisions should they be worried about? Obviously not helped by the story being splashed across the headlines of the red-tops.
Being able to trust the data that is underpinning critical decision-making is essential for any organization, public sector or commercial. This means implementing rigorous data management policies and procedures across the organization to provide certainty that data is not only regularly updated and fit for purpose but it is also available and accessible when and where it is needed.
One of the barriers facing organizations looking to implement quality data management practices is the data silo mind-set that has become established in many organizations over time. Data silos are one of the major causes of inter-departmental inconsistences that mitigate against efficient data sharing, lead to duplicate, inaccurate records that can result in missed connections and prevent senior management from being able to take a holistic, joined-up view of their enterprise. If every department operates with its own database there are not only internal cost implications but organisations should not be surprised when decisions made by one team result in problems for another.
I strongly suspect that in the case of North Yorkshire Council that the decision that led to a team of sign writers being tasked with painting parking restrictions outside a private residence could have been avoided if the highways and education departments were operating from a common, well manged database.
The lesson for any enterprise is that their data is a very valuable business asset and any investment in improving their data quality standards will be money well spent. As a data-driven economy data is embedded in all areas of business operations and is central to successful decision making and maintaining a competitive advantage. By eliminating data silos and implementing data quality best practices organisation can be 100% certain that their operational decisions are based on 100% trustable data 100% of the time.
Link to the original article here.