4 Nov, 2021 | Stuart Reyner - Senior Product Architect
In today’s data-driven world, it is critical that company decision makers can trust that the data they are feeding into their BI applications is 100% accurate and up to date 100% of the time. Without trust how can they have confidence that their decisions are the right ones and will keep the business moving forward? However, it has been found that for 60% of business executives lack of trust is a big issue and that for more than 30% the answer is to choose not to rely on the data when it comes to operational decision making.
Trust Once Lost is Difficult to Get Back
This situation is quite likely to be the result of bitter experience from past project failures that can be traced back to poor data quality management processes. However, once trust is lost it is very difficult to get back, which means that businesses are now looking to invest in new data quality management (DQM) tools that are able to assure the integrity of their data and provide the certainty that managers need to re-build trust at every stage of the data journey.
Manual Data Quality Management Lacks Efficiency
With volumes of data coming into a business from a wide range of sources annually now being measured in Zettabytes assuring data quality has become even more difficult with massive scope for errors becoming embedded in the data set, whether that is in terms of formatting, field omissions or simple spelling mistakes. All of which can have a serious implications for the applications being used to process it. The challenge for the organization (and the tools being used) is not just managing the availability of the data but ensuring that all these issues are addressed as automated processes, as early as possible in the project, without the need for manual intervention, in an acceptable timeframe.
New AI Based DQM Tools Can Deliver Huge Improvements
One result of this massive upswing in the need for DQM has been the commensurate growth in new tools appearing on the market (according to Gartner the data quality software market grew at a rate of 3.9% to reach 1.72bn USD in 2020 and is expected to reach 2.24bn by 2025). A common feature of many of these tools has been the use of AI to augment and speed up the DQ processes while reducing the human error factor from the equation. However, with the exception of IDS none of the current leading vendors are able to offer a fully integrated solution that encompasses data ingestion, profiling, mapping, transformation, migration as well as obfuscation and synthetic data generation.
With proven turn round time improvements of up to 80% in some of the DQ stages the IDS, iData solution is able to deliver significant time, and hence cost, savings compared with typical industry project time scales using a standardised platform and user interface.
Test Data Quality Management is Critical to the Success of Migration Projects
Of the many clear benefits that this integrated approach offers the inclusion of automated test data generation can make massive improvements to software testing and data migration projects. This is particularly relevant to the growing number organizations going through digital transformation and ERP migration. Data quality is as much a critical element in the success of software migration as it is delivering trust in the final production environment and needs to be both accurate and representative of the real world to deliver a successful migration.
Test Data Generation Can Account For 60% of Project Resources
According to a study conducted by IBM in 2016, manually searching, managing, maintaining, and generating test data can account for between 30%-60% of the testers time. Testers typically need to generate huge volumes of test data to ensure the quality of the application for real-world use. In a recent NHS project that required strict controls of the non-production environment, the testing team were spending between 30 and 40 days generating inaccurate and unproductive test data. Using iData they were able to generate 80 million rows of accurate obfuscated data on demand.
With such huge volumes of data and fixed budgets to deal with it is understandable that corners are likely to be cut, particularly when needing to rely on traditional manual methods. Equally, it is unsurprising that according to another Gartner report as much as 83% of ERP migration projects fail, miss crucial deadlines and exceed their budgets.
Using the latest automated DQM tools developers and testers are able to massively reduce the time needed generating test data and can focus more resources on meeting the critical business objectives that a successful migration is intended to deliver.