Survey: Developers claim responsibility for managing data quality

blur-1853262_640.jpg

Almost half the developers responding to a data quality survey by SD Times and Melissa indicated they are responsible for managing the quality of the data their applications rely on.

Further, more than half of the 202 respondents said they were involved in data quality input, data quality management, choosing validation APIs or API data quality solutions, and data integration. 

Some 57% of developers surveyed indicated the primary domain their companies generate master data for is customer, followed by product (14%), employee  (12%) and finance (11%).

RELATED CONTENT: Want to ensure data quality? Look for the Golden Record

“Poor data quality hurts organizations in terms of being able to deliver great experiences to their customers, which impacts their businesses,” said David Lyman, publisher of SD Times, which completed the survey in conjunction with data quality solution provided Melissa. “Whether it’s duplicate data, inconsistent or incomplete data, developers are spending more and more time cleaning things up instead of working on solutions that deliver value to the company.”

In fact, just about a third of respondents ranked duplicate data as their greatest challenge, followed by inconsistent data (23%) and incomplete data (21%). Next were international character sets (17%), old or incorrect data (11%) and misfielded data (10%).

“With the amount of data streaming in from a multitude of sources, it’s imperative for developers to build in a data quality firewall,” said Greg Brown, VP of Global Marketing at Melissa. “By identifying, fixing and eliminating duplicates, erroneous and invalid data before it enters your system helps cut the time needed for manual intervention while improving virtually every downstream process.”

Because of these data problems, respondents to the survey said they spend roughly one full day of their week dealing with data quality issues, taking time from building new applications.

“Developers should heed the 1-10-100 Rule, it costs on average $1 to verify the accuracy of a record at point of entry, $10 if it needs to be cleaned and $100 if nothing is done,” added Brown.  “And, remember this doesn’t include the additional costs of misdelivered shipments, wasted postage and production costs, missed opportunities and undetected fraud, let alone the added costs of data preparation to make it fit for purpose.”

Credit: Source link