The first step to ensuring data quality is validation

analysis-1841158_640.jpg

Blue Compass is an Iowa-based digital marketing agency specializing in website development and SEO. As such, according to development manager David Wanat, they take care of “everything beyond the design of the site” on the back end.

Not only that, but Wanat also said he is responsible for ensuring the data is good, whether it’s internal data or coming in from another source. So for him, the first step toward data quality is validation.

“We’ve got articles and blog posts on our site, we have RSS feeds, we just finished an airport website, so there’s parking information, like how many spots are in a lot, or is this flight on time, or is it delayed? Some of it is user inputted through WSYWIG engine or through an API,” he explained. “We’re talking to another site that gives us information, like REST calls, or maybe a CSV file is uploaded via FTP, and we dig through that to find information. There’s all kinds of different sources for this data. And some is end-user driven, where they’ll put in information requests via a web page.”

RELATED CONTENT: The SD Times Data Quality Project

One way Blue Compass ensures good data is being input into their forms is by limiting the amount of free-form data users have to type in. Wanat explained the company first has to think ahead about what they intend to do with the data, and minimize user input to must-haves, like inputting your name. “But if I can use a calendar date picker to put a date in instead of you free-forming the date, that’d be way better in my world, because I can control the format from the date picker,” he said. “If you’re picking a preference — a size of a shirt, a color — I’m going to control that as much as possible so I get the color red instead of burnt umber, so I know exactly which one they’re picking.”

But there are cases where the data input could be of good quality but something still is wrong. 

“If you’re asking people a question, and 50 percent of them respond with almost the same exact answer they typed in, that doesn’t seem like it’s very unique,” Wanat pointed out. “If you’re asking people what they had for lunch, and everybody says a ham sandwich or pizza, instead of like… you would expect it to be a very wide difference. So if I see the exact same answer, that tells me something’s off here. You have to figure out what you’re expecting to get, and when you get something that seems off, it probably is. ” 

Yet in spite of these controls, bad data still is unavoidable. When that happens, Wanat turns to the user of data validators. He explained the company will do some quick tests internally on the data, and depending upon what they find, they might use machine learning to understand why the bad data is getting through. 

Wanat said they also check the length of the input, to see if it aligns with what they are expecting. “If somebody’s typing in an address, it shouldn’t be very long,” he said. “If it’s over 200 characters long, that’s a problem.” Further, he said, they will scan data for some quick text validation, looking for script tabs or special characters that should not be in there. If found, he said they will either “code that out, or invalidate it altogether and send [the user] back to the info form.” 

Those kinds of checks happen before the data gets into the database. But if something through those checks, they will again validate that input before bringing the information back out of the database. 

As you would expect, this can take up quite a bit of a developer’s time. In a survey of developers on data quality issues SD Times completed in August, respondents indicated that spend about one day per work week on data quality issues. Wanat agreed with that sentiment.

“You can write a web page or a web form that takes input in a few minutes,” he said. “But if I have to add validators for this, when I have to scan for that, if I had to code it, sort it in the database, now I’ve quadrupled the amount of time it’s taking me to do this one thing.

“It’s just part of what we’re doing, and ensures our clients are getting what they want,” he continued. “No one wants to say, ‘Oh we had a script injection and all the data was erased from the database.’ “

If Blue Compass’ clients can pay once to have good data coming in, then they save that time continuously after that because they’re getting a higher quality product, Wanat explained.

Credit: Source link