Given a heuristic for determining data quality, it homogenises the quality of its contents. Data you write to it has pieces exchanged with other entries depending on its quality. The lower the quality, the higher the rate of exchange.
If you put only perfect data, nothing is exchanged. Put high quality, you’ll mostly get high quality too, but probably with some errors. Put in garbage, it starts poisoning the rest of the data. Garbage in, garbage out.
“Why would you want that”, you ask? Wrong question, buddy - how about “Do you want to be left behind when this new data quality management technology takes off?” And if that doesn’t convince you, let me dig around my buzzword budget to see if I can throw some “Make Investors Drool And Swoon”-skills your way to convince you I’ll turn your crap data into gold.
AIs make the same typos that humans do bc they were trained on human writing.
Garbage in garbage out
My new data structure:
Given a heuristic for determining data quality, it homogenises the quality of its contents. Data you write to it has pieces exchanged with other entries depending on its quality. The lower the quality, the higher the rate of exchange.
If you put only perfect data, nothing is exchanged. Put high quality, you’ll mostly get high quality too, but probably with some errors. Put in garbage, it starts poisoning the rest of the data. Garbage in, garbage out.
“Why would you want that”, you ask? Wrong question, buddy - how about “Do you want to be left behind when this new data quality management technology takes off?” And if that doesn’t convince you, let me dig around my buzzword budget to see if I can throw some “Make Investors Drool And Swoon”-skills your way to convince you I’ll turn your crap data into gold.