Sunday, January 12, 2020

Data quality framework: Necessity or discretionary practice?

Three decades ago, organisations could probably get away with not having an abundance of clean, reliable and accurate data, and driving the business purely from an operational perspective.
The adoption of business intelligence and analytics, however, has revolutionised ways of working in the past few decades. Organisations now need to be more tactical and strategic, especially in saturated markets, where customer centricity, satisfaction and experience are the core drivers for customer acquisition and customer retention today.
Let’s have a look at a practical example: Telecommunications service providers can no longer assume that customers will want to recharge as soon as their airtime is depleted, after making a call. The caller may be more of a data user, with a smartphone and currently in a location that has free WiFi. The receiver may also have a smartphone but no data and not in a free WiFi zone. So, how does the service provider collect and combine this information in order to offer the receiver a data package that will enable the caller and receiver to continue conversing?
If you are thinking along the lines of data engineering pipelines, streaming analytics, analytical models and actionable intelligence, you are on the right track! However, apart from the heavy lifting data engineering and data science practices, there is a fundamental component that is often overlooked: data quality.

No comments:

Post a Comment