Business intelligence is becoming more and more popular everyday. The issue of data quality plays a central role here. We present five best practices on how business decision-makers can optimize the quality of their data.
Business Intelligence and Data Quality Optimization
Certainly, business intelligence and analytics applications are on the rise. They help companies of every industry and size to understand the needs of their customers, to develop the right products. And thus to position themselves successfully in the market. In this country, in particular, more and more companies rely on data analysis. Three out of four German companies are increasingly making their decisions on the basis of analyzes of their databases. This was the result of the representative survey “Creating value with data – Report 2015”. Which, the auditing and consulting company KPMG and the industry association Bitkom carried out among more than 700 companies.
As the study shows, the importance of data continues to grow. More than half of the companies surveyed described data analysis as a crucial component for value creation and business models in their own company. In addition, every second company is actively looking for opportunities that could a comprehensive analysis of data could offer, the study continues.
High Data Quality Is a Must
But without optimal data quality, deviations, or errors in central business processes can quickly occur. That explains the internationally operating IT service provider CGI Germany. Therefore, you can only achieve high data quality through a systematic and, ideally, company-wide procedure. Over time, it can make a strong part of your growth hacking tools. And will lead to a more accurate performance. According to the CGI experts, the aim of increasing quality is to use the relevant data from the various sources as efficiently as possible. To optimize decision-making and business processes and ultimately to increase the competitiveness of a company.
One of the most important requirements for this is continuously high data quality and integrity.
Companies can no longer postpone the subject of data quality. In many highly regulated industries, compliance with high data quality is the result of more and more legal regulations. There is no getting around the establishment of company-wide data quality management.Knut Veltjens, Vice President / Practice Head Business Intelligence at CGI in Sulzbach near Frankfurt.
CGI shows which measure companies should implement to improve the quality of their data in the area of business intelligence using five best practices, which we will present to you below.
1. Set Priorities for Optimization Measures
As the CGI experts explain, in many cases it makes sense to take stock of the current problems in the specialist departments. The employees know the processes best and know where the weak points of the cross-departmental business processes lie. For example, individual departments structure customer data records differently. Therefore, data are incomplete, there are different designations for articles or individual part numbers are missing in the material database.
Certainly, all of these deficiencies would have different financial implications, according to the CGI. In a project to improve data quality, the IT specialists from Frankfurt recommend starting where the costs for elimination are lowest and where you can achieve the monetary process improvements as quickly as possible.
2. Maximize Data Availability
Many business processes – or big data applications – require data from several sources. Usually, they receive data automatically via individual data imports or so-called ETL processes (“Extract, Transform, Load”). The more interfaces there are, the higher the maintenance effort and susceptibility to errors. Companies should therefore check in individual cases whether a data warehouse, standardized data pools, or logical or virtual data integration offer a better solution.
3. Define Responsibilities
A sustainable improvement in data quality requires changes and measures in the areas of technology, organization, and employees. Often there are already employees in the departments who know “their” business process-relevant data in detail. And who are consulted in the event of data quality problems.
In order to increase the quality of the data, it is often not enough to use technical means to eliminate duplicates or to track down and close gaps in the data records. In the role of a data steward, the technical experts draw up business process-related rules for how the data should be generated and maintained. You are also responsible for implementing and complying with the requirements and continuously adapting the procedures to new requirements.
4. Build Metrics for Evaluating Data Quality
Another success factor in an effective quality check in the area of business intelligence is, according to CGI, specific metrics that can be used to evaluate the quality of certain data. As examples, the experts cite a maximum of three percent errors in the customer and article data records, a maximum of two percent in the address data, no more than one percent in the material data, or a maximum of three percent in returned invoices.
Rather, it is crucial that the key figures are transparent, measurable and verifiable so that those responsible for the process can give an account of their activities or, in other cases, remedial measures can be initiated and their effect can be checked.
Establish Processes for Continuous Improvement
“The introduction of data quality management is not a one-off project, rather the procedures, processes, and results must be continuously analyzed and improved”, says Veltjens. Ideally, the task should follow the “Plan, Do, Check, and Act” cycle. According to Veltjens, this would ensure that the processes for ensuring data quality management and maintaining data integrity are subject to a continuous improvement process. The primary concern is whether internal and external rules and regulations are actually implemented and new requirements are taken into account.