How Do Good Processes Produce Good Data

How Do Good Processes Produce Good Data

How Do Good Processes Produce Good Data

Good data is the basis of good decisions in companies. Good data is also necessary to enjoy the benefits of business intelligence and machine learning. It is then surprising how much corporates spend on shiny technology versus producing good data. High-quality data can be an asset to companies that know how to use them.

1. Business objectives

The purpose of collecting good data is obvious, and nobody sets out to produce bad data. But the definition of what is good data must go back to the business purpose. As always, Business First, Technology Second is essential to reach the right destination; data quality projects are no exception to the rule. The relationship between data quality and business performance is not always direct and may require some careful analysis. The data is not the business itself, but it is usually a digital representation of reality. But when both are very close to each other, manipulating one efficient is key to controlling the second.

Business Critical Number

Identifying the information critical to achieving a business goal helps to sort out the relative importance of data points. This hierarchy will guide a data quality project and ensure it has a positive business impact.

Value of Data

Assigning value to the data is an excellent way to prioritize. Computing how much quality data brings in value or how much value poor quality data destroys will reveal the return of data quality improvement.

2. Data Creation

Companies create data in several ways:

  • data entry,
  • automated measure,
  • data generation,

Data Generation

Companies generate their data by automated processes, analysis, enrichment, conversion, exchange, and many more. It is critical to monitor carefully and to update regularly these processes. Data generation relies on other information and is only relevant as long as the underlying data source does not change in structure but also in content. It is the nature of the business to change, evolve, and adapt. So the underlying will change, and companies must keep their data generation up to date.

Automated Measure

Automated data capture is usually the most stable of data creation. Acquiring raw data from a sensor will provide comparable long-term results.

Data Entry

Manually capturing pieces of information is obvious the least reliable way to create data. There is a whole spectrum of reasons why data capture produces unsatisfactory results, from human mistakes to misunderstanding. But companies have no choice but to rely, at least partially, on an initial input of raw data by its employees, suppliers, customers, and other stakeholders.

3. Process Mapping

In all scenarios, data issue and root cause analysis go back to a clear understanding of the processes used to generate, measure, or input. It is possible with manual mapping of the process of using Process Mining Technics .

Past and Future

Without a clear understanding of the processes at play, it would be risky to attempt any fix. The same behavior tends to have the same results. Fixing past data will not affect future data if a broken process produced it.

Process Accuracy

One of the critical measures of a process is its accuracy. It represents how close a process representation is to reality. Some questions to answer to evaluate accuracy are:

  • Have we incorporated all possible cases?
  • Are their exceptions to the most common process?
  • What happens when this step fails in the process?
  • Can this step be executed more than once?

For long and complex processes executed in many cases, Process Mining will provide a high level of accuracy and an actual number to evaluate the accuracy.

Process Precision

The precision of a process representation measures how narrow this representation is, preventing too many undesirable variations. It is a better predictor of the quality of data generated than accuracy. Process design often has a bias against precision. The pressure of users for “freedom” does not lead to precise processes that only allow the best path of execution.

4. Process Optimization

Based on precise mapping of the critical processes, the next part is to diagnose the issues causing poor quality data to creation.

Low Accuracy

Simple data quality controls will expose low accuracy in the process representation. Analytics will reveal that the data created does not follow the expected statistical distribution. The most common symptoms in enterprise software are:

  • low concentration of data, in other words, lots of empty pieces of information,
  • unexpected events,
  • inconsistent results, unbalanced amounts,
  • duplicate information,

Low Precision

An enterprise software implemented with low precision will give too much freedom to its users, resulting in many different ways to process the same information. It does not necessarily lead to poor data quality, but it will inevitably create different habits and generate different levels of commitment to data quality with users. Process mining reveals the symptoms of low precisions:

  • too many variants of the same processes,
  • events executed out of order,
  • missing events,
  • significant disparity on waiting or execution time,

5. User experience design

In his book, The Design of Everyday Things , Don Norman explains that the most catastrophic industrial accidents attributed to human error can be traced back to a critical user experience design flaw. In one of the commissions he joined as an expert, he pointed out that “the plant’s control rooms were so poorly designed that error was inevitable.” The same applies to data entry. A bad user experience makes the production of bad data inevitable.

Less is More

As a general principle, the data quality increases with less data to manage . In terms of User Experience, humans will make more mistakes if they have to input more data. Repetitive tasks lead to inattention. A user wants to complete a transaction as fast as possible to get to the next task. Reducing the size of a form will increase the chances of better data entry quality.

Context is Important

Dealing with complex information in complex systems requires users to understand the context into which they operate. It is an unnecessary burden to put on the user’s shoulders. An intelligent system can place the user in a known context and take advantage of all information available.

Non-Decisions

Enterprise systems are full of non-decision, redundant steps, ineffective controls, and validation. The analysis of a trace of transactions will often reveal the optimization possible by combining steps in an optimized, simple, direct, and dynamic process. These accelerated processes will produce high-quality data because: the user will input less data, taking advantage of the context, letting the system make the non-decisions,

Conclusion

Mature corporates with effective systems of record (ERP, CRM, OMS, and more) can improve their efficiency and data quality by launching data-driven optimization projects to design and implement accelerated processes. These projects have a dual payback in cost savings and lead time reduction.

We are Here to Help

At System in Motion, we are committed to building long-term solutions and solid foundations for your Information System. We can help you optimize your Information System, generating value for your business. Contact us for any inquiry.

You should also read

How to Design an Effective Dashboard?

How to Design an Effective Dashboard?

Article 5 minutes read
How Process Mining Facilitates Operational Efficiency?

How Process Mining Facilitates Operational Efficiency?

Article 4 minutes read
Why do you need a critical number on all your projects?

Why do you need a critical number on all your projects?

Article 9 minutes read
Do you need an Architect?

Do you need an Architect?

Article 5 minutes read

Let's start and accelerate your digitalization

One project at a time, we can start your digitalization today, by building the foundation of your future strength.

Book a Workshop