Everything you always wanted to know about Data Quality Management
In this five-post Q&A series we’ll explore the concept of data quality management – what it is, why you need it and how to achieve it
#1 What is ‘bad’ data?
Law firms have business applications that contain huge amounts of master and transactional data. Quite often, a surprising amount of this data is ‘bad’.
Bad data commonly includes:
- Data that is repeated within a single system (duplicated)
- Data that is in violation of business rules (invalid)
- Data that is superfluous (redundant or obsolete)
- Billing, correspondence and email addresses that are wrong (inaccurate)
- Transactions that don’t add up to balances (lacking integrity)
- Systems that are supposedly integrated but show different results (inconsistent)
#2: Why you must fix ‘bad’ data
Bad data (data that is duplicated, invalid, redundant, inaccurate, inconsistent) can lead to any number of poor business outcomes. These can all harm your profitability.
- Bad data can lead to undue risk and reputational damage and so give your lawyers unnecessary work.
- Bad data can make it hard for your compliance team to protect your firm against risk.
- Bad data can slow down cash collection and increase operational costs and so make it hard for your finance team to manage your firm’s money.
- Bad data can result in client loss and missed opportunities and so make it hard for your BD team to support your client relationships and develop new business.
- Bad analytic data can result in poor judgements and so make it hard for your management teams to make decisions with confidence.
- Bad data can result in extra remediation and maintenance costs and so make it hard for your IT teams to deliver systems with consistent data.
#3: How do you fix ‘bad’ data?
First you must accept that you will have some bad data. Data is a vital asset of every law firm and is worthy of the same high level of management you afford to your other assets.
To manage bad data you need to apply a data governance regime. There are software tools available that will help you monitor this regime and resolve fundamental issues:
- Address Quality Management: this tool validates both physical and email addresses
- Data Quality Monitor: this tool highlights both financial and system integrity across multiple databases
- Master Data Deduplication: this tool ensures that 3E entities, sites, addresses, relates and payors are only represented once in the 3E database
#4: When do you fix ‘bad’ data?
Most firms will have a big store of ‘bad’ data that needs to be found and fixed first. When this is complete you can move on to ‘maintenance’ mode where data hygiene becomes a business-as-usual function.
A good way to look at this process is:
- One-off data cleansing
This is an extensive remediation exercise, typically performed ahead of a system migration. It gets rid of or corrects bad data before it moves across to the new system.
- Continual data quality monitoring
This is regular, automatic checks in and across systems. It flags up bad data so you can fix it before it becomes a problem.
- Continual data quality control
This is ongoing work to enforce correct data entry or remediate bad data that has crept back in.
#5: Who can fix ‘bad’ data?
You can try and fix your bad data in-house. However, it is quicker and easier to work with expert teams who:
a) know how to design and implement a suitable data governance regime
b) have developed specialist software tools to fix bad data automatically
The best way to fix bad data is to bring a) and b) together. This is exactly what we have done with our managed service. Our expert consultants will continually monitor the quality of your data and take remedial action where necessary to ensure you can depend on consistent, optimum hygiene levels.
In return, you will enjoy:
- Reduced costs, by avoiding operational inefficiencies
- Reduced risk, through better quality data capture
- More informed decision-making, by eliminating duplication and inaccuracy.