Recently, my colleague Tony Pullman wrote an article on the value of system interoperability and what prevents it. At Pinnacle, we regularly explore the issue of interoperability with our clients and their development teams. We want to accelerate interoperability – the art of making systems, products and software work together – and help firms leverage the value of integrated data, which often resides in separate silos. With the economic climate gloomy and its impact on law firms uncertain, those firms that achieve data interoperability will be empowered with business-critical information and better prepared to navigate the next few years. Here, I explore what law firm leaders are doing to enhance their data-driven decisions in 2023 and beyond.

 

Joining data up

Today, smarter firms are creating interconnected data. Some are doing this with integrations between software platforms – working with the Intapp platform, we see many clients bringing data together and exposing it in risk, CRM or other interfaces.

It’s relatively easy to bring data into a single platform if you have one, but many firms operate from disparate systems. Even those operating some kind of unified data system generally haven’t yet succeeded in migrating everything onto a single platform. As a result, many firms are using Microsoft Power BI to bring that disparate data together, with these aggregations being surfaced in real time and without user intervention.

 

Creating insights

The value of joining up data from separate systems in real time can be significantly underestimated. We are seeing some firms forecasting revenues by combining opportunity data from their CRM systems with capacity data from finance and HR systems. Others are tracking website traffic and sending targeted offers instantly. And some provide real-time performance management of the business acceptance teams, allowing teams to allocate work proactively and plan resourcing, business driving efficiencies.

 

The key is data quality

Law firms are realising that, in order to combine data together and create insights at the speed at which the firm wants to work, data quality needs to be high – this is an issue for many firms. According to Forrester, data quality issues are such a pervasive problem nearly one-third of analysts spend more than 40% of their time vetting and validating their analytics data, and Ovum Research reported that poor data can cost businesses 30% or more of revenues.

To address this, firms need to build systems and processes that can handle poor-quality data, put robust processes in place that stop more low-quality data being entered into systems, and work on cleansing only the important legacy data.

Stopping poor-quality data coming in has been a focus of some of Pinnacle’s more technical resources. In our increasingly digitised world, it’s more and more feasible to remove manual data entry – instead, users can simply select data, choosing addresses from a global lookup, company information from a register, or referrers from the CRM system.

For some core applications used in law firms, our implementation teams have developed ways to provide external data as a lookup for users to select. The major advantages are fewer user keystrokes, data quality is as good as it gets, and it provides a reliable key to external data.

To address the processes underlying these issues, leading law firms are now aiming to put tools in place that continually monitor the quality of their data. For example, checking that all client numbers in the practice management system also exist in both the risk and CRM systems and ensuring that client names and statuses are the same across all three. Or, ensuring that the firm’s list of employees and users is consistent across all systems. It’s worth the effort, because inconsistencies create interoperability failures and don’t allow individuals to trust real-time reporting.

Systemic monitoring removes these issues and alerts teams so they can rectify data issues before users notice and, importantly, before many decisions are made based on it. The tools can also ensure high-quality data when an application cannot support an external lookup. The monitoring tool silently performs the match after the event and highlights data issues to be fixed in the core application. Not an ideal scenario, but this allows for poor data entry to be rectified.

 

Making it happen

Having unclean data is very costly, but data cleansing can be hugely expensive and often needs to be repeated every few years.

Leading firms avoid putting in place cleansing projects. Instead, they are putting data at the heart of their thinking and decision-making. They are giving the business the tools to consistently and proactively highlight poor-quality data and the individual teams the tools and responsibility to maintain their data quality. Doing it this way is the only way firms can reliably make 2 + 2 = 4 – or, even better, turn that equation into a value multiplier.