blog

The Realities of Portfolio Company Data Quality Every Private Equity Manager Should Know

April 26, 2022

Data, they say, is the new oil. And just like petroleum, raw data needs to be cleaned and refined before it’s a useful fuel. For few is that more true than private equity managers, which must keep track of hundreds of data elements for each of their portfolio companies.

This was the theme of a recent Webinar for fund general partners titled “Value In, Value Out: Business Intelligence Starts with Quality Portfolio Data.” During the event, Hank Boggio, chief commercial officer at Cobalt, offered that “Rather than a necessary evil, portfolio company data is the backbone for reporting and analysis, often driving strategic firm decisions.”

The webinar drew on the experience of Cobalt, a leading provider of portfolio monitoring software, and Alvarez & Marsal, a consulting firm with expertise in private equity performance improvement and analytics.

Here are four of many insights from the event that might surprise general partners (GPs):

Collecting data from portfolio companies saps productivity.

A typical small or midsize private equity manager typically gets financial updates by email from portfolio companies, with an associate copying data to spreadsheets. The data is often copied again from the spreadsheet into reports for partners and updates sent to investors. Taken together, all this is a drain on the firm’s resources. “General partners tell us on a regular basis they want to ensure that the investment team spends less time collecting and re-keying data and more monitoring and managing the companies in their portfolio,” Boggio said.

More data has more problems than most understand.

The ad hoc process for gathering information too often doesn’t spot errors and inconsistencies in the data provided by companies. All that manual copying, in fact, introduces more. “If you haven’t taken a close look at the state of your data, you may not really understand the problems you have,” said Cole Corbin, the senior director of fund analytics and reporting services at Alvarez & Marsal. “We’ve seen all kinds of issues, from major integrity problems to cases where like-for-like comparisons were not being done right.”

Employee turnover can undermine data.

Another consequence of informal approaches to collecting information is that a lot of critical knowledge about the process is in the minds of a firm’s investing staff. Experienced associates not only know how to transfer information from company reports to your internal spreadsheets, they also can spot anomalies and errors. “In these times of the Great Resignation, we’ve seen GPs look for ways to lessen their reliance on individuals to collect and organize their data,” Corbin said.

Automating data collection improves data accuracy and saves time.

Instead of manually entering information, GPs are building automated crawlers that read the data sent by companies and enter it into portfolio monitoring software (such as Cobalt). These can be designed with logic that flag errors and other data-quality problems. They can spot internal inconsistencies, e.g., when the revenue line items don’t add up to the total. And they can alert the investment staff if any data element changes unexpectedly from prior periods.

The webinar walked through a detailed example of how this sort of crawler can be built. It also explored how portfolio monitoring software can identify risks and opportunities in private equity portfolios. These analytical tools can be very powerful, but only if the data they are based on is reliable. “The basic mechanics of collecting the financial, operating, and performance metrics from each portfolio company is very repetitive and highly prone to error,” Boggio concluded. “As with any data-centric system, ensuring that the information captured for reporting and analytics is critical.”

Watch a replay of the webinar here: