Operational due diligence is specific type of consulting project in the fast-paced world of mergers and acquisitions (‘M&A’). This is where advisors assess the operations of a business to determine the appropriate cost based going forward. As a Director in the Strategy and Transactions team at EY, Ali Adam leverages both operational and financial data to provide fast insights.
In today’s conversation, Ali met with Joel Lister-Barker, Host of Talking Data, to talk about the data sources that he finds most valuable on due diligence projects and the process that he follows to validate client data.
I started my career at EY in London, UK, working in the audit practice and qualifying as a Chartered Accountant. After three years I moved into the world of transactions, where I perform operational diligence on potential deals across all sectors for private equity clients.
Data quality is super important in M&A. While the sell-side usually cleans up any company data, we do receive data from target companies in all shapes and sizes, and of varying quality. In the world of operational due diligence, everything comes back to a cost line item in the Profit and Loss (‘P&L’). So, whenever we receive information the first thing that we do is to confirm whether it reconciles back to the single source of the truth - The P&L. If there are discrepancies between these sources, then we will look to discuss them with the management team of the target company. This process forms the general approach that I take to validate any data received from the client or target company.
It’s hard to provide an example because every single project uses data to perform some kind of analysis that drives a result for the client. One recent example was for a large healthcare and cosmetics retailer that our client was looking to acquire. The financial data provided was so inconsistent that no one could understand what was really going on within the company. It was only when we were able to piece together the cost base using different subsets of data to truly understand the cost base from a people and non-people perspective. This included using the people data from the HR Information System (‘HRIS’), vendor spend from the procurement system, and other sources for all other costs. From this re-baselining exercise, we were able to confirm our hypothesis that an exorbitant amount of cost could be taken out through process improvement.
In any due diligence or value creation project you always want to perform a comparison to peer companies. This can be at an industry level or within a subset of comparable companies, with the primary goal of identifying opportunities for cost improvements. Typically, we use our own internal proprietary data and third-party sources, such as APQC. However, data quality and transparency from external sources can be an issue, so we are careful when we use it in our analysis.
We typically look at cost and FTE benchmarks as part of any operational due diligence project. It is important to understand that benchmarks need context because no two businesses are the same. For example, every business will have various levels of insourcing and outsourcing. From my perspective, benchmarks are an additional data point to support our point of view - They are not a leading indicator.
I would love there to be an aggregator of all the data sources out there where I can access or buy reliable data. Trying to find trusted information usually involves hours of searching the internet – Sometimes you find what you need and other times you do not.
Insights are just around the corner.