Understanding PE Buy-and-Build Success Factors

Reliable buy-and-build strategies that compete in the crowded private equity space need to be focused, intentional plans that involve multiple acquisitions of smaller—often under-performing—companies.

Over the course of the buy-and-build cycle, many factors determine success, but the potential to realize multiple-arbitrage begins with the underlying strength of the platform company—and the leverage this gives GPs when seeking lower-multiple deals.  If the platform company has the systems for efficient integration, GPs can more confidently pursue under-performing targets and their reduced acquisition multiples, securing a key component in the arbitrage strategy.

With efficient integration, performance of the acquired bolt-on can more quickly rise to the level of the already successful platform company.  Conversely, inefficient integration not only means continued under-performance and lower multiples for bolt-ons, but also a likely drag-down in performance and value of the platform company.

blue-margin-private-equity

The Value and Role of Data in Private Equity

One of the most critical (and perennially frustrating) elements of integration is data, and the ability to clearly assess performance in aggregate--and in comparison--across business units.  Cobbling together reports from disparate systems, or holding out until software integrations can be completed, adds risk to acquiring under-performing assets.  Flying blind can lead to reactive adjustments, unintended consequences, and a revolving door for the management team.  Conversely, good private equity data-intelligence allows add-ons to continue operating from day-one without disruption.

Making the case for the importance of data, Boston Consulting Group, in an article titled “Creating Value in Private Equity with Advanced Data and Analytics,” warns “companies that choose not to use data to create value, risk hastening their own obsolescence or, at the very least, losing competitive advantage.”

Download Now: Data Warehouse Best-Practices Whitepaper

Integrating disparate data from acquisitions can seem overwhelming, but it doesn’t have to be.  As outlined in previous private equity data intelligence articles, investors need the ability to access near real-time performance data across the portfolio to accurately address and overcome barriers to growth.

To gain the needed data access and insights, leading PE firms employ modern data warehouses and business intelligence tools that:

  1. Enable management to effectively monitor and assess performance across the organization by consolidating data from multiple sources into a unified, consistent format; the single source of truth.
  2. Ensure all stakeholders have trust in the data by unifying definitions for key metrics and validating data to transactional systems across business units.
  3. Provide insight into growth opportunities by tying performance to internal and industry benchmarks.

A Look at the Past: Corporate Information Factories

In the past, data warehouses were rightfully thought of as risky investments.  Overly complex, expensive, and time consuming, data warehousing has until recently been out of reach for most companies. A key driver of this perception—and reality—is an approach to data warehouse architecture known as the Corporate Information Factory (CIF).

CIFs are representative of the traditional, monolithic enterprise data warehouse. The basis for CIFs is the Inmon data-model, designed to minimize data redundancy by increasing the number of tables (in order to avoid duplicating “common keys” across tables) and consequently, the number of joins needed to inter-connect data between tables.  

The primary benefits of this model are a reduction in data-loading complexity, efficient data refreshes, and a smaller storage footprint.  However, highly complex data models lead to long development cycles and “technical debt,” as mapping a tesseract of table-joins to reports requires complex queries.  Navigating this environment requires advanced technical knowledge, greatly reducing the number of people able to work with the data.

Additional points of constraint occur with the CIF’s top-down approach to development. Under the top-down approach a full schema must be defined at the onset of the project.

From the beginning, the fully defined schema must allow for all currently known data sources, along with any sources that may be added in the future. Because of the strict schema requirements, and akin to “waterfall” software development, the top-down approach is associated with long-duration projects, high development hours, and increased likelihood the final deliverable will become disconnected from underlying business requirements.

Similarly, making significant changes to an existing CIF (e.g., structural changes necessitated by changing business requirements) can be slow, expensive, and inherently risky—a scenario likely to occur repeatedly with buy-and-build acquisitions.

New call-to-action

The Kimball Approach

In sharp contrast to the entry-barriers and overhead of CIFs, modern data warehouses employ data modeling that is reporting-friendly, cost efficient, and easy to implement in an agile, modular way. The basis for this model is known as the Kimball approach, a method that was first codified in the late 90s, and later became the established best-practice for data modeling.

Kimball models are designed for efficiently reading, summarizing, and analyzing numeric data. They offer a data structure that is easily understandable by average business analysts, and a schema that easily integrates with modern BI tools (Power BI, Tableau, Qlik).

Systems based on Kimball’s methods also enjoy cost and efficiency advantages over the CIF approach. Implementation follows an agile approach.  Contrary to the top-down, waterfall approach, agile benefits include fast delivery times, quicker ROI, and less risk of development deviating from business requirements. Key private equity portfolio analytics reports can be produced within weeks of kickoff rather than months, and data sources can be added as follow-on, stand-alone sprints.

The Modern Data Warehouse (a hybrid approach)

Borrowing from both methodologies (with a bias toward Kimball) produces the “Modern Data Warehouse,” a methodology that has emerged as the leading best-practice for efficiently developing data warehouses that are fast to develop, highly-scalable, and easy to work with. 

When combined, the modern data warehouse supports efficient bolt-on integration by providing PE firms the following benefits and advantages:

  • Near-term ROI through agile implementation that delivers actionable reports early and often
  • Data structures that are reporting-friendly and focus on specific, immediate reporting needs
  • Data warehouses that are flexible, and that adapt to new reporting requirements
  • Data sources that are easily incorporated, with no requirement to extensively change existing processes and structures

Private Equity Data Warehouse Design Takeaway

The modern data warehouse gives GPs a key advantage in competitive markets. By allowing strong platform companies to efficiently integrate under-performing bolt-ons, lower-multiple acquisitions can be pursued, and the groundwork for successful arbitrage laid.

Curious to see how your portfolio can mobilize data to improve business outcomes? Blue Margin offers a no-cost consultation to help sponsored companies understand their data-readiness and opportunity to use data as a differentiator for growth and valuation. Click here to request more info.

New call-to-action

Jon Thompson

Written by Jon Thompson

Jon Thompson is co-founder and Chief Strategy Officer at Blue Margin Inc. An author and speaker, Jon sheds light on how businesses can take advantage of a revolution in business intelligence to become data-driven and accelerate their success.