Glenn Hopper is a strategic finance leader with two decades of experience and a proven track record of leading transformational change and integrating best-in-class technologies into finance operations and policy. His expertise spans financial leadership and strategy, finance automation, data science, and operational efficiency. With extensive experience working with private equity-backed companies and expertise in leading M&A, Glenn is knowledgeable in leading processes and technological implementations for organizations of all sizes. He sat down with AlixPartners to discuss the coming age of AI in finance and how investors, operators, and portfolio company management can begin to implement new ways of working.

Glenn, thanks again for taking the time to connect virtually. I want to jump right into this pervading idea of digital evolution and the role of automation & AI in the finance function. Given the pressure from their investors on accelerated value creation, how should portfolio company CFOs start to think about introducing automation/ML/AI assets? What are two or three critical checks they must consider before plunging into digital evolution? 

That’s the million-dollar question, right? I think about where most companies are when they take on PE investment. They’ve obviously established strong leadership, positive cash flows, a sustainable competitive advantage with growth opportunities and a clear path to exit. In a perfect scenario, that operational excellence extends into back-office operations of the business, but this is frequently an area that needs a lot of work in those early stages. 

Once you take on the investment, the clock is ticking; and the lift could be significant. One thing I learned the hard way along the path is not to immediately dive into the idea that step one is to implement a something like a new enterprise resource planning system. ERP implementations are difficult, time-consuming, and require cross-functional support to be successful. There’s a reason more than 60% of implementations fail. That said, the promise of life on the other side of the implementation can be worth the effort – as long as you plan and manage accordingly.

A new ERP tool isn’t a magic bullet, and there are plenty of steps businesses can and should take before undertaking an implementation project.

I’m a big Agile guy. Rather than trying to paint an entire roadmap of life on the other side of the rainbow, start with your current processes and procedures. Inventory the data you are collecting today, define sources of truth, and work to establish a data lexicon within your organization. Understand the important metrics, validate how you measure them, and be sure you know the levers that drive change.

By starting with a solid understanding and documentation of where you are today, you can start to plot the course forward.

Notice I haven’t even mentioned artificial intelligence or machine learning yet? That’s because if you don’t have a handle on your current processes and data, these technologies won’t be much use. So it starts with your company’s data maturity and processes.

Point taken - it's a good one. Once you have the basics in place, how should you begin thinking about AI and ML in finance? 

At the risk of being overly reductive, the three steps to move toward maximizing automation and incorporating AI/ML into your back-office suite are:

First, evaluate your company’s data maturity and integrity. As we were just saying, you need to understand the quality, accessibility, and integration capabilities of your existing data. It's vital to define your key metrics, ensure data integrity, and establish a single source of truth for each of your metrics. Once you have that sorted, you can start incorporating AI tools into your workflows and reporting.

Second, identify and optimize existing processes. This is difficult work, but if you don’t understand your current processes and where roadblocks or bottlenecks are, you don’t even know the requirements of any new system you might want to put in place. Like any new technology, AI won’t magically fix bad processes. This step involves documenting all of your back-office operations from the sales process through onboarding, servicing, invoicing, and eventual churn of clients. Map out all your procedures, pinpoint inefficiencies, and highlight areas where automation and AI could bring significant improvements in cost, speed, effectiveness, or all three.

Third, strategically plan for incremental implementation. It's a bad idea to try and eat an elephant in one bite. Before we start ripping out existing systems, let’s look at some small wins we can achieve quickly without a massive effort. Solving individual issues is much easier than tackling the entire transformation. For example, look at ways you can move and consolidate data between systems. This may involve integrating systems via APIs or consolidating data into a data lake or data warehouse. It may involve implementing a software reporting layer or turning on unused functionality in current systems. Working incrementally has another advantage -  it helps you build support for the transformation and create a culture of continuous improvement, which sets the company up for change management.

Putting your PE sponsor hat on briefly, how should investors and operators think about introducing ML/AI assets across their portfolios? Every portco is different, but could we eventually see unified dashboards of operational & financial feeds? 

This is where PE firms have an advantage. Remember, data is the fuel that powers AI/ML, and PE firms have data from multiple companies, which they can aggregate to put to better use than a single company could. But you have to be able to get access to that data. 

You’re right – every portco is different, probably collecting vastly different amounts and types of data or measuring metrics differently. The last thing you want to do is embark on a project across all your portcos to try to change all the systems to mirror one another, which could bog your and them down in the equivalent of a months-long root canal. But having all that data tied together sure would be valuable.

This is one of the magic powers of AI – generative AI in particular. There’s not (yet) an off-the-shelf tool that does this, but I suspect it’s coming from one of the big players soon enough. The development path though would be something like:

  1. Data aggregation: Either via API, automated reports, or manual upload bring relevant portco data into a single data lake. (Think of a standardized report template containing fields for the data you’re collecting as worst case scenario.)
  2. Generative AI layer: Deploy generative AI models to process, analyze, and interpret the aggregated data. Computers are much better than humans at identifying patterns, trends, and insights within large, complex datasets. By analyzing the aggregated data in a single data lake, generative AI could facilitate the creation of comprehensive, automatically updated dashboards that provide a consolidated view of all portfolio companies.
  3. Dashboard implementation: You don’t need AI for this. Many firms are already using tools like Tableau and Power BI to create extremely useful dashboards. Remember, AI is not a cure-all, and it’s frequently not required to get real business insights. But if you can interact with your data through gen AI-powered tools like chatbots that can interact with your data, you will be able to garner much greater insights without relying on report developers to create one-off reports or analyses.

Keeping that sponsor hat on, how can PE firms support their management teams in implementing automation/ML/AI assets? As we know, these portco management teams are under immense pressure to perform – how do they wedge a digital evolution into the execution of their value creation plans?

This pretty much goes with the kind of support PE firms are already doing for their portcos. Providing strategic guidance, giving them resources for upskilling (i.e. an in-house resource at the PE firm who understands the technology and can guide portcos on this would be invaluable), and of course investment in the necessary infrastructure. Think of the competitive advantage you could gain by integrating this digital evolution into the core of your value creation plans rather than viewing it as just another task. This means setting clear objectives, prioritizing initiatives that offer the most significant impact, and fostering a culture of continuous innovation. By aligning this transformation with the portcos’ strategic goals, PE firms could drive additional value.

Think of automation and data maturity as a kind of IP that adds value to the companies you’ve invested in.

Can you walk us through what you do, personally, when you start advising a company?

Great question. And the answer is kind of painful. I feel like when I show up to a company I’m probably tagged as a cross between an IRS auditor and one of the two Bobs from Office Space. 

You can’t lead with the technology. Like I said before, you have to understand where you are before you can map where you’re headed.

I joke that my onboarding is similar to an ISO 9000 audit. You don't do it to get the certificate; you do it because of what it measures and what you learn. That's as important today as it was 20 or 30 years ago. We look at leadership and documentation of processes and procedures to see where they are interlinked and to identify any roadblocks and data breakage points. We evaluate the company’s data maturity to see where they are concerning being data-driven and assess their change-management approach and both internal and external relationships.

Sometimes a fresh set of eyes and a procedural approach to documenting what the company does every day is the best way to get quick insights into areas for improvement. There may be plenty of things they’re doing very well, and we want to be sure to identify these also. 

By considering all this, we understand the road ahead of us and then we can start working on the fun stuff!

In your book “Deep Finance: Corporate Finance in the Information Age” you say that a critical part of an automation journey is consolidating all sales, operational, and financial data into a single location. Data democratization sounds great in theory, but how does that work in practice? 

That’s the other side of the rainbow, right? That’s what we’re aspiring to do here. 

I’m always careful not to shill for any particular software, but there are a lot of tools out there today that even without AI can make this a reality. But again, you’ve got to do the foundational work to get everything ready before you can implement them.

So quickly, let’s recap the nontechnical steps to prepare for this. Let’s start with a typical small-to-medium sized business back office tech stack: you have your G/L, maybe a CRM, bolted-on payables and receivables apps, marketing tools, maybe a project management and/or timekeeping system, and increasingly some kind of business intelligence tool. Most of the time when I step into a business, these tools aren’t integrated. They are each their own silo of data. So, for example, maybe you have a billing contact that you’ve collected in your CRM tool, but a different billing contact may be entered in your accounting software. Or maybe a name is fat-fingered in your project management system so it doesn’t match what’s in the other systems. There’s typically not a single source of truth identified for the various bits of information you’re tracking.

Before you even think about aggregating your data, you have to catalog it. All of it. It can be painful work, but once it’s identified, you can aggregate it. Then you can report on it. And everyone is singing from the same sheet of music. 

I don’t want to veer into the deep waters of defining the different data storage schemes (data lakes, data warehouses, etc.). But your data strategy is important here. If you truly want to push data-driven decision-making throughout the company and you want to empower your employees, you need to figure out how to appropriately share data, while ensuring sensitive information isn’t exposed to everyone in the company. 

For example, a customer service tech shouldn’t have access to salary information, but they should have full access to the metrics that drive their department. By sharing this data widely, you shift from a top-down process approach to more of a Kaizen model. Giving front line employees access to data that is meaningful to them empowers them to identify issues and provide insights that may not be available to management.

This builds on that old business adage that you have to “inspect what you expect.” By giving employees the ability to inspect or investigate, you give them more ways to drive improvement. So it’s not just about the data – it’s about the culture.

One of my favorite parts of “Deep Finance” was the car wash business you co-founded, automated, and ultimately sold. You ended up with so much time on your hands as CFO that you were able to read the Wall Street Journal daily.  How did that happen?

I’m laughing because I’ve told that story so many times. I guess I first subscribed to WSJ when I was in business school, and I had never felt more informed than when I was reading pretty much the whole paper every day. But then when I got into the business world, I was lucky to make it through the “What’s News” section before getting pulled into some kind of firefighting drill. 

That car wash company that we eventually sold in a Leonard Green-backed roll-up was maybe what I will remember as my greatest success in the business world. We took this very low-tech mom-and-pop business, and put in some genuinely amazing real-time reporting by building an online tool that aggregated data from these extremely low-tech POS systems that helped not only in tracking sales, but helped us better monitor chemical levels, staffing, and real-time margins. 

We had that company completely dialed in and were able to repurpose our time to start building these incredible forecasting models that factored in our internal data plus all these external data points ranging from traffic counts to weather and even macroeconomic conditions like gas prices and average home prices. Ultimately, we had this wildly variable business that was driven by many factors out of our control, but we got so good at forecasting that I’m sure our debt providers thought we were some kind of wizards.

And yeah … when I wasn’t spending all day creating manual reports I managed to find some time to read the paper now and then!

How far away are CFOs and Finance team members from having AI serve as a “junior analyst”? How does technology like AI evolve the future of Finance and the back-office? 

The biggest hurdles right now are data privacy/security, compliance, and trust in these gen AI-powered systems. If I’m the CFO signing off on financials, I’m not in a place where I’m just going to hand that off to my new robot accountant and trust that it’s done the job correctly.

Let’s talk about each of those individually:

Regarding data privacy and security, I’m telling everyone now: do not upload proprietary information directly into the off-the-shelf LLM-powered chatbots! There are tools out there that give a higher level of data security and ensure your data isn’t used for training future models. But just like you wouldn’t upload your private company financial statements to another website where you’re not sure of their security measures or how your data will be used, we don’t need to trust OpenAI or Google or any of the other players to keep that information private. (Just read the disclaimer from each when you sign up for an account to see what I mean.) So, in a lot of ways, we’re still in proof of concept for these tools, but the potential is clear.

A key issue with large-language-model-powered chatbots right now is that the outputs of these models vary widely. You can’t have finance processes that are not completely definable, understandable, and scalable. Further, these models are known to “hallucinate” from time to time – outputting made-up data as if it were factual, and not understanding the difference.  LLM's have some pretty interesting emergent capabilities for things like solving math problems, but that is not an inherent function for these tools, and they frankly aren’t very good at it. (This excludes tools like ChatGPT’s Advanced Data Analysis, which actually writes Python code under the hood to provide answers. Python actually is good at math.)

Finally, most small-to-medium sized businesses (where I’ve spent the bulk of my career) don’t have the internal resources or budget to do a lot of internal development to implement these technologies. As the power of this technology increases and its accessibility expands, I do see generative AI being built into existing software platforms, and that’s probably going to be the most accessible way for most companies to access this information. I would expect to see generative AI increasingly available in a meaningful way in tools like our ERPs, CRMs, and various reporting tools. And the added benefit of that is that it will have been built out and vetted by these massive companies – they’ll be the ones taking on the development costs and bringing it to you as a nice, usable package wrapped up with a bow.

But if you want to take full advantage of this when it’s available, now is the time to get your data house in order!

Glenn, this has been terrific, thank you so much for the time and wisdom you’ve shared today!

 

Glenn Hopper currently serves as the CFO of Eventus Advisory Group, advising on the implementation of financial technology and process improvements, as well as fractional CFO services. He is also a lecturer and author focused on the intersection of artificial intelligence and corporate finance. 

Glenn is the published author of “Deep Finance: Corporate Finance in the Information Age” 

He can be reached at https://glenn-hopper.com/