I was at an event earlier this week, and a colleague asked me what kinds of data challenges I encountered while working on M&As. Later I read about Silicon Valley Bank’s (SVB) failure and had flashbacks to my experience in an FDIC acquisition. That warranted a few thoughts and stories in the data hitchhiker blog because some exciting stuff that happens would be common across industries and pretty unique things to banking. I won’t cover everything that goes into data during M & A, but I’ll give you an idea of what happens.
M&A in Banking
Banks do M&A activities for several reasons. Growth, market share expansion, improving profitability, increasing efficiency, and regulatory needs. I spent about 20 years in banking and working on many data-centric projects as a CTO. In that period, I mainly worked with horizontal mergers of banks that were 5-20% of the size of the bank I worked for. I didn’t experience a merger of equals (MOE), although I knew plenty of banks that we acquired that had. It’s common in the landscape, and many banks, approaching a size such as 10-15 billion or 25 billion will think about selling rather than taking on the risk of trying to compete at a larger size. Regulations change based on the size of a bank, or at least the pressure around them increases and is more scrutinized, and therefore some banks determine they can’t do it. Most of the time, it’s a traditional business sale, but when a bank fails, as in the case of SVB, the government takes over, and there’s a firesale on the debt. As I write this, I imagine that the bank that bid on that debt is working this weekend to catalog and analyze what they’ve got. Regardless, bank mergers are just part of the landscape, and beyond the business reasons, the technology strategy and underlying data strategy can be complicated.
The Traditional M&A
From a technologist’s perspective, the process in a traditional bank M&A is about 3 phases. From announcement to legal close and then to system integration are the 3 main milestones that affect a technology plan. System integration is essentially a data migration event. Once a deal is announced, teams start working to inventory all systems and data repositories. From there, we would do system gap assessments and then build routes for where data would live. The next several months will be an exercise in documenting and creating the plans for a massive migration event. While I would incrementally migrate solutions, they wouldn’t work until all data from the bank was migrated into all other systems. This included banking data like deposits and loans, channel data like online banking login and password, and system information like email accounts, network accounts, and other information. Banks have a LOT of data. These days they’re more data entities than almost anything else. As the size of a bank increases, so does the number of data repositories you have to plan for. This type of model would be similar to any industry.
I had been a part of 2 main strategies for the data and systems. The first was to bring everything over into the bank as is. If the converting bank had a data repository for loans in a system, we would migrate that whole system over. Ultimately, that has proven difficult to manage and an IT nightmare for a bank that likes to grow through M&A. The second strategy was “No converting bank systems, only data survives.” This was a data strategy versus an application strategy. It was much more effective and clean but more complicated from a migration strategy perspective. Not only do you migrate data, but you must have metadata that tells you if this was acquired data, its quality level, etc. The key in these types is you have time to plan and manage.
The FDIC Fire Sale M&A
While not all FDIC acquisitions result in the sale of the bank to another bank, many of them do. The one I was a part of was a Thursday afternoon, and our executive team asked me to clear my Friday and weekend. Banks can be pretty secretive about what’s happening in a merger, so I assumed we would announce something, and they wanted senior leaders prepared. I found out Friday afternoon at 5PM that the FDIC had closed a bank about 90 minutes from our main office. The CEO was under investigation at the bank we acquired. The bank I worked for had bought the debt and taken on the deposits, and we were the bank owner starting Monday.
I arrived at the bank with a crew of IT people on Saturday morning. We were not allowed to call or talk to anyone at the bank, so we had several different skill sets. The halls were lined with armed sheriffs and federal agents. No one at the bank had been allowed to leave until they were debriefed and the agents had what they needed. It is an impressive process of precision and speed. At this point, we realized it would differ from the other M&As we had experienced. The agents informed us that all the IT staff had quit months before.
As you can imagine, as a technologist, hearing you have no one that speaks your language, not to mention its incredibly high pressure, was concerning. We did a lot of analysis and planning that day, but perhaps the most interesting for this post is what we had to do with data. The finance and accounting teams called me and said they needed some reports or data from the banking core system. They had days to make decisions that we have weeks in a normal M&A. Every team in the company wanted data, and they needed to learn how to get it for them.
After running around, we found one person at the bank that knew a password to access the core. From there, my friend Jamie and I combed through data structures, COLD reports, and repositories. We pulled from our banking knowledge to rapidly create a data mart. Making it more complicated, we needed time to connect our network, so we didn’t have our everyday toolkit. We had to build a data mart with whatever we could find at the bank.
All in all, it took us about 3 days of working non-stop. Not bad for not being able to access a secure cloud, get to our tooling, or sleep. It helped, but I’ll never forget that one of the leaders told me, “man, we really need this day one when we asked.” I was exhausted, I felt like we had pulled together a miracle, and no, we didn’t hit it out of the park. This later led me to rethink our architecture so it could be done differently (I’ll do another post on modern data architectures for M&A). They needed it day 1 because there are several decisions a bank can only make in the first week and the first month following and FDIC acquisition, so speed is key.
That is just one story about data during an FDIC acquisition, I doubt SVB is dealing with the same thing, but there are a number of challenges. If any part of that story interests you and you want to know more you can reach out to me!