Advanced Computing in the Age of AI | Thursday, March 28, 2024

The Capital Markets Industry’s ‘Unsustainable Cost Model’: A Technology Strategy 

Greed, fear and regulations have historically driven the capital markets industry. But with today’s rising regulatory regime and its attendant costs, the driving factors have boiled down to fear and regulations. Greed has been swamped by compliance requirements that, at one tier one bank, cost $1.9 billion last year.

This is the grim picture drawn by Tabb Group’s Terry Roche, principal and head of fintech research, at this week’s HPC on Wall Street Conference in New York. A path out of the current morass exists, he told the audience, even if it’s a challenging one.

“The fundamental truth of capital markets is that the cost model is now unsustainable,” Roche said, “largely due to cost of compliance with regulations.”

The capital markets industry is beset by a three-pronged perfect storm that includes a Gordian knot of data management complexity required to comply with transactional tracking and reporting regulations; the elimination of proprietary trading as a source of profits; and an aging legacy IT infrastructure that impedes the adoption of new, advanced scale technologies that can lower operational costs and perform advanced analytics in the pursuit of alpha. Money that could be invested in new, advanced scale technologies is consumed by compliance, he said,quoting a managing director at a top 10 investment bank who said:

"It's a digital business model we need to embrace for the firm to have an elastic, agile infrastructure. We do lots of real time solutions that have high mission criticality and uptime needs. We run the risk as an industry to be locked out of the innovation happening in the public cloud."

But IT infrastructure upgrades are a necessity, Roch said.

“You think about CCAR regulations and comprehensive capital analytics analysis and review by the Fed, for every risk position in the firm, you have to tell the fed where your data’s coming from, every step of the way,” he said. This entails a range of structured (prices, trades, confirms, cancels) and unstructured data (all e-communications traffic, including chat logs, emails, voice transcripts) all along the transaction stream.

Tabb Group's Terry Roche

Tabb Group's Terry Roche

“It all needs to be captured, not necessarily knowing what anyone’s going to do with it yet,” Roche said, “but you have to capture that information, and I think some of the leading firms are combining that data with structured data…, understanding the full panoply of what happens in the transaction, which is causing more data, more analytics, more compute, more cost.”

It’s also causing more complexity and IT dysfunctionality that undermines profitability. In the example of the tier one bank cited earlier, Roche said he shared a dispirited afternoon with the heads of anti-money laundering and surveillance, who told him of the annual $1.9 billion in compliance costs. “To spend that much money and not return a single penny of shareholder value – it’s not sustainable,” he said.

He also cited the COO of a major London bank who grew so frustrated with his IT division that he vowed to fire the staff and outsource the IT operation.

“‘I can’t deal with this anymore,’” the COO told Roche. “ ‘I’m getting excuse after excuse. We’re going to outsource the entire thing… Every time I have an issue about wanting to change the model to reduce the costs all I get is blocking. I’m not going to run through a brick wall any more, I’m going to get rid of all of it.’”

With the explosion in regulatory costs after the 2008 Wall Street crash have come infrastructure cost reduction measures. This has meant staff reductions – or, in Roche’s words, “fire anyone they can fire.”

“We’ve seen significant offshoring of staff to lower wages,” he said, to handle such tasks as middle- and back-office processing, data processing and data cleansing. “However, that strategy is under duress because those lower wage locations have undergone inflationary growth in wages. And there is a massive turnover problem.”

Roche said firms have automated as much of their operations as they can “and continue to do so, and while those moves have realized significant savings, structurally it’s not enough as regulation keeps running apace.”

Beckoning capital markets organizations is new cloud and big data analytics technologies that have delivered great benefit to other verticals. But the industry remains technology hidebound, though not of its own choosing. In part, it’s a penalty the industry is paying for being a first mover toward digitization in the mid-1990’s.

“There’s so much infrastructure architecture and enterprise services that are integrated together that it’s very difficult for capital markets to be nimble at this point in time and embrace change and embrace new technologies,” he said. “Lots of acquisitions have occurred, there’s a lot of integration that still needs to be done.”

In addition, there are significant performance obligations that raise the risk of disruptive change. “They can’t suffer any type of downtime, there’s obligations around privacy of data for personal information and account information that is required by regulation that would have a commercial reputational loss that would be devastating when firms are breached or lose some information. It’s a tremendous challenge insofar as the regulator frameworks that have developed over the last 10 years that have really sucked up all of the investment capital for capital markets.”

In the face of these problems the choice presents itself: Change or die, no matter how hard change may be.

“There is a conundrum here in recognizing that all of these challenges are real and legitimate,” Roche said. “So new models to achieve differentiation – in fact I’d go so far as to say, to stay in business – need to be embraced.”

Roche outlined a five-part path forward centered on a strategy of effective aggregation of data and agile application development that leverages the computational, storage and analytical capabilities offered by hybrid cloud technology strategies. The first step is adoption of hybrid cloud (a combination of private and public cloud) technologies that enable advanced scale computing.

“They’ll need to embrace and I believe deliver to their clients third generation platform services that’s represented as a combination of cloud, mobile, social, big data, data analytics and microservices,” he said.

A second step is industry-wide collaboration around what have become increasingly commoditized utility services, which no longer provide market differentiation to capital markets organizations.

“There are so many commoditized services that are in the industry that occurred because of the first mover disadvantage,” Roche said of items like reference data management, legal identifier management and fixed connectivity. “Every organization created these commoditized services on their own, and now they’re no longer competitive.”

Rather further investment into these functions, “we’re seeing more creation and more exploration around consuming utility services.”

A third step is the adoption of Blockchain, which – though overhyped – is an important data lineage tool that helps overcome siloed data, said Roche.

“Blockchain is just one piece of the puzzle, not the entire puzzle,” he said. “By definition it is a next-generation database architecture, a beautiful data architecture that enables a consensus or validation mechanism that will let you know the source of the truth, you don’t have to go in and check your data.”

But a fourth step, to cut major post trading processing costs, is establishment of Blockchain industry standards to support interoperability.

“The post-trade world today does checks: ‘Are these the right securities?; the right legal entity information?; does my ledger agree with your ledger and agree with the six other ledgers I have in my organization?’” Roche said. “That’s because the data is all over the organization. Blockchain can solve this problem, but only if the industry comes together to standardize its understanding of what’s being traded…and establish community services, utility services that moves all of the post-trading data validation to happen pre-trade.”

The final step, he said, is adoption of machine learning to leverage insight from the vast amounts data captured around customer transactions.

“There’s a lot of data that’s out there and the…ability to manage all that information, to wrangle all that data, is a challenge,” he said. “We’re seeing machine learning techniques starting to be leveraged around data management, around compliance for things like natural language processing to understand the voice traffic, to do correlations between structured and unstructured data, and I think that in some instances, although they are closely guarded secrets, were seeing machine learning leveraged to create data analytics for alpha generation.”

Roche said these steps are tightly integrated.

“There are so many misunderstandings about this industry because there’s frankly a lack of understanding of the nature of enterprise services and all of the systems that are weaved together in these institutions, they need to work together as a symphony, and it’s very difficult to make changes without making significant dislocations to other services,” he said. “But it’s an absolute requirement that this industry needs to get its arms around, to make these changes in order to survive, and to offer far more dynamic and compelling services to the investment community.”

EnterpriseAI