Advanced Computing in the Age of AI | Friday, March 29, 2024

An Insider’s Guide to Building a Cloud-based FinServ Big Data Analytics Shared Service 

Even if you’ve never lived through a data analytics development project at a financial services organization, you know it’s a complicated undertaking. But you may not know just how multifaceted, how layered, how political – how hard –a project of this kind can be.

Technology is only half the challenge.

Julio Gomez, a veteran technology manager and consultant in the financial services for more than 20 years, knows all this. Speaking at the Big Data Everywhere conference in Boston this week, Julio Gomez (industry strategy partner at data analytics consultancy Knowledgent, New York) delivered an A-to-Z portrait of a 12-month big data analytics shared service engagement at a major financial institution (that will remain anonymous).

It began in June 2014, when the firm faced a classic big data challenge: their data was siloed, their business units were siloed – everything was siloed – leaving the company unable to share data or gain organization-wide insight. Within the sequestered business units, what little “analytics” that was done was actually data management, data preparation and wrestling with disparate data types. The company realized their data had tremendous potential value that lay dormant.

What was needed was a cross-functional, cross-organizational system that would go out across all the business units and functional areas of the company, providing a way to release all the data through a centralized shared service that could perform advanced analytics. And that automated much of the byzantine data management complexities that were consuming so much time and resources.

Julio Gomez of Knowledgent

Julio Gomez of Knowledgent

“It was a system that would be capable of accessing the data, ingesting the data, cleansing the data, understanding data lineage, and then making it available for consumption,” Gomez said. “If we could manage all that within the shared service, that would alleviate the business units of that burden and free them up to focus and brainstorm on the analytics side – that was our objective,” Gomez said.

Among the challenges Knowledgent faced was the client’s lack of data analytics wherewithal.

“This is still early days for big data,” Gomez said. “Imagine back in mid-2014 the task that was at hand. The common theme: folks wanted to leverage big data – whatever that was – and they wanted to do advanced analytics. So the group decided, based on this common theme, that it might be good too if they actually collaborated and did this together, as a shared service. That was the genesis of the project.”

It was decided that the initial Proof-of-Concept phase would focus on areas of the company where there was a relative lack of data management maturity, where reducing the pain and friction of data management in support of analytics would do the most good.

Here is where Knowledgent’s educational challenge began.

“It was really an attempt to introduce an Agile methodology and be iterative in the process of building out a new organization, as opposed to a waterfall,” Gomez said. “We wanted to work on being more iterative with the business units as well as paying very close attention to the internal and external requirements for the usage of this data.

“This is something we came across over and over: we’re going to run into problems with how we can use this data so we need to plan for that up front and ensure we’re facile in dealing with those concerns. That may sound good when you think about planning for it, but it’s not easy.”

Knowledgent was exacting in its planning discussions with the client.

“The first thing we had to do was get on the same page for our mission, to really articulate this,” Gomez said. “We worked out the mission and operating values, and that was actually a lot of work. When you have a blank canvas and you have an organization trying to do something transformative, it’s actually very difficult to do. So we worked hard to establish the mission statement to allow us to govern how we built out the service going forward. And then we set out to do the internal selling.”

This initial phase also included framing the interaction between the shared service and the business units. The business units, meanwhile, focused on the business side of the shared service’s mission, “the hypotheses that go into the analytic models, the modeling itself, the testing and the analytic insights that could then be harvested and productionalized, all of it under the umbrella of a proper governance organization and set of principles.”

Gomez said it is often lack of thoroughness in scoping the mission and operational side of a system that leads to failure.

“I want to emphasize that in my experience this aspect of a project can get short shrift,” he said. “We’re often very quick to determine we already know what is needed. Too often, we take a few bits and pieces of information from the business and assume that is all we need to know to go forward. But only with thoroughly engaged and consistent dialogue can you really tease out the key information you need to make the service a success.”

That dialogue also helps secure buy-in from the business units as the project unfolds.

As the mission scope process neared completion, Knowledgent also tasked itself with building out a technology roadmap and design that incorporated the strategic decision – made by the client – that part of the system would reside in the cloud, impacting both the conceptual and logical architecture. “That was a challenge, but it was a requirement, so we had to deal with that,” said Gomez.

By this time, Knowledgent had developed an operating model, a technology architecture. Next step: put the concept into a POC system.

In their scoping work, Knowledgent realized that the use cases developed with the business units fit neatly into three broad categories: selling, modeling and risk management. Now Knowledgent was ready to identify the POC it would develop based on the value it would provide to a business unit and the feasibility and readiness of that organization.

knowledgent slideIn identifying use cases, Gomez said, “it’s incredibly important to go at least one level deeper, to figure out not just the description of the use case but also what is the business value that it creates, and then to articulate that very specifically. You also need to understand what it takes to execute against it, what are the categories of business data sources that you need to access.”

For the first POC, the mutual fund distribution business unit was selected. ”All their analytics people were working on managing the data. They wanted to see the impact of some of their campaigns on sales and draw that correlation, and we aspired to show them that.”

A lot was at stake. The POC would – or would not – deliver proofs of value, demonstrating not only the technology but also “our ability to work as an organization, to test our processes and to test our ability to work with a business unit.” On the technical side, a key goal was to expand the types of data sets that could be rationalized, managed and analyzed.

Knowledgent set out to build the big data environment – not just the architectural layers “but also what was needed to make it all work together,” connecting on-premises capabilities with cloud capabilities, a highly complex proposition. “We were dealing with connection, permission and firewall issues, PII. But we relentlessly pushed this thing forward.”

Three months later, in August 2015 (nine months into the project), the POC was completed. “We basically have a well-defined, well planned organization; we’ve got an established technical environment.”

And Knowledgent had a happy client. Gomez said the head of the mutual fund distribution business unit “was seeing things they’d never seen before in terms of insights, and they really wanted to go forward and take it to the next level, turning the POC into something they could productionalize.”

This added a higher level of complexity to Knowledgent’s work.

“We started running teams, bringing data in through the environment,” Gomez said. “It wasn’t smooth, we kept iterating, fixing things, strategizing, getting creative, really getting after it.”

Meanwhile, Knolwedgent and the client began the laborious process of developing data governance policies and procedures for taking the shared system across multiple departments. “We had a complex institutional organization to deal with, everyone had a finger in that pie, as you can imagine, so there was a lot of coordinating that was required within different organizations within the enterprise. We forced a lot of issues to the table. We had to because we couldn’t go into production until those issues got solved. In many ways that was the biggest impediment to progress, mobilizing the people who had a voice but not a decision. Everyone had to be made happy in this particularly critical area.”

Knowledgent also moved on to Wave Two Execution of the project across four business units simultaneously. “We were dealing with prickly issues like multi-tenancy, it was really a challenging time for us.” In addition, some functions performed in “a fairly manual way” in the initial POC phase needed to be automated. “We were growing, we were grinding, we were smoothing.”

This was the focus of the final four months of the project, and this phase included testing the technology environment, resolving governance issues, hardened the operating model, and building up the shared service system staff.

By the beginning of this year, Knowledgent was done, a shared service that Gomez said handles some of the most complex data issues facing the business units trying to achieve a higher analytics capabilities.

“It’s a partnership with the business units,” Gomez said, “it’s a partnership with the architecture within the enterprise, it’s a partnership with all the governance stakeholders. It’s being able to develop that partnership with all these units that is the difference between having your big data initiative be a strategic asset for the enterprise versus being put on the shelf as an interesting experiment.”

EnterpriseAI