Insights | BetaNXT

Ops 2024: Point to Point – Designing the Future of Connected Data | BetaNXT

Written by Don Henderson | September 27, 2024

 

Data modernization isn’t just the latest trend; it’s a critical and competitive element of success for wealth enterprises. Join BetaNXT’s CTO Don Henderson and a special guest for a lively look into the future of connected data, where modern technology systems power a faster, more transparent, and highly modular way for wealth enterprises to meet their clients’ needs.

Whether you’re an ops executive looking for new ideas on how to transform your organizational data architecture, a strategic business professional seeking ways to increase efficiency and reduce integration costs, or a risk management expert looking at opportunities to increase data security and compliance, this session provided inspiration for the next phase of data innovation.

This conversation was recorded live at the 2024 SIFMA Ops in San Diego.

Read Don's Top Five Takawayes from SIFMA Ops 2024
Read More

Anand

I want to welcome everybody. Good afternoon. Thank you. I said my name is Anand Pandya. I'm the Global Head of Financial Services at Hakkoda. We're a modern data consultancy and we really partner with our customers to help them harness the cloud and modern technology and enable AI capabilities. We've been working very closely with BetaNXT and Snowflake on strategies and integrations to really enable this next gen partnership that was announced this morning. I'm looking forward to this conversation and I'm looking forward to talking more with Don Henderson who's the CTO at BetaNXT and Chris Napoli the head of Wealth and Asset Management at Snowflake. Before we jump in, I want to let these guys introduce themselves talk about their journeys and talk about their respective organizations and how they're approaching data today. With that, Don?

Don

Thanks Anand. I'm Don Henderson Chief Technology Officer for BetaNXT. My team and I focus on the technology strategy and implementation across our BetaNXT products. BetaNXT recently just celebrated our second-year anniversary. We actually are a combination of three business lines brought together to build technology and workflow solutions for the wealth management space. Our clients many of which are in this room depend upon us to provide mission-critical services so they can focus on their growth strategies.

Chris

I'm Chris Napoli. I’m a long-suffering Jet fan so this football is the closest I've ever been to a New York Super Bowl in my entire life. I just want to put that out there. By introduction I lead wealth and asset management at Snowflake. There are quite a few things under the remit of what that role entails but one of the main focuses that excites me every day is thinking about transforming the financial services industry which I’ve been part of for over 20 years. The opportunity to work with BetaNXT on the strategy with DataXChange has been fantastic and I’m super excited about the announcement we had this morning.

Anand

Alright so let's jump right in. Change transformation, monetization, and innovation — all words that are used all the time, but they mean different things. You both sit at the forefront of the industry whether it’s your direct client demand Don or Chris from the broader platform perspective. I’d love to get your thoughts on how innovation in the data space is enabling companies in the wealth management industry to stay competitive grow their business and thrive; and that thematic shift has moved towards data and harnessing its capabilities. How are you hearing that from your customers and how are you starting to engage with that change?

Don

Well, just to back up to yesterday, there were conversations in this room, other rooms afterwards, just about the fact that data is problematic, it needs fixing, it needs more governance, it needs more trust. We essentially have an outdated data ecosystem. And today clients are building strategies around digital experiences. They're building strategies around analytics, all of which needs trusted, curated data underneath it. And our solutions today are more around taking data, we're shifting it to one bucket, we're moving things around and we keep putting things on top of it.

We're not fixing the root cause, which is the foundational layer of data that needs to get fixed in the industry. And that's the innovation that we need to focus on. We need to focus on solving for the problems that actually are creating the poor experiences. When you want to be a digital provider, you need to create the experience and it needs that data underneath it to have that trust there. At BetaNXT we see this problem all the time. We see it in the data structure. We'd see it in the workflows. There's work that needs to get fixed and it's evident in things like data reconciliation. We talk about that all the time in the industry. It's a nonstop conversation. And it's because the data is inaccurate. Every time we move that data around from one bucket to another bucket, it's less accurate. We all know that there's just a lot of work to sustain when you move data from one system to another system.

And second is the timeliness of data. There’re many cases where we see the data, it's accurate, it's right, but it shows up too late in the client experience. So, we need to figure out how to stop depending on overnight processes and actually create an experience around real-time data. Because that real-time data is what's going to drive things like digital experience that our clients are all kind of striving for. At BetaNXT, I think one of the things that you're going to hear and see us focus on is operational innovation. That's where we're going to focus. We're going to create solutions that are going to become more cost effective for the industry, that become more timely for the industry, and frankly, more friendly and flexible in the industry. So that that flexibility enables our clients to deliver their strategic strategies that are unique to them and their clients.

Chris

And exactly everything that Don said is what I and my colleagues at Snowflake are able to see just being where we sit as basically the operating system for advanced analytics. Just by way of thinking about how we've gotten here, it's by no fault of anyone's own that the foundation that we were just describing that needs to be modernized is in the shape that it is today. I remember 20 years ago sitting in the cage at Morgan Stanley slamming “R” on a mainframe terminal…otherwise, one of my managers would yell at me that we now need to go finance a position. That's just how the world worked at that time and trying to reconcile a CUSIP into an ISIN into a SEDOL; those were things that just weren't intuitive prior to the financial crisis. And just as was mentioned by Don, when everything happened in 2008, that was the beginning when thinking about how important data governance is, how important data lineage is, understanding where these transformations are occurring, and ensuring that you actually have that logic so that you could validate and attest to that data.

So, when we think now, or when I am working with organizations that are talking about their AI strategy, right? We had John speak earlier from Fidelity. The building blocks that go into training an LLM are the same foundational inputs that actually came out of that financial crisis. Like, what is the semantic layer? What is the data dictionary? How do we make sure that it's speaking our language, and everyone's language is going to be different. So, the focus now that everyone has realized is that all of that data governance work, it's either each individual firm is going to need to be able to do that, or they're going to be ecosystem partners that are able to help their counterparties, right? In this case, the wealth and asset management industry to do it on their behalf. And that'll probably help us get to the actual AI strategy that is dependent on a data strategy ultimately, that you're not building everything over and over again. And those types of innovations that are capable with Snowflake will help accelerate us to get to the productivity add that I think all of us are expecting to have. And I know that John Elise spoke about a little bit earlier today from generative AI and, and that next wave.

Anand

So, you guys both touched on three critical components. There's completeness, there's accuracy, there's timeliness, and I think people fall into the trap of just saying, we want more data. But that causes paralysis, paralysis by analysis. It's that phrase you always hear. It's not about more data, it's about curating data. It's about being able to make data usable for those components. And with that in mind, we all likely saw your partnership announcement this morning. First of all, congratulations to both of you. But if you could take a little bit of time and talk about why the partnership's unique, what you guys are building towards together, and how it ultimately aligns to not more data, but better data being available for use.

Don

So, we spend a good bit of time trying to figure out the root cause what we have to do to solve for this problem. And it's not solved because it's a big issue. It's a big, it's a big investment. Chris talked about a lot of different opportunities, but foundationally you have to get down into the nitty gritty build, small intricate pieces to get this to work correctly. And as we looked at the larger delivery, and cloud, and there were so many different new technologies coming into play here in this. I think it’s one of the keys, that we have the opportunity now — 20 years ago we didn't have it, but now we do because the tools are becoming more modernized. And we saw the partnership with Snowflake as an opportunity to help us. As we call it internally, turbocharge our solution and turbocharge our delivery because what they've done compliments what we're doing. And we introduced today as an announcement, this platform called the DataXChange. The DataXChange is a new capability for BetaNXT to enable data to work seamlessly and frictionlessly across all of our product lines internally at BetaNXT. We will also take that to our customers. We'll take that to help them transform their journeys. And we'll also take it to partners as well, too.

We see four things that we're building as part of the DataXChange. First is stop sending data, right? We want people to start sharing data, sharing original sources of data. I said before, every time you send a piece of data to somebody else, it's a new dataset. That's not going to be, that's not, that's not the way to solve for this data reconciliation problem that, that we have out there. Right?

Second stop doing the same stuff. We literally, I see as I visit clients, everybody doing the same thing. We're doing it at BetaNXT, we're making a big investment. We're making it the, at the data layer foundational layer to build proper data so that we can build curated and trusted data sets. You'll hear us talk about data sets a lot, right?

And three, the data experience that has to change. We know that you just can't take raw data and expect a perfect experience. You need to actually create experiences that are abstracted for data that you need as a user, as opposed to always have to pick and choose work through the challenges of managing many different data sources. Creating new curated custom data views to enable not just the engineer to be able to write software, but also the power user and the business user to be able to kind of ingest them into systems that they can do their analytics on.

And then additionally, it has to be governed, right? And governance is a key piece for us. And that governance is a delivery model that we have to have to ensure that we can do and work with companies like Snowflake. And that's the piece I think that brought us together, was that we saw that we're building this foundational layer data structures and we're focusing on creating new governance strategies. And we're doing some of the hard stuff in the, in particularly in the vertical industry, so that when we apply and partner with the Snowflake data AI cloud, which it touches on three, three key technologies there, we get the advantages of what they're doing, what they're investing in, and their ease of use and bringing it to market will complement all the things that we're doing so that the data transformation can move faster in the industry.

BetaNXT DataXChange
Learn more about our DataXChange platform

Chris

And it's exactly why I agree with you, Don. And we learned how BetaNXT actually creates two terabytes of data per client per day, right? We're talking trillions of rows of data that exist within their ecosystem. The phrase big data will, well you hit it, right? That that's where it really is. So, what organizations started to discover is that to maintain that on-premise, right? And have software and servers and hardware and DBAs to do that, it, it lends itself really to cloud infrastructure. And what I'll do rather quickly, is everyone familiar with Snowflake at all here in the audience? Great. So, does everyone know it goes cross cloud, it's basically a normalization layer to communicate from one cloud provider to another. But what really what it does is it communicates from one Snowflake instance to another without ever really moving data, you're just viewing into someone else's universe. So, what that does and is every what it really allows you to do is kind of a seamless way to transfer data without actually moving it. But ultimately what Snowflake has also innovated upon is the fact that you could also move code in the same fashion. So, there's no longer a need for say, file transfer protocols. Then there used to be, hey, if I wanted to take data and I wanted to perform a process, like a data scientist wants to upload a library and do some Python on it, you would move it to another area or another software or another provider.

Well, that's what Snowflake is looking to innovate. And what companies like BetaNXT have the ability to keep growing upon is, hey, we will send code to this data now. We don't actually need to even exchange it to go to different software. So, what we're truly trying to do with Snowflake is to connect the ecosystem. The participants from the data providers that we're fortunate enough to have here that we work with to the service providers that we're here with to assist, to then really actually make sure nothing is moving and we're bringing code to the data. And I know we Phil Sims may have not known a lot about cybersecurity and understandably, unfortunately, I have the curse and knowledge of knowing a little bit too much about it and the ways of how do you limit a lot of that, just as mentioned, it's not even just the hops of moving data around and stopping, it's making sure you're bringing things into an encrypted enriched environment. So that's what allows, and that's really what we saw what BetaNXT was thinking about, right? It really allowed us to say, this is a way for the ecosystem to modernize, to really get through some of the challenges that as a 21-year-old out of university gentleman working at the, in the cage of, of Morgan, like, wow, some of those reconciliation challenges, they were just data problems, right? It was just that something didn't post in the right place. Like I had to learn omni accounting and broker dealer accounting because like, it just, there's just, hey, someone didn't post a row in a column at the right period of time, right? And now that's really what the capabilities are of this next wave of, of technology are to be able to solve that on all sides of the transaction. So, that's part of why we were so excited with all of it.

Don

I think just to add there that the pieces that you talked about around orchestration, automation, that's, that's core. We have to get rid of the manual processes that you were doing back 20 years ago. Frankly, some of that stuff probably still does exist. That still happens. We have to get to the point where everything is automated. Everything has a process that's governed around it and to deliver the data so that it's a trusted data set that our clients can kind of depend upon.

Anand

So, you guys touched on, again, a recurring theme there about, about trust and access. And, and when you talk about it through a technical lens, if there's technologists in the room, you're thinking, “Oh yeah, great idea. I have to go through our architecture patterns and review boards, and I can't do that. And I don't know how to do that.” If you're a Wells person and you…I ran Zephyr and IIS that inform a financial intelligence, and adamant can't leave a data center. You funded a whole different data center for us to use because you didn't want your data to go anywhere. And the business side suffers because it's: how am I supposed to use all this data if I can't get to it? To Chris's point, to Don's point, making data accessible is ultimately the trick to drive that innovation. So, my question to you guys is: how do you cross that schism or chasm that exists in the industry today where the difficulty of accessing data and the resulting privacy and security and compliance concerns manifest. How are you guys approaching that and ultimately addressing that both underlying, with the Snowflake architecture that Chris, you had already referenced, but by driving the DataXChange on top of it, how do you help bridge that divide and ultimately enable access to those curated sets with the trust and compliance built in?

Don

Well, I'll start with governance. We talked about that word. One thing that we're going to focus on is redefining what that word means. We're going to put fanatical in front of it, right? We're going to take governance and we're going to realize it's just not a process. It has to be part of an architecture. It has to be part of your workflows. It has to be a tool. It has to be capabilities to be able to define. It has to be able to define the lineage. It has to be able to trace, it has to become analytics, and it has to be easily see the bigger picture of data itself. We have to figure out how to manage data in a fanatical way using governance. And that governance is an area that is a huge opportunity around innovation there.

And then of course you can't do anything once. That's another issue every time we do these data projects, we tend to build them once, get them done, and then we just hope it happens. We have to actually change the way we manage it afterwards. So, the change process becomes a part of the governance process. So, as I move governance into my change process, I have to move governance into the change process an change how we build software, maintain software, maintain services. And that governance now becomes that the change process has to include data. Data has to be a core part of everything. It just can't get built. Because otherwise what we're going to end up with is more data sets copied and pasted into many places and more data problems for ourselves.

Chris

And to tie it to GenAI as a theme as was mentioned, is if the data…we've all heard the idiom: Garbage in garbage out. Now you've seen it, right? Like you've seen what happens when a model is inferring an answer, but it doesn't have direct guidance as to where it is or how that data element may change based on if they use a different source. So, the building blocks that I mentioned before that came out mostly from regulatory framework to make sure — I don't want to give my whole career, but well I'll skip part of it — make sure the financial crisis didn't happen the way it is again. And the regulatory reporting that happened as a result of it that put the governance models out to say this is what a CUSIP needs to be. Alpha numeric nine digits. That's the only way we're accepting it. The technical schematics of it all, it was a defensive play, but now it's much more a proactive play. If you want to get to this productivity, if you want to trust the data, you need to have technology inform you that this dataset is green. This dataset did not have two levels of management approval. It is yellow. But then determine whether or not that coloring is able to be used for certain business processes. Anything out to the street probably should, everything needs to be green.

But the thing is, is that everyone is doing that. I had the fortunate opportunity in my career to be a head of data and analytics officer for a custodian bank. And it is a, it's viewed reactively, but it's now the building block tool. And I truly feel that that's a paradigm shift that is going to really realize that the data governance, the actual ability to trust what's coming from your service providers to know it's gone through these four eyes or some level of attestation and is statistically accurate, it will quickly help us get into shortening the amount of time that we go about our processes and our services. It allows us to repurpose ourselves to do more challenging or, or potentially even more value-added tasks, right, for our organizations. I think that's ultimately where when you see an organization like yourself be able to do it for this data set and you are trusted, well then it could go right into the model and all of a sudden, I could ask an LLM what my cash position is, right? And trust it's going to come back with the right answer.

Anand

So, we work with customers in the wealth and asset management space across the globe and it really comes down to how do I use my data better? How, how do I get there? What is that strategy: from being able to move from the way I've always done it, the way I've always consumed it to actually leveraging new technologies? Don, you've couched the DataXChange as “Transformation as a Service”, which is kind of this umbrella of everything we've talked about and enabling that. So, kind of final question to you guys is really: what next? And it's not the three five-year answer, it's the three to five month answer. What are the near-term innovations that you guys feel the DataXChange and the broader Snowflake platform can really trigger and enable for the people sitting in this room right now wondering, how can I do something differently going into 2025?

Don

Well, let me start with the story where we're at right now. So, we are in the DataXChange platform. We're currently in pilot. We're working with key customers. We see that that process that we're building and testing, verifying is an important step in the build process here for us. Cause we're going to work with our clients because I think the client feedback is what we're most interested in. We want that feedback to help us define the story for what we're going to continue to build going forward. So Q4 for us is pilot, get it right, get it working. We'll officially launch the actual DataXChange platform in Q1 of 2025, and it'll have all of our production readiness that we need to have in place so that anything that we deliver has that got that trusted flavor associated to it. And in Q1 you'll start seeing us deliver new services and applications driven by the DataXChange.

That's our objective. DataXChange is a capability for BetaNXT, but we're going to continue to enhance the services that we deliver today and bring out new services that we could deliver in the future. And I think the areas that Chris was talking about, that's another interesting opportunity here for us is that our roadmap will be in intermixed a bit with what Snowflake is doing. So, we're going to test everything, verify everything and as we bring out new things like AI LLMs that are built, they’ll be built onto the data that exists inside the DataXChange. So, it's the ease of use that we're trying to kind of promote here. It's the availability of data that now has more capabilities on it.

So, our objective ultimately is to create a different data experience so that the client experience can be achieved that everybody's looking for. We need data that is trusted, that trusted dataset. Our ultimate objective is to take out the complexity that exists today in raw data, right? Raw data is hard to manage. That process of taking raw data has many, many, many steps to be able to get it to the point where it's curated data and it's usable. So, our ability to be able to kind of transform raw data into client driven views of the data, that's the journey that we think that we can start driving in 2025 so that our clients can focus on the data they need as opposed to having the right software that constantly changes against raw data. Because as we build new systems and we change our data structures, our clients have to keep up with that. Everybody needs to keep up with that. And that's what APIs do. They all, as they things change, change, change. There's not an abstraction layer in the industry here. And we think that getting rid of the concept of sending data and sharing data will eliminate the need for those updates so that you can have essentially an abstracted layer of custom data that you need. Because we know you need your data in your format for your systems. That's ultimately the objective that we're trying to achieve with the DataXChange to be able to make the experience around the data easier to use for our clients.

Chris

I would say the first thing and the most important thing in order that, that I pay attention to and it's probably the health of Aaron Rodgers, Achilles. And as long as that stays strong and sturdy, I truly think that this may not be the closest I am to a New York Super Bowl for some time. But joking aside we are hand in hand in our product roadmap and even more importantly, the audience that BetaNXT serves is not always the audience that Snowflake is fortunate enough to position, right? Because it's a cloud data warehouse. We convert and compete with say, Redshift, and Teradata, and Sybase and these modernization mainframes, to be honest, DB2. And it's really about how you get data into it, to get to these insights that you all ultimately are looking to receive. So, the voice of the industry is going to come from individuals like myself and my peers through organizations like BetaNXT so that we could hear things that you may want their platform to do that could help us inform our product roadmap.

Likewise, our product roadmap as mentioned also helps inform where this can go. With the timestamps that exist within with Swift messaging and trading, that helped us realize that having time series capabilities within Snowflake, so you could do joins of, hey, what happened this hour? Where were the most amount of exceptions? What happened in this 15-minute window can help you provide insights to procedures that may need to be improved that we just previously were not able to do in, in the ways that the infrastructure that we used previously. And again, through no fault of anyone's own. So as long as we as a community, as an industry make sure that the communication is through all, we will be able to make sure that the future proofing of the platform, is ultimately what we can drive towards.

The beautiful part about Snowflake or what has permitted this data liquidity that exists in the industry for those that are using or have been using Snowflake is the fact that in Walton Asset Management, a good amount of the information is structured, the third-party data providers are structured. The formatting of these messages can be transformed into analytical procedures. And that's the hardest part of what to actually bring AI towards. It's the first wave of LLMs. They're really good at taking PDFs and getting information from them, they're not very good at reading structured data tables to help ultimately inform the information that those in this audience are truly looking for. So, understanding where it is that we should focus, how we can help your day-to-day through our partners like BetaNXT, that that's ultimately going to be what permits us to continually develop and prioritize so that we're really making sure that we know with the true pain points of the industry that we're solving.

Don

And just to reiterate, we're going to build solutions around the data, we're not going to build applications and then bring the data to it. And if we can do that — it's going to require us to rebuild some of the stuff that exists today — it's going to make it easier for us to operate for sure.

Anand

So, I think that's it for our time. I want to thank Don and Chris for spending time sharing their thoughts. I want to thank SIFMA for allowing us this time. I would challenge all of you, think about what you want to transform, think about how you want to make your consumption of data easier. You can find Don, Chris, myself, JR, and the entire BetaNXT team at the BetaNXT booth to ask them questions, bring them ideas. Innovation will be sparked by you guys, their customers and their partners. So, thank you everybody, have a wonderful conference. We appreciate your time.

Source: SIFMA 2024 Operations Conference & Exhibition