Data is king — but how stable is the throne?

With managing data complexity a major challenge for today’s traders, there is a broad consensus around the need for better data architecture, enhanced structured and unstructured data management, and real-time data capture and decoding within the industry, as well as the need for smarter ways of handling data and processes. However, key upcoming challenges include how to manage data in the cloud, including the need for a common application programming interface (API) and regulatory compliance. 

Jarod Yuster, CEO, Pico

At Tradetech 2024, a panel of infrastructure and data analytics providers, and one asset manager, looked at the next phase of data aggregation and processing, discussing what firms are doing to utilise the masses of data they have, and how they are making it useful.

Jarod Yuster, CEO of Pico, an infrastructure services provider, pointed to wider macro trends such as advancements in global data and software analytics, including electronification of markets, desire for global borderless trading, and upgrade to API-based solutions.

The above has led to larger datasets needing to be consumed by multiple groups, highlighting the need for a common data structure and fungibility with developers across asset classes and functions, “global borderless trading” — despite geopolitical scraps and trade barriers springing up.

Rob Lane, global head of business execution, low latency, LSEG

LSEG’s global head of business execution, low latency, Rob Lane, said the firm manages more than 200 billion messages per second across 500 exchanges. “Acquiring data is not the problem, it has grown and grown and grown. The issue is how you store it, manage it and retrieve it,” Lane said.

Robin Mess, co-founder and CEO, big xyt, sporting a Make European Equities Great Again baseball cap, said that given the complexity of data streams it is important to consolidate. While there are multiple routes to accessing good quality data, “normalising it and making it insightful and useful, that is a big challenge for the industry,” Mess said.

For first issuers who want to demonstrate the size of the market, there is a lack of data aggregation, Mess said. “The quality of liquidity in Europe is not that bad; we have stocks that trade a billion a day, we have great volumes across different mechanisms and venues. We have multiple stocks that on average trade below 3 bps. So there is a lot of data that can be used to promote markets from an issue perspective. And from an investor perspective it helps as well,” Mess added.

Robin Mess, co-founder and CEO, big xyt

Despite the need for easy access to quality data, Mess said the cloud is not always the answer. “Google, Microsoft, Amazon will gladly sell the capacity to anyone in the industry, but you still have to figure out how to use the data stored in the cloud.” While APIs provide access to this data, aggregation of different data types is the key challenge for the industry, Mess added.

Using artificial intelligence (AI) to grapple with increasingly large, disparate datasets will become increasingly commonplace, Pico’s Yuster believes. “When you’re capturing data in a very complex environment such as trading, with many connected systems, you need to capture every hop,” Yuster said.

With this focus on network monitoring and cybersecurity, AI can ensure large market makers and options market makers have the same access to electronic trading data warehouses.

“When something goes wrong in a real-time environment, you need to understand what’s the exposure in the marketplace, what’s the client exposure. With the US moving towards T+1 settlement, you can’t afford to have overnight breaks,” Yuster said.

©Markets Media Europe 2024

TOP OF PAGE