Virtu: Supporting Trading Decisions

Kevin O'Connor, Virtu
Kevin O’Connor, Virtu.

Global Trading spoke to Kevin O’Connor at Virtu about the results of the buyside pre-trade and post-trade survey, and how his firm is evolving to keep up with client demands


What are some of the key pieces of functionality clients are asking you to incorporate into your pre-trade tools?
In our view, pre-trade has evolved into a decision support tool, so many of the items requested are in support of helping make more informed execution strategy decisions.  For customers that are using pre-trade for automation, they have asked for market impact estimates that factor in live market conditions to better route orders. The challenge here is that market impact models are also widely used in liquidity risk applications and in portfolio construction.  Having a model that meets both the real-time trading use case and the portfolio or risk use case presents some unique challenges. We have been deploying a new pre-trade framework to customers (our Dynamic Cost Estimator or DyCE) to support both of these general usage patterns.

In addition, customers are looking for a tighter integration between pre-trade tools and their EMS functionality. Pre-trade is more than just market impact models, so the combination of historical performance metrics, real-time market indicators and liquidity measures need to come together to round out the decision support framework.  Increasingly this is being applied to other asset classes, as well as to Equities.

What about strategy selection or venue selection?
I think most customers would evaluate and eventually use a model that can be shown to be effective with strategy selection. The requests we see from customers vary from those that are looking for us to incorporate their own data and models into the strategy selection framework to those that would like a general-purpose model that suggests an execution strategy or venue. General-purpose models can be difficult because there are many variables to consider, and each customer’s experience may be different enough to affect the outcome. There is a lot more work to do in this space.

Where does pre-trade still fall short of customer needs?
Most of the discussion has been about the reliability of pre-trade estimates for areas such as illiquid securities or block trades.  In general, single stock pre-trade will be limited by what historic data is available. Although the cost of market data keeps going up, customers (and vendors) are still interested in quality data that help with constructing models and evaluating trading. This problem is most noticeable in Fixed Income but still exists for equities in some areas.

For post-trade analysis, which for equities is quite mature, are there new ideas being discussed with customers?
Yes.  As capabilities for most providers have matured, customers have been asking for more actionable analytics.  We have been spending most of our time this year rolling out a new equity model to our customers (our Smart Cost Estimator or SCE) that not only can provide a model-driven benchmark for all orders but can also be used for scenario analysis. It is the scenario analysis capabilities that lets customers evaluate alternative execution strategies and then evaluate the effectiveness of those changes.

What about data quality?  Are there still areas that require attention?
I think that anyone who relies on daily TCA needs to remain focused on data quality. It is not just about properly filtered market data. Customers recognize the importance of tagging orders correctly so that the areas being evaluated (traders, brokers, algos) are being done so fairly. Tagging data with the correct order type or discretion category remains important so that the correct populations can be used for any rankings or evaluations. We have been seeing more requests for scorecards (for brokers and algos) that apply filtering for both order characteristics and outliers. The quality of the data being used in the reports is just as important as the format and flexibility of the reports.

What are some of the general technology and industry trends that are affecting Trading Analytics?
Everyone is talking AI and Machine Learning these days and it is no different in the Trading Analytics space. Customers are looking for AI tools to help with report generation and data analysis to cut down the time it takes to draw conclusions from their own data.  We have been focused on the use of Machine Learning in some of our models, but most of our efforts are in providing access to our data and models so that clients can experiment with these technologies on their own. This has led to many advancements in the API delivery of pre- and post-trade data. I think the proliferation of customer-focused APIs is one of the most exciting developments in Trading Analytics.

Haven’t APIs been available for a while?
They have, but the rate of change here is increasing. The capabilities of the buy side have changed in the last few years. Many of them have data scientists or data analytics working directly with their trading teams. Although vendors will continue to provide tools to help analyse data, many customers are bringing the processed data back in-house and enhancing it with other internal and external data sources. They are using technologies like SnowFlake to build their own view of the market and their trading activity. We have been supporting this effort by offering API access to almost all of the data we offer up in our pre- and post-trade tools.

What other industry trends are affecting your business?
We have been working to expand our capabilities in other asset classes such as Fixed Income and Fixed Income derivatives. Our largest clients (and even some of our smaller ones) have been moving to the multi-asset class dealing desk model. Even though these asset classes are very different and require specific approaches and calculations, some of the concepts in equities are still valid. The automated capture of transaction data (including RFQs, IOIs and Axes) is very important. Even though benchmarking in Fixed Income can be a challenge, cleansing transaction data and market data gathered from multiple sources allows customers to get a view of what execution options should be explored for different orders. The original intent of TCA was to quantify transactions costs so that they could be lowered and thus improve returns. Customers see value in doing this for all asset classes they trade.

 

 

©Markets Media Europe 2025

TOP OF PAGE

Related Articles

Latest Articles

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] Please review our updated Terms & Conditions and Privacy Policy carefully. By continuing to use our services after Aug 25, 2025, you agree to these

Close the CTA