Home Blog Page 482

Citi Leads US Top Fixed-Income Leaders: Greenwich Associates

Program code - selective focus

U.S. fixed-income dealers are investing heavily to meet investor demand for electronic trading. But the nature of that demand is changing. Institutional investors today aren’t just looking for electronic execution. Instead, they are looking to partner with dealers who have the best pre- and post-trade analytics, deepest data, and advanced electronic tools.

All of the 2019 Greenwich Leaders in U.S. Fixed Income have continued investing in their electronic offerings with the features and capabilities required to meet those changing needs and attract institutional trading volumes. In Overall U.S. Fixed–Income Market Share, Citi takes the top spot followed by a three-way tie among Goldman Sachs, J.P. Morgan and Morgan Stanley, with Bank of America Securities rounding out the list. The 2019 Greenwich Quality Leaders in U.S. Fixed Income are Bank of America Securities, Citi and J.P. Morgan.

These dealers know that electronic infrastructure is becoming an evermore important factor in the competition for U.S. fixed-income trading business. In the months covered in the Greenwich Associates 2019 U.S. Fixed-Income Investors Study, the proportion of secondary investment-grade credit volume trading over e-channels was in the 25-28% range, while high yield hovered in the low teens, according to Greenwich Associates.

“Numerous factors are contributing to this growth, including the ongoing increase in ETF trading volumes as well as more odd-lot trading resulting from the broader trend toward index investing,” says Greenwich Associates Managing Director Frank Feenstra. “E-trading in emerging-markets rates and credit, and in short-term fixed income, also increased year-over-year.”

2019 Most Helpful Analysts and Traders in U.S. Fixed Income
The growth of e-trading and the more recent emergence of data analytics, AI and other electronic tools as core business imperatives for investors have together elevated the importance of dealers’ electronic offerings. However, as sell-side investment in e-tools grows, top-caliber traders and analysts can help dealers differentiate themselves from the competition.

“In an electronic age, a personal connection created by a trader visit can be the thing that gets a dealer the first call,” says Greenwich Associates Managing Director James Borger.

Click here for the complete list of 2019 Greenwich Leaders in Overall U.S. Fixed Income, Rates, Credit and Securitized Products and the 2019 list of the Most Helpful Fixed-Income Traders and Analysts.

Closing Volume Discovery

Kathryn Zhao, Cantor Fitzgerald

Kathryn Zhao, Cantor Fitzgerald
Kathryn Zhao, Cantor Fitzgerald
Imbalance messages provide insight into the closing volume and have implications for our prediction method. We focus on the closing volume discovery process over time and identify key differences for this process on NYSE and NSDQ.

In recent years, daily US equity trading volume has shifted more towards the closing auction. For S&P 500 stocks, the closing auction volume increased from 4% of the daily volume eight years ago to more than 10% of the daily volume now (Figure 1). And the last half-an-hour accounts for more than 25% of the total daily volume today.

Spread in the last half-an-hour is at the lowest level for the day. For S&P 500 stocks, the average spread decreases from more than 20 basis points (bps) at the open to less than 5bps in the last half-an-hour (Figure 2b). Volatility also stays at a relatively lower level in the afternoon than in the morning (Figure 2c). The market impact is trending down and reaches its lowest level in the last half-an-hour. According to our research, the market impact of trading the same quantity at the same rate near the open is 1.4 times of that during mid-day, while the market impact near the close is only half of that during mid-day (Figure 2d). It is clear that liquidity, spread, volatility and market impact are all intertwined and they all indicate trading near the close is cost effective and should be an important part of the trading strategy.

Figure 1. Close Auction Volume Trend
Figure 1. Close Auction Volume Trend
Figure 2. Intra-day Market Microstructure Profiles
Figure 2. Intra-day Market Microstructure Profiles

The closing auction is of even more significance on index rebalance days such as the annual Russell Rebalance in June. For stocks that are added or deleted from the Russell indices, the average closing volume could be more than 40% of daily volume. For stocks that remain in the Russell indices, the average closing volume could still be as high as 25%.

Therefore, closing volume prediction is an important factor in the optimal scheduling of algorithmic trading. Managing order placement logic appropriately is difficult given the actual daily closing volume has very large variations. Excluding large trading volume events such as the annual Russell Rebalance, the standard deviation of the closing volume is 66% for Russell 1000 stocks. Comparing the closing volume versus the previous day, the average absolute prediction error can still be as much as 50%.

However, prediction error can be greatly reduced if taking into account imbalance messages published on the primary listing exchange. The goal is to find the optimal balance between starting the order early enough versus waiting/adapting order placement as imbalance messages are being disseminated near the close to achieve the lowest volume prediction error.

Contrasting prediction models for NYSE and NSDQ closing auctions

The two largest primary exchanges for closing auctions are NYSE and NSDQ. Each exchange has its own closing auction procedures and methods for disseminating imbalance messages, and its own methodology to determine the eligibility and priority of orders participating in the closing auction. The information fields in imbalance messages include reference price, paired quantity, imbalance quantity, near and far indicative clearing price.We use a linear regression model to predict closing auction volume on Paired Qty and Imbalance Qty. We run the regression every 30 seconds in the last 6 minutes to capture the time effect as more information becomes available towards the close.

1. NYSE
One key feature for the NYSE close is d-Quotes. The NYSE cutoff time for MOC/LOC order entry is 15:50. However, d-Quotes can be submitted, modified or cancelled up to 15:59:50. This unique feature contributes to greater uncertainty of the closing volume on the NYSE. It is important to note that NYSE d-Quotes are not added into the imbalance feed until 15:55. As imbalance messages are updated every 5 seconds, prediction accuracy gradually improves over time towards the close.
screen-shot-2019-09-23-at-11-25-24-am
As shown in Table 1, at 15:55 when NYSE d-Quotes are initially included in imbalance messages, the average Absolute Prediction Error drops significantly to 12%. It continues to decrease slowly, dropping to 4% at 15:59:30 when the deadline for d-Quotes submission approaches. The regression coefficient of Paired Qty also decreases over time.

2. NSDQ
The NSDQ cutoff time for MOC/LOC order entry is 15:55. After 15:55, Imbalance-Only (IO) orders can be submitted before the close and late LOC orders can be submitted until 15:58, but IO and late LOC Orders cannot be updated or cancelled. Since any IO orders to buy (or sell) that are priced more aggressively than the 16:00 NSDQ bid (or ask) will be adjusted to the NSDQ bid (or ask) prior to the execution of the cross, there is essentially only time priority for those IO orders. The competition of IO orders for time priority quickly drives the imbalance quantity down to zero shortly after 15:55. Consequently, the prediction error converges quickly to a much smaller value on NSDQ.

Figure 3. Histogram of Prediction Error on NYSE
Figure 3. Histogram of Prediction Error on NYSE

In Figure 4, the orange line shows an example of the evolution of the predicted closing volume, which converges to the actual closing volume (the blue line), while the baseline prediction (the green line) in this case underestimates the closing volume.

Figure 4. Evolution of Closing Volume Projection on NYSE – An Example
Figure 4. Evolution of Closing Volume Projection on NYSE – An Example
Figure 5. Histogram of Prediction Error on NSDQ
Figure 5. Histogram of Prediction Error on NSDQ

The histogram of Prediction Error on NSDQ has a much narrower shape (Figure 5). Immediately after the first imbalance message, the Absolute Prediction Error is only 1.5%, and reduces to 1.1% at 15:58. The ErrorStdev in Table 2 appears to be larger than that in Figure 5, because the distribution is not normal and the standard deviation is driven by the long tails with an Absolute Prediction Error larger than 5%.
screen-shot-2019-09-23-at-11-31-39-am
Special days

  • Quadruple Witching/S&P Quarterly Rebalance: closing volume averages 26% of daily volume (compared to 10% on a “normal day”).
  • Russell Rebalance: closing volume averages 30% of daily volume. Intra-day trading volume is also severely back-loaded.
  • The closing volume on “special days” are often several times higher than that on “normal days”, which requires model coefficients to be calibrated separately.

    Conclusion

    Closing volumes are highly variable and hard to predict. The imbalance messages offer a more accurate prediction for closing volume. The timeline of the “discovery process” varies on different exchanges. We find that the prediction converges extremely quickly on NSDQ, while the prediction error gradually reduces on NYSE as more information flows in from d-Quotes. 

    Managing Vendor Consolidation – Decoupling your OMS and Client Connectivity

    George Rosenberger, Itiviti

    With the continued evolution of capital markets, market consolidation is the commonplace order. Smaller technology vendors are being acquired by larger vendors at an alarming rate. The latter are also expanding their offerings through mergers and acquisitions to reduce time-to-market for new solutions and features.

    George Rosenberger, Itiviti
    George Rosenberger, Itiviti

    Over the course of the past 24 months, we’ve seen billions and billions of dollars change hands, not just from the mergers of tech companies, but financial firms recognizing the need to become more technology-focused—I can think of eight off the top of my head. To fully comprehend the impact of consolidation in this space, let’s consider the main rationale behind it. I see three basic motives for growth through consolidation:

  • Increasing customer bases Monolithic companies may acquire a firm for their customer relationships, rather than adding technology or resources.
  • Improving buying and negotiation power Smaller firms struggle against larger firms with stronger negotiation power in RFQs; an acquisition allows the infrastructure to scale in order to land larger global clients.
  • Adding efficiencies and removing redundancy Extreme cost cutting of systems and resources, staff reduction (involuntary or voluntary) and product end-of-life decisions can produce immediate budget gains.


  • How does vendor consolidation impact you and your clients?
    Vendor consolidation can be great for your organization, e.g. by adding solutions options, contract consolidation with price reductions, expanded service and support, and faster implementation of new products and services.

    But, more often than not, there are negative impacts as well. For example, you might be strong-armed into signing for a new, yet unproven, system that does not fully address your technology or business needs.

    What if your new vendor fails to meet the deadline? Or if their service falls flat or your service level drops? What if your superiors demand answers to questions you can no longer answer? What if your clients’ experience changes negatively because you decided on that particular OMS provider? What if you must ask clients to make a change yet again?

    This line of questioning is instrumental to the decision making process. If you fail to consider the responses to these key questions, you are placing a lot of risk into the health of your customer relationships.

    Is your client connectivity managed by your OMS provider?
    Now we’re delving into an area we’re all very familiar with: customer satisfaction and success. Both are crucial to our own organizations management and growth potential. One area this is prevalent in is the OMS/connectivity space. It’s commonplace that your OMS vendor also provides client connectivity in a single (bundled) package. FIX connectivity is not a core capability of most OMS providers; it is usually an add-on revenue stream. This has been a point of contention recently, as we’ve seen an increase in migrations of broker’s client connectivity to independent service providers.

    How confident are you that the risks to your and your clients during a migration have been addressed?
    Before you ask your clients to migrate, consider these fundamental migration risks:

  • Connectivity risk Does the OMS vendor have the FIX expertise required to quickly and easily add trading counterparties? Do they provide tooling which enables you to control the onboarding experience should you choose?
  • Set-up risk Will all enrichments and rules you have in place today migrate to the new vendor, and does the vendor have proper testing tools to verify those enrichments and rules prior to migration?
  • Platform risk How resilient and global is the platform? Does the vendor have any certifications on its resiliency capabilities and are they global in scale to meet your expanding business needs?
  • Ideally, brokers should decouple inbound client connectivity from their OMS to mitigate these risks and to find the most stable, innovative connectivity provider. The ability to choose an independent connectivity provider is the ultimate insurance policy. Decoupling ensures your client flow is “portable” when adding a new OMS vendor or changing OMS providers again—migrations will be seamless for your clients.

    This practice also insulates from OMS vendor risk. If your OMS provider goes out of business, decreases its support or discontinues its product, this will not impact your clients. Additionally, decoupling OMS from connectivity allows you to select providers for each area ensuring the best match for your own needs and those of you clients. The benefits of unbundling can be summed up into these categories:

  • Align the function with subject matter expertise.
  • Access to better connectivity tooling.
  • Benefit from more advanced service levels.
  • Increase flexibility and reliability.
  • Gain greater transparency into FIX messaging.


  • I implore you to consider unbundling client connectivity from your OMS provider to protect your firm and your clients. Decisions about connectivity and OMS solutions are complex, long-term decisions that should not be taken lightly. Don’t let further consolidation among technology and data vendors catch you by surprise.

    Routing Transparency Roundtable

    Regulators are pushing brokers to disclose more information about how and why they route equity trades. Some industry practitioners want to go a step further.

    The Routing Transparency Initiative aims to boost transparency in the routing of ‘child’ orders — trades that cleave from the initial ‘parent’ order between order and execution — by better understanding the intention, or reason behind the routing. RTI can be applied globally for all asset classes leveraging the Financial Information eXchange (FIX) protocol.

    dsc_0409On Sept. 10, about 20 market professionals representing the buy side, sell side, exchanges and technology providers convened in New York for a GlobalTrading Journal Thought Leadership Roundtable entitled RTI: Unlocking the Routing Decision. Questions discussed included what is the problem that RTI is the solution for; what is the current state of implementation; what is the role of different market constituencies in moving RTI forward; and ultimately, how would RTI support best execution for market participants.

    “Regulatory policy is being driven by a qualitative perception of what’s going on between venues and the sell side,” said Enrico Cacciatore, Senior Quantitative Trader and Head of Market Structure and Trading Analytics at Voya Investment Management. “RTI offers a quantitative, intention-based view through the forensics of routing. Instead of regulators driving policy, industry can be the key driver.”

    Enrico Cacciatore, Voya IM
    Enrico Cacciatore, Voya IM

    Cacciatore is a co-founder of the Routing Transparency Initiative, along with Nancy Kwok, Principal, Equities Trading Research and Strategy at Connor Clark & Lunn Investment Management. He described the group behind RTI as a grassroots consortium of financial firms who want to add transparency to routing practices.

    To get RTI started, the buy side needs to outline just what data it needs via RTI, and then a critical mass of sell-side brokers (perhaps a half-dozen) would need to sign on to develop and implement the system and distribute the data. Investment managers can potentially encourage brokers to get involved by directing more of its commission wallet to participating firms, Cacciatore said.

    A survey conducted before the roundtable provided some background information. Of 14 respondents, 13 agreed there should be more transparency in routing behavior, particularly pre-execution. Respondents generally were in favor of standardization via a new FIX tag, though less convincingly than in the first question.

    Phil Mackintosh, Nasdaq
    Phil Mackintosh, Nasdaq

    On the question of how likely it is that at least six to eight brokers will adopt RTI, 10 of 14 respondents expressed medium to slightly positive confidence, while two respondents had high confidence and two had low confidence.

    Action Step

    Securities professionals “have been talking about disclosures and transparency for years now,” said Tal Cohen, Executive Vice President, Head of North American Market Services, at Nasdaq. “We need to take the next step and achieve what regulation is trying to achieve.”

    “We are very supportive of complete transparency in algos,” said Brian Bulthuis, Head of Americas Quantitative Trading Strategy at Instinet. “It engenders trust and triggers good conversations between providers and users.”

    If done right, RTI can more closely align buy-side and sell-side interests. “Best execution for the sell side and buy side may differ and can be subjective,” said Kwok of CCL. “RTI has the possibility to be a common use case which exists to quantitatively link intent and execution together.”

    Kathryn Zhao, Cantor Fitzgerald
    Kathryn Zhao, Cantor Fitzgerald
    Going beyond the why of RTI, Kathryn Zhao, Global Head of Electronic Trading at Cantor Fitzgerald, offered a sell-side perspective on how RTI could be implemented.

    “The idea is to enrich each FIX order message with a tag that encodes the order destination, type and intent,” Zhao said. “We propose the tag value to be encoded with SBE (Simple Binary Encoding), which has the advantage of being more compact and more flexible in terms of allowing unlimited dimensions and extendibility.”

    “The SBE schema dictates which section of the binary data belongs to which trading component (e.g., Algo Engine, SOR, etc.), making the framework much cleaner,” Zhao continued. “In addition, using the shared SBE schema, the SBE compiler will code-generate the encoder/decoder logic into a target programming language of choice, that can be easily integrated into the FIX flow processing.”

    More or Less?

    One detail to be sorted out involves how expansive RTI would be. “We want to define and standardize all intention codes,” Cacciatore said. “The question is, do we want a limited number of intention codes to provide simplicity, or do we allow the broker to uniquely express their intention allowing a larger number of standardized codes?”

    For Instinet, transparency “ends up focusing a lot around customized reports,” said Bulthuis. “If you establish a basic overall definition, brokers can map their myriad reason codes into a simpler format.”

    “The foundation rests on how to map reason codes into a Street-wide format and then figure out how to communicate data in real time. That’s the sell side’s responsibility,” Bulthuis continued. “The buy side’s part might arguably be bigger — consuming the data, storing the data and figuring out how to make sense of it all.”

    Nancy Kwok, CCL IM
    Nancy Kwok, CCL IM

    Respondents to the background survey offered a number of potential challenges in implementing RTI, ranging from broad (cost and prioritization), to specific (“compliance with FIX, “intention” is not as objective as other FIX tags”). “Beyond defining exactly what proponents of RTI want, will there be enough users representing enough commission dollars to get a big enough portion of the sell side to comply and standardize?” one person asked.

    But the end goal — bettering best execution — could make the industry’s lift worthwhile. “Greater transparency is always a good thing, as long as it is delivered in an easy-to-consume and useful way,” one survey respondent said. “Very simply, this initiative would help us make more informed trading decisions,” said another.

    Regarding next steps in moving RTI forward, CCL’s Kwok suggested (1) surveying a larger group of industry practitioners; (2) finding the right stakeholders to work with, and (3) agreeing on a small set of standardized intentions that can serve as building blocks for the initiative.

    CEO Chat with BMLL Technologies

    CEO Chat with BMLL Technologies: Making Deep-Dive Data Accessible

    It is a well-known fact that capital markets firms need to focus on data, analytics and insight in order to create alpha. This focus on data has gone so far that some have even taken to calling it a ‘data revolution’ or ‘data wars’, while Accenture has labelled the task of data-driven management as “turning dark data into business impact”. We spoke to Johannes Sulzberger, CEO of BMLL Technologies, about how firms can efficiently increase their data analytics capabilities without incurring excessive overheads.

    Tell us about BMLL Technologies? What does it provide?

    Johannes Sulzberger, BMLL Technologies

    BMLL Technologies is a Data-Engineering-as-a-Service company for Capital Markets. BMLL provides firms with access to massive limit order book data sets through cloud computing services.

    Leveraging the underlying data science platform, we have developed a derived data product where we can calculate an unlimited number of bespoke derived data metrics and deliver those to customers in a multitude of ways. Customers can focus on the output and let BMLL manage the tech environment themselves. We offer summary statistics and derived data from the most granular data sets available, providing access to insights that have not been easily accessible before and deliver these in a simple, consumable product. Basically our platform combines data, hardware and analytics, and is designed to allow quants, traders, business analysts and compliance staff to go ahead and do their work.

    What makes BMLL Technologies different?

    Imagine trying to analyse all the trades, orders, messages and data broadcasts across every exchange. There are insights to be gained from looking at that granular level of data, but up until now only a very small number of firms have had the means (and budget) to access it. The BMLL platform aims to level the playing field and provide access to Level 3 (L3) data for market participants, their research and quant teams.

    L3 data displays all individual messages in the limit order book (LOB), providing traders and researchers with unparalleled visibility into the workings of the market. These datasets are extraordinarily large and complex, so most firms don’t have the computing infrastructure to derive meaningful analytics from them. Our Derived Data capability can typically process hundreds of billion messages in a matter of hours rather than days. Data sets are primarily equities from US, European and some Asian trading venues, exchange traded derivatives. Based on client demand, however, we can add new data sets quickly and easily and we are starting to push in fixed income and FX spot markets too.

    While some big firms do have access to data, we have found that the combination of data, analytics and the scalable compute power necessary to efficiently extract value is very rare. Quant researchers want access to the complexity of the order book condensed into simple metrics. BMLL enables access to both raw and derived data through a range of mechanisms with advanced quant users accessing the platform using Jupyterlab, a popular data science interface.

    What can be gained by looking at L3 data? Why is this level of detailed data important for companies?

    Quants are on the look-out for either unique trading signals, fraud detection, or market impact and associated costs of a trade. In order to analyse markets and interact with exchanges in the best way, it is important to understand the data produced by exchanges. For example, in the Equities world, L3 data comes from 14 US exchanges and 10 major European exchange groups; each dataset is large, cumbersome and in a different format.

    In order to realise the value contained in the exchange data, it needs to be extracted and put into context. BMLL makes this detailed data accessible to the whole market and takes all messages from the full order book to be normalised and compressed. This data can then be analysed using state-of-the-art computing power via the Cloud. Once this derived data is summarised down at scale, new insights can be gained because they can see data sets they have not had access to in the past.

    Who are your users?

    We currently have 12 Tier-1 institutions on our books including investment banks, global asset managers, sophisticated hedge funds and academic research institutions.

    Can you tell us something about the company’s background?

    BMLL is spun out of the Engineering Department of Cambridge University. The last five years have seen significant developments both in customer use cases and in the technology stack available to tackle such big-data problems. The original concept of our platform was borne out of trying to solve for the pain that industry users feel when trying to access, manage and extract value from limit order book data.

    Do academics still help?

    Absolutely. We are very proud to have a scientific advisory board with some of the world’s best-known academic leaders in the field of financial data and machine learning. Together with our research team they look to provide answers in the fields of liquidity resiliency, price impact, predatory trading and the combination of high and low frequency data where we base our work on rigorous academic analysis.

    SEC Mulls Rule 605 Reform

    Abstract Futuristic Financial Chart. Concept of Digital Stock Market Trading. Vector Illustration. Abstract Background with Technology Business Diagram.

    U.S. Securities and Exchange Commissioner Elad Roisman stirred the market-structure pot by floating potential reforms to the regulator’s best execution requirement at SIFMA’s annual market structure conference in Midtown Manhattan.

    He cited his concerns that brokers are substituting a focus on price, time, and the avoidance of trade-throughs for robust best execution analysis.

    “Price is cut and dry, but it is just one factor in best execution analysis when firms make routing decisions and when regulators review those decisions,” said Commissioner Roisman.

    One possible approach would be to expand the definition to include subjective factors.

    “The SEC has previously cited many other aspects of a brokers’ routing decisions in obtaining best execution of customer orders,” he said. “These include size, availability of information, trading characteristics for a security, transaction costs, and ease of getting a fill. Going beyond the SEC, FINRA cites several other factors surrounding best execution, including price improvement, opportunities, the difference in price dis-improvement, the likelihood of execution of limit orders, customer needs and expectations, and the existence of internalization or payment-for-order-flow arrangements.”

    To enable such a change, the Commission could consider adopting a non-prescriptive interpretation of Regulation NMS’ Rule 605 in which brokers would have a baseline requirement that would apply across various market niches and market conditions. Brokers would then be able to differentiate themselves by performing beyond the baseline.

    New best execution requirements likely would require the Commission to review or even eliminate its Order Protection Rule as well.

    “When adopting NMS 14 years ago, the Commission could not have had predicted 13 active equities exchanges with a 14th recently approved and two more in the works as well as 30 alternative trading systems,” he noted.

    The Commissioner raised multiple ways in which the SEC could update the Order Protection Rule, including having exchanges meet a set average daily volume level during a set period, adjusting round lot sizes based on price or order size, or protecting the smaller exchanges.

    “I’m not committed to any of these, but are using them to spur the conversation,” said Commissioner Roisman before turning his focus towards order management systems.

    He questioned whether the recent consolidation within the OMS market and the opacity of the platform providers’ pricing had raised risks for investors and brokers and if they were diligently overseeing their relationship with the vendors and their services.

    “They are performing services that market participants value and have become interwoven in the equities environment,” he said. “I would like to understand this thread that runs through the equities market better.”

    Treasury Trading Q&A: Elisabeth Kirby, Tradeweb

    Last week, Tradeweb Markets and ICE Benchmark Administration launched Tradeweb ICE U.S. Treasury Closing Prices. In a joint release, Tradeweb and ICE said trusted reference price data is critical for financial firms to manage investment portfolios, evaluate the fair value of securities, perform compliance monitoring, and satisfy general accounting standards. The Tradeweb ICE U.S. Treasury Closing Prices are designed to represent the daily market mid-price for U.S. Treasury securities.

    Markets Media caught up with Elisabeth Kirby, Managing Director and Head of Rates Strategy at Tradeweb, to learn more.

    What market need was identified that led to the development of U.S. Treasury Closing Prices?

    Liz Kirby, Tradeweb

    The U.S. Treasury market is the largest in the world, with $16trn in tradeable securities, but unlike other sovereign debt markets (like the U.K., where we provide official Gilt prices), it has no official close. As a result, lots of participants have relied on different snapshots of the market at various points of the day, or varying estimates of where the market should be to close their books each day.

    Participants are shifting away from submission based pricing and towards independently validated reference prices, and so we launched the Tradeweb ICE Treasury Closing Prices based almost entirely on customer demand. Anyone who has to mark to market each day needs a transparent, robust price that they can trust. Given the long history and strength of our U.S. Treasury trading platform, market participants look to us for the provision of that high-quality data, and in working with IBA, we’re ensuring that the data is fully validated.

    How does this launch move the needle versus what data had been available for U.S. Treasury traders/investors?

    Our aim with the Tradeweb ICE U.S. Treasury Closing Prices is to establish the market-wide close, for everyone – including, but not limited to, our customers – to use. That’s why the IOSCO Principles have played such a large part in our approach and design: it helps to guarantee that these prices will be independent and truly based on market activity.

    What transactions will the U.S. Treasury Closing prices be based on and how robust will the pricing be?

    The Tradeweb ICE U.S. Treasury Closing Prices are calculated and published daily on more than 900 U.S. Treasury securities using prices available on our global institutional platform. From those prices, supplied by as many as thirty of the largest U.S. Treasury dealers, we calculate a single bid and offer price, and then derive the official closing mid-market price. We do this all around 3 p.m. Detailed pre-calculation checks and post calculation monitoring and surveillance then rounds out the process, and the prices are published at around 3.45 p.m.

    The breadth of the prices on the Tradeweb platform, from such a large body of critical market participants, will bring much needed accuracy to the Treasury market close. And the methodology is tried and tested: it’s the same one we use in the U.K to calculate Closing Prices for Gilts. We took over the exclusive provision of those closing prices from the U.K.’s Debt Management Office in 2017.

    In the Sept. 10 release, Tradeweb CEO Lee Olesky said the prices will “address the industry’s ever-increasing demand for accurate, independently validated market data, based on a strict and transparent methodology.” How and why is industry demand ever-increasing?

    There are a number of factors that drive our industry’s desire for more data.

    The first is the advance of technology, and the speed at which trading, and more broadly the entire workflow, is electronifying. In order for those technology processes to work, you need concrete data inputs. Obviously, the better the data, the better the input.

    The second is a far more granular focus by firms on compliance and best execution. For these analyses to be most successful, it helps to have objective, robust prices that are utilized across market participants.

    Lastly, but no less importantly, is the increasing impact of ETFs and passive investments. These types of flows represent a huge and growing segment of the marketplace that requires accurate closing prices to reduce tracking errors versus benchmarks.

    OPINION: Institutional Trading Could Kill Crypto Exchanges

    Bitcoin and Block chain network concept on technology background 3d illustration

    Can crypto exchanges survive the expected deluge of institutional investment? Today, that bet is even money.

    Satoshi Nakamoto never intended to graft bitcoin onto the existing international financial markets. It was meant to operate in parallel and be an uncorrelated alternative.

    The crypto markets began, like most typical markets, as over-the-counter transactions between individual investors before crypto exchanges began popping up like mushrooms to facilitate easier trading. Once that happened, it began piquing the interest of institutional investors.

    However, what it takes to support retail investors and institutional investors regarding services and support is quite different.

    Individual investors have shown over the past few years that they are happy to invest without a standard instrument taxonomy, third-party custody, reference data, transaction cost analysis, and smart order routers, which are non-negotiables for institutional investors.

    Does it make sense for digital-exchange operators to chase the institutional cryptocurrency market and make the necessary investment to replicate the business model of self-regulatory organizations?

    It doesn’t. Digital exchanges currently make their money from their sizable trading commissions while giving away their market data, which is a mirror image of the SROs.

    Even if exchanges did make the necessary investments to meet the requirements of institutional investors, they might be waiting a long time to cash those institutional trading commissions.

    Institutional trading, by its nature, is OTC trading. Buy-side traders want to keep their order flow off the displayed markets as much as possible to avoid moving the market against them.

    All buy-side traders need is a reference price and size to use as a benchmark for their trade negotiations, which the digital exchanges already give away for free, before registering their trades in the appropriate distributed ledger.

    Trading digital securities, on the other hand, presents its own issues. Regulation D or Regulation A securities are illiquid instruments at best and do not fare well on limit order books.

    If digital exchange operators wanted to pursue the digital securities market, they would need to develop matching methodologies that are similar to those in the interdealer markets.

    In this case, exchanges could generate revenue from its market data and trading fees as well as a host of tangential services, such as developing and issuing the digital securities.

    All the digital exchange operators would be doing is adopting the same business plans of private securities trading but swapping out the middle and back office for a token-based infrastructure.

    Trading Technology Q&A: Frank Troise, Pico

    frank_troise
    Frank Troise

    On Tuesday, Pico said it appointed Frank Troise, former ITG CEO, to its board. Markets Media spoke with Troise to learn more about his new role and what opportunities he sees for the trading technology firm.

    Why Pico?

    I’ve known Pico for quite some time. I’ve been a client of Pico. I know the management team. They are top-notch. Pico provides tremendous value in the industry. It’s an entity that I want to put my time, effort and energy into. Pico’s value proposition is becoming even more important.

    Pico aligns very well with my experience in delivering client value in financial technology, in complex global product offerings that have a lot of moving parts and require a lot of focus.  It’s about paying attention to the details to ensure that clients receive high quality outcomes along the way.

    Pico CEO Jarrod Yuster said the goal is to lead market share of the $50 billion financial technology services market. How far away is Pico from leading in market share and how will it get there?

    Pico is a leader in outsourced global infrastructure. The acquisition of Corvil, which delivers infrastructure performance analytics, is another leading capability and it is a signal and a message of what Pico is all about: being best-in-class in delivering high-end infrastructure technology capabilities across the capital structure and globally.

    What will be your duties as director?

    As a board member I primarily provide governance and guidance to the executive team. I see my role as working with a top-notch executive team, helping guide them through continuous innovation, and partnering with them as they look to grow and expand their capabilities.

    Where is the growth in Pico’s business?

    There are tremendous opportunities in continuing to advance capabilities as the industry evolves and to expand the geographic scope of the business. There are also opportunities from the recent acquisition of Corvil.

    So there are opportunities for geographic scope and product expansion as well as more client segments to pursue.

    Will growth be primarily organic or via M&A?

    It’s a combination of both. There are tremendous opportunities organically, sticking to the knitting, being best-in-class at what they do and expanding geographic scope. There is also product scope around infrastructure management and delivery as well as extensions from the Corvil acquisition. Additionally, Pico has opportunities to expand through acquisition.

    The Buy-Side’s Drive For Technology Growth

    Instead of investing in costly technology upgrades, it is more prudent for firms to collaborate more frequently and directly, writes Monique Popli, Director of Strategy, Tier1 Financial Solutions.

    Investment professionals are always thinking of ways to improve performance and efficiency. Although these are pivotal to measure business growth and success, the digital transformation that is occurring across the capital markets landscape is evolving rapidly, and must not be ignored. The impacts that these changes are fostering can be immensely beneficial if they are properly addressed but could prove problematic if overlooked.

    Let’s take a look at three key factors that will drive the evolution of financial technology and help ensure that investment firms are adequately positioning themselves amidst a shifting status quo.

    First, investment firms are keen to leverage emerging technologies. However, most firms either do not have the workforce or the bandwidth to take on new projects; or they don’t have the existing rails in place to support these types of advanced fintech tools. These challenges make the adoption of new technology both very costly and quite challenging considering the sweeping changes often required from firms to take that next step in innovation. Without the appropriate personnel on-board to design, implement and roll out new solutions and upgrades, the burden becomes exponentially higher.

    Finding a system that is advanced and flexible enough to integrate seamlessly into a firm’s current operations and data is challenging. When the human element is factored in, there is a significant amount of downtime that must be allocated to not only build the tools but also learn how they’re used to maintain growth and capitalize on opportunities.

    The second factor is that MiFID II has led to diminished research coverage for small and mid-cap companies, putting more of the onus on the buy side and corporates for direct communication and creating more responsibility for investment pros to prepare due diligence and publish their own research. 

    Corporate access meetings previously coordinated by the sell side will fall into the hands of the buy side. As a result, investment firms have both a greater need and appetite for streamlined technology tools to readily fill these gaps. Without a collaborative solution in place to catalog these initiatives and organize the increased responsibilities, there will undoubtedly be missed opportunities.

    Finally, investment firms are actively seeking partners with which to collaborate. By leveraging their relationships across broker, corporate and fintech vendor partners, investment professionals gain a greater capacity of technical expertise and economies of scale to ensure solutions are more future-proof in their design. 

    By partnering with experienced professionals who understand the intricacies of the capital markets industry, investment firms can respond much quicker to their evolving needs. With the structural landscape of the industry and regulatory factors changing constantly, an agile and nimble approach with expanded market awareness ultimately creates greater scalability as the business grows.

    Instead of investing in costly upgrades, it has and will continue to be more prudent for firms to collaborate more frequently and directly. Finding the right partners is going to foster the flexibility of fintech tools, drive the level of innovation across the industry landscape and help ensure that investment firms are addressing technology and structural movements as they unfold.

    We're Enhancing Your Experience with Smart Technology

    We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
    Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

    Close the CTA