Drag on mutual-fund returns may be less than a rounding error, Nasdaq economist says.
Is the debate about market data costs overblown?
Yes, says Nasdaq Chief Economist Phil Mackintosh. In a blog post published Thursday, Mackintosh noted that as a percentage of end-user investors’ overall costs, exchanges represent a small fraction of what investment management, fund expenses and trading impact cost, and it’s even less than half of regulatory costs.
All-in exchange costs, including data fees, snip 0.001% off the annual return of mutual-fund investors, according to Nasdaq estimates. That’s $0.10 from every $10,000 in return.
Phil Mackintosh, Nasdaq
“Exchange costs are not a significant drag on investor returns,” Mackintosh said. “Some would say they aren’t even a rounding error.”
Market data has been a contentious issue for years. Market participants say exchanges behave monopolistically and charge too much; exchanges say their fees reasonably reflect their own technology costs, and the data market is a competitive one.
Mackintosh said criticism of exchange data costs is typically light on supporting data, so he dug into the numbers for the $17 trillion in investor-owned mutual fund assets. Five major types of costs were assessed: investment advisors, mutual fund expenses, trading impact, and commissions and regulatory.
Investment advisor fees trim an average 1.02% from investors’ annual returns, or about $102 per $10,000 in return; fund expenses weigh in at 0.55%, or $55; trading impact is 0.19%, or $19; and commissions are 0.08%, or $8.
Mackintosh noted that costs are not necessarily “bad” for investors. “Managing investments is not cost-free; even those low-priced index funds need a portfolio manager, custodian and accountants to calculate year-end distributions.”
After the main investor expenses, fees associated with the SEC and FINRA regulatory bodies are 0.002%, as are fees from the industry data utility SIP. Each of those skim $0.20 off $10,000 in investor return, or double that of exchange costs.
“Looking at all of these, it certainly puts exchange costs in context, and hopefully encourages a more data-driven debate,” Mackintosh said. “Exchange and SEC fees are a small part of the total equation. This is especially noteworthy considering the important role of exchanges in bringing companies to market, regulating the market and supervising trading, which provides the returns that help secure the retirement if U.S. workers.”
The two major trends are the increase in value of data and analytics and multi-asset class trading.
David Schwimmer, chief executive of London Stock Exchange Group, said the acquisition of Refinitiv allows the UK firm to meet three strategic goals in one step.
In a conference call this morning Schwimmer explained that LSEG set three goals in autumn last year – to become more global, to diversify in capital markets and to significantly expand data and analytics. Schwimmer took over as chief executive of LSEG in August last year after spending two decades at Goldman Sachs.
We are pleased to announce the proposed all share acquisition of @Refinitiv, by @LSEGplc today. "Our aim is to capture the opportunity of data which we believe is driving unprecedented change in the global financial community." – Read our announcement: https://t.co/AMQb90HxRspic.twitter.com/ThGw2ZUClM
“We looked at this transaction and it allows us to execute those three goals in one step,” he added. “The deal took several months to come together but makes most sense for customers and values for shareholders.”
London Stock Exchange Group said in a statement today that it has agreed definitive terms with a consortium, including Blackstone and Thomson Reuters, to acquire Refinitiv for $27bn ( €24.5bn).
Schwimmer continued that financial market infrastructure has changed in the last 20 years market and is evolving at an even greater pace today. “Data capabilities will define the success of financial market infrastructure,” he added.
He said LSEG had always anticipated market trends – such as acquiring FTSE and Russell due to the expected growth of passive investment and buying a majority stake in LCH. As a result information services and post-trade business comprise 80% of group revenues.
“The two big trends now are the increase in value of data and analytics and multi-asset class trading,” Schwimmer added.”Refinitiv allows us to capture these trends.”
The deal will allow LSEG to expand geographically into the US, Asia and emerging markets.
David Schwimmer, LSE Group
“We can also broaden the number and type of customer, for example, Refinitiv has a wealth management business in US,” continued Scwimmer. “We will have more contact with corporates through Refinitiv’s worldwide know your customer platform.”
LSEG will also acquire foreign exchange trading venues FXall and Matching and Refinitiv’s controlling stake in Tradeweb, which trades fixed income, derivatives and equities. Tradeweb went public this year and will remain listed in New York.
Electronic trading volumes in foreign exchange have doubled since 2007 but the share is still below equities according to Schwimmer. He continued that Tradeweb had strong growth over the last three years and is well positioned for the electronification of fixed income trading.
LSEG management on the call said they would be combining the best technology from both companies.
David Warren, chief financial officer at LSEG, said on the call: “There is some overlap in technology, such as data centres, and a global property portfolio which can be rationalised.”
Warren added that the combined company will be investing and making hires in growth areas.
The deal needs the approval of completion regulators globally including the UK, EU and the European Union.
Schwimmer stressed in the call that the two businesses were highly complementary.
He said: “I do not want to pre-judge the regulators’ decisions but the businesses are more complementary than overlapping and we are not thinking of divestments. We look forward to engaging with regulators around the world as we are in 20 jurisdictions.”
He also stressed that the exchange would maintain its open access policy, which is consistent with Refinitiv philosophy. Clients can trade on other venues and use LCH or other services and vice versa.
Linda Middleditch, who joined trading technology provider Itiviti as chief product officer from Bloomberg last month said in an email that some smaller firms will be concerned about price rises but there is still plenty of competition from other market data providers.
“As the industry sees further consolidation and many firms consider the potential for a sale, there will be further mergers within this space,” she added. “So, It’s not surprising to see exchanges reinforce the trend of looking to diversify their revenue streams.”
LSEG and Refinitiv had combined annual revenue of more than £6bn ($7.3bn) last year, which would have made the combined business the largest listed global financial market infrastructure provider by revenue according to the exchange.
In a few years customers will be able to access cash and OTC data alongside futures data.
Trey Berre, global head of data services at CME Group, said the exchange group aims to become a one-stop shop for customers’ data needs as it integrates NEX and BrokerTec, partners with third party data providers, and invests in new distribution methods.
Berre was appointed to his current role overseeing the group’s portfolio of historical, real-time and derived data products and services in May this year, reporting to Bryan Durkin, president of CME Group. He had previously led CME’s derived data licensing & partner services business since January 2018.
Trey Berre, CME Group
He told Markets Media: “We aim to be a one-stop shop for our customers’ data needs and facilitate their reading of markets.”
In the past four years, the group’s market data business has tripled the volume of market data it provides to clients, with data from international participation growing more than 16%, according to the exchange.
“CME made large investments in real-time data and in 2017 we rolled out Market By Order (MBO),” added Berre. “Clients can view their individual positions, the full depth of book and more granular information. Previously clients could only view the market by price.”
He explained that these investments meant that when CME traded more than 50 million contracts in May last year, its largest ever day, there were no data disruptions.
Berre’s responsibilities include development of new data products, data distribution methods and analytics, and the integration of NEX data offerings. CME completed its $5.5bn (€4.94bn) acquisition of NEX Group in November last year.
Terrence Duffy, CME Group
Terry Duffy, CME Group chairman and chief executive, said in a statement at the time: “Together, we will provide efficient access to futures, cash and OTC markets, as well as post-trade services and data offerings that will further support cost-effective trading and risk management.”
NEX operates electronic foreign exchange and fixed income platform, and offers post-trade services for over the counter derivatives in compression, reconciliation and processing businesses.
Berre said: “There are synergies with the acquisition of NEX and BrokerTec so clients will be able to receive data on both US treasury futures and cash treasuries. The integration will allow us to launch additional products on the DataMine platform.”
Datamine is CME’s historical market data platform. He continued that CME has a project to integrate all real-time steaming data from NEX with the exchange’s market data.
“In a few years customers will be able to access cash and over the counter data alongside futures data,” Berre added.
One product from the acquisition has already launched. Berre said the EBS Quant Analytics platform for foreign exchange was launched in the second quarter through an API.
Seth Johnson, chief executive of cash markets at CME Group, said in a statement: “There’s currently nothing like this in the marketplace, and that’s exciting not only for us, but for the entire EBS community. The new API streaming service means that EBS clients can analyze their individual and relative performance more efficiently than ever before by consuming the data in real-time and in their own environments, ultimately helping them grow volumes and revenues.”
The new functionality streams trade information, market impact and alpha calculations on a trade-by-trade basis to clients against benchmark data taken from the EBS ecosystem.
Berre added: “There is an increased focus on developing tools that drive content to the customer that meets their standards and investment needs.”
One way of meeting client needs is to invest in cloud-based distribution rather than customers having to subscribe to the whole exchange to receive market data.
“We hope to leverage this technology to meet the changes of customer expectations,” said Berre. “It is like the music industry where customers used to have to buy whole CDs but now they can just buy the music they want, when they want.”
Fixed income
Berre continued that outside the core of the exchange’s proprietary data, CME has been partnering with third parties to take advantage of its global distribution and give clients a more cost effective way to access data.
For example, CME Group is making corporate bond data from Neptune Networks available to its customers in order to increase transparency. Neptune Networks is a consortium of banks that was formed in 2016 to delivers high-quality bond market data from sell-side banks to buy-side clients more efficiently and cheaply.
“Neptune has a global corporate bond composite consisting of more than 30,000 bonds which reduces the requirement for investors to calculate their own composite price,” said Berre. “The data is distributed through an API so clients do not need a terminal but can simply add a subscription.”
There is a large opportunity to improve the provision of fixed income data. Consultancy Greenwich Associates said in a report this week, Defining Fixed-Income Data, that for a single fixed income security, market participants need to look to several data providers, multiple trading platforms (which each have unique datasets) and regulatory bodies that often collect and disseminate trading and holdings information in different formats.
— Greenwich Associates (@GreenwichAssoc) July 30, 2019
Kevin McPartland, head of research for market structure and technology at Greenwich Associates, said in the report: “Even in cleared markets like swaps and futures, gaining access to historical and reference data still leaves users looking to multiple providers. While we expect this to change over time, the major trading venues largely reserve the data they collect and the analysis of that data for their client base, which pays via commission rather than directly for the data itself.”
Kevin McPartland, Greenwich Associates
McPartland continued that buy-side trading desks have access to more data than ever before, with each desk spending on average only $225,000 annually on direct data costs (or roughly 15–20% of their total technology budget) as large broker-dealers in fixed income are subsidizing the cost of data for their clients.
Greenwich expects this model to change as data increases in quality and becomes more valuable, requiring the buy side to pay for access.
“Conversations with top-tier hedge funds and asset managers have shown us that those on the cutting edge are in endless pursuit of new data sources, using them to further refine both their investment and execution strategies,’ added McPartland. “The buy side is certainly more informed than ever, but by all accounts, we’ve only scratched the surface with regard to the fixed income market’s transformation.”
Third party partners
Another partner for CME is QuikStrike who supply analytical tools designed for options trading. Berre said QuikStrike provides heat maps for CME’s benchmark products which shows the concentration of open interest and have been very successful.
He added: “We have a targeted approach to alternative data. In DataMine we will provide content that is complementary to our core markets and relevant to giving clients greater insights.”
For example, CME has also partnered with RS Metrics who provide satellite imagery on global copper storage while Orbital Insight supplies global and regional oil estimates.
In addition, CME has agreed a deal with the Johannesburg Stock Exchange to distribute market data from the DataMine platform to its customers, who will be able to access data from CME and JSE in one place.
“We provide DataMine as a global platform-as-a-service to content providers which creates a one stop shop for our customers,” said Berre.
Importance of data
The critical importance of data to exchanges is shown by London Stock Exchange Group’s potential £27bn ($33bn) acquisition of Refinitiv.
LSEG said in a statement that it had a strong strategic rationale for the deal as digital transformation means that financial markets infrastructure providers must operate globally across asset classes, with data management, analytics and distribution capabilities that can serve customers across asset classes and geographies.
“The Refinitiv Data Platform has over 150,000 data sources, and is a leading provider of real-time pricing, reference data, private and public company information and events, commodity, economic, quantitative and research data, Reuters News and over 10,000 other news sources,” added the UK exchange.
Steve Grob
Steve Grob, former director of group strategy for Fidessa, said in a post on Tabb Forum that harvesting and monetizing data is becoming the primary business goal of large technology firms. “That is exactly what the LSEG, and its competitors, have become,” he added.
Grob continued that traditional exchange activities have come under severe pressure but the demand for data is an exception.
“Refinitiv has the potential to add significant scale and diversity to the LSEG’s dataset – both of which are prerequisites for effective artificial intelligence and machine learning,“ he added. “Trading margins continue to compress while the regulatory overhead is going up too. So, making the path between investors and liquidity shorter is another obvious way of creating competitive advantage.”
Five years ago, Michael Lewis set off a firestorm in the trading community with his bestseller Flash Boys, accusing high-frequency traders of robbing innocent investors to the tune of untold billions each year. Un-nuanced, for sure, the book nonetheless captured for the general public a problem institutions had been wrestling with for eons — how to buy and sell stock without leaking information to those, such as latency arbitrageurs, with the ability to profit from it at their expense.
Five years on, it’s difficult to say we’ve made systematic progress toward creating a marketplace in which institutions can transact safely and efficiently — that is, without having to arm themselves with costly technology to negate the speed advantages of costlier technology designed to extract and exploit information from their trading activity.
Don Ross, CODA Markets
As always, though, we’re told that tomorrow’s technology is about to solve the problems created by yesterday’s. This time, it’s in the form of so-called artificial intelligence—that is, moving from “dumb” algorithmic trading, which does what it’s programmed to do, to “smart” algorithmic trading, which continually adapts algorithms based on what the system learns about the changing outcomes. By some estimates, AI already controls about 80 percent of stock transactions.
Unfortunately, AI will never solve problems of adverse selection in trading. It will only, at best, mitigate them temporarily. The reason is what’s known, in the realm of monetary policy, as Goodhart’s Law—or the phenomenon whereby any measure or target or observed regularity ceases to be useful once pressure is placed upon it by new behaviour. AI can tell me, usefully, to stop doing x and start doing y, but in an environment where AI is ubiquitous y will stop being profitable soon after I embrace it.
If I sound like a technological pessimist, however, I am not. I believe that the technology exists, right now, to improve investor returns, systematically and durably.
How is this possible? Haven’t I been arguing that someone’s speed will always surpass mine? That someone’s intelligence engine will outsmart mine? Yes. But these are only problems because of endemic flaws in market structure—the way trading is organized.
Specifically, the problem is that our markets are organized to execute transactions bilaterally and sequentially. That is, we trade with each other one counterpart at a time. If I have 50,000 shares to sell, I might sell 150 to you at $50, then another 150 to someone else at $50, then another 150 at $49.99 and so on. This process inevitably pumps signals into the market that a seller is active, triggering orders and cancellations designed to profit from the knowledge, at my expense.
But why trade like this? What if I could simply announce to the market, anonymously, nothing more than the fact that I’d like to trade stock XYZ, and then let others—seconds or milliseconds later—say privately how much XYZ they’d like to buy or sell, and at what price? And then match those orders all together, at the same time? That is, multilaterally and simultaneously? In a market structure like this, I could easily sell 50,000 shares without moving the market.
Don’t believe it? Consider this actual trade, in precisely such a market structure.
By initiating an on-demand auction in Phillips 66 “PSX,” a symbol-only alert brought multiple parties together, at the same point in time, to express their trading interest in the stock. The buyer was able to transact the largest trade of the day, 139 times the bid/ask size, with minimal market impact.
This is the market structure my firm, CODA Markets, operates. It is high-tech, for sure—all-day multilateral auctions need lots of computer power and good programming. That’s where we’re putting our money.
So when, you might ask, will everyone start trading like this? That is, institutions and liquidity providers alike, with no advantage for either? Well, the biggest factor holding back mass migration to such a market structure is bad trading cost analysis. If you measure something wrongly, you don’t know that you’re doing much less well than you could be. Most TCA is just too unsophisticated to do proper counterfactuals—to compare the estimated cost of an immediate large-block multilateral trade with the cost of a long sequence of smaller bilateral ones. We at CODA are working to help our clients build and adopt metrics to help them better capture the costs and benefits of alternative market structures.
In short, then, I am indeed bullish on technology in trading. Not to shave microseconds, or to construct (ever-gameable) speed bumps to slow things down. Not to add ever-more costly anti-gaming AI to counter the rapacious sort. No. I’m bullish on technology to build better market structures—structures that enable ever-growing volumes of multilateral trading.
Don Ross is the CEO of PDQ Enterprises, LLC, the parent company of CODA Markets, Inc., a FINRA regulated broker-dealer which operates an electronic alternative trading system under SEC Regulation ATS.
ID2S is connected to the European Central Bank’s real-time gross settlement system.
The first trades settled live time last week on ID2S, the regulated blockchain central securities depository developed by SETL, the institutional blockchain payment and settlement infrastructure provider.
Peter Randall, president of SETL, said in an email that the delivery of the complete package was an impressive achievement.
Randall added: “This is a big milestone for SETL, building a CSD from scratch, getting it specified, regulated, connected to T2S the European Central Bank’s real-time gross settlement system, and able to issue ISIN’s, update account balances and undertake non-revocable delivery versus payment all within 18 months is a tremendous achievement. The challenge will be for the commercial paper industry to now add the liquidity and flow which are the bedrock of all functioning market infrastructures.”
The ECB launched T2S, TARGET2-Securities, to end fragmentation and lower the cost of cross-border settlements in the European Union. By connecting the SETL blockchain to a central bank, ID2S has settlement finality.
Anthony Culligan, chief engineer at SETL, said in a blog that ID2S was built entirely by the SETL team from their global engineering centre in the UK using the financial grade SETL blockchain. He continued that the use of SETL’s core blockchain as the ledger at the heart of the system means that a standardised approach is taken to the immutable ownership record.
“The fact is that none of the public blockchains have, therefore, been through the regulatory process which is required to meet the standard of operating market infrastructure,” Culligan added. “SETL is unique in having done that.”
Culligan continued that a significant part of the SETL build was around authentication and identity and that SETL’s software code undergoes independent review to ensure it does not damage the capital markets ecosystem. In addition, SETL had to meet the security requirements of the Banque De France and the ECB to connect to T2S.
“The ID2S project sets a new benchmark in the cost and speed of development,” added Culligan. “SETL’s team put in place a piece of national financial infrastructure in a fraction of the time and at a fraction of the cost which could have been achieved either by the incumbents or by the use of more traditional methods.”
Randall continued that competition from ID2S has lowered prices very significantly for users in two ways. First, because settlement is same day there is no longer the risk exposure and hence capital cost, and second, because fees are much lower than those charged by the current incumbents.
Peter Randall, SETL
“The second benefit is that it allows issuers of commercial paper to have a much more direct connection with the owners of that paper which results in tighter spreads and lower overall costs,” Randall said. “The third benefit is that for the first time the issuers, owners, compliance professionals, accounting systems and risk systems all have an absolutely identical record of the transaction which leads to much lower claims and disputes.”
He added that the only limiting factor on the volume of transactions processed by ID2S was the number of of transactions processed by the front-end matching engine, which was not supplied by SETL.
It was once about nudging traders to the mouse; now it’s moving them away from it.
Tradeweb has spent the last couple decades nudging fixed income traders towards their mouses to buy and sell. Now part of its plan is to entice market participants to step away from the desktop devices.
No, the trading-platform operator isn’t seeking to ‘de-electronify’ the market and go back to the telephone. Rather, Tradeweb is deploying artificial intelligence to automate trades, obviating the need for an individual to point-and-click for many run-of-the-mill transactions.
Lee Olesky, Tradeweb
Buy- and sell-side trading desks can deploy algorithms to program their process, and they get liquidity aggregation and trade execution “without a click,” said Tradeweb Co-Founder and CEO Lee Olesky. “Things go through with all the data and factors that go into the decision tree.”
Still just a curiosity less than a decade ago, AI is growing rapidly as a trading technology. In the first half of 2019, 21.6% of activity on Tradeweb happened via automated tools, up from 13.6% last year, 2.7% in 2014 and less than 1% in 2013, according to company data.
Olesky spoke on the “Behind the Market Structure” webinar hosted by Greenwich Associates.
Kevin McPartland, Managing Director at Greenwich Associates, noted that pressure on investment-management fees is a high-level catalyst for AI-enabled processes, as firms look to cut costs and gain efficiencies across the board.
In addition to giving traders more time to focus on more challenging, higher-value trades, Olesky said AI is also a risk-management tool, in that all the ground is covered. Pre-AI, “you never really knew, from a data standpoint, all that was happening,” Olesky said. “Now everything is captured, and in a more streamlined way.”
To be sure, there are still skeptics regarding AI’s ultimate utility in financial services, and large buy-side institutions are generally not first movers when it comes to new technology. But AI tools such as Tradeweb’s AIEx Automated Trading have the wind at its back in terms of an open-mindedness to technology that wasn’t there when the firm first came onto the scene in the late 1990s and made the bold decision to use the internet as its backbone.
“Over the last 10 years, the application of technology to make business more efficient has been a focus,” Olesky said. “People are looking to use software and technology to streamline a process. I don’t think there’s resistance anymore to electronic trading.”
Industry group looks further down the post-trade chain.
Some firms have already started working towards the new FIX Trading Community standard for periodic payments which aims to increase efficiency and reduce errors.
Laurence Jones, co-chair of FIX Trading Community’s global post-trade working group and director, post-trade strategy at Traiana, part of CME Group, told Markets Media that the non-profit, industry-driven body was originally launched to set messaging standards around executions. In the last five years there has been an increasing focus on post-trade, including allocations and confirmations across asset classes.
#FIX expands its presence into Post-Trade Processing. Joining the Chicago Regional Meeting tomorrow? Listen to representatives from Itiviti, Citi, ArcadiaSoft, Blackrock and Traiana on the new FIX standard message for Post Trade Payment Managementhttps://t.co/dsO4CW2ezY
Jones said: “The processes surrounding the confirmation of periodic payments has historically involved a large number of manual steps, so FIX aimed to set standards which has never been done before.”
He explained that the process for confirmation of the payment sits upstream from any physical instruction or cash movement, so parties must ensure they agree to the amounts prior to any payment instruction for settlement to occur. For example, an expected coupon payment on a swap contract when calculated needs confirming between trading parties.
Laurence Jones, Traiana
“Adopting a standard brings huge operational and technological efficiencies, and can help reduce errors,” added Jones. “With the regulatory focus that continues to loom across post-trade processing, this is incredibly important’.”
For example, Central Securities Depositories Regulation (CSDR) in the European Union imposes penalties for settlement fails for certain transactions.
Jones continued that conversations started in 2017 as the process for confirming periodic payments is heavily manual and can incur huge operational and technical costs but change needed the involvement of the buyside, sellside, vendors and service providers’.
“Some firms have already started working towards the new standards and some have started testing, “ he added. “There will be a ramp up in adoption, and I’d expect to see it pick up over the next three to six months.”
In the first phase the messaging standard will be rolled out to a subset of payment types and will be done on a gross basis. Jones added it is possible that netting will be introduced in the future.
Jones said: “It’s incredibly exciting to see the industry continue to collaborate and be part of improving a process, especially when it offers a whole range of benefits across liquidity management, STP and control. In a time where regulation dominates a lot of decision making in this industry, that’s incredibly important.”
Lou Rosato, co-chair of the FIX Trading Community America’s regional committee and director in the global investment operations group at BlackRock, said in a statement: “The natural next step is to continue looking further down the post-trade chain into asset servicing and reporting standards which help the industry reduce risk and operate more efficiently.”
New products need to be developed and the speed of change needs to accelerate sharply to meet the deadline for replacing Libor and avoid a major market disruption.
Consultancy Oliver Wyman said in a report that a huge amount of work remains to transition $240 (€214) trillion US dollar Libor market to alternative risk free rates before the benchmark stops being published. The Financial Conduct Authority said two years ago that the UK regulator will not compel panel banks to submit to Libor beyond 2021.
After the financial crisis there were a series of scandals regarding banks manipulating their submissions for setting benchmarks across asset classes, which led to a lack of confidence and threatened participation in the related markets. As a result, regulators have increased their supervision of benchmarks and want to move to risk free reference rates based on transactions, so they are harder to manipulate and more representative of the market.
Oliver Wyman said a firm date for the discontinuation of Libor would probably be the biggest accelerator of change but this is unlikely in the near term. In addition, the report set out action for regulators, banks, clearing houses and benchmark administrators.
“The actions will have to happen in parallel if the mammoth task of transitioning away from Libor is to be completed by the end of 2021,” added the report. “Further delays raise the risk of a market-wide dislocation, as well as economic, conduct and operational impacts for individual market participants.”
The consultancy said banks need to stop procrastinating and develop loan products based on risk free rates, despite only backward-looking RFRs currently being available.
“Banks cannot afford to wait due to the lead time to transition legacy contracts,” said the report. “Without proper preparation, banks face a number of material risks that include unanticipated operational risks, potential value transfers which can lead to huge gains or losses when fallback clauses come into force, and considerable conduct risk that could result in reputational damage, fines, and lawsuits.”
FCA
Andrew Bailey, chief executive of the FCA, said in a speech in New York last week that markets in the risk-free alternatives to Libor are continuing to develop.
The US has adopted the secured overnight financing rate, SOFR, as its risk free rate and Bailey said open interest in SOFR futures has grown to close to half a trillion dollars. The UK has chosen the sterling overnight index average, Sonia, and Bailey added that open interest in Sonia futures was £129bn ($161bn) by the end of last month from close to zero in April 2018. However Oliver Wyman noted these are only about 1% of Libor futures volumes.
In the swaps market, SOFR-referencing swap transactions are small but growing while the notional of outstanding cleared Sonia swaps is now more than £10 trillion.
“In the first six months of this year, Sonia accounted for a little over 45% of notional swaps trading in sterling,” added Bailey.
He said the growth in Sonia swaps volumes is particularly notable at longer maturities of more than two years. Bailey continued that Sonia is now the market norm for new issuance of sterling floating rate notes.
“Since the first new Sonia-referencing issue in a decade – just over a year ago in June 2018 – we have now had £28bn of new issuance referencing Sonia,” said Bailey. “Issuance of US dollar bonds referencing SOFR, is even more substantial, having reached $135bn by end-June.”
The first UK securitisation referencing Sonia was completed last December but two thirds of new public deals referenced Sonia in the second quarter of this year.
However Bailey continued that transition also involves converting old Libor contracts, which is hardest in bond markets as consent solicitations are required.
Andrew Bailey, FCA
“But last month one UK issuer, Associated British Ports (ABP), proved it can be done,” said Bailey. “Both issuers and investors can benefit from conversion. The successful ABP consent solicitation sets a useful precedent and model that others can follow.”
However, he warned that the loans market is least advanced in transition and progress is needed in the year ahead.
The Loan Market Associations in Europe and the US are developing new standardised documentation for syndicated loans referencing overnight RFRs rather than forward-looking Libor term rates.
“Delivery of that documentation – and its use – will be key next steps in the transition,” said Bailey.
Floating rate notes
The International Capital Market Association, a trade body, said in its third quarter report that new public issues of floating rate notes maturing beyond 2021 have nearly all stopped referencing sterling Libor.
The transition to risk-free rates is a global challenge. It will only succeed if the authorities and market participants work together. Read more about progress in different jurisdictions here https://t.co/hNA58sPDXo
ICMA said £20.5bn of Sonia floating rate notes were issued in the first half of this year, compared with £6.9bn in the second half of last year. There were 35 new FRN transactions referencing Sonia in the first half of this year, up from 12 in the second half of 2018; an increasing amount of secondary market activity; and more than 180 investors.
Paul Richards, head, market practice and regulatory policy at ICMA, said in the report: “As a result, new public issues of FRNs referencing sterling LIBOR maturing beyond the end of 2021 have all but ceased.”
In addition, securitisations referencing Sonia have also begin to take place with Nationwide, a UK financial institution, launching the first securitisation referencing Sonia distributed to investors in April this year.
ICMA also noted there is a significant volume of bonds with maturities beyond 2030, and some perpetual bonds that need to be converted.
“The use of consent solicitations to transition the whole of the legacy bond market – involving vanilla FRNs, covered bonds, capital securities and securitizations – would be a long, complex and expensive process and would not necessarily be successful,” added ICMA. “This is because individual bonds – which are freely transferable – are often held by many (eg several thousand) investors, and consent thresholds are generally high.”
Alternative RFRs
Raja Khuri, BBH corporate banking credit analyst, wrote on the bank’s On The Regs blog that many recently completed or amended transactions, and certain prior transactions, include flexible language that specifically allows for the determination of an alternative reference rate. However, there are also many syndicated facilities with Libor -based pricing that do not include these provisions.
“This lack of fallback language exposes lenders to undetermined or undeterminable interest rate risk if their facilities are not flexible enough to adopt an alternative rate,” Khuri added. “This would likely require multiple simultaneous amendments to any impacted facilities, posing a significant operational challenge to lenders.”
Khuri highlighted that SOFR is an overnight rate money market repo rate while the most widely used Libor rates in US dollars are 30-, 60- and 90-day rates. The analyst noted that Richard Sandor, founder of the Chicago Climate Exchange, launched the American Financial Exchange in 2015 partly to address the need for a suitable replacement benchmark rate. The AFX allows small banks to lend to and borrow from each other under mutual lines of credit overnight or for 30-day terms and has created its own benchmark, dubbed Ameribor.
“Ameribor has tracked at a seemingly consistent buffer above overnight Libor,” said Khuri. “It also reflects the unsecured interbank character that Libor is intended to represent.”
However, he also explained that the breadth of Ameribor’s sample will need to expand in order to serve as a true Libor replacement.
ICE, Libor’s current administrator, also intends to publish the ICE Bank Yield Index early next year. This benchmark will be a term rate and use transaction data reported to the Financial Industry Regulatory Authority submitted by 13 large, internationally active banks.
Khuri warned: “Despite a lack of definitive solutions as this deadline approaches, borrowers and lenders should not ignore the situation. The disappearance of one of the world’s most important interest rates will no doubt affect banks and companies alike.”
Buy versus build? It’s the age-old technology question on Wall Street.
In a modest commission environment where both the buy- and sell-side are making due with less and technology needs only leaping in bonds, should firms build their own technology or employ specialty vendors to help them? While it seems a question more suitable for the accountants and financial officers, Xinthesys Founder & Chief Executive Officer Jeff Stein discussed the benefits of buying versus building software, data workflow platforms and how to manage consolidated audit trail data.
TRADERS MAGAZINE: What some of the benefits are for organizations to consider a data workflow partner vs. building a solution in-house?
Jeff Stein, XinthesysJeff Stein: Over the years the financial services industry has had a love-hate relationship with the concept of build vs. buy. Years ago, many believed that having the best software development shop in-house gave you a competitive advantage. In recent years, many have successfully deployed very sophisticated software platforms by external vendors particularly in commoditized functions. This buy vs. build pendulum fluctuates every two or three years and particularly given the steady decline in trading revenues over the years, many banks and brokers are reevaluating where and how they spend their tech budget.
Should you be spending dollars in areas that provide no inherent competitive advantage or that are not revenue generating? There is a trend among financial institutions to look for ways to reduce spend in non-revenue generating areas to better allocate resources for revenue-generating areas like trading infrastructure, algorithms, pre-trade analytics, new other product offerings.
When it comes to regulatory reporting such as CAT, why build a platform that will generate zero in revenue for your firm? On the contrary many firms are looking to partner with the right firm to reduce risk for post trade workflows like CAT.
Partnering with the right firm for regulatory reporting, at half the cost and half the risk, just makes fiscal sense. Regardless how large you are and how much tech budget you have.
TM: What key characteristics should CTO’s consider when looking for regulatory data platforms?
Stein: That’s a tough question because in our experience it’s not only the CTO who is involved in a decision for a platform. These days, COO’s, CRO’s, CCO’s and Heads of the Trading are equal stakeholders in these decisions and it’s important to recognize different perspectives and joint responsibility for success in these types of endeavors.
The upcoming CAT implementation has caused many C-suite executives to rethink how platforms are implemented. The following elements are top of mind:
• Delivery risk
• Ease of the implementation
• Impact to the organization
• Access to data
Delivery risk is an important factor as many of us have seen large projects fail, use significant amount of resources and if implemented, not hit the mark of the initial desired value. Which is why in our view smaller is better. And as we know from software implementations, agile surpasses long delivery timelines. This has been proven successful in many disciplines.
So, when looking at implementation some of the key things to identify are:
• How easily the platform integrates into the existing environment?
• How quickly can it demonstrate value and proceed on the right track?
• How easy and quickly can changes be made? (i.e. Can your requests be measured in days, weeks or months?)
• Are you dependent on a vendor to make changes or can you exercise self-help and make changes yourself?
In our view, easier implementations are made up of systems that leverage existing firm’s assets and systems with short delivery timelines that are measured in days not months.
Assessing the impact on the organization on a program like CAT is complex as it cuts across many disciplines: the front office, operations, compliance and technology. Each of those areas will be impacted. Therefore, for firms that are already wrestling with existing programs in regulatory or trade processing, these parts of the organization will be stretched even farther. From our perspective, selecting a vendor that has best practices in this space, can significantly reduce or even eliminate the impact on the organization.
All that said, it’s always about the data. How easily you can access the data to answer questions or respond to queries by internal and external stakeholders should be a primary factor in your decision.
Without getting too technical, approaches generally fall into two categories, a “layered approach” and an “integrated approach.”
In an “integrated approach” a system integrates and is fed by an existing data warehouse for trades to report. For many firms can be quite challenging. The reason for this is that most data warehouses in banks are specialized and undergoing continual change. Moreover, most have been engineered and re-engineered over decades. Keeping this integration working requires constant care and feeding. Most solutions fall into this category.
In a “layered approach,” a system integrates directly into the trading streams and feeds a data warehouse. This approach is not dependent on the firm’s data warehouse for trading data but rather the source for it. It may be necessary to pull static data such as codes, accounts, classifications but this minor dependency too can be eliminated in some systems so as not to create a dependency on day-to-day basis. This is the approach we have taken with ADEPT and why the integration and implementation are far easier.
It is from this perspective that your CTO’s and technology architects can guide you on the merits and difficulties with either of these two approaches at your firm.
But the bottom line is this: You must be able to respond to regulator queries within a 24 hour period. Therefore, any solution you chose should put the data access in your hands and not strictly your vendors.
TM: How does Xinthesys’ solution differ from other big data systems?
Stein: Our solution which is named ADEPT and stands for Agile Data Engineering and Processing Technology is both a software platform and tools to empower data scientists, analysts and technologists to quickly create and the deploy data engineering pipelines.
We have and/are using this platform to create data workflows for the CAT, legacy OATS end-of-life, Rule 605/606 and Bluesheets reporting with many others coming over the coming months. Our vision is to allow anyone that may or may not be a developer but that understands data to create any data workflow imaginable and to run those workflows at scale in a public or private cloud platform or on self-hosted infrastructure.
We believe that we have a unique approach and platform that will bring ”Agile” data engineering to the post trade environment.
With ADEPT, it’s all about empowering users not overpowering them or holding them hostage. Our vision provides those tools in that technology that will enable any firm to reduce the risk for data intensive operational needs.
TM: What is Xinthesys doing to attract customers as the deadline for CAT nears?
Stein: For starters, we are offering a no obligation informational and assessment session to any firm that is interested in discussing some of the best practices in operational efficiency around trade reporting. Specifically, around some of the challenges that the CAT brings and managing the OATS end-of-life. These sessions are designed to understand better the operational complexity of an organization and what challenges or potential ‘gotcha’s’ that will be factors in becoming compliant for the CAT.
Secondly, we are excited to introduce our new program called CAT Lite. It is a trimmed down version of our larger offering for CAT. By design, it is a low-cost, super easy implementation that enables a firm to set up, test and report on a daily basis to the CAT within as little as two weeks and it is available today.
CAT Lite is targeted for those broker-dealers, small or large that find themselves either in a risk position, uncertain about the direction they should take or would like to take an evolutionary path towards managing through the CAT deadlines. It is a solution which bypasses the need for a large upfront expense, a complex implementation and tying up precious firm resources. It leverages your existing trading infrastructure, connectivity and data.
There are also incentive programs available if you are on some of the larger FIX networks, such as AUTEX Trade Route (ATR), by Refinitiv.
TM: Aside from CAT, who are you targeting? Is Xinthesys focused only on financial services?
Stein: Yes. For the foreseeable future our focus is on our core competency, the financial services industry. More specifically around pre-and post-trade. That said, we have been approached by organizations that service other industries such as research, government, healthcare and insurance that are interested in taking our technology and applying it to those industries. We will gladly license the technology but leave it for others to develop those markets.
Later this year we will be releasing a new product that we’re super excited about. We believe it will be a game changer in the quantitative/portfolio analytics and financial engineering disciplines. It is called ADEPT Models™ and is targeted to Buy and Sell side firms engaged in portfolio analytics and trading signal generating activities. We will definitely be sure to keep Traders updated when this becomes generally available.
We're Enhancing Your Experience with Smart Technology
We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025. Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.