It could be a very green Christmas season for the Kingdom of Saudi Arabia.
Saturday, Aramco released its prospectus detailing the much anticipated IPO of the global oil giant. While it was unstated how much would be for sale and to whom exactly, the release confirms the company will go public shortly – with a possible trading date of December 11 seen.
According to the prospectus, first seen by the AP, Aramco revealed that it will sell up to 0.5% of its shares to individual retail investors. It did not indicate how much will be made available to institutional investors. The IPO will be the largest in history.
Saudi Aramco is the kingdom’s oil and gas producer, pumping more than 10 million barrels of crude oil a day, or some 10% of global demand.
Aramco has previously reported that it plans to hold the initial sale on Saudi Arabia’s own Tadawul stock exchange, but some have said shares could be made available via other global exchanges such as the FSE or even a U.S.-based bourse.
In the roughly 650-page prospectus, Aramco said the offering period for investors will begin Nov. 17. It will close for individual investors on Nov. 28 and for institutional investors on Dec. 4. Aramco will price its shares on Dec. 5, according to the document.
The company stated its plans to pay out an annual dividend of at least $75 billion starting in 2020, but questions linger over how much Aramco is worth.
Build it better and they will come – trades that is.
Trading is now a highly complex and data-driven activity. U.S. equity markets comprise dozens of execution venues, each with different liquidity characteristics, order types, fee structures and latency differentials. Trading algorithms are an essential tool for navigating these seas of liquidity, but execution performance can vary significantly from one algo to the next.
The responsibility for Best Execution falls squarely on the shoulders of traders, who are being given substantial flexibility from portfolio managers. For 82% of orders, traders have full discretion on execution strategy. To determine appropriate execution strategies and algos, traders are relying on increasingly sophisticated execution data, which is now a vital tool on trading desks.
“Trading in this new environment both requires and produces vast amounts of data,” says Richard Johnson, Principal for Greenwich Associates Market Structure and Technology and author of Peak Performance: What the Buy Side Expects from their Algos. “The key to achieving peak performance with algos is knowing which data points matter most.”
The new Greenwich Report represents the most comprehensive study of algo key performance indicators (KPIs) to date and helps traders optimize algo performance by identifying and analyzing the most important components of execution quality. Nearly 100 buy-side traders and market structure experts reflecting over $6 trillion in AUM and over $800 million in commissions participated. It reveals how traders rate the importance of factors like impact, reversion, information leakage and adverse selection for liquidity-seeking, implementation shortfall, VWAP, and other types of algos.
Customized Algos The increase in execution venues, order types and tactics, coupled with advancements in data analysis and TCA, has driven many brokers to customize the algos they offer. The buy side is keen to engage the sell side in more algo experimentation and customization – Approximately 60% of buy-side traders in the U.S. are involved in algo customization in some capacity. They are increasingly focused on how to generate better performance, with an appetite to dive into the details of venue selection, algo KPIs and algo customization. The goal: to gain more control over their order flow and optimize trading performance.
Measuring Success The buy side is clearly focused on slippage vs. arrival price across different types of algo strategies, with Price Impact being the most meaningful KPI for better algo performance. Nearly one quarter of traders perform venue preferencing as part of their algo customization, based on the results of cost analysis, or other qualitative reasons.
“Brokers should feel more comfortable discussing algo enhancements and A/B testing,” says Richard Johnson. “After all, the objective is to improve performance for client orders, and this type of discussion and partnership could enhance the overall relationship whether or not clients are open to experimenting.”
That’s according to a new survey of 2,500 UK citizens between the ages of 18 and 65+ reveals the opinions of Bitcoin ownership with some interesting results. The study was launched by Crypto Radar, one of the premier blogs for crypto-related news, press releases, and content related to Bitcoin and other altcoins.
The survey asked 2,500 residents of the United Kingdom the following question:
What’s your stance regarding Bitcoin investing?
I do NOT own any. NOT Planning to buy any.
Never heard of Bitcoin before.
I do NOT own any. Planning to buy some.
I own some. NOT planning to buy more.
I own some. Planning to buy more.
According to the findings of the survey, when asked their stance on investing in Bitcoin, 67.5%. of UK respondents indicated that they neither owned nor planned on buying any Bitcoin. Compellingly, when demographic filters were applied to the results focusing on individuals 65+, the percentage skyrocketed to 76.2%. It further increased to 77.9% amongst males of retirement age.
Based upon the results of this survey, the present uncertainties surrounding Brexit have made British investors far less likely to choose risker investments, such as an asset class like Bitcoin, which is still in its nascency and highly volatile. This is especially true for those British investors in their retirement years where a safe nest egg is imperative.
The next most popular response among UK respondents was 20.3% who had never heard of Bitcoin. Interestingly, when demographic filters were applied to the results specifically targeting males between 25 to 34-years-old, the percentage soared to an astounding 25.5%.
Seeing that in 2019, the fact that over a quarter of British males Young Professionals surveyed, not to mention the full 20.3% of all UK respondents were unaware of Bitcoin, ought to give the crypto community pause. These survey results are demonstrative of the need for more education by the Bitcoin community to inform investors, especially male investors under 35. Of the UK respondents, 6.8% indicated that they did not own any Bitcoin, but were planning to buy some. Yet, when demographic filters were applied to the survey results factoring younger British investors between 35 to 44-years-old, the percentage rose to 8.9%, and increased even further amongst males of this cohort to 11.6%. Taking into account that this demographic is generally set within an established career path with disposable income, it is understandable why they are the least risk-averse to investing in a highly volatile investment like Bitcoin, given the potential for massive appreciation.
“Regardless of its enormous volatility, Bitcoin is a very attractive asset class for those investors, especially younger investors, who are willing to ride the volatility to tremendous gains,” stated Amine Rahal, CEO of Crypto Radar.
According to the results, a combined 5.3% of UK respondents to the survey already own Bitcoin, with those percentages soaring even further amongst British investors between 35 and 44-years-old.
As the Australian Securities Exchange (ASX) moves to replace its Clearing House Electronic Sub-register System (CHESS), market participants look to what else distributed ledger technology can deliver.
Distributed ledger technology (DLT) in capital markets is at an in-between stage of its development: it has become clear that it won’t live up to its most exuberant early expectations, but the future that is coming into focus is promising for its feasible and viable applications.
“There has been an evolution in thinking across the industry with the focus shifting to the practical in terms of how this technology can be applied and what it can achieve,” said David Braga, CEO of BNP Paribas Securities Services Australia & New Zealand.
Braga spoke at a GlobalTrading Thought Leadership Roundtable entitled DLT: Beyond Settlement. The event took place October 3 at the ASX office in Sydney.
Key roundtable takeaways, part 1. Interviews with David Klinger, SpringCapital Investment and David Braga of BNP Paribas Securities Services:
Synchronise, Digitise
The ASX is moving forward with a high-profile, industrial-scale blockchain use case to replace CHESS, its post-trade clearing and settlement system. Peter Hiom, Deputy CEO of ASX, noted that CHESS was developed more than 25 years ago and while it still performs well, it is not the right platform to move into the future with.
“Having spent a significant amount of time exploring the capabilities of DLT, we came to understand its value in synchronising data and digitising processes,” Hiom said. The new platform, which is being developed by ASX in partnership with VMware and Digital Asset Holdings, “will provide the market with a contemporary clearing and settlement system with improved functionality, enhanced efficiency and less risk.”
Added Hiom: “It will also provide the option for our customers to use the DLT capabilities of the system to build innovative services that create efficiencies and new revenue opportunities.”
Fil Mackay, Digital Asset
That part is important, as Hiom noted that CHESS replacement isn’t just an internal ASX project, it will also serve as a jumping-off point for subsequent industry DLT initiatives. “There is an exciting opportunity to standardise processes, and market infrastructure providers such as ASX that play a key role in assisting our customers realise the benefits of using DLT, which can collapse complexity, drive out costs and unleash innovation,” he said.
Smart contracts – computer protocol intended to digitally facilitate, verify, or enforce the negotiation or performance of a contract – are a critical, and often underappreciated aspect of DLT. That’s according to roundtable participants including Braga and Fil Mackay, Head of Engineering – ANZ at Digital Asset.
Peter Hiom, ASX
Braga continued, “We have a great opportunity to better streamline data and enable clients a real-time view of the data. To date, the processes have had a strong reliance on a variety of individual validation processes. In the new dynamic DLT, process and data are tightly coupled. The data can trigger the process itself. So we therefore have a lot more certainty over that code and how it’s described, because effectively that’s now the agreement.
“The challenge going forward will be for collective agreement on the processes that are coded in the software. Right now we’re still exploring the ideas of what it can do for us as a business and our clients and then we will need to see how we integrate this into our contracts,” said Braga.
Key Roundtable takeaways, part 2. Interviews with Peter Hiom, ASX; Mark Wootton, BNP Paribas Securities Services, and Fil Mackay, Digital Asset:
Privacy and Integrity
By sharing data only on a need-to-know basis, DLT provides the privacy that a pure blockchain does not. And DLT’s advantage over a traditional database is that it provides integrity via smart contracts, obviating the need for reconciliation processes. “We don’t think you need to choose between integrity and privacy,” said Digital Asset’s Mackay. “You can have both.”
“In traditional systems today, two institutions agree to processes based on documents, and each builds it in their own technology stack,” Mackay explained. “While we logically expect such systems to agree with one another after, there is no guarantee since specifications are often ambiguous and imprecise.”
“Today institutions agree to messages (data) via XML and similar technologies, but they generally do not have an equivalent way to describe or agree to a process,” Mackay said. “DAML fills that gap by providing a mechanism for institutions to agree to definitions of a process, that each party can run independently.”
The roundtable discussion emphasised that there is much more to DLT than the CHESS replacement, and capital markets firms should be at least starting to assess where, how and when digital-transaction recording systems can add efficiency, reduce risk, and take out costs.
“There are many use cases across the industry that DLT technology can assist to solve,” said Mark Wootton, Head of Custody Product for BNP Paribas Securities Services Australia & New Zealand. “Even in today’s ecosystem there are time-consuming and paper-based processes that the use of combined DLT and smart contracts can solve to deliver benefits of all the players in the value chain.”
The road to test, implement and optimise DLT is a long one, and while year-on-year changes can be slow to develop, some market participants expect a dramatic end result.
Stephen Donaghy, UniSuper
“Future generations will look back on the current database structures that underpin our financial markets with the amusement with which my generation looks at the use of the fax machine as the means of sending trades between firms,” said Stephen Donaghy, Senior Investment Analyst at UniSuper Management Pty Ltd.
Added Donaghy, “As the ASX introduces distributed ledger technology to replace the aging CHESS, UniSuper is excited to explore the opportunities that this new technology offers and looks forward to working with the ASX and other key market participants in making these opportunities become a reality.”
Banks are using compression to help them transition derivatives contracts from Libor as the the benchmark interest rate is expected to be discontinued after the end of 2021.
After the financial crisis there were a series of scandals regarding banks manipulating their submissions for setting benchmarks across asset classes, which led to a lack of confidence and threatened participation in the related markets. As a result, regulators have increased their supervision of benchmarks and want to move to risk-free reference rates based on transactions, so they are harder to manipulate and more representative of the market.
Edward Ocampo, Quantile Technologies
Edward Ocampo , advisory director at Quantile Technologies, told Markets Media: “Compression plays an important role in Libor transition by allowing firms to terminate Libor-linked cashflows, collapse Libor-Overnight Indexed Swap basis positions, and rebuild portfolios onto OIS referencing the new risk-free rates.”
Quantile was launched in 2015 and is used by London Stock Exchange Group’s LCH and CME for its compression and margin optimisation services. Ocampo joined Quantile last year from the Bank of England where he was a senior advisor and led work to develop and promote alternatives to Libor, as well as contributing to the Fair and Effective Markets Review. He also previously worked at Morgan Stanley for 24 years.
Compression allows clients to “tear-up” offsetting trades in their portfolios to reduce the notional outstanding and the number of line items in their portfolio while maintaining the same risk exposure. Use of compression services has increased following the introduction of stricter capital requirements, such as the Basel III leverage ratio, which has led to banks reducing their balance sheets and capital efficiency becoming increasingly important.
Ocampo continued that Quantile can also help firms move uncleared Libor positions into central counterparty clearing houses.
“This can materially facilitate transition because cleared positions are easier to trade and easier to compress,” he added. “We have seen increased interest in this service since the PRA notified regulated firms that they would need to report their Libor exposures on an ongoing basis.”
The UK Prudential Regulation Authority is part of the Bank of England and supervises more than 1,500 financial institutions including banks and insurance companies.
Quantile’s sterling compression service already enables transition from Libor to Sonia’s, the UK’s chosen risk-free rate. “We are encouraging clients to maximise their participation in these runs,” Ocampo said.
Progress on transition
He continued that transition is most advanced in the sterling bond markets as issuance of Libor instruments has ceased in the primary market for floating rate notes and residential mortgage-backed securities.
“Progress on transition in derivative markets is mixed – half of new contracts reference Sonia, but sterling Libor still dominates in long-dated swaps,” added Ocampo.
Derivatives analytics provider Clarus Financial Technology reviewed Sonia swap traded volumes relative to sterling Libor at different tenors.
Clarus said on its blog: “If markets are to transition from Libor to Sonia before 2022 (when Libor is expected to cease publication) the longer tenors (i.e. longer than two years) will need to show substantial growth in Sonia volumes relative to Libor.”
The review found no evidence of a transition from Libor in the critical five and 10-year tenors which are favoured by debt issuance and loans and said this lack of urgency remains a mystery.
“The missing component could be cash trades; for example bonds, loans and securitised products,” added Clarus. “The risk generated from these trades has historically created a need to hedge with derivatives which, in turn, drives a secondary derivative market to spread the risk.”
In addition, speculation is a major component of longer-dated derivative trading volumes. Clarus said: “We will need to see a significant migration of this activity to Sonia if volumes are going to exceed Libor turnover.”
Bank of England
The Risk Free Rate Working Group, convened by the Bank of England and the UK Financial Conduct Authority, has set the third quarter of next year as the date for stopping the issuance of new GBP Libor-based products.
In September the Bank of England warned that firms are continuing to write new Libor contracts that mature beyond 2021.
With Libor expected to be discontinued after end-2021, the use of alternative interest rate benchmarks is steadily increasing. But use of Libor remains widespread, and this poses risks to market stability. https://t.co/8KgzL9XzKL#BankOvergroundpic.twitter.com/PDH5uRy8X3
The study highlighted that loan markets are still dominated by Libor-linked lending and that many new long-dated derivative contracts continue to reference Libor.
“Firms now need to focus on shifting new business from Libor to alternative rates, and should put in place a clear transition plan to mitigate their legacy risk from older contracts,” said the Bank of England. “In June 2019, the PRA and the FCA published guidance for firms on the issues they need to consider when making a transition plan.”
European Central Bank
Ocampo continued that the transition is simpler in euro markets.
“EONIA has been redefined as €STR plus a fixed spread of 8.5 basis points, so there is no basis risk going forward,” he added, “Clients will still want to terminate EONIA trades and replace these with €STR to reduce operational and legal risk.”
The ECB began publishing its new benchmark interest rate €STR, the euro short-term rate, for the first time last month.
The euro short-term rate (€STR), a new benchmark interest rate, has been published for the first time. The rate reflects the wholesale euro unsecured overnight borrowing costs of banks located in the euro area. The €STR is available here https://t.co/BUp2OidOjS#EuroSTRpic.twitter.com/b14OFa6YHP
Exchanges can justify high fees – but only if they are adding value!
By Neill Vanlint, Head of Global Sales & Client Operations, GoldenSource
Neill Vanlint, GoldenSource
Across global financial markets, it is hard to think of a more contentious issue right now than the prices exchanges are charging for their trading data. Some high-profile potential acquisitions, including the London Stock Exchange’s $27bn bid to buy Refinitiv, means that there could soon be more market data monopolies than ever before. Let’s face it, regardless of the providers, market data has never exactly been cheap but prices are getting higher for traders. And with just a few firms currently dominating, there is no question that banks need to ensure they are getting more for their money.
Prior to the financial crisis, lower capital requirements meant trade volumes were booming and the markets were awash with cheap data. This is a far cry from today, with more stringent capital rules putting a real squeeze on risk taking. As a consequence, trade volumes are down, and a reduction in income from trading activity has forced exchanges to develop additional revenue streams.
At the same time, regulation among other things is requiring firms to consume broader data sets more frequently.
From fancy co-location servers for high speed traders, to trendy trading software to stamp out market abuse, exchanges go way beyond being a place for companies to raise money by listing securities. However, these new services do not change the fact that a continued tightening of belts means that financial institutions are not going to jump on the idea of paying excessive data fees unless the exchanges can demonstrate they are adding value.
Despite all the developments and sophistication of the data sets and the technology used to publish it, it remains an indisputable fact that the information distributed from the exchanges is never immediately fit for use. It still needs to be checked, repaired, validated and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers. And herein lies the problem – all these operational data niggles feed into the overall cost. But even if there is an enforced cap on market data prices from exchanges, or even if banks ‘pool’ their data collectively and make it available for less, this still won’t make a drastic difference.
The only way that investment firms can plan for and manage data costs from exchanges is to underpin their market data strategy with an infrastructure that makes data operations as efficient as possible, and oversee it with data governance policies that ensure the data is being used judiciously. From the perspective of an exchange, members continue to demand access not just to pricing, but reference and corporate actions data from listed firms, regardless of the cost. The trouble is that in the modern world of equity trading, members are increasingly dependent on small basis point movements and instantaneous reaction. As such, there has never been a greater need for exchanges to provide accurate, all-encompassing market data. After all, if the price of the data goes up, so too should the quality.
Moving forward, in an attempt to make better informed decisions, analysts and traders are constantly seeking for alternative data sets. And there is no question better quality data is at the heart of the LSE’s interest in Refinitiv. Regardless of the other drivers behind this potential acquisition and others before it, the reality is that a smaller group of very dominant exchanges have greater data firepower at their disposal. With this in mind, financial institutions need to remember one thing – data is a premium staple rather than a commodity. It comes at a price, and the more value that technology is seen to be able to derive from it, the greater the premium that exchanges will be ‘justified’ charging for it.
With a little more than a month since the US Securities and Exchange Commission adopted its ETF Rule, the new rule has helped streamline exchange-traded fund issuance and likely will spur further growth of the fixed-income ETF market.
The 269-page rule eliminated the exemption process for the Investment Company Act of 1940, which had been capital- and time-intensive, noted Eric Pollackov, global head of ETF capital markets at Invesco, during a recent open call hosted by the Security Traders Association.
Eric Pollackov, Invesco
“The costs were in the hundreds of thousands of dollars, and the time it took to gain your exemptive relief could have taken more than a year,” he said.
The Rule’s new process can be seen as a win-win for the issuer community as well as for the SEC as well.
“Whether it is the Division of Trading and Markets or the Division of Investment Management, the rule has just made their time much more valuable since they will not be pouting through exemptive relief applications and processing them,” said Pollackov.
However, one of the most significant benefits of the new rule is the return of custom baskets to ETFs, which the SEC typically granted in its exemptive relief before the 2008 financial crisis but has omitted until it adopted the ETF Rule.
During the interim, for issuers to accept creation orders, they had to have a whole pro rata slice of the fund or a full cash equivalent, which hampered issuers.
“If you don’t have the relief to accept custom baskets, you are asking the creators of your products to deliver a pro-rata slice of a municipal bond index or an aggregate bond index, or something else, which can become quite costly and difficult to obtain,” he said.
Now that issuers are permitted to use custom baskets, they can accept only a portion of a bond index or a few bonds, which they could negotiate with an authorized participant.
“A portfolio manager who did not have the relief prior to this rule could not really negotiate which bonds would be delivered,” said Pollackov.
The new dialogue regarding what the portfolio manager will accept for a creation order is only going to help the fixed-income ETF space grow faster, he added.
Pollackov was careful to stress that the Rule’s relief covers securities that fall under the 1940 Investment Company Act and not exchange-traded notes or securities registered under the Securities Act of 1933, which typically are commodities-based futures products.
“The Rule will affect different asset classes differently, but overall, fixed income is probably the winner,” said Pollackov.
2/ Dutch firm Interxion provides #datacenters and #connectivity. The 200-strong London base is ideally situated in The Old Truman Brewery, surrounded by creative arts and media businesses who thrive on digital data, as well as near The City and its financial services clients. pic.twitter.com/lAN8dApSDZ
4/ Receptionist Cristina Jorge used to be a qualified security guard at Interxion's #data centres. This experience proves to be really useful in her current role as she is trained to deal with a number of security requests across the business portfolio. pic.twitter.com/tRGy9QyJ7R
6/ Caroline Puygrenier, Director, Strategy & Business Development – #Connectivity, was in #CapacityMagazine's '20 Women to Watch' this year. She regularly appears on panels encouraging women to join the tech industry and visits local schools to increase all types of #diversity. pic.twitter.com/dCvJM05fK5
8/ My host Paul Leonard, UK Marketing Manager, has been at Interxion for four months and his responsibilities include #DigitalMarketing and Event Management. He is sitting in one of the 'creative pods' in the office which can be used for quick meetings. pic.twitter.com/bOQ1Uu9Vws
10/ Ian Teague is part of the HQ Commercial Strategy team, whose mission is to enable #Sales to execute the Go to Market strategy and achieve Interxion #businessGoalspic.twitter.com/hRrx2zNkFj
12/ The exterior of #LON3, Interxion's newest #data centre in #London which opened this year. LON1 opened in 2002 and LON2 in 2012. The NYSE-listed firm operates 52 data centres across 11 countries including the Netherlands, Germany and France. pic.twitter.com/jAjkNeFRva
14/ I was given a tour of @Interxion's three #London data centres by Stephen Cramp, Facilities Site Supervisor. His responsibilities include ensuring that the #datacenters maintain their record of 100% uptime. pic.twitter.com/TXEDqGZtM3
16/ Stephen is part of the #running club Interxion's London office. In September club members took part in the annual #damtotdamloop, where more than 35,000 runners take on a 10-mile course between #Amsterdam and Zaandam in the Netherlands. pic.twitter.com/qgnd4rzx0L
18/ The London-based Internet Exchange Point (LONAP) and #LINX, the UK’s largest Internet Exchange, along with eight major content delivery networks (CDNs) are also co-located at the campus, which is strategically positioned between the Square Mile and #techcity. (No photo)
20/ The Interxion #data centers have their own #generators to ensure they keep operating even if there is a problem with the power supply. There are 14 MTU generators, each of which is regularly tested. pic.twitter.com/x2ZbYwKDNS
22/ Lina Desire started at Interxion three years ago and is Customer Support Manager at the European #customerservice center. The team supports 50 data centres in 11 countries, helping clients 24/7 with inquiries, and handling incident and even event (alarm) management. pic.twitter.com/dYVNZECAx4
26/ The view from the #LON3 roof is typical of #London, with a mix of old and new. In the background on the left is Christ Church #spitalfields which was built between 1714 and 1729. The church is next to the modern skyscrapers of the constantly evolving #squaremile. pic.twitter.com/iGpJRgixSt
27/ Thanks to its prime location, Interxion has the largest #finserv community in the City of London with 50% of European Equities and Fixed Income trading here. This is also the home of global metals trading. #LME (THE END) pic.twitter.com/J6cvldlg2A
The total of flows into index trackers and exchange-traded funds in Europe were greater than flows into active funds for the first time.
Detlef Glow, head of Lipper EMEA research at Refinitiv, said at the Refinitiv Lipper Alpha Expert Forum 2019 in London today that assets under management in Europe increased to the end of September this year, compared to the same period last year.
Glow said: “For the first time index trackers had more market share than ETFs.”
Index traders had 9% of market share in first nine months of this year, compared to ETFs with 7%.
“For the first time passive flows were more than active,” he added. “The competition is on.”
Index trackers had 32% of inflows in the first nine months of this year and ETFs had 31%, meaning that passive flows were more than 50% for the first time.
Detlef Glow, Refinitiv
Glow said part of the reason is that as management fees for ETFs fell to less 10 basis points, the fees for index trackers also fell dramatically.
“In addition some investors cannot use funds with derivatives or securities lending and index trackers can overcome these constraints,” he added.
ETFs
Hector McNeil, co-chief executive and founder of HANetf, which allows firms to white label ETFs, said at the conference that the next big trend will be active ETFs.
McNeil said: “The US Securities and Exchange Commission will soon approve non-transparent ETF models where issuers do not have to publish a daily file of their underlying holdings.”
David Lake, chief executive of Lyxor UK, the asset management subsidiary of French bank Societe Generale, said half of inflows into ETFs in Europe this year were in fixed income and 12% into ETFs for environmental, social and governance strategies.
Lake said there was plenty of innovation in European ETFs. For example, Lyxor launched the first ETFs to track inflation expectations in 2016 and he continued that assets rose from $100m to $3.5bn in 18 months.
“In 2017 we worked with Equileap to launch the first gender equality ETF,” added Lake.
MJ Lytle, Tabula Investment Management
MJ Lytle, chief executive of fixed income ETF issuer Tabula Investment Management, said at the conference that innovation in fixed income ETFs is only beginning. He added: “We will soon be issuing a new type of inflation ETF.”
Assets under management
ETFs and ETPs listed in Europe gathered net inflows of $75.3bn in the first nine months of this year according to ETFGI, an ETF research and consultancy firm.
Equity ETFs/ETPs listed in Europe had net inflows of $16.7bn to the end of September this year, substantially less than the $34.2bn in the same period in 2018.
In contrast, fixed income ETFs/ETPs listed in Europe gathered net inflows of $47.9bn in the same period this year, considerably greater than the $11.1bn in the first three quarters of 2018 according to ETFGI.
After initial skepticism, equity market participants have for years harnessed volume-weighted average price (VWAP) to evaluate execution quality. This key metric is enabled by the transparency of the equity markets where both pre- and post-trade data are readily available for every stock and where algorithms are impacted by changes to that data in a matter of milliseconds. As the equity market has grown comfortable with the consistent calculation method of VWAP, the focus of execution analysis has shifted from which is the appropriate benchmark toward how best to use and interpret the benchmark comparisons.
Seeing as the fixed income markets are less transparent and liquid with data that is incomplete, the prevailing market price (PMP) waterfall methodology has the potential to deal with the “data desert” participants face when seeking relevant market observations to power advanced execution analytics. Although it is not the single metric that VWAP is for the equity markets, the waterfall approach provides the ability to develop a precise, quantitative and consistent benchmark data set to which every fixed income transaction can be compared, further driving enhanced insights and supporting critical regulatory requirements.
Reintroduction to PMP
While it is widely believed that the introduction of the PMP waterfall methodology was a recent retail-focused initiative to support mark-up disclosure, the reality is that, as a concept, PMP has existed for many years. PMP was initially introduced by FINRA where it was applied to the corporate market and later extended to the municipal market. As part of the mark-up disclosure regulation that went into effect in 2018, the MSRB made a concerted effort to align their definition of the PMP waterfall methodology with FINRA’s, thus creating a standard methodology across both taxable and tax-free securities. In addition, both FINRA Rule 2121 and MSRB Rule G-30 were rewritten to describe the determination of PMP as a critical component in assessing fair pricing for all customer trades — both retail and institutional.
In the context of mark-up monitoring, mark-up disclosure and fair pricing, the precise PMP value is the most critical metric, whereas best execution processes can be enhanced by leveraging the totality of the data and identifying variances across all market observations. In all scenarios, the ability to (1) marry every execution with a consistent benchmark data set, (2) identify and address exceptions, (3) document other market conditions and (4) support regulatory inquiries provides fixed income market participants with an opportunity to streamline and enhance critical compliance processes.
Rationale for the PMP waterfall methodology
The PMP waterfall methodology walks a professional market participant through a logical process that attempts to replicate what a bond market professional would do in attempting to ascertain the prevailing market price of a bond. By codifying the steps to create a consistent approach across all trades, we can automate the process, store the data and analyze it at different levels of aggregation to identify trends, outliers and opportunities for improved outcomes. The PMP value is determined by the following process:
Level One: Have I traded this bond today? If so, at what price did I trade the bond?
Level Two: If I have not traded this bond today, are there any market observations of trading in this bond today in the following order, starting with the dealer-to-dealer market (considered the inside market)? If not, are there any dealer-to-institutional customer (considered knowledgeable investors) trades? If not, then are there any quotations in the market on my bond?
Level Three: Should no observations exist for my bond; can I identify comparable bonds and determine a value for my bond by using a yield to price the calculation of the comparable bond? This calculation can be done by looking at dealer-to-dealer, dealer-to-institutional customer or quotations (in no defined order).
Level Four: Barring any observations in my bond and a lack of comparable bonds, I can determine a PMP using an economic model.
Regulation driving innovation
The introduction of the PMP waterfall has provided the fixed income markets with a unique opportunity to augment current mark-up monitoring, mark-up disclosure, fair pricing and best execution processes with a single, quantitative and consistent benchmark data set. If adoption of a single benchmark for analyzing fixed income executions follows a similar path to the equity market’s adoption of VWAP as a key tool in execution analysis then, after an initial period of resistance and finding fault, effort will turn to how to improve the metric to drive real insights.
In the early days of using an average price for equity execution analysis, the common refrain among traders was “Why would I settle for average?” With time and improved understanding of the market’s microstructure, traders are now thinking of ways to improve the metric by normalizing for bid/ask spreads, by using interval VWAPs, or comparing to VWAPs before and after trading windows. We should view PMP in a similar fashion. First, we can agree on a common metric. Then, we can work on ways to improve the analytical capabilities of the metric.
The convergence of technology, investors’ needs and regulatory efforts will further highlight the importance and fuel the value of fixed income data. Through relentless innovation in data analysis and automation, market participants can advance their capabilities and drive deeper, more meaningful insights across the fixed income landscape.
We're Enhancing Your Experience with Smart Technology
We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025. Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.