Home Blog Page 496

Improving Single Stock Trading

By Lee Bray, Head of Asia Pacific Equities Trading, J.P. Morgan Asset Management

Lee Bray, J.P. Morgan Asset Management
Lee Bray, J.P. Morgan Asset Management
Investment in high touch trading has lagged behind spending on low touch technology, but that is changing with the recognition of the importance of developing quantitative methodologies.

I have been in the equity trading business for 20 years, and during that time buy-side single stock trading has, in essence, remained the same as when I started – with a few minor tweaks.

Meanwhile, the low touch world has been going through significant change for many years now: smart algorithms, wheels to determine which provider is best, innovations around dealing with the rapidly evolving market structure; the list goes on. This dichotomy between the two is perplexing.

Single stock traders throughout the world, in whichever market they are operating, are often responsible for overseeing trades that are significant and material to the underlying portfolios. How have we arrived at this status quo?

In my view, there is one core reason: the lack of investment by the industry in single stock trading. This may once have been justifiable given the absence of real-world solutions that could help. However, this has changed in recent times with the advent of machine learning techniques and the accessibility of TCA (transaction cost analysis).

Before any firm embarks on a path to reassess their approach to single stock trading, they should first acknowledge failings that are almost invariably present within their current workflows, most notably the lack of a consensus view on what makes a “good” high touch execution strategy. Realistically, firms that have two single stock traders with the same trade will each take a different view on how to execute and likely will also employ divergent approaches. This is not their fault: it’s merely the lack of a constructive set of defined criteria which were not accessible in the past.
screen-shot-2019-05-19-at-10-01-51-am
The world has moved on, and most asset managers now have access to tools such as portfolio manager profiles and more advanced systems that can indicate what could be the drivers of a particular stock move. If trading desks can put a framework around this and get comfortable with it, they can start to provide meaningful, quantifiable feedback to traders to start making improvements and leveraging outcomes.

I’ll simplify the workflow to make a point. Currently, I see three key decision points for high touch traders.

Workflow for high touch trading
First, the decision to trade a block should there be one available. Second, if there is not a block available, then what is the best way to work in the market while searching for blocks? Third, determining what information to give to the street to enable them to do their job concerning matching buyers and sellers.

J.P. Morgan is developing quantitative methodologies to address and inform our high touch traders at these three key decision points, as well as working with our quants to construct a framework around when is the correct time to adjust a given strategy. Having this structure in place simplifies workflows and makes high touch trading significantly more intuitive.

To date, we have made significant progress. Traders already receive a recommendation on which broker to leave information with to optimize potential block outcomes for the benefit of our clients. Block releases are captured and time stamped so we can much better understand the consequences of our traders’ actions. Also, a framework is currently under development to create a recommendation around “trading strategy” that will supplement the high touch traders’ own input.

Needless to say, the level of flexibility is greater in the high touch world when it comes to intervention, but this framework provides a base for our quants to help improve execution quality, where there was none before. By the end of 2019, a number of other factors will have been baked into the model to provide a significant step forward in achieving optimal outcomes.

This investment, and the framework it provides, give the traders more confidence in their decision making and allows them to understand with hard numbers which counterparties are working for them and which are not in the brokerage community. Put another way, this process enables them to start to move away from the current best practice of a qualitative overlay when making decisions. With more confidence comes more information to those on the street that are meeting requirements, allowing sell side traders to do their job in providing added value to the blocks world and to move away from the prevailing thought that algo execution is the first port of call when working large orders.

In order for this evolution to have the maximum effect, there is another change that needs to happen – within the traders themselves. Embracing new technologies and techniques is vital, and those firms who push forward in this space will have the most open-minded single stock traders; those who are willing to adopt new and exciting changes to increase the efficiency of their role and empower them to make more informed decision.

We are at the start of a period of dramatic change in this critical space, and I am looking forward to seeing a group of traders who have been underinvested to move forward as the industry takes the next step. 

BNY Mellon Completes Initial Margin Puzzle

Jonathan Spirgel, global head of liquidity and segregation services at BNY Mellon, said the custodian is providing the last piece of the puzzle to allow clients to meet the incoming initial margin regulations which will affect a much greater proportion of the buy side.

In September phase four of the uncleared margin regulation will come into force, with phase five in September next year. Regulators began implementing the mandate to exchange initial margin on uncleared derivatives in 2016 in annual waves. The implementation thresholds are based on firms’ notional amounts of outstanding contracts of the derivatives that fall under the regulation.

This month BNY Mellon said it had partnered with AcadiaSoft, which  provides margin automation solutions, to allow buy-side derivatives market participants to fully outsource their non-cleared margin workflow.

Spirgel told Markets Media that BNY Mellon had chosen AcadiaSoft because the firm is a market leader and their technology is already being used by the sell side for margin movements and collateral management.

“We are using their technology to provide an end-to-end solution for our clients including asset managers, insurers and smaller hedge funds who have limited capabilities to meet the non-cleared margin regulations,” he added. “By joining forces we provide the last piece of the puzzle.”

AcadiaSoft acts as a central repository for calculating initial margin, enabling market participants to use the utility as a single point of contact through which to conduct their messaging and calculations. BNY Mellon said in a statement that it now becomes the first collateral outsource provider to support the initial calculation and reconciliation needs of clients that fall within the scope of the non-cleared margin rules.

Chris Walsh, AcadiaSoft

Chris Walsh, chief executive of AcadiaSoft, said in a statement: “We’re thrilled to be able to offer buy-side derivatives market participants access to AcadiaSoft’s IM calculation and reconciliation services for the very first time through BNY Mellon.”

Spirgel said clients already send BNY Mellon daily trade data which includes 25 fields.

“They can add additional fields to the same file and AcadiaSoft will perform the margin calculation, after which BNY Mellon will message the counterparty to move and segregate the collateral,” he added.

Need to prepare

Spirgel said BNY Mellon is pursuing clients so they are prepared for the next phase of the margin requirements.

The first four phases of the initial margin regulations apply to approximately 100 firms globally according to Celent. The consultancy  said in a report that in comparison, the fifth phase will apply to more than 1,100 additional firms, involving 9,500 bilateral counterparty relationships, and 9,000 negotiated agreements.

Euroclear has also that during the first three phases only 34 global asset managers came under the scope of regulation. However the next two waves will cover 1,200 firms, who between them have 9,500 counterparty relationships, each of which requires new sets of documentation.

“Waves 1 to 3 showed that firms needed new documentation, new customer selection, fresh understanding of collateral eligibility risk and collateral optimisation,” added Euroclear. “The implementation of these waves also included a huge amount of tech and infrastructure changes for the IT and operational teams within firms.”

Spirgel continued that in order to prepare firms need to know who their counterparties are, have documentation in place and instructions concerning where to move collateral. “There is a lot of heavy lifting upfront but once clients are set up, the process will be streamlined,” he added.

RETAIL REPORT: 2/3 of Wealthy Want Crypto in Portfolios

So, who wants to buy crypto?

Everybody? Maybe. But according to new research from deVere Group the wealthy are looking to increase their cryptocurrency exposure. The firm said that more than two-thirds of high-net-worth individuals will be invested in cryptocurrencies in the next three years.

Nigel Green, deVere Group

Carried out by deVere Group, a proprietary survey revealed that 68% of poll participants are now already invested in or will make investments in cryptocurrencies, such as Bitcoin, Ethereum and XRP, before the end of 2022.

The research was based on responses from over 700 clients who currently reside in the U.S., the UK, Australia, the UAE, Japan, Qatar, Switzerland, Mexico, Hong Kong, Spain, France, Germany and South Africa.

High net worth is classified in this context as having more than £1m (or equivalent) in investable assets.

“There is growing, universal acceptance that cryptocurrencies are the future of money – and the future is now,” Nigel Green, founder and CEO of deVere Group told Traders Magazine in a phone call. “High net worth individuals are not prepared to miss out on this and are rebalancing their investment portfolios towards these digital assets.”

Green added that crypto is to money what Amazon was to retail.  “Those surveyed clearly will not want to be the last one on the boat.”

Besides FOMO – the ‘Fear Of Missing Out’- Green believes there are five main drivers for high-net-worth individuals’ surging interest in cryptocurrencies.

“First, cryptocurrencies are borderless, making them perfectly suited to an ever globalized world of commerce, trade, and people,” he began. “Second, they are digital, making them perfectly suited for the increasing digitalization of our world, which is often called the fourth industrial revolution. Third, they provide solutions for real-life issues, including making international remittances more efficient, and help bank the world’s estimated two billion ‘unbanked’ population. Fourth, demographics are on the side of cryptocurrencies as younger people are more likely to embrace them than older generations. And fifth, institutional investors are coming off the sidelines and moving into cryptocurrencies, bringing their institutional capital and institutional expertise to the crypto market.”

The deVere CEO’s optimism comes as Bitcoin, the world’s dominant cryptocurrency, continues to register new highs, reinforcing the view put forward by its recent upswing towards bullish territory.

“Once this confidence is in place, the sky is the limit for cryptocurrencies, which are increasingly accepted by both retail and institutional investors as the future of money,” he said. “The global poll underscores a justified international surge in crypto-optimism.”

 

Buy-Side Trading Q&A: BlackRock

BlackRock won Best Global Trading Team at the 2019 Markets Choice Awards.

Markets Media spoke with Supurna VedBrat, Head of Global Trading at BlackRock, to learn more.

How and why is teamwork important for BlackRock’s trading?

Supurna VedBrat, BlackRock
Supurna VedBrat, BlackRock

Markets today are highly interconnected. BlackRock has a global presence, which makes teamwork even more important. Our size and scale allows us to have specialist trading desks located in multiple regions that operate as one global trading team. Furthermore, collaborative knowledge sharing across the team, along with a leadership team that brings different ideas and perspectives, has allowed BlackRock to proactively alter its trading strategies to adapt to changing external market conditions.

BlackRock’s trading team uses the same risk management system globally, which has been instrumental in our ability to achieve best execution for our clients. Having the same view and trading tools across regions allows our traders to benefit from each other’s contributions such as using similar algos across the regions for the same product or transferring use of instruments and trading strategies such as ETF’s and portfolio trading from equities to Fixed Income. Teamwork has been essential to our ability to tap into the collective intelligence of our entire trading team.

How is BlackRock improving its trading in specific asset classes?

Daniel Veiner, Global Head of Fixed Income Trading at BlackRock, accepted the award.

There are many examples but let me just touch upon a few. In credit markets BlackRock has been working with other market constituents on developing efficient models to increase the use of portfolio trading in fixed income, which offers an alternative method to access liquidity. The ability to trade large baskets of credit as a single portfolio trade allows either side to lay off multiple line items of risk seamlessly at a more competitive price than the individual trades while leaving a very light footprint in the market. Portfolio trading is one of those rare situations that is a win-win for everyone.

In equities, we have been very active in improving market microstructure in the wake of the ‘flash crash’ from a few years ago. We are also constantly enhancing the quality of algorithms, the controls around the algorithms and the decision making process behind which algorithms to use and when. We are also continuing to build out the trading ecosystem for ETFs, especially with the growing use of Fixed Income ETFs.

How else is BlackRock raising the bar?

Proliferation of data due to electronification and regulatory reporting has created an environment where data science and AI can increasingly be used in trading strategies. We are only at the beginning stages of using data science to help with analyzing market liquidity and further enhance trading cost analysis. We recognize dealer balance sheet has become a scarce commodity these days so we optimize how we use the balance sheet we are given.

17th Asia Pacific Trading Summit

Edward Duggan, HSBC on the major developments in Emerging Markets including China



Adaptation to the Central Securities Depositories Regulation

Adaptation to the Central Securities Depositories Regulation

This report examines the March 2019 go-live of the EU’s Central Securities Depositories Regulation, in which a multitude of country-specific rules historically created an environment in which financial markets participants were required to comply with end-investor and regulatory reporting mandates. The eventual full implementation of the EU’s CSDR across the entirety of Europe means that, in time, the continent’s CSD industry will likely adopt a more uniform, single market-friendly structure.

Adaptation to the Central Securities Depositories Regulation

Let Go Of The Wheel! Buy-Side Traders Need To Embrace Their Tesla Moment

Raj Mathur, Credit Suisse

raj-mathurBy Raj Mathur, Co-head of AES, APAC, Credit Suisse

Adoption of the latest technologies for mechanical aspects of the trading process allows human traders to focus on where they can add value.

The buy-side has developed trader workflows around the tools that they have had available for over a decade. However, the investment into and application of new technologies is challenging the heuristic approach to trading.

A buy-side trader has to consider several variables, including size, potential price impact of an order, as well as the urgency or alpha that the order carries. Doing this for ten orders is manageable; but doing it for hundreds of orders is not.

The nature of a trader serially inputting orders or instructions enforces limitations. As the themes of best execution have penetrated this workflow, packaging orders into bands or buckets based on average daily volume (ADV) or volatility has been a natural progression to bring homogeneity into the process.

In particular, buy-side algo wheels have helped provide a structure around how orders can be bucketed around key factors, to bring more statistical rigour into the evaluation of execution quality and to facilitate a comparison of execution quality amongst brokers.

However, introducing statistical observation to provide for an “apples-to-apples” comparison may also introduce certain approach biases which actually deviate from the overarching objective of seeking the outperformance we are looking for.

THE IMPLEMENTATION SHORTFALL PARADOX
Implementation shortfall (IS) has become central to benchmarking broker execution. Historically, the tools to target IS have tried to capitalize on implicit short term bets by using price reversion or momentum models. That is to say, when traders are placing orders with IS in mind, more often than not, the tools at their disposal require the trader to decide participation bands, and then vary participation driven by price performance. If prices improve from arrival, participation increases (reversion) or alternatively decreases on the same basis (momentum). On the assumption that there is no clear alpha precipitating the trade, the net result is that while some orders may end up winning, the losing trades continue to run and as a whole produce very mixed results. That is possibly the reason why we have seen the buy-side gravitate away from traditional IS benchmark strategies into more spread capture focused strategies (see table above) for the purpose of IS benchmarking.

From a theoretical perspective, that makes a lot of sense for the buy-side: more prescriptive participation rates create less variability in performance outcomes and also allow a more apples-to-apples comparison of brokers’ algos performance.

But, although this achieves a certain level of homogeneity of results, it creates a paradox: the pursuit of standardising results for comparison sake may potentially be contrarian to our goal to find the best possible results.

In fact, the tools have not really been designed to address the heart of the problem. So far, the typical algo toolkit available in the IS space has necessitated that the buy-side trader implements the trade plan, and makes micro decisions: for example, take a bet on where the price will go, and choose appropriate participation rates, etc.

There has generally been a lack of trade planning assistance from the algos, yet with the advancement of machine learning and artificial intelligence to make more data driven decisions, the problem has shifted from a technological challenge to more of a philosophical challenge.

MAN VERSUS MACHINE
When it comes to trade planning, can a machine make more informed decisions than a human? Even without grandstanding breakthroughs in technology, we have seen the benefits of applying this idea to anticipating close volumes for market-on-close (MOC) style algos that address the “traders dilemma” to balance benchmark slippage against potential price impact. Meanwhile, volume weighted average price (VWAP) has been staring us in the face the whole time as the epitome of the driverless algo that can trade plan for us.

Indeed, VWAP embodies all the features of a benchmark algo that we are looking for. As we have found with spread capture implementations, the lack of assisted trade planning forces the need for components such as “must complete”, which essentially are the equivalent of an emergency parachute. The trajectory is deterministic, and if it runs out of road it needs to take emergency manoeuvres.  

The art of trade planning should be replaced by the science of trade planning. Let the machine determine the initial participation, and re-factor it around the data it can assimilate from what is happening, and utilise what it expects could happen. No more “aim to complete” please!


To illustrate, the difference in the human heuristic approach versus the data driven machine approach (see charts below), we have two small samples of orders, encompassing different ADV factors.


In group 1, 11 orders with an IS benchmark were given to a trader, and the decision was made to trade all of them between 5-10% participation.

In group 2, 25 orders were given to the same trader, and the decision was made to trade these between 10-15% participation.

Looking to the right, the same orders were given to our new IS algo, and it’s clear that the machine was able to determine a unique signature for each order. Individual participation rates from the machine ranged between 2% and 60% for group 1, and 2% and 30% for group 2.  

By relying less on heuristics, and instead using the machine’s ability to assimilate historical and real-time data, provides for a wider window of possibilities, within which we may find some of the out-performance we are looking for.

Continuing down the conformist path may help make things easier to measure, but at the cost of finding ways to improve.

THE FUTURE OF TRADING
At present, the amount of investment and the potential for technology to make an impact on execution performance relies on our ability to re-think the way we approach things philosophically.

A good analogy is the moment Tesla drivers received the software update to allow their car to drive itself. Imagine the dilemma: you have driven the entire time with full control, and now you have to cede control to the car. Can you let go of the wheel or not?  

Abandoning the mechanical aspects of driving frees up the trader to do what they do best, which is to discover alpha.

Machines should empower traders to find alpha: the actual trading piece is only a subsection of the overall trade lifecycle. Other elements can yield more insight and actually inform our approach, namely: What precipitates the decision to trade? What can we learn from previous results? How does this look at the portfolio level? Where can I find blocks?

The future does not mean less control; the role of the trader is more important than ever. The trader still has ultimate discretion over the outcome. Where the trader has information that a machine may not possess, such as momentum signals that precipitate the trading decision, or knowledge of the larger uncommitted order, the trader can inform the machine to find the extra edge.

So who wins between man and machine? It’s a no contest. The goal is to move beyond the heuristic and mechanical aspects of trading. The machines can find new ways to improve the overall average, and empower the human trader to realise opportunity beyond this – best execution.

Venue Analysis Holy Grail

By Enrico Cacciatore, Senior QuantitativeTrader – Head of Market Structure & Trading Analytics, Voya Investment Management
cacciatore_e_10-02

Routing Transparency Initiative – Unlocking the routing decision

1. Introduction
Twenty nineteen is poised to be a very eventful year in market structure. Factors such as technological advancements, optionality in venues, Reg NMS1, and increased usage of quantitative methods are significantly shaping the evolution of equity market structure. The next few years will be very pivotal in tackling several obstacles that are inhibiting equity markets from transitioning to the next phase. This year, in particular, could have substantial influence on shaping the market’s future direction. We can anticipate the continued growth of systematic trading, the SEC evaluating the maker/taker model as well as the enhancements to Rule 606, to highlight a few key events in the year ahead.

1.1 Reg NMS, Technology and Quant methods
The interest in market structure and its impact on how market participants transact to achieve an investment goal continues to have ever-increasing significance. Reg NMS was a major catalyst in seeding the landscape for technology and quantitative methods to dominate the trading process. Without an equity market structure that is fluid and efficient, the growth and dominance of systematic investing, whether it be in a large systematic quant fund or an automated market maker, would not have been able to grow as it has over the last thirteen years.

From this highly efficient and relatively low-cost market structure, investors are able to transact with venues and a cost structure most suited for their investment objective. If an investor2 is patient and passive in seeking liquidity, one is able to capture both the Bid/Ask spread but also a rebate fee3 for providing liquidity on that specific venue. However, if an investor is very aggressively seeking liquidity, one could just take passive liquidity resting on an exchange.

Alternatively, one can queue jump by resting on an inverted exchange that flips the maker/taker model, requiring liquidity providers paying a fee to make. This structure provides, what I call, “optionality in venue liquidity”. The fragmented structure does have its drawbacks, but is outweighed by having equity markets that allows liquidity shoppers and vendors to provide both an economic and transactional experience best aligned with the purpose of the trade. Depending on the intention, the investor can find a venue that best fits their objective. In some cases, the cost is a critical factor, while in others it is the highest probability of a fill. The result allows the investor to align a trading strategy that best captures the alpha profile of the investing decision.

2. Venue Analysis – Current State
Venue analysis should never be a starting place to evaluate performance or toxicity of execution at the venue level, but should provide the forensic details of potentially why certain orders or even a broker algorithm in aggregate are under-performing. Using a top-down approach to broker-dealer algorithm performance, we can replay the intent of the algorithm’s order placements and determine what impacted performance based on the investors communicated urgency.

Just because the broker-dealer has an efficient smart router that does well in capturing the routed intent does not equate to a strong placement level performance. The placement level performance may be poor due to the broker-dealer’s strategy urgency level not being aligned with the investor’s alpha profile. In this scenario, this would not be a bad problem to have. The investor can simply synchronize the optimal urgency level for given order profiles and expect to see a performance increase.

However, poor placement level performance may be due to the broker-dealer using an algorithm developed around clock-time rather than volume-time. In the clock-time approach, we would see some sort of increment of time set used to trigger additional algorithmic instructions such as after five seconds at the bid, move to mid. Under a volume-time approach, actions are triggered by events, and actions tend to be more dynamic and random. Using a clock-time approach in stocks that have large gaps in events such as small cap illiquid securities, timed triggered instructions are highly detectable. With the volume-time, one would expect algorithmic interaction with the market place to correspond directly with the stock specific events.

By connecting algorithmic event intentions with actual stock specific events such as a quote, move or large block print becomes a powerful tool when evaluating placement level performance.

2.1 The push for greater transparency
Various incentives are providing the push for broker-dealers to provide greater transparency around child order routing to venues. In a press release on December 19, 2018, the SEC released information regarding the Transaction Fee Pilot: “The Pilot will study the effects that exchange transaction fee and rebate pricing models may have on order routing behaviour, execution quality, and market quality.”4

Additionally, the SEC “adopted amendments to Regulation NMS, [Rule 606(b)(3)], [that would] require additional disclosures by broker-dealers to customers regarding the handling of their orders.”5 The most vocally expressed justification around both SEC actions is the view that there is an embedded conflict between broker-dealers, access fees and best execution. The view is that broker-dealers may prefer specific venues based on economics and not in the best interest of performance.

This does not solely influence routing to regulated exchanges (lit market), but also Alternative Trading Systems (ATS – Dark Pools). FINRA recently released a paper titled “Institutional Order Handling and Broker-Affiliated Trading Venues” that “…examine[s] whether the use of affiliated ATSs that may have benefits for the brokerage firm affects execution quality of institutional clients.”6 All of these concerns and potential conflicts produce valid justification to question routing decisions. No matter, transparency around routing must be improved to help broker-dealers remove this view of conflict as well as provide critically missing data in evaluation routing and venues quality.

Broker-dealers must adopt a standardized intention/reason code for why they routed an order to a venue. This would remove or validate any conflicts as well as provide the missing component to true venue analysis. As stated in the FINRA paper, “our data are not sufficiently detailed about client intent or the extent of broker-dealer discretion associated with an order.”6 If we have both filled intention codes real-time as well as all routed orders stored and available in a standardized format, both investors and broker-dealers have the ability to step-through the routing decisions and analyze the impact on those decisions.

3. Venue Analysis – Future State
A committee of sell-side and buy-side peers are working to create, normalize and adopt a “Routing Transparency Initiative”. The goal is to provide more transparency and standards to routing FIX orders to venues as expressed in the 606 amendments and to add a quantitative layer to the behavioural changes in routing for the SEC’s empirical evaluation of Reg NMS Rule – 610T.

The use of a descriptive numerical framework using five two-digit sequence codes to categorize the actions in numbers would allow many combinations and flexibility to handle a new normalized standard to the FIX protocol with the standard applied globally. Each two-digit descriptor would provide up to 99 possible sequence code values allowing for enough expansion for future expansion. The translated numerical string essentially becomes a readable phase describing what the broker is attempting to accomplish from the routed order. Figure 2 would have a ten-digit string of “0501081903” to represent a Midpoint Ping, ATS, Market w/ Minimum Execution Size (MEQ), Seek oversized Block, with Medium urgency.

Our committee has begun to spec out a conceptual descriptive routing framework to communicate the broker-dealer’s intentions. The broker-dealer’s proprietary routing codes would need to be translated and standardized into a predefined sequence code and corresponding value. With this standardization adopted across all broker-dealers, we would look to integrate this information into existing protocols. The attempt would leverage existing protocol from ITCH, OUCH, UTP Direct, etc. Additionally, FIXtrading.org and standard fix protocol would help determine how best to adopt these standards into the Financial Information eXchange (FIX®) Protocol.

The Routing Transparency Initiative committee will leverage knowledge from developers, fix engineers, systematic traders, market structure heads, and participants to define a descriptive numerical framework that is both extensive and flexible enough to handle all possible routing intentions defined by broker-dealer’s proprietary routing behaviours converted to a normalized standard.

4. Conclusion
In the current investment life cycle, all decision points are very transparent and are captured and stored for further analysis. We can systematically improve performance by evaluating these decision points in an optimization process. Aligning the parent order’s alpha profile vs. optimal algorithmic strategy results in an iterative systematic improvement in parent order performance. Having consistent algorithmic strategy performance is a critical factor in this process. The Routing Transparency Initiative adds a transparency layer into the investment life cycle that would help broker-dealers improve their routing, investors understand and evaluate routing practices, and a critical component to the SEC Rule 610T empirical evaluation.

Hear more opinions on this topic:
 

Phil Mackintosh, SVP, Chief Economist, Nasdaq
Phil Mackintosh,
SVP, Chief Economist, Nasdaq
THE POST TRADE DEBATE DESERVES TO BE DATA DRIVEN
In the past 20 years, markets globally have automated and modernized. Computers are now responsible for the majority of executions in stocks around the world, and the quantity of data that is available to analyze trading has exploded. There are now more than over 50 venues for trading, and over 35 million trades every day in the US equity market. Just recently FINRA highlighted that they had 135 billion records in a single day. At the same time, transaction costs have also fallen significantly. That makes squeezing the last basis points out of your trading desk increasingly more complicated.
Read more
 
 
 
 
 
Ryan Larson, Head of Equity Trading (US), RBC Global Asset Management (US)
Ryan Larson,
Head of Equity Trading
(US), RBC Global Asset
Management (US)
MORE TRANSPARENCY SHOULD IMPROVE EXECUTION QUALITY
Nasdaq recently offered their recommendations on creating a more modern market structure; and separately, a collective group of market participants launched the Routing Transparency Initiative. These efforts should be applauded as contributing to the debate as to how to improve our markets. A collective group of diverse market participants should agree to adopt common ground to deliver a global standard to routing FIX orders that provide additional transparency and build upon the efforts of the SEC’s Rule 606 enhancements, ultimately providing child-order routing decision transparency.
Read more
 
 
 
 
 
Stephen Cavoli, Execution Services, Virtu Financial
Stephen Cavoli, Execution Services, Virtu Financial
COMPLETE ANALYSIS REQUIRES AN UNDERSTANDING OF CHILD ORDER INTENTIONS
By supporting the RTI, we hope to raise the standard of transparency in the industry by providing normalized data for clients to analyze, as we believe that transparency helps market participants make better, more informed trading decisions. Initiatives like the RTI are very much in line with our Enhanced TCA and Transparency Report, where we were the first to provide granular insight into the decision making process of Virtu’s algorithms and how the clients’ execution objectives and market conditions map to an algo’s order routing decisions. Read more
 
 
 
 
 
 
 
 
References
[1] Regulation NMS – Regulation National Market System (NMS) is a set of rules passed by the Securities and Exchange Commission (SEC), which looks to improve the U.S. exchanges through improved fairness in price execution as well as improve the displaying of quotes and amount and access to market data. – https://www.sec.gov/rules/final/34-51808.pdf
[2] Investor – As an institution or retail investor, market access rules require registered member to provide access to venues. Unfiltered or naked access have become prohibited – http://www.finra.org/industry/market-access
[3] Maker/Taker Model – The Broker-Dealer or Member Firm are the direct recipient of net cost of trading with venues. Only institutions with a cost-plus arrangement will benefit from the net cost of exchange maker/taker fee structure.
[4] “SEC Adopts Transaction Fee Pilot for NMS Stocks.” U.S. Securities and Exchange Commission, U.S. Securities and Exchange Commission, 19 Dec. 2018, www.sec.gov/news/press-release/2018-298.
[5] “SEC Adopts Rules That Increase Information Brokers Must Provide to Investors on Order Handling.” U.S. Securities and Exchange Commission, U.S. Securities and Exchange Commission, 2 Nov. 2018, www.sec.gov/news/press-release/2018-253.
[6] Anand, Amber, et al. “Institutional Order Handling and Broker-Affiliated Trading Venues.” FINRA Office of the Chief Economist, Financial Industry Regulatory Authority, 22 Feb. 2019, www.finra.org/sites/default/files/OCE_WP_jan2019.pdf.

Exo-Brain Power: Humans and Machines Learning Together

elliot_noma_edmBy Elliot Norma, Managing Director, Garrett Asset Management

Machine learning is changing the way we trade, and the skills needed to be successful; but the objectives are unchanged.

Humans will not be replaced by computers. As more evidence accumulates, we see the limitations of computers solving problems on their own. Self driving cars and the replacement of radiologists, for instance, have been slower to arrive than advertised. The legal and ethical issues also have become clearer over time as we struggle with questions of who (or what) is responsible for auto accidents and misdiagnoses of cancer.

The implications of technological advances extend far beyond the question of the recall, precision, and f1 statistics of classification. 

This is not a new situation. As early as 1954, Paul Meehl demonstrated that methods such as an equally weighted sum of high school grade points and standardized test scores provided a robust predictor of college performance that was superior to a human-powered selection process for college admissions. Despite these fi ndings, machine-dominated selection procedures have not been adopted. There is clearly a gap between the stated goal for assessing outcomes and the true criteria for determining success or failure. 

If the criteria for success are often fuzzy, then how do we make successful applications?

Our use of machine decision-making only makes sense when we consider artificial intelligence (AI) as the latest generation of tools that, as always, revolve around human users. The idea that tools such as language, pencils, and telephones are merely extensions of human intellect leads one to think of humans acquiring “exo-brain” power to accomplish tasks that would otherwise take an excessively long time to complete or tremendous amounts of organization. These tools don’t make the seemingly impossible possible by replacing humans but make our dreams accessible in a human time scale.

AI offers the power to change the way things are done. Humans still need to be in all parts of the process, be it determining whether we want to risk the firm on a large potentially lucrative trading position or determine whether the value of driving to the store on a snowy night exceeds the value of obtaining the item tomorrow.

Current uses of machine learning emphasize the application of standard models to new data that can provide alpha. There is less emphasis on applying the latest variant of recurrent neural models to minimally improve classification. More efficient ways to tune model parameters continues along with hardware improvements to accelerate the learning process.

However, marginal improvements from the application of more exotic models over standard models increase the fragility of the models. Their application often decreases the number of useful insights the model and its outputs provide.

BEST AREAS OF APPLICAITON
The most promising areas of model development are tools and methods to give humans insights when interpreting model outputs. These methods include data visualization and algorithms to restate model results in terms that provide insights to humans.

Data visualization methods include scatter plots and bar graphs, to heat maps and interactive charts that allow one to drill down into the data. Augmented reality might be the next step in the evolving methods.

The path for improving the interpretability of model outputs is less clear. One approach looks at ways to change the classification of images by obscuring parts of the image to show the sensitivity of the classifier in a region of the inputs.

In image recognition, the image is modified and then reclassified. Images are modified to obscure parts of the input or change characteristics such as colours and shapes. By testing these images, one can guess which factors the key ones are used by the model to make its categorization. This can also be thought of as creating a simpler local model, in the same way that the steepness of a particular climb can characterize the entire route traveled.

LIMITATIONS OF CURRENT TECHNIQUES
I find these methods useful; but they fall short on two aspects. First, these techniques make key assumptions about the features that are relevant in model prediction. For instance, in an image, these methods assume that spatially close pixels make up a neighbour. However, an alternative is that separate wave forms across the image are neighbours or some key regions are neighbours to all points. Only with the assumption that we know which characteristics define the neighbourhood can we then explore a neighbourhood using this method.

This view prevents us from exploiting the true power of these models to discover alternative features. However, alternative ways of viewing the world are what models exploit to improve classification.

Second, we know little about how humans actually make decisions, so it begs the question of what an acceptable explanation is. Brain mapping, for instance, indicates that we make decisions and only later consciously create a script of why we made the decision. Without further insights, we are in the dark about how to construct an explanation that works for both the computer and humans.

Regardless of the current best uses of machine learning, the application of models starts and ends with their ability to solve a trading problem. That problem can be: how do I develop and execute ideas before my competitors; how do I efficiently learn about events and news impacting my holdings; or how do I examine terabytes of data to extract key historical patterns.

Just as the use of charting software changed the nature of futures trading from open outcry to technical analysis, machine learning is also changing the way we trade, and the skills needed to be successful. However, the goals remain the same. Our need to control and understand the limitations of the tools is unchanged.

Warning On Brexit Fragmentation

Market participants warned about the UK’s departure from the European Union leading to the fragmentation of both trading liquidity and regulatory data.

Haroun Boucheta, BNP Paribas Securities Services

Haroun Boucheta, head of public affairs, BNP Paribas Securities Services said financial firms have already spent a lot of time to secure access to financial market infrastructures after Brexit. He spoke on a panel  at the Association for Financial Markets in Europe 12th annual European post-trade conference in London yesterday.

“However it is still not certain that we will have access to all FMIs if there is a no-deal Brexit,” he said.

Boucheta told AFME that the European Securities and Markets Authority’s decision to grant equivalence to UK central counterparties and the central securities depository was welcomed but it is only temporary.

“The equivalence decision for CCPs lasts for 12 months after Brexit and 24 months for the CSD,” he added. “This is temporary so there is still a significant challenge.”

He highlighted Esma’s decision in March that that EU firms will have to trade certain shares and derivatives on EU or equivalent venues, even if most liquidity is currently in London.

Nausicaa Delfas, executive director of international at the UK Financial Conduct Authority, warned at the time that this will conflict with the UK’s own share trading obligation.

Nausicaa Delfas, FCA

She said: “This has the potential to cause disruption to market participants and issuers of shares based in both the UK and the EU, in terms of access to liquidity and could result in detriment for client best execution. We have therefore urged further dialogue on this issue in order to minimise risks of disruption in the interests of orderly markets.”

Boucheta added: “The regulatory war over Brexit has already begun which could lead to a fragmentation of liquidity and increased risk.”

He also highlighted that in December last year the EU only extended equivalence for the Swiss trading venues for one year despite Switzerland adopting MiFID II when the EU regulation went live last at the start of last year.

“The European Commission adopted a rigid position and the same could happen after Brexit,” said Boucheta.

Data

Andrew Douglas, government relations and chief executive of DTCC’s European trade repository, said on a panel that he had spent the last two years opening an office in Dublin in order to operate after Brexit. He explained that half of the repository’s trade reports currently come from the UK while the other half come from the other 27 EU member states.

“Trade repositories were set up to provide regulators with a single source of data,” said Douglas. “After Brexit will only have access to half of the data which is a problem for risk management and market infrastructures.”

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA