Home Blog Page 542

Equities trading focus : Dark trading : Robert Cranston

Rob Cranston, global head of equities product, Liquidnet
Rob Cranston, global head of equities product, Liquidnet

Robert CranstonNON-DISPLAYED TRADING AND THE PUZZLE OF THE SWISS.

By Robert Cranston, Head Equity Product Management, SIX Swiss Exchange.

We all love a puzzle, at least that is what the hundreds of random math questions popping up on LinkedIn have led me to believe. Leading up to the launch of SwissAtMid in October last year, we spent a lot of time looking at the volumes of Swiss stocks traded in non-displayed pools across Europe, and the patterns that emerged left us with another puzzle.

Formal MTF dark pools were launched, post MiFID I in Europe, towards the end of 2009. Since that time there has been a steady increase in non-displayed trading across Europe, up to around 8-9% of overall market flow trading in non-display pools (according to statistics gathered by Rosenblatt Securities). While there is a natural expectation that this will slow down, especially with the advent of MiFID II, there is an expectation that there will still be a significant amount of flow that will be traded in non-display venues moving forward.

What has come to light is a difference in trading patterns when we started to look at our market compared to some others across Europe. The market with by far the largest shift to non-display trading is London, where the market regularly trades over 12% in non-displayed MTFs. There is also a larger increase in non-displayed trading seen in French instruments where there are regular peaks in trading of over 10%. This compares to the Swiss market where we see around 7-8% of flow trading in dark pools (see Fig. 1). Both the growth in dark trading and peak so far is considerably smaller for Swiss instruments, the question is why?

To find answers, we need to look at why people trade in this way. The easiest answer is that it must be more profitable to trade in the dark for London and Paris stocks versus those from the other markets. What would drive this profitability? To understand this we need to look at the underlying structure of the different markets to try and assess the value of trading in the dark. In this article we look at two key indicators of book quality (pieces of the puzzle) – the average spread and the average volume available at the top of the order-book.

The first piece of the puzzle, the average spread, is a key factor when talking about mid-point trading. The wider the spread, the more value there is in a mid-point execution. However, if this was the only factor driving the decisions, then the Swiss market should expect a much larger percentage of dark trading to take place, as the value of trading at the mid for the main Swiss blue chip index is significantly greater than that of the associated indices from the other European venues. Taking half the spread of a Swiss instrument gives you over 3.6 bps when a similar trade in a CAC40 instrument would only yield 2.1 bps on average (see Fig. 2). So the puzzle is not solved by spread leeway.

The next piece of the puzzle is the queue sizes on the lit book. We do this by looking at the average bid and ask depth for these same indices. This number is very useful as it gives us a view of not only the volumes available at the touch, but also a very strong indication (when looking at highly liquid instruments) of the time it would take for price levels to change and move. The assumption here being that for a highly liquid instrument, a market with considerably more volume available at the touch will see less price moves, and a more stable market. Here we start to see again some very clear differences in book structure, with the Swiss market having much larger volumes at the touch than the other markets being compared (see Fig.3).

These two factors work together. A wide average spread, when driven by wider average tick sizes on the Swiss Exchange (due to using FESE table 2), builds additional liquidity at the touch. The puzzle appears to be taking shape.

However, it also has a secondary impact when it comes to non-display trading. The value of non-display trading to participants is not just price improvement but also the ability to hide their intentions from the market. With an order book that is very dense at the touch (in terms of volume) it is easier to be able to trade out of even relatively large positions without impacting the price on the order book. When there is less volume available, trying to trade without moving the market is much more difficult, and drives you to work harder to try and trade with reduced impact.

With the launch of SwissAtMid last year we are starting to see the difference in non-display volumes between markets narrow for dark trading. By making it much easier to achieve the considerable price improvement available in Swiss instruments, and housing this directly on the same platform as the largest source of Swiss equity liquidity, we are removing a lot of the risk of trying to trade in the dark today. This makes it easier for many firms to achieve the price improvement available at the mid-point with very little technical change. As you can see from the data (see Fig. 4), the growth is significant, and yet we have not seen significant impact to the volumes of Swiss non-display trading carried out by the other public dark pools in Europe. We fully expect the non-display volumes in SwissAtMid to continue to grow, but with the advent of MiFID II we won’t run out of new pieces to fit into the puzzle anytime soon.

©BestExecution 2017[divider_to_top]

Market Opinion : Daniel Simpson : JWG

Daniel Simpson, JWGWHAT’S IN A NAME?

Daniel Simpson of JWG explains the important differences between reg and fintech and the challenges they face.

The terms fintech and regtech are both more and more frequently used in the financial services industry. However, they are not by any means used consistently, or indeed necessarily separately. What they most definitely describe is the increasing use of technology in the provision of financial services and the solving of regulatory problems, but beyond that it gets a little fuzzy.

Regtech has been defined as equivalent to, a subset of and separate to fintech. In fact, the two terms describe things which are similar, but different.

Fintech is the use of technology not just in providing financial services, but in changing the nature of the provision of financial services. It is employing innovative new technologies on a system-wide basis which can blur the lines of financial services and ultimately provide systemic benefits. In this way fintech is a form of creative destruction – it seeks to supplant legacy financial services with greater economies of scale at better points of distribution.

In 2016, Thomas Philippon evaluated The Fintech Opportunity for the National Bureau of Economic Research and concluded that a new regulatory framework was required. He defined fintech as: “Fintech covers digital innovations and technology-enabled business model innovations in the financial sector. Such innovations can disrupt existing industry structures and blur industry boundaries, facilitate strategic disintermediation, revolutionise how existing firms create and deliver products and services, provide new gateways for entrepreneurship, democratise access to financial services, but also create significant privacy, regulatory and law enforcement challenges.”

Regtech is similar, in that it involves the innovative use of technology to change the nature of the way in which something is delivered, however as already stated it is also different, in that it has a very different focus. It is the use of technology to change the nature of regulation, both in terms of the compliance and enforcement systems which combine to define the extent to which policy goals are achieved.

The regulatory jungle

Ultimately regtech should be leveraged to reduce compliance costs for firms and at the same time allow policy makers to better achieve their policy goals. This is in the context of the financial services industry nearing a regulatory crisis point; detailed rules are being written in huge volumes, by regulatory bodies struggling to keep up with their mandates.

The result is requirements that are often not well contextualised and articulated and often unlikely to achieve the desired outcomes, at the same time coming with enormous effort and cost.

Regtech has the potential to be the innovative platform which solves for this developing crisis, but for this to be successful it could well require revolutionising the way regulation is developed and implemented. The approach to regulatory policy development since the crisis has been tantamount to write as many rules as possible, throw them at the financial services industry and see what sticks, then go back and change the ones that didn’t and try again.

A lot doesn’t work, and equally international standardisation has been something to pay lip service to, but is not something that has made it very far. Nearly 60,000 regulatory documents have been published by regulatory bodies in the G20 since 2009. They contain rules that are complex, overlapping, inconsistent and, in some cases, contradictory.

And then you come to how they are implemented; piecemeal with no realistic attempt to look for interdependencies and subsequent efficiencies. This story rarely ends well, and it is not getting any easier.

The complexity of the overlay and interaction of the many regulatory initiatives makes it increasingly difficult to see the bigger picture and adopt common industry solutions. As it is nearly impossible for one individual to be an expert in all operational disciplines, few have a coherent view of the global banking infrastructure and, as a result, when faced with a series of changes, the markets are starved of deep expertise needed.

No easy solution

What this means is that implementing a piece of regulation in a considered manner is a virtual impossibility. This huge strain demands new and dynamic approaches. Old strategies of both regulator and regulated taking a project by project sticking plaster approach are increasingly prone to failure. In this light regtech stimulated by the regulator has a real opportunity to provide a path to safety.

The regtech definition in the UK Chief Scientific Advisor’s 2015 report is “Technology that encompasses any technological innovation that can be applied to, or used in, regulation, typically to improve efficiency and transparency”. In this context regtech provides the means for overburdened firms to provide regulators with the data they require in a more efficient manner.

What makes this harder however is the challenge of encouraging innovation for a solution that, if executed poorly, could land senior managers in jail, under the new FCA Senior Managers Regime in the UK. Given the challenges with specifying the requirements, good regtech is only possible if it is addressed thematically, with the necessary time to do so correctly.

Of course, implementing regulation with time and on a thematic basis will not be the experience of many compliance teams. It is clear that the reasons for this are twofold, the number of regulations of the nature of them means that firms don’t have this time and the way in which firms approach compliance is not conducive to innovation.

Getting this right not only means making strategic investments in the technology and but also in the understanding of the detailed regulatory requirements. Will the regulators provide ‘sign off’ for innovative regtech solutions? Almost certainly not. Will one firm invest in a solution way ahead of its competitors? Rarely. Therefore, to have good regtech one needs good collaboration and robust, scalable solutions that can overcome the challenge of constantly changing requirements, high volumes and the need for high quality.

For regtech to become the solution to the problems of volumes, inconsistency and insufficient infrastructure that engulf the regulatory community this would necessitate a revolution of sorts – in the way that rules are developed, a revolution in the way that regulatory change is managed and a revolution in the way that regulatory change is supplied.

However far off this may seem, it is clear that it is a preferable path forward for all sides of the regulatory problems by comparison with the situation as things stand. It is also obvious that in order for regtech to be the solution to all of the problems discussed here, it needs to be defined and cultivated as something which is similar but altogether different to fintech.

©BestExecution 2017[divider_to_top]

FIX Trading Community : FIX Orchestra : Tim Healy

Be28_FIX_TimHealy_660x375

BEST EXECUTION – FIX ORCHESTRA.

Tim Healy, Global Marketing and Communications Director, FIX Trading Community.

Anyone that has worked with the FIX Protocol will appreciate that, whilst it is a technical protocol, the human interaction in configuration, testing and certification of a FIX connection is absolutely vital. The protocol itself was borne out of a business need over 20 years ago by traders and technologists with an explicit requirement to make communication between investment firms more efficient, thus reducing errors. Over 20 years later the FIX Protocol is the de-facto messaging protocol for the way the world trades today across all asset classes.

There are many thousands of FIX connections between buyside, sellside, vendors and exchanges. For each FIX connection between these counterparties, there is a time and a cost to market involving people, potentially across different time zones, exchanging FIX specifications. Once exchanged, calls will be scheduled between parties to run through different scenarios, and based on these calls and interpretation of these specifications, configuration work will be effected. 20 years on from the original FIX connections and the same onboarding processes are still being used.

FIX Trading Community has always looked at ways to create more efficient processes across a number of workflows, and FIX Orchestra is the latest of these important initiatives. As John Greenan, CEO Alignment Systems and one of the creators of Orchestra describes it: “Orchestra is the full stop at the end of FIX. Risks and costs can be cut and time to market and efficiency improved.”

What is FIX Orchestra?

Put simply, FIX Orchestra is a standard for implementing machine-readable rules of engagement. Counterparties can exchange their Orchestra files, either statically or dynamically through network interfaces, then automatically compare their files and discover where there are differences and restrictions. Following this process, configuration of FIX engines, generation of application configuration and code, test case creation and sample messages can be created without human intervention. In effect, a whole testing lifecycle can be undertaken quickly and in an efficient, cost-effective manner. This will resonate with the original demands behind the FIX Protocol itself and the continued need to be cost conscious in a tight margin environment.

What is the business case?

How can a firm take advantage of this? A CTO is charged with effectively holding, developing, articulating and continually to evolve the company’s strategic technical direction. The CTO is responsible for making sure a company continues to have the best technology offering in a dynamically-evolving highly-competitive space. A space where FIX is an integral and vital component. It’s a challenging role. Well, it’s important to note that, for a business, no changes will need to be made to a company’s FIX engine which will lessen the financial impact. FIX Orchestra is metadata about a specific implementation of FIX. The benefits for a business will be derived from the cost-savings and efficiencies that allow your company to set up FIX connections promptly and with less manual human intervention – a quicker set-up and with significantly reduced opportunities for errors.

Eugene Budovsky, Director, Autobahn Equity, Deutsche Bank AG, Australia & New Zealand, commented, “A key challenge for my business is rolling out new capabilities to client front-ends, in particular the upgrading of algo tickets. My clients use a large variety of vendor systems, and while the FIX Algorithmic Trading Definition Language (FIXatdl), for specifying algorithmic trading user interfaces, goes some way towards simplifying the process, there is invariably a large number of people (and associated cost) involved in the process. The task can be so daunting that sales people are often discouraged from offering ticket upgrades to clients, and multi-vendor global ticket upgrades are a massive undertaking. FIX Orchestra has the potential to automate these types of processes, and I am eager to see it deliver as it would significantly improve my ability to roll new offerings out to clients.”

Jim Northey of Itiviti, and one of FIX’s Global Technical Committee Co-chairs observes that “The strength of FIX is its flexibility and versatility, which was vital to its success and widespread adoption. The weakness of FIX is the variance across usage. Let’s face it, as one of the most successful industry standards in finance, FIX has some of the worst written specifications. What FIX Orchestra does is to create a formal specification mechanism for FIX. As the FIX community begins to enhance their systems to automatically consume FIX Orchestra files, and as we in the FIX Global Technical Committee start to publish FIX Orchestra files for each best practice, we will start to see the main drawback, variability across service offering, become one of our strengths and values.”

Brandon DeCoster of FIX Flyer commented: “Today, we write PDF documents. Then humans read them, (mis)interpret them, implement them, and test them: all by hand. The future is defining Rules of Engagement. Period. Technology will generate the PDFs, the translations, the integrations, the tests, the certifications. The Rules of Engagement will be the authoritative source, the PDF an artifact.”

 

Diving into the details

FIX Orchestra Release Candidate 1 is available today. FIX Orchestra is not so much something new as it is a significant evolution of the FIX Unified Repository, that has been adopted by a few major market participants to manage their specifications, in addition to being the basis for automated generation of FIXimate and the FIX specification. Release Candidate 1 permits much more than the definition of the fields, messages, components, and enumeration (code values). If this were all that Orchestra provided, there would be little value over what is available today. FIX Orchestra moves beyond just message definition, to permit specification of scenarios, which is a specific usage of a message. A single FIX message is used for many purposes. The Execution Report is an Ord Ack, Fill notification, Partial-Fill, done for day, order status. With Orchestra, you can define each of these scenarios and define the context in which each scenario will occur via FIX Orchestra’s ability to model, in a formal way, system state. For instance, you can model the state of an OMS or an execution venue and specify the response to various states.

Not just for tag-value FIX

Adopters of FIX Orchestra can do as little as create a machine-readable inventory of their message formats. Or ambitious service providers, who see the full value of automation, can integrate FIX Orchestra within the core of their gateways, order routers, order management systems, and execution management systems. As we live in a world where most venues now provide proprietary interfaces in addition to FIX, and as FIX now has seven different message encoding formats, and multiple FIX session layers, it was important that FIX Orchestra did not support FIX tag-value only. In fact, an early version of FIX Orchestra was used within the Association of National Numbering Agencies (ANNA) Derivative Service Bureau project and used to generate Javascript Object Notation (JSON) Schemas.

Next steps

The development of Release Candidate 2 is underway. Included in Release Candidate 2 is an expression language for specifying context and conditional rules. The use of a domain specific language was selected to improve readability and usability versus arcane and difficult to read XML based rule languages. The grammar for the FIX Orchestra expression language is maintained by the FIX Trading Community and is available under Apache 2.0 license via GitHub. Release Candidate 2 will also include support for FIXatdl to address a common issue faced by broker-dealers and their customers, the inability to manage FIXatdl definitions with their service offerings.

Integration with global standards

Members of the FIX community are participating in creating the next version of the ISO 20022 standard. The capabilities being defined within FIX Orchestra are being included in the work for ISO 20022. This integration is essential for the future of the FIX Protocol.

Getting involved

Sufficient functionality exists now within Release Candidate 1 that vendors and bespoke implementers should be adopting and using the standard now. What is needed now are early adopters who will be rewarded by being able to drive and mature the standard. Please join us.

©BestExecution 2017[divider_to_top]

Equities trading focus : Electronic block trading : Dr. Robert Barnes

Turquoise_LOGO_750x165
Turquoise_LOGO_750x165

Dr Robert BarnesDELIVERING BEST EXECUTION.

Partnership and innovation deliver Turquoise as the electronic block trading destination that provides the best results on a continuous basis – the very definition of best execution. By Dr Robert Barnes, CEO at Turquoise.

Designed in Europe and refined in partnership with the buyside and sellside, Turquoise Plato Block Discovery™ is a Large In Scale (LIS) electronic execution channel that works. From July 2016, and every month since, Turquoise Plato Block Discovery™ has registered new activity level records above the extraordinary June 2016 volumes coincident with the UK Referendum.

Turquoise Plato Block Discovery™ set a new monthly record by value traded in March 2017 of €2.64bn, an 11% increase over the previous record of €2.38bn set in February 2017. March 2017’s value traded was also more than 13x the €0.20bn traded in March 2016. Turquoise Plato Block Discovery™ has also matched €15.9bn since 20 October 2014 start and 31 March 2017 of which more than €11.8bn or 74% has matched since September 2016 (See Fig. 1).1

High firm up rates

Turquoise Plato Block Discovery™ has more than two years worth of empirical measurements evidencing consistently high firm up rates. These rates result from robust automated reputational scoring, which measures the difference between the original block indication and the subsequent firm order. Since its launch, more than 95% of order submission requests (OSRs) resulted in firm orders into Turquoise Plato Uncross™, within the specified time window of under half a second. In less than 5% of OSRs did the respondent fail to come back within the prerequisite core time and size parameters. The key insight is that the vast majority of Turquoise Plato Block Discovery™ OSRs successfully firm up.

Low reversion = positive differentiation

Turquoise Plato Block Discovery™ Block Indications firm into Turquoise Plato Uncross™ for execution. Turquoise Plato Uncross™ is an innovation that provides randomised uncrossings during the trading day, ideal for larger and less time-sensitive passive orders.

Turquoise Plato Uncross™ has been the subject of multiple studies since 2013 by LiquidMetrix, the independent analytics firm that specialises in venue performance metrics and execution quality analysis,2 and their analysis shows that for Turquoise Plato Uncross™, after 1 second following execution, the PBBO mid reference price has changed in less than 10% of occasions. This compares well with the 50% for continuous dark pools.

According to the report’s authors, Dr Darren Toulson and Sabine Toulson: “Turquoise Plato Uncross™ matching happens at times less correlated with market movements and within the EBBO more readily than continuous dark book executions across all relevant venues. From an execution point of view, large orders left resting on Turquoise Plato Uncross™ are less susceptible to gaming or adverse selection than orders left on other continuously matching MTF Dark pools. The proportion of Turquoise Plato Uncross™ trades executing outside the prevailing EBBO is lower (= better) than continuous dark book trades.”

Higher trade sizes

Turquoise Plato Block Discovery™ is a broker-neutral mechanism for executing anonymous block orders above 100% of LIS thresholds. Today, orders entered are eligible to participate if above 25% LIS. ESMA sets current LIS thresholds relative to an Average Daily Turnover (ADT) metric that segments securities into one of five bands that set a security’s respective LIS threshold.3 During March 2017, Turquoise Plato Block Discovery™ differentiated profile is resilient – 59% of all value traded in March 2017 was greater than 100% x ESMA LIS threshold.

Key observations from Table 1:

  • Where traded activity is above LIS, the average trade size is 1.9x to 2.9x the LIS threshold across all respective ESMA Bands of liquidity.
  • All of the Bands have recorded a maximum successful single trade size well over €1m, the largest to date being €13.1m (this was a single trade in Eni, the global energy company with a primary listing on Borsa Italiana).

For the same ESMA Band 5 blue chip securities that can trade within the single Open Access Turquoise order book, Turquoise Plato Block Discovery™ trades for ESMA Band 5 blue chips above ESMA LIS consistently average more than E1m per trade, and are more than 100x larger than the E10k average trade sizes arising from continuous dark pool activity.

Why this matters

  • Because orders with size above ESMA LIS received by a venue can under MiFID II continue to trade with LIS pre-trade transparency waiver at Midpoint and save half the bid offer spread (= lower implicit costs).
  • Because larger trades that match with minimal market impact and lower implicit costs contribute to long term investment returns.
  • Because it shows Turquoise Plato Block Discovery™ as working LIS electronic execution channel.

Helping investors

Turquoise Plato Block Discovery™ helps investors get their business done by offering additional electronic execution channels. For example, if we look at trading of UK company NEXT plc on 9 March, 2017 (see Fig. 2), the top 5 trades are via Turquoise Plato Block Discovery™ and this represents a 19% (and Turquoise overall 32%) share of trading on that day.

The intraday chart, enriched by trade size, shows NEXT plc trading on multiple trading platforms, including LSE, the electronic order book of its primary listing exchange, London Stock Exchange, and Turquoise. Turquoise trade prices are similar to those of the primary Stock Exchange, and Turquoise trade sizes serve both the smallest and largest orders through its single connection for straight through processing, from trading in London to settlement into Euroclear UK & Ireland. Turquoise not only enables investors to access a third of the order book value traded that day, but Turquoise also matched the very largest order trades of all venues during the day via its award winning electronic block trading innovation Turquoise Plato Block Discovery™.4

A platform you can trust

Surveillance of Turquoise activity, including monitoring of price movements ahead of Turquoise Plato Uncross™ is undertaken by the independent London Stock Exchange Group Surveillance team. Any suspected manipulation of Reference Price will be referred to the UK Securities Regulator, FCA.

A combination of robust automated reputational scoring, independent quantitative analysis evidencing the quality of the LIS trading mechanism, and an independent surveillance oversight adds to the integrity of Turquoise Plato Block Discovery™ for the matching of undisclosed Block Indications that execute in Turquoise Plato Uncross™.

Footnotes:

  1. Sep 2016: www.bestexecution.net/announcement-plato-partnership-enters-co-operation-agreement-turquoise/
  2. LiquidMetrix Turquoise Plato Uncross™ – Execution Quality Analysis by Dr Sabine Toulson & Dr Darren Toulson, 13 Nov 2015, plus further analysis Nov 2016.
  3. Source: COMMISSION REGULATION (EC) No 1287/2006 of 10 Aug 2006 implementing Directive 2004/39/EC: www.esma.europa.eu/system/files/Reg_1287_2006.pdf
  4. Turquoise won 2015 Financial News Award for Excellence in Trading and Technology: Most Innovative Trading Product / Service.

Turquoise_LOGO_750x165

©BestExecution 2017 [divider_to_top]

Equities trading focus : Technology : Joseph Hipps

Joseph Hipps

Joseph HippsTOWARDS ALPHA PRESERVATION.

Investors could reap significant benefits from technology that brings realised returns into closer alignment with expected returns. Joseph Hipps, head of client service and support at Trade Informatics explains.

In the early days of telecommunications, the clarity of voice transmission was often eroded by the physical distance between callers, to the extent that long-distance calls were sometimes barely even audible. Advances in the technology that supports the phone system have ironed out those kinks over time, and users can now communicate much more easily across continents with little deterioration in quality. A similar kind of transformation is underway in asset management and order execution.

Many investors currently face a frustrating situation in which their expected paper returns greatly exceed their realised returns, largely because of the costs incurred from slippage of information during the order handling and execution process. Investors should be empowered to recoup those costs, and technology can be used to minimise the loss of returns.

Consider the technology stack associated with a typical client trade today. At the top of the stack is the portfolio management system, into which a portfolio manager or investment analyst enters the details of a transaction. At this point, the portfolio manager has a trade idea, usually driven by a clearly defined alpha generation strategy, and is still calling the shots.

But when the trade idea passes to the buyside trading desk, it flows into an order management system, and the alpha profile of the manager is largely lost. Then, when the trade moves into an execution management system, its key characteristics are rewritten into FIX messages so that they can be clearly understood by the sellside. While the messaging is highly effective, the original decisions that accompanied the trade idea have by this point been completely lost.

Quantifying lost returns that may be caused by the passing of an order from one investment agent or system to another is not an exact science and will naturally vary depending on the size of the order and the nature of the parties involved. But there is no doubt that ‘transmission decay’ imposes a significant cost on the end investor.

Much like the early users of telephones who conceded to an inevitable loss of quality when trying to make contact with faraway relatives, investors may well feel there is no way to avoid this transmission decay. If they want to execute an order using available technology, the order has to pass between multiple agents and systems, and that in turn will have some unavoidable impact on the bottom line.

But there is no need to admit defeat so quickly. While it may not yet be widely practiced, it is possible to apply a technology wrapper around all three systems that will preserve the alpha profile of the order as it passes between agents, and continually assess the trade-off between short-term return and incremental impact.

This kind of alpha preservation solution can be tailored to meet the needs of end investors. An active fund manager that has powerful market intelligence, for example, might need a very aggressive logic that could quickly adapt to changes in market conditions, while a more passive strategy might extend the duration of orders with much less intervention.

In practical terms, an order is paired to a specific alpha profile at the outset. This alpha profile is then used to parameterise the order when it moves into the order management system, and those parameters are in turn used to create FIX tags that can be recognised by the execution management system and the sellside. This means that the link between the original profile and the execution strategy is never severed.

Without this alpha preservation solution in place, portfolio managers can be as prescriptive as they like in the early stages, but ultimately they have to accept that their orders will be shoe-horned into off-the-shelf execution strategies that may result in drastically different returns from those that had been calculated and expected.

There are inevitable challenges in realising effective alpha preservation, not least the fact that buyside firms will not want to share all of their historical data with counterparties, particularly if there is any competitive risk for either side. Alpha profiles have to be designed and delivered by independent third parties that have access to the necessary data and expertise, in order to map diverse execution strategies into each of the three systems.

The barriers to entry for providers of this technology are high. Effective delivery depends on a sophisticated understanding of the investment technology stack, the execution process itself, and the alpha profile of individual investors. Dealing with the diverse characteristics of multiple order management systems and being able to create a synthetic language that interacts with each one is also a prerequisite.

Ultimately, mapping the proprietary language of each of these systems into FIX messaging so that the alpha profile is not lost when it crosses to the sellside is the key to alpha preservation, with the overriding objective being to identify, retain and make use of portfolio decisions throughout implementation.

This nascent technology has the potential to transform the investment management process in the same way that fibre optics revolutionised telecommunications and vastly improved the user experience. If investors are really to close the gap between realised and expected returns, they need to embed alpha preservation at the heart of their operations.

Click here for the Tradeinformatics website.

©BestExecution 2017[divider_to_top]

Equities trading focus : Reference data : Peter Moss

Peter Moss, SmartStream
Peter Moss, SmartStream RDU

OUT OF THE WINGS AND INTO THE LIMELIGHT.

Peter Moss, CEO of The SmartStream Reference Data Utility (RDU), looks at the reference data challenge posed by MiFID II/MiFIR.

The financial industry is bracing itself for the introduction, early next year, of MiFID II/MiFIR. Under the incoming rules, reference data will play a part of increased importance. There are a number of reasons underlying this development.

MiFID II/MiFIR extends regulatory focus from shares to a much wider range of instruments. It also brings in new product registration obligations, plus an extensive suite of reporting requirements covering pre-trade price transparency, post-trade and transaction reporting. Critically, much of the incoming regulation pertaining to transparency and reporting will require decisions to be made at individual instrument level, using decision trees and information relating specifically to each financial product. It will make access to accurate, targeted reference data more vital than ever.

Firstly, the upcoming rules will oblige financial institutions to inform their National Competent Authority (NCA) of any new product they wish to trade. The NCA, in turn, will forward information to the European Securities and Markets Authority (ESMA) for publication. Products will need to be registered with financial authorities, along with all relevant reference data. Additionally, before any new instrument can be traded, it will need to set be up – with the associated reference data – in a company’s security master.

Pre-trade price transparency is likely to prove another pain point for the industry and particularly for systematic internalisers. Regulated markets (RM), multilateral trading facilities (MTF), organised trading facilities (OTF) and systematic internalisers (SI) will be covered by the new pre-trade price transparency provisions. They will have to advertise prices in advance and then guarantee to trade at that price, unless a waiver can be shown to apply. Reference data will be needed to prove that a waiver should operate. The trading venues (RM, MTF and OTF) have always advertised prices quite transparently – systematic internalisers have not. Systematic internalisers could be in for a tough time as they are going to need large quantities of reference data in order to provide a mechanism for pre-trade price transparency. Accessing all the relevant information could prove hugely testing.

Implementing the post-trade reporting provisions set out by MiFID II/MiFIR may cause firms other reference data-related headaches. For a start, trades will have to be published within fifteen minutes of a transaction occurring. A set of instrument reference data will be needed to define the trade, while reference data will be necessary to determine which party should report the trade, as well as to establish whether reporting should be deferred.

Finally, transactions will have to be reported to authorities within twenty-four hours of them taking place. Firms will need to provide a host of detailed information – primarily counterparty data in addition to the details of what was traded.

So how will firms meet their reporting obligations? The author believes that almost all organisations will use an approved publication arrangement (APA) and an approved reporting mechanism (ARM). Systematic internalisers will need access to reference data directly as well, in order to handle their broader obligations. This can be sourced directly, through vendors or from a utility like The SmartStream Reference Data Utility (RDU). Firms accessing reference data directly need to be confident that their internal teams can deliver. Those using a data vendor or a utility need to carefully track time to market, too.

Fulfilling MiFID II/MiFIR requirements will oblige firms to gather information from multiple sources, including trading venues and a number of regulatory bodies such as ESMA, the Association of National Numbering Agencies (ANNA) and the Global Legal Identifier Foundation (GLEIF). There is, however, some difficulty concerning the data to be supplied by certain regulatory authorities.

It is, unfortunately, not yet clear what form the data to be provided by ESMA and ANNA is going to take nor when, exactly, this information is likely to become available. Neither is it entirely apparent as to where certain details should be obtained from. For example, if companies want to find out which systematic internalisers are dealing in a specific ISIN, to whom should they turn? Will ESMA publish this information or will another body do so? We still do not know. These and other unanswered questions mean that a very fluid situation is likely to persist until the end of the year.

Not surprisingly, the lack of clarity as to how ESMA and ANNA data may eventually be clarified and published is a cause for anxiety among financial institutions. Firms are especially worried that delays in data availability might lead to a hurried final phase of implementation late in 2017.

And financial institutions are right to be concerned. While regulators may be lenient for the first six months of 2018 – provided they see that best endeavours have been made – they do have the power to stop firms from trading and it is possible, therefore, that after an initial period of grace, they will put these powers into action, should they see institutions failing to comply with MiFID II/MiFIR obligations.

A financial utility which specialises in processing reference data, such as the RDU, can assist companies to meet the demands of MiFID II/MiFIR. The RDU normalises and aggregates information from multiple sources, e.g. trading venues and data vendors. It removes much of the “heavy lifting work” associated with processing the subtleties and complexities of huge volumes of raw data, simplifying and enriching where necessary, in order to create easily digestible sets of data.

The RDU is also ideally suited to tackling the type of large files likely to be published by regulatory authorities. ESMA data may well be supplied in a raw form and require significant enrichment before it can become usable and the files published by ESMA could also turn out to be extremely large in size – with millions of instruments – potentially making processing work even more arduous. Furthermore, the data may need to be supplemented by data from additional sources such as trading calendars and ISIN/SI mappings.

For systematic internalisers in particular, which will need quantities of focussed data relating specifically to the set of instruments they have chosen to make a market in, access to the sort of highly targeted reference data the RDU can provide should prove critical. The service provided by the utility can also relieve them of a weighty processing burden.

MiFID II/MiFIR looks set, in conclusion, to bring plenty of change. The RDU can, the author believes, play a pivotal role in helping companies to meet their obligations. Solely dedicated to processing reference data, and employing industry best practice, the RDU provides an alternative to in-house development or to third party vendor solutions. Importantly, as a utility, it has the potential to save scores of individual institutions from replicating the same effort, providing firms across the industry with a cost-effective means of responding to the reference data demands generated by MiFID II and MiFIR.

©BestExecution 2017[divider_to_top]

Equities trading focus : Block trading : Gareth Exton

Gareth Exton
Gareth Exton Head of Liquidnet's Execution + Quantitative Services EMEA ,

Gareth ExtonWHAT IS THE VALUE OF A BLOCK?

Gareth Exton, EMEA Head of Execution Consulting at Liquidnet describes how it’s possible to measure and prepare for the new trading environment.

Why has there been a growth in block trading in the dark markets recently?

Two very distinct events have led to a growth in block trading over the recent years. One is the MiFID II regulations coming into force in January 2018, bringing with them the move towards large-in-scale (LIS) trading, which has certainly focused minds on the way people handle their orders. In addition to that, MiFID II is forcing trading firms to think carefully about their workflows, for example do I always look for block trades at the start of my order, do I go outbound to all conditional venues, or do I want to continue to trade on a schedule.

The second reason was Brexit in June 2016, when we saw a much higher number of block trades being done in the days immediately after the vote. I think a lot of people fell in love again with block trading as they could trade in and out of large positions in a short space of time, with very little price and market impact.

Do you think this trend towards trading in blocks will continue?

Most definitely, because a significant number of venues are entering the market offering a new, automated, conditional order type and a focus on ensuring trades are over the LIS waiver threshold. Liquidnet is already future-proofed for this new liquidity landscape; currently 94% of our trades are above the LIS threshold, but there are a lot of new initiatives in the market such as the UBS MTF offering a large-in-scale preferencing service or Euronext announcing their block trading MTF. And of course, there is also Turquoise Plato BDS and BATS LIS, which are growing in size and highlighting a clear direction of travel towards more conditional LIS orders.

How does this move towards more block trading benefit the end customer?

Generally, end customers should get their orders done quicker and see improved performance. If you’re looking for a block, you tend to do that at the start of an order, either via a manual workflow or via a conditional order that is part of an algo. This means you should get closer to your arrival price, which has become the primary benchmark for a lot of trading desks. Furthermore, as conditional orders are executed in the dark, they don’t cause the impact that these large orders would do if traded in the lit.

What changes will this bring?

The big change I think people will see is that the fill profile of their orders will change significantly. In Europe we are very used to people trading along a fairly predictable volume curve. This has meant it has been easier to predict how long it would take for an order to trade, based on a stock’s typical volume curve and how much liquidity they tend to have in certain venues.

This predictability will be blown out of the water with the rise of the conditional order type, the removal of the BCN liquidity and the move towards systematic internalisers. The liquidity profile of a stock will completely change and this means smart order routers will need to be re-configured and the way that algos work will need to be adapted. This will result in more variability to an order’s duration and performance vs different benchmarks. It will be important therefore, to measure the value of block liquidity when assessing execution performance, often placing a higher value on liquidity over an average market price. Managing that variability in both execution experience and performance is going to be key.

So how do you track and measure the value of a block?

We will need new metrics in place for valuing the percentage of average daily volume that is getting done, the price reversion around the time of that trade, and how liquidity is valued. All this comes down to how you’re measuring your fund performance and the style of trading that you are doing.

There will be much more variability in how you track different styles of trading. At the moment, a lot of people have some standard benchmarks that they use across all of their trading and brokers. In future, I think we will see a move to more granularity by tracking the monitoring and performance of these type of orders in different ways than standard flow trading, including POV / VWAP / Close. This kind of block / LIS-style trading will have to be looked at separately, with a focus on the impact that you’re having around the time of the trade and the liquidity that you are capturing being key to the way that you value that block liquidity.

What should asset managers be doing to prepare for these changes?

One important area is to look at the best execution policy that you have in place, and how you factor in block trading into your workflow. Best execution policies post MiFID II will need to be quite specific about how you take into account the seven ESMA factors and how you build your workflow to reflect those. For example, for large percentage ADV orders you might state that your first port of call is a block trading venue in order to minimise information leakage and capture available liquidity, then move on to conditional venues and finally start trading via an algo. Every firm’s trading workflow will be different based on the type of orders they receive but all firms need to consider these issues in designing their new best execution policies.

What is Liquidnet doing to help?

Block liquidity it at the heart of what we do. Our dark pool will be fully compliant and ready to go on 3rd January 2018 when the new MiFID II rules come into play. Our members are looking for that large-in-scale block at the start and their algo orders rest fully conditionally in the Liquidnet pool.

To help with the quest for liquidity, we also offer conditional access to BATS/BIDS and Turquoise BDS in the full size of your order, from the moment the order starts to the time the order ends. We are also enhancing our TCA consultancy service to factor in block trading and show the value placed on block liquidity, and adding a specific section to our TCA product that looks at conditional orders and the performance around your block orders.

In terms of venue analysis, we have built out a Venue Ranking Model, taking into account average order size, large-in-scale, and the way conditional venues work in terms of fading and behavioral score. All these elements are taken into account when our smart order router is configured. This information will be available to our members and their clients, to ensure we offer full transparency about what’s happening to their orders, what liquidity they’re capturing and at what price.

With MiFID II on the horizon, we feel that we have a responsibility to provide as much information as we can and be as transparent as possible to our members on how we are handling orders. We owe it to our members to prove that we are trading in their best interests at all times, and that our solutions are configured in such a way that they are compliant with the best execution rules and our own policies. We have invested a lot of time to ensure that this information is available to members and being explained, so that we can help them as much as possible to capture the liquidity that they are seeking.

©BestExecution 2017[divider_to_top]

Equities trading focus : ETFs in South Africa : Donna M. Nemer

Donna Nemer

Donna NemerTHE EMERGING ETF TREND.

Donna M. Nemer, Director, Capital Markets and Group Strategy at the Johannesburg Stock Exchange (JSE) explains how the product is being adopted in South Africa and its neighbours.

At the end of 2016 assets invested in the global ETF/ETP industry were US$530 billion larger than the assets invested in the global hedge fund industry (source HFR). What are the drivers behind this growth?

Investors typically seek investment products which are liquid and cost effective. ETFs are essentially much lighter on fees and afford investors a much more flexible and creative product that enables higher net returns. They track the market or index, and evidence has shown few actively managed funds can outperform the market consistently. In addition, they are cost effective in that investors can purchase a diversified basket at a fraction of the cost, compared to buying each underlying stock individually. ETFs are known for their transparency, assuring the investor of the exact underlying exposure he/she is receiving plus they are tax beneficial – any purchase of ETFs listed on the JSE exempts the buyer from Securities Transfer Tax (STT).

Is this trend replicated in emerging markets? What are the trends in your markets? Is the growth also escalating and if so, why?

The trend hasn’t been truly replicated on the same level in the SA market. The local hedge fund industry growth has largely tapered off and ETF market capitalisation has remained mainly stable with a slight decrease in 2016 compared to the previous year. This is mostly due to a decrease in commodity based ETF prices off the back of lower commodity prices. However, there was positive growth in terms of number of ETF listings with 2 ETFs during 2016 and 3 in the first quarter of 2017. When considering emerging market passive equity strategies, the industry has seen cumulative inflows of roughly R110bn over the last 13 years, according to WisdomTree, EPFR and ETFGI. This rate of growth in inflows is on a par with global trends, albeit off a smaller base.

Last November, the JSE listed two new global ETFs, the CoreShares S&P500 ETF and the CoreShares S&P Global Property ETF – what were the objectives behind this? 

With the launch of the CoreShares S&P500 ETF and the CoreShares S&P Global Property ETF, investors are now given additional investment options to gain exposure to offshore markets. The S&P500 is one of the most recognised indices in the world and SA investors now have access to this iconic benchmark. Similarly the S&P Global Property ETF is the first SA ETF that tracks an offshore real estate index which will provide further portfolio diversification.

How are institutional investors using ETFs in South Africa and emerging markets?

The institutional market is still mostly unit trust focused, however, exposure to gold and other commodity ETFs has helped elevate ETFs as attractive investment options, mainly because they are liquid and fall within their investment mandate. Some smaller firms use ETFs for cash equitisation purposes but core satellite strategies are currently limited to mostly structured products. The influence of smart beta is still fairly new in South Africa, but we are increasingly seeing an increase in strategies that select market factors that generate long term outperformance.

Do you think it will develop along the same lines as in Europe and the US for example – where you are seeing a more active use of ETFs – to overweight or underweight sectors?

The local ETF market has definitely seen growth over the last 16 or so years. AUM is roughly R70bn comprising of 52 ETFs listed on the JSE. Although the market is still very small compared to the global ETF market, there is evidence to show that ETFs are gradually becoming more well-known and popular. There have been five ETF listings in the last four months alone. In markets that have typically large single stock concentration, like South Africa, ETFs have not yet reached their full potential.

What is the JSE future plan in the ETF space?

Creating better awareness of ETFs for institutional and retail investors remains a key objective. The JSE has also embarked on supporting the National Treasury and the broker community with the Tax Free Savings Accounts which are ideally suited for ETFs. The JSE has also started to collaborate more with ASISA (The Association for Savings and Investments South Africa) and the local ETF providers through a tailored ETF standing committee, aimed at developing the market further.

Along these lines, are you seeing a move towards smart beta ETFs?

There is interest in smart beta ETFs in the local SA market, with future potential for adoption at the same rate as in the US and Europe. Essentially smart beta combines active management and passive investing attributes. This includes strategies that are not based on traditional market cap weights but typically embody a theme or style such as low volatility, momentum, environmental, good empowerment status, Shariah compliant, dividends, cash flow, sales and book value. Smart beta global inflows were in excess of $45bn in 2016, according to ETFtrends.com.

©BestExecution 2017[divider_to_top]

Equities trading focus : MiFID II : Rob Boardman

Rob Boardman

Rob BoardmanTHE MIFID II ROADMAP.

Rob Boardman, CEO of ITG Europe, spoke to Best Execution about the challenges of MiFID II.

MiFID II covers a massive range of compliance issues, what is top of mind right now?

The industry has got a tough timeline to comply by the 3rd of January, and one of the biggest issues for asset managers is clearly the separation of research and trading. At the moment, it’s an administrative burden, and in some cases it threatens their competitiveness, particularly when competing against asset management groups outside Europe. We believe ultimately it will allow traders to be unconstrained in their ability to buy the right execution product from the right source, trading in the right venue, and it will leave portfolio managers unconstrained to buy research from whoever they want. Eventually it will be seen as being beneficial to all sides, but in the short term the transition in modifying business models is clearly a challenge.

How do you see market structure changing?

We think there will be more large in scale (LIS) crossing, which is not capped by MiFID II. We have applied for and received permission to use the LIS waiver, which will allow us to continue to trade blocks in the dark.

There will be a series of other innovations in systematic internalisers (SIs), which the banks will be doing. Instead, we are focussing very much on the technology to provide best access, including to systematic internalisers and their capital.

The third great plank of interest for asset managers is how best execution will evolve. And I think it will evolve in many ways; it will certainly be adapting to the new market structure landscape, but also I think the bar will be raised, from the expectations of compliance from regulators to how asset managers pursue best execution. They now no longer have an excuse of having to trade with research providers, so they should have a complete, razor focus on best execution.

How are you helping clients to prepare in the equity space?

We are developing tools and technologies. Workflow technologies to allow asset managers to demonstrate best execution through analytics, but also through enhancements to their process to use a series of providers and measure that performance at the same time. And, to be able to engage in a systematic, unbiased engagement of brokers to demonstrate best execution.

Looking across assets, what is changing?

There are the new transparency requirements for derivatives and fixed income. It’s less clear to the market how that is going to shake out. We have a request-for-quote (RFQ) platform, principally for equity derivatives, called RFQ-hub, which we plan to enhance to give traders additional facilities to meet their MiFID II obligations. As to enabling our clients to be compliant, we think that could happen in different ways, in different product lines and that the market needs different solutions.

It’s less clear than in equities, the extreme example being fixed income, where the transparency requirements are really around quoting, because it’s a quote-driven market. Many attempts have been made to move liquidity onto more displayed markets but none have really succeeded.

There is still unfinished business with MiFID II, and I think multi-asset is clearly one of those areas. We will have to wait and see how regulators and policy makers react to the changing landscape. I think they will be surprised by how many new venues are born.

How will the shifting landscape affect where people trade?

Some banks are trying to promote the view that market makers will take over the market again; we don’t believe that’s true. Exchanges’ market share will not be affected. There is already a significant over-the-counter (OTC) market in pan-European equities – estimates are around low 20% of the market. Now, in some of the OTC trading, there will likely be a move onto more organised facilities – SIs for example – but we see OTC remaining important.

SIs do not have to have a rulebook in the way that a multilateral trading facility (MTF) or a regulated market would need. They can treat customers differently, so there is quite a lot of scope for creativity amongst the SIs in permitting different types of interactions; some of them will specialise in large or small transactions or different market segments. So, I think there is likely to be great differentiation amongst the SIs that will become clear through usage. We intend to provide investors with the best access that we can to all the liquidity sources, and we intend to measure them and have systematic processes to help them demonstrate best execution.

Which tools do you see trading desks needing to navigate this new environment?

It’s demonstration of execution quality that is important. They now have compliance and internal audit asking them to demonstrate why they picked a strategy with a broker, and traded on a given venue on a particular date. Having a systematic process is, we believe, going to be important for asset managers.

We could be in for a new era where asset managers are not just trading with relationship counterparties. They will have a broker list like they always did but they will also have a methodology for why they picked those brokers, and why they should be on the list, and who should not be on the list as well as making changes periodically. Having a static best execution process and policy is out the window. I think the regulators have made it very clear that they expect to see those policies and procedures reviewed and updated regularly.

So, will data become much more important to the buyside trader?

We think of this as an entire cycle of analytical data that we and others in the market try to provide asset managers. Decision support tools can provide details around likely impact, around which strategies or brokers have been more successful in the past, and classifying those transactions on the basis of liquidity or volatility. That then gives you the confidence, whether you make a decision yourself or through a machine, to place that order in a certain way, with a certain strategy and with a certain broker. Comparing the result with pre-trade estimates when you have finished the execution cycle then gives you a post-trade analysis. This can be daily, monthly and at quarter-end for bigger, more in-depth analysis looking for more trends.

©BestExecution 2017[divider_to_top]

Equities trading focus : Best execution through analytics : Adam De Rose

Adam De Rose

Adam De RoseMIFID II: THE RISE OF ANALYTICS.

Adam De Rose, Associate Director, Product Management, Eze Software Group.

I was an equities dealer on the buyside for a short while, some seven years ago, I thought I knew what I was doing. I watched the bids and the offers in the order book closely, hoping I could learn to read them like Keanu Reeves could read the Matrix. I watched out for news on the names I was trading and any others of interest to my portfolio managers, and I had screens full of charts and morning reports from brokers and watch lists and chat rooms and alerts and even a system built by my talented colleague that would calculate intraday P&L and bark whenever a stop-loss was hit and we needed to exit. We called it Watchdog.

Every day I digested as much information as I could, and I ingested at least seven cups of coffee as I did it. I tried to be the best trader I could be, and best execution was always front and centre in my mind. And all that was great, but I returned home every day mentally exhausted from the sheer, monumental effort of trying to be everywhere at once.

Seven years on, at the cusp of MIFID II implementation and with the higher and wider standards of best execution it demands, I find myself casting my mind back and wondering whether, even with all that effort, I would have fallen short of the new obligations.

As we all know, the regulators’ focus is on ensuring firms have a process in place to take “all sufficient steps” to achieve the best possible outcome for investors. The latest statement* from the Financial Conduct Authority on 3 March 2017 seems to indicate that while some firms have put in place effective governance processes to tackle costs throughout the trade life-cycle, many others are still ‘ticking the box’ on using transaction cost data. The industry has some way to go on embracing technology, and the FCA expects firms to “be aware of enhancements to best execution monitoring as they become available.”

The scope of MiFID II is a formidable challenge for the industry, but rather than stifle progress, the regulators’ challenge has spurred innovation. We are seeing exciting new technologies coming online across the industry, and much collaboration between multiple parts of the asset management technology ecosystem to enable the buyside to demonstrably improve their best execution processes. These are the very enhancements to best execution monitoring that the FCA expects the buyside to stay on top of.

For our part, we’re seeing many on the buyside making the effort, embracing education and incorporating oversight and insightful monitoring throughout their investment process. At the same time, we’re doing our very best to survey the landscape and bring the best of the tools we find to support our clients.

For instance, Eze Software has recently partnered with OTAS Technologies. OTAS apps like Alerts and Intraday Screener can monitor your order blotter and draw your attention to abnormalities that could indicate a change to your execution strategy may be needed. The integrated Schedule app runs pre-trade transaction cost analysis that explains the risk profile of your order and calculates the optimum trading schedule based on your risk appetite. We’ve seamlessly integrated these advanced in-trade analytics and more with our trading tools, enabling traders to quickly react to market signals.

We’re also turning our attention to the market’s interest in intelligent segmentation of orders, lining up trade objectives with appropriate benchmarks and looking at how we can use the client’s universe of post-trade evidence to drive decisions throughout this process, either automatically or as an advisory function. Despite what you may have read, this isn’t artificial intelligence, this is just trading intelligence.

Say what you like, as a mere human being, you can’t watch 50 order books at the same time, and you can’t pay close attention to the performance of each order and the conditions in which they’re trading. You certainly can’t give each and every order the due care and attention that MiFID II demands. Not even Keanu could do that.

So what can you do?

First, review your process. Take a good hard look at your best execution policy and make sure it accurately depicts the factors that your firm takes into account when deciding execution tactics. Review the best execution policies of your brokers as well – you may wish to use one of the electronic trading questionnaires out there to help you collect and digest the data. Then think about how to show that you are educating everyone involved in orders about the policy, how can you prove you’re following it, and put in place a framework for regularly reviewing your performance.

Underpinning much of the above is data. Look around the market to evaluate what technology and reporting tools are out there and try to identify which of them would best help you justify what you say you’re doing in your best ex policy, and will provide the most insightful look into your trading.

Best execution is not an end-state. It is an endeavour. It is a mindset to be adopted throughout the entire life-cycle of a trade, including before and during, when you can exploit real-time market intelligence to impact your results. The traditional end-state, in the form of post-trade transaction cost analysis, has repositioned itself instead as a critical part of the feedback loop, reflecting a mindset that we can learn, refine, and ultimately do better. This analytical mindset, I would argue strongly, is where the regulators are trying to direct us.

*www.fca.org.uk/news/news-stories/investment-managers-still-failing-ensure-effective-oversight-best-execution

©BestExecution 2017[divider_to_top]

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA