Assessing the potential for trade automation on the buy-side desk. By Mark Northwood, Global Head of Equity Trading at Fidelity Worldwide Investment.
When FIX launched in 1998, buy-side desks at traditional asset managers were still adding direct phone lines to their dealer boards, and order and execution management systems (OEMS’s) were often spreadsheets, copied from paper order tickets with actual time stamps. The order routing process of the day was this: [buy-side reads order –> calls/copies details -> sell side re-enters order].
FIX enabled a rapid evolution, starting with the removal of the fallible transmission of order details between humans. The next evolutionary step saw the buy-side trader given the choice to remove the sell-side trader from the process by routing an order with FIX to a box in a data centre where an algorithm (algo) took charge of order placement decisions. The next step could be to remove the buy-side trader from the process as well, resulting in a “no touch” flow.
We know that automation has revolutionised many industries including our own with exchange order matching, market making and arbitrage activity all now handled by machines. So could it work on a centralised trading desk servicing many investment disciplines, and is this where the pursuit of “best execution” is leading us?
The buy-side trading process
Regulation sets us all the goal of achieving “best execution”. This term seems valid for small marketable orders, but acquires extra dimensions when applied to large institutional orders. At Fidelity Worldwide Investment we target an optimal balance between net price, volume, speed and certainty for each order. Simple enough, right? Of course it isn’t, as the optimal result is impossible to determine in the complex, competitive system known as the market. An order book in the market is generally in a state of “fragile” equilibrium: a stable, mean reverting quote distribution persists until one (or more) trader’s algo is too aggressive, triggering a volatility spike as other algos react instantly. Such action by one individual’s algo may be in their interest but can disrupt the market for others and tempt them to dial up their aggression as well, adding fuel to the HFT machine. “Best execution” means continuously assessing how the strategy is behaving, and adapting it.
When an asset manager trades for its clients, the best result combines: 1) a good decisions to buy and sell positions, and 2) intelligent and cost-efficient implementation of those decisions.
The two steps have been segregated into specialist skills as firms have grown. Measuring the impact of different types of trade activity on fund performance is essential. This work is most effective if it spans the entire process, assessing the trading decision itself and not just the execution step.
Traders can then use that analysis to identify ways to improve the decision making step. For example that analysis might show that the so-called fast money tends to over-price news announcements as it anticipates the response by investors, and price reversion is likely hours or days later. Traders could then use such evidence to recommend a disciplined approach when reacting to the “market moving stories” being pushed out by the media and brokers to encourage impulsive activity. So the starting point for further automation is to accurately profile each decision from the Portfolio Managers (PMs) in terms of the many internal attributes which may be relevant to it, plus the external conditions prevailing at the time it is received. This profile will drive the selection of a target execution strategy, incorporating “fair price” bands linked to expected alpha. This would provide more value than an estimated cost, and gets us closer to what could be termed “best execution”.
Human versus machine
Today the trader’s desktop is dominated by large screens. New products and services are pushing more and more information through a graphical interface to a human user with only two eyes, expected to pick what is important from millions of pixels, with a reaction time of 200 milliseconds. The human trader has effectively been at full capacity for years; a bottleneck at the centre of an enormous flow of information, which needs to be tied together, understood and acted upon. That is not to say that human involvement is no longer vital. It is, and I will show this by comparing the relevant capabilities of a good trader to a concept Automated Trader, or “AT”.
The AT envisaged here is more than an auto-router sending small %ADV orders to an IS algo*. The AT would comply with handling practices for common order combinations, determine an execution strategy from the order profile, continuously check the current state of each order against its target state and assess possible alternative states. Such a “cognitive” OEMS is not yet a reality for asset managers, but many of the required components are in active use at firms engaged in HFT and other systematic trading desks. The proposition is that the time has come to explore whether AT could play a significant role on the buy-side. A subjective comparison of the two contenders follows.
Broking is a human specialty, so could an AT interact as effectively with a broker, or would a good buy-side trader have better access to information about client flows because of their relationship? A solitary AT in the high and low touch world of today would struggle. But if other firms embraced the concept and more market practices were dragged into the electronic age, AT could operate in a world of message traffic between venues offering a variety of continuous and auction mechanisms for different types of transaction. Block deals could perhaps be offered directly to investors by institutions in specially structured dark auctions, following targeted marketing based on investors’ recent search history… Back in the real world, much of the time the other side of the trade doesn’t yet exist. Sometimes a deal will only come together with a little selling, and that remains something that certain humans have always been good at!
The optimal process
An engineer would solve this by designing a process with the human trader doing what he or she does best, and the AT playing to its strengths: The traders monitoring the filtered stream of highly processed information on a single dashboard highlighting exceptions, opportunities arising from others’ clumsy trading , and illiquid names. The AT seamlessly blending new orders into the Auto-OEMS, maintaining integrity of the allocation process without incurring delay and cost from intraday bookings, and making continual adjustments to the price and aggression settings on its trading algorithms as conditions change.
The target operating state for the AT is derived from the analytical profiling described earlier, as are the breaches which trigger intervention. The notable risks would be mitigated through multiple safeguards: constraining the interactions between human trader and machine, independently supervising every part of the process in real time, and instantly suspending activity which violates the normal state.
A significant re-engineering of the process would be required, but it is already best practice to record and track the key human inputs, so decisions become electronic instructions flowing through a system. As each component of an AT was developed, it would be inserted into the process under the watchful eye of the specialist trader. The obvious impediment is the huge investment in time and money which would be required, as this would be supplementary to the time and cost of running the business. Experience of the complexity of handling institutional orders, and with getting today’s algos to perform consistently suggests this effort cannot be underestimated. Regulators, for obvious reasons, would also have concerns and are likely to add restrictions and a burden of proof before such trading technology could be widely adopted by the buy-side. Car drivers make fatal mistakes every day, but there will be public outcry the first time an autonomous vehicle causes a crash.
Conclusion
The buy-side trader will still have a key role to play in years to come, as will the OEMS but the partnership will evolve as each learns new skills from the other. Over time, the trading system will learn what it needs to take over the order handling and micro-decisions during order execution. Traders will increasingly focus on supervision of the system performance, and using much improved data to consult with PMs, advising on the trade timing and strategy, steering complex orders, assessing market activity for opportunities and only rarely intervening directly in an errant trade execution.
*jargon: algorithmic trading strategy targeting an execution price close to the market price when the order is placed, minimising the Implementation Shortfall.
More Buy-side Interviews Here
Will My Next Trader Sit In A Data Centre?
Changing Use Of Algos
With Jacqueline Loh, Head of Asian Trading, Schroders
With the current amount of algo usage in the major markets in Asia, change in market microstructure is inevitable. That’s not necessarily a bad thing, it just means that users have to keep a very close eye on the workings on their algos and on how their algos affect markets. The float type algos have had most traction, although newer generation algos tend to be combination of algos e.g. Implementation shortfall and VWAP or float and VWAP.
Our policy is (and has always been) best execution and acting in the best interests of our clients. Our focus is on minimisation of impact costs rather than ratios. We look at execution services in terms of high touch and low touch, and the final choice of venue will be the one in which total impact costs, inclusive of commissions, are the lowest.
I actually think TCA provides the impetus for the next stage of algo development. As TCA reports provide more detail on the relative performance of algos, buy-side traders are in a position to objectively evaluate which ones work best for them. When a critical mass of buy-side traders do this and seek to enhance the algos they use, this will drive development on the sell-side.
Transparency
With Alasdair Haynes, CEO, Aquis Exchange
Crossing networks play an enormously important part in market micro structure. They are where orders can be matched to each other and so reduce market impact. We believe the proposals to restrict dark flow would be a huge mistake and we are sure the regulator doesn’t deliberately intend to remove this very valuable part of the business. Natural crossing is something that brokers have been doing for hundreds of years and they should be able to continue.
There does appear to be a problem in the industry with regards to transparency. The market should be fair, inclusive, simple in nature and, of course, transparent. Transparency should be the main aim of anybody creating a market.
Transparency is particularly important with regard to price. There has been much discussion about the structure of exchanges and what is charged for trading. If you look at the pricing policies of many of the national exchanges, you see how complicated and difficult the pricing descriptions can be, which is hardly transparent.
We think subscription pricing is the solution. When the buy-side considers what they are paying for trading, the answer should be very simple – $X per month. This is good for both the buy-side and the sell-side. We think we can help expand the margins of the sell-side and that the buy-side will benefit too.
If dark pools have a simple process, are easily understood and try to match orders of size so as to reduce market impact, it can be hugely beneficial. We all know about the problems of maker/taker pricing and that there were issues from the buy-side who weren’t comfortable with it. There was no transparency and a duopoly in the market.
In general we believe the regulator likes the concept of subscription pricing because it is fair; a big user will pay more, a small user less. And subscriptions allow markets to grow without the complexity experienced under maker/taker pricing models.
We are not expecting the other exchanges to love us and what we do, but we do think we have good support from the buy-side and the sellside and we have the support of the regulators.
Unbundling Commissions
With Carl James, Global Head of Fixed Income and FX BNP Dealing Services and Managing Director of Dealing Services UK.
On the equity side, brokers have experienced a collapse in margins. Asset managers are now increasingly regulated with the focus on the asset managers explaining how they spend clients’ commissions, and having to justify their research and execution spend.
The Financial Conduct Authority (FCA), in its previous life as the Financial Services Authority (FSA), looked at ‘unbundling’ also known as CP- 176. This was driven by the Myners report that put trading costs under scrutiny. The Financial Services Authority (FSA) undertook a survey in 2000, and found that institutional investors paid brokers a total of £2.3B in execution commission. It was clear from the Myner’s report that most pension schemes had no idea of this cost.
Following this report, commission sharing agreements (CSA) were put in place. The idea behind CSA’s was to separate out what was paid for through execution commissions. Previously execution commissions, which let us not forget are the clients’ monies, were used to pay for a wide variety of third party services, which included research. With the implementation of CSA’s there was an explicit split made between execution costs and research costs. Transaction costs now have more clarity, as they are better understood and this has meant they have fallen.
Buy-Side’s Unlocking Business Value Through Technology
With Michael Balter, Chief Strategy Officer, CameronTec Group
A FIX backbone has become the standardisation layer to the sell-side trading community, empowering the buy-side and equipping them with the agility and flexibility to swap and change trading partners to realise greater value in their business.
Michel Balter, Chief Strategy Officer for CameronTec Group, says the market trend over the past five years is a transformation in the way firms perceive their investment in technology. “No longer are buy-and sell-side’s rationalising technology as a cost centre, instead the thinking has shifted to ‘profit centre’ and ‘profit enabler’. This is opening up new opportunities to deliver connectivity infrastructure for buy-and sell-side’s to unlock the value of their business.”
In the past buy side’s relied on the sell side to provide their trading connectivity technology, a sub-optimal strategy given they could be trading with multiple sell side partners and so required to manage many trading solutions. Take this scenario and multiply it by asset class traded, and it quickly adds up to a large operational constrain for any buy-side firm.
It has not only been the advancement in technology but the tightening of regulations that is triggering greater buy-side investment in trading infrastructure. Balter says: “Today we are seeing a number of Tier 1 asset managers taking advantage of new regulatory constraints to leverage best-of-breed technology to offer execution services to buy-side peers for both optimal FIX connectivity to sell-side destinations and FIX hub access for their external clients. In doing so, buy-side’s acting as externalised dealing desks are changing the trading landscape by competing directly with sell side trading partners to attract order flow.”
“A FIX technology backbone is the ultimate business enabler, giving firms the ability to flexibly switch between platforms by plugging and unplugging a variety of solutions,” adds Balter. “The advantages of a FIX entry hub and FIX exit hub are clear. Firstly a reduction in connectivity cost, followed closely by the reduction of time to market, combined with a greater capacity to trade many assets on the one technology.
At the same time, sell-sides are not taking the shift in buy-side reliance lying down. They are counter responding by leveraging FIX to offer more advanced services such as the leasing of algos to sell-side peers and of course the buy-side community itself.”
The Evolution Of Asian Algorithmic Trading
With Murat Atamer
How has increased algo trading affected market microstructure across Asia?
Market structure, regulatory environment, and advanced technology are the key factors promoting the use of algorithms. This is evident from the fact that a good trader uses drastically different algorithms in South East Asia from those she would prefer in liquid developed markets in our region. Spreads are the driving force as once they get smaller, displayed liquidity decreases and manual trading becomes difficult, resulting in an increased use of algorithms.
Over the past few years, we have seen a considerable increase in the portion of overall trading going through algorithms. This is evident in the data as volume curves across securities within a market look increasingly similar to each other. The fact that VWAP and volume participation algorithms have been the “go to” algorithms for most buy-side traders explains these patterns.
Recently we see a heightened interest from the buy-side on opportunistic algorithms that go beyond their predictable, yet very mechanical, predecessors. Traders want algorithms that “act” like a human trader by focusing on real-time market dynamics. There is also an increased focus on performance and a move away benchmarks that encourage high participation rates.
What comes next for algo strategy and development, and who will drive the change – regulators restricting innovation, buy-side pushing for more control?
These are exciting times. The SFC’s regulatory drive forced traders to get a deeper understanding of the algorithms they are using. This has opened up a lot of debate between the algo users and the designers. We have seen a decline in the number of algorithms used by a typical buy-side trader. All of a sudden safety measures available to brokers to protect buyside orders have taken the front seat. Traders were more interested in fat finger protections and on how broker circuit breakers work than getting another custom algorithm; and therefore, for the first time we have seen cancellations on custom algo requests from the buy-side.
The regulatory push for safety has set the stage for less algorithm providers as well as less algorithms. This is likely to mark the beginning of a period for further specialisation by the traders as well as by brokers where they will focus extensively on bettering their existing algorithms. A much greater level of transparency between the buy-side and the sell-side is crucial for achieving this systematically. Particularly, the traders have no visibility on the child slices sent to the market by the algorithms that they are using, and, therefore, have been pushing the sell-side on fix tags that provide liquidity indicators, i.e. Tag851, which would provide this information real-time. Electronic trading platforms that provide feedback to designers as well as the users will naturally evolve to be superior to mechanical platforms that lack this capability. The feedback does not necessarily need to be real-time as long as it is available for in-depth analytics. In this regard, TCA can be a crucial tool to bridge this gap and provide the much needed transparency to the buy-side. Historically, TCA has offered little insight in terms of “selecting the right algorithm.” With this in mind, we have been actively looking at our platinum client orders and have been closely working with them on their algorithm choices based on stock, market, as well as order characteristics to help them achieve better execution.
A view from Japan : Lynn Strongin Dodds
It would seem when walking around Tokyo that the so-called Abenomics strategy was working its magic. The city is bustling and shops seem to be doing a brisk trade. However, looking underneath the surface, the smaller cities and rural villages tell another story. The edges remain frayed and the dramatic loosening of monetary policy initiated by the Bank of Japan last year has not restored them to their former glory. More sustainable, long term structural alterations are needed for a genuine turnaround in the economy’s fortunes.
These are not only the reflections of my recent visit but also a new report by the International Monetary Fund (IMF). While praising prime minister Shinzo Abe’s efforts, it warned that without additional reforms Japan risks falling back into lower growth and deflation, a further deterioration in the fiscal situation, and an overreliance on monetary stimulus with negative consequences for the region.”
There are concerns that the Premier will fail to deliver which explains why earlier this month the fund cut its growth forecast for Japan for 2014 from 1.7% to 1.4% while investors sensing danger have retreated, sending the previously buoyant stock market down more than 8% this year. The Bank of Japan though is taking another line and has defied calls to increase its cash injections. Instead, it is keeping a steady hand on its monetary policy in the belief that inflation will head progressively towards its 2% target, suggesting no additional stimulus is on the near-term horizon. It has also stood firm in the view that the modest recovery will continue despite a temporary slump in demand caused by a sales tax hike in April.
QE though is only one part of the solution. As the IMF reiterates, the government needs to lower trade barriers and ease constraints on laying off workers in order to inject more dynamism into the economy. Both are proving difficult as evidenced by the failure of President Obama on his recent visit to secure a bilateral trade agreement which was needed to rekindle talks over a broader Pacific Rim trade deal.
Another and equally as challenging impetus would be to create a culture of innovation. It is perhaps difficult to remember that Japan once dominated the global market with its products. The scenario is very different today with the rivalry no longer being between Nagoya versus Detroit but Shinagawa versus Seoul versus Taipei versus Bangkok versus Silicon Valley. Companies can get back in the game but they will need to take a leaf from the book – The Team – written by entrepreneur William Saito. He’s tearing down the traditional top-down decision-making culture and seniority-based system in order to give women and younger people a chance to develop new ideas.
Lynn Strongin Dodds
CCPs And Collateral
With Shane Worner, Senior Economist, IOSCO
With regard to ongoing issues in the next year or so, I have identified two main areas of concern – CCPs and collateral transformation.
We are essentially moving risk from banks into CCPs. The fundamental question is whether CCPs are able to price the risk as well as banks can. We are therefore creating an issue that will ultimately become too big to fail.
If we look at the issue of collateral management, the one area that will need more attention is collateral transformation. By putting in stringent requirements for OTC derivatives and other regulatory reforms, we are moving the risk back into those entities that are undertaking the transformational process.
For example, take someone with high value corporate bonds who wants to swap them for triple A sovereign debt. The risk differential between those two products needs to be transferred elsewhere. The other person is not holding it anymore. The person is doing the transformation. We need to shed more light on that actual practice in order to make it more transparent.
In the wake of the financial crisis, CCPs were mandated to help with the promotion of financial stability. If you believe that financial stability is a public good then these entities ideally would be government owned, but at a minimum should be a not-for-profit entity with a natural market monopoly.
Currently, they are private entities with a profit motive and face competition in their respective markets. Basic economics would predict that this could cause significant issues, especially if CCPs compete on eligible collateral and move collateral investment down the maturity ladder to take business away from competitors.
Even a profit based entity with no competitors extracting a monopoly rent might be the tax the market has to pay to increase financial stability.
Market Surveillance
By Alexandru Gomoiu, Regional Director Solution Consulting Treasury Capital, Misys.
Background
The continuous changing regulatory environment, the challenges of dealing with increasing volumes of big data as well as the important role of compliance within financial institutions have had a significant impact on current market surveillance systems as well as the technology requirements behind them.
What do we mean by “real-time”?
There is some debate about the exact definition of the term “real-time” surveillance because of the variation in the accepted or expected latency period between detection, investigation and action. Is real-time when an event happens or should we use the term “pre-trigger”, i.e. before an event actually happens?
This leads us to two main approaches. Firstly, looking at the challenges from a technological perspective, how do we balance coverage capability and the scope of surveillance so we can identify the “triggers” before they actually happen?
Secondly, letting the technology follow front-office behaviours and techniques. The challenge is that front-office behaviours are constantly evolving and you need to have sufficient funds to ensure that the technology is moving in the right direction.
Challenges for market participants
Market participants fall under two categories: Regulators and exchanges which seek to ensure a fair and orderly market for sales and traders, as well as regulatory bodies or acts, including Dodd–Frank Act in the US and MiFID II in Europe. There are also participants who are more focused on issues such as price manipulation and accuracy.
The primary challenge for regulators and banks is to believe that real-time surveillance is possible, and therefore feasible to manage. The other major challenge faced by compliance officers is increasingly high trading volumes and limited time to monitor and manage it.
Using technology to trigger, support and enforce real-time market surveillance
The aim for regulators is to be able to perform market surveillance and monitoring in as close to real-time as possible.
Increasingly, surveillance technology is being rolled out by exchanges to monitor the market, guarantee transparency and fairness in the marketplace, and increase investors’ trust and confidence when trading.
In an ideal world, the technology would be deployed across all trading and electronic systems so that one can analyse and act upon the market in real-time.
Real-time surveillance can be applied in a number of different ways; the most popular is complex event processing (CEP), which involves understanding multiple complex events, achieved by programming the CEP tool to recognise meaningful events which it flags to be acted upon in real-time. This gives you the ability to support multi-threading; use more advanced techniques represented by an efficient combination of CEP tools with in-memory processing analytical capabilities; and handle “big data” and related analysis in a “what if” or real-time mode.
We need to ultimately ask whether this high security and up-to-the-minute analysis is achievable. Should we consider real-time triggers and operations as being a role model for banks to follow?
One would think that predictability and “what if” capabilities are more efficient than real-time “post” operation. The drawback is that predictability would be able to offer only a partial view of what a real-time system provides.
Source: Waters Technology
An end to legacy silos? : Robin Kneale
Robin Kneale argues that as trading across multiple asset classes increases, operating in silos is no longer an effective strategy for optimising operations, mitigating risk or capitalising on market opportunities.
In the last ten years I have witnessed a huge increase in multi-asset class trading, as buy- and sellside firms utilise an increasingly broad array of investment strategies to improve their performance. This diversification has occurred across asset segments but also across geographies, as financial firms look for new growth opportunities.
I’ve worked in the industry for many years, and it’s not a revelation that to remain competitive and increase efficiency you need to constantly re-evaluate your business processes. Silo-based operational models, that have effectively replicated front-to-back office services across each group of asset classes traded, should not be an exception.
I would argue that to lower costs, enhance service and improve operational efficiencies there needs to be a complete redesign of front- and back-office technology infrastructures and a consolidation of operations to create processing systems able to support client needs around multiple asset segments such as equities, fixed income, FX/money markets and OTC and exchange-traded derivatives. I’m not saying this won’t be a challenging proposition for many firms, but we are seeing a growing number of our customers looking to implement comprehensive asset-agnostic processing frameworks. So, we may finally be seeing an end to the legacy silo.
Robin Kneale, is Global Head of Product Strategy, Securities Processing Solutions, at Broadridge