Data without context is a distraction. Data without validation is a liability. To meet the growing burdens of regulatory requirements, due diligence, business analysis, or even simply understanding missed opportunities, every firm at every level of trading technology needs meaningful insight into the data they already generate.
Daytona – Trade Surveillance Is More Than Simply Having Access To Data
The Challenge Of Global Market Surveillance
With Greg Yanco, Senior Executive Leader, Markets & Participant Supervision, Australian Securities & Investments Commission.
We are really excited to have the market surveillance technology that allows us to take control of our destiny. It is much more flexible than our old system in terms of parameters and it is also more dynamic. We have people trained in using the language that the system uses which enables us to generate our own alerts and reports.
The system is being developed by First Derivatives. Their ‘bread and butter’ clients are high frequency traders so we are now applying the same technology and capabilities as people who are using our markets (in terms of putting a lot of the orders on). This is an exciting development and enables us to carry out more flexible, intensive analysis which goes beyond real time supervision, beyond waiting for alerts to pop out.
We could do some of these things with the old system but the new system does seem much quicker and more flexible. We are bringing ourselves up to the same standard of capabilities already being exhibited by other markets and we’re still only part of the way into looking at the new system.
The next stage will be the development of more sophisticated data analytics and data mining capabilities. This means not relying so much on real time supervision but looking back at patterns over time. We have a real time system that pops up alerts in place. For example, say someone had traded before an announcement was made. We can find out who could have had access to that information and whether we might have overlooked that person previously. We want to be able to look back and discover historical patterns of behaviour as we are then in possession of much more powerful information. It’s the same with manipulative activity.
What we are working on now is post-trade analysis where we go back over time and look for serial misbehaviour. We have been able to find patterns in trading previously. It took an incredible amount of work to actually figure out that patterns occurred elsewhere and it would have been good to be able to tell the system to go back and look for a particular pattern. These are the sorts of things that have brought us up to the front of the curve in terms of capability.
The new system gives us some of that capability but we are also getting access to other less structured data and relationships. We have one of the biggest databases in the region and so have direct access to a lot of information needed rather than us having to constantly pester brokers for information. We are using whatever we can find that we already access to identify relationships and matching that to the trading. This is a somewhat more sophisticated than just the new surveillance system.
Regulators tooling up
A new system is all very well but it will only operate efficiently if it is staffed by people who understand what is going on. We have been building our skill set and now have ex-traders, ex-electronic traders, market makers, and real rocket scientists. We are developing staff with skills to understand the industry, the trading, to know who they are dealing with and the ability to use those tools appropriately. We have been concentrating on scaling up the staff skill set in the same way as we have the technology.
We also spend time working with other regulators discussing the capabilities that we have that we can share.
Understanding the market
What we do is spend time talking to people in the market so they know that we are observing the market. We talk to the electronic traders so we get a really good understanding of who their clients are and what they are doing. This gives us an understanding when something goes wrong of what the problem might have been; we might have understood the strategy behind that sort of execution. We quite deliberately communicate regularly with the industry; we publish statistics on referrals we have made, enforcement actions. We put out quarterly statistics on the shape of the markets, including order to trade ratios and a six monthly market supervision report. We are talking on a daily basis about what is going on in the market, particularly with regard to electronic clients.
The HFT and dark pools review
We were pretty happy with the outcomes of both the HFT review and dark pools discussion. We use analytics as the basis of how opinions are formed and the decisions that are made. We think our expertise, particularly with regard to HFT, has given the market confidence. In our opinion HFT is not the problem everyone thinks it is. In fact, there was an issue with the quality of some of the buy-side algorithms and the ‘noise’ they were creating. Essentially the algos were more of a problem than the HFT firms, who were generally pretty efficient, apart from a few who might have been guilty of not having the appropriate filters etc. But in general, I think we were fairly happy with the outcome.
The new system and its capabilities enables us look at market replays – it might take us a little time, but we can look at replays of markets right down to the micro-second level; it is as though it were just normal trading. There are some systems attempting to influence other systems and we have spoken to high frequency traders who are having to defend themselves against these sorts of things. But currently, I think they are pretty much there, however tomorrow may see something else that blows everything out of the water so we need to keep up to speed with the market and ensure everyone is doing their bit to maintain market integrity.
Global regulatory cooperation
We are involved with inter-market surveillance group meetings where we talk about the different types of behaviour we have witnessed. We might spend a few hours discussing one particular alert type in detail. We are in regular discussions with FINRA, IIROC and FCA with whom we share much of our thinking. We benefit from their experience as they have been in this business much longer than us.
This dialogue is important to us particularly with regard to how the processes work and trading patterns etc; these are ideas that FINRA, IIROC and FCA have shared with us. They have also alerted us to people that have been misbehaving in their markets.
For example, we had an online broker hacking incident recently through three brokers in one day. We had been discussing this at the inter-market surveillance group so as soon as we saw it, we knew what was happening. One of the regulators was able to help us track the money flow, all the way back to somewhere in Eastern Europe and the behaviour was stopped immediately. So it was through those relationships that, firstly we knew what was going on, and secondly we were able to follow the money trail.
This can happen because there’s an established network of relationships at the regulator level. The key message is that we are moving from real time identification of one person who might have done something once or twice to a more structured integration between real time surveillance and daily, weekly, quarterly, annual, thematic work. This is where we go back over the database and look at what is going on, look at the pattern and ask “where else is this happening?” We look at absolutely everything now – every transaction, every order is looked at by the system – and this will happen more regularly and more rigorously in future.
Transparency Through Latency Analysis
By Markus Löw, Head of Monitoring, Vassilis Vergotis, EVP & Head of Americas and Fabian Rijlaarsdam, Monitoring Analyst at Eurex Exchange.
Technology is one of the most important cornerstones of electronic markets and, more broadly, of today’s global markets. As technology allows for increasingly complex systems from a design standpoint, it is essential to get a clear view on each step inside a system to understand outcomes. Gathering data at discrete points and understanding the outcomes within such systems is a practical way to break down the complexity and deliver digestible information. Within the area of financial market technology, especially in the case of exchanges and alternative trading platforms, transparency is essential for monitoring the fairness and market structure of financial markets:
- Data analysis on latency is important in this matter, as it provides insight into the technical infrastructure.
- Latency statistics are essential to provide users, operators and supervisory authorities detailed information on the performance and stability of technical infrastructure.
Despite the common perception that these statistics are valuable and important in understanding developments in market structure, two problems exist:
- Data analysis on latency is important in this matter, as it provides insight into the technical infrastructure.
- Secondly, and perhaps more important, is that many people do not fully understand the data that is available. The reason for this is the deep dependency on in-depth knowledge of the technical infrastructure that is used.
This article aims to increase understanding of latency statistics by expressing our view on latency, using Eurex Exchange’s latency statistics as an example. Our latency monitoring tools are helping our Participants to optimize and adapt their trading strategies to the market and infrastructure available. Eurex Exchange’s latency, transparency as well as the detailed monitoring capabilities have been well-received by Trading Participants across the board and have helped the Exchange and its Participants to collaboratively solve problems. Most recently, these figures have proven that our new T7 trading architecture has drastically reduced latency and increased consistency in our technology.
Since the introduction of Eurex Exchange’s T7, latency statistics have demonstrated that T7 is significantly faster than the Exchange’s previous trading system. Average (request to- response) round-trip latency has been reduced from approximately 3 milliseconds on the previous trading system, to approximately 250 microseconds on Eurex Exchange’s T7. According to a number of sources, including Automated Trader’s extensive 2013 Global Trading Trends Survey Report, many Participants in the market have taken the decision to withdraw from the “latency arms race”, judging incremental improvements in round-trip times to be just too costly. Against this backdrop, the consistency improvements triggered by Eurex Exchange’s T7 are more important to a wide range of market participants that deploy less latency-sensitive investment strategies.
Achieving a common definition of latency
As a starting point, we refer to latency as the time between the arrival of a message on our application space ’gateway’ and the send-time of the respective message-response, which is commonly known as the (exchange-) round-trip time. Graph 1 represents the distribution of these exchange-round-trip times on Eurex Exchange’s T7. The graph depicts transaction latencies in microseconds. The probability density function of round-trip latencies is shown on the left Y-axis and the cumulated probability is shown on the right Y-axis. When analyzing the graph, one can look at a number of variables: the mean, median, standard deviation or tail (i.e. higher percentiles of the distribution).
When looking at data, the average is an easy figure to grasp – and therefore often used. However, it can be rendered imprecise if the measurement cited is an average collected from a number of different products and/or time frames. Therefore, in the remainder of this article, we will mainly look at the tail of the latency distribution shown in graph 1, as these are the cases in which Participants run the highest risks due to being uninformed of the status of their transactions.
There are several plausible explanations with regards to the occurrence of such a tail:
- On the one hand, a technical issue delays transactions coming in, causing Participants to unnecessarily run higher risks.
- On the other hand, if a certain market situation, e.g. a scheduled news event, causes Participants to behave in a correlated manner, the delays still imply extra risk for Participants, but this risk is introduced by the physical limits of the exchange system: the better the system, the less risk Participants run.
Either way, latency figures or other data analysis on transaction traffic help both Participants and us to understand the robustness of latency in the system. From the graph we can see that, on an average day, over 90% of the transactions in Eurex Exchange’s T7 have a round-trip time latency of less than 400 microseconds.
Monitoring tools at Eurex Exchange
Eurex Exchange supports the goal of promoting system transparency in the industry. Our wide range of monitoring tools is designed to give users extremely granular and detailed timestamp information. Furthermore, Exchange Participants do not only receive their own data, they also have access to the “best in class” data values for each measuring point in order to benchmark their performance. Within the round-trip measurement discussed earlier in this article we implemented a series of smaller measurements on numerous points within our hardware and software, in order to provide as many independent measuring points as possible and practical. By doing so, we provide our users access to more data to enhance their own performance analysis.
Transparency promotes trust as well as cooperation
Wolfgang Eholzer, Head of Trading IT, explains that offering latency monitoring tools has benefitted market participants and their introduction has been immensely positive for Eurex Exchange. He says: “When we first introduced our monitoring tools, we quickly learned that empowering our Participants to monitor their own latency, was an immensely productive step. Our Trading Participants’ IT staffs proactively contacted us with questions and observations and some helpful constructive criticism. As a result of our openness, we were able to work together to solve both perceived and actual problems and optimize our respective networks in an extremely efficient manner. This teamwork continues today.”
A call to action for the industry
Eurex Exchange believes that developments both in the industry and in technology will encourage a greater number of systems providers also to introduce more transparency into their systems. Eholzer opines that more openness from the industry is a forward thinking move. He states, “Sharing system data on latency is a practical step that many systems providers can take to enhance their customer relationships and do their part to create a more stable market infrastructure.” That trend will benefit the market place as “two eyes are better than one” in terms of identifying and correcting potential system issues.
For further information please visit
www.eurexchange.com
The Three Tenets Of Capital Markets Reform
By Fred Ponzo, CEO, GreySpark Partners
Capital markets reforms crystalised in the US Dodd-Frank Act and in its EU counterparts that are also being similarly implemented by a growing number of other countries like Australia and Canada, for example, have a single aim: to reduce systemic risk in the global financial system.
As such, three tenets underpin the new architecture devised by legislators. First is the management of market and counterparty risk, both of which historically were commingled and buried within bank balance sheets. These risks must now be segregated and dealt with independently from each other.
The second principle of the financial markets reform is the steep disincentive now applied to banks regarding risk-taking activities. It takes the form of punitive capital charges on uncleared or uncollateralised trades (Basel III), as well as through outright bans on proprietary trading (e.g. the Dodd-Frank Act’s Volcker Rule). The intent of these rules is clearly to nudge investment banks to fall back into their place within the global financial system as intermediaries, instead of behaving like supercharged hedge funds. Lawmakers in the EU and US favour a world wherein the provision of market liquidity is assumed by shadow banking outfits instead of deposit-taking banks. In short, this change will push market risk onto firms that do not need bailing out by taxpayers if their bets backfire.
The third tenet is to move most of the counterparty risk onto clearinghouses. The reality is that mandating the central clearing of OTC contracts does not necessarily reduce systemic risk. Instead, by concentrating counterparty risk within a handful of fit-for-purpose institutions, such risk can be accurately accounted for and, therefore, adequately managed. This is a significant improvement from the previous belief that risk would naturally spread out across the universe of investors, making the market as a whole more resilient. This assumption was proved wrong when, in 2008, the London-based trading arm of US insurance giant AIG was found to be at the end of the majority of CDS contracts in the market, leading to an eventual $85bn US government bailout of the company using taxpayer cash.
In this spirit, central counterparties (CCPs) should be, in essence, public utilities, operating as insurance schemes in a manner akin to mandatory, universal healthcare coverage. All clearing members to a CCP should be treated equally and should contribute premiums that are pooled to cover the risk of a default by market participants. In practice, clearinghouses are an oligopoly of competing commercial organisations in which clearing members take an equity stake. This equity cushion is meant to absorb potential losses incurred by a default.
In my view, the capital structure of CCPs represents the Achilles heel of the new market architecture. A single large default event could wipe out the capital of the CCP, triggering its nationalisation to safeguard the whole system. Maybe this is what is required to return clearinghouses to their original form of mutualised, non-profit infrastructure. I cannot help thinking we could have spared ourselves the thrill of having to look into the abyss once again in the not-so-distant future.
The Changing Nature of Algos In Asia
Matt Saul, Head of Trading Asia ex. Japan at Fidelity Worldwide Investment looks at the changing technology of algos in Asia.
How has increased algo trading affected market microstructure across Asia?
We’ve seen market microstructure changing partly as a result of algo trading but also as exchanges reduce spreads and change rules to facilitate growth in trading volumes (read HFT and general quantitative trading). This in turn has triggered algo providers to further evolve their strategies, all of which seems to be resulting in less (and more ephemeral) displayed market liquidity.
Our usage of algos made a step change higher several years ago, it has been relatively stable since then, but still accounts for less than half of our total activity. The nature of algos being used however has certainly evolved over the last few years, with more vanilla strategies being refined to suit our traders styles and greater use of what I guess you’d call opportunistic strategies. Tools for monitoring trading (algo and more traditional) have also greatly improved as has “live” TCA, although using this data to improve future trading outcomes has to date proved challenging.
How this change affecting the trading desk decision process between self-directed flow, high touch and block trading?
As discussed, the ratios of our activity are relatively stable. Barring some radical new development the most likely trigger for a further sharp rise in share might be another marked downturn in markets. This would reduce market turnover, probably trigger further cuts from brokers “high touch” desks, further degrading their offering and making it more likely that we would not chose to pay the commission premium, instead relying on our own capabilities.
What comes next?
I think given the current high level of regulatory change, most brokers and clients are merely struggling to keep their existing strategies “compliant” and ensure that they adapt to changes in market microstructure. Developing markedly new strategies seems to be something for a more profitable time in the cycle. The biggest open strategic question for the sell-side, which has made tentative steps toward merging different channels, is as to whether they really embrace this change in a disruptive way and try to recapture some value of anonymity and liquidity access that they previously gave away (almost unwittingly) with the development of “low touch” desks in the previous decade. My other forecast is for somebody to attempt more intermediated “dark” (grey) block crossing facilities.
Blurred Definitions; Broken Models
With Clive Williams, Global Head of Equity Trading, T Rowe Price
People are very much aware of dark liquidity and what it means. But what I think people are probably unaware of is that if you add up all the volume in the fragmented dark then the true volume number is closer to 55%, and not the 40% that is commonly quoted. So it’s the minority of trading that actually is used to develop price formation; price discovery is being impinged by dark flow, which is a challenging situation.
Brokers are increasingly indistinguishable from exchanges, and vice versa, both want to be each other. For example, NASDAQ tried to launch algorithms to try and compete with the sell-side. It becomes accepted behaviour.
The arguments that we hear from the sell-side, namely that exchanges are too expensive; the simple way to solve this is to reduce the exchange fees, and that’s the access fees, which is all part of the maker/taker model. Ban maker/taker and a lot of issues would be solved.
The exchanges are also increasingly becoming more like technology companies, they’re not trading companies; data is where they make their money which is counterintuitive.
It’s been tried before, but maybe now is the time for the buy-side to get together and have their own exchange. We haven’t seen that yet, but anything is possible.
Everyone comes up with a more complex answer to solve the same issue but not so long ago that market structure was relatively simple. Why do we need 50 dark pools and 13 different exchanges? They say its competition. I struggle to see that. It’s just about a different pricing model? Again, that all pertains to the maker/taker model where the biggest clients are the high frequency traders. They’re not looking after the investors and are not interested in fundamentals or even risk taking. They are just sand in the machine.
An exchange is supposed to be a facilitator for companies that want to raise money, help the economy and grow jobs. We seem to have lost sight of that; companies no longer want to come and list in the market. They’re happy to stay private longer as financing of private companies is much easier today. Increased regulation may have possibly played a part in the reduction of listed companies but it’s nowhere near the full answer.
We suffer as an institution as these small, growth companies are no longer serviced by research analysts; approximately 40% of all listed stocks globally don’t have any research coverage.
This lack of research and promotion is because there are no incentives for brokers to pick these things up. They’re busy spending their money on technology and trying to interconnect with the dark pools and venues; they’re forgetting what they were about, namely helping companies raise capital and grow.
HKEx Ecosystem Forum: Technology And Market Integrity
The 20th of March saw HKEx host their annual Ecosystem Forum conference. This event brings together the exchange and the industry to talk about the latest developments in technology and services in Hong Kong.
The conference altered its normal format slightly this year: while including a range of presentations and speeches from HKEx senior staff, there were also two panel discussions which focused on regulation and changing electronic trading patterns in Hong Kong and the wider region.
The event opened with remarks by Jonathan Leung, Senior Vice President and Head of Hosting Services at HKEx, giving an update on the latest technology being employed at the exchange. The day was mostly focused around the continuing rollout of the Orion suite of technologies being employed by the exchange. The Orion Central Gateway is one of these flagship technologies: details were shared about its support for FIX, precisely how it will function and be rolled out. The other central technology was the Orion Market Data initiative, and its move into derivatives data. These updates are coupled with the new push for the Orion Trading Platform, which the HKEx hopes will update the trading platforms used on the exchange with greater functionality and the latest technology.
The first panel of the day focused on changing regulation throughout Hong Kong and Asia, taking a broad view of the drivers behind regulation: be it retail, specific market incidents, or just the stability and integrity of the market as a whole. Underpinning this discussion is the changing answer to the question: “who is responsible when something goes wrong?” . As we have seen from the SFC’s algorithmic regulation that came into force January 1st, responsibility is being shifted much more definitely towards the buy-side and electronic trading vendors, but there was still a call for the exchange to shoulder burdens as well. The precise balance of what checks sit where is a matter for continued debate. Generally the exchange’s comments regarding circuit breakers and reinstating a closing auction in Hong Kong were well received as being steps in the right direction.
On the Orion Market Data platforms specifically, the drive is towards the customisation of the exchange’s products to specific segments of the market, breaking out products to meet specific needs of users. Latency is also being improved through faster feeds and a new messaging protocol.
One of the most widely attended sessions during the day was looking at market integrity. The HKEx announced in the session that they were closely examining the reintroduction of the closing auction in Hong Kong, and looked at the various reasons why a closing auction should be brought back, and the potential form that the auction might take. By way of international comparison, Hong Kong is the only exchange in the 23-member MSCI developed markets list without a closing auction. The exchange is expected to begin consultation in 2014.
The other area of interest in the same presentation was the introduction of circuit breakers: a contentious point with various markets in the region introducing different checks. Carried over from an earlier panel on where responsibility for trading errors sits, the exchange is looking at circuit breakers, but it would not be drawn on their precise nature. The debate is broadly around two areas; what should sit at the market level, and what should sit at the instrument level. In terms of specifics, these boil down into issues of whether the bands should be dynamic or static, what instruments should be included in their calculation, and what should happen when the bands are triggered. There are many questions around circuit breakers, but again the exchange will consult in 2014.
The second panel of the day looked at the changing nature of electronic and algorithmic trading in the region. One key development is around how the SFC’s algo regulations have changed electronic markets and technology since their introduction on January 1st. When the point was put to the panel on whether the regulations have stifled innovation, the general consensus appeared to be that the rules were too loose before, and that fewer and simpler algos might actually be a good thing on the market. The main takeaway was that technology is there to make trading better, safer, and more reliable for everyone, and regulation can back up that role.
The day was attended by over 400 market participants and combined the best new content from the exchange with a number of great keynotes and panel discussions.
The speaker presentations are now available here.
Changing priorities : Lynn Strongin Dodds
GOING IT ALONE? WELL ALMOST.
In the post financial crisis era, the mantra reverberating in sellside hallways is the need to be more client-focused. This of course begs the question as to why customers weren’t always centre stage? The truth is that in the heydays, investment banks were able to pay attention to only to a narrow band of institutions and still generate healthy profits. The picture is different today despite the improvement in the UK and Europe’s economic fortunes. They need a broader base to boost the bottom line but winning the hearts and minds of the larger and even medium sized buyside firms is not that easy as many are taken greater control of their destinies.
This disintermediation is not a completely new theme but one that has gained momentum over the past five years thanks to technological innovation and regulatory reforms. For example, the recent wholesale review by the Financial Conduct Authority (FCA) on the long established practice of using client-dealing commissions to pay for research content is just the latest example of the shake-up in the buy and sellside relationship.
A recent survey conducted by Frost Consulting and software firm Quark estimates that each year around $22 bn is being paid globally for research services, but investment banks’ slice of the pie has shrunk as asset managers turn to independent research providers and consultants, and enter into commission-sharing agreements with banks. At the same time, research budgets at large banks have fallen from $8.2 bn in 2007 to $4.8 bn in 2013. They may have less money available but the demands are greater in that they are expected to produce more than the all-encompassing “waterfront” models and maintenance research. Fund managers also want in-depth analysis and insight on specific sectors, geographies and themes.
Expectations have also been raised on the trading front and a suite of state of the art algo tools is now a given. This is especially relevant today as European buyside firms are increasingly paying for the automated execution of transactions as opposed to human-guided trading. Last year, orders completed by algorithms comprised 51% of payments, up from 46% in 2012, and the amount may rise to 52% this year, according to a recent report from TABB Group. By contrast, traditional human sales traders saw their share of commissions drop to 27% last year from 32% in 2012, with a projected slump to 25% in 2014.
The TABB report points out that it is not just the larger players but also the medium sized firms who are becoming more discerning. If they do not get the service they think they deserve, then they will trade directly themselves via an algorithm. Striking this fine balance between technology and the human touch may be difficult but it is the road that the sellside will have to travel if it wants to stay client focused.
Lynn Strongin Dodds, Editor
[divider_to_top]
Buyside focus : Huw Gronow
Taking the lead.
Huw Gronow – director, head of European equity trading at Principal Global Investors discusses the firm’s strategy in the post financial crisis world.

What do you think of the trend towards execution consulting on the sellside?
There is a lack of uniformity across the buyside structure and it is difficult to have one, two or three models that meet all of their requirements. The sellside has been under pressure for some time due to the global financial crisis and the fall in volumes, which is critical to their revenues. Successful sellside firms have leveraged technology to enhance their communication conduits to provide choice without losing quality or efficiency. However, there has been a change in the topography and in many cases the race for cost efficiencies has changed the relationship. It has led to what some call “juniorisation” where senior staff are replaced by more junior people to deal with more buyside firms. For us it is important to have the choice to select the relationships that best fit with our needs at any given time, which is why the structural design of a sellside operation is challenging.
How do you select a broker?
The criteria we use focuses on brokers’ abilities to access liquidity and that can take several guises. For example, it could range from a superior suite of algos, which can find liquidity in fragmented markets in a scaleable and efficient manner, to an expertise in small to mid-cap or emerging market stocks. Here we might look at a local or regional specialist. Since the trend has increased to more automated and self-directed trading, we are tending to place more value on technological competency and whether they are sound guardians of our order flow under the instructions we provide. We have about 25 to 30 brokers on our list and we put great store in transaction cost analysis to measure their performance. Given our deeper understanding of market structure, we are looking to integrate that analysis into our investment process.
Does that mean you are taking more responsibility for trading?
Overall, the buyside has taken much more responsibility for their own trading and will use the technology that the sellside has given them. For us though, one of the most critical factors has been the development of TCA in measuring not only broker, but also our own performance. We are now able to do that in a much more granular way, thanks to technology as well as the increased bank of data that is now available. We are able to harvest so much more information, which enables us to better measure the quality of the execution. This has meant that over the past 15 years, TCA has moved from being a pure compliance tool to a real-time strategic tool that enables us to attempt to quantitatively assess the value of the trading desk whether it is on a simple or more complex array of trades.
What is the portfolio manager’s role?
We are seeing a much closer relationship between the portfolio manager and trader. In the past these two roles were segregated but we believe that the integration of the two leads to real benefits. Typically, portfolio managers would be more interested in the post trade review, but this has changed and they are becoming more involved in pre and post trade analysis. This has meant that the trading function is more than just simply a transaction execution function but an integral part of the strategic investment decision-making process. For example, integrating the pre-trade transaction cost estimates allows us to adjust the expected alpha signal with some reasonable estimate of implementation costs. As a result, decisions can be based on realisable, expected alphas rather than purely theoretical ones.
What impact do you think MiFID II’s caps on dark pool trading will have?
I understand the rationale for the caps in terms of greater market transparency but I will be interested to see how they will be implemented and enforced. I would continue to question the economic benefit of dark trading when the average trade size in non-institutional dark pools has shrunk and resembles that of the lit exchanges, with little or no price improvement.
That said, dark trading in large institutional size has significant benefits in terms of reduced market impact and transaction costs which positively impacts investment returns for savers. There is also no guarantee that trading on lit venues will rise if dark pool trading is capped.
What impact do you think high frequency trading (HFT) has on execution?
The thing about HFT is that there does not appear to be an agreed definition of what it entails. People often use broad definitions and generalisations but HFT is not a single type of strategy. HFT is simply a set of techniques, or a description of a set of tools. We take the view that there has been an increase in ultra-low latency, high-frequency short-term alpha market participants and although there is a lot of debate about their role, there is no turning the clock back. These strategies are also not new but have just increased over the past few years. From our perspective, we need to understand the microstructure of the market and move to conduct our own forensic analysis on the impact these techniques have on stock price action, so we can adjust our own trading accordingly. More independent research in this area would clearly be welcome. The regulators could and should deal with any impropriety that takes place and examine any structural or information asymmetry that could potentially compromise confidence in well-functioning and stable equity markets.
What are some of the challenges and opportunities ahead?
The greatest current challenge is the pace of changing regulation and we have a committee In London, on which I sit, that closely monitors and responds to all the developments. In terms of opportunities, we plan to continue to build on the globally-integrated investment process by continuing to conduct our own research into our trading, using quantitative techniques and resources to align our execution with our alpha process, in addition to deepening further our understanding of market microstructure globally.
Huw Gronow is head trader in London for Principal Global Investors. He joined Principal in 2004 and is also an active contributor to many of the firm’s ongoing research and development initiatives relating to trading. Before joining Principal, Huw was a trader at BlackRock International where he specialised in UK equities. Previously, he was a trader at TT International Investment Management specialising in global equities. Gronow received a master’s and a bachelor’s degree in natural sciences from the University of Cambridge. He is a member of the Chartered Institute for Securities & Investment (MCSI).
© BestExecution 2014 [divider_to_top]
Buyside focus : Brian Schwieger
Strengthening the bonds.
Brian Schwieger, head of equities at the London Stock Exchange Group speaks to Best Execution and explains why the buyside is now in the spotlight.
How have you settled in?
The head of equities is a new role that was created last year mainly due to the changes in the marketplace. The LSE Group identified that derivatives was a business for strategic growth, but at the same time, wanted to continue to focus energy and development on its core equities business. I report into Nicolas Bertrand, our head of equities and derivatives markets. My role is primarily to look at ways in which we can enhance LSE’s existing offerings, build a greater awareness around them and find innovative ways to solve issues. As a result I spent the first three months getting to know everyone and learning how the organisation works and the next three months getting in touch with clients to see how we can add value to their trading experience. As the buyside takes more responsibility for their own trading through electronic platforms, it’s important that we have a good dialogue with them as they are the end consumers of our services. The biggest challenge for them is finding liquidity with minimum market impact because of the market fragmentation. It is time for us to refresh ideas and ensure they know about our existing and new products.
What changes have been made?
We made a number of changes to our International Order Book (IOB) such as reducing the tick sizes for the most liquid IOB stocks such as Gazprom, Rosneft, Sberbank and Lukoil, to align with the FTSE 100. We also brought forward the start of regular trading to 08:00 compared to 08:15 so the opening time could match that of the SETS market.
We also just ended a consultation to further enhance the trading of small to medium enterprises (SMEs) which includes looking at improving the functionality of SETSqx (a hybrid trading service that provides the functionality of four daily electronic order book auctions with standalone non-electronic market maker quotes). We hope to make some announcements in the next month or two on other new products.
Do you think that best execution since MiFID I has improved?
It comes down to how you define best execution. If you are talking about child-to-child orders then things have definitely improved but if you are looking at parent orders then it has become more difficult to execute these kinds of trades. This is because it is much harder to find liquidity due to fragmentation. The market though has recognised these challenges and has created technology in the form of smart order routing, new platforms such as dark pools and broker crossing networks and strategies to address them.
What impact do you think MiFiD II will have?
We have fairly well defined principles but in order to implement them we need a clearer set of details. This is particularly true with how trades are flagged. Today, dark trading operates under the reference waiver (allows transactions to take place off-exchange if they are matched at the mid-point between the best bid and offer generated in the lit market). I also think caps on dark trading will be a reality. Under MiFID II, the proposals are for them to be placed in a stock to 8% of the volume traded in the European Union as well as the volume on any one platform at 4% of the total EU market in that stock. I think where it will hit will be the more liquid names, and the challenge will be in finding ways to connect the liquidity. However, we have time to develop solutions and as MiFID I has shown the industry can be innovative in terms of new ideas, products and platforms.
How do you think the buyside has changed the way it trades and what impact will the new rules have?
The buyside has definitely changed the way it trades and they could become even more holistic in the future. We are already seeing some portfolio managers taking more of a hedge fund approach and working closely with their heads of trading. In the past these two were separate but increasingly they are interacting with each other. There is also much more focus on the cost of trading not just from the execution side but end to end, particularly on the post trade side. If dark trading becomes constrained under the ‘double caps’ regime, we could also see a return to portfolio trading, particularly with volume weighted average price (VWAP), as well as requests for capital commitment, although that would be difficult given the constraints that investment banks are under because of Basel III.
What other changes have you seen on the buyside?
The composition of the market’s eco-system has changed over the past year with a clear increase in long-only flow. We have also seen the return of US investors to Europe and I think this will continue. There had been interesting discussions over reverse rotation but 2013 saw a 20% rise in markets and there is still a healthy interest in equities. However, we understand from market analysts that this may be a more challenging year for fund managers. Last year, if you had simply tracked the index, you would have been a winner. Views seem to be that this year success will be much more about alpha and stock picking – and that certainly seems to have been the way the year has started.
Against the backdrop of ICE buying NYSE Euronext, do you see more consolidation in the exchange industry?
People have talked about consolidation for a long time but we have not seen the same proliferation of venues in Europe as in the US. The market has been more stable. There is always room for more competitors but the real question today is about the value-add of the business model. The challenge for any venue is not only its ability to deliver the liquidity but the quality of the flow.
What other challenges do you see facing the industry as a whole?
The need to innovate is crucial because exchanges and MTFs both have certain fixed costs that they need to cover. They are not coming down and you need the right value proposition to help you stay ahead of your competitors. Investing in technology is also critical but timing can be difficult. For example, if you make the investment too early then the business may not be there but if you are too late you may have missed the opportunity. We believe that one of the keys is to be much more client focused and pay better attention to our participants, the nature of their liquidity, where they chose to trade and how we can better partner with them.
Brian Schwieger is head of equities at the London Stock Exchange Group. After graduating from the London School of Economics in 1990 he joined BP as an LPG trader before joining Continental Grain in 1994. After taking a Masters in Finance at the London Business School he joined Morgan Stanley International, in the UK Equity Trading and Market Making department, progressing there until 2005 when he joined Bank of America Merrill Lynch, as Head of Execution Desk (Algorithmic Trading). Brian joined the London Stock Exchange Group in 2013, in his current role.© BestExecution 2014 [divider_to_top]