Nirav Shah, Global Head Capital Markets Group, aurionPro looks at the evolution of FIX testing procedures, and where they go from here.
Evolution of Testing
FIX testing and FIX engines have evolved together at a rapid pace over the last 10 years or so. Early FIX testing used to be done primarily using in-house tools on top of the FIX engine, having the ability to send and receive FIX messages. The main focus of such tools was more on FIX semantics rather than business use cases. Even validation used to be manual, and tools were customised to requirements, and would have a flavor of the tester itself. This, according to me, was Generation I of FIX testing.
Then it evolved and Generation II abstracted the FIX semantics and the focus moved onto testing the business use cases. I think as usage of FIX increased, rightly so, the testing focus also shifted towards ensuring business use cases were tested properly. The market started demanding scalability and flexibility.
Generation III mirrored the FIX usage explosion across multiple asset classes, and hence the focus moved to performance across multi-markets and multi-asset baskets. Suddenly trading latencies had moved from milliseconds to microseconds and focus was also on rigorously testing the performance of the FIX infrastructure.
The current generation is now focusing more towards automation, regression testing, and minimal effort in adding new products across multiple asset classes and moving towards the middle and back office. This is a challenging and interesting period where FIX testing and FIX products need to demonstrate the scalability to move to allocations, confirmations, etc. The process has so far tailed FIX evolution, but it also needs to catch up with the evolution of software testing processes. The current demand of the market is to be better, faster and flexible; able to meet the ever-growing landscape of new execution venues, new products, and new asset classes.
Current Testing Focus
The greatest concern for both sellside and buy-side has always been to ensure fast time to market with reliable quality. For a sell-side, having multiple clients sending orders, the focus would be to manage customer specific RO E. The automation of testing has to play a huge role going ahead as FIX evolves more into allocations and confirmations, and as the markets finally start moving from FIX 4.2 to FIX 4.4 or FIX 5.0. A major reason why firms have opted to delay upgrades is because regression testing is not automated.
As markets have grown faster and faster, the main challenge for a sell-side is ensuring good quality of FIX systems, so that there is no down time. We need to understand that trading is now truly global. A down-time of 10 minutes in 24 hours is just about acceptable, and therein lies the biggest challenge the industry is facing.
For a buy-side that connects to multiple brokers in multiple markets (and most of the time multiple brokers for the same execution venue), the system needs to be customisable pretty quickly, and to ensure that the same performance is obtained in terms of latency. Reference data is a major headache for a buy-side who is connecting to multiple brokers in multiple markets.
Why Third Party Test Now?
This is purely a strategic decision. Seeking third party testing tools and even outsourcing them gives the firm the required bandwidth to focus on the core business of trading, and more importantly saves costs and ensures high quality.
There are numerous examples of buy-side and sell-side firms struggling to keep control of costs on their FIX infrastructure due to a lack of FIX expertise and not using the right tools. It’s no longer a classical dilemma between “build or buy”; FIX testing is now moving to a domain of out-sourced testing and hosting.
Testing FIX infrastructure has rapidly evolved into a product space with new ideas, and a primary aim of increasing operational efficiency. We have seen a number of our clients, many of whom have been FIX compliant for years, still lacking a testing platform or framework. Many times, this has resulted in them being slow off the mark in acquiring new customers or launching new products, while smaller firms have been more dynamic and have invested in the right technology and tools.
The Legacy of FIX Variants
Having different versions of FIX, or even a non-standard implementation, is a fact of life which has to be dealt with. This is due to the lack of proper testing tools, and where the firms have opted to delay an upgrade and adopt nonstandard implementations.
A sell-side has to adapt its FIX ROE and systems to the needs of customers; and in times where margins are shrinking and money is made only when the customer starts trading, being able to on-board the customer as quickly as possible has led to different versions and non standard implementations. I think these are more business driven decisions, aimed at minimising the impact on FIX infrastructure. A robust and a truly integrated FIX testing tool would help to resolve many such cases, as the technology has to move at the same pace if not ahead of business needs.
The net effect is that FIX systems are no longer scalable to match performance levels, and a lot of cost and effort is now going into maintaining the status quo, thereby slowing down the upgrades or the ability to add required features. It’s no secret that a lot of firms are still at FIX 4.1 or FIX 4.2 and not many have gone to 4.4 or 5.0 precisely because of these reasons. A lot of the above has to do with a lack of a proper testing platform – really the fear of not being able to test the system completely is pushing a lot of firms towards non-standard implementations.
If the end-client for a sell-side firm is the end customer who needs tobe on-board quickly, then not much of this process can be done by the end-client. When a new customer is being brought on-board, it starts off a new relationship. The humaninteraction required in guiding the end customer for certification testing can and should never be removed. This inspires confidence in the endcustomer, that the sell-side firm really cares about his/her business. It also allows the sell-side firm to get a deep insight into its customer’s profile, which can in turn help the sell-side firm to offer more relevant products and expand the relationship. However, if there is a new feature being added or the buy-side OMS has changed or been upgraded, then the automation can be achieved in testing by the end-client, and it really needs to be there to achieve efficiency. A balance needs to be struck between automation and human interaction to achieve optimal efficiency and maximum business expansion.
Now, suppose a new algorithm or trading strategy is being launched by a firm, then the end-client is going to be the execution trader on the desk, and they need to be involved in the testing process. The current tools are very tightly coupled with FIX tags and the person testing it needs to know the FIX tags very well. Normally, the FIX expert would sit with the trader and test the system, thereby leading to a duplication of effort. There is a definite need for, and there is an evolution happening in the FIX Testing arena, to address this. The bottom line is that the FIX testing tools are evolving into FIX testing platforms that have the ability to address multiple users, including traders, FIX testers, and developers to derive the most value from them.
Testing The Engine
Building Barricades
In a rapidly changing world, Fergal Toomey, Chief Scientist and Co-Founder of Corvil, examines the risks, and remedies, of technological failure.
The Value of a Holistic Approach
Makers and users of technology have learned from lengthy experience that there is no single ‘silver bullet’ that can be applied to eliminate the risk of technical failure. Instead, system managers rely on a range of best practices and controls, each individually useful but imperfect, to reduce risks. A holistic analysis such as the ‘barrier’ or ‘Swiss Cheese’ model of accident prevention helps to clarify how failures happen in such multi-layered systems. In this model, defences are pictured as a succession of barriers against hazard, all of which must be penetrated or circumvented in order for losses to occur. The model accepts that each barrier has weaknesses and will sometimes fail to contain a hazard; nevertheless adding barriers helps to reduce risk so long as their weaknesses do not align in a common failure pattern.
The range of ‘hazard-containment barriers’ available to a typical trading firm is broad, and includes measures such as resilient design practices, multiple levels of software and system testing, automated risk checks, network control policies, system monitoring and supervision. Barriers can also include organisational policies, such as staff training and operational procedures, that aim to reduce human errors made while carrying out risky tasks. The model emphasises the importance of having diversity and defence in depth, with each successive barrier guarding against failures in earlier layers. Risks can be reduced by strengthening individual barriers, by increasing the number of barriers, and by making barriers more independent so that weaknesses are less likely to align.
When things do go wrong we often tend to focus on immediate or proximate causes, such as a critical mistake made by an operator or a major defect in a particular piece of equipment. In reality there are multiple ways in which any particular lossinducing hazard can be stopped and contained. We can better defend against fallibility if we look beyond the immediate cause and question how each layer of protection could be improved.
The Main Challenges
One of the challenges for trading technology in particular is that systems are not ‘closed’, in the sense that every trading system interacts with multiple others, including systems in other firms that are not under a single operator’s direct control. As a result, systems are exposed to patterns of interaction that are hard to anticipate, hard to design for, and may not be covered during system testing. And to make things even more challenging, the environment is not only diverse but also constantly changing. Market data rates are generally growing all the time, and patterns of activity across markets do not remain static. The fact that a system has worked well in the past does not guarantee that it will continue to work in the future.
This combination of factors means that a trading system may find itself operating in conditions that its designers did not foresee. To guard against error in such circumstances, there are three main defences that firms can deploy: resilient design (do not assume that outside systems will behave according to spec); automated limit checks and controls during operation (including traditional trading risk checks and also network-level traffic controls); and operational monitoring and supervision. The latter is the last line of defence against hazards that were not foreseen either during system design or during the design of automated checks.
On the ‘plus’ side, automated trading systems are generally considered to have a ‘fail-safe’ mode, namely to stop trading altogether and cancel outstanding orders. This is an advantage that other automated systems (automatic aircraft pilots, for example) don’t always enjoy. There have been calls for wider use of so-called ‘kill switches’ that would manually or automatically put a trading system into fail-safe mode, and we would agree that traders should try to make use of this advantage if they can.
The Trade-off Between Protection and Speed
‘Speed’ means different things to different people. In the sense that the speed of an automated trading system allows it to generate trades at a faster rate (i.e. more trades per second), it clearly creates the potential for larger losses to occur more quickly if the system gets out of control. Risk is conventionally defined as the product of potential loss versus likelihood of occurrence. Higher speed systems therefore require more protection in order to reduce the likelihood of failure and keep the same level of risk. This trade-off is not unique to electronic trading; it is common to any industry where automation is used to increase production.
In trading, the term ‘speed’ is sometimes a synonym for lowlatency, i.e. the ability to process individual trades quickly, as distinct from throughput (trades per second). In a lowlatency system there is less time available for pre-trade risk checks, and it can be harder for human operators to keep track of what’s going on since events happen faster. However, so long as basic pre-trade checks are in place, individual trades are unlikely to cause large losses: a greater threat comes from the risk of continuous patterns of erroneous trading. Such patterns can be monitored and terminated when necessary without compromising the latency of individual trades. Likewise the ability of human operators to understand what is happening is limited mainly by the tools and instruments that they use. People generally need assistance from technology in order to control high-speed systems. Firms using lowlatency strategies need to employ appropriate monitoring and control techniques, but provided they do so then low-latency in itself does not necessarily increase risk.
The Main Driver of Development
There’s no doubt that recent events such as those at well known US market-maker have focused attention on technology risks surrounding automated trading. From our clients’ perspective, managing these risks successfully is primarily a question of self-preservation rather than being regulator driven at this point. While the financial losses in that recent event were obviously shocking, firms are also concerned about the reputational harm and loss of client confidence that can be caused by a glitch even when there is no direct financial loss. The flip side of that concern is that, by demonstrating appropriate care and attention to the potential risks of technology, firms can hope to enhance their standing and attract new clients.
What Comes Next?
Our expectation is that aspects of infrastructure monitoring and risk monitoring that have traditionally been separate will now converge due to greater recognition of the risks that technology issues can pose. Our own experience has been that clients want to see businesslevel metrics that expose patterns of trading activity presented side-by-side with metrics for technology health and performance. The ability to monitor what’s happening at both levels together provides confidence that any potentially business impacting technology glitch can be detected quickly.
A second factor driving convergence between these areas is the realisation that infrastructure monitoring systems have the potential to provide an independent and comprehensive overview of trading activity that is difficult to obtain elsewhere. These systems often have access to all of a firm’s trading communication traffic, and they monitor activity at a fairly low level that is close to the wire. Equipped with the right analysis software, they can create a consolidated picture of trading behaviour, determined independently of the firm’s trading systems themselves. That helps to avoid aligned weaknesses or common failure patterns between monitoring and trading. As these monitoring systems steadily enhance their trading analysis capabilities, users are applying them to more use cases in the areas of risk and compliance.
The State Of Play
Franklin Templeton Investments’ Director of Global Trading Strategy Bill Stephenson looks at the four main challenges facing the buy-side today.
What are some of the challenges your global traders are facing?
Well, there are four major themes and problems we have been trying to solve; most of them involve a technology solution.
The first issue is liquidity, which is hard to measure. Contrary to some beliefs, it isn’t always correlated with trade volume, bid/ask spreads, depth of book, or by the speed of execution. While all of these things can make the market appear more liquid or easier to trade, we just haven’t found that to necessarily be the case.
The second challenge would be defining and reducing our market impact, and potential implicit costs. In a more complex trading environment, we need to be smarter about reducing information leakage and leveraging real contra-side liquidity to minimise our footprint. The third theme concerns transparency; this boils down to understanding how and where our orders are routed. It includes knowing where they may rest, and the methodology involved in how they interact with various liquidity destinations.
Finally, the fourth area of focus is data management. This involves how we take the intelligence we gather about our stocks and execution performance, and combine it with our fundamental internal information and analysis. This is probably the most proprietary effort we are undertaking right now, which could potentially yield the greatest benefit to our process.
Can you expand on these a little and describe how you are helping your firm solve these challenges?
Well, as I said, technology plays a part in this; quite honestly, in most cases it is the key to our success. I would say that a considerable amount of my time, and that of others within our group, is spent working with our traders and technology teams to figure out the best way of addressing these issues. Take our first challenge — liquidity. One of our senior traders, Ben Batory, led an effort to author an internal white paper that specifically addressed some of the challenges we are facing. It was essentially a thought piece geared toward helping us synthesise our views on the difficulty of current stock trading when compared to previous periods. It was something we could share with our portfolio managers, to help them understand why they may not be executing in the same manner that they have expected in the past. While the lack of volume was part of our theory, the quality of that volume was potentially even more important. Of course, there are many reasons why these two things are occurring right now. One reason is the macro environment and lack of conviction in the marketplace. Another part of it may be the market structure rules that have evolved over time, leading to unparalleled execution speed.
This increased speed has helped the markets in some ways and made it more challenging in others — or as some would describe, more nefarious. For instance, speed allows algorithmic traders to cancel orders very quickly, permitting apparent liquidity to be fleeting. We know we can’t just create liquidity or even fully optimise the type of liquidity we want to interact with; but we do believe it is a challenge we should continually strive to overcome. While we all want to transact with a natural counterparty at a price with minimal market impact, we completely understand the necessity of liquidity provisioning, and the resultant cost to access it. We do, however, believe there are times of unnecessary provisioning that disintermediates a buyer and seller, which comes at an additional cost. We need to better understand the types of liquidity pools — such as exchanges, dark pools, and crossing networks — and target those destinations we deem to contain less toxicity, i.e. less intermediation and information leakage. We can customise our experience in these liquidity venues by only interacting with certain tiers or liquidity providers, using minimum fill sizes, disabling outbound messaging, etc. We have always said that complexity isn’t always a bad thing in the marketplace, because it does create opportunities. However, overly arcane rules, the continuous introduction of new order types, and the ever-changing pricing schemes can actually cause participants to cease trading — which then impacts liquidity and trading costs. For example, in the last two years there have been well over a hundred exchange pricing changes, and hundreds of different exchange order types offered with perhaps thousands of permutations; and this is just in the US.
Understanding the types of liquidity pools leads to my second point relating to transparency. We have seen enough to know that you can trust, but you must verify. Just asking how something works isn’t always going to be the best practice. We have required that our dark pools and ATSs open the kimono regarding how their systems work. For instance, at what prices do they cross, do they allow pinging, and what data feeds are they using? Led by the co-head of Americas Trading, Dave Lewis, we have moved on to the algorithmic providers as well, by insisting on two pieces of information. First, we require real-time information on specific FIX tags surrounding the executing destination. This includes whether they provided or added liquidity, the associated rebates/fees, and quote at the time of trade. Secondly, we ask for an end-of-day file with similar information, but with more details relating to all the order routes, fill and non-fill destinations, millisecond precision timestamps, and other related data. We completely understand that this can be highly proprietary information, but we also know that this information is necessary for us to understand how our orders are being represented — and it is our obligation to understand.
Additionally, this allows us to conduct further analysis to understand other measures, such as the velocity of trading and quote changes before and after our trade, price reversion, and the potential impact of odd-lot fills. We have millions of fills per month, so the challenge of building a platform to concatenate, analyse, and interpret that data is complex, but essential. We are building up our technology and people resources in this area. It is important for us to understand this data, since it will only continue to get more complicated.
As I just mentioned, the number of records being generated becomes huge. This leads me to the challenge of data management. We now have a variety of data that we believe is important to manage our business better — beyond just execution data. Managing this data has been a herculean challenge, but it is also one that presents immense opportunities. If we can better collect, manage, visualise, and interpret our data, both proprietarily and externally generated, we can create a value added overlay in our investment process. For example, we are building a proprietary system conceptualised by Mat Gulley, the global head of trading, called Investment Dashboard. Without getting into specifics, the Investment Dashboard takes data from different departments in our firm, and organises it in a way that becomes insightful to users. This capability creates investment opportunities that were previously harder to identify, if at all, and does it in a very timely manner. We think this will be a game changer for our organisation, because it really gets to the core of our white paper’s message — liquidity is challenging and the market is more complex than ever. But with greater communication between traders and portfolio managers with better tool sets, long-term investing can still be short-term opportunistic. In fact it needs to be in order to outperform in today’s environment. We view this symbiosis between trading and portfolio management through technology as core to our investment implementation process.
Finally, there is the issue of market impact. At the end of the day, we hope that all of this ties together. Through transparency, analysis, collaboration, and smarter liquidity seeking strategies we aim to reduce our impact on the market. We are not naïve; we realise that we will have market impact, given the potential size of our orders. We know that we can’t always beat the higher frequency traders to the punch, but our low frequency process, trader discretion, and investment horizon allows us to think long-term and also act opportunistically while in the market. Since our traders have discretion on their orders, they are not as predictable to market participants. We believe this creates an expected value for our clients. Quite frankly, we don’t need to be able to get from one exchange server in New Jersey to another in 161 microseconds to effectively compete. But we need to know how and why it is happening, so we can think more strategically about our choices.
A few years ago, you spoke with us about TCA. Has anything changed in how you measure your market impact or the value the traders are adding using your investment or trading horizon?
Sure, we have done a few things that tie the data together better. For instance, we believed that real-time analysis should seamlessly flow into our next day post-trade analysis. With that idea in mind, we re-engineered and outsourced our existing point-to-point FIX infrastructure to our TCA provider. We felt this would create more continuity for the traders, simplifying their monitoring processes, and ultimately allow for finer strategy adjustments. We are in the midst of this migration, which involves helping the vendor conceptualise, design, and prioritise specific enhancements to the real-time platform. The innovation will allow us more insight into our activity — not only for the trader, but also from a management and risk oversight standpoint. In addition, we have re-vamped our post-trade analysis to a more granular level, in addition to enhancing the data-set we deliver to the vendor. Again, we think TCA is not just about measuring the past, but using it to manage risks, plan strategy, and ultimately to help reduce costs or even add alpha going forward. It is definitely part of our culture when it comes to this analysis. We believe it helps us increase the predictability of the performance, and the experience for the portfolio manager over time. As we like to say, best execution is a predictable execution — and that comes from a foundationally sound best execution process.
It sounds like you have been working on a lot of different initiatives. Do you find that all this work is beginning to pay off for your investors?
We think that it is. Of course, we’d like to be able to precisely measure these improvements, but not everything is easily quantifiable. We feel strongly about how we measure our execution performance, which includes both our market impact and opportunity costs. This is because we designed our system and measurement methodology to match the investment process and align the goals of the trader and the portfolio over the expected duration of trade implementation. However, while we spend a lot of time trying to optimise performance, there will always be circumstantial impediments to our performance gains from things we can’t control but must successfully learn to adapt to or navigate through. These factors could include: market volatility, the types of stocks we are trading, price constraints or just the intra-day timing of orders from the portfolio managers. When it comes to the quantitative intelligence we are gaining from the FIX data we are requiring or the qualitative measures we use to measure potential toxicity, it is by no means easy to quantify the impact. With that said, we believe we have created a roadmap to navigate the global markets that takes in everything we have learned over many years of engagement. We do know that this intelligence does creep into performance for our clients, potentially fractions of a penny at a time — but we’ll take that in volumes. Getting back to the internal white paper, we can’t necessarily determine the impact of more collaboration, the increased speed of opportunity realisation, or even better expectation management; but we know that a tighter connection between portfolio managers, traders, and analysts, is the main driver toward allowing a process to be opportunistic, versus one that is always a step behind in the alpha generation process.
Events : Capital Markets Forum
TARGET2 – SECURITIES – WILL YOU BE READY?
Tata Consultancy Services’ Capital Markets Forum, held on 26 June at the heart of Europe’s political centre, the Maison Grand-Place in Brussels, provided a wake-up call for financial institutions not yet leveraging the TARGET2 Securities project. Delivering lost-cost cross border settlement, T2S will break down a significant barrier in the creation of a single European equities market place. At the forum, entitled ‘TARGET2 – Securities – Will you be Ready?’ leaders in the delivery of the project gave the audience an insight into the next stages of development.
Marc Bayle, TARGET2 – Securities (T2S) programme manager at the European Central Bank. Keeping on track.
With over 50 central securities depositories (CSDs) and central banks signed up to T2S, banks need a strategy in place for participation, to take benefits from this new setting.
As of the end of June 2012, 24 CSDs from within the Eurozone including six non-Eurozone CSDs have agreed to join T2S. The project itself is halfway to completion; the legal framework and the technical documentation are defined, 65% of the coding was completed by the end of March 2012 and by the end of the year nearly 100% of the coding will be done. We are entering a phase of testing heading towards the first markets going live in T2S by June 2015.
Recently the project has delivered several key developments: the new user detailed functional specification (UDFS), complemented by the new versions of SWIFT messages requested by the project’s advisory group; the connectivity package, both in application-to-application mode (A2A) and user-to-application mode, including licences to develop connectivity solutions for SWIFT and Colt who, along with Eurosystem’s CoreNet, have developed services that allow them to connect to T2S.
The agreement for the euro has been fully signed up; the agreement for non-euro currencies has been signed by Denmark for whom it will come into effect in 2018.
We do not expect other non-euro currencies to sign up for next month but there is a chance for a combined initiative in the Scandinavian region in the coming years. Timing is not such a concern for central banks to sign up; there is no deadline and the technical aspect is much less complex for them than for the CSDs.
The main countries not signed up are UK, Norway, Sweden and Poland. While Switzerland’s CSD is participating, its central bank is not, and although the CSDs of Hungary and Romania have adapted to T2S, neither country’s central bank is considering putting their own currencies onto the platform at this stage.
In May the Council of the European Union’s Economic and Financial Affairs Committee, made up of the finance ministers from national governments, reiterated its support for the project to go ahead. The ECB is working closely with the European Commission on its CSD regulation. This is an important plan which will help deliver a harmonised framework between the different markets. It is increasing the legal safety of the CSDs who come to T2S around the specificity of outsourcing by a private entity to a public entity. The regulation will also establish applicable law on securities accounts, and it will help to ensure a level playing field among market participants.
The harmonisation work is now centred on two objectives: finalising the definition of the relevant standards; and the improvement of tools and procedures for monitoring the implementation of harmonisation standards by all project participants. This will help increasing the benefits of joining T2S, in particular for the custodian banks.
Axel Pierron, senior vice president for Celent’s securities & investments group: Leveraging T2S Infrastructure: What are the options?
A study was conducted by Celent, late last year, to form a picture of the costs and savings that T2S offers, based on the views of six major market participants. Overall, non-international CSDs and local custodians will face the biggest impact. Sub-custodians operating in a single country will have difficulty in weathering the changes driven by T2S, so there will inevitably be consolidation amongst them. Those CSDs without an international business really need to move up the value chain to offer an asset servicing business or to become a utility.
Three scenarios are projected for custodians to change their post trade arrangements:
1 – Doing nothing and relying on agent banks or an investor’s CSD services
2 – Direct connectivity to T2S for settlement and using an external provider for asset servicing, either a custodian or a CSD
3 – The full settlement approach and on-boarding of custody activity
Market participants are likely to use a mix of these approaches. There are four main elements of expenditure. The first is system costs. Strategies two and three will require upfront investment and variable costs. The predicted settlement IT costs for both scenarios averages €12.5 million; the additional custody IT costs for the full service model in scenario three will be another €15 million on top, based on accessing two markets.
The second element will be running costs. For scenario two these are modelled at an annual average of €1 million, rising to over €5 million for scenario three.
Although T2S should be able to drive down trading costs in Europe, depending on the way that firms respond to it they will still have to purchase services, the third element, from an external provider, for example a tax agent even in scenario three. Spending under scenario three will of course be much lower for services than in scenario one where actually you are relying on an agent bank.
The cost of liquidity will depend on the market a firm is in and overall trading volumes, making the measurement of its cost impractical for the study. Clearly if a custodian is providing intraday liquidity in a bundled service as in scenario one, liquidity provision will be included as part of that, where firms connected to T2S will have to manage their liquidity more closely; that is set against the gain of having just a single pool of liquidity.
Based on these costs, we found no business case for financial intermediaries with below two million settlement orders per annum to connect with T2S directly. For firms engaging with scenario two, four million settlement transactions per year would be needed to provide a return on investment. For market participants with over €200 billion of assets under custody and managing over four million settlement orders per year, there is business case to connect directly with T2S and taking over the asset servicing function.
Given the costs involved and the period to generate a return on the investment, some banks will find it hard to justify the expenditure to go beyond scenario one.
Prabha Bob, Global Solutions Head, GCP – Capital Markets Group, at Tata Consultancy Services: Driving T2S Transformation
Transformation programmes, like the dematerialisation project in Japan or NASA’s space programme in the USA, have certain common themes: far reaching impact on the business model; long gestation period; complexity in quantification of benefits; and the use of technology as an enabler. The driving force behind such programmes is a belief that it is the right thing to do, an acknowledgement that the team work of stakeholders will put together the jigsaw puzzle and a vision to build a better future.
At TCS we have been following the developments on the T2S front right from 2007 and have had opportunities to work along with our clients.
In our view the T2S adaptation programme of stakeholders – CSDs, custodians, CCPs and national central banks – needs to have a three-dimensional approach encompassing organisation, technology and programme management.
The key challenges facing the stakeholders include building a flexible business model, minimising the impact on clients, enhancing the business value of the IT infrastructure, meeting the quality assurance objectives within the time frame for testing earmarked by ECB and ensuring a multi-phase rollout of an integrated settlement programme.
CSDs, custodians and other stakeholders can deploy a business process and change management framework to ensure a smooth transition to the target operating model. The various dimensions of the framework include:
• Leadership alignment that will set the overall strategy for T2S adaptation
• Organisational design to support new roles and responsibilities
• Business process adaptation for alignment of existing processes and developing new processes
• Improvement opportunity assessment to enhance operational efficiency
• Stakeholder buy-in and communications management
• Training
Technology is a key enabler and a portfolio approach should be taken to enhance business value of IT infrastructure. T2S adaptation can be leveraged to align the IT systems to the target state architecture of the organisation. This could involve a technology refresh, consolidation, retirement or integration of applications. It is important to be conscious of the impact that other regulations, such as MiFIR and EMIR, will be having on the organisation’s technology at the same time.
One option being examined by CSDs to minimise the impact on legacy platforms of clients is the support of multiple messaging standards in the initial phase. Another approach by stakeholders could be to develop an integration layer.
Leveraging existing settlement infrastructure for T2S adaptation is another area that calls for deliberation. One option is to develop an add-on component for T2S specific processes on top of the existing settlement system. However the merits of this strategy need to be established through a cost benefit analysis. Maintenance of heterogeneous applications (a combination of legacy systems and products/solutions) for multiple geographies is another aspect of technology transformation that will confront stakeholders. Adoption of a multi-tenant, multi-entity solution with a T2S interface layer and standard settlement engine could be one approach.
From a programme management perspective it is interesting to note that around 50% of the time in an IT project is spent on testing and migration. A key challenge for organisations in the T2S adaptation program is the presence of multiple stakeholders with varying levels of readiness. Testing can be facilitated by developing a test harness for downstream users in the value chain. A test harness is a repository of test cases/scenarios that is accessible by stakeholders, facilitating a level playing field for stakeholders and also enabling the monitoring of progress in the testing phase.
From an end-to-end market testing perspective 50-60 scenarios can be developed and deployed for mock sessions. This would enhance the confidence of the stakeholders.
Another concept is conformance testing whereby users of a stakeholder are assessed for readiness and provided dedicated time and environment for testing. For example a CSD may have ten custodians that are in different states of maturity for T2S adaptation. The custodians are assessed individually before the testing is thrown open to all the ten custodians. Very often simulation tools are used for conformance testing.
The next challenge is migration, and how to ensure consistency and reliability of the T2S adapted solution. We would propose the parallel testing approach with pre-T2S environment running along with the T2S testing environment. Migrated data can be utilised for testing towards the end of the testing phase. Another approach is to conduct business-operations testing. This will give organisations the chance to test an entire day of operations with a focus on meeting SLAs, operations schedule and information exchange with external stakeholders. Performance issues, if any, can also be detected.
The T2S initiative is obviously more complex than the Japanese Demat program and obviously more down to earth than the Moon, Mars and Beyond program of NASA.
But yes – it must take heart from the successful implementation of the Japanese Dematerialization Initiative, and draw inspiration from the visioning that propels the massive Moon, Mars and Beyond mission.
Panel discussion: The rocky road ahead
The panel debate showed that despite the shared vision of T2S among participants there were still some major concerns, around cost and co-ordination particularly.
Yvan Timmermans, chairman of the T2S Belgian National User Group (NUG), National Bank of Belgium (NBB) gave the perspective from a central bank on the project. He explained that the ECB Governing Council decided to discontinue the preparations for CCBM2 (Correspondent Central Banking Model). Therefore each NCB will provide the eligible assets and valuation list (for auto-collateralisation) to T2S individually. T2S will not be negatively impacted by this discontinuation. He said NBB will connect to T2S indirectly via its CSD as, like most central banks, NBB does not have the volume to justify a direct connection.
He highlighted two Belgian specificities. First, the NBB-SSS will probably not have a business case to set up links with the other T2S CSDs. Second, the other Belgian CSD, Euroclear Belgium, is not a Eurosystem-eligible CSD for receiving collateral as it mostly deals with equities that are not Eurosystem-eligible securities. Consequently, only eligible assets issued in NBB-SSS will be usable for auto-collateralisation by Belgian settlement banks.
James Cunningham, European Market and Regulatory Initiatives at BNY Mellon Asset Servicing noted that national specificities were a risk to blocking the achievements of a homogeneous system.
He said, “What we want is a homogenous settlement process as a precondition to T2S; if we don’t have it, then afterwards we are in the same position as today.”
Marc Tibi, head of market infrastructure at BNP Paribas Securities Services said, “We are worried that although we have one single European project called T2S, we do not have one single provider; we have 22 projects, each being handled domestically so the banks have to invest in and deal with 22 projects. That is not intelligent.”
Alex Dockx, T2S programme manager at JP Morgan Worldwide Securities Service said that his firm’s analysis as a user of CSDs was that cost saving existed in the long term with costs incurred in the short-term.
The CSDs on the panel said that they were moving ahead. Jean-Luc Fripiat, director of product management at Euroclear CSDs and Group Services, which has seen four of its group’s seven CSDs sign up, asserted, “There is a strong willingness to join T2S. It will harmonise market practices, lift a lot of technical and legal barriers to have a harmonised solution on Europe,” adding, “There will be a move to an unbundled service model and we at Euroclear are delivering services already for collateral management and asset servicing as well.”
Italian CSD Monte Titoli has begun to develop new services, seeing T2S as an opportunity to become a global post-trade services provider. It has signed up in the first wave of CSDs, committed to testing in mid-2014 and going live in June 2015.
Alessandro Zignani, general manager at Monte Titoli said, “We won’t increase our pricing for T2S to minimise the impact on our community. From a technical and organisational point of view we are organising training with our user group, explaining what T2S is; small and medium size banks are not aware of the full impact.”
BNP Paribas’ Tibi observed that by developing solutions to T2S separately CSDs were creating significant additional costs.
“Monte Titoli said that it is looking at costs of €15 million to adapt to T2S, but Euroclear and Clearstream say costs are in the region of €80-€90 million. Of course there could be some volume effects but not to the tune of €75million. Why have several CSDs separately developed a GUI? Our suggestion is that CSDs should be made to share their viewpoint and best practices.”
©BestExecution 2012/13
Australia’s Multi-Market Landscape
Ben Jefferys, Head of Trading Solutions, IRESS, looks back at a year of the multi-market environment, and the future growth potential.
With almost a year of a multi-market environment in Australia, it’s time to stop and reflect on the year that has just passed. The question often talked about town is whether expectations have been met; but depending on who you are or who you talk to those expectations were very different things, if anything at all.
As Australia anticipated the launch of Chi-X and its rival ASX PureMatch we looked abroad to get a sense of what to expect. It didn’t take long to realise that the legislative framework and market structure within Australia was different to other countries and regions in the world. Many felt that the Canadian experience would be the closest, but still there were enough significant differences to ensure that Australia’s journey was a unique one.
The most relevant point of difference between Australia and other parts of the world was just what happens when there is a better price on one market compared to another. In Australia there are no hard and fast rules about having to solely execute at the better price. It comes down to the market participant (i.e. the broker) and their respective best execution policies. The latest guidance from the regulator, ASIC, says that a broker needs to make a decision on whether to trade on markets other than the ASX if outcomes, including all costs involved, are consistently delivering better results for clients.
In Europe where the European Union’s Markets in Financial Instruments Directive (MiFID) applies, best execution is “principles based” where the premise lies in achieving the best outcome for the client. It is important to note that it states best outcome and not just best price.
The US and Canada have what are known as trade-through protection rules but then the definition in each country differs. For the US, if there is a better quoted price on another exchange, then the order must be forwarded by the receiving exchange to the exchange with a better price. In Canada it is similar but not quite the same. If there is a better price elsewhere, then the exchange must reject the order and the broker must then try again and send it to the exchange where the better price exists.
Delving into the detail of how each market is structured and how their respective legislative frameworks operate one would have found it increasingly difficult to predict just what might happen in Australia in a multi-market environment.
Depending on what side of the fence you sat those expectations ranged from cheaper transaction costs through to tighter spreads and everything in between. If you went and asked the industry as to whether these last 12 months constituted success for multi-markets in Australia, sadly the response is most likely to be one of indifference.
The Arrival of Fragmentation
But a lot has been achieved in the last two years by exchanges, market participants, regulators and vendors. The change has been immense. If we focus just on innovation around the fragmentation of the Australian marketplace most people would be surprised at just how long this change has been in motion. It was back in June 2010 when the ASX fragmented its own market with the CentrePoint and VolumeMatch products. The latter traded once and has not since, whilst CentrePoint has developed into a successful alternative market where price improvements can be realised by trading at the mid-point of the spread.
Then on 31 October 2011 Chi-X Australia commenced trading on a small number of stocks and was in full swing by 9 November 2011 trading all ASX 200 stocks plus ETFs. Similarly, the ASX soft-launched its competing PureMatch product a month later on 28 November 2011.
Fragmentation had officially arrived and a survival of the fittest began. Market participants were now spoilt for choice. There were the lit markets ASX TradeMatch, ASX PureMatch and Chi-X along with the dark ASX CentrePoint market and ASX VolumeMatch sitting somewhere in between. Hidden orders could also be pegged within the spread on Chi-X.
When it came to the lit markets it didn’t take long for there to be only one alternate exchange left competing with ASX TradeMatch. ASX PureMatch traded upon its launch but has been dormant ever since. The regulator, ASIC, finally confirmed the fate of PureMatch when on 23 February 2012 it indefinitely extended its waiver that PureMatch bid/offer quotes are not required to be included in National Best Bid Offer (NBO) calculations until it reached a liquidity level on average over 10 consecutive trading days of 0.2%.
Gaining Ground
Chi-X however has been slowly gaining ground since its launch – but is it the success that everyone expected? How did it compare to what was seen in other markets around the world? Looking at what happened in Canada in terms of market share we can directly compare just how the Australian foray into multi-markets stacks up. Markets first fragmented in Canada back in 2007 when a market called Pure launched, which in all respects never gained significant market share. But a more realistic comparison can be made by benchmarking against Chi-X Canada and another called Alpha which is owned by Canada’s biggest banks. After six months of trading, Chi-X Australia’s overall market share sat at approximately 1.9% and at the time of writing and after 11 months market share had risen to 4.6%. Comparing this to the two successful alternative exchanges in Canada we can see that Chi-X Australia is tracking well. Out of interest, the most recent market share statistics for these markets see Chi-X Canada and Alpha with approximately 9% and 19% respectively.
The following chart helps visualise this comparison over a longer period. It shows the first 24 months of multi markets in Canada and overlays what we have seen to date in Australia. Within that 24 month period in Canada four separate exchanges launched; Pure in September 2007, Omega in December 2007, Chi-X in February 2008 and Alpha in November 2008. Momentum gathered pace once volumes got past 2-3% and within 18 months close to 10% of volume in Canada was traded on alternative markets. Within two years it was pushing 20%. Whether Australia follows is anyone’s guess but looking at the numbers so far it appears that things are heading in the same direction.
Trading volumes and market share are simple measures but buy- and sell-sides must find more comprehensive ways of determining how multi-markets can possibly benefit their business. As discussed earlier, best execution in Australia doesn’t just have to mean the best price, and as ASIC states, we should be looking for a consistently better outcome for the client. For a private client it is likely to be driven on best price, whilst an institutional buy-side client needs the best price considering depth and availability of liquidity to the stocks they are trading, whilst accounting for latency and minimising market impact.
Analysing the Change
New suites of tools are evolving to help with the necessary analysis to seek liquidity at the best price possible. The emphasis for the buy-side on transactional costs analysis (TCA) is heightened since liquidity is split across more exchanges and dark pools making it more complex to minimise market impact. As the marketplace fragmented, the buy-side would often express that it becomes harder to trade stock in size, whilst at the same time proponents of HFT and market makers on alternative exchanges claim that they are adding liquidity to these markets and offering tighter spreads. Whatever the outcome this reiterates the need for tools to monitor the quality and cost of execution, whether it is on one or multiple exchanges.
Similarly, whether you are trading on one or multiple exchanges there is now a need to monitor what are known as “trade-throughs”. A trade-through occurs when there was a better price available on a different exchange compared with the price on the exchange that you traded at. To date in Australia the monitoring of trade-throughs has been a function of a select number of brokers using smart order routers. Trade-through analysis (TTA) is a good way to measure the effectiveness of a smart order router but as awareness grows along with the market share of alternative exchanges we are now seeing greater interest in TTA. In Australia the brokers to a trade become publicly disclosed on T+3 and this is incorporated within TTA tools. One broker can compare their trade throughs with the next and the same applies for the buy-side who can look across all brokers and rank them according to the number and value of trade throughs. As volume on alternative exchanges increases then so does the likelihood of a trade through.
So it is no wonder that the expectation of multi markets in Australia is varied if not distorted. We have a marketplace that has dramatically changed with the introduction of alternative exchanges, suites of new tools and data to monitor execution quality and two exchanges competing in an arms race for liquidity with an almost constant stream of innovation. All in all it is a difficult path to navigate but spending the time to understand the change is advantageous. Whether the outcome is one of realising a better price or not it is certainly worth understanding the new structure to help uncover ways of extracting the best and most efficient means of getting the desired result. Only then can a true expectation be set and then later evaluated.
Australian ETF Market Poised For Growth
Ilan Israelstam, Head of Strategy at BetaShares, looks at the past, present, and future, of ETFs on the Australian market.
The first exchange traded fund (ETF) was quoted on the Australian Securities Exchange (ASX) on 27 August 2001, and while growth was slow in the early days, since 2009, assets under management of Australian Exchange Traded Products (ETP) have doubled, with approximately $5.5 billion invested as of August 2012. The last 12 months in particular have been significant in terms of the development of the Australian ETF industry, with all major asset classes now available for investors in ETF form. In addition, new money into the industry was significantly positive – with over $200m of net net inflows in the last 6 months alone.
Additionally, the number of products and exposures now available on the ASX has increased substantially, with 26 new ETPs listed since June 2011. To put this number in context, in July 2010, almost nine years after the first ETF was launched in Australia, there were only 37 ETPs quoted on the ASX, and in the last two years, we have seen this number more than double. Of the 26 new ETPs listed on the ASX since July 2011, investors now have exposure to a range of currencies, fixed income, cash and commodities such as agriculture and oil. With all these asset classes now available, we are also beginning to see greater interest and some investment in ETFs by institutional investors.
Institutional Investors’ Usage Still Limited – but Significant Upside Available
While there has been plenty of interest in ETFs by Australian institutions, uptake has been limited to date. However, despite this, activity has definitely picked up in the last 12 months including initial ETF investments by SunSuper, a major Australian pension fund, and van Eyk, a multi-manager, both allocating to emerging markets through ETFs. Recently, there has also been institutional use of the US dollar ETF (which aims to profit from a fall in the Australian dollar against the US dollar), as well as in the high interest cash ETF (which aims to provide returns greater than the Reserve Bank of Australia cash rate). Just recently, a resources sector ETF experienced some block trading as institutional investors looked to obtain tactical tilts to Australia’s largest sector of the economy. The above progress notwithstanding, there is still a long way to go. In other markets such as Europe or the US, up to 50% of ETF assets under management are institutional and ETFs are almost always in the top 10 securities traded in terms of exchange turnover. Compare this with Australia, where ETFs only make up a small portion of the total trading value, with a majority of the money retail based. It appears as though significant upside still exists for institutional usage of ETFs in Australia.
The Attractiveness of the Australian ETF Market
Despite not being as advanced as other ETF markets, there are some characteristics for Australian ETF issuers which make the local market attractive. First off, by sheer virtue of being a laggard market, local ETF issuers and regulators can learn from the scrutiny of ETFs that occurs in more developed markets, especially when it comes to regulatory issues. As an example, Australian ETF structures benefit from ‘best practice’ in existence around the globe. To illustrate this, currently all ‘synthetic’ ETFs listed in Australia are backed by cash held in a depositary account. Such a structure avoids some of the controversy that occurred when similar ETFs quoted in overseas markets utilised alternative forms of collateral in their structures. The second point is that the Australian ETF market has a lot of room to grow. If we compare Australia to Canada, there are a lot of similarities in terms of GDP, population size and being a resources driven economy. However, there is a huge difference in terms of size of the ETF market with the Canadian industry being approximately C$40 billion compared with just over A$5 billion locally. The size of the potential is particularly striking when one notes that Australia has the fourth largest pension pool in the world, just having tipped A$1.4 trillion. This clearly presents strong growth opportunities for ETFs as retail and institutional investors continue adopting ETFs in portfolios. Depending on market conditions, we believe the Australian ETF market has potential to grow exponentially over the next three years reaching up to A$15 billion by 2015.Alternative Asset Classes
ETFs are access vehicles for investors providing the opportunity to gain exposure to a variety of asset classes in a simple, transparent and liquid manner. Outside of core Australian equities exposures, investors can now access fixed income, cash, commodities such as oil and agriculture, as well as precious metals such as gold, on the ASX.Traditionally, Australian investors are more heavily weighted towards domestic equities compared with global counterparts. Most Australian superannuation funds can allocate anywhere between 30% – 40% of portfolios to Australian equities, a very high level when compared with overseas pension markets. Now that there exists direct access to fixed income, commodities and previously available asset classes of domestic and global equities, investors can now access all the asset classes of a balanced portfolio and tailor their exposure depending on their risk profile. As an example of further access made available via ETPs, in July the first “bear fund” was launched in Australia. As the product is designed to go up when the sharemarket falls (and vice versa), it allows investors to profit from or hedge against a decline in Australian shares.
Future of ETFs in Australia
ETFs on global exchanges provide a reflection of investor sentiment, and fund flows of larger ETFs are actually a gauge of the risk appetite of the market. Although Australian ETFs have not reached this size, local products do provide a barometer for risk appetite. Currently the flows indicate that risk appetite is subdued resulting in very little inflows to Australian equities based products. Despite this, Australian investors may have finally turned the corner with renewed interest in Australian equities during July, with 6 of the top 10 funds by inflows being Australian equities based. The other continuing trend is dividend and yield based strategies which are continuing to prove popular among investors. With capital return expectations low and investors looking to yields for returns, they now have the option of fixed income ETFs, and recently the first cash ETF, available on the ASX. While fixed income ETFs have been popular overseas, the market in Australia is still in its infancy and has not yet enjoyed the same success compared with overseas markets. During the first half of 2012, fixed income ETFs attracted just over $40 billion in new assets globally, which accounted for approximately 40% of all global inflows. Also, lack of investor conviction has resulted in lower trading volumes on the Australian Securities Exchange but the various ETFs which track Australian equities indices have continued to prove popular as both trading instruments and longer term buys. Innovation can sometimes be dependent on the market and in the case of Australia, the first cash ETF to be quoted has grown strongly due to the market dynamics. With one of the highest interest rates in the developed world, investors can now access high Australian interest rates through the cash ETF in a simple, transparent and liquid manner on the ASX. Notwithstanding a great deal of product innovation, by global standards, the range of listed exposures available to Australian investors is still very narrow. As such, we expect there to be continued product innovation over the coming year. Such innovation c
ould include more thematic and strategy based ETFs. As the menu of products continues to fill out, we expect further adoption, both retail and institutional, as the ETF market continues its growth trajectory.
Taking Aim At Misconduct
Greg Yanco, Senior Executive Leader of Market and Participant Supervision, ASIC looks at the electronic trading environment in Australia and where regulation can play a role.
ASIC is Australia’s corporate, markets and financial services regulator. In relation to its responsibilities supervising markets, ASIC believes that well-regulated, transparent and well-functioning capital markets are the engine room for economic growth: matching companies that wish to raise capital to grow their businesses, with investors that wish to place their funds for a return in a liquid market.
ASIC constantly monitors changes in global equity markets and draws on the surveillance experience of other regulators. It is clear that technology has increased the speed, capacity and sophistication of trading. Along with the new opportunities that this presents for participants, it also poses new regulatory challenges for ASIC.
Flexible Advanced Surveillance Technologies (FAST)
In the May federal budget, the Australian Government announced a commitment to invest in new technologies to enhance ASIC’s capabilities in market surveillance and enforcement. The $43.7 million over four years will allow ASIC to plan for a future that includes greatly increased message traffic, new trading technologies and techniques, increased competition between trading venues, and the increasing globalisation of capital markets.
The funding allows ASIC to provide four key deliverables. Firstly, it allows for the replacement of ASIC’s market surveillance system – a system which was originally designed for a single market and for which the contract expires in 2013. ASIC’s new system will continue to use the existing FIX specification. The upgrade will add capacity and capability, and enable the system to cope with both a multi-market environment and the increase in high frequency trading (HFT) and algorithmic trading. It will also allow for real-time surveillance of futures markets, which is currently post-trade. The new system will provide the capacity to handle the dramatic increase in messaging, with the ability to handle up to one billion messages per day.
Secondly, the system will allow for advanced analytics; for analysts to search data records and identify suspicious trading by connecting patterns and relationships. This will be crucial for greater levels of detection of insider relationships, and will also allow for the development of post-trade surveillance capabilities to identify market trends, patterns of trading behavior and repeated or systemic behavior. These capabilities are common in other comparable markets.
Thirdly, the new system will also include a portal for market participants to connect with ASIC. This will allow for the enhancement of efficiencies in dealing with ASIC, and enable participants to electronically lodge certain material in accordance with their obligations. Fourth, the development of a workflow system will help ASIC improve the management of cases from the moment ASIC receives an alert, complaint or enquiry, to the end of an enforcement action.
With Australia having some of the highest levels of share ownership globally, this funding is crucial in allowing ASIC to strive towards one of its strategic priorities of confident and informed investors and financial consumers. This is achieved through markets operating with integrity and efficiency which, in turn, helps to achieve another strategic priority: fair and efficient financial markets.
The funding costs will be smoothed and recovered over 10 years to minimise impact. In our view, the cost will be outweighed by the medium to long-term benefits of competition and of well regulated, fair and efficient markets. These costs also reflect the additional costs of supervision due to the increased speed and complexity of trading and dispersed trading venues, including dark pools. As a percentage of market turnover, the cost of market supervision remains favourable to comparable jurisdictions.ASIC is committed to the project representing value for money, which means the process needs to be thorough. We will put in place strict evaluation procedures and will ask vendors to demonstrate competency in a range of functionalities important to ASIC and the market.
ASIC is acutely aware of the size of the IT project and build, and has hired experienced technical staff that have the requisite skills.
Electronic and Automated Trading
Technology has increased the speed, capacity and sophistication of trading. This has opened the door for new types of market participants that bring innovative trading strategies and an increase in platforms where orders can be matched.
Two regulatory challenges that ASIC is particularly interested in relate to the increased use of dark liquidity and HFT. ASIC has recently established internal taskforces to specifically consider these areas.
Dark Horses Down Under
As the issues surrounding dark liquidity grow more and more contentious, Steve Grob, Fidessa’s Director of Group Strategy, looks at the Australian trading landscape, how this type of trading has evolved and what the dark future holds.
Over the past few years, much attention in Australia has focused on the competition between ASX and Chi-X. But, in fact, multi-venue trading (via dark pools) was available to Australian investors long before Chi-X ever opened its doors back in November 2011. The term ‘dark pool’, however, has now become a catch-all phrase that describes a variety of activities that match trades but do not distribute pre-trade prices. Partly because of their name and partly because they are misunderstood, dark pools have attracted great controversy worldwide, and Australia has been no exception to this. So, what does the Australian dark landscape really look like, who is in it and what value does dark trading really bring to the market?
In 1997, ITG launched POSIT, a pre-market VWAP cross, Australia’s first alternative venue. However, it wasn’t until 2008 – anticipating the end of the ten-second rule1 – that Liquidnet followed with its buy-side crossing network, and finally Instinet with its BLX crossing network in 2011.
Likewise, broker dark pools were also around before Chi-X was born. UBS PIN (Price Improvement Network) launched in 2009 with other major brokers quickly following. These days most of the big brokers in Australia operate dark pools, although UBS PIN remains the biggest.
Finally, ASX’s own dark pool, Centre Point, was launched with much fanfare in 2010, and it is thriving (see diagram 1); it in fact has greater market share than Chi-X. To illustrate the complexity of these issues though, at a conference in Melbourne in May 2012 ASX chief Elmer Funke Kupper warned of the dangers dark pools presented to the Australian trading landscape, even though ASX earns 0.5bps on the AU$100 million-odd worth of trades executed on its venue every day.2
Genesis of Dark Pool Trading
Institutional buy-sides have always preferred to cross their order flow with other buy-sides that have the ‘natural’ other side in the stock they are looking to buy or sell. The reason for this is anonymity, or not divulging their trading intention to the market at large. The larger the order size the greater the value placed on this anonymity. Dark pools provide exactly this – but have they always been available or are they a new luxury brought about by the wonders of electronic trading?
In the days before widespread computer usage, if an institution wanted to move a big line of stock, and didn’t want to send it down to the exchange for fear of moving the market, they’d call their broker. Often the broker would call their internal counterparts to see whether they, or a client, was buying or selling a similar block and would execute an ‘upstairs’ trade at an internally negotiated price. Depending upon the skill of the broker, this provided a high degree of anonymity and certainly more than would be achieve by just dumping the order on a lit market for all to see.
It was a natural step then for brokers to employ computers to automate this upstairs activity so as to increase its effectiveness and reduce their reliance on human memory. The result was a dark pool that matched different client orders away from the public gaze of the lit markets.
When is a Dark Pool not a Dark Pool?
The terms ‘internaliser’ and ‘dark pool’ are often thrown around synonymously when talking about non-lit trading, particularly in the Australian media. Correctly speaking, however, an internaliser describes a firm conducting a specific activity where the broker is looking at incoming client orders and choosing to cross them against its own book. This can be contrasted with a broker crossing network, which is anonymous to both the clients and the broker – neither can see the orders inside the pool until they have been executed. A third category of non-lit activity includes dark books operated by venues that provide a similar service. And, finally, some venues permit dark order types (where part of the order is hidden) to intermingle with their lit liquidity.
Not only are there many types of non-lit or dark venues but, to make matters worse, each is defined and treated differently by various regulators around the globe. No wonder then that dark pools tend to be perceived suspiciously (see diagram 2).
Algorithmic Attitudes
Ben Read, Electronic Sales, Merrill Lynch, examines the ongoing trends and potential pitfalls in electronic trading in Australia.
To the year ended July 31, turnover on the ASX had fallen 46% over the previous year on year, with eight down months out of the 12. (Please see diagram1 on next page). August provided some relief, with turnover up 23%, but globally equity volumes continue to decline through the seasonally quiet summer months; driven by a continued rotation out of equities amidst global economic uncertainty – the European Sovereign Debt Crisis, the US election and “Fiscal Cliff”, and Chinese growth moderation at the same time as a once-in-a-decade leadership change, all weigh heavily.
Beyond the macroeconomic (cyclical) challenges over the past 18 months, there has been a significant structural shift in the Australian market. Most significantly, in executing orders on behalf of investors, participants now have two public, or lit, venues and exchange-run/broker dark pools to consider. Best execution obligations set out in the corresponding ASIC Market Integrity Rules have introduced a new set of market integrity rules, which bear resemblance to the European Union’s Markets in Financial Instruments Directive (MiFID). Each require every market participant to now have a policy in place which sets out the framework in which it will meet its best execution obligations to its clients. The similarity to MiFID is its principles-based nature of this best execution requirement, which allows each broker to decide how it determines the best outcome for clients.
This is driving electronic trading/execution to become a more significant part of how Australian markets trade. Globally, we have witnessed this over the past decade and recent data estimates that 75% of US and 55% of European volumes are now being executed electronically. In Australia, we estimate this figure to be closer to 35%; up significantly over the past 2-3 years, but still very low by global standards, which, to us, implies that there is still substantial growth ahead. ASIC has been following this shift towards electronic, or direct, execution and in its recent consultation paper (CP184) provided guidance and rules on automated trading. One point was to ensure that all participants maintain robust filters and controls across all platforms that access the market, with a key requirement to have a kill switch in place that can shut down parts of the execution infrastructure when behaviour outside accepted patterns is detected.
A kill switch can take two forms: it can either be software coded, or hardware-based. The former is a functionality that exists across the execution platform. It allows the trading desk, in combination with oversight functions, to quickly shut down either flow from a particular client, or a particular algorithm, that they believe is not behaving correctly. A hardware-based solution is more complex and involves the physical disconnection of parts of a market participant’s order routing platform from the exchange infrastructure. This is typically accomplished by closing or blocking network switches that exist between the two.
Also vitally important is the real-time monitoring that alerts trading desks to any form of execution that is occurring outside acceptable parameters. Most market participants with established electronic trading platforms have pre-trade filters based on a percentage of AD V, value of single orders, total notional and net delta. Taking this further is the implementation of systems that can alert on abnormal patterns of messages, trades and realised profit and loss. Significant deviations from historical patterns can significantly speed up decision making in whether to involve a kill switch.
After the recent events in the US involving Knight Securities, SEC Chairman, Mary Schapiro commented that “when broker-dealers with access to markets use computers to trade, trade fast, or trade frequently, they must check those systems to ensure they are operating properly.1 ”
In Australia, the investment community is supportive of such initiatives to ensure that overall market integrity and confidence is maintained and that investors and participants are protected from similar events occurring. In the immediate aftermath of Knight Capital, sell-side lines of business right across the industry conducted thorough reviews of their internal risk frameworks and there was plenty of dialogue with clients on what their trading limits were with different trading desks they faced. The majority of this work started before ASIC formally published CP184.
Algos: Definitions Matter
We think it is also important to distinguish between the different types of “algorithmic trading” given the air-time this contentious issue has received over the last 12 months. Benchmark execution algorithms used by sell-side trading desks and accessed directly by the buy-side are for the most part relatively safe. They execute a specified number of shares in a single or basket of stocks, governed by a framework of parameters that controls how they post and take liquidity from lit and dark venues. Typical algorithms of this type are Implementation Shortfall, VWAP, Percentage of Volume and Market on Close. They are some of the tools used by many participants to help achieve the best execution outcomes for clients.
The algorithms or automated strategies run by proprietary trading organisations and high frequency trading (HFT) firms have more discretion in what they buy or sell based on the market conditions they encounter. They can generate orders without the direct oversight of a portfolio manager or any human intervention at all. For example, a market making strategy can trade across a whole index of stocks, generating orders to both buy and sell the same stock concurrently and auto executing futures contracts to manage risk.
The volume of orders and turnover generated by a high frequency trading strategy is likely to be significantly higher than a regular execution algorithm. However, the right risk and control framework at the exchange and participant levels, can offer significant protection to the Australian market from the systemic risk that may be posed by the combination of algorithms operating on a daily basis.
Finally, a lot has been spoken about HFT algorithms that are deliberately setup to manipulate or game other investors. We do not believe that the majority of HFT strategies behave in this way; they operate legitimate business models, which are fully compliant with ASIC market integrity rules.
Pooling Liquidity
Dark pools are currently a focus and there has been much discussion on how much of market turnover is getting transacted within them. Our research shows that the percentage of total turnover that has been transacted in dark has remained relatively stable at 11-15% in 2012. For the purpose of our research, we have designated volume executed in Centrepoint, priority crossed or National Best Bid and Offer (NBBO ) trades as dark. We think the right metric for the Australian market is the amount of volume that’s getting executed within ASX’s Centrepoint, and the crossing mechanisms that report to market. Block trades negotiated in the upstairs market have been classified as offmarket volume for many years and should continue to be so.
The key is finding the right balance that protects price discovery and liquidity in the lit market, but still affords investors the ability to keep their footprint minimal. The majority of investors are focused on minimising their overall market impact. Hence, it is important that the market structure moves towards an equilibrium with an appropriate balance between the protection of price discovery and liquidity in the lit market on one hand, and the ability of investors to minimise their footprint via the dark pools on the other. A quality independent research on the links between the quality (price discovery and liquidity) of the lit markets and the growth in dark pool activity would be a welcomed addition to the current debates.
If ASIC were to impose minimum sizes for dark pool crossings, as the ASX is campaigning for, then we believe something is needed that is a more flexible than an arbitrary dollar value. As turnover increases or decreases and the price of individual stocks moves then a single value can go from being appropriate to inappropriate very quickly. One suggested solution is a value that is a function of the stock’s liquidity. For example, five times the average trade size based on the previous 20 trading days.
Global regulators are certainly aware that that every round of wholesale regulatory change adds a significant cost burden to market stakeholders both on the buy- and sellsides. The majority of resources are going into technology, compliance and legal functions. What the industry globally is still trying to evaluate is whether the more stringent regulatory requirements and all of the investment made has yet led to both the buy- and sell-side being able to offer a better product or service to their clients.
As a final point, the majority of Australian participants are global firms where Australia is just one of many territories they operate within. If the cost of running an equities business within Australia becomes prohibitively high then there is a danger that, in an environment of reduced activity, focus and resources are pulled back in favour of other regional markets. This in turn could reduce the breadth of services that are available to our domestic managers.
1 http://www.sec.gov/news/press/2012/2012-151.htm
Market view : Investment banking
GAME CHANGER.
Lynn Strongin Dodds explains why banks can’t turn back the clock.
Hope may spring eternal but any chance of investment banks returning to their modus operandi has been dashed by the ongoing eurozone crisis, the prolonged economic downturn in the West and a cascade of regulation. As a result many believe that the industry is undergoing a once in a generation change and that the model will have to be dramatically reconfigured.
“With the demise of volumes across plain vanilla and cash products – in some cases more than 40% – and the retreat from structures and complex risky instruments due to intense regulatory scrutiny and new legislation, investment banks are under exceeding pressure to reinvent themselves,” says Maurizio Bradlaw, a Frankfurt based partner at Capco, a global business and technology consultancy. “The key challenge is that for many, investment banking was the main bread winner for the organisation. “
This position is now under severe threat and they have to come to terms with the fact that both the historical returns and growth in this sector are gone. It will be sometime – potentially even questionable – if they will ever return, at least in the traditional sense of an investment bank’s trading function.”
Job losses are typically the first line of defence and the axe has fallen widely for junior as well as senior positions. The figures make for sober reading on both sides of the Atlantic. Wall Street banks laid off over 75,000 people in 2011 and analysts estimate that overall they will have roughly 10% to 15% fewer employees in early 2013 than they did at the start of 2012. Europe has been equally impacted with the City of London taking the brunt. When the dust settles this year, the number of jobs is likely to stand at around 255,000, a level last seen in 1996.
Equities has been one of the worst affected areas with a recent study by Greenwich Associates – European Equities Investors – showing that banks are in a phased retreat or retrenchment. For example, Nomura recently announced that it was realigning its equities trading business, opting to consolidate execution services under its Instinet agency brokerage brand while Italy’s Unicredit has withdrawn from Western sales and trading, handing French broker Kepler Markets the right to service its clients. Moreover, Credit Agricole is in talks to sell Cheuvreux to Kepler while retaining a strategic stake. In the UK, state-owned Royal Bank of Scotland decided to refocus its attention on fixed income and withdraw altogether from most of its equities business.
Even the behemoths are making adjustments with firms such as Deutsche Bank, Citi and Credit Suisse integrating their equity trading services offered to institutional clients. However, they are expected to capture the lion’s share of business as the commission pool continues to shrink. The Greenwich report showed that global banks with investment and research services grew their share of the commission pot from 66% to 70% in 2011. In the UK specifically, it increased from 63% to 68% in the same period.
This is at a time when the overall institutional bucket shrank over 25% on a pan European basis during the period from the first quarter of 2009 to the same period in 2012. Predictions are that it will be flat for the year although it could be under if soft equity trading which has marked the first six months continues into the second half.
According to Jay Bennett, consultant at Greenwich, the contraction in equity commission and broker trading revenue is the result of three market developments: de-risking of institutional investment portfolios, the economic recession in Europe and the European sovereign debt crisis, the last of which has prompted institutions to keep money on the sidelines, limiting trading activity.
A sharper knife
The concern is that the downsizing, headcount reduction and consolidating trading desks that has taken place over the past year has been too modest compared to the drop in trading revenues and a more drastic approach may be required. “To this point however, they’ve been pruning around the edges,” says Greenwich consultant John Colon, “But if things continue on this track, at some point they’ll have to move from giving up fingers and toes to arms and legs.”
The two main reasons why European and global banks have resisted deep cuts is because trading and research capabilities in European equities are viewed as important sources of market knowledge and intellectual capital that is leveraged in capital markets origination, mergers and acquisitions and other high-margin businesses. The other is that few want to raise the red flag of defeat. “People will stay the course in equities much longer than you might expect based strictly on profitability,” says Bennett. “To some extent, pulling back from European equities is admitting defeat in investment banking generally.”
Miranda Mizen, principal and director of equity research at Tabb Group notes, “The commission wallet is shrinking and one of the biggest challenges facing banks is that they are looking at the same areas at the same time. They need to be able to stand out and differentiate themselves and the question is how they are going to do that economically. There is a lot of navel gazing going on but they have to start making the difficult decisions and focus on their core strengths while exiting the weaker areas.”
The main challenges according to a report by J.P. Morgan entitled ‘Global Investment Banks: IB landscape changes across the globe – the path to an acceptable ROE for Tier I and II players.’ is that there is little chance of the industry returning to the growth levels of the halcyon 2005 to 2006 period plus the scope for product innovation is limited. Moreover, banks should not rely on emerging markets as the engine because they are not expected to be a material driver for revenues over the next five years.
As a result, banks need to rethink their value propositions. “One of the key drivers behind the restructuring are the low volumes in the market and banks are going back to what they specialise in whether it be equities, fixed income or a specific area of expertise,” says Tony Nash, head of execution services at Espirito Santo. “Some are paring down their global ambitions and are focusing on a particular region while others are combining different skill sets on one trading desk such as electronic trading and programme trading. In general, banks are reining in their ambitions and are in a survival mode.”
Those that do will more than survive and in time could thrive. As Bradlaw points out, “What’s new in the battle for market supremacy is that by and large the fight has shifted from breadth of product coverage and inter-bank trading to a greater focus on corporate needs, home market supremacy, consolidation and optimisation of post-trade functions. This paradigm shift has and will have a substantial impact on market liquidity, cost of trade and corporate governance and heads of investment banks need to take a more holistic approach in securing transactional volumes and returns”.
Bradlaw adds, “In this shift there will be winners and losers. Mid-tier banks who have aspired to be recognised as global players will focus on national markets while the top six to seven global players will dominate even more as smaller firms withdraw from certain asset classes or investment banking activity completely. In addition, investment and corporate banking groups will need to work more closely together so that new cross products can emerge which are outside of regulatory oversight and capital demands – but in effect with the same underlying hedge/investment effect.
©BestExecution 2012/13