Home Blog Page 624

Viewpoint : Collateral Management : Stefan Lepp

BE21_Clearstream, Stefan Lepp
BE21_Clearstream, Stefan Lepp

THE GLOBAL CHALLENGE.

BE21_Clearstream, Stefan Lepp

The new regulatory focus on risk management has put the spotlight firmly on collateral supply and mobilisation. Stefan Lepp* discusses with Best Execution a new approach in the post-crisis world.

There has been much debate on the size of the potential shortfall in collateral in the market. What is your view on this?

As we know, more trades must be collateralised under new regulations. Estimates of the necessary collateral supply in the future range from many trillions to none, and this already tells you a great deal about the uncertainty and confusion in the market. But the question is not if there is enough collateral – and I believe to a certain degree that there is – but whether institutions can identify, access and mobilise it quickly and smoothly across multiple time-zones.

Many institutions do have a single view of their fragmented collateral across multiple locations and time-zones and such institutions might also have a picture of worldwide exposures to be covered on a daily basis (more or less in real-time). But what they are often missing is a facility enabling global collateral consolidation as a basis for optimised collateral allocation across all available collateral positions towards multiple exposure locations. Suppliers can achieve this through creating strategic partnerships with institutions holding collateral (central securities depositories, agent banks, global custodians, etc.) while building links to globally active exposure locations (such as CCPs). This model enables previously inaccessible collateral to be mobilised and so softens the impact of the collateral crunch significantly.

How sophisticated does collateral management need to be?

That depends on the institution and their activities but most need collateral management on a real-time and cross-market basis. Without this, institutions cannot deliver the best – i.e., the cheapest, eligible – collateral to each exposure point. For example, they might have to use triple-A rated paper to cover an exposure only requiring single A+ because the single A+ is in another location and cannot be mobilised on time. Systematically over-collateralising in this way bears very high opportunity costs; a Clearstream-Accenture study conservatively reported that this cost globally is north of EUR 4 billion.

How is the industry tackling this challenge?

A dedicated collateral management system used to be seen as optional and not a key product. Since the financial crisis, however, collateral has become a worldwide issue and any market solution must meet this global scale. My company, for example, is currently in strategic talks with many institutions, such as CSDs, wanting to become local collateral service providers. CSDs have a central role in national economies and are often well-positioned to deliver secure services to their local markets. These days, however, they run a commercial risk if they cannot meet their client’s collateral management needs.

A Finadium study in May 2012 found that only 13% of CSDs surveyed could deliver collateral optimisation, but trying to develop the necessary technology in-house can be a lengthy and costly business. A sustainable solution is for infrastructures to insource a sophisticated system quickly and cost-effectively. This approach has been adopted by the Liquidity Alliance, an association of five CSDs (currently) from around the world which share the same approach to collateral management and believe that strategic partnerships are the smart way to beat the global collateral challenge.

*Stefan Lepp is a member of the Executive Board of Clearstream and head of its Global Securities Financing division.
©BestExecution 2013

Equities trading focus : Market structure : James Hilton

Be21_Credit Suisse, James Hilton
Be21_Credit Suisse, James Hilton

YESTERDAY’S GONE.

Be21_Credit Suisse, James Hilton

James Hilton* takes a look back at the changes in market structure over the last six years and argues that, despite pressure from some quarters, for the end investor it would be a retrograde step to try and turn back the clock.

Observing the development of equity market structure over the last 6 years has been fascinating. Introducing competition into a monopolistic industry has been, at times, disorientating and has presented both challenges to the established order and opportunities for new entrants offering services to investors. The resulting market structure, which includes fragmentation of lit liquidity and an increase in non-displayed liquidity, has proven to be an irresistible target for established players keen to demonise the shift away from the old status quo, exchange-concentrated trading environment. However, even as this is to be expected from threatened monopolies, so is the result of the change; when the rhetoric and scaremongering are stripped away, it is clear that competition has measurably and significantly reduced transaction costs, at great benefit to the end investor.

Costs down

The Credit Suisse Composite Cost Index (Fig.1, red line) compiles the trading costs of a broad range of institutional investors, adjusted for trade size and execution style. Our Transaction Cost Index (Fig.1, blue line) goes one step further by normalising for spikes in volatility, creating a better apples-to-apples comparison of structural costs over time. Transaction costs began to fall in late 2009 and have, on average, been almost a third lower since the beginning of 2010. Even though costs have risen slightly recently, they remain 16% below their pre-2010 levels. While numerous market structure changes have taken place, making it difficult to attribute the reduction in transaction costs to a single factor, our analysis suggests that alternative venues and crossing networks are major drivers.

CS Route Report

Competition

Multilateral Trading Facilities (MTFs) introduced competition into what was previously a monopolistic environment. They have disrupted the existing order, offering attractive pricing in a bid to lure market share away. To keep pace, traditional exchanges have been forced to reduce fees (though many have increased the charge for auction trades, where they retain a monopoly). Given the reduction in revenues, it is unsurprising that exchanges have complained about MTFs – and their pricing in particular – time and again. However, what’s bad for the bourse seems to be good for trading counterparties.

In our analysis of Credit Suisse orders sent using time- and volume-based tactics in 2012, we found that trades exposed to alternative venues outperformed on a relative basis by 4.4bps versus Implementation Shortfall (IS) and 0.1bps versus VWAP. It is worth remembering that total transaction costs are down, so the relative outperformance from MTF exposure seems to be an overall improvement rather than a zero-sum gain. This suggests that the introduction of competition in the exchange space has increased the efficiency of the market, lowering transaction costs for its users by reducing the rents accrued by its operators.

Crossing Networks

Crossing “networks” have existed – in some form – for as long as stock has changed hands. Traders would hold certain orders in their pocket (sometimes literally) and try to cross them off without disturbing the market. As this practice has become more automated, it has introduced further competition into the execution space. Automated crossing networks are now operated by brokers, MTFs, and the exchanges themselves, allowing traders to cross stock without displaying their size.

Our studies indicate that trading in this manner has a clear, positive impact for the end investor. For orders using time- and volume-based tactics, those that were exposed to the dark performed a full 3.3bps better when measured against IS and 0.8bps better versus VWAP (both statistically significant at the 5% level). Some detractors have argued this practice is damaging to market quality and should at least be restricted to the largest orders only, but in our studies, we have found that there is no evidence that higher levels of crossing activity adversely affect the market. In fact, we found evidence of a positive impact on several measures of market quality, including spreads and autocorrelation. We also showed that posting visible quotes for even small orders causes market impact, undermining any argument for a minimum size restriction.

Clear and measurable benefits

With the knock-on effects of changes to the market’s structure coming into focus, it stands to reason that the regulatory framework will need to adapt to the new environment. Many common-sense reforms – like order-to-trade ratio limits – have been put forward or are underway. However, other proposals seem designed to roll back or hamstring new forms of execution in order to reduce the competitive pressure on established business interests. The lobbying efforts of the exchanges clearly reflect this anti-competitive sentiment. They have strongly pushed for minimum trade sizes and market share caps for crossing networks. The research papers they fund to bolster their position employ questionable methodology – such as declaring that on-exchange crossing networks are “lit”, while off-exchange crossing networks are “dark”; or conflating macro trends in volatility with market structure changes when it proves convenient. They argue that these restrictions would improve market quality, while the only obvious effect would be to force volume back to the old regimes, restore market concentration and transfer money directly from pension funds to the bottom line of exchange profits through increased investor costs. The market, by and large, knows this: Pensions Europe, for example, an association of funds representing the workplace pensions of 83 million Europeans put it succinctly when they noted that these and other proposals: “offer trading venues the opportunity to take undue commercial advantage from pension funds’ investment activities… affecting the returns on investments and, ultimately, the contributions paid to workplace pension beneficiaries.”

The European markets have undergone an enormous transition since 2007. MTFs have driven down exchange fees, and crossing networks have enabled investors to trade orders without distorting the market. While entrenched interests have painted these developments negatively, warning of deteriorating market quality with subjective or questionably supported claims, the net result of increasing competition is clear and measurable: a significant reduction in costs, with the benefits going to the end investor. It would be a shame to roll that back.

*James Hilton is Co-Head of AES Sales, EMEA at Credit Suisse
©BestExecution 2013

Equities trading focus : Market quality : Michael Krogmann

Be21_Deutsche Boerse, Michael Krogmann
Be21_Deutsche Boerse, Michael Krogmann

OF FIGURES AND FAIRNESS.

Be21_Deutsche Boerse, Michael Krogmann

Michael Krogmann* provides a holistic insight into defining market quality.

It feels like ages ago when the comparison of explicit trading cost combined with the liquidity in a specific instrument seemed enough for market participants to make an educated choice on which market to trade. This was before the credit crunch in 2008. Investors lost a lot of faith then, and in the face of an increasingly fragmented European equities trading landscape it should surprise no-one that this faith has not been fully restored yet.

Consequently, market participants have been looking for better means to identify the very trading venue which supports their individual strategy best. Over the last few years, their focus has been shifted from cross-venue comparisons regarding market share and liquidity to a much bigger picture: the overall quality, accessibility and the level of pre-trade transparency a market has to offer.

Quantifiable indicators

There is an industry-wide consent that market quality can be determined – and compared – by measuring spreads at touch, best prices at touch, and the quoted order book depth of a market, per instrument or index. A very popular example for this is the “battlemap of the exchanges”, a comparison of these indicators plus the respective market share regarding Europe’s major trading venues, and including eleven European indices. This decision-making tool, provided by LiquidMetrix, a neutral and independent research provider, has proven so useful for market participants that Deutsche Börse decided to provide the relevant data on xetra.com for comparing market quality for trading of the German benchmark index DAX. However, these indicators work best with big trading venues attracting all kinds of investors, providing a heterogeneous and pan-European order flow. The more specialized a trading venue is, the less weight these figures have, and the less they qualify for comparison.

The need for diversity

The kind of liquidity a market participant needs to pursue his investment objectives depends on his trading strategy. And just like trading strategies, liquidity pools can vary widely – making it considerably more difficult to compare trading venues. A marketplace attracting mainly high frequency traders might provide high liquidity which could be of lesser benefit for other kinds of investors. Thus, a liquidity pool reflecting a multitude of traders’ interests will cater to the needs of a broad variety of investors.

Another important factor regarding the diversity in counterparties is their direct influence on a market’s ability to recover from the impact of an order. The more diverse the liquidity in a specific instrument is, the more resiliency a market participant can expect.

Fairness: a fundamental factor

Part of the academic literature differentiates between market quality and market integrity, maybe because integrity cannot be quantified as neatly as transaction cost or liquidity. In fact, integrity is a vital part of the overall market quality – and, of course, part of transaction cost, if not an obvious part. Michael J. Aitken, CEO and chief scientist at the Capital Markets Cooperative Research Centre (CMCRC), has contributed greatly to the discussion on market quality and its usefulness as a decision-making criterion for investors. In his eyes, a high level of market quality can only be obtained in a fair and efficient market. Consequently, his definition of market quality includes market integrity as one of three main components, the other two being market efficiency and systemic risk. Therefore, Aitken and his colleagues searched for means to quantify certain types of market integrity-threatening behaviour in order to make the level of integrity as comparable as, for example, best prices at touch. They developed proxy alerts to measure some of the most common forms of market abuse, such as front running, trading ahead of price sensitive announcements, and ramping at the close. Aside from these measures to assess market abuse, the level of market supervisory presence at a specific trading venue is always a reliable indicator for fairness in a market. In this aspect, regulated exchanges offer the fairest trading.

However, fairness as a component of market quality should not be limited to the prevention of market abuse: it should include the provision of equal access for all market participants to a trading venue and its absolute neutrality, like regulated exchanges offer – with the most welcomed side effect of heterogeneous liquidity.

Non-quantifiable but cost-effective

Fairness is not the only component of market quality that cannot be expressed as a figure – there are many factors for an optimum outcome of a single trade defying quantification. Actually, every step in the trading process – from order routing, trading and clearing, down to settlement of each and every instrument – will influence the overall transaction cost. This includes the level of the trading technology of a marketplace and the provision of redundant lines and co-location services as well as the stability of the trading infrastructure. In addition, the efficiency level of instruments and technical solutions implemented to guarantee sensible prices during highly volatile markets, or to prevent unwelcomed side effects of HFT, can influence the outcome of a trade. The same goes for the services a market operator offers to support the risk management of investors, such as the provision of a central counterparty.

Market quality and transparency

While a high level of pre- and post-trade transparency may not comply with the business model of every market participant, it certainly contributes significantly to the stability of the European trading landscape – and adds much needed efficiency to the price discovery process in increasingly fragmented markets. Transparency on a pan-European level, however, cannot be obtained without its most important precursor: fairness in terms of market accessibility and the prevention of market abuse. A market model offering equal rights as well as equal trading opportunities for every market participant alike does not only cater to a rich and heterogeneous order flow – it is also future-proof in regard to the expected new requirements coming with MiFID II.

*Michael Krogmann is Executive Vice President at Deutsche Börse AG.
©BestExecution 2013

 

Equities trading focus : Outsourcing : Asif Abdullah

Be21_GreySpark, Asif Abdullah
Be21_GreySpark, Asif Abdullah

FEELING THE SQUEEZE.

Be21_GreySpark, Asif Abdullah

Asif Abdullah* explains why managing down the cost-per-trade is key to survival for mid-sized players. 

Although global equity market indices, including the S&P500 and the FTSE100 have recovered to pre-crisis levels, this has not resulted in an increase in equity brokerage fees. Client trading volumes which drive brokerage commissions continue to remain depressed. Industry research shows that the global equity revenue pool in 2012 was 55% of pre-crisis 2007 levels (see Fig. 1) while reduced primary activity, such as IPOs, has curtailed secondary market activity.

As a result many firms have been forced to strategically review their equities franchises, which has resulted in downsized or shutdown business lines.

The reduced revenue pool has increased competition amongst brokerage firms. Market-leading equities houses, which have maintained a full service offering, have successfully increased market share at the expense of mid-tier and regional firms.  These global firms have maintained profitable business lines by continuing to exploit volume-driven economies of scale and, increasingly, they have benefited from cross-asset flow by implementing multi-asset platforms.

Mid-tier or regional firms, on the other hand, are encumbered by expensive technology and operational platforms designed to manage pre-crisis volumes. What this means is the fully loaded cost per trade for a market leading-firm is significantly less than for a mid-tier firm. This is illustrated by Fig. 2. The ‘Squeeze zone’ is where firms have high cost bases and cannot benefit from economies of scale.

13Q2 Greyspark charts 1-2

For many equities businesses, reduced brokerage revenue coupled with a high cost base has resulted in negative or near negative return on equity, forcing banks to make difficult strategic restructuring decisions.

In the context of reduced profitability, technology platforms need to align to the new business reality. Three key trends in the market are emerging as a response to this: the merging of high-touch and low-touch flows, the commoditisation of trade management functionality, and the deepening of vendor relationships.

Merging of high touch and low touch

When full STP and electronic trading truly became possible in the late 1990s, brokers found their traditional high touch platforms could not service the demands of clients who wanted fast, reliable and scalable execution capabilities. Early market entrants developed in-house low-touch platforms to sit alongside existing high-touch platforms. Later market entrants have since taken advantage of the lessons learnt from the earlier entrants with many using vendor solutions rather than building in-house platforms.

Over time, high- and low-touch platform functionality has converged with high-touch platforms now incorporating many of the performance enhancements typically offered by their low-touch counterparts. Low-touch platforms offer functions over and above a simple DMA offering, for example enabling brokers to intervene during execution, providing FIX access to broker algorithms, and more.

This convergence of functionality is expected to continue, eventually eliminating the need to maintain two distinct platforms. Low-touch platform vendors have been continuously enriching functionality to meet high-touch requirements and are well positioned to dominate the converged market.

Commoditisation of key functions

As electronic trading in equities has matured, much of the trade management functionality has become commoditised as presented in Fig. 3, but many of these functional components offer very little in the way of competitive advantage to the bank.

13Q2 Greyspark 3

Many software vendors now provide off-the-shelf solutions which are integrated with the broker’s own systems. In particular client connectivity, order management, execution management and the provision of market data have been fully commoditised. Brokers can continue to secure competitive advantage from functions such as programme and algorithmic trading.

In the context of reducing investment budgets and increasing cost pressures, many firms are migrating these commoditised components from in-house to vendor solutions and as a result, firms are able to direct valuable internal resources to areas that still provide competitive advantage.

Deepening of vendor relationships

In addition to increasing the use of vendor technology within the trade management service stack, brokers are looking to vendors to provide increased levels of service – from the provision of enterprise software licencing, integration resourcing, on-going maintenance of software, hosting of hardware and the running of vendor software to a fully outsourced model where the vendor is responsible for complete operation of the software and supporting infrastructure. Fig. 4, summarises five main types of vendor engagement relations.

Fig. 4 – The deepening vendor relationships

Depth of Vendor Service Offering Description
Enterprise Software Licencing The vendor provides the broker with software under a licence agreement, typically with an annual subscription. The software functionality may be customised and extended by the broker and is integrated with other systems by the broker.
Integration In addition to the provision of software, the vendor provides the client with relevant technical and project delivery resources to support the integration of the vendor software with the brokers’ other systems.
Maintenance The vendor is engaged to perform on-going maintenance of the software, which may include the provision of 1st or 2nd line support. The software is installed on-site at the broker premises with the vendor’s resources providing SLA-governed services on-site remotely.
Hosted Enterprise In addition to software maintenance, the vendor provides and hosts, on their premises, the underlying infrastructure that supports the platform. The vendor manages the infrastructure hardware in addition to the software.
Fully Outsourced In a fully outsourced model, the vendor is responsible for all aspects of the platform and related services. The vendor may take advantage of economies of scale by sharing functional components or data (such as market data) with several clients. The vendor strictly controls software changes and all clients typically have access to common functionality and use the same version of the code base.

 

The deepening of vendor services presents challenges and opportunities for the broker as summarised by Fig. 5:

  • The broker may feel a loss of control as the vendor takes on more responsibility. As a consequence operational risk may increase.
  • An increase use of vendor services places greater emphasis on relationship management skills. Effective relationship management is critical in an outsourced environment. The broker may have to review the skill profile of their internal team. The vendor and broker internal teams should have common objectives and incentives should be aligned. A partnership type relationship should be nurtured.
  • Using an outsourced service model provides the broker with scalability. The broker has the opportunity to flex resourcing or systems capacity in line with requirements. Vendors with several clients can smooth out the peaks and troughs of demand.
  • The broker also has the opportunity to move from a fixed cost model to one with a larger element of variable costs and thereby manage down the cost per trade as volumes fluctuate.

13Q2 AC Greyspark 5

We’ve seen the closure of many equities brokerage businesses in the last few years, most notably RBS’s. This too, will be the fate of brokers who fail to manage down their cost-per-trade in an environment where regulatory pressures are increasing and where clients are becoming a lot more circumspect about where they transact their business, brokers who fail to manage down their cost-per-trade will continue to suffer.

Dismantling large, monolithic software architectures is necessary for mid-tier houses who are fighting to survive and those who can do this, and recalibrate their focus towards the areas that truly provide a competitive advantage, will succeed.

* Asif Abdullah, is lead consultant, GreySpark Partners
 
This article is part of a research series published by GreySpark Partners (cmi@greyspark.com).
 
©BestExecution 2013

Equities trading focus : FTT : Denis Orrock

Be21_GBST, Denis Orrock
Be21_GBST, Denis Orrock

TOO LATE TO HOLD BACK THE TIDE.

Be21_GBST, Denis Orrock

Amid much speculation about whether, when and in what form a Financial Transaction Tax (FTT) might be imposed on securities trading in Europe, Denis Orrock* suggests that burying your head in the sand is not an option.

There are approximately 40 transaction taxes or stamp duties that are either active or being implemented in countries around the globe. These taxes whilst being conceptually similar in principle often are referred to as something other than a Financial Transaction Tax (FTT). In the Americas, eight out of 35 countries have some form of transaction tax (including Bolivia, Columbia and Peru), while in the Asia Pacific region almost half the countries there have such a tax. It is becoming clear that the tide that is now lapping at the shores of Europe (and in particular, France and Italy) is part of a much stronger global swell that, despite the debate and political rhetoric, will not be turned back by any one jurisdiction or even region.

Time to prepare for the Operational certainties

Most commentators would agree that some form of EU Financial Transaction Tax is inevitable. Yes, it will probably be delayed and certainly it will be diluted, but the tax and its inherent operational challenge will not disappear. Whether the transaction fee is 1.0 per cent or 0.01 per cent, the operational task remains the same. And irrespective of the decisions taken at an EU level, there is still much activity that is taking place on a national level – the debate surrounding Italian derivatives being a prime example. Capital markets, asset management and custodian firms need to put their uncertainties around fee levels within the EU FTT to one side and start to prepare for the operational certainties that are already here.

The solution conundrum

Most institutions have tactical solutions in place, perhaps based upon their existing tax software, but forward-thinking firms are now looking for a more strategic tax-processing solution that acts as a central repository for all tax rules – not just the EU FTT. The idea is that this repository would then serve all of the regional tax requirements, interfacing with local systems. The alternative is a tactical solution, which typically is resource intensive and has limited capability in terms of workflow, exceptions management or rule flexibility.  Industry evidence with regards to both French and Italian experiences has demonstrated the costliness and inefficiency of these tactical solutions.

As each country implements their own unique version of the FTT, capital markets, asset management and custodian firms will potentially need to develop a unique set of rules for each jurisdiction. For the Head of Operations or IT, the solution they deploy will also need to interface with in-house ledgers, MI systems and other applications both up and downstream. Ideally the solution should also embrace the whole lifecycle of a trade, including pre-trade ‘what if’ analysis. There are obviously huge advantages to be derived from having one central tax repository that can provide a dashboard representing the effects of the tax on all potential trades across multiple jurisdictions.

The current situation is that institutions are capturing the trades from multiple upstream systems (systems that may cover different regions or even different instruments). Once the Operations teams have captured all of that data from the trading platforms, they then need to apply the rules regarding which trades are subject to FTT and which are not. In operational terms, the requirement is for a system that is capable of identifying the, say, 25 trades out of 100 that attract FTT and disregard the rest. At the moment this task is being undertaken in a very manual fashion or at best via a batch process: it is not a scalable system that can accommodate the addition of new countries with their own unique set of rules.

The need for scalability

Any form of transactional tax generates a huge workload for the Operations team in terms of capturing the trades, sifting through them to determine those that are exempt, performing the tax calculations and generating the messaging to the relevant authorities. To do this manually for one country is not really feasible, let alone 40 countries.

Indeed at an FTT seminar of 30 prime brokers, custodians and consultants (hosted in London during June 2013 by GBST and Six Financial Information) the view was expressed that the tactical French and Italian solutions that have been deployed to date are unlikely to scale sufficiently to handle further jurisdictions. The compelling drivers for a strategic solution to the FTT question that emerged from the seminar were competitive advantage, future proofing, risk mitigation and cost reduction.

Conclusion

While the FTT may reduce in scope in terms of European jurisdictions, at a global level the scope is actually growing for the Tax and Operations teams within institutions.

While up on the Bridge, the politicians and industry bodies debate the whys and wherefores of an EU transaction tax, down in the engine room of financial institutions the Operational staff are already getting their feet wet and desperately need a strategic solution. The Compliance and Risk teams are only too aware of the perils of failure to introduce a truly scalable, global, tax processing methodology.

*Denis Orrock is CEO, GBST Capital Markets

©BestExectuion

Viewpoint : Data Management : Stephen Engdahl

Steve Engdahl, GoldenSource
Steve Engdahl, GoldenSource

DATA MANAGEMENT TO IMPROVE RISK MANAGEMENT.

 GoldenSource_Steve-Engdahl

Stephen Engdahl, Senior Vice President, Product Strategy, GoldenSource

Risk analysis is not new. Portfolio management and investment decision-making have been driven by risk analysis since at least the middle part of the last century. Our industry is built around techniques to achieve maximum return with minimal (or optimal) risk. And the risk discipline requires copious amounts of timely, high quality, relevant data. Risk models are only as good as the information which is fed into them.

Institutions now recognize the importance of good data management practices in order to improve their risk practices, and more than ever before they are prepared to do something about it. Risk analysis is a common driver of many active GoldenSource client projects focusing on data management, data quality, and data governance. But if risk analysis is such an established practice, then why are so many institutions just now focusing on data as it relates to risk? What changed?

Navigating an uncertain world

The markets have taught us many lessons over the last five years, including the importance of expecting the unexpected. Market and economic events relevant to risk management, i.e. counterparty defaults, market volatility, mortgage crises, and sovereign debt restructurings, stressed the data management practices of most organizations.

Consider even the relatively simple task of measuring exposure to various dimensions – what was your exposure to a failed counterparty, including all its subsidiaries and related entities? To securities linked to subprime mortgages? To Greece? Now, imagine answering these questions in an environment where positions are recorded in multiple systems, counterparty hierarchy is not well defined, and critical reference data attributes exist in spreadsheets or separate incompatible security masters, not linked to counterparty and position data. What should be a simple task becomes a significant problem unless the right data infrastructure is in place.

Easy access to accurate risk data cannot be ignored, because we live in a volatile world. “100-year floods” are supposed to happen no more often than once every 100 years, but now they’re occurring with alarming frequency. In the financial markets, the same goes for formerly “unthinkable” events such as major institution and sovereign crises.

This situation is particularly stressful without a solid data foundation which aggregates and standardizes internal position and transaction data, and links it with clean counterparty, hierarchy, instrument and underlying data so that measurement and analysis can proceed. After facing a series of external events and realizing that answering critical questions took far too long, institutions are recognizing that the time to aggregate data, standardize it, and monitor it for quality is before the next question arises, not when it arises.

More regulation, please?

Multiple jurisdictions worldwide are asking for more data to support the decisions made and positions taken by financial institutions. Centrally clearing derivatives changes the risk profile of these instruments. Certain trade reporting must be enriched with Legal Entity Identifiers (LEI’s) and Unique Product Identifiers (UPI’s). Solvency II requires much more granular data, in a broader set of attributes, than was needed in the past – touching on all aspects of data from position to instrument to counterparty.

The strategic reason for a firm to invest in data management capabilities should always be to improve quality and improve its own decision making and in turn its own investment performance. However, regulation is providing an additional kick to get things right, today.

Making order out of chaos

Prompted both by regulatory requirements and market events, risk managers, portfolio managers, and other consumers of information within the firm are now asking more questions about the lineage of the data they use.  In addition to demanding better data quality, the business is demanding a view into the data supply chain. This is particularly important in areas such as valuations, where the business needs to show why a particular valuation was applied to a thinly traded instrument, and what other sources and models for valuation were provided.

Transparency requires data management operations to run on systems with full audit trails, entitlements control, and easy access points for end users, rather than spreadsheets. It also requires the data lifecycle within an organization (and the governance processes which surround and support it) to be clearly defined and documented. Every set of data attributes has a data steward somewhere within the organization who knows the most about how that data behaves, and the business needs a simple way of finding that person when it has questions about the attributes.

The path forward

The risk discipline is facing new scrutiny as firms steel themselves for the present and ready themselves to compete in the future. And integral to good risk management is sound data management policy, practices, and systems. Given the current state of data management in many firms, it may seem like a daunting task to get to the end state of clean, consolidated, complete data to drive effective risk analysis.

Luckily, that does not have to be the case. Leading institutions that have made strides in this area share the following three characteristics:

1.     Define success criteria around sets of data consumers

Data management initiatives achieve their best and quickest results when they focus on delivering value to a particular region, a department, an application, or a business function. This ensures that each project phase provides ROI, and sets the stage to iteratively increase ROI in subsequent phases. It’s important to have a solid end-state vision, but equally important to achieve it in incremental steps.

2.     Don’t forget the data generated within the firm

Data governance and quality processes apply to all data assets which need to be aggregated and leveraged across the enterprise. Many data projects only consider the security master data, which is procured from external vendors. However, that is only part of the picture. A solid end-state vision must address counterparties, accounts, and associated hierarchies. It must also include positions and transactions. The quality and standardization of such information is critically important to risk, regulation, and gaining a 360-degree view of the business.

3.     Be selfish – identify strategic benefits from regulatory projects

Regulation is often viewed as something to do “because we have to.” But data aggregation, standardization, quality management, and transparency required by many regulatory initiatives can also be used for competitive gain. Leading institutions derive strategic gains from their regulatory initiatives, such as finding ways to get to know their customers better, operate more efficiently, create new financial products more quickly, and improve their investment performance.

Data management has reached mass-adoption. Earlier adopters have paved the way and provided all of us with lessons learned about how to get data management right. Now, given market uncertainty, regulatory requirements, and the drive for business transparency, it’s a perfect time to create your own data management roadmap.

©BestExecution 2013

Equities trading focus : Execution services : Ian Peacock

Be21_Kepler Cheuvreux, Ian Peacock
Be21_Kepler Cheuvreux, Ian Peacock

A NEW LOOK.

Be21_Kepler Cheuvreux, Ian Peacock

Ian Peacock, UK CEO and global head of execution services at Kepler Cheuvreux talks to Best Execution about its new offering.

Has best execution improved since MiFID?

It is hard to measure the overall best execution in the marketplace since no general measure exists. However, I think MIFID has greatly improved best executions because now there is an industry standard for execution monitoring and transaction cost analysis. This was not the case before MiFID. Trading desks now have a greater focus on placing and monitoring orders and the control is enforced by market authorities. Greater attention is also paid to reporting.

What do you think the impact of MiFID II, and other regulation will have?

If you look at what has happened, the market has developed at such a pace, that regulators are addressing issues that were not there pre-the original directive. MiFID II is far from being implemented and it comprises two elements: regulation which should be applicable around mid-2014 and a directive not to be enforced before 2016.

There are six directions that focus on equities. The first is the definition and limitation of high frequency trading (HFT) through a dedicated fee structure that markets should implement. This type of trading has increased since new trading venues and fragmentation has occurred. The next focuses on OTC trading which is being moved onto listed exchanges. OTC trades will correspond to trading exceptions but they need to be clearly defined. Third, which is still under discussion, is pre-trade price transparency with the exceptions granted to dark pools. There is also talk of the possible creation of both a new trading platform OTF (organised trading facility) to better regulate broker crossing networks and a European consolidated tape. Last but not least is better best execution control or stricter obligations such as monthly reporting by intermediaries and platforms.

Altogether MIFID II should reduce HFT activity and increase market transparency.

And FTT?

There has been a push to get a European-wide FTT but so far some countries such as France and Italy have done it alone. However, there are differences between the two in that France allows the use of swaps or contracts for difference. This is not the case in Italy. Overall, I think the momentum is there for a European-wide FTT, which is very much politically, rather than financially driven but there need to be consistent rules. I do think though it could have a detrimental impact on volumes

You have recently launched a best execution offering. Can you provide more details about your best execution offering?

If you look at the surveys such as Greenwich Associates, flow is concentrated among the top ten brokers. We are on the list as the only non-conflicted agency broker with a multi-country offering. We are not an investment bank whereas the other nine on the list have a global business model. We focus on best execution and we need to be pre-eminent in that space. This means being a strong execution partner and having a comprehensive offering whether it is high or low touch. It is also very important to listen and talk to clients to develop an execution offering along with high quality service that meets their needs.

We have integrated the two trading desks of CA Chevreux and Kepler and now have one team and service that covers electronic trading, cash trading as well as portfolio trading. We offer a wide range of products including direct market access, algos, high touch trading and research. It is being run out of London, which has a greater focus on large cap stocks and Paris, which has a greater emphasis on small to medium-sized caps. We also have offices in all major European centres as well as New York, Boston and San Francisco.

What do you foresee as the challenges and opportunities?

When you talk about future challenges I think one thing is for certain, we are operating in an environment that will continue to change and evolve at a fast pace. With increased regulatory and political pressure now influencing the market structure the challenges of tomorrow will be very different to the challenges we face today. You only need to look at the unintended consequences of MiFID I implementation leading to widespread fragmentation and the explosion of HFT. You would have not predicted this in 2007.

As for opportunities, we will continue to focus on integrating the two brokers and to help our clients who are facing the same challenges as ourselves. We will also continue to reinforce our offering with innovative and new products.

©BestExecution

Equities trading focus : Retail trading performance : Darren Toulson

LiquidMetrix - Darren Toulson
LiquidMetrix - Darren Toulson

IS SIMPLY ACHIEVING EBBO GOOD ENOUGH?

LiquidMetrix - Darren Toulson

Darren Toulson reviews the trading performance achieved by retail investors across different retail trading mechanisms used in Europe.

In the UK most trading conducted by retail investors happens on-exchange but ‘off-book’. Retail investors, most of whom trade online using broker websites, will send a Request for Quote (RFQ) to their broker who will use an RSP Network to elicit two-way quotes from up to 40 market makers. These quotes are good for at least 15 seconds so the broker can present the retail investor with the best price available and give them the option to accept that price and conclude the deal.

Best execution for the UK retail customer therefore relies on two factors. Firstly, by convention, market makers provide two-way quotes at or inside the best bid/offer and usually with volumes greater than available on the London Stock Exchange. Secondly, the competitive nature of the RSP network means that the price offered may not simply match the LSE BBO price, but may actually improve on it.

This is the theory; how about the evidence?

Be21_LIQUIDMETRIX_Figs

Figs. 1 & 2: Spread Capture for FTSE 350 stocks Spread Capture for ISDX reported small / mid cap stocks

Figures 1 & 2, show spread capture histograms for different sets of retail trades executed in UK stocks during May 2013. They are created by taking each retail trade and comparing it’s execution price with the volume weighted market wide EVBBO price, i.e. the best price a Smart Order Router (SOR) could aggressively execute that order size, using lit liquidity on the LSE or any lit MTF. A value of 0 for spread capture means that the price achieved was the same as aggressive EVBBO, a value of 0.5 means the price achieved was EVBBO mid price, i.e. capturing 50% of the EVBBO spread.

Fig .1, shows spread capture frequencies for around 100,000 retail trades executed on UK FTSE-350 stocks. Fig. 2, shows around 50,000 retail trades executed in small/mid-cap stocks that were reported to the ISDX exchange.

What’s striking is that in both cases, execution prices are mainly to the right of the 0 spread capture mark, i.e. inside the EVBBO spread. Averaging over all flow, FTSE-350 stocks are executed about 15% inside the spread, the small/mid cap ISDX stocks about 25% inside the spread. This means that a UK retail investor will get price improvement of at least 15% of the EVBBO spread on their aggressive ‘deal now’ orders (also with a free 15 second option to trade).

What does this mean from a best execution standpoint? For many people, best execution for retail size trades often simply means that the trade should execute within lit primary market BBO prices. What’s clear from our results, however, is that matching BBO or even EBBO isn’t ‘good enough’ – there are liquidity providers willing to quote well inside lit UK BBO prices for retail investors. So, any UK broker offering to simply match EBBO, with no potential for improvement, would not be achieving the best price available for their customers.

How about other European markets? Each country in Europe has different ways of routing / internalising / executing retail flow. We’ll give two examples, one for France and one for Germany.

Be21_LIQUIDMETRIX_Figs.3-4_1024x370.jpg

Figs. 3 & 4: Spread Capture for French Equiduct trades Spread Capture for Stuttgart trades

Fig. 3, shows spread captures (versus primary market BBO this time) of French retail flow executed on Equiduct’s PartnerEx exchange. Although the flow is executed at exactly EVBBO on PartnerEx, there are still some clear improvements to be had versus sending the same orders aggressively to the primary market. Fig. 4, shows spread capture of German trades executed on a regional exchange – Stuttgart – versus BBO prices on Xetra. Again there is clear price improvement with, in this case, a relatively large number of trades matching at Xetra mid-price.

So the pattern is that, using quite different microstructures, aggressive retail flow is typically achieving significant price improvement over prices that would be obtained by routing aggressively to a primary market. For retail best execution, BBO (or in the UK, EBBO), is not ‘enough’.

Finally, we might ask why should these retail focussed trading pools offer apparently better prices than those offered on lit markets to institutions/professional traders? The most likely answer is that in each of these venues, the counterparties to the retail flow can be fairly confident they are dealing with low frequency, ‘less toxic’ flow. As such, the risks of being picked off by HFT firms or suffering the adverse selection risk of trading the other side of a large, price-moving institutional block are reduced. Hence the liquidity providers can afford to be more generous.

There are interesting parallels here with the role of buyside Dark Pools / BCNs in the institutional space where, by being selective about who can participate, these venues can sometimes provide better outcomes to those who do participate.

*Darren Toulson is head of research, LiquidMetrix
©Best Execution 2013

Trading : Market data

MktData_IMAGE_560x375
MktData_IMAGE_560x375

MARKET DATA: TOO GOOD TO GIVE UP?    

DATA_IMAGE_569x375     

Roger Aitken canvasses industry opinion to see if a more holistic approach can be found.

Exchanges ‘milking’ member firms has been a cry made by industry players on both the buyside and sellside in recent years. Back in June 2008, Clive Furness, Managing Director of Contango Markets, a commodities consultancy, remarked in a Highdeal roundtable in London that a “huge number” of end users of stock and derivatives exchanges were being confronted by a “bewildering array” of pricing tariffs – linked to outright price but also the speed of execution and latency.

Furness even referred to one unnamed exchange where their pricing tariffs stretched across “10 columns on a single page and ran to 11 pages” just for financial products.

Since then and with the arrival of MiFID in Europe a plethora of exchanges and MTFs sprang up. For UK equities alone there were at one stage 19 separate trading venues and Europe became home to 27 exchanges and 19 MTFs. Trade execution fees for equities shrank to an average today of around 0.2 basis points (bps) versus 2.0bps pre-MiFID.

Against this backdrop the exchanges sought to find inventive ways to boost revenues. Here the London Stock Exchange Group’s (LSEG) 2013 annual report offers an insight. It reveals that its information services business segment generated £306.3m representing 36% of the exchange’s total income out of £852.9m (2012, £814.8m) in the year to 31 March 2013. That was almost a 40% rise over a year earlier (£218.9m) and contrasts with the capital markets segment for multi-asset trading declining £34.4m or 11.4% over the same period.

Within LSEG’s Information Services segment sits the sale of real-time price information and a range of other information services from indices to post-trade analytics, as well as fees based on the number of terminals taking the exchange’s real-time price and trading data and subscription fees for data. Meanwhile, LSEG’s technology services, spanning fees for network connections and server hosting pulled in 6.6% more income than the previous year.

ObjectTrading_S.Woodyatt.454x375

Steve Woodyatt, CEO and chairman of Object Trading, a London-headquartered vendor providing a single interface to the world’s markets for the buyside, sellside and technology partners, reflecting on the evolution of charges levied by exchanges refers to a heated industry panel discussion he attended at last year’s International Derivatives Expo (IDX) in London.

Woodyatt noted, “The buyside attendees during that session were raising the vexed issue that the exchanges were not moving their costs, while the sellside complained that they were being pushed thin at the margins. On top of this they had the added cost of regulation and compliance to contend with. And, there is only one place that the increased cost of doing business can go – to the buyside. However, they are saying they cannot or do not want to pass it on to the investor.”

Woodyatt added, “When one looks at the buyside and where it’s going to be squeezed on margins, the issue in Europe and the US is that the exchanges are very busily trying to hold their ground on costs. They are shifting things around and certainly one sees cases in Europe where they are now increasing data and technology charges in order to maintain margins.”

A year on and Woodyatt thinks the situation is pretty much “business as usual” as far as market data and technology charges levied by exchanges go. “It’s rather analogous to the old story of the Emperor’s new clothes. Everyone knows this movement really has to happen but no one probably knows where to start the discussion and does not want to be the first to initiate it.”

The issue remained pertinent for delegates at this year’s IDX event in London in late June. During ‘The View from the Top’ debate with various exchange CEOs an audience poll revealed that 45% believed exchange charges (unspecified) would be higher in 2014.

And, in a sign that there could be some movement in terms of ‘industry stack’ co-operation and addressing issues in a more holistic manner by different stakeholders in the value chain, Andreas Preuss, Eurex CEO, posed the question during the same session as to why the industry does not use a “global liquidity access infrastructure”. That is something Object Trading’s FrontRunner suite is predicated on and facilitates improved efficiency for the industry.

Woodyatt indicated that he is starting to see market participants between the various trading interfaces “beginning to collaborate”, although this is more between buy- and sellside institutions than between sellsides and the exchanges.

Alex Clode, business manager at Bloomberg, who has spent some twenty years providing data, research and execution solutions, commenting on whether fees levied by exchanges in Europe and North America are too high for end users said, “Clearly we would all prefer to see lower fees in terms of market data and that I’m sure is the case for the buyside, sellside and all the vendors. However, it’s not immediately obvious to us that the exchanges are charging too much for market data.”

He added, “One could argue that the overall costs have balanced out to a level where they are at least sustainable.” In relation to the overall exchange charges for the different components, Clode contended, “The most appropriate mechanism for ensuring the market gets what it requires and at price that it can afford, is to ensure that there is enough competition between venues to lead to downward price pressure – certainly on the messaging and transaction fees.”

He added, “All the execution venues need to be able to earn sufficient revenues in order to enable them continue to invest, further innovate and carry on providing the types of trading environments, which the players on both sides need.”

Turning to the data side, Clode however acknowledged, “It’s rather like a balloon. If you press it in one area it tends to become a bit larger in another. And, that I suspect is what has occurred within the data side. That said the data costs have not increased in the individual venues.”

Richard Chmiel, senior vice president of global sales and marketing at OneMarketData, a vendor providing market data management and analytical solutions for financial institutions, concurs with Bloomberg’s Clode that the exchanges “have to be able to make money” to fund future investment. However, the former market maker adds a caveat in that a level could be reached at which exchange charges could become “too onerous” – especially exchanges that have a near monopoly. However, that begs the question at what level?

Andy Allwright, head of regulatory strategy, trading at Thomson Reuters, commenting on the exchanges’ charging tariffs for data, messaging, and transaction, says, “In our view the issue is not about the individual charges of exchanges, MTFs and trade publication services, it is about the lack of an established commercial and contractual model outside of North America for the distribution and consumption of consolidated data across all of these parties in real time.”

He adds, “In the absence of this the only option is to charge the combined underlying real-time fees for all parties contributing the data. The result is that administering access to the consolidated data to only those paying for all the data is problematic and the combined cost prohibitive.”

Bob Fuller, chief administration officer, Fixnetix, a vendor of ultra-low latency trading, market data and connectivity solutions, and former CEO of Equiduct, commenting on the collapse of efforts by the COBA Project to develop a consolidated tape of record in Europe for equities says the endeavour faced difficulties. While most of the big broking houses have their own consolidated tape, for others costs of consuming a consolidated tape would be cost prohibitive.

While the technology is available to achieve the COBA Project’s envisaged goal, Fuller notes other challenges and a “definitional issue” as to who (retail or institutional) will be reading a consolidated tape and for what purposes.

“Fundamentally the problem lies in the fact that once you want to start to redistribute the huge amounts of raw or normalised data one has to pay to all the individual exchanges. This can be an astronomical cost for individual users and consequently this sort of information tends to be confined to algorithmic/HFT desks with an absolute requirement for it.”

For now the debate about exchange tariffs and those in relation to market data is unlikely to abate anytime soon. Both Deutsche Börse and LSEG were canvassed by this magazine for clarification on their market data charges but declined to comment.

©BestExecution

Trading : HFT

Trading_HFT
Trading_HFT

HFT ON TRIAL.

Dan Barnes assesses the clampdown on HFT.

Germany’s decision to regulate high-frequency trading (HFT) is almost unique in developed capital markets. While some are hostile because they lack the infrastructure to support HFT, the country is ahead of the pack in actively clawing back the advances of technology-led trading.

In the US, HFT accounts for around 63% of cash equity trading volume compared to Europe where the figure is closer to 40%, and in Asia 25% according to research by market consultancy GreySpark. Interfering with such a significant proportion of trading could reduce liquidity and increase volatility. However regulators must also ask whether the scale of HFT volume may be threatening market stability.

Paul Squires (see Profile, p.14), head of trading at buy-side firm AXA Investment Management, says, “Some HFT firms cancel close to 90-95% of their orders, pinging in and out of different markets then withdrawing liquidity. Some of the descriptions of this such as layering or ‘quote stuffing’ have negative connotations because that activity quite closely resembles price manipulation. You cannot trade against those orders but they affect the price you think you can trade at.”

Trading at high speed can also create spectacular mistakes. On 23 May 2013 NYSE saw a sell order in two energy companies trigger a ‘flash crash’ that cost investors $1.7m. Sellers traded 200,000 stocks at prices down to 63% below their value for less than a second, before trading returned to normal prices.

During the flash crash of 6 May 2010, which saw the Dow Jones Industrial Average Index drop 1000 points in 20 minutes before bouncing back, the Securities and Exchange Commission (SEC) and the Commodity and Futures Trading Commission (CFTC) identified the huge bursts of activity and withdrawal of liquidity by firms operating HFT market-making strategies as a catalyst for the crash. At one point these firms traded 27,000 equity futures contracts at such a speed they generated 49% of the day’s trading volume in 14 seconds, which caused many automated equity trading systems to freeze due to the extreme activity.

Since the 23 May flash crash the New York Stock Exchange has implemented a circuit breaker to cover the 15 minutes of market open and 30 minutes of market close that are not covered by the cross-market circuit breaker that regulators are enforcing. However other exchanges are still exposed in these periods and the event is symptomatic of the threat that markets face says Joe Saluzzi, co-founder of equity broker Themis Trading.

Themis_J.Saluzzi.481x250

“[The 23 May crash] looked like an error rather than malicious, but it could easily have involved more stocks, it could have been a bigger error or a programme trade that came through and where was the protection?” he asked. “Years after the 2010 flash crash?”

However high-speed trading is not always HFT. When the Australian Securities and Investment Commission (ASIC) investigated HFT and dark pools it found that HFT was not a major contributor to market disruption, while market fragmentation and dark pools were. In a report issued on 18 March 2013, it noted that a variety of automated traders could be found demonstrating behaviours that are complained about such as pinging, and that HFT firms do not appear to rest orders in the book for very short periods of time.

“When we trade via algorithms the smart order routers break up our block orders into small quantities: it is intended to look like any other (non-institutional) order flow to avoid attracting attention,” says Squires. “So I can understand why regulators might wonder – when asked how to define HFT – whether algorithmic trading is really any different.”

Indeed the rules do not always distinguish clearly between the two. For example, the European Commission’s draft proposal for MiFID II demands that firms trading with algorithms should provide continuous liquidity, in an apparent reference to market makers and defines HFT as placing large volume orders based on high-speed analysis of market data.

The blurred line between HFT and other electronic trading and a lack of empirical evidence to connect it with either market abuse or disruption, has stirred concerns among market observers about the effect of regulatory interference.

GreySpark_B.Wood.487x250

Brad Wood, partner at GreySpark says, “I think we have seen some knee-jerk regulatory responses to the flash crash made by politicians who frankly have no idea how these markets operate.”

The first step

On 15 May 2013 the German market regulator BaFin created a regulatory regime specifically for HFT firms. It defines them as those which use co-location and high-bandwidth lines (10 gigabytes) to trade; which initiate orders without human intervention; with more than 75,000 orders placed per day; and with a high order cancellation ratio. These traders will have to be licensed by BaFIN if they are a participant with the German market either directly or indirectly, including firms that access the market electronically via other market participants.

Licensed firms must hold at least €730, 000 in capital and have to make sure their trading systems are resilient, are operating within the trading and risk parameters, have the capacity needed to cope with their activity, do not send erroneous orders or disrupt orderly market function, market abuse regulations or venue rules and have business continuity plans in place to prevent operational failure.

Trading venues have to tag orders generated by algorithms, set limits on order to trade ratios and message volumes, with levels appropriate to the assets being traded and their characteristics.

Any trading firm that place orders to fix artificial price levels; disrupt or delay the functioning of the trading system; to make it difficult for third parties to identify genuine purchase or sell orders in the trading system; or to create a false or misleading impression about the supply of or the demand for a financial instrument, will be considered market manipulation whether or not the algorithm that places the orders is engaging in HFT or not.

At least one firm is reported to be shutting down as a result of these rules. Cologne Independent Traders, which operates in Germany and Switzerland, describes itself as a specialist in Delta-1 trading with a focus on stocks, commodities and futures and exchange trading funds (ETFs) is expected to shut before the end of the year.

Crest of an international wave

BaFin is one of the first regulators globally to introduce these rules but many others are close on its heels. In Asia, HFT is only viable in a few markets but policymakers are aggressively imposing frameworks to help them avoid the sort of events seen in the US.

Meanwhile, ASIC is in the process of introducing rules to give market operators better control over extreme price movements, by using automated trading pauses which will extend to the ASX SPI 200 Futures market and enhanced market participant filters and controls for automated trading, which will include a ‘kill switch’ to immediately shut down problematic trading algorithms.

It also charges trading venues a levy based upon the amount of message traffic that it has to manage to monitor them.

Taxes on equity trading in Korea make HFT non-viable, but its derivatives markets rely hugely on HFT. Joon-Seok Kim, research fellow in the Capital Markets Department at the Korea Capital Market Institute, has identified that firms posting 1000+ orders a day on a single instrument in the KOSPI200 Futures and Options markets make up 50% of daily volume.

The Korea Exchange (KRX) released a consultation to regulate electronic trading of derivatives on 29 April 2013. Under the proposal it would require firms to register their trading accounts, which will have kill switches allocated to them to prevent algos going awry.

The exchange will be able to refuse ‘excessive’ numbers of orders and can charge firms that post excessive numbers of orders a fixed fee of KRW 1,000,000 per month. The ratio of orders to trades was set at 20-1 if the number of orders is between 20,000 and 100,000 and 10-1 if the number of orders exceeds 100,000.

MiFID II, the pan-European directive currently being debated and expected to take effect some time in 2016, will follow the line of the German rules but could go even further, imposing a minimum resting period for orders and order to trade ratios.

Sungard_M.Almqvist.507x250

Sand in the wheels

The European line would be to make HFT less viable. However targeting a trading style without identifying specific problems it has caused would appear to be unfair on HFT notes Magnus Almqvist, senior business development manager at technology provider SunGard.

“You do find harmful trading patterns that fall under the HFT umbrella just as you do under any trading method,” he says. “When you had floor trading there were unacceptable methods on the trading pit.”

Squires observes that the key in distinguishing between algorithmic trading and HFT is in addressing the trade to execution ratio.

“In my opinion, imposing a trade to execution ratio and minimum resting times would be effective ways to keep an orderly market,” he says.

Etrali_S.JAOUEN.487x250

Alternatives to banning HFT should also be considered says Sébastien Jaouen, head of global trading sales at Etrali. “In a nutshell, with the huge progress we have witnessed in the capturing of all communication pieces and the ability to reconcile them with the full audit trails of orders and trades, the regulators have the tools and power to enforce the regulations and track fraud,” he says. “The technology is an enabler not only for market participants such as HFT but also for the regulators.”

Wood notes that that some rules are likely have unintended consequences which could be more detrimental to the orderly functioning of the market than positive, while they are not guaranteed to prevent the firms operations entirely.

“HFT firms are nimble,” he says. “There is a bit of cat and mouse going on as the regulators will follow behind but I think there will always be a response in the HFT community, they will still be able to identify profit-making opportunities.”
© BestExecution 2013

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA