Home Blog Page 587

FX trading focus : Buyside profile : Brigitte Le Bris

Natixis, Brigitte Le Bris
Natixis, Brigitte Le Bris

THE FX FACTOR.

Natixis, Brigitte LeBris

Brigitte Le Bris, Head of Currency and Global Fixed Income, Natixis Asset Management discusses the evolution of FX as an asset class.

What has been the development of FX as an asset class?

FX volume has dramatically increased in the last few years. According to the latest Bank for International Settlement tri-annual report, daily trading volumes were $5.3trn per day in April 2013 compared to $4trn in 2010 and $3.3trn in 2007. Institutional investors and hedge funds now account for 11% each while corporates are only 9%.

FX used to be traded by banks including central banks and corporates and unfortunately not many final investors were differentiating their domestic investment risk from the associated currency risk. Things are changing and more and more investors are now considering FX as a separate asset class.

What have been the drivers?

I think one of the most important things has been the growing trading volume of emerging currencies. In the past, there used to be a separation between G10 and emerging market currencies but that is no longer the case. The market is becoming more global. For example, the Mexican peso is very actively traded and the Chinese renminbi is now one of the top ten most active currencies. This is due to the globalisation of trade, the growth in emerging markets such as China and the move towards diversified portfolios.

Where does it fit into a multi asset portfolio?

The benefit of FX is that it is lowly correlated with other asset classes and it may help to provide diversification. Also, allocation to FX can be designed according to the level of risk that an institution can tolerate. FX fits perfectly with any multi asset allocation.

What have been the most critical market drivers macro and micro this year?

On the macro side, we are going through an economic decoupling between the US on one side and Europe and Japan on the other side, illustrated by diverging monetary policies followed by the respective central banks. At the same time, inflation remains very, very low. But my main concern is the recent and drastic fall of oil prices. It is becoming a real game changer, able to redesign totally the world financial landscape in the next months.

On the micro side, there has been the conflict between Russia and Ukraine which is still going on as well as the elections that have taken place this year in emerging markets such as Brazil, Turkey, South Africa and India.

How are these trends reshaping the currency marketplace and the way the buyside construct their portfolios?

I would say that investors are more and more aware of the importance of currency. Last year, the FX markets were very boring because there was not that much volatility. However, this has not been the case this year. The trend that we are seeing is that investing in FX is becoming more global but also much more sophisticated. As a result, an increasing number of institutions are delegating their FX investing to specialist fund managers who trade on a global basis. They are no longer just interested in the US dollar value versus majors but also the US dollar versus emerging currencies.

What impact has regulation has had on FX investing?

Regulation is positive in the long-term but may prove difficult in the short term. It can be frustrating but the good thing is that it provides greater transparency. However, the process is slowed down because you now have to sign International Swaps and Derivatives Association (ISDA) agreements with new and different counterparties. Even if the general content is standardised, the new rules means that there are more negotiations having to take place. I do think though that the process will become more efficient going forward.

How has technology changed the business?

An increasing amount of trading is now done through electronic platforms. They account for around 80% of volumes. In one way they are safer in that they speed up the trading and make the execution more efficient. In terms of best execution, it is easier to get the best price. Voice trading though will not disappear. This is particularly true with big orders and you still need the relationship with the counterparty.

We, as fund managers, use sales representatives as advisors but keep the execution separate. The brokers’ sales team will sort out the relevant information produced by their research department and inform us about flows that they are seeing while our trading desks will implement the orders as quickly as possible with the best prices. The platforms are very efficient in terms of helping us achieve best execution.

What opportunities and challenges do you see in 2015?

I believe that the strength of the US dollar will stay in place. The main risk is a lower (growth) economy than forecasted but at the moment I do not see anything that should derail the trend. And if oil prices remain low, it should increase the US consumers’ purchasing power and hence boost the economy.

The other trend that we are looking at is the weakness of the yen which could put other Asian currencies such as Korea and Singapore, as well as some emerging market currencies, at risk. However, there will be opportunities because volatility has come back into the market which is a very good thing. In these types of markets though investors will have to be selective.

[divider_line][Biography]

Brigitte Le Bris joined Natixis Asset Management in September 2010 as head of currency and global fixed income. Prior to this she held the same position at Societe Generale Asset Management, now Amundi. Le Bris began her career in 1987 at Credit Lyonnais London before becoming a trader at the BBL in Paris in 1989. In 1993, she joined CDC Asset Management where she successively held money market and fixed income portfolio management positions. Brigitte Le Bris post graduated as a civil engineer from the ESTP School and has a Master’s degree in finance from the University of Paris Panthéon Sorbonne.

© BestExecution 2015

[divider_to_top]

Profile : Ronan Ryan : IEX

IEX, Ronan Ryan
IEX, Ronan Ryan

LEVELLING THE PLAYING FIELD.

IEX, Ronan Ryan

Ronan Ryan, chief strategy officer discusses how IEX is creating a fairer trading venue.

There has been a lot in the press about the company’s view regarding high frequency traders. There are differing reports so I wanted to ask what the real perspective was?

One of the problems in the industry is that people like black and white answers but it is not always possible to give such answers. From day one our proposition has been to give fair access to all participants. As with any strategy, there are good high frequency trading strategies and some predatory ones. Some firms that could be labelled as HFT look like traditional market makers in trading in a consistent way on both sides of the market and providing healthy liquidity. We are pro technology and our aim is to create a fairer market. We have met with HFT firms and although the press might give off the perception that we are anti-HFT, we are not. We do not want to ban HFT from the market. In fact we’ve met with multiple high-speed trading firms with some telling us that we do not fit their strategy while others are interested in IEX.

What were the drivers behind creating IEX?

The founders of the company are from RBC and one afternoon we were in a conference room assessing our technology product THOR, which we launched in 2009. It normalises the amount of time it takes for data to travel to all the exchanges, neutralising the speed advantage of predatory strategies. It was a real differentiator for RBC and we rose in the Greenwich Associate rankings for customer service for electronic trading to number one from 19 within less than a year in 2011.

At the time, THOR was only used by RBC clients and we thought it would be beneficial to offer the technology to others. The idea was to build a utility that created a level playing field for trading firms of all types and not just for the fast or the slow. If you look across the whole exchange universe we are also the only venue that is owned and funded by the buyside. There are around 20 investors including MassMutual Ventures, Massachusetts Mutual Life Insurance Company and Franklin Resources as well as private investment firms such as Cleveland Capital Management and TDF Ventures. We have also raised funds from venture capital firms such as Spark Capital and Bain Capital Ventures.


How did you decide what type of venue you were going to create?

We sat in a room with some people from HFT firms, exchanges and people who had left with us from RBC. We looked at what the key components would be that would make us successful and decided not to co-locate as other exchanges and venues have. Virtually all the 11 exchanges and 43 dark pools in the US are in four data centres in New Jersey. We decided to put our matching engine in one of the data centres and create what the press has called a speed bump. This delays incoming orders by 350 millionths of a second, or a thousandth of the time it takes to blink, but more importantly it gives us the time we need to refresh our view of the market. This means that high- speed traders will not be able to trade at IEX with data we don’t already know.


What is your relationship with the sellside? I’ve read leading brokers have signed up with you.

Although we are owned by the buyside, they do not trade directly on the platform. We only allow registered broker dealers to trade on the platform and 13 months into operation the relationship has become stronger. In November, Goldman Sachs, JP Morgan, Morgan Stanley, Bank of America, Citigroup, Deutsche Bank, UBS and Barclays were among at least 17 brokers that have adapted their algorithms to allow automatic routing to us. Altogether we have 135 connected as participants and their adoption has been critical to our success. We had no intention of pitting the sell and buyside against each other and we realised that people may have felt threatened if we circumvented the brokers. Our first year has gone by very quickly and we would not have gotten here without the support from both sides.


I hear you just captured 1% of equity market share – what is the next milestone?

In November we captured 1% of US ADV (Average Daily Volume) for the first time, although we have gone above that number recently. This compared to an average of 0.764% in October and 0.543% in May. That might seem small to some people, but according to some industry sources, we have quickly grown to be one of the largest of the 40+ dark pools.

Our biggest goal over the next 12 months is to achieve exchange status. We recently raised $75m from investors and have shared a draft Form 1 with the Securities and Exchange Commission. We will roll out functionality throughout 2015 and have already announced that we plan to publish a displayed quote through our TOPS data feed in the first quarter of 2015. This should help market participants identify available interest in our market. The quotes shown will be non-NMS protected, meaning they will not be included in determining the National Best Bid and Offer but they will be accessible to all our participants.

As a new exchange, we have the chance to create a new model that is more transparent and accountable to both sellside and buyside stakeholders, for example in determining how exchanges should charge for market data.

What are the challenges and opportunities for the company?

In terms of challenges, we need to stay focused on what we are doing. The model has proven that it works and we are working towards getting exchange status. Trading venues typically do not talk to the buyside and we want to continue to learn how to interact with them and how we can improve the venue.
[divider_line}

BIOGRAPHY
Ronan Ryan, is chief strategy officer at IEX, and has over nine years of experience in the financial services industry and over 17 years of experience in networking infrastructure. Ryan previously was the head of electronic trading strategy at RBC Capital Markets, where he used his prior experience in network, hardware, and co-location technology to generate client-facing solutions. Prior to working at RBC, he was the head of Financial Services Development at Switch and Data and Head of DMA and Co-location solutions at BT Radianz.

© BestExecution 2015 

[divider_to_top]

Technology : Artificial intelligence : Heather McKenzie

AI_DIVIDER#01
AI_DIVIDER#01

THE NEW REALM OF ARTIFICIAL INTELLIGENCE

AI_DIVIDER#02

Heather McKenzie asks if this is the end of the world as we know it?

While not widespread in the financial markets, the use of artificial intelligence (AI) or machine learning technology has recently attracted attention. One of the most vocal soothsayers has been renowned British scientist Stephen Hawking who raised the possibility that financial markets could run out of control if taken over by AI-type programmes.

In a newspaper in May, Hawking, who is the director of research at the Department of Applied Mathematics and Theoretical Physics at Cambridge, said that while the potential benefits of AI are huge there is the possibility that the technology could eventually reach a stage where it will outsmart the financial markets, ‘out-invent’ human researchers and ‘out-manipulate’ human leaders. “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all,” he wrote.

Elon MuskHe is not alone. In October, Elon Musk, an investor and technology company chief executive, also voiced his concerns calling it a serious threat to the survival of the human race. At a symposium held by the Massachusetts Institute of Technology, Musk said AI was humanity’s “biggest existential threat”, calling for regulatory oversight on the national and international level.

There are differences between AI and machine learning. The former is typically defined as an academic field of study that aims to create intelligence via intelligent agents, or systems that perceive their environment and take actions that maximise their chances of success. The latter deals with the development of algorithms that can learn from data. The algorithms operate by building a model based on inputs and using that model to make predictions or decisions following explicitly programmed instructions.

In the mid-2000s, two academics, Robert Schumaker of Central Connecticut State University and Hsinchun Chen of the University of Arizona developed the Arizona Financial Text (AZFinText) system, a quantitative textual financial prediction system. It is an AI-based programme that ingests large quantities of financial news stories along with minute by minute stock price data and uses the former to figure out how to predict the latter. The programme then buys or shorts every stock it believes will move more than 1% of its current price in the following 20 minutes

At present, AI type technologies seem to be of most interest to hedge funds. DE Shaw and Renaissance Technologies both use techniques called AI to improve their investment and risk models.

Two Sigma Investments, a New York city-based hedge fund uses a variety of technologies, including high frequency trading, AI, machine learning and distributed computing for its $23bn funds. Founded in 2001, the firm aims to develop technological innovations that intelligently analyse data. Its modelling teams work with vast data sets û from corporate earnings to news and weather reports. The fund seeks to synthesise quantitative and qualitative information sources to generate new scientific insights with systematic precision. Investment strategies are based on the research, which Two Sigma says generate alpha and manage risk.

The power of big data

Many of today’s AI-type initiatives are based on data, or ‘big data’. There is renewed interest at sellside firms in quantitative skills, with firms using as much information about their clients as they can in order to inform how they position products and how they price them. Data-centric initiatives are being implemented not just for pricing and risk assessments, but also to analyse client behaviour. This is similar to how online firms Google and Amazon use data to target advertisements.

In London recently, academics from the Warwick Business School demonstrated systems that use big data to detect early warning signs of stock market moves. Google, Wikipedia and data from the digital world can be used to measure and anticipate human behaviour, according to the researchers.

US-based Rebellion Research is another firm tapping the power of AI technology. It employs the technology to process an extremely diverse set of information, basing its analysis on many hand-selected, macro, fundamental, technical and more traditional factors such as growth, value, momentum, etc. The AI uses its performance predictions along with knowledge of the volatility and interrelationships among stocks to create a portfolio that balances risk and expected return. This analysis, which typically defies even the largest team of analysts, is performed “with scientific rigour transcending the pitfalls of human emotion”, says the company.

Rebellion, Alex FleissBayesian statistics (a subset of the field of statistics in which the evidence about the true state of the world is expressed in terms of degrees of belief or Bayesian probabilities) serve as the backbone of Rebellion’s AI-based investment software. It provides a flexible framework that enables the firm to automatically integrate the new data available each day with prior market knowledge in order to predict stock performance. Alexander Fleiss, chief executive of Rebellion Research describes AI as “the most exciting technology on this planet”, with the potential to save lives, make money and create a much more utopian society.

The firm employs its proprietary AI-based machine learning powered system to operate two strategies for clients: AI Global Equity Strategy and AI Absolute Return Strategy. The former has been managing money for partners and clients since 2007. Its strategy seeks greater return than the US stock market over time and the algorithm learns as the markets change, automatically adapting to new information. The strategy purchases stocks of various sizes for clients and uses a combination of growth, value and momentum investment styles.

The AI Absolute Return Strategy has managed money for Rebellion’s founders and clients since 2012. It targets low correlation to the US stock market and like the Global Equity Strategy, the algorithm learns as the markets change, automatically adapting to new information. The AI reviews changes related to the economies of 44 countries, as well as stocks, bonds, commodities and currencies simultaneously.

“When we were first developing the system we found it would look at anything to make estimates û there was a lot of white noise. We spent two years refining it,” says Fleiss.

Developing machine learning systems is a very complex process, which requires the identification of the ‘right’ data and removal of ‘bad’ data. However the effort spent on developing the system has been worthwhile as Fleiss says the technology enables the firm to look at data from a fundamental, earnings-based view combined with a macroeconomic view. “Our technology enabled us to call the Greek debt and euro crisis very early as we had very good economic foresight,” says Fleiss. “We can move quickly and stay ahead of the pack.”

Be26_BWood_GreySparkAI type technologies are neither a ‘silver bullet’ nor a cause for concern in the financial markets, says Bradley Wood, a partner at London-based GreySpark Partners. “AI is quite a fuzzy term; it needs to be more precise,” he says. “There are heuristic algorithms, machine learning and advanced mathematics, or AI. At what point something is AI or an heuristic algorithm is not trivial. A vanilla algorithm is relatively deterministic as opposed to something that has ‘human intelligence’.”

Wood is not convinced that there are revolutionary changes based on the use of AI in the financial markets. Rather, he thinks any change will be evolutionary and will be based on cutting edge mathematics rather than AI. The area most likely to attract development time is that of conduct risk. “AI and machine learning technology is increasingly being applied to monitoring what is going on inside a bank, particularly in the light of the Libor fixing scandal and discovery of rogue traders. There is a lot of interest in bringing clever technology to bear in order to identify suspicious behaviour.”

[divider_line]©BestExecution 2015

[divider_to_top]

 

Post-trade : Clearing : Mary Bogan

Be27_Post-trade_DIVIDER
Be27_Post-trade_DIVIDER

THE INTEROPERABILITY DEBATE RAGES ON CLEARING.

Be27_Post-trade_DIVIDERMary Bogan assesses whether central clearing will be dominated by a few large players or bow to competition.

With the European Securities and Markets Authority (ESMA) rumoured to be returning to market before Christmas to ask for final feedback on MiFID II, spare a thought for the senior managers of Europe’s major clearing houses this holiday season. The legislation, which promises to crack open the post-trade market in the same way MiFID I freed up trading, is hotly contested and the battle lines between those supporting and opposing an unprecedented surge in liberalisation of the clearing market are being hard fought.

What’s at stake is both survival in an increasingly cut-throat market as well as the opportunity to grab a share of Europe’s upcoming, new OTC derivatives clearing business which, according to Morgan Stanley, will be worth an extra E160m in revenues.

“Clearing has become an ever changing landscape and we have reached a key inflection point”, says Alex Foster, global head of strategy and business development for financial services at BT. “Regulators want to increase competition and transparency in clearing but there is some friction between those aims and the desire to reduce risk and create greater stability post the financial crisis. What we’ve got now is a balancing act between what regulators and politicians, speaking for the man on the street, want and what different market players want.”

At the heart of the current fierce debate about the future shape of the clearing market is the interpretation of the MiFID II concept of open access. “What open access means is that clearing houses will now have to offer fair and open access to competitive venues,” says Steve Grob, director of strategy, Fidessa. “They will have to move from a vertical to horizontal model of clearing and decouple from their parent exchanges to become businesses in their own right.”

However, how far regulators should go in opening up the clearing business of vertical silos such as Deutsche Börse, CME and Intercontinental Exchange to competition is proving highly controversial.

Breaking down barriers

Clamouring to break down the doors of the silos and push the competition envelope further are the likes of EuroCCP, LCH Clearnet and SIX x-clear who have fought to make interoperability an established feature of the clearing landscape. Without it, says Tomas Kindler, head of clearing at SIX Securities Services, open access has no teeth.

SIX-SS_TomasKindler“Open access without interoperability changes nothing in the clearing layer. That really comes with interoperability where two CCPs are linked to each other and participants have a true choice where they can opt for a preferred CCP, and it doesn’t matter where the counterparty clears. That’s what we’ve seen in cash equities over the last couple of years. Clearing has become more efficient because participants are able to consolidate flows in a single CCP with synergies on margin, collateral and so on.”

EuroCCP_DianaChanDiana Chan, chief executive of EuroCCP agrees that open access without interoperability is a toothless tiger. “The fundamental economics of why people want to choose their CCP is that by concentrating business in one CCP they can benefit from portfolio margining, making contributions to a single default fund instead of many, and a volume discount on fees. It is to realise the benefits of interoperability that we need open access. One cannot function without the other.”

Since mid-2011 when regulators first approved interoperable arrangements between CCPs, about half of trading venues, including all the major pan-European MTFs such as BATS Chi-X and Turquoise have introduced a choice of CCP. In addition a handful of primary exchanges including the Swiss exchange SIX and, more recently, LSE and Oslo Bors have embraced interoperability. When LSE goes live with interoperable arrangements at the beginning of next year, some 57% of European trades will be cleared concurrently by the three interoperating CCPs, according to EuroCCP.

While a greater choice in clearing appears popular with market participants and the process for facilitating interoperability in cash equities has so far proved problem-free, resistance to an open market is still holding strong in some exchanges.

“With the exception of Euronext all the remaining main markets that remain closed, such as Germany, Italy or Spain, are vertically integrated. From a pure economic perspective they have no incentive to open up. So the equities market is seen as the first line of defence for the lucrative derivatives market. That’s the main area they want to protect. Keeping the equities business closed has to be seen in that context,” says Kindler.

At present interoperability is being limited to cash equities, and to a degree, cash fixed income securities but whether regulators will extend the concept to products such as OTC derivatives remains to be seen. According to Philip Simons, head of OTC derivatives business at Deutsche Börse’s Eurex Clearing, the move would be a wrong turn.

“Regulators should absolutely not allow interoperability for derivatives,” he told the Mondo Visione Exchange Forum recently. “It creates systemic risk. If one clearing house goes down it would result in contagion as it would bring down all the others.”

However, Chan thinks interoperability in some parts of the derivatives clearing space is feasible. “If a CCP can manage market risk from its own members by collecting margin, then it can manage market risk from an interoperating CCP by collecting margin from that CCP,” she says. “To prevent contagion, CCPs should not be subject to the loss sharing rules of an interoperating CCP, even if this means collecting additional contributions from participants. For some institutions, the gains to be made from being able to use a CCP of choice may outweigh these additional costs.”

In addition, an interoperable clearing arrangement for derivatives already exists in Europe, albeit in a limited way. The Oslo Bors and the LSE’s Turquoise platform operate a joint order book for Norwegian equity derivatives and flow from Oslo Bors is cleared by Oslo Clearing while flow from Turquoise is cleared by LCH.Clearnet under an interoperable arrangement. The key requirement is the fungibility between contracts, argue proponents, but conceptually all standardised products that could be made fungible could qualify for interoperable clearing.

It’s not just those CCPs though with a vested interest in warding off increased competition that are sounding a cautionary note about its possible negative effects on the market. Observers like Steve Grob also fear that the move to open access and the centralised clearing of OTCs is making the financial world a less safe place.

“As CCPs start to uncouple from their parent exchanges and compete more directly with each, an obvious step to differentiate themselves is to offset margin requirements from equivalent but not fungible products especially given the opportunity costs of capital now. This could now include OTC product and exchange traded ones, say a Euroswap and a Bund. While this is OK in principle, “equivalent” is very different from “same” and a race to the bottom in this type of competitive activity will increase systemic risk rather than reduce it.”

As clearers try to build critical mass in the derivatives market, the introduction of long fee holidays and fee caps could suggest a race to the bottom has already begun and that derivatives clearing could well follow equities into a cut-throat space.

Reconciling greater competition in clearing with preserving the financial integrity of CCPs has therefore become a major focus of regulatory debate this year. Tougher international risk management standards have already been imposed on CCPs but now attention has turned to preventing and resolving a CCP failure. In October, the Bank for International Settlements and the International Organisation of Securities Commissions suggested that CCPs be allowed to call for additional resources from members in a crisis as well as introduce margin haircuts and terminate or ‘tear up’ contracts.

Members though fear such moves would subject them to unquantifiable exposures and JP Morgan, for example, recently proposed that CCPs be required to contribute more than 10% of member contributions into a guarantee fund to give them more “skin in the game” and curb reckless risk-taking. Another option being considered by European regulators is to introduce another layer of total loss absorbing capacity (TLAC) for CCPs, similar to banks, stipulating the minimum amount of capital and liabilities that can be written off in the event of a crisis.

Recovery and resolution is now the main regulatory challenge, Steven Maijoor, chair of ESMA, said recently. “It is of utmost importance that we speed up the process of having a recovery and resolution framework in place. With the move towards central clearing, CCPs are becoming more and more systemically relevant. Not having the recovery and resolution framework in place now is like letting ships leave for their maiden trips without any life boats on board.”

What shape that framework takes will play a key part in determining whether clearing remains a game dominated by a few titans or whether a new era of competition and innovation finally dawns.

© BestExecution 2015

[divider_to_top]

Market opinion : Best execution MiFID II : Anthony Kirby

Ernst&Young, A. Kirby
Ernst&Young, A. Kirby

STEPPING UP TO THE PLATE.

Ernst&Young, A. Kirby

Dr Anthony Kirby looks at how the game is changing for order driven and RFQ.

The topic of best execution is interesting because it represents the bridge between innovation in the markets and the regulatory conduct of business (CoB) rules which are designed to protect the investor. MiFID I, which took effect in November 2007, introduced a multi-factor based approach with regard to best execution, in comparison with previous definitions which were in common use in other jurisdictions such as the US, based merely on best price.

Since then, the financial markets have changed substantially as a result of the financial crisis, the prudential and conduct regulatory responses, and the technological innovations. The number of trading venues has increased by 44% and the number of registered MTFs has jumped by 78%, while the number of systematic internalisers (SI) has stood steady at around a dozen firms. The flow of equities classified as high frequency traded (HFT) increased more than six-fold, as did the amount of equities flow placed in dark pools. Both the market impact to the investor and the number of algorithms have both near-tripled in the meantime.

MiFID I certainly delivered more choice to the securities industry in Europe, but clearly at the expense of fragmenting order flow. Most buyside heads of trading understand the current challenge when dealing in size has been the market impact associated with signalling risk when trying to locate quality liquidity. The collapse of block trading (the result of narrowing capital commitment given demands for quality capital elsewhere), the abuse of dark pools, the proliferation of HFT market-makers (powered by lower-latency architectures), and the re-consolidation of market venues have all worked to confuse attempts for firms to evidence best execution procedures to their clients, particularly in non-equities.

MiFID I article 21 obliged investment firms to take all reasonable steps to execute orders on terms most favourable to the client, and to obtain the most favourable possible result for their clients taking into account price, costs, speed, likelihood of execution and settlement, size, nature or any other consideration relevant to the execution of the order, whether trading occurred on- or off-exchange. Many firms since 2010 have adopted a ‘MiFID I.5’ approach revising their best execution policies with regard to the execution arrangements surrounding types of ETFs such as synthetic ETFs, and particularly when evidencing best execution with regard to an evolving broader market for non-equities.

It is not surprising that regulators were keen to press for closer scrutiny of best execution with qualification for certain asset classes given the large behavioural shifts needed for voice trading, and the trade-offs between transparency and liquidity under the new MiFID II Level I text issued in April this year. The form of the legal text is tougher. Article 27(1) states, for example, that ‘Member States shall require that investment firms take all sufficient steps to obtain, when executing orders, the best possible result for their clients taking into account price, costs, speed, likelihood of execution and settlement, size, nature or any other consideration relevant to the execution of the order.

A big ask

This is a challenging ask in a market fragmented as described above, and it’s a considerable game-changer for quote-driven (RFQ) markets such as fixed income, currencies and commodities, even if some heads of trading have still to realise that. These markets feature extensive voice-trading, indications rather than trade prints, and they do not benefit from the use of transaction cost analysis (TCA) tools. The best execution requirements, innocuous in themselves, will be introduced to the bond and derivative markets (comprising asset classes as diverse as government, corporate, high-yield and emerging market debt, asset-backed securities, and OTC-traded derivatives – in both liquid and illiquid form, depending on the instrument, maturity and market conditions).

The reference to ‘all sufficient steps’ represents a step-up from the original MiFID I text and will formalise the factor-based approach (‘price’, ‘costs’, ‘speed’, ‘likelihood of execution’) – see Figure 1. When coupled with applying the European Commission’s 4-part ‘legitimate reliance’ tests* when firms are ‘executing orders on behalf of clients’ (as opposed to ‘arranging deals in investments’) or when applying the supplying requirement to evidence costs (‘total consideration’), it is clear that the MiFID II measures will be significantly more burdensome for market intermediaries.

MiFID-II-Best-Ex-Schematic

Requirements for greater disclosure by dealers might also lead to a tightening in the number of firms prepared to commit their capital to hold inventory or make prices. Firms have also expressed fears of complications arising from the interplay of regional and local (including thematic) measures (e.g. the FCA’s Thematic Review in the UK, new HFT laws in Germany and France, and caps on the order to trade ratios introduced in Germany). The liquidity characteristics of the instruments (under ‘normal’/’stressed’ market conditions) are an additional consideration. However, many small- or mid-sized buy- and sell-side firms are not where they need to be. On the buyside, many haven’t linked their PMS/ OMS/TCA systems together, and therefore place undue reliance on their brokers to effect best execution.

The MiFID II’s LI text (accompanied by the ESMA/2014/548&9 papers) describe “WHAT” needs to be done to tighten up best execution, while the FCA’s TR14-13 describes ‘good’ and ‘poor’ practices to illustrate “HOW” this might be achieved. For example, the true focus on increased disclosure of costs under MiFID II for investment firms acting on behalf of retail-classified clients seems relatively open-ended. If execution is performed for retail clients, due consideration should take account of explicit and implicit costs e.g. expenses (execution venue fees, regulatory levies, taxes, clearing/settlement fees, third-party fees) and commissions, per each eligible execution venue featured in the firm’s best execution policy.

Some politicians who have familiarity with the FMCG (fast moving consumer goods) sector have expectations that the financial industry equally inform retail clients in the manner of price comparison websites, carrying the presumption that more information is better. Given that ‘best execution’ falls under the scope of directive (and not regulation), this author would argue that the industry would benefit from a consistent codification of expectations in each Member State. Moving forward, it is clear that firms will need to migrate beyond today’s simple workflows, and devise a ‘Big Data’ approach to data mining.

If this is the direction of travel, then the areas for improvement will include:

  • Increasing both precision and availability of data relating to IOIs, RFQ/S/M,
  • Increasing the granularity of data drawn from the ‘broader markets’ – to replace the screenscrapes that are in frequent use today
  • Increasing the level of forensics – unstructured data such as voice/e-mail/text runs creates problems of evidencing.

This is not to say that there won’t be significant benefits with improving the clarity under the MiFID II best execution regime. Far from it. The immediate benefit will be greater transparency for the buyside and better accountability to end investors, which would ordinarily lead to greater confidence when exercising trading decisions and thence a greater trading velocity. Improved transparency could also foster better accountability and better confidence, leading onto better liquidity.

Future innovations may well arise out of MiFID II spanning fixed income/FX TCA, RFQ streaming (equivalent of SOR); fixed income algorithms and ‘all-to-all’ liquidity approaches that we have seen in the fixed income markets of late. Firms may decide to innovate by producing tools themselves, monetise through collaborations via ventures, or rely on market-led/FMI-led approaches. MiFID I showed form in all of these areas – who is to say that MiFID II cannot create opportunities for venue and vendor providers alike when creating more flow-business opportunities? 

Any views expressed are the author’s own and should be used for information and reference purposes only and not intended to provide legal, tax, accounting, investment, financial or other professional advice on any matter.

* The Commission Opinion set out a list of the four-fold test to help determine whether a client is legitimately relying on the firm:

• Which party initiates the transaction

• Questions of market practice and the existence of a convention to ‘shop around’

• The relative levels of price transparency within a market

• The information provided by the firm and any agreement reached

[divider_line]© BestExecution 2015

[divider_to_top]

 

Regulation : Libor reform : Louise Rowland

Be27_Libor_500x652
Be27_Libor_500x652

LIBOR: STILL FIT FOR PURPOSE?

Be27_Libor_500x652

Louise Rowland looks at life after the scandals and the debate over the new benchmarks.

When Chancellor George Osborne announced that the ceramic poppies on display at the Tower of London would be embarking on a nationwide tour, courtesy of £500,000 of the Libor fines collected from banks, “from those who’ve demonstrated the very worst of values to support those who have shown the very best”, he captured the public mood perfectly.

Six billion dollars in global fines, seventeen criminal prosecutions… the Libor rate-rigging scandal plumbed the depths of banking misconduct. Since 2012, extensive reforms have taken place to rehabilitate it. Libor looks very different today from the benchmark that proved so easy to manipulate at the heart of the banking crisis. So it should: it remains one of the most important global reference rates, used to set some $300 trillion of financial transactions, including corporate loans, mortgages, credit cards and savings accounts.

At the heart of the problem was the benchmark’s reliance on offered rates rather than actual trading data. The incentives and rewards for manipulating the market were enormous. Thanks to the collusion in the infamous chatrooms, investors were being massively short-changed: pension funds could easily lose millions of pounds every day.

Clearing up the mess

Regulators worldwide demanded a complete overhaul of Libor, with the introduction of far greater governance and transparency. The UK has been at the forefront of these efforts. The Wheatley Review’s ten-point plan in 2012 ensured that the publication of rates submitted by the 18 participating banks was embargoed for three months and that the number of Libor rates published daily was cut from 150 to 35 fixings a day. A new administrator, ICE Benchmark Administration (IBA), also took over from the British Banking Association early in 2014.

IBA is keen to anchor the Libor benchmark as far as possible in actual market transactions and plans a more standard, prescriptive formula for rate submissions. Banks are likely to be asked for more of their internal transaction data, and to highlight clearly which of their submissions include a measure of human judgement. A newly developed system is also being tested to collect and validate the rates submitted.

Libor has now become the benchmark for benchmarks. Its legislation is being extended to other major reference rates: FX, the gold and silver markets, commodities and oil markets. The Treasury and the Bank of England recently launched a Critical Benchmark Consultation to canvas market views, and a Fair and Effective Markets Review is also underway, likely to result in new legislation by the summer of 2015.

There is also intense activity at global level, with the International Organisation of Security Commissions (IOSCO) issuing guidelines for financial benchmarks in June 2013 which contained 19 key recommendations to introduce greater reliability and oversight.

Time for a change?

Libor isn’t out of the woods yet, however. A vocal lobby, led by Governor Jerome Powell of the Federal Reserve, is pushing for a transition towards alternative benchmarks to reduce what they see as the market’s over-reliance on Libor. The Financial Stability Board, working on behalf of the G20, is calling for these new benchmarks to be introduced as early as 2016 – a timeline that many in the market see as somewhat ambitious.

The Fed hosted a meeting in mid-November with major market participants to discuss developing risk-free or nearly-risk free reference rate alternatives, based entirely on verifiable market transactions. The Fed is keen for the process to be market-driven, rather than imposing benchmarks on the industry. The FSB believes the introduction of this kind of rate would encourage market choice and reduce systemic risk and is calling for a multiple-rate approach over the next two years, combining survey-based rates and objective data.

The front-runners for these risk free or nearlyrisk free rates are US treasury rates, the general collateral repo rate, the overnight index swap or compound overnight interest rates. The FSB cites EONIA, the euro overnight index average, as an example of a rate that is viable, actively used and nearly risk free.

Everything is still very much up for discussion, believes one market participant. “People are still bouncing around a lot of different ideas. The intellectual discussion continues but, in order to explore or ratify a particular alternative, there has to be substantive agreement and an ongoing and effective global regulatory effort. The whole thing is only going to move forward incrementally.”

While the industry is generally happy to explore alternatives, the first priority is a newlystrengthened Libor. This would be in the form of a hybrid model using offered submissions while also relying on reference to transaction data: an interim period, in other words, before any major transition takes place.

Be27.WMBA_DavidClark“It would be impossible to change Libor overnight,” says David Clark, chairman of the WMBA. “I foresee that it will take a good one to two years if there is to be a change in the way Libor is calculated. A consensus would need to be reached by borrowers who want to know exactly what their borrowing costs are. People trading in derivatives, regulators and supervisors are all stakeholders in this, so we have to arrive at a solution on which everyone agrees and which is workable for them all. If this can’t be achieved then there may be two different rates.”

Herculean challenge

Moving en masse to a new rate would present a phenomenal logistical challenge, requiring a highly co-ordinated approach. The Market Participants Group in the US warned in a report in March of possible market disruption, loss of liquidity and valuation problems if the transition to a new system were not extremely well co-ordinated. “The complexities and risks associated with key benchmark transitions are difficult to overstate.” Who would be the first to jump across? What would happen to existing contracts? What costs would data submitters incur? How would enough banks be encouraged to move across at the right time?

Be27.Ashurst_JakeGreen“Legally, it would be reinventing the wheel,” says Jake Green, Senior Associate at law firm Ashurst. “People are comfortable with Libor because it’s always been there. The work on benchmarking reform and governance has restored confidence and a large portion of the industry is on board with it. So much work has gone into getting Libor right and the cost of moving might outweigh the benefits.”

Whether or not Libor will be replaced is the $64,000 question, says David Clark. “Six months ago, I’d have said there would not be a replacement. I’d qualify it now and say that there is probably room for two benchmarks in the market, one for funding costs and one for the evaluation of derivative and off-balance sheet positions. The market would prefer one rate but it’s quite possible to have two. It probably doesn’t matter if one’s a measured transaction rate, rather than an offered rate, because so many contracts are written against Libor. What corporations want to know, above all, is how transactions are being priced.”

Clearly a balance needs to be struck if the life is not to be crushed out of the market. Compliance is already very costly for data submitters and many banks are increasingly concerned about litigation or regulatory risk. Some have set up internal task forces to scrutinise their benchmark submissions and have even exited various benchmarks as a result. Yet for investors, reference rates are a vital yardstick against which performance can be measured. For firms on the buyside looking for the best return for their clients, they are an essential part of the whole process. Mass exits from benchmarks would make them far less liquid and reliable: unintended consequences the regulators certainly wouldn’t have had in mind when they embarked on their radical programme of reform.

[divider_line]©BestExecution 2015

[divider_to_top]

 

 

 

 

Market opinion : Crowdfunding : Jannah Patchay

Jannah-Patchay

NEW KID ON THE BLOCK.

Jannah Patchay of Agora Global Consultants looks at why crowdfunding is capturing the UK government and investors’ imagination.

Agora_JannahPatchay_770x375

Crowdfunding has gone mainstream. Even the UK government is on board, investing directly in small-to-medium enterprises (SME’s) via platforms such as Crowdcube and Funding Circle. Market commentators laud it as the antidote to SME funding issues, and the retail investor’s new best friend in a market sorely lacking in decent returns on investment.

Crowdfunding is a disintermediary, using the power of the internet and developments in financial technology to bring together both investors and SMEs normally beyond the reach of traditional capital markets. For the UK government, it is a means of not only demonstrating internet-savvy but also of investing easily and directly into this crucial end of the market, touted as crucial for economic recovery and growth, without itself having to administer costly loan schemes.

In March 2014, the Financial Conduct Authority (FCA), somewhat late to the party, published “The FCA’s regulatory approach to crowdfunding over the internet, and the promotion of non-readily realisable securities by other media”. This longwinded titled report followed on from a consultation paper and contained the final rules applicable to this burgeoning industry.

It distinguished between five types of crowdfunding activities:

  • Donation-based – including charity fundraising sites such as Crowdfunder.
  • Pre-payment or rewards-based – investing in exchange for a reward, service or product, e.g. via Kickstarter.
  • Exempt – schemes such Enterprise Schemes or withdrawable shares issued by Industrial and Provident Societies, which are exempt from FCA authorisation or regulation.
  • Loan-based – schemes facilitating the lending of funds by investors to businesses i.e. peer-topeer lending.
  • Investment-based – schemes facilitating investment in businesses via shares, debt securities or unregulated collective investment schemes.
  • The FCA further noted that only the latter two activities fell under its remit. These are referred to broadly as crowdfunding, with the overt distinction being made only where necessary. Furthermore, it stipulated that these activities were, broadly speaking, to be governed by existing rules for consumer credit providers. This is a diverse category encompassing credit card providers, payday lenders and pawnbrokers, amongst others.

Customisations for crowdfunding platforms included stipulation of regulatory capital requirements and arrangements in place whereby the outstanding loan or investment book may continue to be administered and run in the event of the platform operator itself folding. Interestingly, from an investor point of view, the FCA decided that, whilst crowdfunding investments would not be covered by the Financial Services Compensation Scheme (FSCS), participation in investment-based schemes would be limited to a maximum of 10% of a retail investor’s investible assets.

The advent of crowdfunding, arguably the vanguard of the wider fintech paradigm, has led to an increased appreciation from both the UK government and the FCA of the need to be at the forefront of such an innovative area. For the government, fintech is high on the agenda because it offers a new industry with massive growth potential. It provides not only a channel for investment, but also a target for it.

In August, George Osborne announced a doubling of the initial £100m investment in fintech companies and development of innovative finance products. There are also now provisions for tax breaks for both fintech firms and investors (including, for the first time, allowing peer-to-peer lending in investment savings accounts). In parallel, the FCA is moving forward with proactive regulation, culminating in the launch in May 2014 of Project Innovate. The service is aimed at facilitating communication between fintech start-ups and the FCA, and assisting them in navigating the regulatory environment.

Traditionally, early-stage SME financing has been dominated by bank lending and angel or venture capital investment. Peer-to-peer lending platforms have done well to address the deficits now inherent in the former, while investment-based schemes such as Angel’s Den have made the latter easier and more successful than ever.

Peer-to-peer lenders are typically and rightly perceived as being less risky to retail investors due to the smaller amounts invested, the ability to withdraw funds and the potential for spreading loans across multiple recipients. This has resulted in a lower and more diversified risk profile. As for investment-based schemes, it is notable that an angel investor is a very different beast to an average retail investor. They are more sophisticated, with a greater understanding of market fundamentals. They understand the need for a solid business model and prospects for return on investment, which together promote investor confidence.

Investment-based crowdfunding platforms may provide them with liquidity and market infrastructure that were previously unavailable. An interesting exercise would be to explore the breakdown between the retail investor, more sophisticated angel investor and government investment across the various crowdfunding platforms. The aim would be to understand the extent to which investment-based crowdfunding has been adopted by retail investors. Furthermore, to what extent do retail investors understand the risks involved in investing in these schemes?

The government’s own statistics note that between 50% and 70% of start-ups fail in their early years. The wild success of investmentbased crowdfunding platforms has not borne out these statistics. To some extent, this may be a self-fulfilling prophecy: many businesses on these platforms are second-or-third time fundraisers, and easy access to funding can help overcome some of the initial cashflow challenges in launching a new service or product.

However, there is no substitute for a strong business model and good corporate leadership. Once the initial funding issues have been resolved, it remains to be seen how many of these crowdfunded start-ups will survive. It may be too early to ascertain the true success but one thing is for certain – the failure of a large crowdfunded scheme (whether investment- or loan-based), or an economic event resulting in the collapse of multiple enterprises, and subsequent losses incurred by retail investors, would pose a fundamental challenge to the industry that has not yet been adequately explored.

[divider_line]©BestExecution 2015

[divider_to_top]

Buyside focus : OMS/EMS : Heather McKenzie

Be27_OMS-EMS_DIVIDER
Be27_OMS-EMS_DIVIDER

THE HEAT IS ON

Be27_OMS-EMS_DIVIDER

Heather McKenzie looks at how OMS/EMS can keep the flame burning.

There is no doubt that the sellside is under pressure. Their buyside clients have heightened expectations regarding execution and pricing while regulators are calling for greater transparency and better execution. At the same time, they are trying to prove their worth while seeking to reduce their costs, including commissions, exchange and clearing fees.

One way is that sellside firms must recognise the “increasingly urgent” need for the deployment of order management system (OMS) technology capable of more than simple execution of individual assets, according to a TABB Group report, The Transformative OMS: From Tickets & Timestamps to Today’s Technologies. In order to differentiate and retain as well as win new clients, they need to upgrade their product offerings and revamp their OMS-related technologies.

The report highlights a number of key trends that are driving significant change in the relationship between buyside and sellside firms. Some of these are regulatory, of course, including the Dodd-Frank and the Securities and Exchange Commission’s recent move toward expanding the Regulation NMS (Reg. NMS) Rule 606, which governs broker order routing practices. In Europe, the latest incarnation of MiFID is also a game-changer.

The other driver, according to TABB is the increased scrutiny of investment managers. They have reduced the number of execution partners by as much as 50%, depending upon the size of the firm. In addition, they are re-evaluating services such as research, trade execution or capital commitment that were once considered an unquestioned value.

Another trend noted is the migration of sellside personnel to buyside firms. This has not only increased the sophistication of their trading standards but also raised the level of understanding of algorithms as well as electronic execution strategies and trading venues. “Today’s investment managers have fully embraced electronic and algorithmic technology and have complete access to a full complement of trading tools once reserved for the brokerage firms that covered them,” says the report. Moreover, asset managers have whittled down their broker lists mainly due to shrinking commission wallets on the back of internal cost cutting.

A true reflection

Broadridge, Chris JohnThe picture painted in the report is one recognisable to many observers of the industry. Chris John, head of the revenue and expense management business at investor communications and technology company Broadridge says there is huge pressure on sellside firms at present around cost management and expenses. “Sellside firms are focused on being more transparent and efficient,” he says. “But rather than looking at it from a trading desk or trading group point of view, they are looking at it from a higher level from the firm’s total expenses and what they are paying out in commission, exchange and clearing fees.”

These expenses can reach totals of hundreds of millions of dollars annually, he says. “Everything is being looked at as a candidate for saving money and creating efficiency through re-engineering or outsourcing. The idea is that the savings can be passed on to buyside clients in the form of lower cost of execution,” he says. Post-trade costs represent about 65% of the total cost of execution so this is the area where sellside firms are looking to create the most efficiencies across their organisations.

Sellside firms have a greater challenge now in determining their relevance to buyside firms in the equities execution space, says James Blackburn, global head of product marketing, equities at Fidessa, a trading infrastructure and software company. “The pie is getting smaller; buyside firms are driven by cost and regulatory pressures and are becoming much more discerning about the sellside partners they choose.”

Fidessa, J BlackburnThe changing dynamics have led to a polarisation of the market, with large wholesale brokers (tier one banks) and niche, specialist buyside firms doing best. “The firms in the middle, that either don’t have the size or don’t specialise in particular geographies, sectors, M&A or research, have an identity crisis and are not doing very well at the moment,” says Blackburn.

GreySpark, Anna PajorAnna Pajor, a lead consultant at London-based consultancy GreySpark, agrees the environment has become much tougher. “Before the financial crisis, sellside firms looked at how to target clients. Today they are much more likely to examine whether there is a reason to attract particular flows.” Sellside firms are focusing on specialising and providing quality services in a particular industry or market, or they are reviewing all of their services and the profitability they deliver.

Pajor adds, “This means that some brokers are deciding particular clients are not profitable. This actually makes it easier for buyside firms to determine which brokers to retain – if a broker says it doesn’t want your business, then it is an easy decision.”

Technological solution

In turning to technology to cope with this changing environment, TABB Group found that sellside firms are prioritising OMS/EMS technology initiatives, followed by tools and analytics (including transaction cost analysis), technology infrastructure and algo development.

The report states: “Global firms are focused on initiatives that will revamp their legacy OMS platforms and other systems built in-house. Top of mind for all sellside firms is the desire to consolidate OMS functionality across all asset classes. At many boutique and agency firms, however, technology initiatives centre on implementation of off-the-shelf OMS and EMS products, followed by the evaluation of algorithmic-suites.”

Across all firms, nearly half plan some type of OMS technology projects or related efforts to upgrade analytical tools and infrastructure. TABB notes that this highlights the overall level of dissatisfaction with current OMS and technology providers. This also reinforces the consultancy’s view that, going forward, firms must aggressively seek out technology partners able to provide more robust solutions in these areas.

EzeSoftware, Bud DaleidenBud Daleiden, managing director, product management of Eze Software Group, says: “Much like the buyside has done in the past, we’re seeing our sellside customers demand a convergence of EMS and OMS capabilities as they attempt to reduce overall trading costs while delivering quality executions to their buyside customers.” Some examples of features that firms are frequently looking for include the ability to auto-route certain orders to the street while directing others to the trade desk for high-touch execution, more advanced intra-trade TCA and multi-asset trading support.

Daleiden adds, “Sellside desks are finding that auto-routing certain orders, typically those that are lower volume and marketable, allows their traders to focus on actively working more complicated orders trying to get best possible execution. Then, while working the high-touch orders, sellside traders are increasingly relying on intra-trade TCA to provide real-time comparison to various benchmarks.”

This helps the traders determine where they need to adjust behaviour to work orders more aggressively or passively. They are ultimately looking for the most cost-efficient way to seek liquidity as well as the tools to help them identify the best venues. Desks are discovering that they can minimise their internal costs by using a multi-asset OEMS, according to Daleiden.”This allows the same traders to work equity or derivative orders and ultimately permits the desk to streamline trade desk support.”

[divider_line]©BestExecution 2015

[divider_to_top]

Profile : Will Geyer : ITG

ITG, Will Geyer
ITG, Will Geyer

THE EMPOWERMENT OF THE  BUYSIDE.

ITG, Will Geyer

Will Geyer speaks to Best Execution about the future state of trading.

Having had extensive experience of the both the buyside and sellside what changes have you seen?

I would say that five years ago the buyside wanted the hot keys and algo strategies that could be customised around the way they traded. Today, the emphasis is much more on using market data and proprietary research to make better-informed dynamic trading decisions. The information gives them a greater understanding about the way brokers and venues operate. This has been an evolutionary process although it has recently accelerated. Several years ago, firms would use tools to improve trading decisions but now they want to have better insight into the building blocks underlying those tools. We have developed workflows that have assimilated the data and created a customisable environment which enables clients to efficiently adapt to market changes.

What are the drivers behind this?

The lines between the sell and buyside have continued to blur. The buyside want to have more direct engagement with liquidity without an intermediary and also, similar to the sellside, they want to capture all the necessary data to meet future and existing regulatory requirements. The other major trend is that clients no longer just have an equity trading desk but are looking for opportunities across the asset classes whether it be equities, fixed income, FX or over-the-counter derivatives. They want to have a single blotter where they have their trading applications, P&L and risk management. They also want to be able to access liquidity in the most effective way and as quickly as possible. This consolidated, multi-asset approach I believe is the future of trading.

 You joined ITG in 2011. What is the strategic priority for the company at this time, and what do you bring to the mix?

We believe many clients are looking for a single blotter user experience but they are not necessarily looking for one omnipresent solution. They want simplified and flexible workflows. We offer a wide range of different modules across the asset classes that we have built to interface with each other or different sellside firms because we are broker neutral. We are not trying to redefine the process but offer greater integration, transparency and control. We take clients through the entire life cycle of a trade starting with the ability to capture all the data for execution to transaction cost analysis (TCA) and straight through processing. Our order management and execution management systems are separate, although they can be tightly integrated through the trading blotter, and they are compatible with nearly all other major front-ends. This gives clients the option to choose the systems which are best for them rather than locking them into one system.

In terms of tools, one of the most promising developments today is in real time decision support. This provides clients insights into what is actually happening to their order versus their expectations and allows the buyside to react appropriately and trade on the venues that are best suited for their strategies.

What were the drivers behind your acquisition of RFQ-hub in August?

In many ways it is a chicken-and-egg story. When you want to move into a new area you have to find clients who want to consume the product, but they will ask how many brokers are involved. However, if you go to the brokers they want to know how many clients you have. RFQ-hub was an established multi-dealer platform which connects buy and sellside firms across exchange traded funds, OTC-negotiated equities, futures, options, swaps, convertible bonds, structured products and commodities. It fit our strategic vision and it will remain available as a standalone platform but will also be offered as module on ITG’s Triton execution management system.

What are your plans for RFQ-hub?

We did not see the acquisition as a cost rationalisation play but one where we can invest in the business and expand it globally across asset classes. We plan to roll out equity derivatives in North America in Q1 2015 and enrich the analytics by embedding our TCA within RFQ-hub.

What impact do you see regulation having on the industry?

Regulatory changes are the only constant in this business. They will continue to change the way people trade across all asset classes. Take the FX market; the dealing structure is evolving real-time to provide more transparency and disclosure. To that end we recently signed a joint venture with FX Connect, (State Street Global Exchange’s end-to- end, multi-bank foreign exchange platform). It provides foreign exchange liquidity from over 50 providers for spot, forward and swap transactions. We are marrying the back end of FX Connect with the front end of our Triton product which we believe will allow clients to execute FX seamlessly across multiple accounts unlocking workflow constraints and providing a multi-asset single blotter experience.

Who do you see as your main competition?

ITG covers every region and asset class and although we have competition in our different individual modules we see our edge as the ability to offer the full suite of products at a compelling price point that is tailored to each client’s requirements. For example, we are an agency broker with research, that has an alternative trading system, a network, OMS and EMS and a full continuum of TCA. We have many individual competitors, but very few true peers.

[Biography]

Will Geyer is Managing Director, Head of Platforms at independent execution and research broker, ITG. Globally responsible for ITG’s suite of Order Management Systems, Execution Management Systems, and connectivity products, Will previously worked on both the buyside and sellside of investment implementation. Most recently, he was CEO of JonesTrading and prior to that ran Citigroup’s global alternative execution businesses and managed equity trading at Barclays Global Investors.

[divider_line]© BestExecution 2015

[divider_to_top]

Viewpoint : Performance metrics : Darren Toulson

LiquidNet, Darren Toulson
LiquidNet, Darren Toulson

HOW DO YOU DEFINE TOXICITY?

LiquidNet, Darren ToulsonDarren Toulson, Head of Research at LiquidMetrix explains the methodology behind the Counterparty Profitability Measure and how much you could be losing to ‘Toxic’ counterparties.

A central fear any buyside trader will have when trying to execute a large order is being exposed early on in the trade to ‘toxic’ venues or counterparties. Imparting too much information too quickly to the wrong people will make your implicit trading costs higher or worse, it may make finding liquidity at all a challenge.

The idea that certain market players seek out large buyside orders to game has been given credence by recent books, articles and media attention focussing on predatory HFT algorithms and opaque anti-gaming logic in some dark pools. Because of this, there’s pressure on buysides, sellsides and execution venues to demonstrate they are not falling prey to, or inadvertently encouraging, such gaming.

But what is meant by ‘Toxicity’? A common perception is that certain counterparties exploit fast connections, privileged access or smarter algorithms to ‘game’ less sophisticated trading styles. How can a participant see if this is the case with their own trades?

Let’s consider two approaches that buysides might use when trying to identify fill level toxicity:

A. Indirect Approach – Broker / Venue fill profiling to identify toxicity

One way to characterise ‘toxic’ trades is to think what we might expect from a ‘good’ set of trades, for instance:

  • When our orders access multiple lit markets they should obtain at least as much liquidity as was available on all lit venues at the time of first order submission. Ideally they should capture more if dark pools or other hidden order types are being accessed. Otherwise this is a sign that HFTs might be gaming orders and either stealing or cancelling liquidity in the milliseconds our orders are taking to execute
  • When our orders execute aggressively on midpoint matching dark pools, we should expect little or no short-term (less than 1 second) impact on liquidity in lit markets. Otherwise it might be that our counterparty on the dark venue was a market maker (or predator) who used information from our trade to adjust their own resting orders or trade ahead of us on lit markets.
  • When our orders execute passively on mid- point matching dark pools, we should expect them to capture close to 50 percent of the market wide BBO spread (not just the primary market spread). Otherwise the aggressor may be timing orders to take advantage of dark pools with market data latency problems or where matched reference prices don’t reflect market wide fair prices. When orders execute passively, we would prefer as little correlation as possible between our trades and large changes to the market wide mid-price. Otherwise it’s possible that our trade was ‘picked off’ at an adverse price just as the market moves sharply (adverse selection).

This type of ‘qualitative’ approach doesn’t always produce clear ‘smoking guns’ (though sometimes it does!) but it does provide a good way of assessing the relative performance of different brokers / algos / venues and giving some assurance that trades executed through these different channels are doing what they should do and are not harming your overall trading performance.

B. Direct Approach – Counterparty Profitability Measure to Identify Toxicity

Returning to the central concern around predatory short-term trading styles, much of the fear is that the counterparty to our trades might be making quick, riskless profits at our expense. Whether this profit is coming from ‘latency front running’ lit market orders, gaming reference prices in dark pools or some other means is immaterial.

Consider two common examples of ‘toxic’ behaviour:

  • A primary reference price moves sharply and a resting buyside order on a dark pool that references this price is hit at that exact instant; we can assume that usually the buyside passive order will trade at the less favourable (stale) price.
  • A lit aggressive buyside SOR order is ‘gamed’ causing it to go deeper into the order book than intended; there follows a price reversion after the lit trade has executed.

In both cases (and others scenarios described in the previous section) the counterparty to the buyside order has probably obtained a short-term profitable position. But how profitable?

The obvious way of measuring this is to simulate a simple short-term market making model that will balance (trade out) positions obtained as a result of being counterparty to all trades done by a certain buyside. Obviously this needs to be done somewhat realistically, simply looking at mid prices a fixed time after each fill doesn’t really tell us if the counterparty to your trades could have realistically traded out and made a ‘fast buck’. However, by simulating trading outcomes of limit orders placed by market making / trading strategies with some kind of short term inventory / risk model, we can get an idea of how much profit a basic algorithm might make by being counterparty to your trades.

What we have found when we look at real client data, is that this LiquidMetrix Counterparty Profitability Measure (CPM), which directly tries to answer the question ‘how much is the other party profiting from trading with me’ is actually a very good ‘general toxicity indicator’. That is, if you look more closely at groups of buyside trades on a venue, algorithm or broker that has a high CPM associated with them, then you usually find within those trades some of the specific ‘gaming’ problems identified in the previous section.

Conclusion

One way of defining toxicity is seeing whether your (toxic) counterparty can consistently make short-term riskless profits for themselves at your expense when you trade. This is not always a bad thing, you expect to pay something for liquidity. However, some of the types of gaming relating to high frequency price movements or reference price arbitrages highlighted in the media are more than simply a liquidity provider being compensated for providing liquidity. So having a way of identifying, quantifying and hopefully then avoiding or mitigating the risks of trading with such ‘toxic’ counterparties is going to be key to answering the question “How much money (if any) does your fund lose to HFT / toxic counterparties”.

[divider_line]© BestExecution 2015

[divider_to_top]

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA