Home Blog Page 601

Viewpoint : Impact of regulation : MiFID II – tick sizes : Sandra Bramhoff

A NEW REGIME AWAITS.

Be25_17.DBoerse_SBramhoffToday there are many different tick size regimes in Europe, none of which are harmonised or under the control of regulators, however this will change. The introduction of MiFID II requires that tick size regimes must be adopted by regulated markets and MTFs in shares and equity-like instruments such as ETFs. Dr. Sandra Bramhoff, Senior Vice President, Market Development, Deutsche Börse AG gives her initial assessment.

ESMA now has the task to draft regulatory technical standards by taking into account that tick size regimes must not only be determined by the price of a financial instrument but also calibrated in a way that they reflect the liquidity profile of a financial instrument and the average bid-ask spread.

In the discussion paper that was published on 22nd of May 2014, ESMA suggests two options for a European tick size regime for shares. Option 2 is based on the existing FESE Table 2. This table is now extended from 17 to 20 price bands. In addition the new table allows for higher granularity (see Figure 1).

Be25_DBoerse_Fig.1

In order to reflect the liquidity profile of a share Option 2 suggests two things: first, there shall be one table in place for liquid shares and one for illiquid shares.[1] The two tables differ only slightly, that is to say that tick sizes for less liquid shares will be one tick less granular per price band than for liquid shares.

Secondly, Option 2 suggests that tick sizes for a share shall be adjusted after an initial six week calibration period after the regime started in case the spread to tick ratio falls below 2. Assume that for example a liquid share with a price between 20 and 49.99 EUR has a spread of 0.024 EUR. According to band 7 of the table it will initially be assigned a tick size of 0.02 EUR (see Figure 1). The spread to tick ratio of 1.2 is therefore less than 2. As a result the proposed tick size is too large and would need to be adjusted by what Option 2 refers to as a Spread Adjustment Factor (SAF). In order that the spread to tick ratio is above 2 the tick size needs to move to 0.01 EUR. After the initial calibration period the tick size for all shares will be reassessed on an annual basis.

Overall this option seems quite workable, however it may be worth thinking about a few adjustments. Firstly, it seems that the proposed spread to tick ratio of 2 seems very low. This implies that the tick size becomes too big in relation to the instrument’s average spread, i.e. it might be quite likely that the spread is compressed down to zero, meaning that the spread is artificially restricted by the tick size. Therefore the spread to tick ratio should be no lower than 4.

Secondly, the proposed initial six weeks calibration period might not be needed. Instead spread to tick ratios should be checked when the regime is implemented and if needed the tick size of a share should be adjusted through a SAF straight away. In order to determine if a SAF factor is initially needed the annually time-weighted average spread before implementation of the new tables should be considered. This minimises market disruption because tick sizes would not need to be readjusted after six weeks. It might also be useful to check in advance how many stocks would need an initial adjustment via a SAF. In case that number is very high it might be worthwhile checking whether the table itself is calibrated appropriately or needs adjustment.

Thirdly, a share first admitted to trading should not be assigned to the illiquid table but rather to the liquid or illiquid table, based on which table its peers have been assigned to. Instead of a reassessment after six weeks it should be checked at the end of the following quarter if a table change (and a SAF adjustment) is deemed necessary, as long as the share has been assigned to a respective table for at least a minimum of six weeks. The fact that tick size adjustments for new shares will only take place at specific dates, i.e. at the beginning of a new quarter, will result in less market disruption. A share that, for example, floats on 1st of March would first be moved into another table and assigned a new tick size via a SAF by 1st of July as the initial trading period is under six weeks.

Fourthly, Option 2 seems also to be workable for ETFs. However, a major difference between shares and ETFs is that the latter benefit from multiple layers of liquidity including the liquidity of their underlying markets. A differentiation between liquid and less liquid ETFs, based exclusively on their individual trading data, as proposed by ESMA in the consultation paper that was also published on 22nd of May 2014, does therefore not accurately reflect their true level of liquidity. Hence, for the sake of reduced complexity, all ETFs should be assigned to the liquid table with tick sizes being amended manually via a SAF when deemed appropriate.

Option 1 has only been designed for shares as it suggests taking trades as a liquidity proxy; for ETFs this is not an ideal proxy for the same reasons as stated above. Furthermore, the tables have been designed based on an optimal tick size as a proportion of 1.4 to 2.5 average spread for liquid shares and 1.4 to 5 for less liquid shares. As outlined above this will artificially restrict the tick size. The tick size becomes too large in relation to the instrument’s average spread thereby having an impact on the viscosity. While this may lead to less perceived flickering at the top of the book it might however reduce flow from liquidity providers. Ultimately the number of transactions may go down and the effect would be a shift to a lower liquidity band. As a result tick sizes will increase and it will turn into a vicious circle.

Market participants should therefore carefully consider these two options and also base their judgement on what ESMA points out itself in the discussion paper: the intention is to implement a European tick size regime that is robust and scalable, as simple as possible, easy to implement, enforceable and does not cause disruptions to trading when being implemented.

[1] According to Commission Regulation 1287/2006 a share is considered to be liquid if it is traded daily with a free float of not less than 500 million EUR, and one of the following conditions is fulfilled: (1) the average daily number of transactions in the share are not less than 500 or (2) the average daily turnover for the share is not less than 2 million EUR. Please note that ESMA suggests changing these parameters under MiFID II, cf. p. 174f of the consultation paper published by ESMA on 22nd of May 2014.

©BestExecution 2014

[divider_to_top]

Viewpoint : Buyside challenges : Technology : Nick Barker

Be25_RBS_Nick-Barker_960x303

TECHNOLOGY – AT THE HEART OF MEETING CLIENT NEEDS.

One of the challenges that the buyside faces today is the lack of agility in their trading systems, which may struggle to keep up with changing client demands. Nick Barker, Head of Electronic Distribution Products at RBS, argues that banks need to embrace a new approach to technology and development to ensure their products remain relevant.

The future of banking technology is to offer clients a platform that is flexible and adaptable to meet their business needs and lets them shape how they use services themselves, whether for sales, trading or market intelligence. Those services might be provided by either the bank offering the platform or by a third party, such as research from other institutions or third party analytics.

An open source offering, combined with HTML5 code, allows banks to provide solutions from execution and post trade applications to transaction cost analysis tools in a more dynamic, user-friendly way.

It can be a collaborative model that allows people other than the original developer to make changes or improvements to the code. The benefits include better software because more programmers can have an input, the ability for banks to keep up with changing client demand and lower cost and risk.

Among the many advantages to clients are that a number of small changes can be made over a period of time, allowing the software to evolve gradually. They can be tested and the feedback used to channel future investment and adjust the direction of a product. It could even make the software as easy and enjoyable to use as their personal technology.

In my opinion such smaller, more regular releases are a lower-risk way of developing software than offering one or two more major updates a year, a tendency we have seen in the past in some areas of the industry. Sometimes clients were being expected to take on board a complete redesign every six months, potentially a massive risk to them and the banks. Having an agile approach to development is more dynamic and much less jarring.

You also avoid a scenario where you effectively go off into a darkened room for three years and emerge with a product only to find it has missed its target because demand and technology have moved on. That is a significant investment risk to a bank; incremental releases mean you can spend small and test as you go, and that way make sure you are always on track to meet the end goal and adapt if the goalposts move.

Traditionally, open source means giving full access to everyone. In the case of banking, this is not practical, so the code is open to an approved community of users to avoid regulatory, compliance or data protection issues.

The easiest way to approach this is to have a library of content and standard components that can be built on. So when you want to offer a new service or capability you don’t have to start from scratch and can amend or add to what’s already there.

It’s also possible to use a third party provider to build part of the software offering; for instance if you want to create a service that uses an SMS engine, there is no point having your own people work on building it when you can get one from a third party provider. Again, this means greater speed of development and a better ability to align your offering with client needs.

Improvements do not go into production until they have been technically approved. We do not want to make the approval process so onerous that there is only one software release a year, but quality control checks need to be built in the process, so it’s about finding an appropriate balance.

While one benefit of open source is the input from other people and new solutions and developments, you must still have an overall strategic plan. You must know the direction in which you want to take your product and an idea of where your next investment should be, but be prepared to shift that depending on client demand and the changing market landscape.

The reason that perhaps some banks do not take this approach is that they have very successful trading and market platforms already. However, they may find that they are stuck with a legacy infrastructure that could be difficult to alter or build on as client demands change.

Using HTML5 and open source means you are not tied to a particular software vendor and can evolve the platform or system as necessary. I think that HTML5 will be the technology of choice for the industry going forwards and banks that embrace this approach now will find that they have a head start on their competitors in future.

Allowing clients to self serve and decide for themselves how they want to consume services ties in with an industry-wide drive for simplification. Regulatory changes are forcing banks to become more transparent in their offerings, and ultimately letting the client choose how they use our products allows us to serve them better.

In our personal lives, most people are used to a high-end, enjoyable user experience from their mobile phone, tablet or whatever technology they are using. In the banking industry, outside of the retail space, clients don’t have that same enjoyment. We need to move the dial on the client experience and make it as easy as possible, and pleasant, for clients to do business with us.

This is one of the biggest changes in banking technology I have seen in the past few years: we are at the start of another wave of digitalisation of the banking industry.

As consumer technology is evolving – for instance, a car manufacturer using technology to measure how efficiently you are driving, and how you can save money on fuel – the banking industry needs to evolve. We need to remain relevant to our clients and this will be achieved by the quality of our digital offering.

©BestExecution 2014

[divider_to_top]

News review : FICC

Tightening the screws.

Chancellor of the Exchequer George Osborne is cracking down on market abuse with his proposals to make the manipulation of foreign exchange, fixed income and commodities benchmarks a criminal offence in order to strengthen London’s status as an international financial hub.

Be25_Minouche_Shafik_263x375Minouche Shafik (left), the new Bank of England deputy governor for markets and banking, will be leading the assessment along with Martin Wheatley, Financial Conduct Authority head and Charles Roxburgh, director-general, financial services at the Treasury. A recommendation will be made in the autumn on broadening the legislation the government implemented to clamp down on Libor to cover other benchmarks.

As part of the changes, the government plans to widen the market abuse regime to include foreign exchange and commodities. It currently only covers insider dealing and market manipulation related to securities. It will consider the perimeter of regulation, the role of industry standards, and any need for further supervisory resources. The certification regime is also expected to be extended to include foreign lenders with branches in the UK.

Separately, the Financial Stability Board, which is chaired by the Governor of the Bank of England Mark Carney, is also assessing foreign exchange benchmarks as part of its on-going probe of short-term interest rate benchmarks for the G20. The FSB set up a sub-group to lead the investigation which is co-chaired by Paul Fisher, director for markets at the Bank of England.

Foreign exchange is a global market that is integral to the flow of global business and commerce, its reputation and integrity has been tainted in recent month. London is at the heart of the global FX system and it is good to see the UK lead on reform where it is needed. Action taken here will have a global impact. Marshall Bailey, ACI

The $3 tn daily FX market has been rocked by several manipulation scandals. It has been alleged that traders at major banks shared information in order to manipulate benchmark rates. Over 40 dealers across the globe have been sacked or suspended from international banks following claims of rigging abuse.

The UK’s Financial Conduct Authority (FCA) has already launched an investigation into foreign exchange manipulation allegations alongside the US Department of Justice. London boasts the largest market in global foreign exchange with the largest proportion of daily trades.

Reforms are welcome by industry trade groups, as Marshall Bailey, President, ACI – The Financial Markets Association, noted, “George Osborne’s leadership is very welcome and this is the right direction to restore the reputation of the FX market. This will do much to reassure markets, investors and the public. The FX market must evolve and learn the lessons from recent events.”

He adds that while “foreign exchange is a global market that is integral to the flow of global business and commerce, its reputation and integrity has been tainted in recent months. London is at the heart of the global FX system and it is good to see the UK lead on reform where it is needed. Action taken here will have a global impact. While the market’s structure is broadly very effective and well designed, we have a duty to the global financial system to ensure we repair where needed.”

©BestExecution 2014

[divider_to_top]

 

Analysis : European equities : Market fragmentation

FRAGMENTATION IN EUROPE.

LiquidMetrix analyses consolidated performance figures for stocks on major European indices and the changes from the previous quarter (Q1.2014).

The charts and figures below are based upon LiquidMetrix’s unique benchmarking methodology that provides accurate measurements of trends in market movements. Trading Volumes on Lit markets including auctions are taken into account, as well as dark trading on the major MTFs.

Be25_Liquidmetrix_MktShare

 

Be25_Liquidmetrix_MTF_TABLEQ2 2014 saw a slight shift in trading volumes from primary markets to MTFs and an increase in overall trading volumes.

Average MTF market share for the LiquidMetrix universe of European stocks now accounts for just under 30% of overall trading. This increase bucks a recent trend of MTF market share since its peak around summer of 2013.

As we shall see later this turnaround is mainly due to increased market shares on one particular MTF, Turquoise.

Be25_Liquidmetrix_Spreads
Be25_Liquidmetrix_Spreads_TABLEAverage bid offer spreads remained fairly constant between Q1 and Q2 2014. Within Q2 2014 there were some peaks of wider spreads associated with various bank holidays falling on different days in Europe.

Overall on-book liquidity increase fairly significantly and is now at levels similar to previous peak values at the beginning of 2013.

European market breakdown – Major European indices

Be25_Liquidmetrix_MktShare_Q2.14There were significant shifts in market shares in Q2 2014. Market share in most primary markets with the exception of Stockholm and Madrid reduced on average by 1-2%.

This wasn’t, however, a simple shift of market share of primary to MTFs, Turquoise significantly increased its market share in all markets at the expense of Chi-X and BATS. By the end of Q2 2014 Turquoise market share on FTSE-100 stocks is getting close to Chi-X’s market share.

Burgundy’s market share continues to decline.

Be25_Liquidmetrix_SprdsAtTouch_Q2.14

Market wide spreads were generally little changed between Q1 and Q2 2014.

Spreads on UK stocks narrowed somewhat whilst on French stocks spreads widened a little.

Spreads on Turquoise tightened noticeably relative to other venues. Although CHIX is still comfortably ahead of Turquoise in terms of BBO spread the gap has narrowed. BATS spreads widened somewhat overall and it is now clearly behind Turquoise in fourth position.

Be25_Liquidmetrix_SpreadsAtEUR25kAs with touch spreads, across the market as a whole there was no great change between Q1 and Q2 2014 in depth weighted spreads on primary markets.

For the MTFs the picture was a little more complex, BATS had a significant widening of depth weighted spreads on most markets.

Turquoise significantly tightened its depth weighted spreads on most markets to the extent that it is now second placed, ahead of CHIX on the SMI market and close to a tie for second place on Italian and Swedish stocks.

The touch and depth weighted spread improvement on Turquoise in Q2 2014 probably explain its relative success in improving market share.

Be25_Liquidmetrix_10bpsLIQ

Resting order book liquidity on most lit markets generally was stable or increased in Q2 2014, with the largest increases seen in the French market.

Amongst specific execution venues, Turquoise showed particularly strong performance with its resting liquidity increasing by about 10-30% in all countries.

Be25_Liquidmetrix_VolatilityInd

 

Be25_Liquidmetrix_VOLatility_TABLEThe LiquidMetrix Volatility Indicator provides a measure of high-frequency market volatility based upon 30 second price movements.

Average volatility for Q2 2014 was similar to Q1 though there seems to be a general trend to lower volatility in the most recent data (May 2014).

Market volatility levels are now near the lows of the recent two year history, significantly lower (almost half) the levels of mid 2012 at the height of the Euro crisis.

©BestExecution 2014

[divider_to_top]

Fixed income trading focus : Electronic trading : Dan Barnes

Be25_FIXED-INCOME-573x375
Be25_FIXED-INCOME-573x375

AN ELECTRICAL CHARGE.

Be25_FIXED-INCOME-960x292

Bond markets currently lie dormant, but an eruption of volatility could trigger a revolution for electronic market places. Dan Barnes reports.

The bond market is a smoky mirror image of the equity market.  While stock trading has moved from central exchanges to thousands of venues, liquidity in bonds is coagulating very slowly into trading hubs, quite against the wishes of most brokers.  The problem is that sellside firms see the opacity of prices in the market as the key to its profitability.

Lee Williams, partner and head of trading at proprietary trading firm Bluefin Europe, says, “I think it will take a very long time [for the market to change]. There are some pretty stubborn people in there. For every new dealer that a platform adds, we will hear from two dealers who say they are pulling back. A lot of the dealers were used to being the best of three, then it was five, and now it is 48 they have no interest.”

The sellside may become more pliant as its profitability and participation are squeezed. Under Basel III capital adequacy rules, banks must hold capital to balance out the risks of their bond inventories and that simply makes many bonds too expensive to keep. In October 2007, primary brokers who traded with the US Federal Reserve held positions of US$235 bn in corporate bonds, their highest ever. That had fallen to $32 bn by 21 May 2014, according to the Fed’s own figures.

Additionally the Volcker Rule, part of wider US capital markets reform, bans banks from engaging in ‘risky’ proprietary trading or supporting other ‘risky’ traders at any significant scale. With banks unable to hold or trade bonds, an important source of liquidity has been removed for the buyside. Steady interest rates have contributed to quieter markets and brokers have seen revenues fall.

“This time around dealers need electronification as much as anyone else,” says Michael Chuang, CEO of iTB, a vendor offering trading access to multiple bond markets. “Their business models have had change forced upon them by regulators. They are more interested in a solution than ever before.”

Liquidity varies by bond type. US treasuries are so liquid they are traded by high-frequency trading (HFT) firms on the eSpeed market, owned by exchange operator NASDAQ OMX, and on BrokerTec which is part of interdealer broker ICAP. Treasuries are also used as collateral for derivatives trades, being both highly liquid and secure assets. Corporate and other government bonds are at the more illiquid end of the spectrum.

Unlike fickle, yet perpetually valuable equities, fixed income instruments generate a predictable, interest-rate-based return over a fixed period. Each new issuance creates an entirely new set of bonds, non-fungible with previous issuances. Combined with the enormous range of issuers, this makes their universe far more populous than that of stocks. The International Capital Markets Association (ICMA) estimated that as of 2013 there were approximately 6500 stocks and 40,000 corporate bonds issued in the US. By contrast the average daily number of trades for stocks in the country is 25.2 m and 40,280 for corporate bonds. That contrast of fragmentation and illiquidity of products makes matching buyers and sellers a harder task than it is for other instruments.

Be25_07.VEGAChi_CAntoniades“The pattern of relative illiquidity has always been part of the corporate bond market, it has just gotten a lot worse because the amount of capital commitment, liquidity and transformation that is offered by the dealers is now a fraction of what it used to be,” says Constantinos Antoniades (left), chief executive of bond-trading platform Vega-Chi.

Pushing and pulling

The desire of the buyside to achieve rapid price discovery and execution is being frustrated. Trading venues that help match up buyers and sellers provide one solution to illiquidity. Electronic trading of corporate bonds is dominated by three platforms; Bloomberg, MarketAxess and Tradeweb.

They each offer a request-for-quote model (RFQ) under which asset managers have to ask for prices, see what the brokers punt back and then try to execute on those prices. However brokers are often reluctant to support the model.

“I get the impression that dealers are not looking to make two-way prices,” says Williams. “A good gauge for me is if I send out a request list of five names with orders of US$0.5 m to US$1m each, I can reach many dealers on certain platforms. If I see just one or two responses, they are not sellers. That suggests people are not really making two-way markets; if it suits they show a price, if it doesn’t suit, they won’t.”

The three operators have also been criticised for sustaining the sellside to buyside model, albeit on their derivatives platforms, which the US derivatives regulator the Commodity and Futures Trading Commission (CFTC) said in November 2013 is non-compliant with the Dodd-Frank Act.

Stephane Malrait (right), head of eCommerce for FX and Fixed Income at broker Société Générale says that a move away from relationship-based trading and towards price-based trading is perceived by some as a threat.

Be25_07.SGCIB_SMalrait“When we talk to the sellside, firms say they don’t want to lose their relationship with the buyside and that relationship is also important for the buyside,” he says. “This is partly reflected on some electronic platforms where you see prices from smaller [sellside] players that are not really tradeable. There is value in that information about the quote request.”

In the cash market, liquidity sits almost entirely with fund managers, and a model that would connect up the buyside of the market would seem optimal.

Antoniades says, “In today’s world dealers are mostly acting as riskless brokers, taking an order from the buyside and trying to find the other side, so the current dealer to client model is no longer the optimal model of operation.”

Malrait notes that there are a number of initiatives in the pipeline that aim to deliver on this, but both buy- and sellside are required to support them.

“The buyside is not structured in such a way that it can invest into an initiative that will match orders from one buyside to another,” he says. “To avoid facing other buyside firms from a settlement point of view, they would need to go through an agent as in the equity model. The agent in the middle could be a bank but the banks are not set up yet to deliver this type of business in a mature fashion.”

Fund-management giant BlackRock launched its Aladdin Trading Network (ATN) in 2012 to pool liquidity from its buyside clients, however despite managing enormous levels of assets under management has hooked up with MarketAxess and Tradeweb to provide prices.

Be25_07.MktAxess_RSchiffmanRichard Schiffman (lower right) was a managing director on the ATN project at Blackrock from 2010 to 2013, before joining MarketAxess as Open Trading product manager. “[At BlackRock] we had the right idea and the roots that were put down there are now embedded into the MarketAxess offering,” he says. “We reached an inflection point, we thought the model was sound but we lacked the full breadth of market participants. The decision was to open up ATN widely to get full market diversity, or partner up with another firm that already had that aspect and that was the genesis of the MarketAxess partnership.”

The Tradeweb / ATN connection was announced on 14 May 2014, but for MarketAxess, which has been linked in since April 2013, the connection provides around 1-2% of traditional client to dealer business. “Three quarters of the inquiry flow that we see in investment grade and high yield products is made public on that market list page which represent around 5-6,000 line items per day,” says Schiffman.

Match making

Some are sceptical of the RFQ (request for quotation) model. Antoniades says, “Really it has just automated an existing process. It hasn’t created avenues by which liquidity will be enhanced or discovered by new parties on the buyside.”

Vega-Chi, uses order matching rather than an RFQ model, and does not offer post-trade services. It was bought by buyside block trading platform Liquidnet in March 2014, and plans to open up its order matching model to sellside participants.

Williams says, “The more small independent dealers that sign up to these platforms, the greater the pressure for banks to join them, but I don’t know if the platforms will get to the critical mass without the banks themselves.”

An alternative to creating a new market place is to deliver a more efficient model for bilateral trading. Algomi has launched a product for the sellside, which takes voice trading and tries to better automate the liquidity discovery process within a firm. This allows dealers to be more effective at delivering liquidity but without removing the relationship. It is in the process of beta-testing a product for the buyside which will perform a similar function.

“We see a ripple effect of information, so that a sales trader taking an RFQ is then connected to several other people in their organisation who can materially assist, perhaps by connecting liquidity from their clients or other sources,” explains Stu Taylor, chief executive at Algomi.

Chuang believes that the lethargy to change stems from a lack of trading volume, but that brokers and fund managers will find the motivation when they see volumes erupt.

“The catalyst for electronic trading will be when volatility returns,” he says. “When Ben Bernanke mentioned tapering last year, trading volatility exploded. If the Fed raises interest rates, volatility will return. The market is broken and when interest rates rise and there is nowhere to go, traders have to go electronic.”

©BestExecution 2014

[divider_to_top]

 

Post-trade : Preparing for T+2 : Brian Godins

Be25_HSBC_BrianGodins_948x750
Brian Godins, HSBC

SETTING THE PACE.

Brian Godins, managing director, global head equities operations, HSBC explains how the industry and the bank are preparing for T+2.

 

Be25_10.HSBC_BrianGodins

How are things progressing on the move to T+2 in October?

It is a busy time for regulatory change, with implementation of wide-ranging initiatives such as Dodd-Frank and EMIR filling much of the calendar. T+2 is being driven by another piece – the Central Securities Depository Regulation (CSDR) – so there is a challenging workload to be completed in a short space of time.

CSDR is moving forward with a good degree of traction and the European countries which are impacted are focusing on switching from a T+3 to a T+2 settlement cycle on 6 October. A larger number of countries, approximately 24 at last count, are looking at the shortening of settlement cycles either at the same time, or at some point in the not too distant future. While this is not a completely new concept for markets in Germany, Slovenia and Bulgaria (which already trade on a T+2 basis) the transition will undoubtedly be challenging for countries making the move. 

What are and will be the greatest challenges?

It depends on what type of market participant you are. Many say it will be easier for broker-dealers,central counterparties (CCPs), exchanges, Central Securities Depositories (CSDs) and custodians. It will likely be the asset managers, hedge funds and other buyside participants who conduct portfolio valuation and performance around a T+3 and T+4 settlement cycle who will be the most affected. They will have a lot of work to do in terms of changing their operating models and valuation systems to provide same day affirmations (SDAs). Firms that cannot do this may find themselves at risk of being fined. This has been a great opportunity though for vendors who have a wide range of post trade solutions and can help fill a gap.

Another challenge is the effect it will have on the global client base of European markets. Most broker-dealers offer distribution channels where their executing entity is often very different from the contracting entity. It will put extra pressure on firms who manage these cross border transactions in different time zones and highlight the quality of the trade date processes in a shortened settlement cycle.

As the move to T+2 draws closer it actually appears as though one of the biggest challenges and hurdles will be how T+2 impacts the derivatives world.

How will T+2 impact the derivatives world?

It’s interesting that over the last couple of months people have moved from thinking about the scope of products and markets for securities, to ‘what does T+2 mean for the business’ and ‘what does T+2 mean for the derivatives landscape’.

As it stands there is every expectation that the physical exercise of a stock settlement on the back of a legacy OTC contract will be T+2, and naturally the same for new contracts entered in to post T+2.

There is also a growing momentum and sentiment around legacy equity swaps – where the underlying markets will move to T+2 – that the events should also move to T+2 (for example resets and terminations). The additional question as it stands is what action should be taken on these legacy trades? Should each firm close the existing contract, re-open it and reconfirm or should all legacy swaps have their existing contracts updated and re-confirmed?

This will actually probably become much clearer.

Do you think we will see existing utilities offering more services for data, or new ones being created to help meet the challenges?

The quality and integration of data is central to these and many other regulations and I think they will evolve over time as utilities are inherently data driven. There is significant momentum in the utilities arena, evolution of existing utilities and creation of new ones, all looking to capture a sweet spot in the marketplace to solve industry challenges.

What is still fascinating is that if you took a sample of people from the industry across vendors, exchanges, institutions and others, you would probably get many different definitions and interpretations as to what ‘utilities’ means to them, but the common word would more than likely be ‘data’.

What is your role at the Association for Financial Markets in Europe (AFME)?

Over the past 20 months I have been working with AFME as part of the Post Trade division, chairing the Transaction Management Committee (TMC). One of the things the committee has been focusing on is looking at the prioritisation of initiatives, given the limited time everyone has at their disposal, and in particular the benefit and value-add versus the effort that is being put in.

At the time I joined, the broker-dealer committee was already actively working on a set of deliverables in the post-trade services space. This included a set of standards and documentation around SDA (same day affirmation), matching, allocation and confirmation for the securities product with a focus on cash equities and CFDs (contracts for difference). The goal for the committee is to influence, guide and move the industry forward in addressing the inefficiencies and lack of completeness in the T0 processing space. It’s the old adage that if you get the booking right at point of execution, either through quality of booking and/or trade date affirmation, matching and so on, then the problems downstream dissipate into thin air. Easier said than done!

The committee on which I sit has been working tirelessly on developing standards for the executing broker (EB) to EB flow, EB to prime broker flow and EB to buyside communities. These are an eclectic mix of flows with some commonality and some real ‘devil in the detail’ differences. The added catalyst of T+2 on the immediate horizon has given us the added ‘oomph’ to continue to push SDA in the marketplace. We will continue to provide clarity and guidance on these standards and expectations, to make this as mature and transparent a model as possible.

Turning away from the industry plans, what have you been doing at HSBC to prepare for the changes?

As this regulation touches many parts of HSBC, we have set up one programme covering all areas of the group. This allows us to run the programme efficiently if there are common requirements and also share best practice.

We look at all the European regulations to assess the impact they have on an individual and collective basis at the bank. For example, we have a working group on the post-trade securities environment which has representation from across the businesses. This allows us to focus on the bigger picture, and broader implications for the organisation.

[Biography]
Brian Godins assumed the role of global head of equities operations in 2013 having performed a number of fixed income and equity related line and change roles in HSBC over the past five years. In addition to his HSBC responsibilities, Godins is currently the chair of AFME’s Transaction Management Committee. He previously worked at Morgan Stanley for 13 years in line and change roles in securities, derivatives and FX, across market counterparties, institutional and prime brokerage clients. 

© BestExecution 2014

[divider_to_top]

Asia Trader Forum

Asia TraderForum (ATF) is a private membership group formed in 2012 by the buy-side for the buy-side and run by Institutional Investor for heads of buy-side trading at international asset management firms. The membership’s mission is to help support the development of fair and efficient markets throughout the Asia-Pacific. With twenty four members across asset management firms representing well over $500 billion USD in assets under management in Asia, ATF represents a broad view of international buy-side participants.
The group serves as a medium for buy-side traders in the Asia-Pacific region to discuss issues of common concern and express their views with a common voice. Since its establishment, ATF has been reaching out to regulators and exchanges throughout the Asia-Pacific region to start ongoing dialogues on market structure, policies and procedures while also conducting surveys and research for its members on trading and brokerage practices.
The Asia TraderForum was one of five professional associations that worked to create a due diligence template to assist the buy- and sell-side to comply with new Hong Kong Securities & Futures Commission (SFC) regulations. Effective January 2014, the SFC require brokers to show that they have proper management and controls for the electronic trading systems and investment managers to ensure that they understand those systems. ATF joined with the Alternative Investment Management Association (AIMA), the Asia Securities Industry and Financial Markets Association (ASIFMA), FIX Trading Community and the Hong Kong Investment Funds Association (HKIFA) to create a template to assist brokers, and for fund managers to comply with the rule. Brokers can complete this standard questionnaire and transmit it to their buy-side clients using Markit Counterparty Manager; the clients can then electronically indicate that they have received and reviewed the information.
HKEx is the only exchange within developed markets without a closing auction mechanism. Due to a previous failed attempt at implementing a closing auction mechanism in Hong Kong, HKEx has been hesitant about putting one in place without the support of local broker associations. ATF members are working to come up with a viable closing auction mechanism for Hong Kong that could be easily supported by HKEx and the SFC.
Contact details;
Tel: +852 2842 6950
Fax: +852 2842 7056
Email: info@iihongkong.com

Who’s Responsible For Transparency?

By John Kelly, Chief Operating Officer at Liquidnet
John KellyThe growth of electronic trading has provided many benefits to equity investors by creating new market efficiencies and reducing trading costs. But it has also led to a complex web of venues, liquidity types, and trading strategies that have contributed to a loss of confidence among investors globally. And while it may sometimes feel like “us vs. them”, it is in the interest of institutional trading firms to work with regulators to restore confidence and clarity to our market structure. Indeed, market participants should take the initiative without waiting for regulatory action. One place to start is transparency and control over liquidity interactions and use of customer information.
Since the beginning of regulated markets, institutional investors have needed mechanisms to trade their large blocks away from the “open outcry” of the exchange floor. These institutions – that manage billions of dollars of pension and mutual fund assets on behalf of millions of investors, often trading in block sizes exceeding 200,000 shares – still need to be matched away from the retail market so as not to cause price volatility for the rest of the market and trigger adverse price movements.
With the digitisation of markets, institutions now have a multitude of trading venues in which to trade. However, each of these venues caters to different, and sometimes contrasting, trading needs. And, while some platforms serve their original intent of providing institutions with a venue to trade large blocks, most have become destinations for algorithms that slice blocks into pieces, indistinguishable from the trades found in the displayed world of the exchanges. The highly competitive, complex, and interconnected market structure results in block orders being transformed into small orders that move quickly from one venue to another before eventually executing. This results in information leakage, adverse price movement, and an erosion of fund performance.
Market fragmentation has also presented challenges to regulators in defining the types of transparency that are, or are not, beneficial to the overall market structure. It is difficult for regulators to implement “level playing field” rules common to all participants and venues while giving consideration to the large number, variety, and co-dependency of trading venues. In such an environment, regulation could easily become a web of micro-managing rules that impede efficiency rather than stand as broad and powerful principles that enhance transparency and improve market structure.
For institutional investors – who have the greatest need for market transparency, the complexity of today’s market structure itself has resulted in increased needs for transparency and, at the same time posing greater difficulty in defining and standardising transparency across different business models.
Recently, we have seen some examples of regulators establishing sound principles that strike the right balance between individual participant protection and fostering broader market transparency and efficiency. One example is FINRA’s rules for new reporting standards for venue trading volumes that include an appropriate delay; two to four weeks, depending on the underlying liquidity available for a particular stock. Regulators in Canada and Australia have been successful in their efforts to improve the use of dark pools by implementing rules that govern trade size and establish price improvement requirements for dark trading.
Each venue is different. The operators of each trading venue are not only best placed to understand individual customer needs and how they interact with liquidity and the various trading technology solutions, but also these venues have a responsibility to their clients to communicate this. One way forward is for the venues to develop tools that give customers ultimate control over how markets are accessed and how their data is managed and utilised.
Liquidnet recently launched “Transparency Controls”, a web-based interface that provides clients with the ability to set their preferences for interacting with different types of liquidity and the different purposes for which we may use their data to create opportunities to trade. The options are detailed and the system sets preferences in real time. We believe this new level of transparency and control pushes existing industry standards and addresses the concerns of the institutional community as well as regulators.
Still, the industry needs to do more. There needs to be an industry-wide focus on protecting and respecting client information. Trading venues need to be responsible guardians of what is, essentially, proprietary information. They should have clear and transparent principles on how they interact with venues, liquidity, and how their information is used. Importantly, they should give their clients complete control over these parameters through simple, user-friendly tools.
Institutions have every reason to expect this, and trading venues have every reason to provide it as a matter of good business practice rather than simply the result of regulator pressure.
Both regulators and market participants crave a new level of trust and transparency in the markets. Trading venues themselves must take a more active role in creating it.

Hong Kong Post-Trade Roundtable: Putting Together The Pieces

Roundtable 14
On the 4th June the GlobalTrading journal hosted its second annual post-trade roundtable. Last year’s was welcomed by a black rainstorm, and this year’s was met with fierce sunshine. Whether the weather reflects market mood or not remains to be seen.
The roundtable focused on a number of major issues that reverberate across the trade lifecycle and into the back office. The main topic under discussion revolved around the split between automation within a firm, offshoring processing to a cheaper environment, and outsourcing processing altogether. As is often the case, a careful balance must be found between saving costs and cutting back on functionality . The fundamental point remains that no matter what processes are moved where, and what is automated entirely or partly, the responsibility for post-trade compliance and risk management has to stay within a firm.
There was also a conversation around the changing value of labor within the firm: the value calculation between offshoring labor intensive processing to a cheap market where there is a high staff turnover, and keeping the work in a more expensive environment with better staff retention that encourages staff improvement is changing.
A major consideration leading on from this point is that in the case of controversy (eg Libor fixing, FX rate scandals), it is often the case that these areas won’t be fixed until a person in a firm is made personally responsible or accountable for such transgressions. This is often forced by regulators. Culture within a firm does not drive compliance and surveillance technology forwards; the technology exists, but its uptake is limited until regulation forces firms to engage. As was said in the room, everybody believes that lighting will strike somewhere else first. It remains to be seen how many billion dollar fines it takes before surveillance and compliance becomes an area where global firms start to implement solutions ahead of the regulators. As is often the case however, the technology and people that the regulators have access to, needs to rapidly increase in level and complexity to enable regulators to properly monitor and control their jurisdictions.
Dean Chisholm
Michael Karbouris
It was widely agreed that an area that needs attention first is the standardization of technology and terminology. There are ongoing efforts to push the uptake of FIX in allocations and confirmations, and standardize models and the way different elements of software talk to each other to alleviate many issues that arise during automation and outsourcing.
With increasing trades across asset classes and across different arms of the firm, the legacy silos that divided up responsibility and trading from the middle and back offices have to become a thing of the past. If systems cannot communicate with each other, it becomes very difficult to holistically manage risk and exposure across a firm, and it becomes almost impossible to properly monitor costs and commissions, especially across markets as disparate as Asia’s.
In summary, the event was a chance to discuss such theoretical matters against such market moving backdrops as the Hong Kong – Shanghai Connect operation that is unfolding in Hong Kong and regulation that is coming out of Europe and the US. It is a time for change and evolution in post-trade processing, and once the pieces are all put together, firms should be able to really see the difference and leverage real benefit.
Stuart Knowling
Jean Remi Lopez
Endre Markos
Peter Morris

MiFID, MMT And European Market Reform

By Arjun Singh-Muchelle, Senior Adviser, Regulatory Affairs, Institutional & Capital Markets, Investment Management Association
Arjun Singh MuchelleIt was a bright cold day in October 2011 as the clocks were striking 10:00 when MiFIR/D II was unveiled by Commissioner Michel Barnier.
Over two-and-a-half-years later, on an equally bright and cold afternoon in April 2014, the European Parliament finally voted through the Level-I text of MiFIR/D II. There is a lot in the legislative proposal that will go some way to ensure stability in the market; restrictions on high frequency trading; stringent rules on market abuse; introduction of equivalent treatment of investment and insurance products and ensuring a high-level of investor protection.
High-level investor protection
The EU Parliament’s confirmation that key investor protections are to be extended to insurance-based investments and structured deposits is very important and will benefit investors widely. We have always maintained that having different disclosure and conflict regimes, simply because some products were historically classified as insurance-based or deposits and not seen as investments, failed to look at the issues through the experiences of investors and savers.
By 2017, most of these differences will have gone. The PRIIP KID Regulation, in particular, will ensure that investment funds, insurance-based investment products and structured products will all have similar pre-sale disclosure requirements and this could come into force as early as late 2015.
On the issues regarding capital markets however, asset managers, as fiduciaries for their clients, have doubts on whether MiFIR/D II will ensure efficiency and free and fair competition in financial markets. Before going in to the details, one point needs to be made up front: asset managers only ‘trade’ on the market as fiduciaries and as agents; that means, they do not trade on their own behalf, but rather, have a legal and contractual obligation to manage assets in the best interest of their clients (the vast majority of whom are long-only, unidirectional funds such as pension funds). It is their legal responsibility as fiduciaries therefore, that has framed our thinking and positions on MiFIR/D II.
The contentious ‘dark pools’
Were one to call a spade a spade, these pools would be called by their more representative name – ‘institutional liquidity pools’ – for that is their purpose; providing institutional investors with an alternative liquidity pool.
The underlying assumption of the volume cap mechanism is that there is no difference in the needs of global institutional investors and Mrs Jones, who may invest €200-a-month. For the seasoned market watcher, this is utterly ridiculous.
The proposed volume cap mechanism limiting the ability of asset managers to trade in institutional liquidity pools will have a negative impact on the equity market, which was not a primary cause of the financial crisis. The restriction on the use of these pools limits investors’ abilities to trade what are often illiquid or bespoke trades. The cap will force these trades on to primary exchanges where they will face the adverse effects of having to trade with competing traders who will be able to see them coming. This will increase transaction costs for investors and make it more expensive to trade.
It is always important to recall that these pools allow investors to reduce the market impact of their trade and therefore the associated transaction and execution costs which benefit the end client. Research by LiquidMetrix has shown that these pools (in February 2014) offered an average price improvement of 14.21 basis points when compared with the best lit venues across Europe.
As such, the IMA is calling for a phased-in implementation to the volume cap mechanism. On 1 January 2017, make the most liquid asset classes subject to the volume cap. On 1 June 2017, make the less liquid asset classes subject to the cap and finally, on 1 January 2018, make the remaining, least liquid asset classes subject to the cap. It is our view that a tiered, phased-in approach is necessary to mitigate any and all adverse effects of the volume cap mechanism. As order sizes and depth across primary exchanges have dramatically declined in recent years, the choice to trade blocks in such pools has become (and will remain) increasingly important.
In trying to increase access to market funding for SMEs in Europe, the volume cap mechanism achieves the opposite. In fact, all of the top 15 names hit hardest by the volume cap are listed on the FTSE 250 and FTSE Small indices. Since over 90% of European equities trade on primary exchanges with only a small proportion of the European equity market (9.46% in February 2014) traded in institutional liquidity pools, we have always maintained that post-trade transparency is more of a major concern than pre-trade transparency for asset managers. This is why the IMA has officially endorsed Market Model Typology (MMT) as the minimum standard for post-trade transparency.
Market Model Typology
MMT will take the various post-trade flags from the different European exchanges and convert them to a single, unified post-trade language. This way, a risk-less principal trade on Euronext will be converted to the same post-trade flag as a risk-less principle trade on the LSE. By creating such a harmonised standard for post-trade transparency for the European equities market, MMT will be the first step towards achieving a consolidated tape across Europe.
The largest failing of MiFID I in relation to markets was the lack of a consolidated tape in Europe. Asset managers have been promised such a tape for decades but neither regulators nor data providers have delivered. MiFIR/D II does very little to solve this failing. The new legislation has split the tape requirements into two; one tape for equities and another for fixed income. The earliest asset managers will have these tapes however, is not until 2020.
It is not just the equity market that will be affected by these rule changes. Fixed income markets will also face significant structural changes.
The European bond market is already relatively illiquid, with liquidity concentrated in primary issuance with a rather shallow depth on the secondary market. The pre-trade transparency requirements for these markets will have a detrimental impact on the already scant liquidity. The implementation of the pre-trade transparency requirements for fixed income therefore, needs to be treated with caution. Mis-calibrated pre-trade transparency requirements may have (and probably will have) a disruptive impact on the fixed income markets.
Efficient calibration of post-trade transparency, of what has actually occurred in the market, is far more important (and useful) to investors, rather than pre-trade information that may have no bearing on market realities.
Finally, to the ‘flash boys’.
A lot has been written about high-frequency traders. But speed is not in itself a bad thing. In fact, during times of severe market stress, speed is vital. There is however, a big difference between those who use algorithms to trade (such as asset managers who may use Order Routing Systems) and those proprietary traders who depend on high frequency algorithms to exploit minute price differences in very small size of trades. The latter of these is unhelpful. As liquidity takers, proprietary high-frequency traders may make effective and meaningful bid-offer crosses difficult for asset managers.
Primary exchanges aren’t off the hook here, either.
Through generating revenue from allowing high frequency traders access to primary exchanges, the latter are contributing to the problem rather than equally participating in mitigating any adverse effects that arise from the prevalence of ’fair-weather liquidity’. As with most financial legislation however, much is left to the European Securities and Markets Authority (“ESMA”). With over 100 regulatory technical standards and delegated acts that ESMA will need to draft and the European Commission to adopt, ESMA is facing a rather large task. ESMA however, has limited resources and a small number of staff. This will make it all the more important to make full use of the information that market participants can provide.
The IMA is working with other trade associations from both sides as well as a number of trading venues to ensure that we are, collectively, able to assist ESMA by providing it with the data and evidence it will need to ensure the effective implementation of MiFIR/D II.
A lot has been said about the delicate balance to be struck between stability and efficiency and free and fair competition on the markets. However the two go hand-in-hand so it is a false comparison.
Through creating efficient markets that engender free and fair competition between all actors, whilst putting in place stringent rules on market abuse, axiomatically leads to stable markets. Together they generate economic growth and development.

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA