Home Blog Page 629

Trading : Derivatives

SCALING UP FOR REBOUND?

Be20,Derivs Divider

Despite the dip in derivatives trading in 2012, trading technology suppliers are reinforcing their platforms to cope with an expected surge in volume. Dan Barnes reports.

Derivatives trading volumes fell year-on-year in 2012 following a lacklustre performance from underlying cash instruments. According to industry body the World Federation of Exchanges (WFE), primary equity markets recovered from their slump as global market capitalisation increased by 15% to reach US$54 trillion, a return to levels last seen in 2010.

This positive turn was not shared by secondary equity markets, which saw trading turnover on electronic order books (EOBs) fall by 22.5% globally from 2011 levels. By volume there were 9.7 bn trades, down 14.3% from 2011, while the average transaction size grew slightly from $8,100 to $8,300, possibly showing a move towards less automated trading. The number of derivative contracts traded on-exchange fell by 21%, the first fall since 2004.

Nevertheless, regulatory changes to the derivatives market, driven by the 2009 mandate of the G20, are intended to move as much of the over-the-counter (OTC) market onto electronic platforms as possible, while making OTC trading more expensive. Technology vendors expect that to drive electronic trading volumes back up, although the scale and speed of change is unpredictable.

“We are currently conducting a study that has confirmed that on the voice side of the broker-dealer business, the average size of a deal is around US$60 m,” says Mitch Stonehocker, director of the Front Arena platform at system supplier SunGard. “On the electronic side the average deal size is around $2m, but the total number of deals is vastly larger than on voice. It’s reasonable to conclude that the more electronic the market becomes, the higher the overall volume of trades. It is the same thing that happened in stocks.”

“There is a lot happening in the brokers’ space and it is not easy for them to understand what is going to be traded on exchange and what isn’t,” warns Emmanuel Carjat, managing director at connectivity provider TMX Atrium. “It’s very difficult for them to understand how much volume might go there; part of it is that the size of deals is extremely high. Looking back at equities five years ago the size of tickets were an order of magnitude larger than they are today. That fall is what has driven a lot of the increase in volume.”

Preparation

Trading technology providers are setting themselves up to support clients in this new high volume world. The firms themselves may not have strategies that reflect trading at high speeds, but the increase in the speed of market data and price movements that high-frequency trading (HFT) firms generate, are a phenomenon that more sedate firms have to handle.

The drive to electronify the market is pushing traditional firms to adapt their operating models notes Jean-Philippe Malé, head of OTC at front-end technology provider TradingScreen. “Even people who didn’t used to trade electronically will have to get onto a platform like ours; people will be badly hurt if they don’t adapt to this new world,” he says.

David Little director, strategy and business development of software house Calypso notes that although his clients are far from the HFT end of the spectrum, including many tier two and three banks, they still need to keep up with the new market conditions.

“We are improving our low-latency capabilities to allow the processing of a large number of trades hitting the system at once and having the positions, risk matrix and P&L displays in near real time,” he says. “It’s far from algorithmic trading but it allows human traders to trade with confidence knowing their prices are up to date and the risk is getting calculated in near time.”

Traders must be able assess risk and determine profitability, react quickly to changes in market prices in order to manage portfolios and mitigate their market risk, and perhaps lock in profitability. That requires fast position updates, and measures of risk or P&L even if they are not using algorithmic trading tools.

“Our graphical user interface has to have some of the flexibility reduced in order to allow a faster appraisal of the data; you don’t want thousands of columns in the grid,” says Little. “Those screens can still be accessed for tasks where they are useful, doing reporting for example, but a trading screen is designed to be fast and complements the slower reporting screens that you can use for offline slicing and dicing.”

The trading of instruments such as swap-future products has required some additional work for system providers. These contracts have long-standing precedents at NYSE and ERIS Exchange, but CME and IntercontinentalExchange (ICE) launched new versions at the end of 2012, which provide firms with a way of getting swap-like exposure without subjecting them to the Dodd-Frank rules on swap trading, clearing and processing.

Malé asserts that TradingScreen has a very flexible system that has allowed it to adapt to these new products quickly. “Our platform was built in the early 2000s by removing all concept of an asset class from the core system; it can trade tables and chairs if we want,” he says. “We can create custom instruments with custom trading rules; for example with CME and ERIS Exchange we can trade interest rate, future-like contracts with very custom cash flows. We had a few weeks of work on our side to accommodate swaps of various types. We have done all of that and those changes are already being integrated with some of the customers.

Unlike some standard trading tools we do a lot of integration with our customers in the back office, middle office and with risk. That’s not purely for trading purposes; it’s for position tracking, for risk, to handle workflow so it is a lot more complicated than the trading part.”

The end result

By trading futures contracts with fixed tenors, firms lose out on the flexibility of trading a swap; they are also committed to trading and clearing with the exchange operator that has issued the future. Dealing with multiple venues is pushing trading technologists to adapt their platforms further.

“Software can’t be optimised for one exchange anymore,” says Mike Glista, director of order routing at trading technology provider CQG. “We’ve been continuing optimising our systems to operate with certain hardware and trying to figure out what actually occurs at different exchanges. For example at one exchange we are dealing with it is better to do a cancel/new order than to run a cancel/replace. There is a lot of tuning to the particular matching engines for the higher speed trader.”

Having a greater amount of choice also complicates the workflows that the trading systems have to provide processing for says Steve Grob, director of group strategy at platform supplier Fidessa. “At the front end you need to know how a euro swap will be traded alongside a bund on Eurex. Then that throws up issues around where you clear,” he notes. “Dependent upon your relationships with different clearing houses you may want to go down one path and not another. The concept is emerging that you might have the best price as one part of your calculation and then the optimal destination as another, when you take into account any potential risk or margin offset. That is a big part of the way the new workflow will look like.”

To reflect this, the software vendors are giving people the tools to make those decisions, with a timely level of information and the flexibility to choose a path for execution.

“It comes down to being able to put all of these instruments in one place, and make your own decisions about their equivalence or otherwise,” says Grob. “Different firms have different views on how well one instrument could offset another. People will need the kind of intelligent tools to trade in and amongst different venues that we developed when multi-market structures emerged in the US and Europe, basically a combination of smarter order routing, algos and analytics. That becomes much more important as the number of destinations and venues grows, becoming more complicated and intertwined.”

©BestExecution

 

 

Dodd-Frank, EMIR and other regs drive buy-side to explore new tech options at TSAM Europe

According to the 2013 OTC Derivatives Trading Trend Survey from IPC Systems, a third of hedge funds, investment banks, broker-dealers and exchanges have said they are not yet ready to deal with a raft of new incoming regulations, such as the requirement to clear over-the-counter (OTC) trades via clearing houses, according…

[Source: www.bobsguide.com]

Are EMIR Implementation Dates Fixed?

Walt Lukken, President and CEO at Futures Industry Association [Photo = Walt Lukken, President and CEO at Futures Industry Association]

With ESMA’s (European Securities and Markets Authority) regulatory technical standards (RTS) codifying the European Market Infrastructure Regulation (EMIR) into an applicable set of rules entering into force on 15 March 2013, the implementation timeline for EMIR has now become much clearer.

The CCP registration process
Under EMIR, CCPs apply for authorisation with ESMA to clear under EMIR. It is expected that this will happen sooner rather than later. CCPs will have the required paperwork ready to submit as soon as the regulatory technical standards (RTS) enter into force. Rumours are that up to 25 CCPs (EU as well as non-EU) might stand in line to apply. The national competent authority then has up to six months to review and approve the CCP application and authorise the CCP.
When does clearing become mandatory?
The clearing obligation needs to be defined and put into a respective regulatory standard. This is nothing more than defining which products ought to be cleared via a clearing house and in what time frame. The national competent authorities (NCAs) will have one month to notify ESMA of the classes of OTC derivatives already cleared by CCPs in their jurisdiction. With the authorisation of a CCP by an NCA, a notification of the clearing obligation should be issued to all market participants. ESMA then has up to six months to prepare a draft RTS specifying the classes of derivatives to be cleared and from when.

[Source: tabbforum.com]

Asset Optimization : Moving Beyond Collateral

Chris Zingo, Calypso Technology [Photo = Chris Zingo, Calypso Technology]

Many of the market structure changes initiated by global regulatory reform are creating opportunities, but they also are creating constraints, particularly around balance sheets and collateral. Calypso Technology’s Chris Zingo says firms need to move beyond the concept of collateral optimization to asset optimization in order to maximize business opportunities at the enterprise level. To enable enterprise-level asset optimization, however, firms need an infrastructure that provides insight into risk and collateral obligations in real time, he tells TABB’s Alex Tabb…

[Source: tabbforum.com]

Aiming For The Clouds

Senior Executive Vice President Michael Lin of the Taiwan Stock Exchange goes through new developments and ongoing challenges for the TWSE.


What was your implementation procedure with your new connectivity platform, and what were you were looking to achieve?
Basically, the implementation of FIX/FAST was for two reasons. One was that we see the following of international standards as a major part of our agenda for the development of our IT systems. Secondly, on the Taiwan capital market, the number of foreign institutional investors has been steadily increasing.
Foreign institutional investors already account for one-third of the market capitalisation, and they are very active in investment activity for their clients. Furthermore, many Taiwanese investors are also interested in investing in offshore products. So I think these two reasons are basically why we really wanted to implement FIX/FAST.
What is the current level of FIX Protocol implementation?
We implemented the FIX Protocol three years ago. But, honestly speaking, at the current level of implementation, I don’t think we have many trading activities placed through the FIX Protocol. The reason is, for brokers, the problem is on our internal system, we still use our proprietary protocol TMP. It means that even though the brokers can place an order using the FIX Protocol, when the message enters into trading system, a conversion to TMP is needed. That means that for messages placed through the FIX Protocol, the performance is not good enough due to the need for convergence.
However, we are now gradually getting more brokers interested in placing their orders through the FIX Protocol. And, we are constructing a continuous trading mechanism, under the new system, the conversion from the FIX Protocol to TMP is no longer necessary.
Where is the Taiwan Stock Exchange amongst its global peers, as well as amongst its Asian counterparts?
We have seen some clear trends in the development of connectivity, especially around the FIX Protocol and FIX/FAST.
Along with the rapid growth of globalised capital markets, we can find some exchanges that are eager to establish closer market connectivity. They are looking to build up close market connectivity for brokers, for example, the New York Stock Exchange proposed to establish connectivity between themselves and the Taiwan Stock Exchange.
The purpose of this connectivity is that it allows brokers in Taiwan or in New York to access the other market easily, just through the connectivity built by the two exchanges. Previously for any brokers in Taiwan, if they wanted to trade on the New York market, they had to find a broker in New York, and they needed to allocate resources to the connectivity solutions through service providers. That was a huge expense, and it was also very time-consuming.
However, once the connectivity between NYSE and TWSE is established, then we can make the connection for the brokers much easier and less costly.
Other exchanges are also looking at the same idea, for example, the London Stock Exchange and even in Asia we can see the Tokyo, the Korean and the Singapore Exchange are exploring connectivity solutions.
So, I think we should focus on this area. Internally, we also held discussions on helping the brokers in Taiwan to extend their global market. I think IT investment and IT management is not easy for many of the local brokers as they are quite small in scale. So we are now really focused on looking into this area.
What are you trying to achieve this year?
This year there are several big projects for us. The first one is we are building our so-called “next generation trading system.” We finished the coding at the end of last year, and this year we are working on the testing. We hope by the end of this year, our new trading system can go live. This new trading system really matters because, first of all, we have changed the traditional proprietary platform to open architecture. We changed the language from COBOL to C++ and we also introduced a new middleware.
In terms of application structure, we have taken this opportunity to make the system more structural, separated into modules, and more flexible. We hope this change will make our business development easier.
Secondly, we are also building a new data centre. We thought about having a new data centre for many years and, fortunately, we are now building it. Construction is underway and we hope to finish the building next year, and to start operations in early 2015. The meaning of this data centre is that it is not only a data centre in a traditional sense, but we have constructed the new data centre around the “cloud concept”. This means we have prepared the building to very, very high specifications in terms of cooling, power consumption, etc.
Also, very importantly, we are also planning cloud services for capital market members in Taiwan, which is the third project I would like to share with you.
We have organised a task force and are preparing what we are going to provide in terms of cloud services. So far, other exchanges around the world when they provide cloud-related services, they have focused on areas such as co-location, data services, etc. Seldom have exchanges really focused on the true concept of the cloud. But the Taiwan Stock Exchange is looking into true cloud services. For example, we are considering providing trading systems on the cloud. Until now only the New York Stock Exchange really offers, a comparatively, more complete or more integrated cloud services for community members. However I think what we are considering is an even broader scope of cloud services.
The cloud services matter because, as I mentioned, most of the Taiwanese brokers are small in scale, so it is very difficult for them to maintain IT development on a global scale. So once we can provide cloud services for them, it is not only cost saving, but I think they can gain the benefit of faster time to market whenever they need to match the requirements of regulatory change, or changes in the business environment.
What challenges have you faced?
From my viewpoint we have been very lucky, because since the 2008 financial crisis, the financial sector around the world has been struggling; however, both the regulator and our board members are really supportive of IT development within our exchange.
In terms of IT challenges, firstly we need to maintain the original system. We need to make the whole system, not only the trading system, but the whole IT system within our exchange, reliable. At the same time, we have our three big projects, so I think it is very challenging for us to maintain the current system and concurrently we want to spend a lot of time on new development.
The second challenge for us is to find good talent, to retain the good talent and to train them.
A further challenge is in trading volume. Last year we lost almost 20 percent of trading volume. So far in 2013, it seems we have had some bounce back, but not entirely.
However, we’re always proud of the Taiwan capital market; we have a very good turnover, and a very good regional P/E ratio, so I think trading volume will return. Quality really matters for the Taiwan capital market because compared with Hong Kong, Japan, Korea, even Shanghai, we are smaller. We need to rely on good corporate governance, P/E ratio or turnover, etc.
So it should be a challenging year, but we hope we can turn it into good news.

Towards A Consolidated Tape

As the regulatory juggernaut gathers pace, Alexander Neil, Head of Equity and Derivatives Trading at EFG Bank examines the issues behind the tape, and what the buy-side wants. 
Many of my buy-side peers have given up hope on a consolidated tape (CT), but the success of a CT is absolutely paramount now, even more so than it was a few years ago. Not just for industry insiders, but for politicians and the outside world to be shown that these can be transparent markets and that we are not penalised by misguided efforts to force volume onto lit markets, abolish dark pools or volume-enhancing factions such as certain HFT activities. In such a low-volume, low-commission environment I feel the stakes are especially high to get this right from the first day, and not let it drag on and into MiFID III . It shouldn’t be this hard to track trades in Europe and it’s funny to think that whilst we’ve seen a real race to zero in pre-trade latency, it feels like the post-trade space is being drawn out over years!
The Holy Grail is a ‘quality’ tape of record at ‘reasonable cost’. But what is a fair price for market data? Is it really something that can be left to market forces, or is it one of those things that should be regulated like electricity prices. After all, there are social responsibility aspects to market data, as ultimately higher costs for the buy-side are implicitly passed on to the broader (investing) public.
There were initially three routes that the European Commission (EC) wanted to take us down. The first route was to employ the same model as they did for execution and let the invisible hand of the market find the best solution and pricing through healthy competition. The second option was for a prescribed non-profit seeking entity to manage the CT, and the third option would be a public tender with just one winner. The EC seems to be leaning towards the fully commercial approach, and it has set the stage for a basic workflow where APAs (approved publication arrangements) collect and pass on the data to Consolidated Tape Provider(s) (CTP). But, if market forces alone could find a compromise between cost and implementation, we would have an affordable and reliable European Consolidated Tape (ECT) in place already, and MiFID II could instead concentrate on new problems.
So my first concern with the purely commercial approach is that so far, it hasn’t worked; incumbent exchanges are still charging pre-MiFID levels for their data (despite, or indeed because of, their diminished market share in execution), and the only real effort to break the stalemate (namely, the MTFs throwing down the gauntlet) will end up just penalising the buy-side more in the short-term. If the regulator doesn’t address data pricing head on, the buy-side may well end up suffering the effects of a scorched-earth move (wasn’t ESMA granted more powers than its predecessor CESR after all?).
Industry Initiatives
However, it’s not all bad. The COBA Project has recently announced a proposal which promises to address these commercial obstacles and has initial support from exchanges and other venues which contribute more than 50% of total turnover. Their solution establishes a new commercial model for Consolidated Tape data which lowers the cost and incorporates the best practices recommendations developed by FPL and other industry working groups. The best practices provide details on how trade condition flags should be normalised thereby enabling consolidation of trade data across exchanges, MTFs, OTC and even BOAT. FPL’s best practices recommendations also bring together wide representation from across the industry, and has been concentrating on data standardisation (including timestamp synchronisation and a clear distinction between execution times stamps and reporting time stamps).
The Coba Project is spearheaded by two former exchange and MTF people, and seems to be the most ambitious in terms of setting a deadline (Q2 ’13). For their sake I would like to see that good work recognised, but the EC has not officially endorsed them and I see this as one of the main failings so far. Without this endorsement or intervention, I worry that the whole effort will run out of steam. And if that happens (if the regulator doesn’t give the industry a nudge) I worry it will ironically signal the failure of the freemarket approach and the regulator will have to make an embarrassing U-turn and go for the prescribed, utility model. Remember the case of BOA T, which had the potential to become an ECT, but it perhaps wasn’t endorsed enough.
So we’re in a position where the exchanges and data vendors are rushing to try and come to a mutually beneficial solution BEFORE the regulator steps in and forces a US-style consolidated tape, and by doing so, potentially remove the commercial benefits for exchanges and vendors.
Being a CTP in itself will be a tough business though, and I wonder if there’s such a thing as a commercially-viable CTP proposition: Not only will they operate in a highly regulated business, but a few years down the line there’s the possibility that Europe goes the same way as the US and starts looking at moving away from a CT and instead getting direct fees from the exchanges (a sort of parallel industry, not quite direct competition). Not only that, but because under current proposals their product will be free after 15 minutes, I expect more investors might just accept a 15 minute lag and get the data for free.

Pre-Trade BBO
The lack of a pre-trade BBO only lessens the commercial prospects, at least if we look at the US as an example, where SEC-mandated best execution requires referencing of the NBBO and the business model directly links to the pre-trade snapshot. I would welcome a US-style model where exchanges provide their data free of charge to a (single) Consolidated Tape Provider, in exchange for a cut of the revenue based on use by final clients. In the US, the basic time and sales data is treated almost as a commodity, and non-proprietary (meaning it belongs to all exchanges, rather than a single exchange), and exchanges then get paid for the revenue from subscriptions to the consolidated tape according to an SEC-approved formula, and some of that revenue is paid back to customers. This practice of redistribution of CT revenues back to users/clients has propagated the rise of HFT though, and European politicians would surely not want to be seen to be encouraging this.
Pre-trade figures heavily in the US model, which so far hasn’t been needed in Europe because we don’t have the same ‘tradethrough’ rule here, but rather a more principles-driven model of ‘best execution’ that is more open to ad-hoc interpretation. So yes, the lack of pre-trade data might make it less profitable and whilst the US commercial model might be viable in Europe, I don’t think market forces alone will get us there. I would welcome the regulator to come in a bit stronger, just as they did in the US.
As a buy-side trader on a global execution desk, I would welcome proposals that used the US consolidated tape pricing model as a rough benchmark (whilst recognising the much more complex picture over here). I understand that a full set of data costs something like eight times less stateside.

Multi-vendor solution
My gut feeling is that the multiple operator commercial approach might see the light of day, given recent industry work on data harmonisation and my assumption that the regulator will eventually set clear rules over data ownership and pricing. But with multiple providers comes the possibility that we will be penalised by high implementation costs: Just as execution fragmentation required a lot of IT investment by mid-tier brokerages (SOR, etc), I worry that a multiple-vendor consolidated tape solution will lead to fragmentation of data that will require large IT investment by mid-tier buy-sides (aggregating what, in my mind, should be a commoditised product, in the post-trade space at least). Secondly, I worry with a purely commercial model we might see the same pitfalls as with execution fragmentation, i.e. an initial scramble to gain market share (with free data from the MTFs, some sort of rebate or other incentivisation, but then a maturing market with lower rebates, data being charged for by all parties and 90% of the market is controlled by two players). This boom and bust cycle might be good long-term for the industry, but in the short-term can end up penalising the buy-side.
If we step back a bit, I wonder if fundamentally a commercial model is really going to work, for what will surely be a very difficult business, in a potentially persistently low volume environment, squeezed at both ends (exchanges keeping prices high, and the buy-side not confident enough in the data to pay a premium), and the data being free after 15 minutes. I wonder if people will be falling over themselves to become CTPs.
Whatever model we end up with I would want to avoid the old model where you pay the exchange to report your trade, and then you pay them again to get the information back! This raises the question of who owns the rights to that data.
Who pays?
Who will pay? Who should pay? Who should be making money off the back of this data and what margins are acceptable? This point might need a more heavy-handed approach from regulators to gently encourage incumbent exchanges to revise their pricing structure. The incumbent exchanges insist that their data is worth the premium, sometimes pointing to the crucial closing volume prints (essentially still a monopoly business). But there have been calls for the regulators to allow MTFs to step in at the closing auction if the primary market cannot fulfil that duty due to IT failures for example, and I would support this initiative. If this happened, the incumbent exchange would probably find it harder to justify their current pricing model.
On pricing and data ‘ownership’ I feel we need a full-on regulator approach to ensure that prices don’t just come down marginally, but come down ‘enough’. No business is going to accept to take a hit on their top-line or margins unless they’re forced to. I believe the MEP Kay Swinburne has been vocal in doubting whether a purely principlesdriven approach is going to be enough, and I support her in this.
The bottom line
I would like to see something that trickles down to benefit the real economy, or end users, and not just boost market share for exchanges, HFT and a (potentially) commercial consolidated tape enterprise, or any other mutually beneficial arrangements that do not allow for some sort of wealth creation across the industry (whether it be profit sharing, or implicit cost savings).
Just as trading fragmentation helped ultimately lower explicit execution costs, I would hope that data fragmentation would do the same, and not just create new industries with little immediate benefit to longterm investors. Especially because market data costs are trickier to pass on to our final private banking clients, and ultimately we want our final clients to benefit from lower overall equity investing costs (execution, clearing and market data).
Something I would also like to see is more transparency on ETFs, which currently feels like they are even further away from getting a consolidated tape, and for which reporting obligations are even laxer than equities (for example, sometimes they are double reporting, once at creation or redemption of new units and then again if a secondary-market trade is concluded). So this is another area that I would want the regulator to help forge a complete view of trading activity.
Besides data pricing and ownership, a fundamental problem is data quality, including flag harmonisation. Here, I don’t believe we’ll hit a brick wall, like we have on pricing, thanks to the industry initiatives mentioned earlier. Ultimately, high quality data (that doesn’t unfairly penalise any part of the investing community, whilst allowing for ECwide rules on delayed reporting for large blocks) is something I think everyone is looking for. There’s nothing to be lost by having it.
In terms of data quality, I would be happy to see something that helped us to form an accurate view of OTC trades. We still can’t accurately spot OTC in the market, so how are we supposed to judge whether they are meaningful and make informed trading decisions on the back of that? Will the new OFT classification kill or help OTC/dark pool trading? I welcome the proposal that Approved Publication Arrangements (APAs) will harmonise OTC reporting, but I worry that the current highlevel principles are too vague and could see us right back in the murky position we find ourselves in today (have you tried to track a midcap Swiss stock across lit, dark , OTC recently?), with multiple mechanisms and websites and flags and myriad ways to obfuscate prints that are important to the price-discovery process (sometimes it is very difficult for us to answer the question today, ‘how much traded and at what price’?).
So I would like to see very clear identification of whether a trade was lit or dark, or OTC and I would want there to be pan-European standards for delayed reporting. It would be unreasonable to expect all OTC trades to suddenly become onexchange and reported instantly, but we need to get a handle on what’s happened, with an acceptable delay.
Two things might break the price stalemate: MTFs starting charging for data, and unbundling by incumbent exchanges of pre/post-trade data. Chi-X has already started and is ostensibly promoting an industry-led commercial route and incumbent exchanges have started unbundling, (although I would like to see further unbundling, of say large cap versus midcap, continuous prints and auction prints) almost a pay-as-you-go model? The exchanges may well welcome a more prescriptive approach to unbundling recommendations. They are unlikely to do it on their own and in any case it would be difficult to enforce prices caps, but perhaps a ratio of pre/post data prices relative to each other and then a ratio of the bundled data versus unbundled could be set by the regulator.
I appreciate that consistently lossmaking businesses, like many of the MTFs, have to make money somewhere in order to stay in business, and that is in all of our interests. At some point, if they don’t turn into standalone goingconcerns, we’ll be right back stuck with a monopoly exchange offering. But instead of propping the business up with equity data fees, perhaps their future will be ensured by the move into the different asset classes that MIFID II and MIFIR will tackle.
Future improvements?
Radical thinking, but in order for the CT provider (or providers) to be paid a reasonable amount (and in turn ensure a reasonable income/fee structure across industry and ensure the survival of exchanges and MTFs) perhaps the regulator needs to rethink the 15 minute free-data model (how much demand is there for actual real-time post-trade data after all). Perhaps instead of just two pricing models (a premium one for no delay and a free one for 15 minute delay), there could be a premium one (for live data), an ‘economy premium’ one (where a very low nominal amount would be charged for data with a 5 minute delay, a fee low enough to attract a wider client base than currently, so lower margins, but this should be compensated by more users?) and then a free model that was for data that’s 30+minutes old (perhaps an acceptable delay for most Private Bank users).

Trading In An Imperfect Market

By James Hilton, Co-Head of AES Sales, EMEA, Credit Suisse
In economics, a perfectly competitive market must satisfy a number of conditions. There can be no information asymmetries, participants must not be able to unduly influence prices, the cost of entry and exit must be zero, and all transactions must be voluntary and mutually beneficial to both parties. As this economic nirvana remains out of our grasp, buy-side and sellside trading desks have a job to fulfill. We operate in an imperfect market, where short-term price volatility is commonplace and the cost of trading can be substantial. The use of smart trading methods is ever more important, especially where the trading goals of different participants vary so considerably.
Transaction cost analysis has helped many diligent investment firms adapt and improve their trading style to suit different market environments. Both buy- and sell-sides have built models to predict market impact, often adjusting their trading participation based on their findings. For example, we have seen a marked shift into more passive trading on our orders, presumably to reduce market impact. However, it is often difficult to pinpoint exactly what causes impact on an order book, and therefore how to adjust one’s trading style accordingly. Impact should be about not only what happens after a trade, but also not trading at the wrong price in the first place.
In imperfect markets, traders take advantage when they perceive a price to be cheap, reacting to – and correcting –short-term mispricing. Indeed, many brokers now offer tools to adjust trading participation based on the “fair value” of a stock when compared to either its own recent trading patterns or an index benchmark. These work well on an intraday basis and help buy-sides avoid (or take advantage of) obvious mispricing. However, we also suspect that some participants/strategies in the market actively seek to create short-term mispricing and use it to their advantage. This has been the subject of our most recent research.
The debate over high frequency trading (HFT) continues unabated. In our view, HFT isn’t in itself a bad thing; however, certain strategies may be detrimental to the market. We have taken a practical approach to the different types of HFT. We measure certain behaviours, determine the best way to detect them in a very short time-frame and then set out to adjust our strategies so that they trade in the most optimal manner.
One of the most visually impactful examples of detrimental shortterm trading is quote stuffing, whereby an order book is flooded with huge numbers of orders and cancellations in rapid succession (see Exhibit 1). Our AES Quant team uses techniques adapted from signal processing – including real time burst detection and pattern recognition – to detect this sort of behaviour. For the STOXX600 universe in Q3 2012, we found that stocks experienced 18.6 incidents a day on average, with more than 42% of stocks averaging 10+ events per day. Though we can’t always identify the exact strategy behind these patterns, we would want to adjust our algorithms in response.

“Layering” provides another practical example. This occurs when a number of orders are placed on an order book – or across alternative trading venues – with no intention that those orders ever execute. Many traders will have tried to lift an offer from the consolidated order book, only to find that suddenly liquidity has “faded”. Exhibit 2 shows the likelihood of orders fading immediately after a partial or full “take”. There are definite steps that can be taken to counteract this activity, and Credit Suisse, as well as other firms, have developed a range of options to achieve better fill rates.

The ability to detect these sorts of behaviours is just the first step. Being able to engineer trading strategies to be cognisant of them and adapt accordingly will begin to set broker algorithms apart. Ultimately, only those algorithms that take into account all factors of market impact will succeed. Imperfect markets create a challenge for traders; having the right tools is essential.

The Changing Sell-Side

In a time of fundamental shifts across the sell-side, Adam Toms, CEO Instinet Europe, examines the causes, and consequences, of change.

Adam TomsTo what do you attribute the fundamental changes underway in Europe?
Equity businesses have been in quite a severe amount of distress of late. Profit margins are being compressed and trading revenues through commissions are challenged given low market volumes. And with the global slowdown in trading by institutions, the commission “wallet” through which the buy-side pays for the products and services they consume has been shrunk considerably. If, as a bank or broker, you are not adjusting your cost base in response to that, you obviously end up with quite a severe economic problem, so we are consequently seeing a number of substantial moves to address the over-capacity currently found in our industry.
This capacity reduction can come in numerous forms. Some firms have shuttered certain businesses and product lines entirely. Others have more specifically defined what they are going to offer and to whom, perhaps servicing only a subset of their former clients in a more substantial manner. And still others have pursued a strategy of merger, acquisition or strategic alliance with rivals.
Is this a new trend or has it been underway for some time?
All of these things have been happening over the last few years, but the pace has quickened over the last six to nine months. However, regardless of the strategy a firm is going to pursue to deal with the slowdown, they have to first and foremost analyse whether they believe we’re simply in the middle of a cyclical challenge or whether a longer-term, secular change is underway.
If it’s determined that it’s cyclical, then managing it probably means reducing some PE expenditure, trimming variable expenses and hoping things start to turn around soon. But if one thinks it is secular, then that really calls for substantial change to the business model and all that goes along with that.
How do you come down on the cyclical versus secular question?
These are conversations we’ve been having at the Nomura Group for some time, and after significant internal debate, we’re firmly in the secular camp. Unfortunately you can only sit there for so long and say: “No, it’s just cyclical. If we just hang in here for a little bit more, the market is going to get better.” That approach will gradually end up compounding the problem, limiting options to correct the position.
What makes you think it’s a secular shift?
We do not think it is one factor but more a number of factors which in unison will result in permanent change, be it the associated equity asset class impacts from the global recession and debt crises through to new areas of focus for global regulators and even taxation initiatives.
Take the French financial transaction tax as an example. From the government’s and regulator’s perspective, it’s natural and appropriate to seek to minimise industry and economic risk through all of the tools available to them. But given the potential for unintended consequences, you hope that laws and regulations take into account their impact on the markets and institutions’ ability to effectively trade for end asset owners.
So how are you adjusting your business in response?
The Nomura Group, with its independent Instinet subsidiaries and its Nomura branded broker-dealers, had been running two distinct execution services groups for some time. When the businesses were competing on a different basis — Nomura a full service broker dealer and Instinet a more execution-only provider — we were quite happy with the structure. But once the group came to the conclusion that we’re in the midst of a broader secular change, it was important to identify a solution that would allow us to extract synergies, innovate and bring the two groups together.
We had a very clear decision to make: take Instinet into the bank, or move parts of the bank into Instinet. We believe our move to extract the cash trading channels from Nomura — essentially consolidating into Instinet — is the right decision. Instinet is preserved as an independent agency brokerage that places clients front and centre of our strategy, which is very different to other models and well-aligned to how we see our clients and the market evolving.
Coming to that conclusion presents quite an opportunity. First, there are significant savings to be had by consolidating onto a single platform, the vast majority of which are through the elimination of duplicate infrastructure costs such as algorithmic engines, connectivity, technology and vendor licenses, exchange memberships, etc. Second, it allows us to pick and choose the best components of each platform. And third, it allows us to take a step back and look at exactly how equity execution services should be provided in today’s markets. All in all we are targeting a bigger and more complete business, and when you consider the impact of aggregating the liquidity of both the Instinet and Nomura platforms, we feel this will differentiate our European business going forward.
We hope to complete the transition by spring 2013.
What about the pending regulations you mentioned? Does that play a role in this new strategy?
Yes. When you consider where regulation is headed, this model makes a lot of sense as well. First, there is a pressure globally to provide more transparency by segregating risk capital from agency execution, which this clearly does. And second, by moving execution to Instinet and keeping research as a stand-alone business within Nomura, we are effectively internally unbundling our offering, which is very much aligned to how regulators are encouraging our clients to think also.
Is there a downside to the new strategy?
While Nomura’s broker dealers can commit risk capital, Instinet cannot. But when you look at the cost of providing capital, it’s a tough game in low-liquidity markets like today’s. So, if you think you can differentiate yourself through an agency-only offering that marries the best of high-touch trading with a suite of electronic tools, we believe we can compensate for any loss due to a lack of risk availability.
How, if at all, will Nomura research clients feel the change?
The biggest change will be in terms of how clients pay for content services, as the execution services groups that existed within the Nomura broker-dealer units to monetise research and other services will be moving to Instinet, with Japan being the notable exception. We would obviously prefer that clients of Nomura research compensate for it by trading through Instinet, who will then direct a CSA payment or similar to Nomura. But Instinet needs to provide a compelling trading service alongside best execution in order for this to happen and we believe the strength of our offering delivers this.
How common are CSA payments in Europe? In some jurisdictions, like Japan, they are virtually non-existent.
It’s true that CSAs are not as prevalent in Japan and various other locations, but the landscape is continuing to evolve. In general, CSAs are well-understood and embraced in most jurisdictions globally. Specifically in Europe, securities regulators have attempted to foster transparency and choice by helping the buy-side embrace more of an unbundled model. But there is a vast difference across Europe in where that process actually stands. For example, in Sweden clients are currently unable to use commission sharing arrangements, but this is set to change shortly allowing CSAs to become an operational reality.
Instinet has been one of the earliest and most vocal champions of the CSA model, and as a result now has nearly 900 different brokers and independent research providers within its global commission management network. In fact, with this migration, Nomura will in some ways become another research provider in that network — it just happens to be one of the larger ones.
Are there any new business segments you may explore once this new model is set?
There are definitely some areas we’re exploring, some of which are made easier given Instinet’s independent model. I can see Instinet, in addition to continuing to service its core institutional clients, progressing its service offering to the sell side, essentially building on many of the execution services we provide to banks and brokers today. It could be commission management administration, core technology development or other white labelling solutions.
What has the response been from clients to this new model?
The response has been positive. Clients recognise that a firm cannot be all things to all people in these markets, and appreciate that we are being transparent in how we’re structuring things to do so. Most feel that this model provides a distinct value proposition while at the same time being sustainable for the long term.
I also think that clients like that Instinet will be able to continue providing a specialised offering. Many other firms, given pressures in their equity models, have attempted to move to more of a generalist coverage model to reduce costs. Conversely, we have staffed our organisation to be able to primarily provide a specialised service offering across high touch sales trading, program trading and electronic services.
More than anything, however, I think this will put to an end the questions over the future of Nomura and Instinet. The Nomura Group is keeping Instinet as an independent company that is segregated from the Nomura-branded broker-dealers. Yes, the Nomura Group is Instinet’s ultimate owner, but Nomura management also recognises that Instinet’s value is tied to its agency-only model, which is predicated on independence. It will do nothing to jeopardise that status.

This Time It’s Structural

Paul Squires, Head of Trading AXA Investment Managers systematically analyses the consequences of structural market change and sell-side head count reduction across the street.
Amid the current market and trading environment the expression “A Perfect Storm” springs to mind because, clearly, the entire industry has seen decreasing margins and volumes since 2008. At the same time, there has been an arms race to invest in technology just to maintain position. Those two things aren’t exactly the best backdrop for cash equity. Furthermore, even if the cash equity business is seen as a loss-leader for other more profitable asset classes, Basel 3 and global banking reforms seem to be impinging upon those commercial realities as well. 2012 was a very tough year for banks and brokers, and I think we’re finally seeing a little bit of fallout from that in terms of strategic reorganisation. It’s not just a seasonal thing now; many in the industry had sustained hope that it was just a tough period, that we would come out of it and that volumes would return to 2008 levels; however, I believe people are realising it’s much more structural.
Splitting Hairs
Commission Sharing Agreements (CSAs) are increasingly an essential facility for the buy-side. CSAs were really the avenue to enable CP176 (FSA consultation paper on unbundling). They reduce the extent to which trade execution might be constrained to where fund managers are getting their advice and service. In other words, the buyside executes with the broker where they have a CSA (provided they can give good execution), paying both an execution and an advisory component at the same time thus building the advisory pot, which can then be used to pay for independent research (or gives the fund managers the freedom to pay for advisory services from a broker whose execution service is not as strong).
More recently, the FSA has said that fund managers weren’t embracing the opportunity to split the different commission components as much as they had hoped, and the FSA is pushing again for that to happen. This is entirely appropriate from a client’s perspective, in my view. It’s the client’s money that’s being used every time the buy-side trades; if you’re paying a bundled commission, that’s effectively the client paying for the execution service and the advisory service and they should expect the best decision for both elements of that.
There are a couple of areas of focus within the current consolidation of advisory services and execution services. On the execution side, we’ve seen most impact from a more strategic ‘top-down’ view of sales trading. In the past, electronic sales trading was seen as supplementary to the traditional cash equity sales trading. There’s been a hard push to set up the provision of algorithms, and that has created a duplicated set of execution services. Now, my desk has taken a decision to focus our contact with our primary cash equity sales traders, but enabling them to see our algorithmic flow. This means that if we’re trading a significant volume of a stock and the cash sales trader can see what we’re doing, we’re optimising all our execution avenues. There is electronic access to multiple venues, but there is also the traditional broker distribution channel.
As part of our regular trading reviews, we explained to our brokers that our cash sales trader is the one who knows our account and our style of trading. We’ve had the historic relationship with them and they are best placed to disseminate the most relevant market information to us very quickly. For example, if we self-direct an algorithmic order, they could see that we were looking to buy a chunk of a French small cap; if they happen to see flow in that stock from another source internally, then they would be able to pick up the phone and say, “I know you’re working an algorithm, but if you’re interested, we’ve got the natural seller.” This is the level of service we want, but it’s taking the market quite a long time to get to that point. Based on our conversations, we’ve discovered that we are in the minority in wanting the sales trader to see the algorithmic flow. In contrast to our view, we believe many in the buy-side see anonymity as the key benefit of an algorithm.
A lot of the buy-side use algorithms almost primarily for the anonymity, which means they end up with a duplicated set of coverage with electronic coverage and cash sales trading. Clearly, that’s an expensive way to organise coverage for a typical asset manager whose volumes have declined substantially in the past couple of years. Therefore, I think we will continue to see brokers moving their electronic teams much closer to the program team or the cash sales traders.
Sell-Side Headcount Changes
The impact on the buy-side isn’t just caused by the fact that the headcount of sales trading has shrunk, it is that the number of clients has expanded. There are now so many small hedge funds and boutique asset managers. In many instances, the sales trader is doing his best to pick up the phone and put the orders into the system but, in our view, he often no longer has time to closely scrutinise the markets and stocks in the way that we used to benefit from.
At AXA Investment Managers, we trade with approximately one hundred brokers a year. But within that, it’s a very concentrated focus with our top 20 or so brokers being absolutely key. In addition, there’s a significant tail of brokers that we need access to less frequently for very specific orders.
There will likely always be two or three brokers, who, while market consensus suggests a certain direction, may feel that it’s worth their while taking a different view and who see an opportunity to gain market share by going against the trend.
Does a buy-side firm these days need more than two or three execution-only brokers who are the traditional sort of agency guys who are very driven to get your flow? They do a lot of work to be close to the market; talk to a lot of people; give you a lot of market colour. I think what this means is that the emphasis has shifted from the buy-side trader picking up the phone to someone on the sell-side who then directs how to execute your order, to the buy-side trader now having all the relevant tools. This concept of “best selection” as a process for us is one of the main aspects of “best execution”; in other words, the due diligence before we decide exactly how we are going to trade the order. Do we pick up the phone because, in fact, we just want a risk price; we want instant liquidity and the immediacy of execution? Do we want to park it passively in a couple of dark pools, and know a particular algorithm that is going to do that for us? Do we want to just pick up a phone to the sales trader and say, “Just keep it to yourself for a while, but I’m looking to buy a chunk of this particular stock in case you see anything in it”? Maybe we want to have a look around the shareholder list, see who might have been active in it and see if we’ve got any opportunities to do cross a block naturally? There are so many different ways to execute now and hence this concept of ‘total liquidity management’.
Do we really have a sense of whether the advisory services of one firm are better than those of anyone else, and how do we ascertain the value of that? I think both the brokers and buy-side have been reluctant to dig into that granularity.
Maybe the buy-side hasn’t been very good at being able to articulate what it wants and really focus on why they’re using certain brokers or execution channels to accomplish their objectives. I think because of the ongoing consolidation in the industry, the buy-side has to be much smarter about that. The result, I believe, is a polarisation in that the bulge-bracket firms are going to give you some very good algorithms, including access to their own internal flow, on one hand, while on the other, there are driven agency brokers who do a little bit more of the old-fashioned broking and may be a better channel for small caps. Where I think the outlook is less clear is where you have a broker which is quite good at several things but not outstanding at anything. This is a trend the market has been discussing over the past few years, but we finally are seeing that come to fruition.
Vendor Impact
From the vendors’ point of view, they have been relatively untarnished by the pain felt in the broader industry, because another consequence of this shift from the sell-side to the buy-side is that the buy-side has had to invest a lot of money in technology. In the old days, it was bundled into the broker service and paid for with the commission that was given to brokers. We would rely on the brokers for all sorts of things. To some extent, for example on the governance of algorithms and different venues, the sell-side is still much better positioned to provide that insight than the buy-side itself. We on the buy-side haven’t got the time or capacity to fully analyse our venue data. I’ve got it because we spent money on it, and I’ve got a great team here that’s enabled that, but to build regression testing to see what might have happened if I had placed that order on different venues; it’s very difficult to do that properly. We are dependent on brokers who not only see our flow but will see the flow of hundreds of different clients. In our case, we’ve now got an Execution Management System (EMS), which has enabled us to take things a step further in terms of functionality and real-time transaction cost analysis with much better visibility of the algorithms and venues. This has all played into the hands of the vendors. Where I think the brokers now feel the pinch is the Order Management System (OMS) or EMS providers’ ability to manage the connectivity infrastructure of the buy-side. If our EMS provider is also a connectivity hub for us, we ask brokers to connect into the EMS providers’ hub so we can send orders via FIX, which is a prerequisite for us, with some very small exceptions. Every time they do that, the connectivity provider can go to 50 brokers and say “It’s going to cost you thousand pounds a year to do that”. Now, imagine what the broker is thinking: on one hand, he may think “Well, if we don’t connect, they’re not going to give us any flow, so we almost have to”, but at the same time, the broker may feel that “if it’s going to cost £50,000 a year to connect and if we are only getting £60,000 a year commission, is it an investment worth making?”. However this type of calculation is only starting to be reviewed now.
The situation previously was “Of course, we’re going to connect to them because they’re an important client. We need to see the flow.” So they did. What we are seeing much more now is brokers ringing us up and saying, “Can we just go through this because, according to our figures, you’ve got this many traders and, therefore, the network is charging us this amount of money and it looks like we’re paying this amount of commission; is there a cheaper way for us to do this?” So I think the vendors will start to feel the pinch on connectivity charges because they’ll either have to review their own charging structures or they will see a drop off in the number of brokers plugging into them.
Regulatory Push
The momentum has definitely been in this area. Late 2012 the FSA pushed out a review of a number of visits to asset managers that they conducted over a 10-month period. They followed up with a Dear CEO letter to a hundred different asset managers with the results of their visits. They said that there was a good focus by asset managers generally on some issues, but they’ve observed, probably because of falling volumes, a drift away from those principles recently and, therefore, they listed the good and poor behaviours that they had observed.
What’s clearly in their minds, I believe, is the perspective of the client and if, for example, one of those asset managers was only paid execution when they trade (i.e. they pay for all of the research from their own bottom line) that is therefore what the regulators think should become the template for all asset managers.
That represents a shock for many asset managers, to suddenly start paying large advisory bills from their own bottom lines, but it’s certainly the direction in which the regulators would like to push. The FSA is trying to shift the current mindset, and when you start looking it’s hard to disagree with the general objectives.
The FSA has made it pretty clear that it does not feel that client commission should be used to pay for corporate access, or at least not to a broker if all that broker has done is arrange for you to go and see the company. Conferences and one on ones can make up a very significant part of most asset managers’ advisory spend if the investment philosophy is fundamental stock picking, meeting the management of companies and based on those close interactions, making the investment decisions. As an example, I was at an Investment Management Association (IMA) meeting a few weeks ago and I’ve never seen the IMA so full. The meeting was arranged to discuss some of the principles that the FSA has reported. There’s a high level of uncertainty and focus on some of these principles, but that’s not to say that it isn’t appropriate, in my view.
A Focus on Specialisation
Focusing on ones areas of expertise is a trend that we’ve not obviously seen across the industry in the past but one that is now really being driven home. The timing of this is due to the realisation that there is a structural change taking place, not just a seasonal change. It’s a natural evolution to focus on the areas where you’re strong and make sure they are protected when it’s quite apparent that cash equity is over brokered and M&A activity is lower.
The reality is that market volumes were down (again) in 2012, and the payment of most service in this business is geared towards turnover, which is completely unknown in advance and to some extent should be unknown. I don’t think fund managers should start the year with a target of doing a certain amount of business. If their portfolio is in the right position then they shouldn’t be generating too much turnover. As markets turn and they choose to move positions around then that’s completely normal. Of course, there is a sort of a range of turnover that we would expect, and we do make projections, then budget and plan as we would with any other monetary considerations. This gives us an indication of our likely advisory spend, and I think that’s what the FSA is pushing for – closer scrutiny as to what client commissions are paying for and how best to optimise value from them.
Looking ahead, I believe a lot of the media comment in 2013 will be on regulatory reform, but actually most of our focus will be on the same core issues as always. The key areas for us are going to be are we retaining assets or are we increasing assets under management? Where are we growing assets? Is it in Asia, is it in the US, or is it in European equities or high-yield fixed income for example? Are we executing at good prices? Are we still able to give our fund managers good market commentary? Our focus is really much more on day-to-day business issues.

Is Fixed Income Broken?

Adrian Fitzpatrick, Head of Desk Central Dealing at Kames Capital expresses strong opinions on the current state of fixed income market structure.

What is driving managers into equities?
A lot of it depends on which market you’re looking at. If you look at the European markets, there has always been a high weighting in bonds. There’s never really been the equity culture to the same degree as you have in the UK. Obviously in the UK a lot of it is driven by the insurance companies that have actuaries to dictate where the split between equities and bonds lies. Before these pensions’ time bombs, the safest investments for most people were bonds. However, we are all working on the assumption that bonds are currently completely overvalued, but as long as there’s Quantitative Easing they are going to hold these abnormally low levels of yields.
The other thing is institutions don’t want to stand out from the crowd. If everyone’s in bonds and bonds fall then you are going to fall together. There’s so much short term-ism in the market, you can’t afford to be an outlier for a long period of time and then not be right. I think risk assets now are relatively cheap; you look at the yields on risk assets compared to the yields on government assets, some of the credit assets like high yield are still quite reasonable. The bond markets are over valued , but it’s probably not going to change until the QE bubble bursts.
So the macro economics is feeding into that bubble?
Yes definitely. It’s like we said about the dotcoms: Why are you buying things that are at that level? We all know if you go back historically and look at bond yields in the UK, they should be around 4½/5%. But at the current level, it’s only sustainable as long as the Bank of England and the government are prepared to keep rates abnormally low because of the state of the economy. So can we see that carrying on to 2013? Definitely. (See diagram on next page)
You’ve got the government printing money, so all you have to say is, “Where are you going to put your assets?” Unfortunately in the environment that we are in you just can’t stand that far away from the crowd for a period of time.
What is the main problem with the fixed income market?
Effectively the market is not transparent, unlike the equity markets, which are. In equities you can see commissions, you can see the prices, you can see what gets printed. In the bond market you can’t. Platforms have been around for a long time, the likes of Tradeweb, Market Axess, and Bloomberg Tradebook, etc. These will keep gaining more traction because you can send them multiple requests for quotes and they are creating a virtual marketplace, which the regulators will like.
It’s not uncommon to send a trade down electronically to one of these platforms, and for all the banks to just pass; there’s no obligation to make a price or to make a marketplace. They make the price when it suits them. Personally, having traded multiple asset classes, that, to me, is not a marketplace.
You end up working with the fund manager to try and find the other side of liquidity. It’s such a captive market that effectively sooner or later, there has to be some outlet valve that gives institutions the opportunity elsewhere. Obviously, when you look at equities you had the establishment of crossing networks many years ago, of which Liquidnet and E-Crossnet were forerunners. Now what you are seeing on the bond market, through Blackrock’s Aladdin and UBS PIN, are various other brokers trying to get that same result.
What they are trying to do is encourage the institutional investor to create more of an electronic market. The biggest problem is in equities you have, for example, Vodafone, but in corporate bonds you can have 12 credits for the one stock; therefore it’s very difficult to identify a specific one and find a natural cross against it. So it needs to be a slightly different approach for bonds, just because it is a different discipline. You have different variables in bonds, the spread, which obviously a lot of fund managers work off, and you have the underlying Government bond.
The banks are going through what happened in equities. But equities aggregation is now the key; tools have been created that aggregated across the different dark pools to allow you to send one order to many venues. I think that what is needed is an institutional crossing network similar to Liquidnet. We need to create competition away from the banks. However, this isn’t necessarily all the bank’s fault because they say “we provide the risk and therefore we should be able to choose when we make prices.”
However, the biggest institutions are very close to the banks; they can dictate when they want to trade.
For us as an institution, it becomes incredibly difficult not just sourcing liquidity, but being able to trade out liquidity when we need to, because you are totally at the behest of the investment banks. In equities you can trade by program, by algorithm, you can ask for a risk price. You may not like the price, but the market will always make you a price and you can decide whether you actually want to take it.
In bonds, a broker will send out what’s called an “axe”. The axe is effectively the bond equivalent of the IOI. You’ll phone to follow up on the axe but they can back out. There’s no obligation.
In government bonds, where the electronic systems are more established you can trade more effectively, and it is getting there, but the credit part of the market is just completely arbitrary.
Who is going to drive reform?
It will have to be driven by the regulators, in the UK by the Bank of England, and it has to be driven by institutions. The problem is institutions aren’t members of the market. Institutions don’t have the IT infrastructure or the IT spend to be able to create new platforms unless you are a Blackrock or a Pimco.
In the US they use TRACE, and we need a similar system as we need to establish ways for institutions to understand who is doing what, so you can target business to the broker or investment bank. And if not, there should be some form of platform that allows institutions to cross up against each other.

Is this something the regulator will look at?
I would say that they’ve done equities to death, but obviously we’ve got MiFID II and Basel III and all of these things happening. I think they are realising that equities are pretty transparent. If you look at the FX market, at fixed interest and you look at the P&L of any major investment bank, they do not make money in equities, they make all their money on fixed income and FX. Therefore the reward structure of banks is something you need to look at sooner or later.
I think that there is blurring at the edges across the different asset classes that is forcing change, which is why we are going to see transparency in OTC clearing due to Dodd-Frank. The regulators want to know who’s got the positions and therefore where the stresses in the system are.
Most regulators are driven by their governments. Effectively they are trying to force an unbundling of the equity marketplace, through the back door, because you can apply far more pressure to an institution than you can to a broker.
The nature of the bond market has to change from where it is. Sooner or later the regulators will realise that and they will try and create some form of exchange, which is what the platforms provide. But the problem is that there is no obligation to make prices.
Because I trade multi-asset, I can see the strengths and weaknesses of different asset classes. I want a level playing field. In the bond markets it’s skewed towards investment bank that can pick and choose whether they choose to make a market or not, whether they want to honour their axes.
Some things like algos are far more difficult because there is no depth to the bond market, there is no centralised pot of liquidity that you can trade against.
A crossing network solution which gives another option for institutions would bring in spreads and potentially create additional liquidity as opposed to the investment bank, and if they start missing out on business I’m sure they will be more competitive and they will want to get involved.
Do you think reform is going to be driven out of Europe and the UK or the US?
The US is slowly getting there, because they’ve got TRACE so you can see what’s going on. In the European market we are less liquid and therefore it becomes more difficult. There are some products that will aggregate trades across the different platforms starting to become more established which will help in providing depth to the bond markets.
It’s all about providing flexibility, depth and transparency to the market. If you can actually do that, then the regulators, governments, institutions are all happier. You can see what’s going on.
I think the problem is that a lot of these houses only trade bonds, and don’t realise that sooner or later regulation is going to change the market place that they are in.
What shouldn’t fixed income learn from equities?
Equities have gone full circle, whereby the end investor, who is effectively the man on the street, is totally disadvantaged. The exchanges now only care about the bottom line, which is their P&L, and they don’t care if this is from a high frequency trader or anyone else, and the reason for that is institutions are not members of exchanges. We are so far down the hierarchy no one cares about us.
The example I always give to people; if you take Vodafone which was the most liquid stock in the UK prior to the Lehman crisis. As an institution I could trade 25 million shares on the touch. Now if I look at Vodafone, Vodafone’s touch price is barely 36,000 on one side and 65,000 on the other side. I’m sorry that is not a market that is helping the institutional long term investor.
How I could trade Vodafone before, and how I trade Vodafone now, I don’t care that it has a narrower spread, because it’s not about spread, it’s about depth to the market.
Equities are not profitable and the bond markets are hugely profitable but I suspect that regulators and Governments will not allow that to continue.
But the way it is just now; the bulge bracket equity model is broken. We are working on a 1970s model and in 2012 and it just doesn’t work.
 

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA