Home Blog Page 642

Setting the Pace in Brazil

By Daniel Ciment, Keidy Sakamoto
Daniel Ciment of J.P. Morgan details the development of Brazilian algos and outlines the most effective strategies for trading in Brazil.


Using Algos in Brazil
Already accustomed to trading with algorithms or using algorithms to trade strategies in different markets around the world, as international buy-side traders look to Brazil, they want to trade there in the same way they have traded elsewhere. Even though having just one exchange makes the data feed more streamlined, because of the low liquidity profile of certain stocks in Brazil, you cannot use algorithms to trade all stocks electronically. For the more liquid names, many traders are using benchmark algorithmic strategies, like VWAP, percentage of volume, or arrival price. Most algorithmic strategies are based on benchmarks for now, as buy-side traders seek to replicate the methods they use elsewhere, while obviously taking into account the intricacies of the market structure. In the end, if they trade with algorithms in the US, Europe and Asia, they want to trade with algorithms in Brazil as well.
Infrastructure and Volume Spikes
This is one of the challenges that we face as an industry. As you are building electronic infrastructures, you have to build for growth and not just for where we are today. When we look at a market, whether it is Brazil or more developed markets like the US, Europe or Asia, we know what we are trading today, but we have to build to accommodate what we will trade in a year, two years and what we think the peak might be. Just because a market trades a couple of hundred million in a day, or in the US, 8 billion shares a day, it does not mean you build your plan to support 8 billion shares a day because a year from now, that figure might be 20% higher.
More so, if a major event happens next week, then that figure might double, so you need to build sufficient headroom. Right now, we can handle a lot more than what we manage on a daily basis, but that is on purpose to make sure that at times of stress we are there for our clients and that they can trade through us with full confidence.
DMA or Boots-on-the-Ground?
To be successful in a market like Brazil, brokers need to have people on-site who know the local investor community and know the local financial community. J.P. Morgan has a major trading presence in Sao Paulo, and that is just one piece of the offering in Brazil. For small firms who want access, outsourcing is a realistic option, but if you are going to be big in a market, especially in a market like Brazil, an in-country trading team is required.
Technical Challenges
Reliable trading requires market data and telecommunications systems, which are present in Brazil, along with data center space and algorithms that are tuned to the local market and market structures. This tuning includes the liquidity profiles of the stocks as well as the rules and regulations of the exchange; you cannot apply the same algorithms from one region to another and expect them to work. We spend a lot of time and effort, fine tuning our algorithms, testing them on our desk and then rolling them out to clients. It is not just copy-and-paste.

 
Keidy Sakamoto, Itaú BBA
With regard to the electronic market in Brazil, we have seen much technological development over the last few years. At the beginning of 2007, we had just a few vendors offering systems for electronic trading using algorithms. The exchanges Bovespa and BM&F were still separate companies and the FIX Protocol was considered but a promising technology.
Since then, Bovespa merged with BM&F, considerable investment was made in technology and electronic trading using algorithms became feasible for all institutional players. We witnessed consistent growth of local and foreign vendors in the country, bringing with it new trading solutions that let us move a bit closer to the level of American and European markets.
Brazilian brokers are doing important work to accelerate this process, introducing new technologies and players to meet the new requirements to push the market towards a more developed scenario. As a result of the investments thus far, we expect a significant increase in the BM&FBovespa’s technological quality for the next few years. Algorithmic traders who are already trading in Brazil are becoming more advanced, making this market more aggressive each day.

A Considered Approach : German Asset Managers Look to the Future

By Rudolf Siebel
Rudolf Siebel, Managing Director of BVI Bundesverband Investment und Asset Management, shares the perspectives of German asset managers and their needs and goals for the coming year.
Technology and Trading Costs
BVI represents German investment fund and asset managment industry which manages ¤1.7 trillion in assets such as bonds, equities and derivatives. Trading is an issue dear to our hearts. In particular, we welcome the improvements in electronic trading over the past decade especially those based on standards, such as the FIX Protocol, which enable automation based on standardization. That is one of the reasons why we became part of the FIX community in September 2011. Costs of trading have certainly fallen over the past few years, particularly with regard to the costs charged by brokers and venues. Also, trading costs have been implicitly lowered through a reduced market impact. Our members sense that with electronic trading they can be much closer to the market and limit the loss of market value because of the latency in trading. Our members, however, have seen that the cost of support and analytics has not fallen. Some also believe that the buy-side trading volume side had declined and that the sellside volume is on the increase.
Value through Innovation
Having discussed issues of electronic trading within our industry, I think the increased ability to analyze market impact and trading costs has provided value. Over the past few years, our membership has seen value shift very quickly to better market access, especially through smarter routing technology. Based on mutual studies, only about 65% of the turnover of the DA X is now on the Deutsche Boerse, and for the FTSE 100, only 50% is now on the LSE. It is absolutely vital for our members to be able to access different liquidity pools, whether lit or dark. Smart algorithms have become a main issue, but not necessarily in view of improving low latency. Our members are asset managers who base their decisions on the selection of securities and asset classes, not necessarily on squeezing out each latent nanosecond. As a result, low latency trading is a secondary priority for BVI’s members, but smart order routing is obviously important given the large number of venues in the European market. At my latest count, there are about 70+ different types of trading venues, be they exchanges or other trading platforms.
Volatility and Connectivity
We are now in a market where there are no longer any safe havens among asset classes, and in times of high market volatility it is absolutely necessary to link your internal systems to outside trading platforms in order to be flexible and quick to market. German asset managers have yet to establish connections across asset classes, and the FIX Protocol is very important as a basis for discussing the connectivity issue. Going forward with Dodd-Frank and new regulation on the European side, the the connectivity with Central Counter Parties (CCPs), will also be a big issue for 2013 and 2014. As far as it is possible, connecting to all markets and asset classes in an electronic way, and connecting to more CCPs will be the challenge for next few years.
Outsourcing Trends
There is a trend to outsource in  the German marketplace, but that is more applicable in the front or middle office and in operations. From a trader’s position within an asset manager, market impact is an important part of the final performance of our product. There is a reluctance to outsource trading, and that applies to both trading decisions as well as execution. BVI’s members use technology platforms provided by vendors but not in the sense that we outsource the full trading process to a broker or another asset manager.
As asset managers, we need to retain the legal responsibility for the trading decisions. A point that you would always have in the back of your mind is whether the other party is using your intellectual property to optimize their own trading. The Pipeline case, for example, has been a reminder that, in a dark pool, the operator has their own stake. Cases like that certainly do not help to improve discussions on outsourcing trading.
Priorities for 2012
The first item our members would like to address relates to the new MiFID II release: the ability for institutional investors to execute large trades without excess market impact. We would be very reluctant to have regulation which would put a stranglehold on dark pools. Whilst some of our members see huge benefits in the market for high frequency trading, lit markets that do not curb high frequency trading place large institutional traders at a disadvantage. We need the ability to execute trades in dark pools in a regulated environment.
The second item relates to the proposal suggesting that fixed income trading will be modelled on the basis of equity trading, and we do not think that this is ideal. This proposal is somewhat premature. In spite of the evolution of electronic trading in the bond market, the overall market is still driven by banks providing capital. To apply the equity market structure to the fixed income market is a long term objective that I think everybody would like to see, but it is premature in the short term. In the long term we would like to see an open market structure with many vendors and buyers in the bond area too. Particularly during these challenging times in the government bond market, it is, however, imprudent to require an equity market structure be set up for the bonds. We would urge the Commission not to move too quickly, but to provide intermediate steps which recognize the importance of the existing bond trading system.
Moreover, we welcome the European Consolidated Tape (ECT) as a transparent pricing mechanism, but the bigger issue will be its implementation. We clearly support the ECT and in September the European Fund and Asset Management Association (EFAMA), of which we are a founding member, published a blueprint on the ECT, suggesting some of the structural features that the buy-side would like to see.
 

FPL News Dec 2011

By Daniella Baker

Working with the Trading Community to Create a More Financially Stable Environment

2011 has presented many challenges for the global economy, with multiple markets experiencing increased volatility and investor confidence in sharp need of an injection of faith. Achieving increased transparency and sustained long-term financial stability has become the Holy Grail for politicians and regulators globally. It may seem an unlikely fit for those unfamiliar with the scope of activities pursued by FPL, but market participants working closely with the organisation, will understand the growing relevance of FPL’s work in helping to achieve these goals.

FPL is an industry-driven organisation, it is independent and neutral, and all initiatives are ultimately focused on supporting the evolving business needs of the trading community, to enable firms to optimise efficiencies and reduce costs. As the markets seek to further mitigate risk and support regulatory reforms, aimed at effectively addressing the needs of the real economy and protecting the end investor, FPL has worked hard to help its members understand the implications on their businesses and how they can seek to achieve these goals. In doing so, the organisation has also ensured the wider needs of the trading community continue to be effectively addressed, meeting the requirements of emerging trading trends and ensuring new geographies benefit from educational opportunities. Here are some highlights explaining how FPL has worked towards these goals in 2011 and an insight into some of the initiatives FPL will be seeking to deliver as we move into the New Year.

Supporting the Development of Regulation
 
FPL works closely with regulators around the world to encourage the use of non-proprietary, free and open industry standards that enable all sectors of the financial community to benefit from the resultant consistency and transparency that facilitates ease of surveillance. The standards promoted are those that achieve mass adoption by the trading community, enabling firms to more easily meet new requirements by leveraging their existing investments across additional business areas and generating significant cost and resource savings. As an organisation, over recent months, we have started to see results from these efforts, as regulators have begun submitting documents to the market requesting participants use open standards that offer these elements to meet emerging requirements.

This approach is consistent with the recommendation in the 2010 released Investment Roadmap, which FPL produced in collaboration with other leading standard bodies to provide the industry with clear guidance as to which standards should be used to support business processes throughout the trade life cycle. The transparency and consistency offered by using standards in this manner, is becoming essential in furthering regulation while restoring investor confidence. 

FPL has provided representation at a number of regulatory related initiatives throughout 2011. A few examples include the annual IOSCO conference in South Africa, a meeting of the FSB (Financial Stability Board) in Basel and attending a Regulatory Data Workshop coordinated by the Office of Financial Research (OFR) in DC as a chance for data experts within the federal financial regulators to share common concerns, best practices and promising ideas. 

Throughout 2011, FPL’s regulatory efforts in Europe have focused on the Markets in Financial Instruments Directive (MiFID) review. This has included contributing to discussions and responding to consultations focused on the development of a European Consolidated Tape and encouraging the adoption of FIX for this initiative. As FPL’s membership represents the cross section of firms that will be using the consolidated trading data, FPL has played an active role in the group providing advice on how the new standards could be applied to deliver the greatest industry wide benefit. 

The MiFID review also raised important concerns around posttrade transparency and in response to this the Market Model Typology (MMT) initiative was launched by the Federation of European Securities Exchanges (FESE), with strong support from a number of leading data vendors. The MMT initiative aims to reduce the level of complexity of equity market data, which will generate processing cost savings, and significantly improve market transparency. The initiative will seek to achieve these goals by standardising trade flags across trading venues in Europe, including lit, off book, dark and OTC trading. The standardisation of post- rade data will prove to be a pre-requisite for effective data consolidation. 

Additionally, FPL member firms in Europe continue to be invited to participate in the MiFID Forum, which is supported by a range of industry standard bodies and associations. The forum provides an opportunity for open discussion and debate amongst senior market practitioners on the regulatory issues affecting the financial markets.

As an independent and neutral stakeholder, FPL continues to respond to regulatory consultations. To ensure that these responses are submitted in a manner that reflects the broad interests of its membership, in December 2011 FPL invited members from the region to join the EMEA Regulatory Subcommittee. The organisation looks forward to working with group representatives in 2012 to further this effort. 
 


From a U.S. perspective, the Securities and Exchange Commission (SEC) has been very active in its efforts to regulate the U.S. financial markets as it seeks to reduce risk, increase market transparency and ultimately restore investor confidence. Increasingly, rule filings have focused on the role of technology as both the source and solution to current market issues. To assist the ma
rket in comprehending and addressing the implementation issues raised by SEC and Financial Industry Regulatory Authority (FINRA) filings, FPL has worked alongside the Financial Information Forum (FIF), to address recent transparency initiatives. 

This group is working with the U.S. regulators to encourage the adoption of FIX for new regulations as they are being developed. By being involved at this stage and encouraging the promotion of FIX business practices, this has proven to enable market participants to implement FIX more efficiently and reduce errors, thus generating significant industry wide savings. Trade reporting is key to measuring firm compliance and expanded use of FIX for equities and fixed income assists in controlling development costs. During 2011 this has also included encouraging FIX use for the reporting of the SEC proposed Large Trader Identification rule, which will enable the SEC to more easily review trading activity from buy- and sell-side firms.

FPL has been providing expertise and recommendations to both the Commodity Futures Trading Commission (CFTC) and the Department of Treasury Office of Financial Research in the areas of reporting, identifiers, and financial instrument classification. For example, FPL has been working with the members of the Futures Industry Association to enhance FIX trade reporting messages to meet CFTC Large Trader Reporting requirements. 

From an Asian perspective, FPL continues to reach out to regulators across this diverse region, in order to highlight the benefits that standardisation and the many advantages that using FIX presents. 

Meeting the Challenges of Risk Management 

As market volatility continuesto threaten the achievement of financial stability, the use of effective risk controls in automated trading has become essential in protecting not only the parties involved in the trade, but also the wider integrity of the market from flawed electronic orders. Keen to raise awareness of the risk issues presented by electronic trading, and how the adoption of common practices could help mitigate against these, in early 2011 FPL developed an initial set of guidelines which recommended risk management best practices in equities electronic trading for institutional market participants. 
 
Determined to ensure that the proposed guidelines comprehensively meet business needs, FPL has sought wide industry feedback. A number of market participants from across the globe took part in this review and their value was recently recognised by the European Securities and Markets Authority (ESMA), who in October published a consultation paper referencing the guidelines. FPL is currently incorporating the feedback received and the guidelines are also being extended to support the trading of futures and options ready for their release in the New Year. 
 
Additional to this, the FPL Americas Buy-Side Working Group has also commenced an initiative focused on risk mitigation. The group is looking at the potential to increase the use of test symbology for all electronically traded asset types to encourage more secure, reliable and compliant business processes. We look forward to hearing more about this initiative as it progresses. 
 
 


 
Working with the Buy-side to Increase Transparency and Mitigate Risk 

To ensure that the needs of the end investor are protected, increasing market-wide transparency and mitigating risk throughout the entire trading life-cycle is very important. Due to the buy-side’s close proximity to the end investor, FPL is keen to fully understand and effectively support the business challenges facing this industry sector. 
 
Over recent years, a central component of this effort has been the formation of regional buy-side focused working groups. Recognising that in an increasingly global trading environment many buy-side challenges, whilst having mild regional variances, are experienced worldwide and FPL is keen to ensure that the knowledge achieved and initiatives developed regionally are able to benefit members globally. 
 
A prime example of how this is being achieved is the best practices around Execution Venue Reporting that FPL published earlier this year. The proposed practices were developed to enable buy-side industry participants to achieve increased transparency and a more consistent response from their broker-dealers on the reporting of the execution venue on each fill. Over recent months, FPL has worked closely with buy-side representatives as they have approached their brokers to encourage adoption of these practices. This initiative, originally developed by the FPL Americas Buy-side Working Group, is now being reviewed by other FPL buy-side focused groups globally so they can understand how the best practices can be applied to achieve increased transparency in their local markets. 
 
An additional area currently under the spotlight by the FPL Americas Buy-Side Working Group is post-trade allocations and the current risks prevalent within this space. The group is exploring the use of FIX messages bilaterally with their counterparties for equity allocations. FIX has offered functionality within this area since 2005 and the group is looking at how it could increase adoption. The work has focused on developing best practices that are acceptable to both the buy- and sell-side community utilising FIX to support the equity post-trade allocation processes. We look forward to hearing more on this initiative in the coming months. 
 
In support of FPL’s efforts to more effectively meet buy-side needs, in 2011 the organisation welcomed the UK based Investment Management Association (IMA) and the Bundesverband Investment und Asset Management (BVI), which represents the investment industry in Germany, to the FPL membership. We look forward to their participation in the many active FPL committees and working gr
oups in the coming year. 

Encouraging the Consistent Use of Standards across the Asset Classes and by the Exchanges Community

As the US Dodd–Frank regulatory reforms seek to achieve greater market transparency, by requiring most types of OTC derivatives to be cleared through clearing houses and traded on swaps execution facilities (SEFs), a surge in new market venues is expected in the US. Similar reforms are also expected to emerge from the upcoming MiFID II regulations in Europe. As these new markets develop, FPL member firms have expressed a desire to encourage the increased use of FIX, and other standards, by both existing and emerging trading venues. Adoption of open standards would enable industry participants to connect to these SEF market venues in an efficient and cost effective manner. To achieve this goal, members from the Global Fixed Income Committee are currently reviewing the functionality FIX offers to ensure it can effectively support the required processes for trading of credit default swaps (CDS) and interest rate swaps (IRS). The group is producing guidelines that build upon methods currently employed today with the aim to release these best practices documentation and enhancements to the FIX Protocol ready for industry adoption in Q1 2012. 
 
To enable FIX to comprehensively meet the business needs of the foreign exchange trading community, in 2011, FPL turned its attention to foreign exchange options. Keen to improve FIX support for both the exchange traded and OTC foreign exchange options markets, the work is initially focusing on vanilla options and simple strategies (including non-deliverable currencies) and then moving onto exotics and more complex strategies. Over recent months, the Global Foreign Exchange Committee has been reviewing business requirements and examining the protocol to understand how it can be enhanced to more effectively support these trades, with the aim of releasing new functionality in Q1 2012. 
 
To provide enhanced FIX support for the listed derivatives trading community, in 2011, the Global Derivatives Committee incorporated enhancements to the protocol to better support post-trade clearing activities. Further enhancements were also made to the protocol to support evolving exchange requirements in the areas of order handling and quoting models. 
 
A major area of new functionality was added to the FIX Protocol to support the communication of parties reference data, including setting risk limits and entitlements between parties. New messages have been developed to not only communicate this information but to also define and update this information. 
 
Encouraging the standardised adoption of FIX by the exchanges community, has been a key area of focus for the Global Exchanges and Markets Committee over recent years due to the large number of market participants required to connect to them and the significant cost savings that can be achieved by being able to do so in a standardised manner. In 2011, these efforts continued with FPL offering significant support to a number of exchanges, including many based in Asia to encourage standardised implementations. 
 

 

Developing a More Transparent Approach for Latency Measurement 

In today’s latency sensitive environment it is not surprising that achieving and promoting strong latency results is common practice for many firms, however FPL member firms have questioned to what degree are latency measurements comparable when there are no standard approaches to measuring latency. For example, there are many points in the transaction of a trade where latency could be measured and no common measurement points are identified. Additionally, latency is often quoted as a single number and with exchanges processing many thousands of orders each day, this number can often reflect only the trades that achieved the strongest results, which may be just a small proportion of those overall. 
 
A further challenge is the actual process of measurement across disparate locations. To measure latency in different locations, a common timing device is needed at either end of the communications link and currently only proprietary communication protocols are available to market participants. 
 
To provide a more transparent latency measurement approach, under the umbrella of the Global Technical Committee, the FIX Inter- Party Latency Working Group has been busy developing the industry’s first free, open and non-proprietary standard, to consistently measure the latency of a trade as it travels through systems at exchanges, trading venues and investment banks. Additional to this, the group is also preparing a set of industry guidelines to encourage common latency reporting to enable like-for-like comparisons to be achieved. Both of these deliverables will be available to market participants early in 2012. 

Supporting Local Trading Needs

As an organisation, FPL has witnessed strong growth within the relatively challenging climate presented by recent years, with a 40% increase in members since the start of 2010. This growth is an indication of the return on investment and significant value membership offers to the trading community. 
 
Over this period of expansion the geographical diversity of members has increased along with the desire from member firms to lead localised initiatives that support local trading needs. This has led to the organic growth of a number of new subcommittees. Since the start of the year, the organisation has formed new groups in Germany and in the Nordics, and supplemented the FPL India and Middle East North Africa groups formed in 2010 and multiple others already collaborating to address both the business and technical challenges facing their markets. 
 
These are just some of the many ways in which FPL is working to support the ever expandin
g needs of the trading community. As an industry-driven organisation we encourage all members to play an active role in the future of FPL and get involved in initiatives that are shaping the future of the trading environment. In 2012, we plan to continue our efforts and ensure that FPL not only meets but exceeds our members’ expectations addressing business challenges so as a whole the industry can trade even more effectively and efficiently.

For more information about how your firm could get involved with FPL please visit: www.fixprotocol.org or contact fpl@fixprotocol.org.

Commodities Trading with FIX

CME Group’s Fred Malabre and Don Mendelson chart the history of electronic commodities trading and discuss the recent improvements in FIX for commodities, including fractional pricing, trading listed strategies and faster market data.
Fred MalabreAdoption of FIX for Commodity Trading
Products traded on CME Group exchanges have many underlying asset classes, including commodities, interest rate instruments, foreign exchange and equity indexes. Although the largest share of products traded today on our platforms are financial futures, the history of our markets began with agricultural and livestock commodities. The underlying physical commodities include the petroleum complex, agricultural products such as soybeans, wheat and corn, and metals such as gold and copper.
Listed contracts in all of our markets include futures and options on futures. The futures contracts can be physically settled, meaning that a seller has an obligation to physically deliver the commodity to the buyer when the contract expires at a delivery point specified by the contract, or cash settled, which are products that could not be settled to an index.
Back in 2001, we were at a crossroads. At this time, futures contracts were primarily traded via open outcry – brokers shouting and waving arms in trading pits. At that time, CME launched its first FIX compliant interface to electronically match orders for a wide range of asset classes ultimately including equity indexes, FX, interest rates, real estate, weather, economic events, energy, metals and agriculture. The question arose: how do we represent orders and execution reports and transmit them between firms and the exchange? We had earlier put out its own order routing API, but it had some drawbacks, including the high level of software developer support required. Firms were running several different computing platforms, for example.
At that time, the FIX Protocol had begun well in the equities world. It was attractive because it was not an API, but rather a standard for message exchange. From the exchange’s perspective, this was an attractive proposition: we would develop our side of the conversation, and firms would develop theirs. From the firms’ perspective, their software developers could achieve high performance, only limited by their own imagination and skills.
It seemed that with a few adjustments, FIX could be adapted for futures and options trading. In version 4.2, fields were added to support derivatives, including expiration and strike price. We participated in working groups to standardize those changes, and later helped spawn FIX-based solutions for market data and FIXML for clearing.
An example of a problem with adapting an equities standard to commodities is fractional pricing. Traditionally, agricultural prices were stated in fractions. To this day, soybean futures are quoted in increments of ¼ of one cent per bushel. US equities made a leap from fractional to decimal pricing in 2001. Although trade pricing stuck with tradition, we decided to follow FIX conventions with decimal pricing in messages.
We added some custom indicators within our FIX interface to facilitate conversion to a fractional display. One indicator represents the main fraction and another represents the sub-fraction. For example, for a product ticking in ½ 64th, we would send the main fraction as 1/64 and the sub-fraction as ½. These indicators could then be used to convert a decimal price used over our FIX interface as 105.0390625 which would then be converted to a screen display as 105 2.5/64th. We found that the FIX Protocol is easy to extend to add custom features and can easily be extended for legacy needs with negligible impact to customers not using new tags such as the custom indicators talked about previously.

Don MendelsonAlthough the FIX Single Order messages sent to CME Group would look familiar to a counterpart in equities, futures trading differs from equities in several respects.
Market makers use a FIX Mass Quote to set prices for option series. Mass Quote was introduced in FIX version 4.3, but back-added into the iLink FIX 4.2 interface.
Most futures and options on futures are traded in strategies. The simplest is a calendar spread, playing on the difference in price between two delivery months. Similarly, a butterfly spread is based on three contract months. Another type of spread is inter-commodity spread. A classic example is the crush spread, which has three legs based on the relative value of soybean products: unprocessed soybeans, and two derived products, soybean oil and meal. At CME Group, futures strategies are listed and can be traded just like outright contracts.

Options strategies can be quite complex. Rather than list all the possible combinations, traders initiate creation of a strategy by sending a FIX Security Definition Request. Once created, the new strategy becomes a listed contract for anyone to trade. However, complex option strategies are still heavily traded in the pit; less than half of our commodity option volume is traded electronically.
Another difference from equities markets is the nearly 24 hour trading cycle. Our trading system starts up on Sunday afternoon and runs continuously through Friday. Trading sessions run past midnight so they are not concurrent with calendar days. Customers accustomed to daily shutdown cycles typically have to make some adjustments for a week-long session.
We replaced the previous proprietary market data format with a FIX-based format, the efficiency of which enabled firms to dramatically cut the network bandwidth needed,  consequently cutting costs to firms. The market data feed includes Implied prices, which are determined by combination of contract order books at the match engine. Implied prices for listed calendar spreads are determined by the actual prices of its outright contracts. At the same time, the price of an outright is inferred from the best possible combination of calendar spreads that the contract is part of plus another outright price.
It is safe to say that the usage of FIX in commodities trading will evolve, and CME Group will apply its FIX technology to other markets around the world. This year we successfully deployed our FIX enabled electronic platform to cover derivatives trading in Brazil, and we expect plenty more to come.

Trading from the Front: A CEP Hedge Fund

Corwin Yu, Director of Trading at PhaseCapital, sits down with FIXGlobal to discuss his trading architecture, the proliferation of Complex Event Processing (CEP) and why he would rather his brokers just not call.
Corwin YuFIXGlobal: What instruments does your system cover?
Corwin Yu: At the moment, we trade the S&P 500, and we have expanded that to include the Russell 2000, although not as an individual instrument, but as an index. We also trade the E-Mini futures on the Russell 2000 and also for the S&P500. We have done some investigation on doing the same exact type of trading with Treasuries using the TIPS indices and the TIPS ETFs and a few of the similar futures regarding those as well. We are not looking at expanding the equity side except to consider adding ETFs, indices, or futures of indices.
FG: Anything you would not add to your list?
CY: We gravitate to liquid items with substantial historical market data because we really do not enter into a particular trading strategy unless there is market data to do sufficient back testing. Equities was a great fit because it has history behind it and great technology for market data, likewise for futures, where market data coverage has recently expanded. Options is a possibility, but the other asset class that is liquid but not a good fit is commodities. We shy away from emerging markets that are not completely electronic and do not have good market data. While we have not made moves into the emerging markets, we know that some other systematic traders have found opportunities there.
FG: How much of your architecture is original and how often do you review it for upgrades?
CY: In terms of hardware, we maintain a two year end-of-life cycle, so whatever we have that is two years old, we retire to the back-test pool and purchase new hardware. We are just past the four year mark right now, so we have been through two hardware migrations. Usually this process is a wakeup call as to how technology has changed. When we bought our first servers, they were expensive four-core machines with a maximum memory of 64 GB. We just bought another system that can handle 256 GB through six-core processors. We are researching a one year end-of-life cycle because two years was a big leap in terms of technology and we could have leveraged some of that a year ago.
In terms of technology, the software and architecture around the platform is completely different. When we first started, we built prototypes that were back-testable and able to migrate into trading strategies directly. In a lot of our strategies, however, we were surprised at how different market data was when we moved from a historical environment to a live environment. Architecture has changed immensely in the last few years, especially when we switch asset classes and add exchanges. When we first started, most of the strategies and architectures between counterparties were a blue sky implementation, yet the system we put into production and what we have now are completely different.
FG: How have regulatory changes affected you?
CY: When you do a ten-year back-test, you do not factor in a lot of the compliance changes from year to year. An easy example is the Reg SHO short sell compliance rules which have evolved in the last few years. When we first started, we knew about Reg SHO and factored it in, but obviously that changed again in the last year. The strategies themselves did change a little, but Reg SHO mostly affected our models in terms of factoring in the locates, the liquidity of locates from broker dealers, sourcing stock loans, the rates for the pricing model, transaction costs, etc. Obviously you cannot do that end in the back-test or it would take years to run.
In terms of other compliance changes, the only changes in futures strategies were in terms of the margin requirements, leverage and the loss or exchange of credit. All our models run on one-time or two-times leverage to be as conservative as possible, and we try to build our new models assuming relatively low leverage.
FG: Has your latency been affected by adding new compliance or risk layers?
CY: We use Lime Brokerage for their risk layers, which are surprisingly fast for us. We have not made many changes in the risk layer other than to change the models to incorporate fat finger checks, buying power checks and short sell checks. Although it slows us down slightly, we have also established a redundant check on our side.
We found that the broker-level checks worked for compliance, but they were not detailed enough for a strategy-by-strategy risk check. As an agency broker dealer, they are doing their best to cover themselves and their clients, but because their clients are broad, the actual compliance check is going to be equally broad. We insert our own risk checks, not to circumvent Lime, but as an additional layer. Investors want as many checks as possible. In this environment, it has always been ‘trust, but verify’, and if you do not or cannot really verify, you need to put your own check in.
FG: What is important to you in executing broker?
CY: We are really focused on the quality of the technology and the infrastructure, but we are not that concerned about customer service. We tell all our brokers the same thing, ‘if I never call you, that means you are doing a great job.’ I am not actually going to call someone to make a trade, so there is no point to picking up the phone. Even if something goes wrong, it is already too late to intervene. Depending on the issue, we will stop trading and reduce risk while we assess or trade through the problem and then go through a large endeavor to clean it all up. That is why we emphasize the technology to prevent a situation like that. We look for strong, redundant infrastructure, actual co-location, with the real cross-connected DMA.
A solid testing platform is also helpful, and good access to market data is an important requirement for us. Their latency model does not even have to be top of the line, but at least within our specs. It is a cold war now, everyone cannot be the fastest. For us the second and third fastest is acceptable as long as they have a strong infrastructure backing trading strategies we believe in. For instance, Lime’s co-location is actual co-location in the NJ2 data center, using the Savvis and direct Mahwah feeds. Many broker-dealers claim to have DMA, but actually offer a smart order router that mimics DMA. We are looking for a robust model and a certain pedigree of users on that infrastructure.
FG: Will CEP be widely adopted in the next few years?
CY: I think so, but it is harder to differentiate CEP from the algorithms. CEP is more of a technological train of thought that evolved from taking fundamental aspects of data handling into a platform. The idea of CEP has been around for a long time: you have a trigger for an event and it happens in a stream and it happens quickly. Although people have being doing this for many years, the advent of new information models thrust CEP into the limelight. While I am not sure if every firm will buy a CEP platform, they are more likely to change their platform to mimic CEP. CEP will not become a commodity; it will be a model for creating applications and designing interfaces. Traders who want quick time-to-market and have the economics and appetite for migration will probably buy it, but if we are looking at a shop that could adjust the development, they will probably opt to change their existing infrastructure to more resemble CEP.

Viewpoint : Alex Sedgwick : TCA in Credit Markets

Alex Sedgwick, MarketAxess
Alex Sedgwick, MarketAxess

THE NEW CAPITAL MARKETS LANDSCAPE.

Alex Sedgwick, MarketAxess
Alex Sedgwick, head of research at MarketAxess explains that TCA in credit markets demands a different approach to equity markets, but choosing the right methodology can not only demonstrate compliance with best execution obligations but also identify meaningful cost savings.

Regulation has ushered in a new era for trading, with sweeping changes to market infrastructure in motion on both sides of the Atlantic. In Europe, MiFID II and EMIR aim to create a more transparent and robust market by mandating central clearing and electronic trading for a broad range of derivative products. Exactly which products will fall under these regulations is still to be determined, but many over-the-counter products, which today are still predominantly traded over the phone, will soon have to be traded electronically.

The end-game for these regulations is to address the systemic risk in the derivatives markets that is widely believed to have brought about the credit crisis of 2008. But the forces of regulatory change that were unleashed by the credit crisis are not limited to derivatives risk management. The remit of MiFID, for example, has expanded to include additional asset classes; and regulators are also concerned with market participants’ ability to demonstrate how they have achieved the best price in executing a trade. Taken together, these changes ultimately aim to protect investors, represented by pension funds and mutual funds who are active in these markets.

Both of these ambitions have brought greater focus on the transparency and efficiencies offered by electronic trading. In highly liquid markets such as equities, a number of sophisticated approaches are widely accepted to measure the main components of trading costs – known as transaction cost analysis (TCA) – and thereby prove best execution. However in less actively traded markets, such as credit, analysing trading costs is somewhat more challenging. Although fundamental differences in the market structure and behaviour of credit markets prohibit an equities-like approach to TCA, there are unique characteristics that have allowed us to develop an alternative approach drawing on data that has become more widely available with the growth of electronic trading.

Characteristics of the corporate bond markets

A fundamental feature of the corporate bond markets compared to the equity markets is their highly fragmented nature. Compared to only approximately 11,000 equities listed on the two principal US equities exchanges, there are over 70,000 individual bond issues in the U.S. high grade and high yield corporate bond markets alone. Moreover, the individual characteristics of these bonds, such as maturity, optionality and credit quality, make a significant difference to pricing. A single issuer may have dozens or more different bonds, each with its own distinct profile. In addition, many of these bonds trade infrequently – the average bond trades less than twice a day – making them difficult to price accurately.

The advent of FINRA’s (Financial Industry Regulatory Authority) TRACE (Trade Reporting and Compliance Engine) reporting in the U.S. in 2002 brought significantly more transparency to the corporate bond markets. FINRA requires that all U.S. corporate bond trades be reported to the TRACE consolidated tape within 15 minutes of the trade taking place. This data is then fed back to the market, thereby enhancing participants’ ability to more accurately price even thinly traded bonds.

This consolidation of trade data and associated price transparency, alongside the greater adoption of electronic trading in these markets and the advent of new methodologies to analyse data, have allowed us to develop an objective approach to examine trading costs in the fixed income markets in new and compelling ways.

A new methodology

The most commonly used approaches for measuring trading costs in the equity markets take into account factors such as the commission, market impact, execution delay and opportunity costs associated with a given trade. All of these measures assume a critical mass of volume traded daily in order to provide an accurate benchmark for the cost of a given transaction, and also rely on the highly automated nature of equities trading for accurate pricing of a given stock at any point in time. As discussed above, due to the comparatively small volumes traded, these methodologies are not appropriate for the corporate bond markets.

For the fixed income markets, a more straightforward approach is required in order to achieve a realistic comparison within the limitations of the data available. Our approach is to analyse execution costs in the fixed income markets by looking at the embedded costs of trading, consisting of the dealer mark-up plus any fee associated with executing the trade. Using an electronic platform such as MarketAxess, we can easily calculate the trading fees associated with each transaction, and we can therefore determine the cost of execution by comparing the electronically executed trade to a ‘prevailing market price’ using the data from TRACE.

For our first detailed study of transaction costs in the fixed income markets, we took a sample of 900,000 TRACE-reported trades over a 24-month period. From these we identified 150,000 trades for which we could establish a prevailing market price and for which we were able to compare electronic trades with a sufficient number of TRACE prints.

Findings

The original intent of the analysis was primarily to quantify the cost of electronic trading compared to trading by voice in the corporate bond market. Our research found at a statistically significant confidence level that trading electronically consistently delivers cost savings to the end user across all trade size buckets and maturities.

In actively working with clients to generate reports quantifying the cost of execution (both voice and electronic), we have shown that this can be a key tool for compliance purposes and ultimately enables clients to demonstrate that they have fulfilled their fiduciary responsibility to achieve best execution. Over the longer term, these cost savings can also be shown to result in incremental gains and a measurable improvement in portfolio level performance.

This methodology has significant applications in responding to regulatory requirements and as such highlights the broader benefits of automation. The ability to calculate transaction costs in less liquid markets helps support one of the key goals of MiFID by providing a mechanism that can be used to prove best execution.

Benefits of electronic trading

One of the primary advantages of electronic trading is increased competition, resulting from the ability to request prices from a broad group of dealers. The RFQ (request-for- quote) model widely used in the over-the-counter markets has the advantage of enabling clients to contact multiple dealers simultaneously. Inquiries are disseminated in a fraction of a second to a large group of participants, thereby increasing opportunities to obtain better pricing and significantly increasing efficiency.

By allowing easy access to a broader group of participants, the RFQ model has enabled the expansion of what was otherwise a highly fragmented market. The cumulative number of client and dealer trading connections on MarketAxess has more than doubled since 2008, and clients are receiving twice as many dealer responses per inquiry in U.S. high-grade corporate bonds as they were two years ago. Enhanced dealer liquidity and competition from a greater number of dealers on the platform are driving meaningful transaction cost savings for investors across all trade sizes. Lower transaction costs in turn are driving increased investor order flow – the overall estimated year-to-date share of U.S. high grade trading volumes on MarketAxess today is 11.1%, compared to an estimated 7.5% for the same period in 2008.

Conclusion

Through extensive research and analysis that has been made possible by, among other things, the presence of a consolidated trade tape in the US corporate bond market, we have been able to demonstrate the statistically and economically significant cost savings afforded by electronic trading in the fixed income credit markets. These savings result from the efficient aggregation of liquidity from a broad and growing group of market makers. The methodologies that we have developed can also be used by market participants to demonstrate compliance with their best execution obligations. We believe that similar benefits can accrue from the electronic trading of standardized derivatives as proposed by the European Commission, and from the increased price transparency afforded by a TRACE-like consolidated tape for the European markets, as envisaged in the MiFID II proposals. Ultimately, we believe that this will result in more stable, transparent and efficient over-the-counter markets.

©BestExecution 2011

[divider_to_top]

 

The Truth about HFT

By Alex Frino

 The Capital Markets Cooperative Research Centre (CMCRC)’s Alex Frino talks about his research over the past 18 months and the conclusions as to the truth about high-frequency trading.

 Alex Frino, Capital Markets Cooperative Research CentreWhat inspired you to focus your research on High Frequency Trading (HFT)? 

There is a very poor understanding of the impact of HFTs on the market place. There is a lot of ill-informed opinion in circulation about the impact of HFT on price volatility, and their contribution to liquidity. I wanted to provide some hard data to help markets move forward and inform sensible evidence-based policy decisions.

There was also considerable interest in the idea of conducting HFT research from our regulator partners, including the FSA and ASIC. 

What were your views on HFT at the outset of your research program? 

When we first set about doing the research 18 months ago, I began by speaking to the investment management community to gather their views and insights into HFT and its impact on their trading. The feedback I got was overwhelmingly negative. One comment sums it up best – an investment manager said to me that “liquidity provided by the HFT community is like fog – you can see it, but when you reach out to grab it, it is not there.” So I began the program expecting to confirm these dominant views. To my surprise we discovered that the realities about HFT are almost exactly the opposite of that the investment managers were telling me. 

HFT liquidity has been described as ephemeral by many on the buy-side. What does your research suggest about the ability of the buy-side to interact with HFT liquidity? 

We have done research with data from the LSE, ASX, SGX, NASDAQ and NYSE Euronext on exactly this subject. The exchanges furnished us with data that identifies when HFTs are present in the market place. We then looked at the make-take decision. HFTs make liquidity when they put up a quote that gets hit by someone on the other side of the trade. They take liquidity when they hit someone else’s quote. The data clearly showed that HFTs are net makers of liquidity.

Interestingly some of our data also included information about when firms are trading through co-located servers within the exchanges. This data too showed that co-lo HFT activity was also a net provider of liquidity in those markets.

Co-location is described by some as an ‘unfair advantage’. What is your take on that given your research into the area? 

My view is that if the advantage is being put to good use in providing liquidity, then it is not being misused. That pool of co-located flow is providing liquidity that would not be there otherwise, so I cannot see how that is a negative for markets. 

Many market participants – including recent widely-quoted comments by Andrew Haldane of the Bank of England – are critical of the speed and sophistication of markets generally, using HFT as their example. They argue the playing field is not level and that markets should be slowed to take away perceived unfair advantages. What is your view? 

I was frankly amazed by Haldane’s suggestion that markets should be slowed [by introducing speed limits and resting periods]. What he is in effect suggesting is that we should take markets backwards by a decade. That is astonishing to me because I just do not see the arguments. Market participants who do not have the technology to compete with other players can easily access brokers with algorithmic trading engines to help them execute their trades. If you cannot or do not want to build the technology yourself, you can outsource it fairly cheaply and very efficiently.

From an HFT perspective, our research demonstrates emphatically that the liquidity they provide is real and other participants interact with it constantly, so I cannot see a problem there either.

Earlier this year, we saw some weeks of incredible volatility in global markets. Do you have any insights into how HFTs behave during times like these? 

We have produced very good evidence that, contrary to popular views, there is a negative correlation between HFT activity and price volatility, meaning they actually act to smooth it out.

On face value that may seem counterintuitive but it is actually very straightforward. There are three broad classes of HFT trading:

  • there are market makers, who provide liquidity and are passive;
  • there are arbitragers who usually have one leg that is aggressive and one that is passive, but overall are neutral;
  • then you have position takers, and within the position taking pool of order flow you have trend followers, who could ostensibly exacerbate volatility. Yet, in that same pool you also have the mean-reverters who act against those who exacerbate volatility by trading against price movements.

So when you look at the data, the HFT pool overall is overwhelmingly made up of those who either do not impact volatility, and/or actively act against volatility. So it is not surprising that we find HFT to have a smoothing effect on overall volatility.

I would be very interested indeed to do further research into the different classes of HFT trading to see which styles impact markets in what ways. We do not have that detail in our data at the moment but it is something I would like to work on.

HFTs ignore company fundamentals in their trading decisions, and many have criticized them for this. What is your view on the HFT decision-making process and its impact on markets? 

HFTs do not impound fundamental information into prices, which is not their advantage and therefore economic role. We have demonstrated convincingly that fundamental information is impounded into prices by the investment management community. What the HFT community is good at is extracting information from order flow and impounding that into prices. For instance, when an HFT identifies an imbalance in supply and demand, it will enter the market and provide liquidity because that is when the market will pay for it. That is particularly true of market maker HFTs. 

Regulators globally are struggling to understand how the advent of HFT impacts their markets and what they should do about it. In your experience, how should regulators approach the issue? 

The first thing they must do is inform themselves. What we need is very deep and enquiring research that fully examines the role of HFTs in all market states. Before we make any rushed regulatory decisions we should await the outcome of this research. Too many times we have heard regulators around the world accept market folklore as fact. The limited research that is been produced to this point has already debunked a lot of the myths around HFT and there is certainly more to be done. 

HFTs are by nature a very secretive community – have they brought their reputational issues on themselves? 

Definitely. By its very nature an HFT organisation needs to protect its intellectual property and therefore they are not open to communicating widely about what it is that they do. However they have carried this attitude of secrecy across to their business communications as a whole, and for this reason they have failed to communicate the positive economic role they play in the financial community. For that reason the battles are one-sided and they are paying the price with the regulatory focus and general market negativity they are receiving. I think the HFT community really needs to come out from the bunkers and be more open about what they do.

Spotlight on: Aggregation

By Dmitry Rakhlin, Greg Lee, Steve Grob, Ned Philips, Glenn Lesko

AllianceBernstein’s Global Head of Quantitative Trading, Dmitry Rakhlin, discusses the problem of fragmentation and what makes a good aggregator, along with Ned Phillips of Chi-East, Greg Lee of Deutsche Bank, Steve Grob of Fidessa and Instinet’s Glenn Lesko.


Dmitry Rakhlin, AllianceBernstein

How does aggregation improve trading and best execution?
Institutional traders usually demand (remove) liquidity from the markets, which in turn creates market impact. Being able to interact with aggregated liquidity (e.g. all available liquidity) lowers this market impact. Aggregated liquidity also gives a trader the ability to interact with many more liquidity sources randomizing the way the liquidity is taken from the market. This decreases the amount of information leakage and protects the trade from being exploited by predatory strategies.

Does aggregation spell the end of fragmented markets?
No. The US equity market is highly fragmented, yet all liquidity centers are interconnected, which allows traders to build various aggregator strategies. No doubt, there is cost and complexity associated with this. Fragmentation also introduces so called latency arbitrage and a potential increase in information leakage (the information leakage can be drastically reduced by using appropriate trading strategies).

The positive aspect of fragmentation is that it creates rich market microstructure (traditional exchanges and exchanges with inverted fee structures, block crossing networks, auctions, conditional order types, aggregators of retail liquidity, etc.). These choices give the buy-side the ability to match their investment strategies to the appropriate liquidity sources and ultimately benefit by being able to trade more nimbly and at lower cost.

From your perspective, is aggregation about greater access to liquidity or reducing trading costs?
Both.

Ned Philips, Chi-EastNed Philips, Chi-East

How does aggregation improve trading and best execution?

A good aggregator brings order to fragmented markets by concentrating order flows, and liquidity, from a large number of matching venues. It is a tool that allows all participants to access multiple venues from one easily accessible point, reducing the technology costs and other difficulties involved in monitoring different trading venues.

Does aggregation spell the end of fragmented markets?

No. Even if one good aggregator attracts a majority of trading flows, it would not represent a throw-back to a single exchange monopoly. Aggregators are there to make the process of using multiple markets easier and more efficient and can only exist as long as participants have a choice of matching venues.

What are the risks inherent in aggregation and how can an aggregator ensure improved execution?

Theoretically there is a risk that an aggregator will be so successful that it monopolises the market but competition and risk management would keep things in check.

An aggregator ensures improved execution by concentrating liquidity which reduces spreads and improves execution.

Greg Lee, Deutsche BankGreg Lee, Deutsche Bank

How does aggregation improve trading and best execution?

As markets evolve and become more fragmented, liquidity aggregation has become a core part of the modern broker electronic offering.

Does aggregation spell the end of fragmented markets?

A good aggregator today will leverage both smart order routers and dark aggregation algorithms to access a wide range of liquidity pools to ensure they have access to as many sources of liquidity as possible. Once connectivity to different venues is established products can apply logic to ensure execution performance is improved from using alternative sources of liquidity. The best of these products rely on sophisticated anti-gaming protection, easy to use interfaces and transparency of where liquidity was found.

How does aggregation augment the services offered by brokers to their buy-side clients?

Anti-gaming or other protection methods are key to making a good aggregator. Simple features such as minimum quantity settings can help to avoid people fishing for your order. More advanced products will use real time models to adjust limits and venue selection to ensure no execution is done at an unfavorable price.

While technology is essential to be a good aggregator, being a trusted partner who is transparent about the methodology and source of liquidity is still the key part of any trading relationship.

Steve Grob, FidessaSteve Grob, Fidessa

How does aggregation improve trading and best execution?

The process of aggregation is all about a broker trying to get to the front of the queue for buy-side order flow. The basic idea is for a broker to offer to sweep or aggregate other sources of dark liquidity in the event that a match cannot be found in that first broker’s pool. In this way the broker takes responsibility for scanning and executing a client order across his and his competitor’s liquidity. In theory this should offer better execution by automating the process of interrogating sources of dark liquidity.

Does aggregation spell the end of fragmented markets?

No because market forces and competition will always mean that multiple sources of liquidity reside across the lit dark spectrum. The irony is that whilst regulators are introducing competition and fragmenting markets, firms like Fidessa are providing the tools to glue it all back together again.
What are the risks inherent in aggregation and how can an aggregator ensure improved execution?/What technology is required?
Naturally there are some signalling risks if every broker is trying to aggregate each other’s liquidity, as a client order or a portion of it may find that it runs into itself through another broker’s aggregation service. On top of this there is an inherent risk around information leakage as orders are broken up and sent across multiple pools of liquidity. These risks need to be managed by effective use of technology. The starting point though is forming a realistic view of the potential pools available for any given client instruction. This needs to be based upon historic analysis of how different sized and shaped orders have performed against different dark pools and resting times.

On top of this, a buy side may wish to avoid a particular source of liquidity and so this will need to be reflected in the routing tables of the aggregating broker as well. A number of different algos are required to ping each dark pool and then adjust subsequent activity according to the success they have had. So in many cases after an initial sweep an algo will then revisit those sources that were successful whilst “listening” to other destinations and making adjustments accordingly.

Glenn Lesko, InstinetGlenn Lesko, Instinet

How does aggregation improve trading and best execution?

Aggregation enables clients to access as many points of liquidity as possible. This is very important because, increasingly, accessing all pools is a staple of best execution mandates from regulators, clients and brokers. Additionally many dark pools only trade at midpoints, so there are also significant spread savings that can be attained.
Does aggregation spell the end of fragmented markets?
Aggregation is a solution for fragmented markets. It resolves some of the issues that people see as negative about fragmentation, and allows clients to react to fragmented markets. In and of itself, it does not impact the physical fragmentation of markets, but it does give institutions the tools to make the issues of fragmentation go away.
What is the profile of a firm who wants their flow to be aggregated and what benefits do operators of pools and ATS’s/MTF’s receive from participating in aggregation?
It’s the really big, long-only fund groups who tend to hold positions that are very large relative to volume. The benefits are that the users are getting high-quality, nutritious long-term investment block flow.

Bloomberg Tradebook : Ray Tierney III

Bloomberg's RayTierney
Bloomberg's RayTierney

PUTTING THE CLIENT FIRST.

Bloomberg's RayTierney

Raymond Tierney III, chief executive officer and president of Bloomberg’s agency broker, Bloomberg Tradebook explains why keeping client focused is more important than ever.

Q. How are the current market conditions impacting your business?

A. It is certainly a volatile time but it is a double-edged sword. Exchanges like volatility because it creates opportunities for trading but it can also drive investors out of the market. For us, the most important thing is to remain nimble and client focused. Bloomberg Tradebook had been much more of a product-focused organisation but today we connect to over 80 markets across 50 countries as well as 40 dark pools to provide high liquidity for our clients. We have a customer product team who is focused on helping clients improve and enhance their workflow and execution performance.

Q. I read that one of the results from your recent forum – “The Future of the Buyside Trader” – was that market conditions lead to consolidation in the broker agency business. What are the reasons for this?

A. There are a lot of headwinds working against brokerage firms, such as declining commission rates, slower economic growth, and fund flows going in the wrong direction. Buyside traders are also under intense pressure in the marketplace as their clients become more demanding in response to a volatile market. The buyside expects agency brokers to provide more acute market insight and have greater awareness of the products and services. Sales traders can add value to the trade by providing high-quality market intelligence and research as well as innovative technology to help them maximise their performance. This requires a heavy spend on technology and data and many of the independent platforms, as well as small to medium sized brokers, are struggling. An informal poll taken at our buyside event found that attendees, who were predominantly buyside traders, believe agency broker consolidation is a potential consequence of these challenges.

Q. A recent Greenwich report shows that US institutional investors have cut back on electronic trading and are pushing more deals through high-touch broker sales desks over the past year. Are you seeing this?

A. I would agree that there is less business being passed around. The buyside is trying to figure out how they are going to pay for research, corporate access and capital markets and as a result they are shifting to higher touch service and away from electronic trading. The other driver is the current volatility that we are experiencing pushes people to trade more in lit rather than dark pools. However, I think this is a short-term trend and not a secular change.

Q. Can you discuss in more detail the independent research service that you launched in May?

A. We launched our independent research services in May with seven providers and we now have 16. Our goal is to extend that to 25 by the end of the year. We apply a deep, rigorous and analytical process to identify leading firms and have names such as Two Rivers Analytics, Thompson Research Group, Turning Point Analytics and Veritas Investment Research, which all have a record of performance. Clients receive the reports but they also can contact the analyst directly. We also have a flexible engagement structure. That is, clients only pay for the research they want and they can do this either by using hard dollars, commission sharing arrangements or bundled commissions.

The other difference is that our research services are supported by a dedicated team of highly trained research consultants who act as the client’s single point of contact. They know their needs and can offer a much more tailored service.

Q. Can you discuss in more detail the execution consultancy that you launched in May?

A. I spent four years at Morgan Stanley Investment Management. From that side of the market, I observed a reliance on brokers for market insight and product expertise, but limited transparency as to how products actually performed outside the organization. I wanted to develop an execution service at Bloomberg Tradebook that helps clients understand how to leverage a wider range of trading strategies. This has become much more important with the increase in high frequency trading.

Through Bloomberg Tradebook’s execution consulting services, we provide advice on liquidity and how to map order flow to its proper destination using the appropriate technology, data and analytics. We also offer real time evaluation of trades and can suggest how the process can be improved. Currently, these skills and advice are typically split across different disciplines at most sellside firms, but we are giving clients a single execution consultant who is responsible from pre- to post-trade and all areas in between.

We also put our consultants through a 14-step high-level accreditation programme. They need to not only understand, for example, how the algos work but also the logic behind them as well as transaction cost analysis, execution capabilities and market structure (micro and macro).

Q. What has the demand been for the new algorithm you launched earlier this year for fund managers to automate the process of rebalancing their portfolios without the help of a brokerage trading desk?

A. We are seeing an increased interest in our algorithmic products because the buyside wants to automate a greater percentage of order flow and speed time to execution but not enough people have gone down the rabbit hole to appreciate how that is accomplished. Building algos into trading platforms requires highly skilled technical resources as well as a sophisticated understanding of market dynamics.

Bloomberg Tradebook’s portfolio trading algorithm gives the buyside an optimal way to manage global equity portfolios across multiple markets, currencies, and time zones. We also have other tools such as dynamic auction algorithms, which enable traders to fit auction liquidity into daily workflow as well as PAIR algorithms, which look to take advantage of ratio-adjusted, absolute or merger statistical arbitrage opportunities. Part of this service is the strategy analyser, which recommends the optimal strategy and algo to use based on the trader’s requirements, benchmarks and market conditions.

Q. Looking ahead, what do you see as your greatest challenges?

A. One of the biggest challenges is the headwinds present in today’s economy that are influencing market behaviour as well as our client’s business. The uncertainty of the final Dodd-Frank regulations is also impacting investment decisions as well as the competitive landscape. This is all taking place in an extremely competitive and dynamic market where it seems
like commissions are in a race to zero.

At the same time, I think changing markets always present new opportunities. That’s what markets do! This is why agency brokers like Bloomberg Tradebook are trying to expand their reach beyond execution services, either with research or execution consulting services, so that the business is not so aligned with market volatility. Today, agency brokers have to evolve from ‘execution only’ to ‘execution everything’, that is we have to be client-focused, technically proficient, market savvy and the first to offer intelligent research and innovative products that clients view as indispensable, so they can justify covering agency fees.

BIOGRAPHY
Raymond Tierney III is the chief executive officer and president of Bloomberg Tradebook. Prior to joining Bloomberg, Tierney was at Morgan Stanley for 16 years, starting in 1994 as an executive director and senior sales trader. He became managing director in 1998 and named head of North America cash sales trading in 2004. Tierney was named global head of equity trading for Morgan Stanley Investment Management in 2006, responsible for leading the firm’s worldwide trading efforts. He started his Wall Street career as a trader at Paine Webber (1981–1986), served as a vice president at First Boston (1986–1991), and was a vice president and sales trader at UBS Warburg (1991–1994). He received a Bachelor of Arts degree from Villanova University.
©BEST EXECUTION

Marcus Hooper : Agora

Marcus Hooper
Marcus Hooper

COMING OF AGE.

Be19_Marcus.Hooper

The landscape may be crowded with trading systems but Marcus Hooper, formerly executive director, Europe, of Pipeline Financial believes there is more than enough room for his firm.

Q. The company launched its final phase of European operations this May but the business was started in 2009, why was there such a time lag?

A. Block trading is a competitive business and while there was a very strong interest in our block trading, we had to build commitment. We were running a restricted service in Europe since early 2009, which included our Algorithm Switching Engine, an order execution and dark pool aggregation tool. However, we did not launch the Block Board because we needed to build significant critical mass and it took time to get the order flow. Although it takes time for a product to gain traction, the volumes have so far exceeded our expectations particularly given that this past August was one of the most volatile months. We had over 100,000 block-sized orders in the first six weeks and they were mostly a mixture of large and mid cap stocks.

Q. Can you explain in more detail about the Block Board?

A. The Block Board is a crossing system for large block trades. Unlike many other MTFs, the Pipeline Block Board is aimed at institutional investors who want to execute block orders in a single trade with little market impact. It is not a dark or lit pool but uses a flagging system which is effectively a heat map of trading scenarios that is distributed to all of our clients. We have three flags; orange which indicates live available liquidity at the midpoint, without revealing buy/sell direction or quantity to the trading counterparty; yellow that indicates that a match is available to parties that have live orders, but not at the midpoint; and black, which indicates to a trader that there is a potential match but with a smaller quantity than desired. The minimum size of an order in Pipeline is typically around 0.5% of the average daily value traded of a stock, but users have the ability to specify a larger minimum acceptable quantity.

Q. You recently launched a new product called Alpha Pro. Can you give some more detail on the product?

A. Alpha Pro is a recent development based around our Algorithm Switching Engine, which is an order execution and dark pool aggregation tool that allows users in real-time to select the best algorithm to reduce implicit costs, including market impact. Both the Algorithm Switching Engine and Alpha Pro are entirely proprietary. Alpha Pro looks at the characteristics of a client’s order flow, and generates trading strategies which are customised to specifically fit a particular trading scenario based upon the exact profile of the client’s order. It takes into account a variety of things such as the historical performance, the benchmark volume weighted average price (VWAP) or implementation shortfall, as well as how aggressive or passive the client wants to be. It can identify the factors most likely to impact price movement and recommend a trading strategy to maximise alpha capture and minimise market impact. It can improve execution quality and reduce implementation shortfall costs by around 40% on average.

Q. What do you think the impact is of high frequency traders?

A. High frequency traders do add liquidity and they are a fundamental part of the business, but the liquidity they provide does come at a cost. In many ways it is no different than 20 years ago when there were market makers and a quote driven system. The fundamental difference today though is that high frequency traders are anonymous and in some cases when they detect a direction to the order flow, they will try and move ahead of it to maximise their profitability. In the old quote driven market, the market makers were visible – their names could be seen on the screen  – and they were working on behalf of known clients. One of the aims of Alpha Pro is to minimise the costs of accessing liquidity provided by high frequency traders and proprietary trading firms.

Q. Competition is intense in the MTF space. How do you differentiate?

A. Everyone wants to do block trades but it’s very difficult to make that happen on a frequent basis. The standard dark pool for example has very low matching rates. In a lit pool you can get a block trade done but it is fragmented into smaller order sizes. Our hit rate in the US is regularly above 20% which is very high compared with other systems. We also think our Block Board offers a unique and cost effective way of matching the trade. We show where there is activity in a certain stock and that is a good starting point for trading blocks.

Q. What has been your biggest challenge?

A. The cynicism of the market is one of the biggest challenges. Proving that your product can deliver cost savings and outperformance to people who have been in the industry a long time and with very established views can be a hard sell. Persistence is the key and the ability to convince people to take a leap of faith based on the results and validity of the product. We often find that clients run specific trials and this has translated into concrete long term business. The other challenge is determining the best point in the cycle to invest in the business in terms of resources. Clients differ, with some running the system entirely independently while others want our team members to engage more actively, and this requires more resource allocation. We are a small team but cover most of Continental Europe and the UK.

Q. Where have most of your clients been based?

A. London has been a slower market to crack because it has been saturated with brokerage services for a long time. However, we have been positively surprised at the reception we had in countries such as Spain which is often considered a closed market for outside providers. We have also had strong success in France and Scandinavia. Building a business is a lengthy proposition and can take six to seven years to develop, as we have seen from other new business entrants in the past. One other challenge we face is that the market has never really fully unbundled and it still isn’t really a level playing field.

Q. On a business level, what is the next stage of development?

A. We are currently broadening our client base. We started targeting global and European heads of trading at large sophisticated buyside firms but now we are focusing on the asset management firms who are growing in their use of transaction cost analysis as well. We currently have over 100 algorithm strategies and we plan to refine and develop more. We also are constantly looking at enhancements to the Block Board’s functionality and expect to expand the Block Board’s geographic coverage which currently serves 14 European markets and over 3,500 stocks.

[BIOGRAPHY]
Marcus Hooper, executive director, Europe, of Pipeline Financial Group – a company that operates institutional electronic brokerages in the US and Europe – is responsible for the company’s European business. Hooper has spent over 25 years in financial services including 17 years in senior asset management trading roles. He has been the manager in charge of trading operations for the Investment Management divisions of HBOS, Dresdner and AXA. Throughout his career, Hooper has worked on financial markets projects undertaken by The European Commission, The UK Treasury, The FSA, The Investment Managers’ Association, The British Banking Association, The London Investment Banking Association and The London Stock Exchange.
©BEST EXECUTION