Home Blog Page 648

Using FIX to Connect LatAm Markets

Andres Araya Falcone of the Santiago Stock Exchange explains how FIX is increasing the range of services available to traders in Chile and throughout Latin America.
Andres Araya FalconeHow is FIX facilitating DMA into the Santiago Stock Exchange?
The first concept of DMA in Chile began with what we call “direct traders” (buy-side traders) facilitating these specially authorized institutional clients, to send direct orders to the market via a “broker sponsor”. Thus, pension and mutual funds, insurance companies and other institutions, using trading terminals provided by the Stock Exchange, can trade directly in our market. The next natural step was the incorporation of electronic networks to attract order flow from the U.S., Europe and neighboring countries in Latin America, especially Brazil.
In 2006, we built the first FIX interface using version 4.0 to connect to the Marcopolo Network, to attract the order flow of our local equities market. After that, the Santiago Stock Exchange launched its initiative to modernize the equities electronic trading system and developed Telepregón HT, jointly with IBM, which went live in June 2010. This system is ready for algorithmic trading flow since it supports a throughput of over 3,000+ orders per second with sub-millisecond latency. In designing the system, we decided to use FIX 4.4 to enable easier connection via DMA with other exchanges, sell- and buy-side firms and market information vendors. This has greatly facilitated the connection to different networks, such as Bloomberg, Fidessa and SunGard, among others. For all these initiatives, FIX has been crucial in facilitating the integration with these listed networks. During 2011 we will announce new network agreements.
Currently, referring to the equity market, 11% of order flow comes from DMA which represents an  average of a 27% increase over the last 6 months, today 19% on average comes from Internet retail order flow, and the rest comes from traditional OMS and Trade Work Stations.
As foreign investment into Chile and the Chilean market continues, how will the Santiago Stock Exchange upgrade its platforms to meet increased investor and trader demands?
In 2010, the Selective Share Price Index (IPSA), the country’s main stock market indicator, gained 37.6% in Chilean pesos (equivalent to some 46% in dollars). Share trading on the Santiago Stock Exchange rose to US$60 billion in 2010, up 30.5% from 2009, setting a new annual record. Trading was particularly strong in the second half of the year, which accounted for almost 60% of the annual total, reflecting strong demand from both local and international investors.
At the same time, by the end of 2010, the Santiago Stock  Exchange had signed a linkage agreement with Brazil’s stock exchange, BM&FBOVESPA, heralding the latest in a series of cooperative projects being run between Latin American bourses. The agreement, signed on December 13th, will enable connectivity between both exchanges for order routing and market data dissemination. It also includes separate initiatives for further development of the Santiago Stock Exchange’s derivatives market, the establishment of joint initiatives related to settlement, clearing and central counterparty services, as well as access to the BM&FBOVESPA /CME trading platform from Chile.
Market participants in both countries will be able to route orders for stocks, stock options and related derivatives listed on the other’s exchange. Both exchanges will also be able to receive and distribute each other’s market data. Clearing and settlement of orders will be done according to local market rules of listed instruments. These kinds of initiatives imply that the Santiago Stock Exchange’s IT platform has to be prepared to manage more than 6 million orders per day.
What plans does the Santiago Stock Exchange have to accommodate High Frequency Trading and algorithmic order flow?
We are working as an integrator of a state of the art product for algorithmic trading. In conjunction with Streambase, FIXFlyer and IBM WFO, we are creating a product we will call “Broker in a Box”. The idea is to provide a framework for capital markets, including a set of algorithmic order execution strategies designed to achieve best execution, access liquidity, minimize slippage and maximize profits for trading operations. These algorithmic trading strategies (like VWAP, TWAP, Arrival Price / Implementation Shortfall, etc.), are provided as fully customizable EventFlow modules which can be used in conjunction with the frameworks. Trading firms will be able to modify each algorithm to reflect their own “secret sauce” and to differentiate their trading strategies in the market. The Santiago Stock Exchange will provide an “all in one” solution: integrated markets, market data (from Integrated Latin America Market (MILA), NYSE and NASDAQ), co-location, monitoring, local support, etc.
How will MILA provide additional opportunities for investors and traders in Chile and Latin America?
We have been working for the last 13 months in the Integrated Latin America Market (MILA), to consolidate regional stock markets so they may become more attractive for local and foreign investors.
The expertise of each of the three stock markets are different. The Peruvian market for example, is concentrated in minerals. On the other hand, the Colombian stock market is focused on the industrial segment, which is growing a lot and is therefore very attractive for both local and foreign investors. Finally, the Chilean stock market is  strong in financial, retail and service industries.
All participants will find that the Integrated Latin America Market provides a standardized trading, clearing and settlement experience with similar market rules and order types, etc.
MILA will attract more liquidity to the market because investors will have wider availability and a greater diversity of companies to invest in, in a bigger and more integrated market. Finally, listed companies will benefit even further from this integration through access to new and increased  financial resources for their expansion.
Will High Frequency Trading become a major force in the Chilean  market, and will it give rise to co-location facilities, data centers and  low-latency data feeds?
Currently, at least three brokerage houses are developing and using their own algorithmic trading strategies for the equities market in Chile. Additionally, we do observe algorithmic trading traffic from foreign brokers, especially from Brazil.
Algorithmic trading is sensitive to round trip latency. A broker who is nearer an execution venue than his peers will have an advantage because he will experience shorter network propagation delays. This has led to the practice of locating algorithmic trading servers in close proximity to execution venue servers. In practice this means that the Santiago Stock Exchange will need to check the following list: sufficient bandwidth to handle peak order and trade flows; support for the most popular versions of FIX; facilities for proximity hosting for algorithmic trading servers; conformance with widely adopted execution mechanisms and order types; monitoring and publishing quality of service parameters; order validation routines to prevent “fat finger” problems, among others.
Therefore, TELEPREGON HT and “Broker in a Box” are the Santiago Stock exchange’s proposals to help our clients start working with these new technologies. At the same time, we see brokerage houses creating their own knowledge bases to participate in this area.

A Buy-side Trader’s Take on the Exchange Consolidation

Paul Squires, Head of Trading, AXA Investment Managers opens up about the relationship between the buy-side and exchanges, and the perceived effects of recent consolidation among exchanges.
Paul SquiresFrom Trading Desk to Trading Floor
We trade on an exchange in the name of a broker, which means there is a buffer between the exchange and the buy-side. The interaction we have with the exchanges and MTF’s works much better now. The MTF’s have done a good job engaging with the buy-side over the past few years, which makes a lot of sense when you think of the evolving landscape of market structure. Historically, buy-side firms and exchanges were never quite sure if they needed to pay much attention to each other; however, there is a much more collaborative dialogue now. Most buy-side desks have mixed feelings about some of the bigger exchanges, in much the same way that some of the brokers have mixed feelings about the positioning of exchanges. On the up-side, there is a sort of national, utility element to the exchanges. For things like index funds, primary exchanges own the end of day official pricing.
Before MiFID, the primary  exchanges were responsible for more of the trade and transaction reporting and it was easier to interpret that data compared with the  fragmentation of trade reporting following the first MiFID installment. On the buy-side, we have this simplistic view that it is positive for reporting to be centralised through the primary exchanges because having liquidity in a single venue is something we see as beneficial. Also, the level of monitoring around the primary exchanges is higher than around the MTF’s, and therefore, things like governance and robustness tend to be greater. Generally speaking, we see the primary exchange as a kind of trustworthy elder statesmen in the world of market structure. Where I think the challenges around the exchanges lie are that innovation can be bogged down by their hierarchy and organizational structure, and therefore, cannot compete quite as dynamically as some of the MTF’s, which clearly have much lighter infrastructure considerations. It is no surprise that some of the primary exchanges have lost market share to the MTF’s, who have been nimble, technology focused and reactive in the face of a changing environment.
Exchange Consolidation
We find it quite fascinating to see what will happen. Given the rate of market expansion, globalization and regulatory changes – all of which lends itself to consolidation – it was an inevitability. Exchanges have to be forward thinking about what their long term roles will be and although there are different aspects to this, what we tend to focus on is cash equities only. When people think of the Toronto or London exchanges merging, they think it is kind of interesting. Deutsche Boerse and NYSE Euronext, on the other hand, is fairly mind-blowing. In a wider context, the really interesting developments for us, the market participants, are for exchanges to look into other asset classes and areas of activity to secure a revenue stream for the future. The real impetus is not from cash equities; it is very much about clearing, OTC, potentially, fixed income markets and looking at what they can do in more commercial areas.
Net Gain/Loss from Exchange Mergers
We would hope to see technical enhancements at the exchanges. By applying a rule of best practices, the things that work well for the Toronto Stock Exchange or the London Stock Exchange could be transported to the other exchange, as with Deutsche Boerse and NYSE Euronext. The technical platforms and order book layout are quite relevant, so we would hope to see some enhancement in that area. Nonetheless, I would not necessarily promote a uniform market layout or order book structure, as I do not think we need that in every single exchange we trade on. To some extent, the more that order books’ structures align, the more it helps traders who are trading multiple markets. We can  pre-constrain a lot of unique exchange rules in our systems, but there are segments where human intelligence and manual control of the various elements are vital. In this respect, we would see any alignment of market practices as a fairly positive development.
The obvious potential negative outcome of the mergers is in returning to situations where the exchanges have too much of a monopolistic position and can potentially raise costs without the market having any ability to challenge it. If the MTF’s or exchanges raise their costs, we do not necessarily see that on the buy-side because of the buffer that the broker provides. There is a fairly high margin in the commission rates we pay our brokers and they cover their costs of trading our orders on the venues, and those venue transaction prices would have to increase exponentially for it to become a direct factor for us. Of course, it does eat away at margins for the brokers, and it may come to a point where they need to pass those costs on. The buy-side is somewhat safeguarded from rising venue costs, but not completely.

Exchange Mergers and the Buy-side IT Team
Adapting to the merged venues would not involve too much IT work because our connectivity is between our OMS and our brokers and the connectivity between brokers and venues is not something our IT team would need to be concerned about. Our concern would be for some of the static data elements that may need to be updated, e.g. lot sizes or account details in some of the emerging Asian markets.
At the recent EMEA Trading Conference, organized by FPL, there was a good session on transaction cost analysis and whether, with the increase in algorithmic trading and smart order routing, people should be doing venue analysis. I think reverse engineering opportunity costs by comparing certain venues is something that will start to feed through from the more advanced quantitative trading desks. The conclusion for most buy-side desks was to make sure your brokers are accessing the multiple ranges of venues in an appropriate manner and to make sure their smart order routers are working in a smart way. On the other hand, if you split your order up into multiple directions, which is currently the system in Europe, it is difficult to derive quality analysis. This is a facet that we will certainly be keen to monitor, even if on a less than 100% granular level.
Lastly, I should mention that we have had a few outages on exchanges. Exchanges have talked about the race to latency, sponsored access and where their servers are located. While these are all timely issues, and though there has been a concerted focus on technology and innovation, there is a slight concern in the back of our minds that too many shortcuts might be taken at the expense of stability. For long only, institutional investing desks like ours, having your primary exchange go down for half a day exceeds our concern for gaining or saving a millisecond of latency. What we do care about is technical stability at the primary exchanges because they still have a lot of market share and a very significant role to play.
BATS Europe Chief Operating Officer, Paul O’Donnell, discusses what the recent acquisition of Chi-X Europe means for their members.
Paul O'Donnell, BATS EuropeWhat are the technical similarities and differences between the two venues trading systems? – or – where are there apparent synergies between both platforms and which technical areas will require some retooling?
Given the multiple synergies between BATS Europe and Chi-X Europe (shareholders, culture and personnel, data centre needs, clearing and regulatory viewpoints), a transaction made great business sense. Currently the BATS Europe platform runs on proprietary technology, whereas the Chi-X Europe platform runs on a combination of Instinet and proprietary technology. Our vision for integrating the two organisations includes migrating Chi-X Europe’s lit and dark books to the BATS Europe platform and rationalising infrastructure and connectivity.
How does the acquisition of Chi-X Europe expand BATS’ offerings for its existing members both in Europe and in the US?
We intend to continue operating Chi-X Europe’s lit book alongside the BATS Europe lit book, which will allow us to offer multiple pricing points and functionality differentiation between the two books, and we’re considering how we could consolidate two market data feeds and offer a real-time consolidated data feed at a low cost. Chi-X Europe also operates a dark book, as does BATS Europe, and we’re currently discussing the possibilities for differentiating two dark books. Once the integration is complete, our offering of multiple books, consolidated market data, and the rationalisation of technology environments and infrastructures, will be extremely competitive and ultimately should lower our participants’ cost of doing business. Our immediate focus with this transaction is the European equities market, but we are always looking for opportunities to leverage technology and resources that could add value for our customers in the U.S. and Europe.

Order Routing in Japan

Osaka Securities Exchange’s Matthias Rietig reports on the most recent developments in order routing and algorithmic trading in Japanese equities and derivatives.
Matthias RietigUpgrades to Japanese Exchanges
Although the Japanese markets have been fully electronic since 1999, last year’s upgrade to arrowhead by the Tokyo Stock Exchange (TSE), as well as Osaka Securities Exchange (OSE)’s move to J-Gate, are quantum leaps in terms of performance. In OSE’s case, the move to faster technology – internal round trip times have been reduced from 60 milliseconds (ms) to below 1-2ms across all derivatives products – also paved the way for a broad revision of trading rules. All Japan-specific rules have been abolished, making way for global standard price/time First-In-First-Out based trading. This will make OSE more transparent and efficient, and hence, a fairer market.
Since Japan is one of the three largest equity markets in the world, the move towards a more globalized market structure, coupled with technology upgrades, will transform Japan into a new hot spot for High Frequency Trading (HFT). That being said, it is important for the exchanges to navigate through this paradigm shift wisely so as to not alienate the domestic user base and very active domestic retail participants. Although TSE volumes jumped about 10% after their move to arrowhead and spreads narrowed, the domestic proprietary trading houses that could not manage the very expensive transition as well as a huge decrease in profitability were driven out of the market, which resulted in a  decrease of around 40% in the Japanese cash equity dealer population.
Although we see a proliferation of Multilateral Trading Facilities (MTFs / so called PTS’s in Japan), dark pools and crossing networks and the like entering the market, the liquidity from the domestic layers is still pretty sticky and has an exchange bias. That being said, even in Japan the times are changing and I expect the share of MTFs to increase steadily over time.
With the arrival of Chi-X Japan, all of the existing Proprietary Trading Systems (PTSs) reviewed their business models and it can be assumed that, due to the successful launch of arrowhead and a speed competitive main market, inter market arbitrage activity will rise. Since the Japanese exchanges are already very competitive with their pricing, the battle will be fought over value added services and features, like speed, tick sizes and access options. Overall, a decrease in trading fees can still be anticipated, which should benefit the end users.
While we do see the first signs of new growth of the PTS’s, overall volumes are still relatively small, with TSE capturing roughly 94%  of the overall volumes on exchange equity cash trading, OSE about 5%, and the remaining PTS’s command a combined share of roughly 1%. It will be interesting to watch how market structure will evolve with the opening up of the JSCC, which for the first time started to serve PTS’s as a clearing house in July 2010.
Competition will increase, and although Japan is often labeled as overprotective, many domestic market participants believe it is a good development, as competition clearly cultivates services. The government, itself, is committed to induce more competition among markets to gain competitiveness in an international context and re-establish Tokyo as the financial focal point in Asia.
As the equity cash market is increasingly fragmented, the situation in the equity derivatives space is less so. OSE occupies the major chunk of Japanese equity index futures trading, the most prominent being the Japanese benchmark index Nikkei 225, through the Nikkei Mini Futures, which is one of the ten most actively traded index futures contracts, globally. Furthermore, OSE occupies a 100% share in Japanese Equity Index Options.
Although overall volumes went south for most of the other Japanese derivatives exchanges for the last couple of years, trading volume increased for five consecutive years. The newly launched J-Gate derivatives platform in Tokyo aims to make the overall market proposition of Japan more cost efficient and attractive. Located in the same data center as TSE as well as some PTS’s, J-Gate will hopefully benefit from cross connectivity, analogous  to the US and European markets.
Smart order routing and algorithmic trading
Smart order routing (SOR) has been gaining traction in the last few years, and the launch of more sophisticated trading platforms, as well as the overall increase of sophistication of the buy-side, should further facilitate this trend. Although adoption by the international buy-side is already relatively widely implemented, the domestic layer has been more prudent to adopt SOR. In any case, it can be expected that the domestic players will catch up quickly, as they are currently screening the market to be ready once the overall liquidity on the PTS’s increases.
Algorithmic trading is very prevalent in Japan and has been increasing steadily over the last couple of years. The introduction of co-location, as well as the introduction of the Nikkei 225mini Futures, has increased the amount of players that can be considered HFT or latency sensitive. Roughly 30-50% of OSE’s current futures and options volumes stem from latency sensitive HFT/Ultra HFT players. If we only look at the messages generated, the overall share of HFT is much higher.
Low latency market data feeds
The recent technology upgrades also led to significant improvements in the market data feeds; opposed to the 100ms snapshot update of the old system, J-Gate sends out market data in real-time. Enhanced market data and analytics offeringswill cater to the increased demand from global quantitative traders. This will be key to fully unleash the potential of the algorithmic trading activity in Japan. It is of the essence that market data is accessible, timely and robust at all times, and in order to avoid a flash crash/flash recovery scenario, it must provide a full picture of the situation at all times.
The rise of HFT and changing trading strategies
Domestic derivatives participants have been engaging with the HFT layer at OSE for quite awhile already. Traditional participants will have to consider the costs and benefits of changing their systems in order to keep or, in some  cases, regain their ability to hit the orders they wish to. It is refreshing to see that some Japanese online brokers are embracing these changes by providing algo tools to their customers, and some are starting to host their servers in co-location facilities in Tokyo.
Regarding the change in trading styles, the recent system upgrades should promote HFT, as traders gain confidence in the stable and high performing platforms, which will support their modelbased trading. Nonetheless, HFT is certainly not the only way to operate successfully on trading platforms, and market directional activity is of the essence for a successful market to function properly.
The changes so far are very encouraging, as the  domestic retail layer is moving towards easy automation and the traditional Japanese prop trading community is starting to equip themselves with professional off-the-shelf tools that can partly or fully automate their strategies to cope with the evolving trading landscape in Japan.
The learning curve for domestic players will be steep, as a lot of information on HFT and third party solutions is widely available, hence adoption should be relatively swift. We already see encouraging first signs, especially among the younger generation within the domestic prop trading community.

Playing Host

Hong Kong Exchanges and Clearing Limited’s Jonathan Leung unveils their plans for new hosting services in Hong Kong.
Jonathan Leung_Mar 11Hong Kong Exchanges and Clearing Limited (HKEx) recently announced plans to open a Tier 4 data center to address both its internal IT requirements and demand for hosting services from the Exchange Participants. Because HKEx was formed by a merger of the Stock Exchange of Hong Kong and the Hong Kong Futures Exchange, it has had to manage a variety of market systems and each organization has had a primary and secondary data center located in different parts of Hong Kong. With its Next Generation Data Center, HKEx will be able to consolidate its IT systems, resulting in further improvements in the systems and better utilization of resources.
The new data center will be a purpose built facility, giving HKEx full flexibility to custom design the electrical and mechanical infrastructure to provide the highest redundancy and reliability. One of the main features of the design is the ability to support higher power density. Today’s trading firms are demanding higher and higher power density for their systems. The average power density, not only in Hong Kong, but globally is about 3kW, while HKEx’s new facility is designed to support an average of 6kW. The construction has already been started and HKEx plans to launch its hosting service towards the end of 2012.
HKEx is aiming for LEED Gold Certification so the multistory facility has been designed with green initiatives in mind, including special features like floor plans that maximize the use of daylight and storm water cycling for irrigation. HKEx’s goal is the highest energy efficiency and sustainability without compromising reliability.
If the new centre opens in 2012, what will be its maximum capacity and when do you expect it to reach that level?
We are still in the design phase for the interior, but our plan is to offer services at the hosting site in Q4 2012. The full capacity of around 1200 racks will be developed in phases. In the first phase, we will be able to support around 400 racks located in a common hall and private cages. To ensure fairness to all hosting customers, there will be a special design in the structured cabling system to provide the same latency access to our market systems no matter where a rack is located. We will monitor the site usage and develop further capacity accordingly.
Who will be the primary firms to utilize this facility?
Exchange Participants, information vendors, technology solution vendors, connectivity providers and telecom service providers are all our target users. They together will form an ecosystem which will make our hosting services more valuable to all the Exchange Participants. Our strategy is to provide a world class hosting facility and work with partners to provide enhanced services to our users. We welcome all categories of Exchange Participants. How they will use the services at our new data center will really be dependent on their business model and trading strategy. It is not about the size of the firm.
How will additional hosted services affect the role of High Frequency Trading (HFT) in Hong Kong markets?
With low latency access to our platforms, our Exchange Participants will have higher flexibility in  their trading strategy. Low latency itself is not equivalent to HFT. There are many factors our Exchange Participants will consider when deciding whether to use the services offered at the new data center. In the end, however, it is really up to the Exchange Participants to determine their business  strategy; our role is to provide a platform to address different requirements in the market.
Will there be additional hosted services offered by HKEx in the next year?
We are in the process of discussing with different potential partners ways to enrich our hosting services. We welcome input from all interested partners. Cloud computing applications are among our considerations. We are also in the process of upgrading our trading systems and other market systems. Our cash market will migrate to AMS/3.8, the new version of the current trading facility, by Q4 2012, and there will be new versions of the clearing and market data systems. On the derivatives side, the hardware and software upgrade that was started in Q4 2010 is scheduled to be completed in the middle of this year. There are also plans for next generation systems for both markets, but there is no timetable yet.

The German Connection

George Macdonald, CEO of Macdonald Associates (MACD) looks into the history of German exchange platforms and reveals the future of German exchange connectivity through FIX.
George MacdonaldGermany was slow on the uptake of FIX. In order to understand that statement, it is necessary to take a look at the past. It was not because they were slow at adopting electronic trading, but in fact, precisely the opposite; the Swiss Exchange (then called SOFFEX) and the German Exchange (then called Deutsche Termin Börse) were right at the forefront of electronic trading with their systems going live at the end of the eighties.
Electronic trading took time to gain significant momentum. In 1996 the BUND (a futures contract on German debt) was still traded almost exclusively in London at LIFFE in open outcry. The fully electronic German exchange, despite having been around for six years, had managed to gain only 30% of market share. Indeed, many people at LIFFE believed that derivative trading could never move off the floor and onto an electronic platform. The move to Frankfurt happened in stages, with intense competition at the end of 1997. However, once 50% was reached, the tipping point had occurred and the volumes migrated entirely in just a few months.
The Germans had demonstrated that electronic trading could work. The cash market for equities was also one of the first to move to a fully electronic platform, with the Deutsche Börse order books going live on Xetra in 1997. Since then, electronic trading has swept the globe, with almost no market resisting the inevitable change. There have also been a lot of other changes since then as the trading process continues to evolve. The early success was also the reason for the slow uptake of FIX; the first mover advantage on electronic trading meant that the exchanges and banks built a trading landscape tightly knitted around a small number of proprietary systems.
Member of both exchangesUltimately, two main connections to liquidity evolved: Xetra, a proprietary trading platform developed by the Frankfurt exchange, and XONTRO, a system which connected the banks to the regional exchanges (Stuttgart, Düsseldorf, Hamburg, München, Hannover, Berlin and the Frankfurt floor). These two systems ensured that the trading landscape became intertwined and dependent on the existing infrastructure. There was no pressing need for a change. This, in turn, made it very difficult for new entrants such as Equiduct, NASDAQ Europe and Chi-X to gain a foothold. The move to FIX was something inevitable, however, and it was only a question of when, not if, the connectivity changed.
Gradually, some new exchanges did experience success. For example Tradegate, a new startup exchange based in Berlin, has recently been successful in capturing flow, using FIX as their main method of connectivity. Thorsten Commichau of Tradegate said: “FIX is basic technology that makes sure your order stream doesn’t dry up. We at Tradegate have been providing it since 2001 and recently an increasing number of other German exchanges have started to use it too. Tradegate’s FIX interface supports modern order types such as one-cancels-other and trailing-stop limit and has made proprietary access alternatives unnecessary. In 2010, Tradegate Exchange recorded 3.2 million trades, a growth of 33 percent compared with 2009. Nowadays, over 80 percent of all customers send their orders to Tradegate Exchange in the FIX format.”
Hanno Klein of Deutsche Börse has also been actively involved in FIX for many years, serving as co-chair of the Global Technical Committee (GTC). As a part of the GTC, he has also been very active in ensuring that FIX 5.0 includes functionality required to cover trading on an exchange and makes some of the FIX verbosity related to message flows optional. For example, some execution reports can be implicitly dropped  because they are redundant, and partial fills can be bundled into one message, thereby enabling an exchange transaction model to be modelled in FIX (for more details see volume 7 of the latest FIX specification).
Of course, there are some banks who have been using FIX for much longer. One notable example is Baader Bank. As Edward Strauss of Baader said, “FIX continues to be a key standard in our inbound and outbound connections at Baader Bank, with the number of connections growing exponentially. In 2004 we had four FIX connections and now have over 100 FIX connections. As the protocol expands, Baader is becoming quicker in being able to take advantage of new innovations in finding solutions for connecting to our customers. I foresee that this trend will continue”. But the number of banks using FIX is still far fewer than one would expect for a market the size of Germany. The uptake is most common amongst large banks and those that trade with foreign counterparties. Until recently, the vast majority did not have any FIX implementation at all and continued to use SWIFT, other protocols, or the telephone for trading. All this will finally change when Deutsche Börse makes a sweeping amendment to their current connectivity offering in Q4 2011, bringing in a new interface using FIX that will, over time, phase out their legacy infrastructure.
The Börse says “The new FIX Gateway for Xetra and Eurex will allow customers to use a well-known, international standard protocol to access our trading services. Eurex already offers FIXML access to risk management services.  We see an increased demand for FIX-based services for both Xetra and Eurex customers and have made a strategic decision last year to move in this direction.” In a further  significant change, the exchange will move all floor trading to the Xetra platform, revolutionising the Skontroführer system in Frankfurt and making wide ranging changes for the  broker managed stocks. At the same time, XONTRO is in the process of adding a FIX interface to their network, thereby providing banks with an alternative to the previous proprietary connection. For the first time, it will be possible to access all German liquidity centres using FIX. This will, inevitably, be the biggest shake up to the trading landscape in Germany for more than a decade.
Changing connectivity is more than just a change to the messaging protocol; changing the plumbing makes other developments possible. Connectivity becomes a commodity and accessing new liquidity pools is suddenly much simpler. This encourages new entrants to enter the market, which in turn offers banks a greater choice of trading venues and software solutions. In time, the process becomes self-perpetuating.
Over the coming year, banks will be reviewing their existing connectivity setup and will need to make alterations in preparation of the upcoming changes. There will be an increase of hosted connectivity, and access to trading networks, offering banks  the ability to tap into multiple sources of liquidity with one single connection. Also, alternative exchange venues will once again try to make inroads into the German market. Using FIX, access to dark pools will become easier and banks will look to connect to different destinations to source the liquidity that they require. Finally, there may well be a shake-up in the offerings from the regional exchanges, as they try to compete for customers and re-define their business models.
Of course, no change happens entirely overnight, but the evolution over the next few years should not be underestimated. The next 18 months are going to be very interesting in Germany.

European Consolidated Tape: the Tipping Point for Real Collaboration?

PJ Di Giammarino, CEO of the JWG Group examines the goals of the European Consolidated Tape and suggests what needs to be done by banks and regulators to reach a consensus.
PJ Di GiammarinoThe notion of having a “system of record for the market,” is central to the regulatory reform efforts. Is MiFID back where it started? Where are we in this saga and why should both the buy-side and sell-side care? The recent MiFID review was initially going to entail only a few minor technical revisions to the active directive; no one was particularly worried about it. In one of 2010’s biggest surprises, the review snowballed, expanding the scope and depth of change dramatically. In its current form, the MiFID review represents a massive shift in the way financial markets operate in Europe far exceeding the original implementation, as evidenced by its expansion into commodity price controls. In many ways, this ‘review’ has the industry right back to where we were in 2005 – attempting to figure out what transparency means to ‘business as usual’. So what is different this time?
Perhaps the big differences are that 1) the stakes are far higher; 2) Europe is no longer alone, and 3) there is real resource to get it right and, therefore, add value to the industry. Across the world, supervisors are worried about the completeness of their view of the financial markets, as well as establishing a level playing field for market participants. New data requests are the order of the day, originating from every part of the value chain and generated by the hedge fund directive (AIFMD), OTC derivative reporting to trade repositories, short selling transparency and the SEC’s Large Trader Reporting System. Meanwhile, regulators are also calling for more transparency back to the market, shedding light into dark pools by forcing real-time (or close to realtime) post-trade data and introducing new post-trade transparency regimes for entire asset classes, such as bonds and derivatives.
Why does the creation of a European Consolidated Tape – a discussion that was successfully batted away by the industry in 2004 – now matter? What is the real driver for it? Clearly, buy-side firms feel it is difficult to acquire a complete view at a reasonable cost and would welcome the initiative. The public can see the point. If market participants cannot reasonably afford sufficient pre and post-trade data for their needs, they are unfairly disadvantaged.
Though many larger firms are happy with the ‘status quo’, this is a particular problem in Europe, where an ever-more fragmented market means that a full set of equity market data costs approximately €450, as opposed to €50 in the US, according to consultation responses. By lowering these costs, regulators say investor protection will be enhanced and price discovery will be facilitated more easily. The real question, however, comes down to whether the tape will be of value to market practitioners, and the answer, as ever, depends on ‘the how.’ The discussion in the European Commission is not on whether a consolidated tape is necessary, but how it should work and who should run it. The bottom line is that, if the tape is done correctly, it will have huge knock-on benefits to the cost/income ratio and be used to build products nd services that can bring value to market participants and regulators.
Smart vendors are seizing the consolidated tape as a commercial weapon. If one company can grab sufficient market share for the tape, the value-added data and services that could be offered would be both attractive and lucrative. Therefore, the industry ought to officially open the discussion to vendors early to start helping with the requirements, operating model and commercial terms. The industry has so far failed to define the detailed ‘rulebook’ and standards required to be able to create such a system of record; hence, the well-known arguments about the quality of the data amongst the many players involved.
So can the industry define ‘what good looks like’ and fix the system? We think the answer is “yes” – but we need the banks to focus on the issues in the centre and commit to solving them. A few meetings instigated by CESR last summer were able to produce the type of quality standards effort from industry practitioners that has been missing in the five years of MiFID’s history. If the industry wants to create, publish and maintain a way to consolidate the trading across Europe, it can. And, for a variety of reasons, it should.
Of course, we cannot fall into the trap of assuming what works for apples will work for pears. The unintended consequences of getting this wrong could be huge; market liquidity is not something to alter without understanding the potential impact thoroughly. Regardless of the scope, timing, operational date or any other considerations the EC has on the table, the consolidated tape will become a reality in one form or another. Rulebooks, standards and quality measures will be written by the regulators, whether we like it or not. The information will be usable by the market and the supervisors, not only to fulfil their remit of market protection, but to challenge trading practices, adjust capital and liquidity buffers and seek out and identify systemic risk.
Whilst we still do not know the shape and governance of the entity that will be responsible for consolidating the information, we can be sure that Europe will not be alone in the pursuit of this objective. The US Office of Financial Research (OFR) has the remit and the budget to amass a great deal of reference and market information, and has already started its detailed policy consultations. The EU and Asia do not have a similar mandate, as yet, but do have the same political drive.
In summary, we are at a tipping point for real collaboration. As we at JWG explore new regulatory frontiers (e.g., systemic risk), we continue to see opportunity to clarify what data, and in what format, are required to meet aggressive G20 targets. Discussions on standards, protocols and data quality measures now matter more than they ever have before and on a global scale. The stakes are high, and industry members regardless of their specific focus, must understand what the ECT entails and get involved in defining the new ‘business as usual.’
FPL Submits a Proposal to CESR
Following requests to comment on a range of regulatory consultations, in the summer of 2010 FPL formed the European Regulatory Subcommittee. FPL is a neutral industry body and the development of this group enabled the organisation to ensure that responses are reflective of membership interests. In November 2010, following consultation with this group, FPL submitted a proposal to CESR, which aimed to address a number of issues relating to the development of a European Consolidated Tape. The submission proposed the creation of an industry led solution for the establishment of a Consolidated Tape Delivery Authority (CTDA) to oversee the delivery and governance of the European Consolidated Tape.
To view the proposal submitted please visit: www.fixprotocol.org/regulations. FPL welcomes participation in the FPL European Regulatory Subcommittee and if your firm is a member of FPL and you would be interested in finding out more, please contact the FPL Program Office by emailing fpl@fixprotocol.org.

The View from Canada

BMO Capital Markets’ Andrew Karsgaard outlines the new regulations regarding dark liquidity in Canada and how firms can use them to their advantage.
Canadian market participants are bracing themselves for another year of significant change. While exchanges, themselves, consolidate, liquidity continues to fragment across venues. Regulatory proposals on dark liquidity are being considered. New entrants, both lit and dark, wait in the wings for the right moment to set up shop. In this constantly changing environment, the tools available to a trader to access and analyse liquidity across markets have become critical to their success.
Regulators Looking at Dark Liquidity
In November 2010, the Canadian Securities Administrators (CSA) and the Investment Industry Regulatory Organization of Canada (IIROC) issued a Position Paper containing proposals on the subject of dark pools and dark liquidity. The proposals are summarised here, but let us focus briefly on how the debate around dark liquidity is evolving in Canada.
The regulators’ proposals were prefaced by the statement that “in order to facilitate the price discovery process, orders entered on a marketplace should generally be transparent to the public…” This seemingly innocuous perhaps, unarguable assertion has, in fact, prompted considerable debate in the market structure blogosphere. Critics of the proposals argue that there is no evidence of damage to the price discovery process in markets where dark liquidity exists. They argue that transparency should not be an end in itself, as the true objective is best execution. Non-transparent ways of trading have existed forever, because they provide an important way to minimise market impact when executing large orders.
Many go further, arguing that the insistence on transparency is actually damaging, as it has created a network of continuous, linked auction markets that are susceptible to gaming, and therefore, represent toxic pools of liquidity. By placing restrictions on dark liquidity, regulators are potentially forcing investors to participate in these pools.
Canadian regulators have a history of pro-actively analysing and responding to market structure changes. Their rules concerning multiple markets were ready before the first lit ATS began operations, and they are the only regulators in the world who are active, direct members of FIX Protocol Limited, contributing to the creation and maintenance of standards in electronic trading. Generally speaking, they are engaged and well-informed. In this case, by getting ahead of the game on dark liquidity, there is a danger of throwing the baby out with the bath water.
Our uniquely Canadian broker preferencing and on-exchange crossing systems, along with the TSX’s market-on-close facility, create hybrid forms of grey liquidity, where size is not exposed to the glare of the continuous market, but where price formation and discovery still occurs. Broker preferencing takes internalisation, which is completely dark, and displays it – every trade – on a public venue. This contributes to price formation to a much greater degree than the broker-run dark pools in the US, which are not obliged to publish trades unless they reach a certain size.
Dark liquidity is not the problem in Canada. We currently have a single dark pool, but we have some interesting methods of merging lit and dark liquidity that could act as models in other countries. Is it possible that these proposals focus on the symptoms rather than the disease?
New Entrants Waiting in the Wings
Awaiting the outcome of the consultation period and the final CSA/IIROC view on dark orders, are a number of potential new entrants to the Canadian market, as well as a number of new facilities being offered by existing market operators.
Alpha – a well-established ATS owned by a consortium of large dealers – submitted proposals to the regulators for their Intraspread order type (essentially a broker internalisation facility) in the second half of last year. Around the same time, TMX submitted an application for their own non-displayed order types, including a non-displayed midpoint order and a non-displayed
Limit Order. MatchNow, currently Canada’s only electronic dark pool, proposed an addition to their existing dark pool, offering an “internalise only” order type.
In response to this avalanche of new requests, regulators placed a halt on developments while they issued their proposals and considered responses. All of these operators are eagerly awaiting the outcome of that process and they are joined in their anticipation by the large domestic and international brokers. Many of these operate internalisation pools in other jurisdictions, and can be quick to market with a Canadian version, should circumstances and regulators allow. The existence of broker preferencing (see previous page and box below) has prevented the development of broker-run dark pools in Canada until now. Should regulators allow the existence of additional dark pools in Canada, some flow would likely migrate there. The immediate impact of which will be an increase in the need for Canadian dark pool aggregators. While many participants have deployed these for their US trading already, work would be required to configure them to consider Canadian dark pools.
In the lit market space, there is also a great deal of potential change. BATS and DirectEdge are logical entrants to the Canadian market. Indeed the merger of BATS and Chi-X, which already has a presence in Canada, may have brought that a step closer.
Tools to Navigate the New Landscape
No matter how many new dark pools or lit venues open up shop in Canada in 2011, traders will require more sophisticated tools to access and analyse liquidity.
Accessing liquidity is the job of the Smart Order Router (SOR). Canada is entering the 3rd generation of SORs. The first were essentially serial trade-through protectors. The second generation were slightly more sophisticated spray routers, with some better support for handling the passive or unfilled portion of an order. The third generation of SORs consume latency and execution quality data as part of the order routing process, dynamically shifting child order placement as the liquidity picture changes.
Analysis of liquidity will be an area of real growth in Canada in 2011. Execution quality reporting and analysis is essential to what should be a disciplined best execution regime within every market participant, and the ability to feed execution quality information back to Smart Order Routers in real-time is critical to the implementation of the third generation SORs mentioned above. The FIX Protocol supports valuable data in that regard – tag30, represents the market of the fill and tag851 indicates whether the fill was the result of making or taking liquidity, which are vital data in determining execution quality. This determination is as equally vital for the sell-side as for the buy-side trader as they choose execution service providers, be they brokers’ algos or actual venues and their various order types.
Market structure is in a state of flux in Canada, and any attempt to describe the current state of affairs risks being out of date as soon as it is published. What is certain, however, is that as Canada develops as an electronic marketplace, participants will have to arm themselves with the appropriate tools to access and assess liquidity. The ability to access equity markets, and to make the necessary changes when any new marketplaces or order types arise, is a minimum standard for market participants. Those who can analyse order routing data and make the results of that analysis relevant and meaningful to their clients will exceed that minimum standard and prosper, no matter what the Canadian equity landscape looks like.

Summary of CSA/IIROC Dark Order Proposals

  • An exemption to the pre-trade transparency requirements should only be available when an order meets or exceeds a minimum size.
  • Dark Orders should only be required to provide meaningful price improvement over the NBBO when executing with an active order which does not meet the minimum size exemption.
  • Visible orders should execute before Dark Orders at the same price, on the same marketplace, except where two Dark Orders meeting the minimum size exemption can be executed at that price.
  • Meaningful price improvement should be one trading increment as defined in IIROC’s Universal Market Integrity Rules (UMIR). However, for securities with a difference between the best bid price and best ask price of one trading increment, one-half increment will be considered to be meaningful price improvement.

 

What’s an Exchange to Do? The Role of the Exchange in Evaluating Algorithms

NYSE Euronext’s Joe Mecane responds to calls for pre-trade certification of algos and describes NYSE’s efforts to manage risk.
Joe Mecane, NYSE EURONEXTWhat is the exchanges’ burden in terms of regulating High Frequency Trading (HFT)?
There are multiple components to that answer. There is not necessarily any regulation specific to High Frequency Traders as a separate type of participant, but clearly there are a lot of regulations that are applied to HFT because of the nature of their business. Sponsored access, for example, is clearly an area that impacts high frequency traders who might not be members of exchanges. Recently, there has been a lot of press and discussion around regulating algorithms, and that has a broad application around, not only HFT, but also customer-type algorithms that are developed by firms who deploy those algorithms to their customers or use them to execute customer orders.
At the same time, algorithms can be used for high frequency traders to develop proprietary algorithms. In those cases, a general supervisory responsibility falls to any of those types of participants to ensure that their algorithms are tested and working properly before they actually deploy them to the public. There has been discussion around whether that should be a more formal stringent rule, but it is more complicated because everyone has some level of responsibility and oversight with regard to deploying and developing algorithms. One thing that we have talked about is creating a ‘best practices’ standard, for people to follow, as there have been some cases of particular firms being fined for releasing algorithms that have had damaging effects on the market.
When does the exchange’s supervisory onus take place? Should exchanges evaluate algorithms in real-time as they’re trading or is it something that should be evaluated beforehand?
The problem with evaluating algorithms pre-trade is that it is not really practical to create an infrastructure that would certify an algorithm before it is deployed. While it sounds good in theory, the reality is the regulators do not currently have the resources and skill sets to sit down and review lines of code; it is just not really practical. What that means is that there is a structure, where firms have policies and procedures around how they develop, test and deploy algorithms that then can be reviewed by the regulators. It is not really practical to demand regulatory sign-off on an algorithm before it is deployed.
What would you recommend for firms as best practice for testing algorithms before deploying them?
It is up to each individual firm to outline and deploy the practices that they think are most prudent. One possible thing the industry could do is develop best practices or standards for algo development that the industry could adhere to. Some of the trader groups and some of the technology trade groups may have those types of standards already or could put them together quite easily. It is not our place as an exchange to try to define those standards, but certainly, it is something that most firms have and the industry, as a whole, could assemble from a technology development and deployment perspective.
What are some actions NYSE takes to supervise algorithms as they are deployed?
NYSE has a number of elements in place to help mitigate and oversee algorithms. On our markets we have a number of circuit breaker-type mechanisms to detect what might be unintended behaviors on the exchanges, ranging from Liquidity Replenishment Points to some of the SEC circuit breakers that have been mandated. We also have market order collars and limit order collars on Arca. On one level, we have some limits on our market that are designed to mitigate big price moves when there could be an algorithm acting in  an unintended manner.
At another level, we also have some safeguards that monitor excessive message traffic, so we have the ability to either moderate or throttle some  message traffic if we start to see excessive message traffic coming down a particular connection. Third, and this is a market-wide thing, there are clearly defined rules in place to deal with erroneous types of trades that could happen as a result of an algorithm gone bad. The last thing is that we have outsourced the market regulation component of our responsibilities to the Financial Industry Regulatory Authority (FINRA). As a part of FINRA’s reviews, they look at a lot of the supervisory procedures and oversight that are utilized by firms in terms of deploying their algorithms, as well as development and testing.
What type of change would you like to see from regulators?
Probably the best thing that could be done, whether from the regulators or customer trade groups, is if there could be some disclosure or standards for best practices established in the marketplace. It would be significant to ensure that there is a consistent, minimum level of oversight for algorithmic development and deployment. A lot of different firms have  standards they are subject to and hold themselves to internally, but to create a standard that everyone could use as a base level would be a very good thing.
To what degree is that complicated by the fact that there are such a large number of firms developing algorithmic products, relative to other countries?
That is why it is not practical to have a pre-clearance of all algorithms that are going to be developed or deployed. The better way to structure things is to organize more procedural, policy-type requirements for all participants to utilize. Considering the amount of time it would take to look at the algorithms of one firm and understand their code and what they  were trying to do, it is clear that the resources are just not there.

My City – Singapore

People Power

John Cameron_1As the FIX Protocol itself evolves to meet the demands of its global, and increasingly diverse constituency of users, industry leader, John Cameron of Cameron Edge, looks at the democratic roots of FIX and how FPL is carrying that tradition forward through the use of online ‘wiki’ technology.
“Government of the people, by the people, for the people” – Abraham Lincoln. He could have been talking about FIX.
FIX was, and is, designed and managed by the people who use FIX in their everyday business. It is a solution to real life problems and its continuing close connection to the people and organizations that it serves is, and always has been, its greatest strength. The challenge is how best to manage that vital grassroots connection.
The traditional management structure that drives the development of the FIX Protocol has been very successful. As advised on the FPL website, the protocol’s management is led by “Technical and business professionals from FPL member firms”, these representatives “coordinate their activities and organize their work through a series of committees, subcommittees, and working groups, all overseen by a Global Steering Committee that aims to ensure consistency of protocol application as it is extended into new markets, asset classes, and phases of the trade lifecycle.”
In addition to this structure, it was recognized early on that technology had an important role to play in keeping FIX in close touch with the industry. The FPL website introduced public discussion forums back in the mid-90’s, where members of the financial community can post questions and comments regarding FIX, which are responded to by other members of the community. These forums, www.fixprotocol.org/discuss, along with their search capabilities, form a wealth of invaluable information about the interpretation of the FIX specification. They are a natural complement to the FIX specification documents.
For example, if someone is unsure of some aspect of the protocol, maybe the exact meaning and intended use of a particular FIX field value, they normally start by consulting the FIX specification. If the specification is not clear, they can search postings in the forums using appropriate keywords. If still unsure, they can post a question to a forum, which will normally be answered by other members of the community. Once answered, that query and its answer are a matter of public record on the forum, which may be useful to others in the future. This open, transparent, collaborative process has been essential to the success of  FIX. It has meant that FIX has remained in close contact with the industry it serves and has enabled it to react quickly to changing industry demands.