Home Blog Page 633

What is driving co-location and proximity hosting in Asia?

Emmanuel Doe, President of Trading Solutions, Interactive Data examines narrowing structural differences between Asia-Pac and the rest of the world.
Asia Pacific is fast emerging as a hub for electronic trading globally. Despite the fragmented geographic structure of the region and a complex regulatory landscape, international traders are finding Asia-Pacific increasingly compelling as a business opportunity to help differentiate their trading strategies from other participants.
Many firms are increasingly attracted to the Asia-Pacific region because of its diversified range of markets. In terms of high frequency trading the developed markets – Australia, Japan and Korea – have evolved their infrastructure based on European and North American best practices. The developing markets, such as China, Hong Kong, India, Singapore and Taiwan, are investing in technology upgrades and are catching up with the more established markets.
However, despite the attractions of these markets, a number of operational and infrastructure challenges still remain for traders.
Regulatory Landscape
The regulatory landscape within Asia-Pacific is complex due to the lack of a single regulator driving the regional equivalent of a MiFID or RegNMS style regulation that would place multiple markets into a coherent framework, drive standardisation and potentially reduce clearing costs.
 
Some national regulators welcome overseas firms and competing execution venues, while others are more cautious. Transaction costs also vary and in some of the markets can be significant due to stamp duty and transaction taxes, which are charged on top of commissions and spreads.
This is likely to change as more global firms enter these markets. Asia Pacific’s advantage is that it is still relatively immature compared to the US and Europe and so can learn from the developments within those markets and consequently adopt a more balanced approach to reform
Geographic Spread
Given the distances between major trading venues, network latency is an extremely significant factor for electronic trading in the region. For example, the direct route between Tokyo and Hong Kong is approximately 2.5 times the route between New York and Chicago, or approximately 4.5 times the distance between London and Frankfurt. This inherent latency does have an affect on the decision on where to locate trading operations in the region and how to execute multi-venue trading strategies.
Location also tends to be determined by the type of trading strategy employed, such as single market cross-asset, multi-market intra-region, or multi-market inter-region. Hong Kong is seen as the key venue for traders looking to access China, while Japan has strong links with Chicago and Singapore for futures trading. Singapore is fostering links with the emerging Southeast markets (Malaysia, Vietnam, Indonesia, the Philippines, and Thailand) via the Association of Southeast Asian Nations’ (ASEAN) trading link up to facilitate increased cross-border trading in the region.
Demand For Co-Location And Proximity Hosting
Co-location is nothing new in the West and the trend is now extending eastwards in order to minimise order roundtrip latency for firms executing on the various Asia Pacific exchanges.
With the increased focus on the region, there has been, in turn, greater demand from trading organisations for enhanced connectivity and co-location availability. Many of the region’s major exchanges have responded to this. According to research by GreySpark1,61% of exchanges offer co-location services and 39% offer proximity hosting.
There has been a sustained period of technology investment in the leading markets of Australia, Hong Kong, Japan and Singapore as well as in the emerging markets that are trying to compete with the traditional liquidity centres. This investment has included trading system upgrades to boost capacity and speed, adoption of standard technology offerings from NYSE Technologies and NASDAQ, and the introduction of Direct Market Access (DMA).
These enhancements will require firms to locate hardware and applications closer to the matching engines in order to remain competitive. As competition increases, co-location and proximity hosting have grown in importance.
 
While an increasing number of exchanges are offering co-location facilities that boast of reduced latency, not all are comparable. Some, such as the Singapore Stock Exchange, provide transparency around their co-location design and latency measurements. Other co-location centres, generally the smaller exchanges, are not as transparent with regard to performance metrics.
 
Managing Total Cost Of Ownership
With Asia hosting more than 20 electronic trading venues and nearly 50 exchanges, dispersed across a wide geographic area, accessing all of the markets required to execute a strategy can be difficult.
 
Infrastructure is not currently mature enough to support the speed at which the market is growing, and this has prevented electronic trading on the scale seen in the US and Europe, due to the high investment required to upgrade lines to enable higher volumes. Network costs are far higher due to the greater geographic distances between venues.
 
In the race to reduce latency in Asia, the cost of building software and infrastructure platforms to take advantage of liquidity on these venues can be significant. There is also the cost of ongoing maintenance and the overall total cost of ownership (TCO) to consider.
 
 
Cost is driving demand for managed services that can offer a mix of DMA, co-location and proximity. This enables firms to access specific markets in the most effective way to execute their trading strategies. This outsourced approach has the clear benefit of offering a lower TCO for multi-market and multi-asset trading.
In terms of co-location, a managed services provider can provide a faster connectivity option by allowing financial firms unobstructed access to exchange matching engines. With new routes scheduled to open and technology continuing to advance, firms that choose to outsource don’t have to worry about investing in the fastest lines between markets, or ensuring they have the latest switching and routing hardware. All of those elements can be managed on their behalf at a reduced total cost, leaving them to concentrate on strategy and client service.
 
There is also a growing requirement for consolidated data feeds for global markets being fed into these co-location sites. This gives trading firms the lowest possible latency for their primary trading markets alongside access to market data from a large number of regional markets, delivered to a single point.
 
Proximity hosting, generally delivered through data centres located near many leading exchanges, can provide microsecond-level access to an exchange presence, often at a fraction of the cost of hosting in the exchange’s proprietary co-location facility.
 
Time to market is another critical factor that has led more firms to examine the outsourcing of their trading infrastructure, particularly if they want to access multiple markets. With each venue having differing regulations, technology rules and upgrade plans, trying to manage all of the moving parts can be challenging.
Conclusion
The continuing evolution of developing markets, especially in relation to technology upgrades, and the complexity of their internal infrastructure, creates opportunities for trading firms. Technology upgrades which level the playing field for electronic trading firms are happening across the region and are in various stages of progress and refreshes.
 
As the markets continue to evolve so too will the demand to access new pools of liquidity and the alternative venues that will emerge as a result of new regulation. This will continue to drive demand for advanced co-location and proximity hosting solutions to ensure firms can access their desired markets quickly and cost-effectively.
 
1 “Low Latency in Asia-Pacific: An infrastructure overview”. Frederic Ponzo, Anna Pajor, Saoirse Kennedy, 2012
This article is provided for information purposes only.  Nothing herein should be construed as legal or other professional advice or be relied upon as such.
© Interactive Data (Europe) Ltd 2012

Singapore Conference 2012


The success of the FPL Singapore Conference over the last few years is a testament to the importance of industry participation through the FPL Singapore Working Group. Asia is a rapidly evolving market, and FPL’s presence is burgeoning. With continued industry input there is every reason to believe that the Singapore conference can flourish and provide a valuable forum for Asia’s leading traders.



Richard CoulstockEastspring Investments (Singapore) Limited
The Singapore FPL event brought together a large number of high profile industry players in a friendly, relaxed environment that proved ideal for discussing the hot topics of the day. The presentations and key note speeches were relevant and interesting. All in, a first rate conference.


Jacqueline Loh Schroders Investment Management
Electronic trading solutions such as algorithms and dark pools e.g. Liquidnet have greatly helped the equity buyside trader find liquidity, and gain better control over his/her executions. I hope that some of these benefits (and expectations) will be transferred to other asset classes as buy side desks evolve into multi- asset desks. The growth of multi asset desks will make switching between asset classes easier. Lower costs of implementation end to end will be an achievable goal, with certainly better control and less risk. In addition, our clients can take comfort from the fact that the same principles of best execution apply across all asset classes.

Singapore Conference 2012


Joshua Rose Zeptonics Pty Ltd
I found the day to be highly productive and informative. The agenda itself touched on a broad spectrum of topical issues, with insightful commentary from a top notch list of panellists. FPL made remarkably good use of the single day to bring industry participants together for interacting, informing, networking and idea sharing. Very much look forward to the next event.

After Boursezilla

As Japan readies itself for a single primary exchange, Fidessa’s Hiroshi Matsubara, also Co-Chair FPL Japan Regional Committee, and Steve Grob look at the country’s motivations for change and ambitions for the future.
The TSE/OSE merger has been big news in Asia since it was announced last year. Creating the world’s third largest exchange by market capitalisation, the merger brings derivatives and cash under one roof and puts equities on a single trading platform. Not only will this create a much bigger and more liquid venue, but it will also give the Japanese exchange a gravitas that neither the TSE nor the OSE has enjoyed on its own.
The merger attempts to position Japan as a super resource centre for pan-Asian trading. With a vibrant and growing PTS market, Japan’s market structure is certainly one of the more advanced in Asia, although challenges still remain. One challenge that was removed earlier in the year was the 5% TOB rule.
Under Japan’s Financial Instruments and Exchange Law, investors who purchase a 5% or greater stake in any firm off the primary exchange had to launch a full tender offer bid (TOB). Because trading on PTSs was originally designated as “off exchange” they inadvertently got tripped up by this rule. This discouraged many firms from buying stock on PTSs, due to the complexities involved in knowing when exactly the 5% threshold was reached. Other firms would sell on PTSs but not buy, for the same reasons. With the relaxation of this rule being implemented in October, the barrier to using PTSs for both sides of transactions has been lowered significantly. By next summer PTS trading could account consistently for around 10% of the market in some blue chip stocks. Indeed, PTSs are already trading steadily larger volumes in the market’s biggest names. Considering PTS trading accounted for practically nothing just two years ago, the fact that SBI Japannext, for example, is regularly trading upwards of 10% in stocks like Mizuho and Mazda is a healthy sign of further growth to come.
The next challenge for PTSs is the 10% rule, stating that once a PTS reaches 10% of the national market, it must apply for a formal exchange license. It comes into effect when daily average trading values of all trading stocks on a particular venue measure 10% of all exchange and PTS trading in Japan, over a six-month period. While it’s unlikely that one PTS will breach this threshold within the next year or two, when they do, the FSA will have to re-address this rule. This is because under Japanese exchange regulation, an exchange can only trade instruments that are listed on it (unlike other jurisdictions, where multiple venues can support secondary trading on the same stocks). So, if a PTS were to become an exchange, it would suddenly have nothing to trade.
At present, smaller tick size is an important differentiation for the PTSs as it allows them to offer prices inside the spread of the main market. However, if the new Japan Exchange Group adopts decimals for tick sizes, there will be work to do in order for them to demonstrate the additional value they offer. Venues should be thinking seriously about this now. Speed has been a battleground in other regions, and SBI Japannext enhanced its platform in September, making its processing speed ten times faster than the TSE’s arrowhead.

International experience shows that tick sizing is important for market participants. In the US, however, tick sizes are set centrally for all venues, and yet fragmentation of liquidity is at its greatest there. In Europe we witnessed a somewhat disorderly process as venues reduced tick sizes arbitrarily to try to gain a competitive edge. This was eventually replaced with a common process as market participants struggled to keep up with the pace of change.
In Japan, many brokers don’t have decimalised systems so can’t take advantage of smaller tick sizes in any case. Whether or not the exchange moves to decimalisation soon, there’s some evidence to suggest a tipping point is being reached where firms need to play catch-up with their systems if they are to be able to offer, or execute, the best quality trading.

 
Regardless of the future of tick sizes and the 10% rule, brokers and buy-sides in Japan are having to get serious about smart routing right now. Fidessa’s experience of watching this experiment take place globally is that there are two thresholds where alternative venues receive an extra boost in liquidity. The first is at around 5%, at which point brokers start using smart order routing. The second, at around 10%, is when buy-sides start asking their brokers if they are connected to a particular venue, and wanting proof that their orders are getting filled in the most advantageous ways. It makes sense of course – if your firm is seriously trading East Japan Railway or Nishin Steel, and 10% of the stock is trading away from the main exchange, you need to be looking at both (or all) venues.
For the Japanese sell-side these add to existing pressures of changing regulation and flattening volumes. Firms will need either to become specialists or to scale up to meet the new requirements of a more complex market and more demanding clients. This global theme has already played out in the US and in Europe, and has seen the sellside rely increasingly on technology to sharpen their competitive edge and generate efficiencies so as to scale revenue faster than cost.
Another effect of increasing buy-side focus on execution quality may be the introduction to Japan, in a regulatory sense, of commissionsharing agreements (CSAs). As firms focus more and more on where and how their trades are executed, a natural consequence will be an increasing reluctance to spend money on sub-optimal trading in order to pay the research bill. While the CSA discussion has been going on for some time now, it’s likely that real change will happen in the next couple of years or so.
For the sell-side this again means that it will be required to prove the effectiveness of its technology in order to be chosen as the executing broker in a CSA environment. Those who prepare now will be best positioned for change when it comes.
A very contentious change that’s come to international markets in recent years is high-frequency trading (HFT). The new depth of the Japanese market, with the addition of OSE liquidity and the TSE’s speed on arrowhead, will further improve the prospects for electronic trading of all varieties. Japan has been more sanguine on HFT than the sometimes hysterical reaction elsewhere, and while there is watchful concern from buy-side and retail firms, it’s unlikely that HFT will be vilified in the way it has been elsewhere.
In the changing Japanese market it’s certain that, as everywhere else in the world, there will be winners and losers. Regardless of the regulatory framework that eventually settles into place, technology will be the critical key to success for those offering and taking part in trading.
 

Trans-Pacific Trading

In an increasingly globalised world, Shane Neal, Head Trader of Matthews International Capital Management talks to FIXGlobalTrading about the benefits, and challenges, of trading in Asia from the US.

What are some of your key issues and opportunities being a US buy-side operating in Asia?
I think the majority of the issues Matthews faces trading in Asia are faced by any firm trading the region; liquidity, volatility and anonymity, to name a few. We have a propensity for identifying smaller- and medium-size companies to invest in, so a disciplined, and often patient, approach to trading is required. Market participants in Asia are sometimes less inclined than traders in the US or Europe to trade in block form; whether due to a lack of trader discretion or because they are benchmarked to a specific participation rate or volume weighted average price (VWAP). Block specialist desks, and dark pools designed for block trading, have started to help change this attitude and demonstrate the usefulness in increasing average execution size and lowering implicit trading costs.
Matthews’ trading desk maintains live trading coverage during all Asia and US market hours from our headquarters in San Francisco. Matthews is the largest dedicated Asia investment specialists in the US, and having our traders and investment team based in a strategic location in San Francisco allows our investment team enough distance from the noise in the market to focus on our long-term investment objectives. Our open work environment provides collaboration among traders and portfolio managers and has been vital to fostering dialogue about execution strategies as part of the investment process. When covering the “night” trading desk (Pacific Time), we are often the “eyes and ears” for the investment team regarding day-to-day market colour, and act as a hub for disseminating pertinent information and filtering much of the market noise.
The advantages of market connectivity through an execution management system (EMS), a sophisticated order management system (OMS), compliance capabilities, and direct market access make remotely trading into Asia an efficient and cost-effective operation. I’m unclear as to the value proposition for some firms to relocate their Asia trading operation to the region. The argument for improved service quality or an information advantage doesn’t hold, in my opinion; I don’t see how trading into India is made easier simply because one’s office is in Hong Kong, as opposed to the US or Europe. For smaller firms that don’t have dedicated staffing during Asian market hours, time zone differences may pose a hurdle when it comes to potential settlement issues, compliance needs or short windows of opportunity for interesting liquidity, like stock placements. Fortunately, at Matthews, we have a deep and diverse team with a range of perspectives and expertise. Our singular focus on investing in Asia means that everyone at the firm, from back office staff, IT to Compliance is committed to supporting our investment objectives in Asia.
How does the regulatory environment and differences between meeting SEC regs and Asian regs present a challenge?
We apply the “best execution” guidelines, familiar in the US, to our Asia trading efforts. As liquidity continues to fragment in Asia, the buy-side will increasingly rely on smart order routers in seeking the best price and utilising transaction cost analysis (TCA) for measurement, as well as other qualitative factors that contribute to broker execution quality.
Foreign exchange transaction costs have been a topic of discussion in the US recently. The lack of transparency in the OTC FX market has motivated the buy-side to work with their foreign exchange dealers and banks to improve time stamping, which is necessary to measuring execution quality. The benefits of negotiating FX trades, and the efficiencies of multibank FX tools are increasingly important. Because FX trading in the restricted markets of Asia must be executed in compliance with local market regulations, such as proof of an underlying security requirement, limited market hours and central bank reporting, market practices have necessitated custodian and sub-custodian involvement. Improved transparency around the costs associated with these trades is important for the buy-side if competitive FX pricing is an unrealistic option.
Post “flash crash”, and in the wake of a recent high profile technology malfunction, the Securities and Exchange Commission is taking steps to tighten regulations in ways that are similar to those used by regulators in Asia to curb erratic price fluctuations and to identify large market participants. In May, the SEC, along with the exchanges and FINRA , approved plans to implement limit up/limit down price bands that are more protective than the current single stock circuit breaker system, and seem to evolve from pricelimit mechanisms used in many markets across Asia. Last year, the SEC also adopted a rule establishing a large trader reporting system to improve the Commission’s ability to identify and monitor the trading activity of its largest participants; in many ways, analogous to ID systems in Taiwan and Korea.
Regulatory requirements, such as ownership filing and foreign ownership limits, vary across the Asia Pacific region and the US, so we rely on our knowledgeable compliance team and an OMS technology that can accommodate the gamut of compliance rules and testing.
How has the electronic trading environment changed, what technologies or developments have made your job easier or more difficult in the region?
I think milestones in the evolution of buy-side trading technology include market connectivity through an OMS and EMS, the proliferation of algorithmic trading access across Asia, and post trade analysis through TCA platforms. These technologies contributed most to risk reduction, and empowered the buy-side in managing their order flow. Adoption of these technologies in Asia lagged the US and Europe by a few years, but the variance among the regulatory environment and market microstructures are wider in Asia than in any other region. The algorithmic trading providers who offer specialised strategy consultants, customisation and local market experts are an asset to the buy-side in understanding the appropriateness of certain algorithms for a particular trading situation.
As liquidity across Asia continues to fragment, the trading tools used by the buy-side must continue to evolve to handle greater market data demands and smarter access to liquidity in order to achieve the highest execution quality. We’ve also sought more granular data from TCA recently. Venue analysis using FIX TAG 30 has been interesting, as not all liquidity appears to be equal, and we better understand where spread saving may be misinterpreted. For example, a broker may highlight a potential for saving half the spread by executing via a dark pool, rather than executing at the far touch in the displayed market. However, closer inspection could reveal some of these dark prints were outside the stock’s trading range in the lit market, when the stock was about to reverse direction—thus costing you half a spread. You would have been better served by posting at the near touch. Some dark pools have better logic to avoid this. Graphically plotting your fills and measuring reversion help in understanding the value and potential shortcomings of different dark pools.
What comes next? Regulatory or technological — what changes would you like to see in Asia and what would their impact on trading be?
Regulatory change happens at a more measured pace than technology, often based on sound reasoning like need for risk analysis and testing. In Asia, market structure reforms that foster competition and will thereby increase fragmentation and reduce trading costs through lower spreads have occurred first in markets like Japan, Australia and Hong Kong. The announcement by Japan’s Financial Services Agency (FSA) that proprietary trading system (PTS) trading will be exempt from the 5% Tender Offer Bid (TOB ) rule is a positive development for Japan. This rule stated that while trading, were a company to attempt to acquire more than 5% of the outstanding shares of a company they would have to submit a takeover bid to all shareholders. There are a number of market participants who avoided interacting with PTS liquidity until there was more clarity around how this TOB rule would apply.
It is inevitable that PTSs will gain market share as new participants realise price improvement and liquidity opportunities. Regulatory changes in other markets of Asia may occur slower and take cues from the successes in other markets as they mature. Continuous trading in Taiwan, centralised clearing in India and the elimination of the tripartite boards in Thailand would improve efficiency in my book.

Eurex Cements Use Of FIX And FIXML

Hanno Klein, Senior Vice President Deutsche Börse IT, and Co-Chair of the “FIX Protocol Global Technical Committee”, discusses Eurex Group’s commitment to supporting the global industry standards in its newest interface portfolio.
Like a growing number of market participants, Eurex Group recognises that using the industry standard FIX and FIXML protocols delivers a number of innovative business benefits for exchange and clearing house users as well as for their operators. This includes increased customisation, hardware independence and flexibility. As the FIX and FIXML protocols continue to increase in relevance throughout the financial industry, their utilisation by Eurex Group for several of its interfaces allows exchange and clearing house users, and software vendors who already utilise FIX and FIXML, to integrate changes and upgrades more rapidly. Additionally, the utilisation reduces complexity and thus leads to increased resource efficiencies.
As a universal language for the financial industry, FIX is not merely a technology and can be used for many different interface types in combination with the appropriate transport. The Transport Independence Framework was introduced by FIX Protocol Limited (FPL) as a key concept of FIX 5.0 to allow any transport technology in addition to the FIX Session Protocol. Eurex Group achieves the highest levels of performance and standardisation by combining the FIX Application Protocol with a proprietary binary transport. FIX Protocol’s High Performance Working Group currently develops multiple standard binary transports that will make proprietary implementations obsolete.
Eurex Group has been an avid supporter of interfaces that utilise FIX. As an early adopter of FIX and the first clearing house to utilise AMQP, the Group features FIX or FIX-based interfaces in a variety of Group companies’ architectures i.e., Eurex Exchange, Eurex Clearing and the International Securities Exchange (ISE) all feature interfaces using FIX or FIXML.
Committed to FIX
With the introduction of its new trading architecture, Eurex Group demonstrates its dedication to supporting global standards and has continued its commitment to offering best-in-class FIX interfaces, among others in its interface portfolio, to its customers. Many of the proprietary design principles of Eurex Group’s new trading architecture have been introduced to the FIX Protocol’s High Performance Working Group and we are confident that these can become part of a standard approach for FIX interfaces.

New Trading Architecture
Eurex Exchange’s new trading architecture, set to go live in December 2012, will enhance performance across the board. It promises to reduce latency and increase throughput as well as deliver enhanced system flexibility. Connecting to the new trading architecture is possible via a variety of new interfaces. All interfaces have the usage of FIX semantics in common while offering the optimal transport for each interface to fulfill its specific technical requirements related to performance, reliability and ease of access.
The new trading architecture features a high-performance messaging architecture and reliable database systems and is a Linuxbased platform. Eurex Exchange will no longer specify hardware or software requirements, leaving trading participants free to choose the technology that best fits their business requirements. Additionally, the exchange will implement a new partition concept to improve scalability, throughput and separation of failure domains, while a direct messaging concept will drive enhanced throughput.
The new trading architecture will feature the new high-performance Eurex Enhanced Trading Interface (Eurex ETI), which uses FIX semantics over a proprietary binary transport and is designed to suit the needs of our latency-sensitive participants. Eurex ETI is complemented by the new high performance Eurex Enhanced Market Data Interface (Eurex EMDI) which uses FIX semantics over FAST. A standard FIX engine connection over the FIX Session Protocol is available to participants whose primary focus is ease of access.

 
Interface Landscape
Eurex Exchange’s new trading architecture features three new order entry interfaces, the Eurex Trader GUI, the Eurex FIX Gateway and Eurex Enhanced Trading Interface (Eurex ETI). The Eurex FIX Gateway only supports order maintenance, trade notifications and strategy creations while Eurex ETI additionally supports quote maintenance as well as mass actions to delete orders and quotes or to (de)activate quotes. The Eurex Trader GUI does not support the entry or modification of quotes but offers mass actions, for example when the user has problems with Eurex ETI and wants to get out of the market. The entry of bilateral and multilateral OTC trades via EurexOTC Trade Entry services is supported by the Eurex Trader GUI which forwards these directly to Eurex Clearing.
FIXML Utilisation
In addition to Eurex Exchange’s trading interfaces, Eurex Clearing offers a FIXML interface that provides Clearing Members with a highly flexible, standardscompliant and cost-effective way to enter, access and modify their clearing data.
The Eurex Clearing FIXML interface reduces the costs associated with accessing Eurex Clearing, while simultaneously offering Clearing Members greater flexibility and standardisation in terms of technology. Since the interface uses the industry standard FIXML, it is easy to integrate into users’ existing IT infrastructures and requires no special programming efforts.

 
Also in line with Eurex Group’s Technology Roadmap, Eurex Clearing will take considerable steps by introducing a completely new clearing system, the new clearing architecture, for derivatives clearing in 2013. Member firms will be able to use the existing Eurex Clearing FIXML interface, which will be enhanced to support improvements to be introduced with the new clearing architecture, for accessing the clearing environment.
Going Forward
By comprehensively featuring FIX in its new trading architecture and FIXML in its new clearing architecture, Eurex Group has signaled its plans to support these well-established industry standards in the long-term. The Group firmly believes that the adoption of FIX and FIXML as the industry standards has been positive for the financial industry as a whole. As increasing numbers of market participants use those standards, Eurex Group will continue to provide innovate solutions that harness FIX and FIXML.

DMA In Developing Markets

In an ever-changing global environment, Brian Ross, CEO of FIXFlyer, examines FIX and connectivity across emerging exchanges.
Technologies used in Direct Market Access (DMA) are changing rapidly, especially in the world’s growth markets such as Brazil, Mexico, Istanbul and India. Brokers provide their clients with Direct Market Access in order to give them a path to the market venue, without going through the broker’s OMSs.
As technology evolves and exchanges increasingly offer more technology in order to capture the interest of automated traders globally, local market structures are changing, and the regulators are responding by making appropriate changes. Many exchanges are copying the exchanges in North America and Europe by offering co-location services, which have been a great driver for revenue for the market venues. And by extension, exchanges worldwide are opening up to the idea of sponsored access, meaning letting the broker’s client connect directly to the exchange, as long as there are appropriate risk controls set up to stop the client in the event of trading that is not acceptable by the exchange.
What exchanges are realising worldwide is that they should cater to the buy-side’s technology to facilitate connectivity. And the buy-side will speak FIX across asset classes and all over the world. Making it easier for buy-side clients to have direct access with the exchange is quickly becoming understood to be a way to support new types of trading, directly from market participants.
Drop Copies to the OMS
The biggest architectural change that DMA attempts to solve is to avoid the path of the messages from the asset manager to the market venue having to go through the database.
A DMA gateway will avoid any slow database writes, and get participants to the market as streamlined as possible. It is still possible to keep a record of trades by sending a copy of the messages from the DMA Gateway to the OMS. This means that you can greatly offload the OMS while providing clients with a more direct path to the market venue.
It’s important to note that an OMS doesn’t need to be upgraded in order to support high speeds. The DMA gateway handles automated order flow while the OMS can continue to be used for care orders, allocations, position management, profit and loss calculation and back office processing.
See Figure 1 for a system diagram of a DMA gateway architecture.
System Diagram of a DMA
Risk Controls
Real time, instant risk checks are important to keep the integrity of an exchange’s market structure intact. Most exchanges go beyond the risk checks mandated by 15c3-5 Market Access rules in the US. These checks are designed to minimise the chances a rogue order, or even a pattern of orders, will disrupt trading and, potentially worse, jeopardise a brokerage business or cause unnecessary havoc in the market. The result is that exchanges and their brokerage members are looking for technology that will check and control risks, but not slow down their latencysensitive strategies and clients.
Some checks that are important are notional value, price and quantity checks, symbology, duplicate orders, stale orders, and cumulative quantity checks.
See Figure 2 for a system diagram of co-location.

Market adapters
Some exchanges are relatively new to understanding the FIX Protocol. We routinely find that an exchange serves as a utility to match trades efficiently, but sometimes the exchange makes the broker fully understand and manage the nuances in the marketplace and exchange systems. Exchanges can be very reticent to upgrade their technology and many core matching technologies were designed before the FIX Protocol was as widely adopted as it is today by trading venues. Native APIs are very common at exchanges and often FIX layered on top of the native exchange protocol. Translations are often not limited to a stateless change of encoding, but involve a semantic translation which is more complex and, more importantly, has a negative impact on performance.
Frequently, there is an additional translation that occurs between the exchange’s interpretation of FIX, then another by the broker, and yet another by their buy-side clients. As exchanges move to eliminate latencies, through colocation and sponsored access, these market adapters must be finely tuned to optimise this translation. Even if the buy-side and exchange have adopted FIX, we have found that it is necessary to translate order identifiers to proper formats and solve race conditions inherent in high-speed exchange systems. Utilising instant translation to make this process seamless with established buyside systems, such as OMS and EMS, is increasingly important to exchanges and markets.
Market Structure
Direct Market Access has a lot to do with the regulation of emerging markets. Each emerging market that we have encountered so far has been very careful to include all potential market participants, such as retail, institutional, algorithmic and high frequency traders. They all realise that markets must be protected in order to help people arrive at a fair price and to facilitate efficient capital markets.
Some exchanges charge a lot for trading, or charge extra for cancelling orders in an effort to control, but not eliminate, high frequency trading. Other exchanges throttle throughput. Some exchanges are considering establishing minimum resting times before cancels or defining speed limits to reduce pileups. Some other markets make it expensive to sell short.
Given the recent issues in the US marketplace, it may be wise for markets to also establish testing guidelines prior to going live, or require kill switches on DMA order flow. It is clear that we must find a balance between maintaining market integrity and facilitating market development and automation.
Case studies: Mexico, Istanbul, India
Mexico
The Bolsa Mexicana de Valores (BMV) has supported FIX since 2008, and can handle speeds of a few thousand messages per second. Currently the BMV supports FIX 4.4. However, some details are not standard FIX and require the DMA gateway to do/ translations to support the BMV properly. For example, the Client Order ID in tag 11 and Order Status in tag 39 are not standard FIX that a buy side system can support. In addition, there is a race condition in the BMV’s core engine that sometimes results in losing the state of the order and can cause over fills in standard OMSs downstream.
A robust DMA gateway can cure this problem at very high speeds so that algorithms can be handled easily. The new BMV matching engine called MoNeT targets high frequency trading, aims to process orders within 100 microseconds and has a completely new architecture that may solve many of the past issues. Regarding risk checks, today the Mexican regulator, Comision Nacional Bancaria y de Valores (CNBV), requires standard pre-trade risk checks by the broker and where required the BMV as well. For example, a dynamic price check within five minutes of the last price is required to catch any DMA order that is not within 5% of the last price. If the order violates the dynamic price check, then there is an auction for five minutes.
Istanbul
The exchange limits trading to six messages per second for a single native connection, but you can have two connections, plus a special gateway connection that can get you up to 14 trades a second. The Istanbul Stock Exchange is working on a FIX interface, promises to be FIX compliant in the next several months, and is committed to handling speeds for modern algorithmic trading.
Cuneyt Tasli, Managing Partner of Geneks International in Istanbul says, “as a leading provider of OMS software in Turkey, algorithmic trading will be very important in this market. DMA lets institutional investors, especially foreign ones, manage their costs using the same FIX routing, portfolio management, and TCA technologies that are familiar. But for the algorithms to work, the exchanges and the brokers need to adopt uniform latencies and good performances. Another item that needs to be handled is if the maximum lot amount of AKBNK stock, for example, is 5,000 and we receive an order of 100,000 lots, we need to split the trade into suborders, sending them to ISE as if they are 20 independent orders.”
India
There are three exchanges in India: the NSE, the BSE, and the MCXSX. All three exchanges offer FIX connectivity and co-location services to support DMA. If an instrument trades on multiple exchanges, through permitted trading or cross listing, there are brand new opportunities for smart order routing between the exchanges. One issue with smart order routing in India is that each exchange clears its own trades, increasing the cost to trade on multiple venues, and this could affect the overall cost to run certain algorithms. Another factor affecting broker algorithms in India is the SEBI (Securities and Exchange Board of India) mandated safeguards for pre-trade risk checks, kill switches, and testing which affect the cost of implementation.
With regard to DMA, only the BSE supports FIX 5.0 while the other two support FIX 4.2. On fees, there are no specific DMA fees at any exchange other than colocation fees. SEBI mandates that exchanges restrict or add fees on firms that have order/trade ratios exceeding a threshold. Within this framework, Indian exchanges can manage these restrictions and still achieve their goals.

Operating Out In The Open

David Dight, Senior Software Architect at Liquid Capital, looks at free and open source software development over his 25 years using and developing solutions.

David DightA brief history of FOSS; major developments and definitions
The term FOSS (free and open-source software) has only recently come into common use. Software made available free of charge has been variously referred to as just ‘free’ software, ‘public domain’ software and sometimes ‘freeware’. The 80’s and 90’s saw the proliferation of ‘try-before-you-buy’ software, also known as shareware (or even ‘crippleware’). The term Open Source software was used much later to distinguish it from ‘free’ software reflecting its licensing restrictions. Today we use FOSS to refer to non-commercial and generally free software which may have certain restrictions on copyright, non-commercial use and distribution. One great advantage of this set-up is that it is now used as a development model.
FOSS has been around longer than most people probably imagine, but it wasn’t really prominent until the 1980s. It was largely the domain of hobbyists and enthusiasts until the advent of Richard Stallman’s GNU project. Today there are close to 400 GNU software packages available. One of them, Stallman’s EMACS editor is still popular (though perhaps controversial) and in wide use.
The early 90’s saw the first release of Linus Torvald’s Linux which has now become the most widely used UNIX flavour today — especially in the finance sector. My first exposure to FOSS was with Slackware Linux (at that time released on floppy). Having already used costly SCO Xenix and then SCO Unix for some years, Linux offered the first free community supported UNIX, and our team jumped on it.
Why do organisations choose FOSS?
Lots of reasons; not least of which is it’s free! But just because something is free doesn’t mean it’s necessarily any good, so first let’s look at the other reasons for choosing FOSS.
If you are on the buy-side trade it’s likely you’ll need to talk FIX.
FIX has had a long (informal) link with FOSS. One of the earliest Open Source implementations, QuickFIX still enjoys popularity. It remains popular because of the critical mass of QuickFIX users out there, the low barriers to entry and the abundance of support – both community and commercial. More fundamentally, QuickFIX performs well and is easy to use. Competing with QuickFIX are many commercial FIX offerings and, to be sure, many of them are excellent in terms of the critical discriminators such as low latency and interoperability. So why does QuickFIX continue to capture so much of the market?
Here emerges one of the main reasons FOSS can be the right choice. In my view, it’s about software sovereignty. A typical medium to large buy-side organisation will generally have a team of highly skilled engineers with the capability and experience to develop pretty much anything asked of them. Give them a FOSS package and the right resources and they can modify, extend and enhance it to suit business requirements. In some cases, just having the source available allows developers to quickly fix bugs and get back in production after an outage. In other cases, the source can be instructive, providing insights into techniques and methodologies that influence future designs and implementations.
These are key distinguishing features of FOSS versus commercial offerings (aka closed source). You can’t just change a commercial piece to suit individual business needs and idiosyncrasies. It is usually extremely difficult to get a vendor to add the specific features you want. Even when they do agree to add your feature request it’s often expensive and may take some time, and it’s also unlikely to precisely match your requirements.
Another FOSS product, GNU GCC, is probably one of the most respected FOSS development tools. As a C/C++ compiler it still dominates the field. With GCC it isn’t so much that users want to extend or enhance it, but rather that they want to use a product that has a critical mass of engineers developing and continually improving it. Take the recent changes to C++ with C++11. GCC already supports most of the new language features and supporting libraries described in the standard. Clang (LLVM) has similar support. Releases are frequent and timely.
So, in summary, there is the appealing feature of low cost, the fact that direct access to the source code reduces your risk if you have the capability to take advantage of it, and access to frequent updates and improvements from a vibrant critical mass of developers.
As a developer why would you release on open source?
In my own experience, I started development on fix81 purely from an individual inspiration. I wanted complete control over every aspect of the FIX layer. I worked on it in my own time for more than a year and at the end of this period I thought, “fix8 has evolved into something that could really be of benefit to the FIX community.” I considered the prospect of commercialising and decided instead to release it as open source.
Something worth thinking about here is why so many companies and individual developers freely contribute to these projects. In my view, they are motivated by pure love and enthusiasm for the field. The benefits derived from this enthusiasm speak for themselves. I believe this is the main motivation for developers to release their work to the community – much like myself.
Why FOSS isn’t always the right choice
Of course, FOSS isn’t a panacea. Niche fields often suffer from the lack of available FOSS projects. Often those that are available are immature or just plain bad. Commercial products often fill this void. Sometimes these products will branch a FOSS version to keep the community active and involved.
A critical weakness can be the lack of technical support. Even with an active community there is often insufficient support, leaving organisations exposed and looking for costly alternatives quickly. In my experience though, these situations are infrequent. Redhat was one of the early leaders in developing a commercial support model which went a long way in addressing these problems.
Projects that use poorly researched and misunderstood FOSS components are also commonplace. Just as many organisations actively choose not to use FOSS, others seem to use it by default. Inevitably this can lead to the wrong choices being made. Once a system has been built on the top of a number of ill-suited FOSS pieces it can be difficult to unravel and choose alternate components.
Also, sometimes FOSS projects go cold, occasionally leaving an unsupported, unimproved user base to fend for themselves.
One final reason FOSS might not be chosen is cultural. Banks for example have traditionally relied on a combination of in-house and vendor solutions. Up until fairly recently, FOSS operating systems were not deployed as a part of their IT policy. Seeing how Linux has started replacing Sun/Solaris is a good example of how this has changed.
Uses of FOSS
Many companies embrace the field and use as much FOSS as is appropriate (and legal depending on license restrictions). Amazon2,3 is a well-known example. Facebook4 is another. Clearly some organisations have concerns about proprietary IP so will not contribute enhancements back to the community. More traditional organisations will develop in-house, use vendor solutions and selectively use established FOSS pieces (such as GCC). However, it seems to me that the trend is moving more and more to FOSS (especially Linux and GNU).
Linux is a good example of this trend towards FOSS. From specialised kernel modifications to entire Linux distributions devoted to specific purposes. Linux now powers mobile phones (e.g. Android), embedded systems, wireless routers and even media players (e.g. Western Digital’s WDTV). The world has embraced Linux like no OS before it. This is in stark contrast to the closed source OS model.
Redhat’s JBoss application server is also a great example of the potential of a complete FOSS stack, from OS to application.
What comes next?
We have seen many big organisations building systems on FOSS and as I see it, FOSS will continue to grow, with the finance sector increasingly becoming involved as users and contributors. The success of Linux will likely pave the way for more OS platforms and applications offered as FOSS. It is also likely that we will see more commercial offerings providing FOSS versions (for example Intel’s TBB 5).
My prediction is that FOSS will increasingly replace closed source and vendor supplied solutions. This seems especially likely as the trend to low volatility/low volume in the marketplace pressures companies to seek more cost effective ways to deliver software.6
In finance/IT, organisations considering the development of new systems are now more likely than ever to use FOSS from day one. From the Linux kernel, right the way through to application software, FOSS is finding its way into more organisations.
We can probably also view these changes in use and availability as a general trend towards more openness in toolsets, platforms and environments. Initiatives like the Loadstone Foundation demonstrate this trend.7 More organisations are viewing their own implementation as their primary IP. As this trend progresses I believe we will see more packages being offered with the FOSS model and more packages offered both as FOSS and commercially supported.


Mitigating Market Abuse

Lorne Chambers, Head of Sales and Account Management for SMARTS Integrity, NASDAQ OMX, looks at the challenges faced by regulators and how they can bridge the gaps in cross-market surveillance.
Lorne Chambers1The responsibility of cross-market surveillance, which has traditionally been levied on exchanges, is rapidly expanding to encompass industry regulators as part of a collective effort to mitigate market abuse. Following the implementation of new regulatory policies in 2007, particularly Reg NMS in the US and MiFID in Europe, enhanced competition and market fragmentation have been key elements in exposing gaps in surveillance coverage across trading venues.
Current market structure has increased the potential for abuse; indeed, practices such as frontrunning, spoofing, and insider-trading can now be spread across multiple venues by market participants in an effort to reduce the chances of detection. For example, a participant might layer the order book on one venue and then proceed to trade on another venue in the same instrument.
Already facing increased competition and cost pressures, there are compelling reasons for firms to consolidate surveillance within regulators or third parties such as the Investment Industry Regulatory Organization of Canada (IIRO C) and the Financial Industry Regulatory Authority (FINRA ) in the US. These organizations and regulators can potentially achieve economies of scale and monitor multiple venues in a coordinated manner.
This regulatory shift is occurring primarily across markets in Europe and North America. Europe faces a unique challenge in establishing a surveillance program that offers a consolidated view of data because it must navigate the local rules and regulatory bodies of over twenty member nations. As one of many efforts to remedy this situation, the European Securities and Markets Authority (ESMA) was formed in January 2011 with the purpose of harmonizing approaches across Europe through its guidelines on MiFID. However, unlike IIRO C, which oversees market surveillance in Canada, ESMA currently does not have the capability to execute and monitor a pan-European surveillance program.
In addition to ESMA, the Intermarket Surveillance Group functions as a member-based organization bringing together over fifty exchanges, market centers and market regulators across the world to coordinate surveillance policies and minimize cross-border manipulation. The International Organization of Securities Commissions (IOSCO) and the Basel Committee on Banking recently fronted another initiative with the joint publication of a new set of standards preventing firms from capitalizing on rule-based differences between nations and improving transparency within derivatives markets.
With the exception of Australia, which recently employed a comprehensive surveillance program through the Australian Securities and Investments Commission (ASIC) to regulate the Australian Securities Exchange and Chi-X, the issue of cross-market surveillance is not playing out to the same degree in the Asia-Pacific region as it is in the US, Europe and Canada. There is however potential for South America to face this issue in the near future, especially in Brazil where there is heightened public interest in establishing a new trading venue.
Why Regulators?
In the face of this complex multivenue landscape, regulators offer an enhanced level of confidentiality in cross-market surveillance that is challenging to achieve at the brokerdealer or exchange levels. There is a significant need among surveillance departments at exchanges and other trading venues to be able to share sensitive information in a secure manner without relinquishing their secrets to competitors; for example, the ability to reverse engineer trading strategies from customer identified data.
In addition, there is a growing trend to outsource to third party quasiregulators, such as FINRA and IIRO C, that maintain a degree of separation from trading activities and possess the necessary tools and budgets to handle such complex operations. The majority of Canadian trading venues utilize IIRO C’s services, which offer a full-range of monitoring and regulatory programs. In addition, the New York Stock Exchange (NYSE) and the Nasdaq Stock Market outsource T+1 surveillance to the US selfregulatory organization (SRO ), FINRA .
Many surveillance solutions are becoming increasingly outmoded by immense volumes of data; up to ten billion trading messages can occur per day in a given country—NA SDA Q OMX alone processes more than four billion transactions daily. Shifts in liquidity among venues can also result in market share and exposure in a venue quickly changing from minor to significant; so a market’s regulatory risk can quickly escalate to the point that systematic monitoring is essential. One of the approaches suggested in Europe holds regulators accountable for consolidating order books for listed companies that trade primarily in their country. The notion of primary market has the potential to change regularly for certain instruments in a competitive environment.
Consistency in identifying market participants and their trading activities is essential to ensuring market integrity and fostering a more unified and systematic approach to cross-market surveillance. The Consolidated Audit Trail initiative in the US mandates that SRO s submit a national market plan that would lead to the creation of an audit trail, which tracks the life of an order from broker to exchange/ATS with client-identifying details. This is a major step considering most markets outside of the Middle East and some Asian countries do not systematically identify client accounts in surveillance data. There are enduser technologies that proffer solutions to such challenges, including SMARTS Integrity from NASDA Q OMX, which recreates order books from exchange data, links related instruments, and consolidates data across multiple venues to identify and combat potential market abuse.
As regulatory and surveillance needs can remain highly marketspecific, there is a complimentary argument for self-monitoring within exchanges; they maintain a more detailed vantage point of both the market and their listed companies than regulators and third party organisations, allowing them more effective insight into market activity. In Europe, legislation is currently under consideration that could allow regulators to outsource consolidated market surveillance to third parties, which would include local exchanges.
Next Steps for Regulators
Establishing a manageable budget in times of austerity and attracting the right staff will be paramount if regulators are tasked with deploying successful cross-market surveillance programs. The process of normalizing multiple formats and linking data from multiple sources will be equally important as market structure and surveillance needs continue to evolve.
Those looking to find any unfair advantage and exploit markets are investing heavily in technology, so it is critical that regulators partner with industry solutions that have the resources to continue to innovate and take advantage of new technology such as big data, cloud computing, social media and sentiment analysis. Real-time cross-border systems coupled with expertise will further aid in linking a worldwide community of regulators.
In the US, the National Market System plans to implement a Consolidated Audit Trail as one of many steps in an effort to improve market compliance and regulation. Other regions, including Europe, are putting in place a framework for industry-led consolidated tape solutions and these may well learn valuable lessons from the progress being made in the US over the next several months.
One thing is abundantly clear, in the current era of market fragmentation and trading across markets and countries, it is essential for regulators to be able to monitor activities across multiple markets.

HFT In India?

Sanjay Rawal, CEO of Open Futures, talks about the latest trends in high frequency trading in India.

What are the principal elements of the HFT market in India? How well developed is it? How quickly is it growing?
Automated trading has seen a massive jump in the last three years. However, all of this is obviously not high frequency. We have seen a few early adapters trying to use higher frequency strategies, but most have tempered these by starting strategies which involve more computation. The problem is that there is just not enough liquidity on the market to ensure that you could close out most of your positions by the end of the day. Therefore, it seems more profitable to be in lower frequency, but slightly more complex, type of trades.
What major challenges do you face? Are they technological or regulatory?
Both varieties. Clearly, just like everywhere else, the Indian regulators are becoming more conscious of the risk that runaway algorithms carry, and therefore they are beginning the process of trying to impose limitations. As is often the case with such situations, we are likely to see regulatory overreach and that will certainly impact the spread of high frequency trading.

The regulator has introduced new rules to curb the perceived risks of high frequency trading, have these had an impact?
To some degree. However, I believe that the best form of defence that the regulator has is by imposing pre-trade risk checks at the exchange level. As long as the risk management software run by the exchange is stable and has a low level of impact, (by increase in market volume) it would be the surest form of ensuring that algorithms don’t go haywire.
Specifically on the rules regarding order to trade ratios, what are your thoughts on these charges?
Terrible mistake. How on earth does it benefit the exchange or the participants if liquidity is pulled out from the market? In some form or other, every bid in the system represents some form of liquidity. It would be far better to impose some limits on how far away you can bid or how long the bid has to stay in the market, or the exchange could incentivise bids which are in products which have low liquidity.
What technology would you like to see on the Indian market?
No specific technology as such. Just as other markets have evolved, so will the Indian markets to accommodate newer technologies. Anything that you have seen happening in any of the larger markets should end up coming to India. Of course, my opinion is that in the present context, it will be difficult to justify the type of technology spend that might be needed. Hopefully, exchange-level pre trade checks, as well as newer technologies and more complex strategies will come next.

Trading To The New Norm

Mark Goodman, Head of Quantitative Electronic Services, Europe, Societe Generale Corporate & Investment Banking writes on venue and algorithm analysis in a low volume environment.
The initial stages of the financial crisis were characterised by highvolatility and high-volumes. The post-Lehman environment was difficult to navigate for many reasons, but low market volume was not one of them. To some extent this continued up until early 2011, but since then we have seen a steady decline in trading activity, albeit interspersed with moments of fear-driven volume peaks such as last August.
However, since the focus of the financial crisis moved from institutional to sovereign risk, European markets have been primarily characterised by a lack of activity combined with moments of significant volatility. The markets have settled into a directionless drift with no strong conviction to stimulate more activity; as a result a low volume environment is becoming the norm.
Whilst there are few signs that volumes will increase, there are some indications that there may be further downward pressure. Some of the core proposals of MiFID II could have a further negative impact on market activity. There is often heated debate regarding the positive or negative impact of high frequency on more traditional market participants but the fact is this activity is responsible for around 35% of European exchange volumes. MiFID II does not seek an outright ban on these counterparties, but efforts to better regulate them is likely to result in a reduction in their activity, and thus market volumes, at a time when markets are already struggling. Specifically the removal of credit/debits and the imposition of market making obligations will reduce the profitability of some strategies. There is clearly a distinction to be made between volume and liquidity when trying to understand the impact of this legislation on market activity but the initial outcome of lower market turnover is indisputable.
Faced with this environment the buy-side trader has to work harder to get trades done – even trades which previously may have been relatively simple. Actively managing access to liquidity and an in-depth understanding of the micro-structure of the market have become a necessity. In addition, algorithms which worked in an environment of higher liquidity need to be reassessed and in some cases rebuilt.
More Venues, or Right venues?
In terms of managing access to liquidity, one aspect of this is ensuring you can get to the maximum number of venues. Whilst this has been a core topic of discussion with our clients since MiFID I kick-started the process of fragmentation, the discussion has evolved from maximising the number of venues to a more cautious approach focused on using the right venues. There is clearly an inherent contradiction that in a low liquidity environment a trader would restrict their venue selection, but where the perception is that the shortfall is being made up of predatory, short-term alpha seeking strategies then this concern is justified.
This change of approach can be seen in the widespread adoption of the FPL execution venue reporting guidelines released last May. Whilst these guidelines have been published as recommendations, we feel this level of transparency should be mandatory for brokers; clearly showing where an order was executed, including ensuring the end venue is correctly identified when orders are “onward-routed”, is not an unreasonable demand from the buyside whose order it is being worked. Buy-side traders are analysing the minutiae of venues and associated impact on their trading, execution by execution, in order to exclude those venues where they are vulnerable to adverse selection.
However as highlighted, excluding a venue means excluding liquidity and this can be costly when trying to capture short-term alpha. A more nuanced approach is using the right venue at the right time, which treats each order according to its specific objective. This means that an order with a longer-term alpha horizon where there is no need to signal the market should only include high-quality, low adverse selection venues; however an urgent order where a significant market movement is expected should be focusing primarily on the probability of execution when selecting venues.
This approach requires a systematic approach to profiling each venue in a way that enables the algorithm to differentiate between the different sources of liquidity. SocGen has developed a proprietary methodology called “Quality of Venue Measure” (QVM) which captures the cost of trading (adverse selection) on each venue as well as the signalling cost to the residual of the order of getting that execution done. An example of the signalling cost to the residual can be seen in the diagram below; this is an analysis of the behaviour of a stock following an execution in the respective venue. All orders are normalised to buys and an observation is made of the price direction following the execution.

This analysis is one way to show a distinct profile for each of the venues. If we try to capture a half-spread saving in venue B but continue to trade our order afterwards then the potential performance impact on the residual execution is up to 3bps. Whilst this does not mean that venue B should be excluded from all trading it should be used as an input in the routing decision when trading a larger, over the day order where that impact may be meaningful, or when completing the residual of an order when any post-trade impact will not affect performance.
For the broker, this systematic analysis of the differentiation between venues should not be a one-off event. Venue operators are under the same commercial pressures as brokers due to the low volumes. As a result there is a continual experimentation with pricing structures, crossing rules, amongst other elements, in order to attract new participants, and this can have a fundamental and continual impact on the nature of the venue itself. Whilst the Pipeline situation in the US earlier this year is an extreme example of this, failure to continually analyse each venue means that relative changes between them can be missed; this results in use of a venue having an unexpected, negative impact on performance.
Equally, there are instances where changes can attract more benign liquidity which can improve the execution of the order by absorbing some of the market impact; examples of this generally relate to changes in matching rules, such as prioritising size over time or implementing auction models. In this situation a lack of analysis means you could be missing good quality liquidity, which is a scarce resource in today’s market.
Yesterday’s Market
The other but related significant impact of this environment is that algorithms previously used to good effect by the buy-side have seen their performance decline. Effectively these were designed and optimised for yesterday’s market; the change in the market environment has not been reflected in the product offering of most brokers.
Whilst one reaction could be to release a new suite of algorithms, our experience has shown that a larger list of options to choose from is certainly not what clients are demanding. As the focus on the micro-structure of the market has increased the detailed understanding of the buy-side trade, this same focus on the logic of the algorithms has resulted in an increase in individual requirements. Customisation is an over-used response to this demand; it is not about the algorithm, it is about defining the objective the client is trying to achieve. This approach to algorithmic development is a very in-depth, iterative approach which combined with legacy technology can result in a costly solution; many brokers are reluctant to go down this path when market volumes mean this cost may not be covered.
However, this is not a commercial issue but an architectural one; not only were most algorithms designed for yesterday’s market but most technology was designed for yesterday’s client. Not only do brokers need to consider their product offering, but they need to break this down and offer it in a way that can be deployed quickly, bringing the component parts together for each individual. This tends to be a learning process for both counterparties so each delivery should be seen as a step towards a final objective rather than an end point. Attempting to achieve this with technology which was built to deliver a uniform product suite for all clients and then commercialise the effort when faced with historically low volumes is challenging.
There is general consensus in the market that the current situation will not change in the mediumterm. This means that both buy- and sell-side need to adapt their approach to trading to the new norm: for the buy-side this means a focus on how to maximise access to liquidity venues whilst minimising slippage; for the sellside this means a more pragmatic and flexible approach to providing access to venues and algorithm solutions. Whether the sell-side can adapt to this new reality at a time when investment is scarce will dictate who the leaders in this field will be for the next few years.

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA