Home Blog Page 679

Taming The 800-Pound Gorilla – Encouraging FIX Compliance Amongst Exchanges Globally

Q: Where does an 800-pound gorilla sit?
A: Anywhere it wants.


I landed my first job in the financial industry back in the Halcyon days of the mid-1990s. From my naïve perspective, everything was wonderful. Stocks only went up. Nobody worried about a company’s P/E ratio, or if it were capable of turning a profit. If it had a “.com” in its name, it was a guaranteed winner. A website that resold airline tickets had a higher market capitalization than the airlines. Within the financial industry, technology was King, and with money growing on trees, it seemed firms had unlimited IT budgets and armies of software developers.
Exchanges, in the mid-90s, were the 800-pound gorillas of the day. They could implement proprietary standards, and firms had no choice but to adopt them. With liquidity centralized on the exchanges, there was no real competition, so everyone in the industry let the gorillas sit wherever they wanted. And with unlimited IT budgets and armies of software developers, accommodating the seating needs of several gorillas wasn’t all that painful.
What a difference a decade makes …
Times have changed. With the global economic crisis, IT budgets and headcounts are shrinking across the industry, and costs must be justified. Liquidity in many regions has become decentralized, with new generations of ECNs and ATS’ taking market share from central exchanges. Interexchange linkages mean that firms don’t have to connect to everyone to achieve price protection and best execution although, obviously, exchanges would prefer firms access them directly and not through their competitors. Firms are discovering they now have far more choices in finding liquidity, and they have less reason to tolerate expensive proprietary or non-standard interfaces.

Is FIX the solution?
FIX provides benefit to the financial industry in the form of a “network effect.” In theory, work done to support FIX for one market can be reused with other markets, effectively sharing the development cost. Proprietary protocols lack the ability to generate network effects and increase costs for the industry. A market that only supports a proprietary protocol is like an 800-pound gorilla that wants to sit on my laptop. The resulting situation is clearly amenable to the gorilla, but to me it’s expensive and inconvenient.

Proprietary protocols are not the only method of escalating market participants’ costs. While many markets claim to implement FIX, some deviate from the standard protocol to varying degrees, which drives up costs. Deviations usually fall into three categories: session, syntax and semantics.

Session costs
With a wide variety of established commercial and open-source FIX engines, session level deviations are few, but notable examples do exist. In these cases, the cost to a market’s participants can be severe. Firms using commercial FIX engines usually do not have the ability to modify an engine to support non-standard session behavior, so firms find themselves at the mercy of their engine vendors. They, in turn, are left scratching their heads wondering why the particular exchange deviated from the session standard and how they can recoup their costs to work around the exchange’s issues.

Syntax costs
Deviations in syntax are more common. These may include failing to send required fields, or forcing participants to send custom, user-defined fields that are not required by the FIX specification. While these do carry some expense and inconvenience, they are not usually catastrophic.

And semantics
Semantic deviations, however, are a market participant’s worst nightmare. The FIX Protocol defines precise semantic usage. Business processes are represented by standard transactional flows of messages, which themselves carry defined identifiers, states and key fields. Market participants’ systems are built assuming certain fundamental semantic behavior. Any nonstandard semantics will often require costly changes and could impose considerable additional latency in the participant’s system.

The costs to the industry of nonstandard behavior include analysis, development, QA, deployment, and integration testing. Firms who use vendor products must convince their vendor to support the non-standard behavior and must wait until the vendor can find the time to create, test and release the change. Just as the “network effect” can reduce costs for standards compliant implementations, nonstandard behaviors multiply these costs. Every $10,000 change to support nonstandard behavior made by 1,000 market participants costs the industry $10 million.

What makes them tick?
Understanding the gorilla’s psychology
may be useful in correcting its behavior Gorillas usually choose to deviate from the protocol and sit on inappropriate objects for one of three reasons: they don’t know any better; it allows for easier interfacing with their internal systems; or they believe they are implementing new business functionality not supported by FIX.

Finding the knowledge
Knowledge of the FIX Protocol is not hard to acquire. The specification itself contains extensive documentation. Posting a question in the FPL website’s discussion forums (www.fixprotocol.org/discuss/) often yields a quick answer. Firms can collaborate with FPL directly and markets can solicit comments from their participants on their Rules of Engagement documents. Understandably, participant firms who will be connecting to the market have a vested interest in pointing out any non-compliant functionality that will be costly to implement.

Taming the legacy systems
Legacy systems are often the cause of the most difficult incompatibilities in semantics. An exchange may think it is saving money by directly exposing legacy semantics not compatible with FIX to its members. FIX compliance may require that the exchange perform translations or modify its legacy system to accept FIX semantics, and that might increase the exchange’s cost. But doing so benefits the industry; it ensures compatibility with the considerable quantity of FIX software that expects standard FIX semantics, reducing every market participant’s cost. Standard implementations reduce industry cost, while non-standard implementations pass the buck to the market participants and multiply the costs for the rest of the industry, which is never a good idea.

Standardizing innovation
Innovation is vital to the financial industry, and markets should be encouraged to create new functionality. But this can lead to a proliferation of user-defined fields and messages, which sharply reduces the “network effect” and increases costs for the industry.

Most market-specific features I have seen implemented as user-defined fields are already part of the most current FIX standard. Even with resources such as FIXimate (www.fixprotocol.org/FIXimate), with more than 1600 tags, it can be difficult to find functionality if one doesn’t know where to look. FPL also adds new features to the FIX Protocol in response to industry needs. For example, functionality that must be user-defined in a FIX 4.2 session may be standardized in FIX 4.4. But if new features needed by a market haven’t yet become standardized, the best course of action is for the market to request that FPL add the functionality.

Speed up the changes to encourage innovation
Prior to FIX 5.0, new versions of the protocol were released infrequently, often 18 months apart. This simply was not a viable model to support innovation. Now, FPL releases Extension Packs, which are bundled into Service Packs for FIX 5.0. An exchange could write a Gap Analysis proposing a change, which would then be approved and adopted as an Extension Pack. For minor changes, the entire process could be completed within two months. FPL would then publish the Extension Pack, complete with tag numbers and enumerations, allowing for immediate adoption.

Reducing implementation costs for the industry is important to FPL. I work with FPL’s regional representatives to collaborate with exchanges, ATS’, ECNs, and regulators to promote standardization.
The latest FIX version (currently FIX 5.0 Service Pack 2) provides the most value to an industry struggling with limited development resources. Numerous ambiguities and unsupported functionality in earlier versions are clarified in FIX 5.0 SP2. Numerous proprietary, incompatible methods for representing things in older versions of FIX, such as FIX 4.2, are replaced with standard representations in the latest version. And FPL can only add new features to the latest FIX version in the form of Extension Packs. Fortunately, the cost to support FIX 5.0 SP2 isn’t onerous. Existing FIX order handling semantics have remained largely unchanged since FIX 4.4, few of the subsequent changes are mandatory, and all changes are stored in a machine-readable Repository.

Watch the market leaders
Several forward-thinking firms have seen the benefits of standardisation, and we are now seeing a number of exchanges globally supporting the FIX Protocol, as well as its use for market regulation, with growing support for the latest release of FIX 5.0 SP2. One such firm, the Investment Industry Regulatory Organization of Canada, was forward-thinking in responding to its members’ needs. When IIROC polled its member exchanges and ATS’ on the format of the regulatory feed each market must use to report transactions to its new market surveillance system, they were fully supportive of using the FIX Protocol. The markets and IIROC wanted to use the most standardized specification possible and avoid user-defined fields. FIX 5.0 SP2, with an Extension Pack to support IIROC’s needs, was the logical choice.

I worked closely with IIROC and observed first-hand that, while a myriad of user-defined fields dominate the Canadian markets, existing FIX 5.0 SP2 functionality rendered almost all unnecessary. The changes IIROC required were small and didn’t need a single additional field. Initial market reactions have been largely positive. IIROC’s collaboration with FPL has laid the groundwork for standardization and substantial cost savings among the Canadian markets.

It comes down to cost
In today’s competitive landscape, any market, be it an 800-pound gorilla or a smaller player, faces a serious competitive disadvantage if its FIX interface is nonstandard. Would a broker or vendor with limited resources choose a difficult, custom integration job for one market, or use those same resources to connect to five competitors who all follow the standard? Likewise, a market’s new, innovative features are worthless if brokers and vendors don’t implement them. Features using standard FIX fields that can be reused by other markets are more attractive than user-defined FIX fields that are limited to one market.

In these difficult economic times, a nonstandard FIX interface is truly a competitive disadvantage. Fortunately for the industry, many market places have noticed this trend and have chosen standardization. FPL welcomes the opportunity to collaborate with trading venues and regulators to promote standardization, leading to decreased costs for the industry.

Standardising the message: Collaboration, clarity and flexibility key to current and future success of global messaging standard

The Investment Roadmap, a guide created through a collaboration between the world’s leading messaging standards, to provide consistent and clear direction on messaging standards usage was released in May 2008. A year later, FPL’s Operations Director, Courtney Doyle asked the authors of the roadmap for their assessment of its impact on the industry and how they saw its future evolution.

Courtney Doyle: What was the motivation and purpose behind this Investment Roadmap collaboration?

Genevy Dimitrion (ISITC Chair): Today, Market Practice and Standards are inextricably linked and yet for all our efforts, Market Practice exists as a mere documented recommendation. However, we see convergence between the two areas to the point where market practice can be encapsulated directly into the standards themselves. The roadmap has a role to play in this evolution, and we see it as critical for the next generation of standards. It also helps to provide future users direction on which messaging standards should be used throughout the trade lifecycle.

Jamie Shay (Head of SWIFT Standards): Our involvement was driven by clear market demand from our customers. They needed clarity about which syntax to use in which space and, more importantly, they wanted a way to make these standards interoperable. SWIFT and FIX have been working together for a number of years towards a single business model in ISO 20022. The decision to work together on an Investment Roadmap was a natural progression. It provides clear direction with regard to messaging standards usage as it visually “maps” industry standards protocols, FIX, ISO and FpML, to the appropriate asset class and underlying business processes.

Karel Engelen (Director and Global Head Technology Solutions, ISDA): Similar to SWIFT, ISDA felt it was important to map out the coverage of each of the different standards so people could get a complete view of the industry standards at a glance. We also thought it would be a useful tool for determining where we had duplication of effort or functional gaps to complete.

Scott Atwell (FPL Global Steering Committee Co-Chair): As the others have pointed out, we needed to provide greater clarity as to how the various standards ‘fit together’. We sought an approach that recognizes, leverages, and includes the financial industry standards without reinventing and creating redundant messages that generate cost and confusion for the industry. The effort was named ‘Investment Roadmap’ as its founding purpose was to aid industry firms’ technology investment decision making.


Courtney Doyle: The roadmap was released over a year ago. Is the community referring to it and using it as a guide on how to invest in standards? How should firms use it and what does it mean for them?

Scott Atwell: FPL has found the roadmap useful as a tool for standards investment. However, the roadmap benefits are multifaceted. It has driven an even greater level of collaboration and cooperation amongst our standards organizations. Discussing it often serves as a ‘conversation starter’ that leads to healthy discussion and debate. It has also served to facilitate key changes to the ISO 20022 process such as the ability for FIX to feed the ISO 20022 business model and to be the recognized syntax for the Pre-Trade/Trade model.

Genevy Dimitrion: ISITC consistently refers to the roadmap when presenting to our members as the guidelines on message standards to be used within the trade lifecycle. It has become an extremely helpful tool for our members in understanding the key standards available and how and when they should be used.

Karel Engelen: The Finance domain has always been well represented in the standards arena, but too much choice is often confusing. The roadmap helps you navigate by showing the standards likely to become dominant in a particular space so that you can align your own system development strategy with them.

Jamie Shay: To add to the previous comments, I don’t think the community is referring to the map per se but they are benefiting from it. One of the key things to remember is that it provides clarity about which syntax should be used in which space and that by working together we are making very concrete progress towards achieving interoperability among syntaxes, which leads to certainty and increasing STP – so there are real business benefits.

The roadmap has also meant increasing cooperation among standardization bodies. For example we are working in a much more collaborative fashion with FPL than before, in particular in the area of pre-trade where we’ve defined a common business model that is in the process of being evaluated by ISO.

Courtney Doyle: Some areas of the roadmap allow for multiple syntaxes as indicated by the cross hatching. Why is this and what should firms do in this case?

Jamie Shay: As stated previously, the roadmap provides clarity around which ‘syntax’ is appropriate for which space. The single ‘standard’ is the common data model and common underlying dictionary – ISO 20022. The actual message representation may remain different depending on the syntax chosen to represent that common business model.

In some areas, we have more than one syntax represented in a given space because there may not be a “dominant’ syntax. In this case, firms should choose the syntax most suited to their environment or need.

Scott Atwell: As Jamie has stated, our objective is to feed and share a common business model, in which case having more than one syntax is not that significant as the core data and approach should not differ. In the Post-Trade / Pre-Settlement portion of the roadmap, for example, there is cross hatching of the FIX and ISO syntaxes and usage will depend on whether a firm’s STP initiatives are being driven more from the fron
t or back office. Firms may choose to use FIX when automating from the front office forward (e.g. adding allocation delivery to an OMS), or may choose to use the ISO syntax when automating from a back office perspective forward (e.g. building upon existing custodial communication).

Genevy Dimitrion: What is most important to our members is the business process. Although there are overlapping syntaxes, ultimately the business process behind each of them is the same. It is not practical to expect all financial communities to adopt a single syntax.

The roadmap is a framework that reflects the reality of the world we live in, underpinned by a common standard data model. From an organizational perspective, the services you provide tend to drive the standard you use which is clearly highlighted in the roadmap. In the cases where the cross hatching exists, firms should look at their key areas of focus to select the appropriate standard to develop.

Karel Engelen: The world of finance is not a set of segregated business lines to the degree it once was. Securities dealers trade in OTCs, derivatives dealers use securities to create structured products or hedge their positions and everyone dabbles in foreign exchange.

Implementing firms often want to maximize the return on their existing software investment by having each of their systems cover more of the trade lifecycle. As a consequence, the standards that support these businesses have had to expand into other domains to support their user base.

As mentioned by the others, the roadmap recognizes that in some functional areas there is more than one standard that can fulfil a particular function although there could be differences in the characteristics of each implementation. Some may be better for high trade volumes of relatively simple products, while others may excel at precise definition of complex financial products or structures traded with low volumes.

We feel that it is up to each community of users to choose the right solution for itself as this may depend on its existing standards usage and technology base. An establishedmessaging community, for example, might decide that leveraging its existing tech infrastructure and knowledge base was more cost effective than developing a parallel solution using a different standard for a new business problem.

Courtney Doyle: Are there thoughts of expanding this group to other standards bodies?

Jamie Shay: Yes, very much so. We are looking to collaborate with other standards organisations, such as XBRL. XBRL is involved in the financial ‘reporting’ end of the business and includes securities and transaction ‘issuers’. Capturing information at the site of the ‘issuer’ in the life cycle of a securities transaction will allow STP to flow throughout the entire chain of events, including the transaction life cycle and eventual reporting related to instruments, positions and statements. The roadmap is a living document. It should evolve as the industry evolves.

Karel Engelen: There is still plenty of scope for automation in finance, much of it outside the areas traditionally covered by FpML, FIX and SWIFT. We would be happy to see other standards integrated into the roadmap to further broaden the coverage to new product areas or processes.

Scott Atwell: Yes, I believe additional standards bodies will join the effort in the near future, and we are discussing reconstituting the group as ‘The Standards Coordination Group.’

Courtney Doyle: What are the next steps? How will the roadmap be maintained and updated?

Karel Engelen: The roadmap should not be seen as a static document. Each of the standards involved continues to evolve to meet the demands of its user base. The group that created it will need to periodically meet to review and update it. The end goal should be a set of interoperable standards covering the entire financial process and product space, unified by a common underlying business model whilst still allowing each to be attuned to the needs of the market it addresses.

Scott Atwell: This is an ongoing collaborative effort. I agree with Karel’s description of the end goal, however, there is a lot of work for us to complete before reaching it. FPL has committed significant technical resources to industry collaboration and the ISO process. ISO 20022’s Pre-Trade/Trade model is represented by FIX, and FPL is currently working with others to complete the Post-Trade model. In addition, FPL actively contributes to ISO WG4 and other areas focused on enhancing the ISO 20022 process. The roadmap will evolve over time. An example might be a previously over-thecounter product shifting to trading in a more listed nature.

Genevy Dimitrion: To follow on from Karel’s and Scott’s comments, we see the roadmap as an evolving document that will be revisited frequently as new products are introduced in the marketplace. We are looking to start addressing CFD’s and the best standards for them.

Jamie Shay: The next steps are twofold: on the one side, we need to continue to work with FPL and FpML to agree on the common business models and get them approved by ISO. We also need to encourage other standards organisations to work with us to expand the scope to cover other instruments, other business processes and other business domains.

We also need to find an easier way to be able to “automatically” produce a chosen syntax from the single business model. Work is ongoing to ensure multi-syntax representations of ISO 20022 compliant messages.

Courtney Doyle: How does the roadmap fit in with ISO 20022 and the goals of interoperability among standards?

Scott Atwell: I believe the roadmap has already paved the wa
y to improving ISO 20022, especially in terms of its ability to leverage and include existing, de facto standards. A key issue in years past was that the process required the creation of a new, ISO 20022 XML-specific message syntax for any messages added to the ISO 20022 data model. Now, the ISO 20022 process has been improved to the point that alternate syntaxes, such as FIX, can be recognized. This enables FIX and other messaging standards to contribute to the overall process, and more importantly allows users of these protocols to continue to leverage their current investment while achieving longer term benefits from the harmonization and collaboration of multiple standards.

The most important part of interoperability is that one can easily transition, in an automated fashion, from one stage of the trade lifecycle to the next. The roadmap facilitates a focus on these ‘peering points’ and the ISO 20022 data model provides a common model to reflect the data used throughout.

Jamie Shay: The roadmap provides clear direction about which syntax to use in which space, and building a common business model means these syntaxes will be interoperable. Interoperability will be possible once all the appropriate business models have been created in ISO 20022, and the industry embraces this standard.

Genevy Dimitrion: The roadmap clearly shows how 20022 provides a framework for binding multiple syntaxes together into a single standard and thereby achieving semantic interoperability. The business process and modeling is key to the success of these standards and their interoperability. The focus on the business functions and the interactions will allow for standards to be developed vs. the past in which standards were created and the business functions were integrated into existing formats. It allows firms to focus on the interactions among the industry players and the key data needed to successfully communicate in an automated way and increase STP within their organizations.

Karel Engelen: There are still areas where standards based messages are yet to be defined. In some cases these areas will be a natural fit with one of the existing standards but, if they are not, then proposing that they be developed within the ISO 20022 framework, rather than creating another standard, is a good route to follow.

To follow on with the thoughts around interoperability, this is a longer term goal and the key to it will be a common business model that captures and integrates all the different perspectives each standard has on the underlying business information. ISO 20022 has made a good start on this but there is still a long way to go.

Confucius said “A journey of a thousand miles begins with a single step.” It’s a good job we have a Roadmap.

The Investment Roadmap is available at: www.fixprotocol.org/investmentroadmap.

Keep It Simple

By Anthony Victor
Despite the rapid advances in sophisticated trading tools, Bank of America Merrill Lynch’s Anthony Victor argues that in times of volatility a knowledge of the basics has never been more important.

Anthony Victor, Bank of America Merril LynchWhile market structure and technologies may have transformed, basic trading skills are still critical to success.You do not succeed in any career unless you learn the basics and build a solid foundation in the fundamentals. For a role in electronic trading, the basics include understanding trading mechanics, market structure and technology, as well as the platforms that clients use to trade. Those old monochrome Quotron machines that were prevalent on trading floors when I started my career are now on the trash heap, replaced by state-of-the art technology, including touch-screen order management systems and workstations that supply news, market data, and analytics.
Since 2000, there have been some very dramatic changes in market structure such as decimalization, the advent of Reg. NMS and an increasing number of liquidity pools. Some of these changes resulted in a reduction of bid/offer spreads and a decrease in average trade size, which in turn, pushed market participants to use more advanced trading technology and ultimately algorithmic trading.
Algorithmic trading strategies, initially used by Portfolio Desks to manage large baskets of stocks, were eventually rolled out directly to the buy side. Trading no longer required multiple phone calls with instructions to execute an order. With a mere push of a button, these instructions could be sent electronically and trading goals could be efficiently realized. However, use of these tools still requires a good understanding of the basics and a team of support professionals that understand the nuances of how algorithmic strategies operate in the marketplace.
Within the last five years, Electronic Sales Trading (EST) desks have emerged on Wall Street to support the clients’ electronic trading activity. Unlike traditional sales trading, which focuses on what clients are trading, electronic sales trading puts the emphasis on how they are trading. Most EST desks have evolved from a pure internal support role into a client-facing, direct coverage role that assesses a client’s performance via real-time benchmark monitoring and post-trade transaction cost analysis. Electronic Sales Traders need to understand market structure and their firm’s algorithmic offering (and that of the competition) to successfully support the trading platform.

While market structure, trading tools, and trading desk responsibilities have all evolved over time, basic trading skills are still critical to success. Part of that skill set includes the ability to maintain a disciplined approach to trading amidst a barrage of news, overall market fragmentation, and a huge volume of market data. Algorithms have assisted the buy-side coping with the complexity of the marketplace, but the choice of strategy ultimately belongs to the trader.
In Q4 2008, when volatility (the amount of uncertainty or risk about the size of changes in a security’s value) peaked and preyed upon market participants’ emotions, many traders moved to more aggressive electronic trading strategies. In that tough environment, traders migrated away from passive strategies, like VWAP and lowparticipation algorithms, to more aggressive liquidity-seeking strategies, including an increased use of ‘dark pool’ aggregators. In a more volatile environment, traders felt pressure to make a stand or risk higher opportunity costs.
However, sometimes intuitive approaches do not work as expected. When analyzing the effects of the volatile market and looking at the slippage, or the difference of the execution price versus the price at time of order receipt (arrival price slippage), these aggressive strategies proved less successful than passive strategies, primarily due to the effects of volatility on spreads and depth of book. During that period, our research shows that S&P 500 spreads widened an average of 72% and book depth decreased 42% compared to Q1. Wide spreads and decreased book depth created a treacherous environment for aggressive, liquidity-seeking strategies, but favored more passive strategies.
The upside of passive
The success of passive strategies like VWAP can be attributed to several factors. VWAP orders tend to be longer in duration than other order types and in turn represent a smaller percentage of interval volume. In addition, the size of individual slices sent to the market tends to be smaller and more passively priced.

Due to the longer order duration, these less-aggressive strategies are more patient in seeking out price improvement opportunities. This results in crossing the spread less frequently, thus saving the higher cost associated with wider spreads. Conversely, aggressive order types that seek to execute quickly tend to cross the spread much more frequently.
Individual slice size is a meaningful strategy characteristic due to the fact that decreased depth-of-book also exacerbates arrival price slippage. With less available liquidity, algorithmic strategies may have to work through more price points to complete a given order slice. Slippage increases when there are more of these execution price points. At lower aggressiveness levels, the slices themselves are smaller, reducing the need to hit progressively worse price points to complete, thereby reducing the signaling risk (the risk of signaling to the market information about trading intentions) of any given slice. The larger slice sizes necessary for more aggressive order types tend to execute through more price levels, resulting in more slippage and producing a more detectable signal to the marketplace.
The cost of aggression
Though traders may have assumed that more aggressive strategies were a prudent execution method in the high volatility environment, it is evident in retrospect that the cost associated with choosing aggressive strategies was significant. Most notably, for more difficult trades (over 2% ADV), the cost difference between using an aggressive strategy over a passive one topped 44 basis points in Q4 08.

During this same period, traders were also looking for additional hidden liquidity to complement their urgent execution mentality. The result was an increase in the use of dark pool aggregators. These order types are generally midpoint-pegged, meaning that they execute at the middle of the spread. Slippage incurred was less than it would have been when fully crossing the spread, although additional costs were realized because spreads widened. As expected, the performance of these order types fell somewhere between more passive and more aggressive strategies.

Keep It Simple
In Q1 2009, as volatility and spreads began to level off and depth-of-book once again began to increase, we saw a migration away from aggressive strategies back to VWAP and lower-participation Implementation Shortfall. It seems that those traders who underperformed in the fall moved to an uncharacteristic, yet statistically more successful, passive trading style. These strategies continue to work well today.

In a high volatility environment, even with the availability of sophisticated algorithms, some of the most basic strategies have worked best. Well-worn algorithms like VWAP and Implementation Shortfall have generally been more successful than newer and more aggressive, liquidity–seeking counterparts. While perhaps contrary to their intuition, traders should consider sticking with basic, “first-wave” algorithms like VWAP and I/S in a high volatility environment. The first and simplest strategy you learned may sometimes prove to be the best.
The new high volatility environment will undoubtedly spur another evolutionary advance of next generation trading techniques and strategies. Successful traders should look to implement the lessons learned last fall by maintaining a disciplined approach to trading based on fundamentals and overall market structure expertise.

Meldrum & Kandylaki : Markit

SHEDDING LIGHT.

Markit_W.Meldrum+S 

There is more that meets the eye to Markit than BOAT. Will Meldrum and Sophia Kandylaki discuss the firm’s panoply of products and services.

When was Markit launched?

Meldrum: Markit was spun out of TD Dominion Toronto (the Canadian based bank) in 2001 as the first independent source of credit derivative pricing. Over the following couple of years, we expanded our suite of products to cover equities, commodities, foreign exchange and fixed income. In 2007 we bought Project Boat, the equities post-trade reporting platform that was created by nine banks (Citigroup, Credit Suisse, Deutsche Bank, Goldman Sachs, HSBC, Merrill Lynch, Morgan Stanley and UBS) – to coincide with the launch of MiFID.

How did Markit develop BOAT?

Meldrum: Over the last year, we have worked closely with clients and data vendors to improve the quality of the dataset. We have also worked with the industry to better understand the implications of MiFID and how we could enhance our processes and workflow. We have recently restructured our data packages to make Markit BOAT data more easily available to a wider range of clients. The dataset provides a comprehensive view of the pan-European OTC equity markets. Overall, users have access to trade reports on an average of 20bn of OTC trades in equities every day, equivalent to about 25% of the daily volumes traded on all European equity markets.

What is the new pricing structure you introduced?

Kandylaki: The new schedule, which will become effective this May, includes nine regional data packages. They cover Austria, Eastern Europe, Euronext, Germany, Italy, Scandinavia, Spain, Switzerland and UK. They are targeted at users who do not require a full pan-European licence, which was the only licence available up until now. The structure will also include a new sliding scale discount fee for Markit BOAT’s pan-European data licence based on the number of registered users within a company.

In addition, we have launched a new service that gives users an end-of-day aggregated view of validated trades reported to the Markit BOAT platform. It aims to address the market’s needs for accurate historical trade data and correct measurement of off-exchange trading volumes. We recognise the importance of fine-tuning algorithms and we expect the service to also appeal to risk managers who need official OTC volume data to use as an input into their liquidity tests.

Although people often think of Markit as just BOAT, you claim there is much more to the company. How, and where, have you have expanded your product range?

Meldrum: Our products fall into three categories: – data, valuations and trade processing which operate across all the different assets: equities, credit, fixed income, commodities and foreign exchange. There are also asset class focused products, which includes Markit BOAT, and Markit MSA, which we launched last December. The service collates information on cash and synthetic equity trades executed on primary exchanges, MTFs and OTC markets. We are working with 141 banks including Citi, Dresdner Kleinwort, Deutsche Bank, Goldman Sachs, HSBC, Kepler, Merrill Lynch, Morgan Stanley, Nomura, Panmure Gordon, Royal Bank of Scotland and UBS, who are sending us their trade information.

The data is then consolidated to provide users with a compre- hensive view of market activity. The service ranks brokers on each European trading venue, stock or index according to the volume and value of their trades. The objective is to provide greater insight into the new trading landscape since the introduction of MiFID. The increase in the number of MTFs and dark pools has made it more difficult for market participants to monitor liquidity and market activity. The product shows them who is trading what and, where they are trading, and what share the brokers have.

What is Markit Valuations Manager?

Meldrum: Markit Valuations Manager is one of our most recent product launches and it is the first multi-bank, cross-asset client valuations platform. It provides a secure, standardised view of OTC derivative positions and derivative and cash instrument valuations across counterparties on a single electronic platform. Users will be able to view the bank counterparty valuations alongside Markit’s independent valuations. Currently, portfolio managers receive numerous statements from their counterparties in multiple formats, and this requires several hours of manual consolidation. The product will be integrated with our trade processing service to enable full life cycle support for OTC derivative positions including counterparty position data delivery, normalisation, reconciliation and valuation.

Before the financial crisis, counterparty risk was not such a big issue but now it has become a huge focus of the industry. Cost saving is also high on the agenda. We recently conducted a survey of 50 asset managers that showed that there was an urgent need for this type of service. About 17% of the respondents said that a single file delivery of counterparty statements would save them between 50 and 1,000 hours of work a month while on average respondents estimated time savings of over 49 hours a month.

What is Markit Document Exchange?

Kandylaki: We launched this product last year and it is basically a documentation library that allows the buy- and sellside to post, manage and share compliance, counterparty, credit and regulatory documents securely. Quick access to counterparty information has been become one of the key issues today from a regulatory and compliance standpoint and this product simplifies the process and makes it much more efficient. For example, previously, if you were a new hedge fund and wanted to trade with different banks, this could entail sending and receiving information from say 15 banks and multiple departments within these banks. Our product gives users a platform where they post their documents once and permission them to all their counterparties.

What are your plans for 2009?

Meldrum: We expect 2009 to be an extremely challenging year for global financial markets and we will not be immune from the contraction in global participation. On a more positive note the structural changes present new opportunities for Markit. We are well positioned to participate in areas where the licensing of IP, electronic connectivity of market participants, or provision of data is required.

[Biographies]
Sophia Kandylaki is a director at Markit and the product manager of Markit BOAT. Prior to this, Kandylaki was the product manager for Markit RED, the industry standard for reference entity and reference obligation identifiers used throughout the credit default swap markets. She holds a Bachelor of Laws degree (LLB) and a Master of Laws (LLM) in banking and finance Law from the Centre for Commercial Law Studies, Queen Mary’s College, University of London.
Will Meldrum is a managing director at Markit and head of equity data. Meldrum joined Markit in 2005 as director of acquisitions and strategic development. Prior to this, he worked at Deutsche Bank for four years, managing the bank’s interests across a diverse portfolio of investments with a key focus on industry consortia, electronic trading systems and data. Meldrum holds an MA from Edinburgh University and an MBA from London Business School.
©BEST EXECUTION

Profile : John Barker : Liquidnet

STRIKING THE MATCH.

Liquidnet_J.Barker_491x375

Liquidity is a scarce commodity these days but Liquidnet believes there are more opportunities than closed doors. John Barker tells Best Execution why.

Can you tell me the background to Liquidnet

Liquidnet started as a “Eureka!” moment at 4 am in 1999 when Seth Merrin, founder and chief executive officer, came up with the concept of capturing institutional block orders. He spent the next couple of years, asking people in the industry to either ratify or pull apart his idea. The reception was good and he launched the company in the US in April 2001. It was successful from day one mainly because of the simplicity and ease of use of the product. Three clicks of a mouse and the buyside trader was able to trade shares anonymously and efficiently.

In 2002, we went live in Europe, initially focusing on the UK, France, Germany, the Netherlands and Switzerland, and in 2007, we went into Asia, where we are currently in the core five markets – Hong Kong, Singapore, Australia, Japan and Korea. Altogether we are in 29 markets and in many ways, the business has developed in a typically West to East pattern. Historically, the US is ahead of the UK by about two to three years which is a year ahead of Europe, which is turn is about five years ahead of Asia. However, in this case, the Asian markets were more technologically advanced than Europe. The buyside had already adopted direct market access (DMA) and order management systems. (OMS).

What has been the impact of the financial crisis?

The whole financial community has suffered and we are no exception. We had grown 100% year on year from 2003-2007 but last year Europe’s principal trading grew by 24% as compared with 2007. While this figure is much lower than we had forecasted at the beginning of the year, we still believe it is good against the current backdrop. Looking ahead, if Europe’s main share indices continue to perform poorly, then principal traded is likely to be flat in 2009.

The group that has been the most impacted is the investment banks. For example, we have not been affected in Europe by the withdrawal of hedge fund activity as our business is only focused on asset managers. Going forward, there will be opportunities for us and we think this could be the age of the agency broker although not all will survive. We are not resting on our laurels and will continue working closely with the buyside to deliver new products, develop existing products and expand into new markets as well as add new functionality.

Which new markets are you looking at?

We are still working our way through Europe and we currently have a healthy pipeline of new client members, with about half a dozen ready to go live. This year we are targeting the asset managers in Germany, France, Switzerland and Italy where we have already started to build relationships. To this end our sales team has recently hired Raj Samarasinhe who speaks several languages and has a good understanding of the different markets. In terms of the markets we provide access to, we expect to have Turkey, Slovenia and Poland on the system in the next couple of months.

What new products and services do you expect to introduce?

We just added Luxembourg-listed Global Depositary Receipts (GDRs) to the system. Initially, there will be 97 GDRs available to trade, all of which are issued under Regulation-S, US dollar traded, and settled via Euroclear. This range complements the LSE-listed Reg-S GDRs, AIM securities, exchange traded funds, contracts for difference and equities from 29 global markets that are already available. Looking ahead, we plan to introduce our new liquidity streaming product called H2O, which was launched in the US about four years ago. The product aggregates fragmented market liquidity and enables clients to interact with the liquidity of other streaming liquid partners. Unlike the main buyside-only pool, it contains flow from non-buyside sources such as broker-dealers and alternative trading platforms. Effectively, these partners provide more liquidity to the Liquidnet pool but buyside control is maintained whilst anonymity and minimal market impact are protected.

We also plan to get our programme trading desk up and running and introduce a transaction cost analysis service in Europe. The desk will be an electronic trading service that uses algorithms to access liquidity both on our own trading platforms and others. I think that large orders will go to the block desk while smaller cap orders will go to the programme- trading desk.

In addition, we are looking to expand our counterparty exposure monitoring process, which we launched in the US late last year. Counterparty risk has become very important since the collapse of Lehmans. This platform collects trade data from various systems and generates a report that allows the firm to look at exposure at the prime broker or custodian level.

There have been dramatic changes thanks to MiFID and the financial crisis. Where do you see your place in the landscape?

Liquidnet reinvented and redefined the institutional marketplace and we really do not see any competition in our space. We see exchanges and MTFs as partners but we differentiate from them in that we bring external liquidity to only the buyside. The main objective is to help the buyside enhance the quality and speed of trade execution, gain price improvement for their trades, and lower the overall trading costs.

What do you see as the biggest challenges this year?

I think we can expect another six months of bad economic news and I would not be surprised to see another big name go. However, we remain positive about our business model and growth opportunities. The buyside feels very comfortable with our model and this will hold us in good stead in the future. There are so many opportunities for agency brokers and the biggest challenge is in delivering the products and services we discussed on time and on budget. The other important thing to do is to get in front of the asset management community and understand what their requirements are.

[Biography]
John Barker is managing director, Liquidnet, responsible for the EMEA region. Previously, he was head of equity operations at Tokyo-Mitsubishi International — a Japanese international bank based in London — and head of trade support for Deutsche Bank. From 1988 to 1998, Barker was director of operations at Instinet Global Services. Before 1988, he held leadership roles at Yamaichi International (Europe) and Midland Bank/HSBC.
©BEST EXECUTION

 

 

 

Profile : Stephane Loiseau : Société Générale

FINDING THE RIGHT PATH.

SGCIB_S.L'Oiseau_433x375

As liquidity fragments and markets stay volatile, it is not always easy to find the right pools. We ask Stephane Loiseau of Société Générale how the bank is helping clients navigate through these difficult times.

Can you please give provide some background to the execution services group?

Over the past four years, we have been putting all our equity products under one umbrella. This includes sales/trading, single trades, block trading, programme trading, exchange traded funds, direct market access and algorithms. We wanted to offer a packaged approach and offer multiple venues for clients to trade on. In effect, it is a toolbox for trading and although they are each different, the common denominator is technology. There is also a strong global focus in that we are covering 60 to 65 markets spread across Europe, the Americas and Asia.

Who are your main clients?

We cater to all institutions – active, passive/index – but in the last few months we are seeing increased activity from private banks. At one time they were considered just retail organisations but today the lines between retail and institutional are blurring.

What has been the impact of the financial crisis on SG and the industry?

The biggest impact has been the drop in volumes, the wider spreads and the lack of liquidity. Figures show that volumes have fallen by about 50% from last year while the bid/offer spread is now three to four times what they were last year.

There is also a different competitive landscape as some market participants have disappeared and there are less players providing capital. Also, there is less activity from some institutions such as hedge funds, which has also had a direct impact on the volumes and the cost of trading. For example, buying 5% of the volume weighted average price flow could be three to five times more expensive this year than a year ago due to the scarcity of liquidity.

As a result, we, along with the rest of the industry, have had to become much more flexible in the way we trade. About 60% to 70% of volumes are traded electronically and technology has become one of the differentiators. Any firm who has invested in the right tools such as smart order routing, algos and DMA is much better placed than those that did not. Clients want to automate the trading process as much as possible in order to access liquidity both quickly and cheaply.

What do you think the impact of MiFID has been? Do you think we will see more MTFs given the current volatile market conditions?

MiFID has definitely succeeded in introducing competition and has allowed the creation of a true pan-European trading platform, which has been positive for the industry. However, there have been far fewer MTFs launched than expected. Some people thought there could be up to 20 but in reality there have only been two to three, including Chi-X and Turquoise, who have made any real impact. I think it will be
even harder this year for new entrants because we are in a low volume market and the primary exchanges are in a good position to capture the flow. For example, the MTFs did not benefit when the LSE had an outage last year and while they have learnt some lessons I am not sure they would benefit today. One of the problems is that there is not that much differentiation between the platforms. The big question they will have to ask is how much time should they allow to attract enough liquidity to be successful. I would not be surprised to see some form of consolidation this year or next.

Do you think dark pools have been good for the market?

I do not think dark pools are for everyone or every order. For example, if you are buying Vodafone, there is no need to use a dark pool but if you are a value investor with a buy order for a high percentage of average daily volume in a small-cap stock with thin liquidity then a dark pool rather than a lit pool may be a better alternative. There is no perfect solution and traders should look at all the options and conduct pre-trade analysis. The questions to ask are what are they trying to achieve and what is the timeframe? Is it a short or long-term investment, and will there be any follow on orders? Also, what are the opportunity costs and information leakage? How long should a trader be prepared to leave an order resting in a dark pool? There needs to be a balance between waiting long enough to see if the order will be crossed with the risk of the order being pinged and as a result, increasing the information leakage, and the opportunity cost of not being represented in a potentially volatile “lit” (i.e. displayed) venue.

What are your plans for the future?

For us the approach has always been to provide clients with not only the products they require, but also the tools they need to measure our performance, and we will continue to build upon our offering. We give them tick-by-tick, real time execution information and we plan to further enhance our post trade data and audit trail. The biggest challenges, of course, are the reduced volumes on both the buy- and sellsides and the market uncertainty. Clients are looking for liquidity and we will provide flexible solutions and more algos that enable clients to adapt to these evolving market conditions.

What do you see as the main challenges for the industry?

I think there needs to be more work in the area of clearing and settlement. MiFID introduced competition in the trading space but clearing and settlement in Europe is still complicated and expensive . It is one of the main differences between us and the US, which has a single provider, DTCC. I believe that a single pan-European clearer would both help market participants lower the total cost of trading but also add volume to the market place.

[Biography]
Stephane Loiseau is deputy global head of execution services of Société Générale Corporate & Investment Banking. He began his career with the French based bank in New York in 1996 as a trader on international equities before joining the programme trading team in London. He returned to New York in 2000 as head of the New York programme trading desk, and rose in the ranks to deputy head of global programme trading and then global head of the group in 2004. In mid 2006, Loiseau became co-head of global programme trading & electronic services and two years later, he relocated to London to take up his current position as well as head of execution for London.
©BEST EXECUTION

Smart Japan – Rapid pace of technology speeds adoption of alternative trading venues

By Hiroshi Matsubara, Philip Slavin,

According to the World Federation of Exchanges, in December 2008, BATS was the third largest exchange on the planet: larger than Tokyo and London combined. It’s an incredible story for an exchange that no one had heard of three years ago, and which only launched in Europe four months ago.

The rate at which the markets have adopted alternative trading venues, is a vivid illustration of the way the capital markets are changing. The singlemarket environment has disappeared in the US and Europe, with venues like BATS creating a credible alternative to traditional exchanges. As a result, in the US, more than 40 execution venues, including incumbent exchanges as well as the new electronic crossing networks (ECN) and alternative trading Systems (ATS), compete with each other for share of order book trades. The growth of ATSs is such that they now have more than 30 percent of the trading share in US equities.

The story is similar in Europe, following the full implementation of the Markets in Financial Instruments Directive (MiFID) and the abolition of the concentration rules. The combined share of the four multi-lateral trading facilities (MTFs) that are currently operational – Chi-X, Turquoise, Nasdaq OMX and BATS Europe – together account for 16 percent of the trading volume in Europe for large cap stocks.

The state of fragmentation in Japan
Just like algo trading before it, which has moved East from the US to Europe and Asia, market fragmentation will inevitably arrive in Japan. We are already seeing signs of it. Japan’s exchange concentration rule was abolished several years ago, and the move toward diversification of execution venues has started.

As in Europe, fragmentation has to date largely been driven by foreign brokers. Several have already introduced internal crossing services and three block crossing networks are in operation. Whereas those PTSs (an equivalent term in Japan to ECN or MTF), used to be with retail flows only, they are now absorbing institutional prop and agency flows by linking together with leading institutional brokers. Domestic brokers who wish to compete should be prepared to invest now, at least by establishing their own internal markets and smart routing orders between them and the TSE.

However, it is important not to over-state the current levels of development. Although the share of off-the-floor trading taken by alternative execution venues and broker principal bids is gradually increasing and now covers around 7 to 8 percent of total volume, fragmentation is still in its infancy in Japan, as is the adoption of Smart Order Routing.

Indeed, 90 percent of the all transactions of Japanese cash equities take place on the TSE. The Financial Instruments Exchange Act (FIEA), the latest Japanese market regulation which has been in force since September 2007, is mainly focused on retail trading markets and is seen as weak when it comes to defining what is expected from the institutional market participants for achieving best execution. It is perhaps not surprising then that the majority of Tokyo brokers define their best execution policy as “to execute on the Tokyo Stock Exchange. ”Japanese institutional fund sponsors are missing some fund performance with the current status of the Japanese market.

Role of smart order routing
Nonetheless, one result of changes in the market and the role now occupied by PTSs, is that smart order routing(SOR) is beginning to establish a presence in Japan.

The route of a trade is moving, albeit slowly, away from the traditional route from buy-side blotter through a selected brokerage, on to a traditional exchange and then to a clearing house. Where SOR is deployed, the road to execution can be more diverse, as it becomes possible to break up orders, trade them separately on multiple venues and then clear through a variety of competing clearing and settlement regimes. These components then need to be reassembled so as to present a single, unified picture of the original order back to the end customer.

To prepare for this eventuality, some of Tokyo’s leading brokers have already adopted SOR for the Tokyo market, and those that haven’t are investigating their options for introducing it. In fact, the concept of smart order routing is attracting a great deal of interest from buy-sides as well as sell-sides.

In particular, brokers are starting to recognise that they cannot get by without intelligent access to multiple liquidity sources. The ability to probe the most optimal liquidity source dynamically and to route orders intelligently is an added value to their execution services. Differentiation will depend on how smart the router is and where it sits in the value chain.

In doing so, they are following a path that their counterparts in America and Europe have already travelled. Where multiple trading venues are firmly established, SOR is regarded as an essential trading tool for delivering best execution.

Upgrading the system
However, SOR will only truly succeed if the latency within the existing exchanges is improved. Without a low-latency platform a smart order router’s interaction between external exchanges and internal crossing, for example, would not make sense.

The good news is that the incumbent exchanges are in the process of strengthening their trading system infrastructure in order to promote low latency and high data volume. What’s more, forthcoming market milestones such as the launch of Tdex+, TSE’s new trading system for options, in July 2009, as well as Arrowhead, the next-generation trading system for cash equities due in January 2010 should boost the market adoption of SOR. To help matters along, the OSE has also announced it will introduce a next generation trading system, and both the TSE and OSE have announced the introduction of co-location and remote membership schemes.

What is still needed, however, is the right regulatory framework. We believe that the Japanese FSA should try to look again at the adoption and definition of best execution duties, especially for asset managers serving institutional clients. More fundamentally, those institutional fund sponsors, such as pension funds, should raise their awareness of the importance of trading costs among the total fund performance. These will be critical for the creation of more efficient market trading structures and the total uptake of further electronic trading in Japan.

In Europe and the
US, many firms have recognised the absolute need to implement a coherent SOR strategy, and Japanese firms are likely to follow. Whether this is deployed on site or via a third party, the net effect is the same – the computer is now taking a much bigger role in the overall decision-making process of how and where an order gets executed.

Beyond SOR
As we have established, SOR’s role of intelligently breaking up orders and sending them on to different venues is only part of the story. The resulting multiple executions need to be fed all the way through to the back office while still associated with the original order. Each execution will need to reflect the different trading fees and clearing regimes of the venue concerned. Another crucial part of the process is the ability to rewind the tape to see exactly how and why an order was executed. This is central to any best-execution policy, yet is often overlooked by pure-play SOR vendors.

It is essential that firms do not implement SOR systems thinking they will be acquiring a one-stop solution for all the challenges of trading across multiple venues; an SOR system without a complete smart workflow strategy behind it has very limited application. In addition, SOR technology needs to be integrated within the overall workflow, which may combine smart routing, DMA or other forms of sponsored access.

Therefore, SOR must be supported by, and operate in conjunction with, order management, market data, market connectivity, trade management, position  keeping and audit trail. Although the marketsthemselves are fragmented, the technology package cannot be: smart trading functions must be tightly integrated into the trading platform.

Eventually, SOR and the other components that are needed for intelligent liquidity access will become embedded as standard features in quality order management systems, in much the same way as advanced trading tools and the ability to handle multiple asset classes have been. This is because SOR is fundamental to the way that trading works in a market place in which multiple venues trade the same stock.

The strong likelihood is that SOR will become the minimum requirement in a package of tools and strategies that are necessary for intelligent liquidity access. Organisations that can successfully combine smart routing across lit and dark liquidity together with real distribution and intelligent workflow will emerge the winners in this new environment.

日本における高度化した 市場外電子市場の始動 – 世界的に不安定なマーケット環境の中で 本格化した日本の高度な代替執行市場の 今後の展望

By Yoichi Ishikawa
世界の取引所が昨今の金融危機において散々なダメージを受けている中で、東京証券取引所も株価や出来高において低調になっています。一方、ここ数年欧米で見られるような代替市場の拡大傾向が、アジアへ広がり、日本でも急進しています。カブドットコム証券の石川陽一氏は、この騒然とした時期に新しい代替執行市場の存在がより認知され定着することが重要であると述べています。

2009年年初から2月に入る中で、日本の株式市場は、東京証券取引所の1日当りの売買代金が2兆円割れするという売買の低迷が続いており、流動性リスクが懸念される。また、東京証券取引所の売買代金シェアは、日本の全市場の約90%超と一極集中の状況が長年続いており、多数の参加者が一カ所に過度に集中し希望する価格で取引が迅速に行われづらい売買リスクも考えられる。このような情勢の中、2008年より市場外取引に新たに複数の高度な私設取引システム(PTS)が誕生してきており、1つの銘柄の複数の値段の中で、より有利な値段を選び取引を行う高度な電子化された「最良執行(Best Execution)」が定着し始めた。

活性化し始めた市場外電子市場
日本の私設取引システム(PTS)は、1998年に法改正され市場外取引の中で電子的に売買を成立することができるようになった。2001年から2社にて運営が開始されたが、取引所と同様の価格形成を行うことができないことや、参加する投資家が限られたことなどにより十分に普及しなかった。これらの要因により、2001年~2006年までは市場外におけるPTSのシェアは2%に満たなかった。2005年の法改正により、取引所と同じ価格形成ができるオークション方式がPTSでも利用できるようになり、2006年9月、同方式のPTSが、カブドットコム証券にて夜間に始まったことをきっかけに2007年の同シェアは約4%まで伸びた。しかし同取引は、夜間という時間帯や個人投資家であること等により伸び悩んだ。そのような経緯を経て2008年、PTS業界において次のような環境の変化があり、2008年の同シェアは約5.5%、2008年11月の単月では、9%超と過去最高を占めるなど急速に拡大した。

  • 昼間の時間帯に取引所と同様のオークション等の方式にてPTS2社(カブドットコム証券と他1社)で売買が始まり、投資家は、市場外にて取引所同様の取引が本格的に行えるようになった。
  • 同2社では、トムソン・ロイター社のようなグローバル情報ベンダーに複数気配(Depth)を含んだPTSのリアルタイム時価情報を配信し始めた。
  • これらのPTSで、取引所よりも細かい呼び値での取引が可能となった。
  • 外資系証券を中心とした高度なアルゴリズム取引により、主市場と比較しながらPTS市場にも流動性を供給する新たな取引手法が定着し始めた。
  • 新たに2社にPTS業務が認可され、日本のPTSは計6社となり機関投資家から個人投資家までPTSへの認知が高まった (次ページの図Aを参照) 。但し、新規2社は個人投資家の取引が中心。

複数の価格を比較しながらの売買
右図は、カブドットコム証券が個人投資家向けに提供する複板(”Fukuita PTS(Multi board PTS)”)だ。このツールは、取引所とkabu.comPTSのそれぞれの気配情報を一枚の板画面で参照し、即座に売買できる。同様に複数の取引所やPTSの株価を同時に比較するマルチブック情報ツールは、トムソン・ロイター社などにより提供され始めており、今後投資家が複数の価格や板の状況を一覧で比較しながら売買を行うことが普及し始めてきている。取引所では、多数の参加者が同じ価格を指定するため、時間優先ではあるものの長い待ち行列となることがよくある。PTSでは、取引所とは異なる価格での取引や、取引所と同じ値段でも、より小規模のため待ち行列が短いことなどにより、アービトラージ取引が可能となった。取引所での取引と比べると待ち行列のキューをジャンプする形だ。また、PTSによっては、取引所と比べて1/10となるきめ細かな価格指定(呼び値の刻み)ができるため、アルゴリズム取引にも対応しスライシングした取引ができる。機関投資家は、これらのキュー・ジャンプや細かな呼び値の刻みにより、コスト・セーブができるようになった。これらの取引環境の高度化により、カブドットコム証券の2008年度の第3四半期(2008年10月-12月)では、外資系証券からの高度なアルゴリズム取引等による電子取引が定着し、1日平均の売買代金は、約10億円と前四半期比で約16%増加、2008年10月7日には47億円に達し過去最高を更新した。また、2008年10月後半以降、金融不安の影響でマーケット全体が低調となり注文件数は減少する一方、約定件数は続伸し、約定率は0.6%から1.0%へ上昇した(次ページの図Bを参照)。

市場外電子市場の高度化
2008年に取引所と同等のPTS市場が始動したことにより、外資系証券等によるアルゴリズム取引等の電子取引も益々高度化している。取引所と同方式をとる新しいPTSでは、基本的に取引所と同じ売買方式を取りながらも、取引所より細かな価格指定ができる。そのため、セル・サイドの売買システムでは、最良執行方針のもと、より良い価格がある場(取引所やPTS)を自動的に発見し、注文を回送させる高度な機能が要求され、先行する外資証券では2009年より提供が始まっている。これらの高度な売買システムは、SOR(スマート・オーダー・ルーティング)機能と呼ばれる。SORでは、価格やコストを加味しながらも、有利な場に注文を電子的に自動的に回送することができる。一昨年からのセル・サイドのダークプールの活用に加え、取引所、ダークプール、さらにPTSといった多数の場から最も良い値段を選ぶSORは、機関投資家の売買運用を広範囲に支援するものである。2010年1月には東京証券取引所の新システムが高度かつ高速な取引所として稼働する予定もあり、これらの電子市場の拡大に対し、迅速に注文を執行できる能力がセル・サイド側のSORシステムに要求される。また、これらの最良執行注文の高度化・小口化を背景に、サイズの大きい注文に対しては、PTSでVWAP(売買高加重平均価格)を用い効率的に売買を成立させる取引も、2009年より一般化されるであろう。

まとめ
2008年からの日本における新たな複数の私設取引システム(PTS)の始動の流れは、2009年より有利な値段で取引を行う「最良執行(Best Execution)」が機関投資家およびセル・サイドのSORシステムにより高度に電子化される流れの中で、電子的かつ機能的に取引所を補完する新たな執行市場として定着し、日本の市場全体の売買リスクや流動性リスクを低減させる役割を持つであろう。

Exchanges without borders – Meeting access and latency challenges across the Asia Pacific

By John Knuff
Can there possibly be a silver lining in the current financial meltdown? John Knuff of Equinix, argues that now is a time to upgrade your investment, allowing the Asia Pacific region to catch up with its US and European peers.

While the global financial crisis has inevitably had an impact on investment, most commentators seem to agree, that the Asia Pacific market will see on-going development, particularly as it continues to invest in the infrastructure and technologies, that will allow it to match its US and European counterparts, in key areas such as execution speed, easier market access and direct data feeds.
Analysts such as Celent see the current downturn as a significant opportunity for Asia exchanges, even suggesting in a recent report that Asian exchanges have the potential to overtake their US colleagues in the near future. Before this can happen, however, there needs to be sustained investment in the technology, skills and processes that will enable lower latency, easier access and faster data feeds across the region.
One of the key challenges remains the diversity of the region. While the geographical diversity, and vast distances involved, will always make it hard for traders to gain low latency access to multiple market centers, the added complexity of local regulations and last mile access make region-wide performance goals even more difficult to achieve. Nevertheless, factors such as direct market access, the increasing presence of alternative trading systems and the introduction of crossing networks, will have a tremendous impact shortly after local regulations ease.
Investing to close the gap with other global markets
To assume that the different markets in the Asia Pacific region will progress seamlessly together towards a more deregulated and open environment would be unrealistic. The global financial crisis is already leading some Asia Pacific exchanges and regulators to be more defensive in their outlook. However, it also provides an opportunity, for more traditional venues, to develop and implement their own alternative trading strategies to compete, more favourably, with new market entrants as conditions improve.

It is this imperative to remedy the handicap of limited bandwidth and slower trading platforms, that is driving Asia Pacific financial institutions to continue to update their technology infrastructure. As many of the incumbent exchanges re-tool their matching engines and foster technology partnerships, with global leaders like NASDAQ OMX and NYSE Euronext, the broker / dealer communities are quickly positioning themselves to be the partner of choice for many of their US and European counterparts.
Given this background, we believe it’s important for Asian market participants to ensure they are making the right infrastructure and connectivity choices today, to allow them to compete more effectively tomorrow.
A world of more end points and more trading venues
In an Asia Pacific market driven by the continued growth of automated and algorithmic trading, the emergence of new liquidity opportunities and increasing numbers of order destinations and market data sources, we’re increasingly going to see financial firms trading a much wider range of asset classes and instruments across broader geographies.

In those countries, where the incumbent exchanges still handle the majority of trading, these new market developments will have a significant impact as local traders who, limited by their current choices, increasingly send order flows to more accessible and transparent electronic markets. All this translates into more end points and execution venues, and is driving demand among financial services firms for a greater choice of networks with low latency/ high bandwidth capabilities to enable these higher message rates and optimise throughput.
With the landscape of the Asia Pacific market’s different trading centers evolving so quickly, it’s becoming increasingly apparent that a strong element of foresight, and much broader connectivity options, will play as important a role as proximity, when it comes to making location decisions across the Asia Pacific region.

Finding solutions – Now in its 5th year, the FPL conference in Canada tackles the tough issues

What a lot can change in a year! Since the last FPL Canada conference, held in May 2008, Canada has been drawn into the liquidity crunch along with the rest of the world. Yet Canada has a risk and regulatory model that is different from many of its established trading partners, most notably the US and the UK. Can the world learn lessons from the Canadian experience?
With only weeks to go until Canada’s leading electronic trading event, it is still hard to pick what the credit crisis and regulatory environment will look like on June 1st, the first day of the conference. What we do know is that there will be increased regulatory involvement, particularly in areas that were previously not subject to scrutiny. In the run up to the event, we asked a range of experts to comment on what they feel will be the hot topics at this year’s event.
Conference Hot Topics
(A) Market Volatility
Market volatility has certainly changed trading patterns. The increasing reliance on electronic trading, leveraged through Direct Market Access/Algorithmic Trading or a portfolio trading desk, is directly connected to the growing need to manage risk, volatility and capital availability. We have seen a dramatic uptake in these services/tools, as there has not only been a focus on how electronic trading is conducted, but also on the expectations of trading costs involved versus benchmarks. The greatest impact has been a higher use of electronic trading strategies relative to more traditional trading and a shift in the types of electronic trading strategies employed.
From a single stock perspective, many traders were loath to execute at single, specific price points due to the potential for adverse percentage swings in the high single to double digits. Algorithms have been employed on a more frequent basis, to help traders participate throughout intervals on an intraday basis, managing risk around the volatility. The violent intraday swings also create significantly more opportunities in the long-short space. Quantitative execution tools have became more of a focus, to take advantage of these opportunities on an automated basis.
What the industry is saying:
“In terms of disruptions relating to market volatility, the Canadian trading infrastructure generally held up admirably. Most dealer, vendor, and marketplace systems handled the massive increases in message traffic and activity with little noticeable impact on performance. This is proof positive that the investment in capacity and competition was well worth it and is now paying dividends.” Matt Trudeau, Chi-X Canada
“Electronic platforms using FIX and algorithmic routers handled significant market fluctuations with no impact to performance. Speed to market for orders made it possible for traders to minimize exposure to huge swings in pricing and to capitalize on opportunity.” Tom Brown, RBC Asset Management
“Many traditional desks were shell-shocked and did not know how to respond to the volatility combined with the lack of capital. Electronic trading tools like algos enabled people to manage extremely volatile situations with great responsiveness. They were able to set the parameters for their trades and let the algo respond as market conditions warranted. Trading of baskets/lists made having electronic execution tools critical. You couldn’t possibly manage complex lists in real-time without very sophisticated electronic front-end trading tools.” Anne-Marie Ryan, AMR Associates
“The recent volatility spike means that risk will likely be scrutinized more in the future than in the past. Post-trade transaction cost measurement systems generally do not consider risk but instead focus on cost. To properly align the interests of the firm and the trader, performance measurement systems will need to reflect both cost and risk considerations.” Chris Sparrow, Liquidnet Canada
Jenny Tsouvalis of Ontario Municipal Employees Retirement System (OMERS)
sees a need for effective integration of investment management and trading processes. “On-line, real-time electronic trading systems provide quick access to liquidity and when coupled with real-time pricing embedded into blotters, identify the effect of market changes on the portfolios and the effect of trading decisions.
“Electronic trading has been successful because of its ability to be adaptive, so it is likely to change in reaction to current issues.” Randee Pavalow, Alpha Trading Systems.
“We’ve seen an increase in the use of algorithms and over the day orders as volatility has increased. The ability to smooth orders over a longer period limits the exposure to price swings during the day. VWAP, TWAP and percentage of volume, seem to be the algos of choice for many these days.” John Christofilos, Canaccord Capital

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] Please review our updated Terms & Conditions and Privacy Policy carefully. By continuing to use our services after Aug 25, 2025, you agree to these

Close the CTA