Home Blog Page 662

Enabling transparency in an increasingly fragmented market : Addressing EMEA challenges with a more consolidated view

By Artur Fischer
In this article, Equiduct Trading’s Joint CEO Artur Fischer argues that in times of extreme structural and economic change, there is an even greater requirement for transparency. He believes that in an increasingly fragmented market, there’s an even greater risk that organisations will need even more help if they are to avoid effectively trading in the dark with no clear consolidated view of market pricing. Here he identifies the growing requirement for a new generation of virtual order book that can consolidate all the visible pre-trade information generated from significant relevant markets, effectively delivering transparency and providing firms with access to a single, unbiased source of pan-European equity price data.
The European equity markets have undergone a period of rapid and unprecedented change over the past two years. While some of these shifts have been mainly related to the still-evolving current global economic situation – leading to the disappearance or restructuring of some of the biggest names in finance – others have centred around newlyintroduced regulation, with the arrival of new types of execution venues and cross border clearing venues being among the most obvious and significant.
These changes have created some huge challenges (and it should be said equally huge opportunities) for market participants, whether they be the large broker dealers having to connect to all the new trading venues in search of liquidity, or a pension fund simply trying to understand what the “Best Execution” he has been promised actually means.
Each of the incumbent Exchanges, the new Multilateral Trading Facilities (MTF), and the growing number of Dark Pools or Crossing Networks provides an alternative USP for execution of equity orders, and each operates with a slightly different business model – both pre and post trade. This has understandably stimulated competition for order flow liquidity, introducing alternatives in the post trade space, and leading to a major shake up in fees. Not surprisingly, this has also irreversibly fragmented liquidity. However, this fragmentation is an evolving process; the picture is far from complete or even stable, and can be expected to go through several consolidation and subsequent fragmentation phases before the next “Big Bang”.
Opening up the European equity markets
With new entrants into the execution space, Europe’s equity market is opening up for investors from across the world. FIX-compliant technology is enabling easier connectivity to the new venues and providing an opportunity for a wider range of firms to get access to venues over and above the incumbent. In Vol 2 Issue 8 December 2008 of FIXGlobal, John Palazzo of Cheuvreux stated “FIX affords every broker the ability to get into these markets at an unprecedented pace” – at Equiduct we certainly agree, but there are still some considerable challenges.

How, for example, do “sell-side” firms determine whether they should connect to these new venues? How do they then prioritise which to connect to? How do they choose where to actually send their order? Also, how do “buy-side” investors understand which venues their brokers should be connected to, if they are to ensure them the mythical Best Execution? What price should they be using to markto- market at the end of each day and for intraday position risk purposes?
At Equiduct, we’re hoping to provide some of the answers to these important questions. We hope to be able to shed some light on the situation and show how to achieve best execution on the various available platforms with a range of analytical tools. Uniquely, the toolset includes a Pan-European aggregated feed.
Ensuring execution on the most appropriate platform
Firms across the trading spectrum, whether small or large, are increasingly using sophisticated smart-order-routing solutions and algorithmic trading systems to “slice” orders and to determine where they should distribute the pieces across the Dark Pools, MTF and Exchanges. However, in order for these systems and indeed an individual trader to start to effectively predict the future, it is important to understand the present and the past. Information providers such as Markit or Fidessa with their Fragmentation Index can confirm the common knowledge that liquidity fragmentation is a reality once a trade has been executed. However they do not have the ability to see how the market should have performed by examining the pre-trade order and price information that was available at the time of trade.

At Equiduct we have been collating all visible pre-trade information (Level II data) for the top 700 shares across Belgium, France, Germany, The Netherlands and the UK from the major European venues (BATS, Chi-X, NYSE Euronext, London Stock Exchange, Nasdaq OMX, Turquoise, Xetra) since April 2008. Yes, a significant percentage of order flow has moved away from the incumbent exchanges but what is not such common knowledge is that trades are still not always executed on the most appropriate platform. Indeed our analysis shows that in April 2009 a significant proportion of trades executed on the incumbent exchanges should have been transacted on an alternative venue, and approximately 35% of executed trades are still not transacted on the best price venue. Significant price improvement could have been achieved if this had happened. (See Diagram 1)

To give the market a view into what is actually happening, Equiduct has created a range of products that provide both real time and historical data analysis, including a powerful tick-by-tick Level II data store that can be used in real time, or as a market reconstruction facility. This allows the user to simply select either a precise time point (with millisecond granularity) or an individual trade to see the corresponding consolidated book display at that very moment, and to “quality check” the execution quality. The tool is also able to regenerate the graphical market fragmentation and liquidity picture over different time periods, or the optimal split for a trade given the prevailing market conditions at the point of actual or theoretical execution.

Streamline, upgrade and relaunch : Nomura shares the technology challenges of its Lehman Brothers acquisition in uncertain times

By Adam Toms
Seven months on from Nomura’s acquisition of the Lehman Brothers’s Equities business in Europe and Asia and the integration process is now complete. Nomura re-launched its trading platform in January 2009 and is fast climbing up the rankings on the London Stock Exchange. Adam Toms, head of the Market Access Group at Nomura (EMEA), charts the challenges and opportunities associated with restarting a business during an economic downturn.
Nomura already had an established product offering in Japan and a leading position on the Tokyo Stock Exchange, the Lehman acquisition has added scale and depth to our global execution capabilities, broadening the distribution network across clients globally and improving global liquidity access for clients. In Europe we took advantage of a three month window out of the market to focus on relaunching with a lower latency, higher capacity, more technologically advanced offering. In addition we recognised that equity markets are changing from both the execution and client requirement perspectives. Through analysis of the products and services currently being used across the Street, we have aimed to position our business for both the current market conditions and build in the flexibility to adapt to the changing dynamics of the equity markets in future years.
Whilst the opportunity to rebuild a business from a standing start was compelling, the challenges involved in doing so were immense. In the three month period prior to the launch of the Nomura platform, the entrance of new trading venues across Europe continued at a great pace. While this has taken second stage to the greater financial services turmoil, it is nonetheless a key development which will have a profound effect on the way the European markets operate. From a business resumption perspective, we needed to reconnect the business to 31 different exchanges and MTFs in 25 countries in Europe. There are numerous third party solutions available for venue connection in place of a direct membership, but we believe direct access is the key to enable us to utilise our own technology in a homogenous way across the platform. This gives us the opportunity to build ultra low latency infrastructure, and not be reliant on external providers for our core business. As part of this, to ensure we are able to provide a highly robust platform with built in failover provision, we run parallel data centres, each with two connections to each trading venue, giving us four routing options for business continuity purposes.
We believe that this accelerated production environment has enabled us to attain significant performance improvements and latency savings across the board. We’ve worked with technology partners Intel Corporation and SeaNet Technologies Inc in the Intel FasterLAB to evaluate and then tune on new hardware and software that has reduced engine latency across the core algorithmic strategies by up to 94%, from 10 milliseconds to in some cases less than 640 microseconds. Migration to dedicated, ultra low latency market data feeds has improved access speeds by over a third, with immediate impact on slippage numbers. To further enhance users experience innovative anti-gaming logic has been applied to Nomura’s dark pool crossing engine, NX.
The development of smart-order-routing (SOR) technology has also been a priority, as this has presented a significant tangible change in European equity markets over the last 18 months. In the immediate post-MIFID period, the industry initially placed a lot of attention on whether or not SOR was offered in a binary sense. Due to significant differences in sophistication and logic offered by providers, dealing desks are now increasingly starting to differentiate on how good the technology is. In choosing a SOR, the most important consideration should be whether it offers better execution and price improvement and this is not achieved simply by connection to a greater number of MTFs. Instead, it is the ‘smartness’ of the SOR that will determine the outcome, including taking into account factors such as lit and dark venues being treated differently. We have invested heavily in our SOR, adding distinctive logic and sophisticated algorithms to maximise liquidity capture and execution likelihood.
As the data demands and complexity of algorithmic trading engine logic continues to increase steeply, Nomura has made significant investment in hardware to accommodate and provide future capacity for both the huge increase in the number of micro decisions and order transactions. Each algo instance, for example, has had capacity increased from 100,000 orders per day to in excess of 4,000,000.
Client connectivity presented an even more significant challenge. Previously, we had in excess of 2000 client FIX connections in place globally. We offer a flexible solution to connect to various execution channels, including Futures, Cash, PT, Algos and DMA, via the same or individual connections.
A significant amount of these connections, particularly for electronic trading products are via third party vendors. We had to rapidly certify an enhanced set of algos, rather than electing for a more restrictive basic / pre-canned approach, onto a number of strategically key vendor platforms. Partnering with these vendors we were able to relaunch with 15 vendor platforms live, and have subsequently expanded this further.
Exchange connectivity, vendor integration and direct client configurations also need to be replicated across the US and Asia products. This presents interesting technology challenges to identify where single connections can be used for multiple regions, or where clients would like to elect for specific routes. There is also overlap available for vendor integration where we can replicate algo enrichment screens, but the back end needs to reflect the regional and model differences.
On a broader technology level, the main challenge has been integrating front office and back office components. This was particularly important from a liquidity management perspective to ensure that all the execution channels across the floor have access to the same trading technology. This enables all order flow to be routed via our NX dark pool and onwards to our SOR, maximising crossing potential. Linking all the teams together and unifying our systems, while developing new products, has proved a substantial test for the IT team, but one that was achieved within a compressed timeframe.
The relaunch of a business of this scale has been a huge challenge from a technology perspective. But we took this opportunity to improve and re-optimise our current and future operations. The sheer complexity of both the internal and external integration work has underlined the reliance we have as a business, and as an industry, on a great range of technology and all the nuances that go along with it. Despite some interesting interpretations we have encountered along the way, the presence of FIX as an industry standard protocol has been fundamental in achieving this scale of business resumption with clients in such a short period of time.

Missed opportunity? Japan’s TSE should look to TOCOM to see the benefits FIX offers in improving efficiency and attracting liquidity

By David Chapel
In January 2010, the Tokyo Stock Exchange (TSE) launches its new, high-profile, Arrowhead exchange trading system for cash equities. TSE defines Arrowhead as its next generation trading system combining low latency with high reliability and scalability. However, to the surprise of many in the global trading community, Arrowhead will not include a FIX gateway. MetaBit Systems’ David Chapel examines the decision and argues that the Japanese exchange should reconsider the decision.
In September 2008, the Tokyo Stock Exchange (TSE) announced its plans for ‘Remote Trading Participant Services’ that would allow offshore firms, with no branch in Japan, direct market participation – a novel proposition in Japan. Initially, during the tender phase, TSE considered offering a FIX Gateway to the new Arrowhead system, however, a survey of exchange participants indicated that broker members had minimal interest in the global communication protocol.
The survey results came as a surprise to many, but a closer look at the TSE broker members shows it leans heavily towards smaller domestic players. Currently, TSE has 106 broker members, including foreign securities firms that have registered their local entities as a ‘General Domestic Trading Participant’, and 11 foreign members. Of these only around 40 members (split almost evenly between foreign securities firms, including those registered via their Japanese legal entity and domestic securities firms offering international trades) would possibly see the need for a FIX gateway.
Given the advantages that a FIX gateway would have offered to Arrowhead’s future Remote Trading Participants interested in the protocols ability to interact with a local exchange through a standardized API, we would urge the TSE to reconsider its decision, despite the current low demand among its domestic members.
Why not FIX it?
FIX is not a new concept in Japan. At MetaBit, we have offered a FIX-to-native exchange gateway to all major securities exchanges in Japan since 2004. The development of Arrowhead provided our company with the catalyst to re-architecture the existing FIX gateway with a focus on low latency and scalability. This need for speed was particularly important given the belief among exchanges that FIX is slow. Our aim was to bring FIX-to-native exchange connectivity below 500 microseconds of additional latency under sustained load, a goal which we have more than achieved. In addition, we have made extensive use of the Orc CameronFIX Universal Server to provide the core FIX connectivity.

Technical challenges and solutions for FIX
During the development of our upgraded FIX gateway, we had to consider the following technical challenges of providing a FIX implementation for direct exchange connectivity:

  • Performance is paramount; request latency (time to exchange) and request throughput (requests per second) represent the key metrics. Larger global members require sub-millisecond latency with throughput exceeding 1000 orders/second.
  • For scalability, most Japanese exchange architectures require multiple physical connections (so called Virtual Server or VS’s). Each VS is limited to a certain throughput by the exchange; higher throughputs can only be achieved by a broker member subscribing more VS’s. Due to cost, smaller and mid-tier brokers typically subscribe to 20 to 40 VS whilst it is common for large members to run above 100 VS’s per exchange. Efficiently managing and load balancing across such a large number of connections leads to a significant increase in complexity.
  • Each exchange API has a unique message protocol and message structure. Creating a standardized multi-exchange product requires custom FIX mapping for each implementation. It is the aim to keep custom FIX tags to a minimum possible whilst adhering to the global FIX Protocol.
  • The FIX API is a simple asynchronous model, well suited to high throughput bidirectional messaging. However, Japan’s older exchange APIs use a synchronous delivery model, providing batched order requests (20 orders per batch) to increase throughput. This complicates mapping between the FIX- and the native exchange API and increases implementation complexity.
  • Members can run many different types of hardware and operating systems; hence a vendor needs to support as many systems as possible.

DMA tops the bill – Delegates at the Face2Face event in Malaysia urged greater intra-exchange and DMA trades

By Stephanie Lawton

Advances in electronic trading technologies, including DMA, and the role of FIX in streamlining global investment strategies were top of the agenda at the FIXGlobal Face2Face event, held in Kuala Lumpur, Malaysia on the 21st of May. FIXGlobal’s Stephanie Lawton reports.
More than 150 delegates attended the FIXGlobal Face2Face event, a day-long gathering of professionals from all sides of the trading community. With Direct Market Access trading due to start later in the year, the event was held at the Bursa Malaysia in Kuala Lumpur.
Opening the event, UBS’ Owen Tomlin, Head of Direct Execution Sales for SE Asia, gave an overview of the latest developments in the FIX Protocol, its impact on the regional trading environment and the advantages it offers for both the buy and sell-side of the industry.
Tomlin was followed by an exchange between Invesco’s Regional Head of Operations, Dean Chisholm, and Rebecca Thomas, UBS Investment Bank’s Direct Execution Sales & Marketing. The topic was ‘The Importance of Electronic trading to the Buy-side.’
The speakers conceded that front-end capital expenditure was unavoidable for the adoption of electronic trading hard and software, but agreed that prices were likely to come down as the technologies became more widespread. The advantages, however, were now indisputable. The need to manage compliance and risk, coupled with an increasing focus on global investment strategies meant that electronic trading technologies were now an essential tool in navigating global markets and regulatory requirements. The role of the trader, they believed, would now be more focused on cost analysis and understanding market impact on trading strategies.
Tasked with updating delegates on the priorities of the Malaysian Securities Commission, its Head of Institution Supervision Department, Market Supervision Division, Puan Shareena Mohd Sheriff, said the Commission had looked at best practice in similar jurisdictions overseas before formulating guidelines to clarify the kind of electronic facilities that would be regulated under the Capital Market Services Act .
The next session opened with Peter Tierney, MD Asia region, NYSE Technologies, adding an international perspective on the changing trading environment for exchanges and cautioning the delegates that, with the proliferation of electronic trading, the onus was on the industry to become more efficient.
Tierney was joined by Bursa Malaysia’s CIO, Jit Jee Lim, who spoke about the challenges faced by the domestic exchange. Among the major projects close to fruition was the launch of DMA. The bourse was also looking at de-coupling the business of the exchange, looking at co-location and exploring new OMS and certification. In terms of FIX, Lim said the industry needed to embrace the Protocol and that Bursa Malaysia would support them by being FIXenabled and capable of leap-frogging to the latest versions. Lim added that the bourse worked closely with the local regulator and that the ability of trade across multiple Asian exchanges was still under discussion.

Taming The 800-Pound Gorilla – Encouraging FIX Compliance Amongst Exchanges Globally

Q: Where does an 800-pound gorilla sit?
A: Anywhere it wants.


I landed my first job in the financial industry back in the Halcyon days of the mid-1990s. From my naïve perspective, everything was wonderful. Stocks only went up. Nobody worried about a company’s P/E ratio, or if it were capable of turning a profit. If it had a “.com” in its name, it was a guaranteed winner. A website that resold airline tickets had a higher market capitalization than the airlines. Within the financial industry, technology was King, and with money growing on trees, it seemed firms had unlimited IT budgets and armies of software developers.
Exchanges, in the mid-90s, were the 800-pound gorillas of the day. They could implement proprietary standards, and firms had no choice but to adopt them. With liquidity centralized on the exchanges, there was no real competition, so everyone in the industry let the gorillas sit wherever they wanted. And with unlimited IT budgets and armies of software developers, accommodating the seating needs of several gorillas wasn’t all that painful.
What a difference a decade makes …
Times have changed. With the global economic crisis, IT budgets and headcounts are shrinking across the industry, and costs must be justified. Liquidity in many regions has become decentralized, with new generations of ECNs and ATS’ taking market share from central exchanges. Interexchange linkages mean that firms don’t have to connect to everyone to achieve price protection and best execution although, obviously, exchanges would prefer firms access them directly and not through their competitors. Firms are discovering they now have far more choices in finding liquidity, and they have less reason to tolerate expensive proprietary or non-standard interfaces.

Is FIX the solution?
FIX provides benefit to the financial industry in the form of a “network effect.” In theory, work done to support FIX for one market can be reused with other markets, effectively sharing the development cost. Proprietary protocols lack the ability to generate network effects and increase costs for the industry. A market that only supports a proprietary protocol is like an 800-pound gorilla that wants to sit on my laptop. The resulting situation is clearly amenable to the gorilla, but to me it’s expensive and inconvenient.

Proprietary protocols are not the only method of escalating market participants’ costs. While many markets claim to implement FIX, some deviate from the standard protocol to varying degrees, which drives up costs. Deviations usually fall into three categories: session, syntax and semantics.

Session costs
With a wide variety of established commercial and open-source FIX engines, session level deviations are few, but notable examples do exist. In these cases, the cost to a market’s participants can be severe. Firms using commercial FIX engines usually do not have the ability to modify an engine to support non-standard session behavior, so firms find themselves at the mercy of their engine vendors. They, in turn, are left scratching their heads wondering why the particular exchange deviated from the session standard and how they can recoup their costs to work around the exchange’s issues.

Syntax costs
Deviations in syntax are more common. These may include failing to send required fields, or forcing participants to send custom, user-defined fields that are not required by the FIX specification. While these do carry some expense and inconvenience, they are not usually catastrophic.

And semantics
Semantic deviations, however, are a market participant’s worst nightmare. The FIX Protocol defines precise semantic usage. Business processes are represented by standard transactional flows of messages, which themselves carry defined identifiers, states and key fields. Market participants’ systems are built assuming certain fundamental semantic behavior. Any nonstandard semantics will often require costly changes and could impose considerable additional latency in the participant’s system.

The costs to the industry of nonstandard behavior include analysis, development, QA, deployment, and integration testing. Firms who use vendor products must convince their vendor to support the non-standard behavior and must wait until the vendor can find the time to create, test and release the change. Just as the “network effect” can reduce costs for standards compliant implementations, nonstandard behaviors multiply these costs. Every $10,000 change to support nonstandard behavior made by 1,000 market participants costs the industry $10 million.

What makes them tick?
Understanding the gorilla’s psychology
may be useful in correcting its behavior Gorillas usually choose to deviate from the protocol and sit on inappropriate objects for one of three reasons: they don’t know any better; it allows for easier interfacing with their internal systems; or they believe they are implementing new business functionality not supported by FIX.

Finding the knowledge
Knowledge of the FIX Protocol is not hard to acquire. The specification itself contains extensive documentation. Posting a question in the FPL website’s discussion forums (www.fixprotocol.org/discuss/) often yields a quick answer. Firms can collaborate with FPL directly and markets can solicit comments from their participants on their Rules of Engagement documents. Understandably, participant firms who will be connecting to the market have a vested interest in pointing out any non-compliant functionality that will be costly to implement.

Taming the legacy systems
Legacy systems are often the cause of the most difficult incompatibilities in semantics. An exchange may think it is saving money by directly exposing legacy semantics not compatible with FIX to its members. FIX compliance may require that the exchange perform translations or modify its legacy system to accept FIX semantics, and that might increase the exchange’s cost. But doing so benefits the industry; it ensures compatibility with the considerable quantity of FIX software that expects standard FIX semantics, reducing every market participant’s cost. Standard implementations reduce industry cost, while non-standard implementations pass the buck to the market participants and multiply the costs for the rest of the industry, which is never a good idea.

Standardizing innovation
Innovation is vital to the financial industry, and markets should be encouraged to create new functionality. But this can lead to a proliferation of user-defined fields and messages, which sharply reduces the “network effect” and increases costs for the industry.

Most market-specific features I have seen implemented as user-defined fields are already part of the most current FIX standard. Even with resources such as FIXimate (www.fixprotocol.org/FIXimate), with more than 1600 tags, it can be difficult to find functionality if one doesn’t know where to look. FPL also adds new features to the FIX Protocol in response to industry needs. For example, functionality that must be user-defined in a FIX 4.2 session may be standardized in FIX 4.4. But if new features needed by a market haven’t yet become standardized, the best course of action is for the market to request that FPL add the functionality.

Speed up the changes to encourage innovation
Prior to FIX 5.0, new versions of the protocol were released infrequently, often 18 months apart. This simply was not a viable model to support innovation. Now, FPL releases Extension Packs, which are bundled into Service Packs for FIX 5.0. An exchange could write a Gap Analysis proposing a change, which would then be approved and adopted as an Extension Pack. For minor changes, the entire process could be completed within two months. FPL would then publish the Extension Pack, complete with tag numbers and enumerations, allowing for immediate adoption.

Reducing implementation costs for the industry is important to FPL. I work with FPL’s regional representatives to collaborate with exchanges, ATS’, ECNs, and regulators to promote standardization.
The latest FIX version (currently FIX 5.0 Service Pack 2) provides the most value to an industry struggling with limited development resources. Numerous ambiguities and unsupported functionality in earlier versions are clarified in FIX 5.0 SP2. Numerous proprietary, incompatible methods for representing things in older versions of FIX, such as FIX 4.2, are replaced with standard representations in the latest version. And FPL can only add new features to the latest FIX version in the form of Extension Packs. Fortunately, the cost to support FIX 5.0 SP2 isn’t onerous. Existing FIX order handling semantics have remained largely unchanged since FIX 4.4, few of the subsequent changes are mandatory, and all changes are stored in a machine-readable Repository.

Watch the market leaders
Several forward-thinking firms have seen the benefits of standardisation, and we are now seeing a number of exchanges globally supporting the FIX Protocol, as well as its use for market regulation, with growing support for the latest release of FIX 5.0 SP2. One such firm, the Investment Industry Regulatory Organization of Canada, was forward-thinking in responding to its members’ needs. When IIROC polled its member exchanges and ATS’ on the format of the regulatory feed each market must use to report transactions to its new market surveillance system, they were fully supportive of using the FIX Protocol. The markets and IIROC wanted to use the most standardized specification possible and avoid user-defined fields. FIX 5.0 SP2, with an Extension Pack to support IIROC’s needs, was the logical choice.

I worked closely with IIROC and observed first-hand that, while a myriad of user-defined fields dominate the Canadian markets, existing FIX 5.0 SP2 functionality rendered almost all unnecessary. The changes IIROC required were small and didn’t need a single additional field. Initial market reactions have been largely positive. IIROC’s collaboration with FPL has laid the groundwork for standardization and substantial cost savings among the Canadian markets.

It comes down to cost
In today’s competitive landscape, any market, be it an 800-pound gorilla or a smaller player, faces a serious competitive disadvantage if its FIX interface is nonstandard. Would a broker or vendor with limited resources choose a difficult, custom integration job for one market, or use those same resources to connect to five competitors who all follow the standard? Likewise, a market’s new, innovative features are worthless if brokers and vendors don’t implement them. Features using standard FIX fields that can be reused by other markets are more attractive than user-defined FIX fields that are limited to one market.

In these difficult economic times, a nonstandard FIX interface is truly a competitive disadvantage. Fortunately for the industry, many market places have noticed this trend and have chosen standardization. FPL welcomes the opportunity to collaborate with trading venues and regulators to promote standardization, leading to decreased costs for the industry.

Standardising the message: Collaboration, clarity and flexibility key to current and future success of global messaging standard

The Investment Roadmap, a guide created through a collaboration between the world’s leading messaging standards, to provide consistent and clear direction on messaging standards usage was released in May 2008. A year later, FPL’s Operations Director, Courtney Doyle asked the authors of the roadmap for their assessment of its impact on the industry and how they saw its future evolution.

Courtney Doyle: What was the motivation and purpose behind this Investment Roadmap collaboration?

Genevy Dimitrion (ISITC Chair): Today, Market Practice and Standards are inextricably linked and yet for all our efforts, Market Practice exists as a mere documented recommendation. However, we see convergence between the two areas to the point where market practice can be encapsulated directly into the standards themselves. The roadmap has a role to play in this evolution, and we see it as critical for the next generation of standards. It also helps to provide future users direction on which messaging standards should be used throughout the trade lifecycle.

Jamie Shay (Head of SWIFT Standards): Our involvement was driven by clear market demand from our customers. They needed clarity about which syntax to use in which space and, more importantly, they wanted a way to make these standards interoperable. SWIFT and FIX have been working together for a number of years towards a single business model in ISO 20022. The decision to work together on an Investment Roadmap was a natural progression. It provides clear direction with regard to messaging standards usage as it visually “maps” industry standards protocols, FIX, ISO and FpML, to the appropriate asset class and underlying business processes.

Karel Engelen (Director and Global Head Technology Solutions, ISDA): Similar to SWIFT, ISDA felt it was important to map out the coverage of each of the different standards so people could get a complete view of the industry standards at a glance. We also thought it would be a useful tool for determining where we had duplication of effort or functional gaps to complete.

Scott Atwell (FPL Global Steering Committee Co-Chair): As the others have pointed out, we needed to provide greater clarity as to how the various standards ‘fit together’. We sought an approach that recognizes, leverages, and includes the financial industry standards without reinventing and creating redundant messages that generate cost and confusion for the industry. The effort was named ‘Investment Roadmap’ as its founding purpose was to aid industry firms’ technology investment decision making.


Courtney Doyle: The roadmap was released over a year ago. Is the community referring to it and using it as a guide on how to invest in standards? How should firms use it and what does it mean for them?

Scott Atwell: FPL has found the roadmap useful as a tool for standards investment. However, the roadmap benefits are multifaceted. It has driven an even greater level of collaboration and cooperation amongst our standards organizations. Discussing it often serves as a ‘conversation starter’ that leads to healthy discussion and debate. It has also served to facilitate key changes to the ISO 20022 process such as the ability for FIX to feed the ISO 20022 business model and to be the recognized syntax for the Pre-Trade/Trade model.

Genevy Dimitrion: ISITC consistently refers to the roadmap when presenting to our members as the guidelines on message standards to be used within the trade lifecycle. It has become an extremely helpful tool for our members in understanding the key standards available and how and when they should be used.

Karel Engelen: The Finance domain has always been well represented in the standards arena, but too much choice is often confusing. The roadmap helps you navigate by showing the standards likely to become dominant in a particular space so that you can align your own system development strategy with them.

Jamie Shay: To add to the previous comments, I don’t think the community is referring to the map per se but they are benefiting from it. One of the key things to remember is that it provides clarity about which syntax should be used in which space and that by working together we are making very concrete progress towards achieving interoperability among syntaxes, which leads to certainty and increasing STP – so there are real business benefits.

The roadmap has also meant increasing cooperation among standardization bodies. For example we are working in a much more collaborative fashion with FPL than before, in particular in the area of pre-trade where we’ve defined a common business model that is in the process of being evaluated by ISO.

Courtney Doyle: Some areas of the roadmap allow for multiple syntaxes as indicated by the cross hatching. Why is this and what should firms do in this case?

Jamie Shay: As stated previously, the roadmap provides clarity around which ‘syntax’ is appropriate for which space. The single ‘standard’ is the common data model and common underlying dictionary – ISO 20022. The actual message representation may remain different depending on the syntax chosen to represent that common business model.

In some areas, we have more than one syntax represented in a given space because there may not be a “dominant’ syntax. In this case, firms should choose the syntax most suited to their environment or need.

Scott Atwell: As Jamie has stated, our objective is to feed and share a common business model, in which case having more than one syntax is not that significant as the core data and approach should not differ. In the Post-Trade / Pre-Settlement portion of the roadmap, for example, there is cross hatching of the FIX and ISO syntaxes and usage will depend on whether a firm’s STP initiatives are being driven more from the fron
t or back office. Firms may choose to use FIX when automating from the front office forward (e.g. adding allocation delivery to an OMS), or may choose to use the ISO syntax when automating from a back office perspective forward (e.g. building upon existing custodial communication).

Genevy Dimitrion: What is most important to our members is the business process. Although there are overlapping syntaxes, ultimately the business process behind each of them is the same. It is not practical to expect all financial communities to adopt a single syntax.

The roadmap is a framework that reflects the reality of the world we live in, underpinned by a common standard data model. From an organizational perspective, the services you provide tend to drive the standard you use which is clearly highlighted in the roadmap. In the cases where the cross hatching exists, firms should look at their key areas of focus to select the appropriate standard to develop.

Karel Engelen: The world of finance is not a set of segregated business lines to the degree it once was. Securities dealers trade in OTCs, derivatives dealers use securities to create structured products or hedge their positions and everyone dabbles in foreign exchange.

Implementing firms often want to maximize the return on their existing software investment by having each of their systems cover more of the trade lifecycle. As a consequence, the standards that support these businesses have had to expand into other domains to support their user base.

As mentioned by the others, the roadmap recognizes that in some functional areas there is more than one standard that can fulfil a particular function although there could be differences in the characteristics of each implementation. Some may be better for high trade volumes of relatively simple products, while others may excel at precise definition of complex financial products or structures traded with low volumes.

We feel that it is up to each community of users to choose the right solution for itself as this may depend on its existing standards usage and technology base. An establishedmessaging community, for example, might decide that leveraging its existing tech infrastructure and knowledge base was more cost effective than developing a parallel solution using a different standard for a new business problem.

Courtney Doyle: Are there thoughts of expanding this group to other standards bodies?

Jamie Shay: Yes, very much so. We are looking to collaborate with other standards organisations, such as XBRL. XBRL is involved in the financial ‘reporting’ end of the business and includes securities and transaction ‘issuers’. Capturing information at the site of the ‘issuer’ in the life cycle of a securities transaction will allow STP to flow throughout the entire chain of events, including the transaction life cycle and eventual reporting related to instruments, positions and statements. The roadmap is a living document. It should evolve as the industry evolves.

Karel Engelen: There is still plenty of scope for automation in finance, much of it outside the areas traditionally covered by FpML, FIX and SWIFT. We would be happy to see other standards integrated into the roadmap to further broaden the coverage to new product areas or processes.

Scott Atwell: Yes, I believe additional standards bodies will join the effort in the near future, and we are discussing reconstituting the group as ‘The Standards Coordination Group.’

Courtney Doyle: What are the next steps? How will the roadmap be maintained and updated?

Karel Engelen: The roadmap should not be seen as a static document. Each of the standards involved continues to evolve to meet the demands of its user base. The group that created it will need to periodically meet to review and update it. The end goal should be a set of interoperable standards covering the entire financial process and product space, unified by a common underlying business model whilst still allowing each to be attuned to the needs of the market it addresses.

Scott Atwell: This is an ongoing collaborative effort. I agree with Karel’s description of the end goal, however, there is a lot of work for us to complete before reaching it. FPL has committed significant technical resources to industry collaboration and the ISO process. ISO 20022’s Pre-Trade/Trade model is represented by FIX, and FPL is currently working with others to complete the Post-Trade model. In addition, FPL actively contributes to ISO WG4 and other areas focused on enhancing the ISO 20022 process. The roadmap will evolve over time. An example might be a previously over-thecounter product shifting to trading in a more listed nature.

Genevy Dimitrion: To follow on from Karel’s and Scott’s comments, we see the roadmap as an evolving document that will be revisited frequently as new products are introduced in the marketplace. We are looking to start addressing CFD’s and the best standards for them.

Jamie Shay: The next steps are twofold: on the one side, we need to continue to work with FPL and FpML to agree on the common business models and get them approved by ISO. We also need to encourage other standards organisations to work with us to expand the scope to cover other instruments, other business processes and other business domains.

We also need to find an easier way to be able to “automatically” produce a chosen syntax from the single business model. Work is ongoing to ensure multi-syntax representations of ISO 20022 compliant messages.

Courtney Doyle: How does the roadmap fit in with ISO 20022 and the goals of interoperability among standards?

Scott Atwell: I believe the roadmap has already paved the wa
y to improving ISO 20022, especially in terms of its ability to leverage and include existing, de facto standards. A key issue in years past was that the process required the creation of a new, ISO 20022 XML-specific message syntax for any messages added to the ISO 20022 data model. Now, the ISO 20022 process has been improved to the point that alternate syntaxes, such as FIX, can be recognized. This enables FIX and other messaging standards to contribute to the overall process, and more importantly allows users of these protocols to continue to leverage their current investment while achieving longer term benefits from the harmonization and collaboration of multiple standards.

The most important part of interoperability is that one can easily transition, in an automated fashion, from one stage of the trade lifecycle to the next. The roadmap facilitates a focus on these ‘peering points’ and the ISO 20022 data model provides a common model to reflect the data used throughout.

Jamie Shay: The roadmap provides clear direction about which syntax to use in which space, and building a common business model means these syntaxes will be interoperable. Interoperability will be possible once all the appropriate business models have been created in ISO 20022, and the industry embraces this standard.

Genevy Dimitrion: The roadmap clearly shows how 20022 provides a framework for binding multiple syntaxes together into a single standard and thereby achieving semantic interoperability. The business process and modeling is key to the success of these standards and their interoperability. The focus on the business functions and the interactions will allow for standards to be developed vs. the past in which standards were created and the business functions were integrated into existing formats. It allows firms to focus on the interactions among the industry players and the key data needed to successfully communicate in an automated way and increase STP within their organizations.

Karel Engelen: There are still areas where standards based messages are yet to be defined. In some cases these areas will be a natural fit with one of the existing standards but, if they are not, then proposing that they be developed within the ISO 20022 framework, rather than creating another standard, is a good route to follow.

To follow on with the thoughts around interoperability, this is a longer term goal and the key to it will be a common business model that captures and integrates all the different perspectives each standard has on the underlying business information. ISO 20022 has made a good start on this but there is still a long way to go.

Confucius said “A journey of a thousand miles begins with a single step.” It’s a good job we have a Roadmap.

The Investment Roadmap is available at: www.fixprotocol.org/investmentroadmap.

Keep It Simple

By Anthony Victor
Despite the rapid advances in sophisticated trading tools, Bank of America Merrill Lynch’s Anthony Victor argues that in times of volatility a knowledge of the basics has never been more important.

Anthony Victor, Bank of America Merril LynchWhile market structure and technologies may have transformed, basic trading skills are still critical to success.You do not succeed in any career unless you learn the basics and build a solid foundation in the fundamentals. For a role in electronic trading, the basics include understanding trading mechanics, market structure and technology, as well as the platforms that clients use to trade. Those old monochrome Quotron machines that were prevalent on trading floors when I started my career are now on the trash heap, replaced by state-of-the art technology, including touch-screen order management systems and workstations that supply news, market data, and analytics.
Since 2000, there have been some very dramatic changes in market structure such as decimalization, the advent of Reg. NMS and an increasing number of liquidity pools. Some of these changes resulted in a reduction of bid/offer spreads and a decrease in average trade size, which in turn, pushed market participants to use more advanced trading technology and ultimately algorithmic trading.
Algorithmic trading strategies, initially used by Portfolio Desks to manage large baskets of stocks, were eventually rolled out directly to the buy side. Trading no longer required multiple phone calls with instructions to execute an order. With a mere push of a button, these instructions could be sent electronically and trading goals could be efficiently realized. However, use of these tools still requires a good understanding of the basics and a team of support professionals that understand the nuances of how algorithmic strategies operate in the marketplace.
Within the last five years, Electronic Sales Trading (EST) desks have emerged on Wall Street to support the clients’ electronic trading activity. Unlike traditional sales trading, which focuses on what clients are trading, electronic sales trading puts the emphasis on how they are trading. Most EST desks have evolved from a pure internal support role into a client-facing, direct coverage role that assesses a client’s performance via real-time benchmark monitoring and post-trade transaction cost analysis. Electronic Sales Traders need to understand market structure and their firm’s algorithmic offering (and that of the competition) to successfully support the trading platform.

While market structure, trading tools, and trading desk responsibilities have all evolved over time, basic trading skills are still critical to success. Part of that skill set includes the ability to maintain a disciplined approach to trading amidst a barrage of news, overall market fragmentation, and a huge volume of market data. Algorithms have assisted the buy-side coping with the complexity of the marketplace, but the choice of strategy ultimately belongs to the trader.
In Q4 2008, when volatility (the amount of uncertainty or risk about the size of changes in a security’s value) peaked and preyed upon market participants’ emotions, many traders moved to more aggressive electronic trading strategies. In that tough environment, traders migrated away from passive strategies, like VWAP and lowparticipation algorithms, to more aggressive liquidity-seeking strategies, including an increased use of ‘dark pool’ aggregators. In a more volatile environment, traders felt pressure to make a stand or risk higher opportunity costs.
However, sometimes intuitive approaches do not work as expected. When analyzing the effects of the volatile market and looking at the slippage, or the difference of the execution price versus the price at time of order receipt (arrival price slippage), these aggressive strategies proved less successful than passive strategies, primarily due to the effects of volatility on spreads and depth of book. During that period, our research shows that S&P 500 spreads widened an average of 72% and book depth decreased 42% compared to Q1. Wide spreads and decreased book depth created a treacherous environment for aggressive, liquidity-seeking strategies, but favored more passive strategies.
The upside of passive
The success of passive strategies like VWAP can be attributed to several factors. VWAP orders tend to be longer in duration than other order types and in turn represent a smaller percentage of interval volume. In addition, the size of individual slices sent to the market tends to be smaller and more passively priced.

Due to the longer order duration, these less-aggressive strategies are more patient in seeking out price improvement opportunities. This results in crossing the spread less frequently, thus saving the higher cost associated with wider spreads. Conversely, aggressive order types that seek to execute quickly tend to cross the spread much more frequently.
Individual slice size is a meaningful strategy characteristic due to the fact that decreased depth-of-book also exacerbates arrival price slippage. With less available liquidity, algorithmic strategies may have to work through more price points to complete a given order slice. Slippage increases when there are more of these execution price points. At lower aggressiveness levels, the slices themselves are smaller, reducing the need to hit progressively worse price points to complete, thereby reducing the signaling risk (the risk of signaling to the market information about trading intentions) of any given slice. The larger slice sizes necessary for more aggressive order types tend to execute through more price levels, resulting in more slippage and producing a more detectable signal to the marketplace.
The cost of aggression
Though traders may have assumed that more aggressive strategies were a prudent execution method in the high volatility environment, it is evident in retrospect that the cost associated with choosing aggressive strategies was significant. Most notably, for more difficult trades (over 2% ADV), the cost difference between using an aggressive strategy over a passive one topped 44 basis points in Q4 08.

During this same period, traders were also looking for additional hidden liquidity to complement their urgent execution mentality. The result was an increase in the use of dark pool aggregators. These order types are generally midpoint-pegged, meaning that they execute at the middle of the spread. Slippage incurred was less than it would have been when fully crossing the spread, although additional costs were realized because spreads widened. As expected, the performance of these order types fell somewhere between more passive and more aggressive strategies.

Keep It Simple
In Q1 2009, as volatility and spreads began to level off and depth-of-book once again began to increase, we saw a migration away from aggressive strategies back to VWAP and lower-participation Implementation Shortfall. It seems that those traders who underperformed in the fall moved to an uncharacteristic, yet statistically more successful, passive trading style. These strategies continue to work well today.

In a high volatility environment, even with the availability of sophisticated algorithms, some of the most basic strategies have worked best. Well-worn algorithms like VWAP and Implementation Shortfall have generally been more successful than newer and more aggressive, liquidity–seeking counterparts. While perhaps contrary to their intuition, traders should consider sticking with basic, “first-wave” algorithms like VWAP and I/S in a high volatility environment. The first and simplest strategy you learned may sometimes prove to be the best.
The new high volatility environment will undoubtedly spur another evolutionary advance of next generation trading techniques and strategies. Successful traders should look to implement the lessons learned last fall by maintaining a disciplined approach to trading based on fundamentals and overall market structure expertise.

Meldrum & Kandylaki : Markit

SHEDDING LIGHT.

Markit_W.Meldrum+S 

There is more that meets the eye to Markit than BOAT. Will Meldrum and Sophia Kandylaki discuss the firm’s panoply of products and services.

When was Markit launched?

Meldrum: Markit was spun out of TD Dominion Toronto (the Canadian based bank) in 2001 as the first independent source of credit derivative pricing. Over the following couple of years, we expanded our suite of products to cover equities, commodities, foreign exchange and fixed income. In 2007 we bought Project Boat, the equities post-trade reporting platform that was created by nine banks (Citigroup, Credit Suisse, Deutsche Bank, Goldman Sachs, HSBC, Merrill Lynch, Morgan Stanley and UBS) – to coincide with the launch of MiFID.

How did Markit develop BOAT?

Meldrum: Over the last year, we have worked closely with clients and data vendors to improve the quality of the dataset. We have also worked with the industry to better understand the implications of MiFID and how we could enhance our processes and workflow. We have recently restructured our data packages to make Markit BOAT data more easily available to a wider range of clients. The dataset provides a comprehensive view of the pan-European OTC equity markets. Overall, users have access to trade reports on an average of 20bn of OTC trades in equities every day, equivalent to about 25% of the daily volumes traded on all European equity markets.

What is the new pricing structure you introduced?

Kandylaki: The new schedule, which will become effective this May, includes nine regional data packages. They cover Austria, Eastern Europe, Euronext, Germany, Italy, Scandinavia, Spain, Switzerland and UK. They are targeted at users who do not require a full pan-European licence, which was the only licence available up until now. The structure will also include a new sliding scale discount fee for Markit BOAT’s pan-European data licence based on the number of registered users within a company.

In addition, we have launched a new service that gives users an end-of-day aggregated view of validated trades reported to the Markit BOAT platform. It aims to address the market’s needs for accurate historical trade data and correct measurement of off-exchange trading volumes. We recognise the importance of fine-tuning algorithms and we expect the service to also appeal to risk managers who need official OTC volume data to use as an input into their liquidity tests.

Although people often think of Markit as just BOAT, you claim there is much more to the company. How, and where, have you have expanded your product range?

Meldrum: Our products fall into three categories: – data, valuations and trade processing which operate across all the different assets: equities, credit, fixed income, commodities and foreign exchange. There are also asset class focused products, which includes Markit BOAT, and Markit MSA, which we launched last December. The service collates information on cash and synthetic equity trades executed on primary exchanges, MTFs and OTC markets. We are working with 141 banks including Citi, Dresdner Kleinwort, Deutsche Bank, Goldman Sachs, HSBC, Kepler, Merrill Lynch, Morgan Stanley, Nomura, Panmure Gordon, Royal Bank of Scotland and UBS, who are sending us their trade information.

The data is then consolidated to provide users with a compre- hensive view of market activity. The service ranks brokers on each European trading venue, stock or index according to the volume and value of their trades. The objective is to provide greater insight into the new trading landscape since the introduction of MiFID. The increase in the number of MTFs and dark pools has made it more difficult for market participants to monitor liquidity and market activity. The product shows them who is trading what and, where they are trading, and what share the brokers have.

What is Markit Valuations Manager?

Meldrum: Markit Valuations Manager is one of our most recent product launches and it is the first multi-bank, cross-asset client valuations platform. It provides a secure, standardised view of OTC derivative positions and derivative and cash instrument valuations across counterparties on a single electronic platform. Users will be able to view the bank counterparty valuations alongside Markit’s independent valuations. Currently, portfolio managers receive numerous statements from their counterparties in multiple formats, and this requires several hours of manual consolidation. The product will be integrated with our trade processing service to enable full life cycle support for OTC derivative positions including counterparty position data delivery, normalisation, reconciliation and valuation.

Before the financial crisis, counterparty risk was not such a big issue but now it has become a huge focus of the industry. Cost saving is also high on the agenda. We recently conducted a survey of 50 asset managers that showed that there was an urgent need for this type of service. About 17% of the respondents said that a single file delivery of counterparty statements would save them between 50 and 1,000 hours of work a month while on average respondents estimated time savings of over 49 hours a month.

What is Markit Document Exchange?

Kandylaki: We launched this product last year and it is basically a documentation library that allows the buy- and sellside to post, manage and share compliance, counterparty, credit and regulatory documents securely. Quick access to counterparty information has been become one of the key issues today from a regulatory and compliance standpoint and this product simplifies the process and makes it much more efficient. For example, previously, if you were a new hedge fund and wanted to trade with different banks, this could entail sending and receiving information from say 15 banks and multiple departments within these banks. Our product gives users a platform where they post their documents once and permission them to all their counterparties.

What are your plans for 2009?

Meldrum: We expect 2009 to be an extremely challenging year for global financial markets and we will not be immune from the contraction in global participation. On a more positive note the structural changes present new opportunities for Markit. We are well positioned to participate in areas where the licensing of IP, electronic connectivity of market participants, or provision of data is required.

[Biographies]
Sophia Kandylaki is a director at Markit and the product manager of Markit BOAT. Prior to this, Kandylaki was the product manager for Markit RED, the industry standard for reference entity and reference obligation identifiers used throughout the credit default swap markets. She holds a Bachelor of Laws degree (LLB) and a Master of Laws (LLM) in banking and finance Law from the Centre for Commercial Law Studies, Queen Mary’s College, University of London.
Will Meldrum is a managing director at Markit and head of equity data. Meldrum joined Markit in 2005 as director of acquisitions and strategic development. Prior to this, he worked at Deutsche Bank for four years, managing the bank’s interests across a diverse portfolio of investments with a key focus on industry consortia, electronic trading systems and data. Meldrum holds an MA from Edinburgh University and an MBA from London Business School.
©BEST EXECUTION

Profile : John Barker : Liquidnet

STRIKING THE MATCH.

Liquidnet_J.Barker_491x375

Liquidity is a scarce commodity these days but Liquidnet believes there are more opportunities than closed doors. John Barker tells Best Execution why.

Can you tell me the background to Liquidnet

Liquidnet started as a “Eureka!” moment at 4 am in 1999 when Seth Merrin, founder and chief executive officer, came up with the concept of capturing institutional block orders. He spent the next couple of years, asking people in the industry to either ratify or pull apart his idea. The reception was good and he launched the company in the US in April 2001. It was successful from day one mainly because of the simplicity and ease of use of the product. Three clicks of a mouse and the buyside trader was able to trade shares anonymously and efficiently.

In 2002, we went live in Europe, initially focusing on the UK, France, Germany, the Netherlands and Switzerland, and in 2007, we went into Asia, where we are currently in the core five markets – Hong Kong, Singapore, Australia, Japan and Korea. Altogether we are in 29 markets and in many ways, the business has developed in a typically West to East pattern. Historically, the US is ahead of the UK by about two to three years which is a year ahead of Europe, which is turn is about five years ahead of Asia. However, in this case, the Asian markets were more technologically advanced than Europe. The buyside had already adopted direct market access (DMA) and order management systems. (OMS).

What has been the impact of the financial crisis?

The whole financial community has suffered and we are no exception. We had grown 100% year on year from 2003-2007 but last year Europe’s principal trading grew by 24% as compared with 2007. While this figure is much lower than we had forecasted at the beginning of the year, we still believe it is good against the current backdrop. Looking ahead, if Europe’s main share indices continue to perform poorly, then principal traded is likely to be flat in 2009.

The group that has been the most impacted is the investment banks. For example, we have not been affected in Europe by the withdrawal of hedge fund activity as our business is only focused on asset managers. Going forward, there will be opportunities for us and we think this could be the age of the agency broker although not all will survive. We are not resting on our laurels and will continue working closely with the buyside to deliver new products, develop existing products and expand into new markets as well as add new functionality.

Which new markets are you looking at?

We are still working our way through Europe and we currently have a healthy pipeline of new client members, with about half a dozen ready to go live. This year we are targeting the asset managers in Germany, France, Switzerland and Italy where we have already started to build relationships. To this end our sales team has recently hired Raj Samarasinhe who speaks several languages and has a good understanding of the different markets. In terms of the markets we provide access to, we expect to have Turkey, Slovenia and Poland on the system in the next couple of months.

What new products and services do you expect to introduce?

We just added Luxembourg-listed Global Depositary Receipts (GDRs) to the system. Initially, there will be 97 GDRs available to trade, all of which are issued under Regulation-S, US dollar traded, and settled via Euroclear. This range complements the LSE-listed Reg-S GDRs, AIM securities, exchange traded funds, contracts for difference and equities from 29 global markets that are already available. Looking ahead, we plan to introduce our new liquidity streaming product called H2O, which was launched in the US about four years ago. The product aggregates fragmented market liquidity and enables clients to interact with the liquidity of other streaming liquid partners. Unlike the main buyside-only pool, it contains flow from non-buyside sources such as broker-dealers and alternative trading platforms. Effectively, these partners provide more liquidity to the Liquidnet pool but buyside control is maintained whilst anonymity and minimal market impact are protected.

We also plan to get our programme trading desk up and running and introduce a transaction cost analysis service in Europe. The desk will be an electronic trading service that uses algorithms to access liquidity both on our own trading platforms and others. I think that large orders will go to the block desk while smaller cap orders will go to the programme- trading desk.

In addition, we are looking to expand our counterparty exposure monitoring process, which we launched in the US late last year. Counterparty risk has become very important since the collapse of Lehmans. This platform collects trade data from various systems and generates a report that allows the firm to look at exposure at the prime broker or custodian level.

There have been dramatic changes thanks to MiFID and the financial crisis. Where do you see your place in the landscape?

Liquidnet reinvented and redefined the institutional marketplace and we really do not see any competition in our space. We see exchanges and MTFs as partners but we differentiate from them in that we bring external liquidity to only the buyside. The main objective is to help the buyside enhance the quality and speed of trade execution, gain price improvement for their trades, and lower the overall trading costs.

What do you see as the biggest challenges this year?

I think we can expect another six months of bad economic news and I would not be surprised to see another big name go. However, we remain positive about our business model and growth opportunities. The buyside feels very comfortable with our model and this will hold us in good stead in the future. There are so many opportunities for agency brokers and the biggest challenge is in delivering the products and services we discussed on time and on budget. The other important thing to do is to get in front of the asset management community and understand what their requirements are.

[Biography]
John Barker is managing director, Liquidnet, responsible for the EMEA region. Previously, he was head of equity operations at Tokyo-Mitsubishi International — a Japanese international bank based in London — and head of trade support for Deutsche Bank. From 1988 to 1998, Barker was director of operations at Instinet Global Services. Before 1988, he held leadership roles at Yamaichi International (Europe) and Midland Bank/HSBC.
©BEST EXECUTION

 

 

 

Profile : Stephane Loiseau : Société Générale

FINDING THE RIGHT PATH.

SGCIB_S.L'Oiseau_433x375

As liquidity fragments and markets stay volatile, it is not always easy to find the right pools. We ask Stephane Loiseau of Société Générale how the bank is helping clients navigate through these difficult times.

Can you please give provide some background to the execution services group?

Over the past four years, we have been putting all our equity products under one umbrella. This includes sales/trading, single trades, block trading, programme trading, exchange traded funds, direct market access and algorithms. We wanted to offer a packaged approach and offer multiple venues for clients to trade on. In effect, it is a toolbox for trading and although they are each different, the common denominator is technology. There is also a strong global focus in that we are covering 60 to 65 markets spread across Europe, the Americas and Asia.

Who are your main clients?

We cater to all institutions – active, passive/index – but in the last few months we are seeing increased activity from private banks. At one time they were considered just retail organisations but today the lines between retail and institutional are blurring.

What has been the impact of the financial crisis on SG and the industry?

The biggest impact has been the drop in volumes, the wider spreads and the lack of liquidity. Figures show that volumes have fallen by about 50% from last year while the bid/offer spread is now three to four times what they were last year.

There is also a different competitive landscape as some market participants have disappeared and there are less players providing capital. Also, there is less activity from some institutions such as hedge funds, which has also had a direct impact on the volumes and the cost of trading. For example, buying 5% of the volume weighted average price flow could be three to five times more expensive this year than a year ago due to the scarcity of liquidity.

As a result, we, along with the rest of the industry, have had to become much more flexible in the way we trade. About 60% to 70% of volumes are traded electronically and technology has become one of the differentiators. Any firm who has invested in the right tools such as smart order routing, algos and DMA is much better placed than those that did not. Clients want to automate the trading process as much as possible in order to access liquidity both quickly and cheaply.

What do you think the impact of MiFID has been? Do you think we will see more MTFs given the current volatile market conditions?

MiFID has definitely succeeded in introducing competition and has allowed the creation of a true pan-European trading platform, which has been positive for the industry. However, there have been far fewer MTFs launched than expected. Some people thought there could be up to 20 but in reality there have only been two to three, including Chi-X and Turquoise, who have made any real impact. I think it will be
even harder this year for new entrants because we are in a low volume market and the primary exchanges are in a good position to capture the flow. For example, the MTFs did not benefit when the LSE had an outage last year and while they have learnt some lessons I am not sure they would benefit today. One of the problems is that there is not that much differentiation between the platforms. The big question they will have to ask is how much time should they allow to attract enough liquidity to be successful. I would not be surprised to see some form of consolidation this year or next.

Do you think dark pools have been good for the market?

I do not think dark pools are for everyone or every order. For example, if you are buying Vodafone, there is no need to use a dark pool but if you are a value investor with a buy order for a high percentage of average daily volume in a small-cap stock with thin liquidity then a dark pool rather than a lit pool may be a better alternative. There is no perfect solution and traders should look at all the options and conduct pre-trade analysis. The questions to ask are what are they trying to achieve and what is the timeframe? Is it a short or long-term investment, and will there be any follow on orders? Also, what are the opportunity costs and information leakage? How long should a trader be prepared to leave an order resting in a dark pool? There needs to be a balance between waiting long enough to see if the order will be crossed with the risk of the order being pinged and as a result, increasing the information leakage, and the opportunity cost of not being represented in a potentially volatile “lit” (i.e. displayed) venue.

What are your plans for the future?

For us the approach has always been to provide clients with not only the products they require, but also the tools they need to measure our performance, and we will continue to build upon our offering. We give them tick-by-tick, real time execution information and we plan to further enhance our post trade data and audit trail. The biggest challenges, of course, are the reduced volumes on both the buy- and sellsides and the market uncertainty. Clients are looking for liquidity and we will provide flexible solutions and more algos that enable clients to adapt to these evolving market conditions.

What do you see as the main challenges for the industry?

I think there needs to be more work in the area of clearing and settlement. MiFID introduced competition in the trading space but clearing and settlement in Europe is still complicated and expensive . It is one of the main differences between us and the US, which has a single provider, DTCC. I believe that a single pan-European clearer would both help market participants lower the total cost of trading but also add volume to the market place.

[Biography]
Stephane Loiseau is deputy global head of execution services of Société Générale Corporate & Investment Banking. He began his career with the French based bank in New York in 1996 as a trader on international equities before joining the programme trading team in London. He returned to New York in 2000 as head of the New York programme trading desk, and rose in the ranks to deputy head of global programme trading and then global head of the group in 2004. In mid 2006, Loiseau became co-head of global programme trading & electronic services and two years later, he relocated to London to take up his current position as well as head of execution for London.
©BEST EXECUTION