Home Blog Page 639

Nick Nielsen : Marshall Wace

Marshall Wace, Nick Nielsen
Marshall Wace, Nick Nielsen

TAKING THE HIGH-TECH ROAD.

 

Nick Nielsen, head of trading, Marshall Wace explains how the firm keeps its edge.

How difficult are today’s markets to trade in?

One of the problems today is that people had become too used to benign markets and had forgotten how to adapt to changing conditions. Today, the ride has become a lot harder due to rapidly evolving microstructure changes over short periods of time. I don’t think markets are volatile but that people are not engaging with them in a way that reflects it. The biggest hurdle cost for traders is how aggressive to be. As spreads have dropped, trade sizes have come down which effectively means that you need to lengthen your execution window or decrease your aggression so as not to show market participants what you are doing.

What has Marshall Wace done to keep its edge?

I think today there is a lot of gain from spending money on technology. Firms can differentiate themselves with the right, scalable platform and intellectual property. We believe that you have to be innovative, constantly evolving and investing significant amounts of money into building and enhancing products and tools. We build everything ourselves, back-test it and make our own impact assessments. We have invested in our trading platform and infrastructure which all of our funds, including our Trade Optimised Portfolio System, are able to leverage. It enables us to trade efficiently across global markets in real time and capture more alpha for our funds.

As for TCA, we have built our own systems because we can better standardise methodologies and understand each assumption and calculation utilised. We also find it difficult to integrate our trading system with independent TCA vendors to capture all necessary information to provide us with the depth of analysis and flexibility that we require. Our goal is to ensure that we have cutting edge technology in order to optimise the execution process and to provide an end-to-end service that is seamless for the portfolio managers and investors.

How do you see the hedge fund industry evolving?

If I had to guess I would think that the big funds will get bigger and the small ones will continue to be niche. It will be those in the middle that will be impacted because the running cost is high and regulation will only increase that. Firms will need the economies of scale in order to survive, unless they have a nice product. I think the opportunities for a two man hedge fund with $50m are limited.

What has been the impact of the reams of regulation?

There has been a great deal of regulation over the past few years but I am not sure that the markets have become any more efficient because of it. For example, market data as well as exchange fees are still too high. There is also a lot of talk about high frequency traders but I think looking ahead they will struggle and need to move up the value chain if they want to survive.

How important is speed?

I don’t think differentiation can be based on who has the fastest computer anymore. It is pretty much a level playing field. There used to be a greater difference but that has been virtually eliminated and there is a limit to what can be achieved in relation to marginal speed benefit implementations. Upgrading hardware is still relatively cheap and will continue to happen but I think we have reached the point where the alpha forecasts being encoded aren’t granular enough to take advantage of the speed with which computers can execute strategies. The costs of marginal speed are high relative to revenue. I think people will continue to maintain and try and improve things where it makes sense, but I see the money they previously spent on hardware will over time, be spent on research.

[Biog]
Nick Nielsen joined Marshall Wace LLP in 2008 and is head of trading in London. Prior to this, he was at Citadel Investment Group in London and Chicago. He started his career in high frequency trading at Goldman Sach’s Hull Group in the US. Nielsen has a BS in finance from the Massachusetts Institute of Technology.
©BestExecution

Arun Sarwal : DST

DST, Arun Sarwal
DST, Arun Sarwal

RE-WRITING THE SCRIPT.

DST, Arun Sarwal

Arun Sarwal, chief executive officer – investment management solutions (IMS) – DST Global Solutions explains how the company has been transformed.

How has the company changed since you took charge in 2009?

We have undergone a major transformation and restructuring of the business over the past three years to make it leaner and more efficient. We had provided services across the front, middle and back office but decided that the front office had enough players in the space and that there were more opportunities in the middle and back offices. The other major change is how we related to clients. Like many organisations, we became silo based and traditionally IT vendors have been more transaction rather than relationship oriented. We realised we not only had to engage more with clients but also to become much more accountable. I introduced scorecards with key performance indicators. The result has been positive both externally and internally. Our margins for key solutions have increased by around 40% over the last three years and the revenue generated by our top 20 clients has risen by some 25%. Internally, we are able to promote from within and nine out of our 10 senior jobs have been filled by our employees.

Who is your main client base now?

We have 260 clients with over 350 installations in 45 countries across all IMS applications. We serve four main market segments – asset management firms, asset service providers and sovereign wealth funds as well as wealth managers which is a fast growth new business area for us.

Why have you focused on the middle office?

One reason is the challenges associated with data management. The pressures from regulation as well as investor due diligence and operational risk mean that financial services institutions have to find a much more effective way to manage their day to day operations. Increasingly, the middle office which focuses on compliance and control, performance measurement and different types of reporting is viewed as the central point for all relevant information. They are being asked to facilitate the management, consolidation and delivery of business critical data. However, it takes time and money to invest in changing technology and we saw this as one of the biggest opportunities.

What new products have you introduced?

Two years ago we launched Anova which is a suite of investment analytics applications for the middle office function on a global operating platform. What we found in the asset management industry was that data in the middle office space was fragmented from both the geographic and analytical application perspectives. Another problem was that firms had developed a complex system of infrastructures and architectures derived from individual best of breed solutions. This resulted in a plethora of systems and solutions which caused inefficiencies, multiple points of potential failure and increased ownership costs. Firms wanted a single place for data and we developed Anova to address that problem. The platform can uniquely operate in a continuous processing mode for 24/7 “follow the sun” investment operations. Anova provides access to multiple analytical applications and data feeds using a single set of data. It covers all areas of middle office analytics – performance measurement, equity analytics, fixed income analytics, risk management and post trade and regulatory compliance. We also built the capability for role-based dashboards that enable firms to customise the data and manage their assets much more effectively. Users create their own dashboards depending on their particular functions and requirements. It offers the ability to leverage existing systems within a client’s operations with engines for areas such as cash flows, corporate actions and real-time investment valuations.

What about administration and your asset servicing solution?

HiPortfolio provides full life-cycle processing from order capture, through trade confirmation and transaction processing to accounting, financial analysis and tax reporting – for a wide range of instruments and tax jurisdictions. It is used by over 160 investment operations in 35 countries around the globe. Clients range from specialist management boutiques to the very largest global institutions and third party administrators with hundreds of concurrent users processing thousands of transactions and portfolios every day. We have also recently added new functionality and extended the depth to a number of key operational areas, including derivatives processing, reconciliation and unit pricing to offer operational efficiency and help deliver cost reductions.

Who are your main competitors?

Given the nature of data management, I would say it is the internal IT teams in the different institutions. However, when they started to look at the data challenges I think many thought they would have to start from scratch which was both costly and time-consuming and risky. Our proposition is that we have been investing for several decades in this area and we have a model that is able to maximise as well as optimise the data. We were also able to tailor it to their workflows and have worked closely with IT departments over several years in developing a data model based on best practice.

What are some of the major concerns that clients have?

When you look at the broad scope of data, scaleability is one of their major concerns. For example, one of our clients is a global bank in Asia which has several million accounts across nine countries and is targeting the mass affluent. They want to know that we can provide them with significant scale as their business grows. They also want a global operating model that runs 24/7 which enables them to pull investment and account data whenever they need to at any time.

What has been the impact of regulation?

Since the financial crisis, there have been several regulatory initiatives across the globe and they touch every part of the value chain. In the past, it would have just been the compliance departments which would have been responsible but today the whole organisation needs to get involved. The other impact is that all these new rules are data hungry and as a result financial institutions need to have information in an accurate and timely basis. As a result we have seen a real uptick in our business.

What has been the impact of the delays and extensions we have seen with certain regulations?

In some cases, the deadlines have been unrealistic and their full implications will emerge over time. We work closely with clients who need to respond, as well as with industry groups, to try and influence regulators to ensure that the implementation deadlines are practical. We think that there should be a more phased in rather than drop dead approach.

What are your plans overseas?

Asia is an important and growing market for us and home to a number of our key clients. We have been in the region for about ten years and are relatively strong in Hong Kong, Thailand, Indonesia and Singapore. We service more than 50% of the insurance market in China on our platform. Looking ahead, we are targeting wealth managers in Asia because of the increasing number of mass affluent. The same is true in Latin America and we are expecting to establish ourselves. We are also looking to expand in Africa – where we have already been established for many years. Africa is interesting also currently as there is an economic bubble growing across the region. This is particularly true in Nigeria, Kenya and Angola.

[biography]
Arun Sarwal joined the company in 2008 as chief executive for investment management solutions. He has around 25 years’ of global financial services experience with particular emphasis in investment banking, investment & wealth management and outsourcing. Prior to joining DST he was the chief operating officer at Scottish Widows Investment Partnership. During his career, he has worked in senior management positions with ABN AMRO, Société Generale and KPMG and has gained experience of working in over 25 countries. Arun is a Fellow of the Institute of Chartered Accountants of England & Wales as well as the Chartered Institute of Marketing. He is a Corporate Treasurer and has an MBA from Cranfield. He is the author of the International Handbook of Financial Instruments & Transactions.
©BestExecution

 

Fixed Income Runs on FIX: Update 1

Lisa Taikitsadaporn and Sassan Danesh of the FPL Global Fixed Income Committee (GFIC) and the Fixed Income Connectivity Working Group (FICWG) share the latest in the effort to standardise connectivity for FI trading.
As the move towards transparency in the OTC markets gathers pace, FPL’s Global Fixed Income Committee (GFIC) and the Fixed Income Connectivity Working Group (FICWG), which is made up of major sell-side banks, are continuing to work towards standardised connectivity between the sell-side and the execution venues through the use of the FIX Protocol.
In the swaps space, the work to extend best practices recommendations to users of the FIX 4.4 Protocol is nearing completion.
In 2012, GFIC and FICWG extended their collaborative efforts to cover cash bond trading. An additional set of recommended best practices for using FIX for standardised connectivity between banks and trading venues is being produced under GFIC guidance.
Swaps: FIX 4.4 Back-Port
With multiple execution venues running version 4.4 of the FIX Protocol, there has been demand for a document which explains how the best practices recommendations should be implemented in a FIX 4.4 environment. The working group has therefore examined how to back-port certain messages, components, fields and enumerations from version 5.0 SP2 and after to allow compliance with the best practices for venues currently running the older version of the protocol.
FIX 4.4
The FIX 4.4 back-port recommendation has been incorporated into Volume 1 of the “Best Practices for CDS and IRS Electronic Trading” document set and is currently at the final stage of ratification by the GFIC. The current documents are available here: http://fixprotocol.org/document/6515/Best%20Practices%20for%20CDS%20and%20IRS%20Electronic%20Trading%20Rev.%203.06.pdf.zip
Bonds: Best Practices Documentation
Fix 4.41
The working group is currently reviewing the quoting and negotiation model and has begun to identify the different workflows needed for cash bond trading. We encourage anyone who is an FPL member and has an interest in this to get involved with the GFIC and GFIC Technical Sub-Committee, the committees where this work is being discussed. To participate, please contact the FPL Program Office (fpl@fixprotocol.org).

Market Opinion : Bob McDowall

INTO THE BREACH.

Bob McDowall assesses the impact of the eurozone crisis on the investment banking community.

The exercise in quantitative easing by the European Central Bank in February accompanied by a contrived resolution of the Greek debt crisis have brought an era of orderly uncertainty to what was disorderly uncertainty. The crisis is not over but it has abated for the time being although the reverberations continue. The ambiguous economic and financial outlook in Europe has affected multinational companies and major imports to the region. Its economic health is important to the prosperity of other major economies including the so-called BRIC countries and the US.

Investors have many reasons for concern. The enthusiasm arising from the debt agreement could revert to pessimism. Their fears have been reflected in market volatility as many are either sitting on the sidelines or have exited the European equities market. This has impacted the equities line of business at investment banks and the recent turbulent conditions have replaced regulation and technology as the most immediate critical focus. Large, as well as mid-sized, firms have shed staff in Europe plus many firms that had had ambitions to build equities’ franchises have been forced into re-evaluating their strategies. The barriers to developing business in terms of technology investment and regulation have grown. The few that continue to pursue a strategy of scaling up have to continue to focus on efficiency, much of which is achieved through technology and headcount reduction.

Other factors inhibit the growth of equity franchise such as the absence of initial public offerings. In addition, reverse takeovers seem to be more popular in the current market climate because they are relatively low key and discreet. Contraction of bank balance sheet and lending, combined with low interest rates have made the corporate debt market a more attractive way of raising funds and securing medium term funding.

The largest firms that try to maintain their order flow and scaleability have to increase their IT expenditure principally through moving to multi-asset based platforms. These are designed to bring together multi-asset electronic trading, research and analytics onto one platform which leverage technology across the businesses and ultimately reduce the level of spending.

A tale of two models

The emerging trend is that the equities business is increasingly moving to two contrasting business models, both of which can be profitable. On one side are the largest investment banks. They continue to maintain an active business, handing substantial order flow that covers most sectors and geographies to a lesser or greater extent. On the other side are the small specialist boutique banks and brokers that place heavy emphasis on depth and degree of specialism, client service and a highly variable cost base that can be managed in response to the level of activity without service disruption. The niche model is being augmented by good quality specialists leaving the larger firms, and this is likely to continue. The focus for this group is understandably away from European equities to the geographies where there is growth, including pan-Asian, regional and national Asian markets, Latin and South America and the emerging markets of the old Soviet Republics.

Squeezed in the middle

Broking firms occupying the middle territory have a dilemma. They do not have the capital and skills to trade up and compete for the diminishing order flow. They should focus on a particular market segment, country or area of research by scaling down to their core geographic, sector or service strengths.

Other, probably more important, changes have occurred in the equities markets over the past ten years which will have an important influence on the nature of the equity franchises in the future. These have been encapsulated by Professor John Kay who is currently chairing the Review of UK Equity Markets and Long-Term Decision-Making which will be reported to the Secretary of State for Business, Innovation and Skills in July 2012.

Kay, who is a visiting professor of economics at the London School of Economics and a fellow of St John’s College, Oxford, indicates that equity markets today are primarily secondary markets. “As the source of new funds for investing in UK industry, equity markets are of almost no importance at all.” The growing focus on absolute returns over relative returns, is stressing that “pounds, not alpha” paid for pension benefits. “Taken as a whole,” he said, “People’s pensions are paid for by beta, not alpha – because alpha aggregated over all investors is essentially zero. Higher performing companies generate the additional returns. The business imperative of asset managers was not to create better performing companies but to outperform their rivals.”

The main issue is that investors and traders have very different interests and perspectives. Investors looked to gain returns from the underlying cash flow generated by the company, while the traders are interested in the returns from movements in market share prices – trading on “momentum, arbitrage or being in some sense market makers.” Technology has advanced the speed and sophistication of the latter.

Asset managers, on the other hand, have a business model built around their relative performance. Their incentive structure rewards a sale of shares over engagement. Many regard this as something that is not so much to the benefit of their clients, but an additional cost that will have to be paid for in some way or another.

These observations illuminate the more subtle changes that are influencing the structure of the franchise models in equities markets. The eurozone crisis has only served to escalate these shifts.

Bob McDowall, is consulting associate to commercial think-tank Z/Yen – www.zyen.com
©BestExecution | 2012

 

Trading : SORs

SMART ORDER ROUTING: RAISING THE GAME.

Heather McKenzie looks at how SORs are adding value.

In a recent study, equity execution services provider Neonet found that smart order routing (SOR), operating with access to multiple trading venues including MTFs, contributes to raising the overall execution quality available to clients. This results in a significant price improvement when compared to trading on the primary exchange only.

For the study, which was published in February, Neonet combined its equity execution services with analytics and quality assurance tools from IFS LiquidMetrix. Using these tools, Neonet analysed the performance of trade flows on individual or multiple trading venues with millisecond precision – a requirement for evaluating sophisticated SOR engines.

Neonet though is not the only company that has recognised the benefits of SORs, which have become an established technology and form an integral part of algorithmic trading strategies. Smart routers are working at ever faster speeds to compare prices across lit and unlit trading venues, ensuring that they fit within any number of myriad strategies that might be in play at the same time. David Holcombe, specialist in markets and trading at UK-based consultancy, Rule Financial, says smart order routing “is the only way to truly navigate today’s fragmented capital markets in order to get the execution you want and need”.

In February, global investment bank UBS announced the launch of a new algorithmic trading strategy called UBS Swoop, specifically designed for discontinuous liquidity and hard to trade or low-volume securities.

“Traditional algorithms were built to trade on a schedule, which means they are not suited to appropriately capture unanticipated bursts of liquidity,” says Owain Self, global head of algorithmic trading at UBS. “Our clients tell us that at times, for certain securities, they need a strategy to operate on an ‘I would if I could’ basis. So we designed Swoop to wait, watch and intelligently act upon those unpredictable moments.”

The trading strategy employs non-schedule based behaviour that enables the algorithm to capture highly elusive liquidity in an opportunistic way. The liquidity-seeking order type has discretion to trade when appropriate opportunities present themselves and carefully allocate child orders – those that have been separated out from the original order.

Richard Balarkas, chief executive of electronic brokerage Instinet Europe, says SORs are essential for algorithmic trading strategies. “As I speak, the largest venue for trading European shares this morning is Bats Chi-X Europe, not the primary exchanges. To trade well without missing price and liquidity opportunities, you need to trade across multiple venues and to do so successfully, you need a smart order router.”

Tony Nash, head of execution services at Espirito Santo Investment Bank, believes SORs “are more important than any other component of the execution process”. He estimates that the scheduling component of the algorithm accounts for less than 30% of the performance compared with the 70% weighting applied to the SOR. The SOR can turn a well scheduled distribution into a poor execution simply by increasing signally risk or sweeping the venues in a sub-optimal order.

Spoilt for choice

Given the growing importance of SORs, it is unsurprising that there are many solutions on the market. Firms can choose the SOR of their broker or implement off-the-shelf packages from a variety of vendors.

Holcombe says SOR should remain a distinct layer in the technology stack. “SOR needs to focus on ensuring the best outcome for each child order, based on what the client is trying to achieve in balancing the required certainty of execution, versus specific price targets, versus market impact.” The algorithm takes the place of the human sales traders, working each parent order through its lifespan; releasing multiple child orders at predetermined intervals or triggers, seeking to achieve a better average price than would be achieved as a single execution.

“In this way, an institution can quite easily buy an off-the-shelf smart order router, as the mechanics of this are largely commoditised. The algorithm applies the firm’s ‘secret sauce’ – their trading logic that defines the trades they want, and need to do, in order to achieve their trading and investment targets.”

Mark Goodman, head of electronic services, Europe, Société Générale Corporate & Investment Banking (SG CIB), says the bank uses proprietary SOR technology in its algorithms. “We believe our signals-driven approach is unique and we do not believe a third-party platform could support this,” he says. SG CIB’s signals-based model drives the underlying algorithms while also enabling a high level of client-specific customisation.

SORs take on responsibility for the ‘how’ and ‘where’ element of an algorithm, he says. The how relates to “how should I trade this part of the order – should I be passive, aggressive, or rest in the dark, trying to get fills?” Many algorithms will handle the how at the macro level leaving only the where to SORs. “The where is the classic function of a SOR; which venues to post to, where to sweep first, using both dynamic liquidity heat-mapping and historical data analysis to drive decisions.”

Goodman says SG CIB believes this is a fairly limited method. The signals required to optimise how to trade an order can only be optimised using short-term signals which need to be as close to market as possible. Without this approach decisions around passive order placement ignore factors such as volatility prediction, which indicate opportunities for high-frequency and leave the order open to being adversely selected.

“There are many SORs in the market and the solutions they offer generally meet best execution requirements and offer a degree of customisation. However we increasingly find that sellside firms recognise the importance of the continual effort required to keep the SOR logic in line with changing market conditions and provide not just best execution, but a competitive advantage.”

Rather than undertake this investment in technology, resources and time themselves, firms are approaching brokers such as SG who will customise their solutions to these clients’ requirements, says Goodman.

There are two important factors an institution should consider when implementing a SOR, says Balarkas. First, they must ensure that the router is set up and calibrated to operate in the best interests of the client. SORs can be compromised by preferring certain venues, not connecting to some venues, looking for internal crosses over external liquidity, chasing rebates for the broker, etc. Second, firms have to ask how well the router actually works. With prices changing hundreds of times per second across multiple venues, a SOR needs to take information in, act upon it, and get the order out to the best venue with the minimum delay, he says.

Balarkas does not think there is enough information available to assess how good a choice there is of SORs in the market. “I would imagine that if I am a buyside trader looking for a broker’s SOR that connects to every possible source of liquidity, has addressed all possible conflicts of interest in favour of the client and that publishes independently verified results; I might not have that much to choose from.”

Espirito Santo Investment Bank uses its brokers’ SORs, says Nash, but benefits from being able to compare the SOR performance of each executing broker. “It was important for us to be able to manage the brokers’ executions and monitor their SORs. We created a post-trade tool that is unique in the market; it compares the distribution of the venue executions against the consolidated market. This tool enables our clients to accurately compare the performance of the algorithms as well as comparing the venue bias of the SORs being used,” he says.

Nash adds that there is an array of SORs in the market but they vary in quality and price “enormously”. He says it can be a challenge to differentiate between them accurately. An ideal SOR, he says, should simply address the challenge of executing the order in the right venue at the right time and the right price with zero signalling risk.

©BestExecution | 2012

 

TCA : BUYSIDE PERSPECTIVE

TCA_Buyside
TCA_Buyside

SAME PROBLEM, NEW SOLUTIONS?

TCA_Buyside

Dina Medland looks at why the buyside are not that happy with the current TCA offering.

A recent report on changing attitudes towards transaction cost analysis (TCA) both in the US and Europe by Aite Group, the independent research and advisory firm, finds that the buyside’s satisfaction level of TCA, in general, is low. There is a lot of inherent mistrust of data that is driven by lack of an accepted, industry-wide TCA framework.” (TCA Evolution, Through the Lens of Buyside Customers by Simmy Grewal and Sang Lee, Aite Group 2011).

Simmy Grewal, one of the authors of the report, says, “We found that buyside firms rely on a number of TCA products, sometimes as many as five, which highlights the fact that asset managers are not happy with their TCA.”

However, using multiple TCA providers at a time when traders are facing increasingly competitive and complex trading environments and thinner margins against a backdrop of rapid advances in technology does not necessarily point to high dissatisfaction levels. Michael Sparkes, director ITG, says, “People are using TCA more extensively than five years ago partly because of technology changes and partly because of market structure and fragmentation. The Aite report presents a very powerful graph charting the rise of electronic trading.”

‘While you could argue that the satisfaction data shows that users feel TCA has limitations, I interpret it to mean that people are now looking much more at what it is they actually want from TCA. Satisfaction levels reflect the fact that people want to do more with it and that is a challenge for the industry, not least because systems were not originally designed to address the questions now being asked” he adds.

While historically TCA has been very much a post-trade exercise that compares an execution against a benchmark such as the volume weighted average price (VWAP), there remain problems with that approach. “VWAP will be affected by your activity – the larger you are, the more you will be the VWAP. Where it is used we would recommend a participation-based approach rather than a time interval VWAP” says Sparkes.

“By far the most widely used measure in TCA is ‘implementation shortfall’, which looks at implicit costs around or slippage between when a trade enters the market, when it is executed and the final execution price. Our approach has been to develop models based on our client universe database to generate an ‘expected cost’ i.e. for an order with certain characteristics, traded in a certain set of market conditions. We believe this provides the fairest way of determining whether a client’s trading costs are in line with reasonable expectations or not, and highlights areas of particular strength or weakness” he adds .

TCA providers agree that the level of detail demanded varies enormously, depending on the client. Poor performance in difficult markets, especially in equities, has meant that there has been a significant increase in interest in TCA beyond the trading desk, as institutions look to mitigate the costs of trading.

Dave Hagen, director strategic accounts & trading at Linedata in the US says, “A lot of our larger clients are now doing their TCA on their own. I’ve just had a fascinating conversation with a gentleman who actually went out and hired three quants to build out his own proprietary TCA.”

There has been talk, he says, of ‘in flight TCA’, which essentially means in ‘real time.’ “What I am hearing now is ‘am I doing the best for this order at the moment’ – but I have been hearing talk about moving to real-time TCA for five years and have yet to see it” says Hagen.“There are no hard facts on this but I have a real sense that the European-domiciled clients I speak with seem ahead of the curve on this front.”

At Société Générale, Stephane Loiseau, head execution services appears to be at just that cutting edge. “TCA is not a post-trade event, it’s a lifecycle event, from when the order is first made to when it is settled it has to be a metric that is used all the way through” he says. Fragmentation in the market, scarce liquidity for macroeconomic reasons and regulation all pose challenges for traders, but the availability of tools now is also second to none, he adds.

“At SocGen we define TCA ourselves, and reduce participation in a venue as we go along. Trading strategy is a complex scenario with a lot of ‘ifs’ in play. By defining the objectives of the order, you can look at the tools and channels you are going to use, including choices of venues and brokers. Monitoring the parameters of the order as you go along, you can change the channels to adjust your trading strategy” says Loiseau. SocGen’s QVM approach is now about a year old and is used primarily on equities, “but we want to adopt this on other asset classes. We have already noticed trends that are helpful to investors, such as quantifying the impact of promotions on venues” he adds.

The absence of a consolidated tape in Europe remains “an issue compared to the US market.” “But there is a lot of information coming back to you and the question is how to make the most of it” he says.

Shifting landscape

Loiseau sees change as having taken place in the last 18 months, with innovation now taking place at a granular level. “For the most sophisticated buyside the focus is on mapping what services and tools are available and building their own mix” he says.

TCA providers say the shift in the TCA landscape has been “huge.” “People want to understand how they can look at their costs to improve performance. A big topic now is toxicity of the venues – how do you know one venue is better than another?” says Peter Weiler, EVP sales at Abel/Noser in the US.

“Buyside firms are not doing away with using vendors, but you do see a segment of the market which has a lot of resource do in-house work, especially if they are very quantitatively focused. The trend is towards using data to build algorithms instead of having an ‘algo sweep’ off the rack” he adds.

Clients are also increasingly interested in using visualisation tools to make TCA more “actionable” he says, and this is a part of a wider trend across the buyside, sell-side and quantitative firms.

Panopticon, the provider of real-time visual data analysis, recently conducted a live webinar with OneMarketData on how you can improve the efficiency of your TCA using a combination of visual data analysis, high performance tick history databases and complex event processing (CEP) engines. It included a “deep dive” demo showing how you can more efficiently analyse massive tick, quote and trade history data sets.

Visual representations such as heat maps and trend lines are “critical and the next step will be to bring those visualisation tools into the trading environment on a real-time basis” says Weiler. At IFS LiquidMetrix, the research and market data analytics company, managing director Sabine Toulson says, “ People are telling us that they want to go in two directions: they need specific information on the venues used to trade, but also data at a macro level in order to make judgements on the execution performance. ”

Not everyone agrees though on the compelling need for aggregate data, or with one of Aite’s conclusions that “As is the case with most aspects of financial performance, individual data is not as important as peer evaluation.” Sparkes points out that the report quotes 65% of respondents as saying that TCA could be built in-house, which would remove the importance of peer evaluation in the equation.

Nor does everyone get excited about visualisation tools and TCA. At SocGen Loiseau says, “Traders are not big fans of visualisation tools. They are good for the marketing brochure. We have had such tools here but would rather rely on the real-time indicators with no added latency.”

However, it remains clear that TCA is “moving downstream, to the trading level where we can shape trading. It is Behavioural Finance 101, it really is about building trading strategies from TCA data” says Weiler at Abel/Noser.

©BestExecution | 2012

 

Inbound Connectivity: Supporting FIX Portal in the US

Justin Llewellyn-Jones of Fidessa explains how connectivity is adapting to meet the concerns of brokers and traders.

The ability to integrate electronic trading and FIX connectivity into receiving platforms is a minimum requirement for trading. As such, it has become something of a commoditized service. The industry no longer refers to an OMS specific connection, for example, because it does not exist. Instead, brokers rely on consolidators that eliminate the need for individual connections, and provide both asset- and application-neutral connectivity from any client.

Justin Llewellyn-Jones, FidessaInterest in connectivity is being revived because of the growing complexities in marrying the need for fail-safe connectivity with the need to increase revenues and cut costs in the face of persistent downward pressure on margins.

Managing connectivity requires considerable effort: monitoring and capturing order failures and rejections, identifying the source of a problem, repairing it and going back to the client and convincing them to re-send the order is a constant challenge. The costs, time and resources required to maintain the communications infrastructure and relationships with telecommunications and application service providers can be significant.

What’s more, FIX expertise is a very specialized knowledge set that commands a correspondingly high price. FIX language is really more of a framework than a firm protocol. Many participants use the language to their own end to perform tasks such as interfacing with proprietary systems, which makes expert knowledge an absolute requirement.

As an expensive, commoditized service, connectivity appears to be a technology that should be outsourced in its entirety to specialist providers. Not surprisingly, a number of brokers have questioned the value they get from owning connectivity themselves. The infrastructure, the physical network connections, the relationships with telecommunication network providers do not add value, nor do they provide better quality of execution or improved relationships with clients.

But the wholesale outsourcing of connectivity management has not happened, which leaves brokers with ‘half and half’ solutions where, for example, a vendor’s consolidator and router might be used, but relationships with network service providers are maintained in-house. Brokers want to maintain a certain level of control, and do not trust vendors to deliver a cost effective solution, or time sensitive responses and customer service.

It is also considered that core services should not be outsourced at a time when it is becoming apparent that connectivity is a critical service, and one that is hard to separate from essential revenue-raising activity.

Brokers need to improve margins by attracting more flow from increasingly selective buy-sides. They need to do so by generating new investment returns, securing recognizable market differentiation or providing liquidity. Once more, we are seeing a definite trend where the buy-side is taking on the determination of the trading strategy. What buy-side traders need is the ability to confirm a particular broker’s trading strategy and then to customize it to a particular portfolio manager or investment fund. The implication is that buy-sides will increasingly demand instant access to new strategies.

As a result, buy-sides require a standardized ability to evaluate a particular strategy, through real-time, post-trade TCA, and information on execution costs which give them an exact read on quality of execution, where quality is not necessarily synonymous with price. Connectivity is therefore essential to a buy-side’s success because it allows access to a broker and that broker’s trading strategy, and it provides various tools to facilitate the evaluation of the broker’s trading strategy.

This is the competitive environment in which brokers operate. It is not just the quality of the strategy offered to buy-side clients that will secure their interest; it is also the speed at which it becomes available. Innovation no longer counts for much if there is a delay while connectivity is arranged. In a market where opportunities are lost in three minutes, a three-month period to build and test the underlying connectivity arrangements would be fatal.

Cost conscious, time conscious brokers need some means of making the necessary changes and evolving their service within a private space, whilst still depending on the standard FIX connectivity. A broker trying to provide a swift service will now typically be doing so behind the scenes, offering to map orders and run market data on the hypothetical results to demonstrate post-trade quality in order to attract and retain buy-side clients.

The fear is that outsourcing could compromise any of these elements as well as a broker’s ability to control his or her own destiny. The way in which brokers manages their clients, costs and time to market with new services or strategies is not viewed as safe if outsourcing.

The challenge is for vendors to provide brokers with the hosted, outsourced connectivity services that remove concerns about network fees and relationships with telecommunications providers, but still provide brokers with absolute control over their internal network. Hosted trading platforms have already demonstrated that outsourcing core functionality to trusted, third-party experts is a highly successful strategy, and a positive way for brokers to respond to a rapidly changing market environment. We expect that over the next year, more and more brokers will take the plunge and leverage the networking and connectivity expertise of trusted vendors in order to service their customers better.

Completing the Circuit: Canadian Regulation

Wendy Rudd of the Investment Industry Regulatory Organization of Canada (IIROC) describes the Canadian approach to circuit breakers, minimum size and increment requirements and the role of dark liquidity.
Wendy RuddWhat is currently driving the regulatory policy agenda with regard to circuit breakers?
Globally, and Canada is no exception, we have seen the introduction of new rules in several areas related to the mitigation of volatility. Circuit breakers are just one of those areas. While some reforms may have been in the works already, the Flash Crash of May 2010 certainly served as a catalyst for a broader debate about market structure, trading activity and the reliability and stability of our equity trading venues.
Volatility is inevitable, so when does it become a regulatory concern?
From our perspective – and we regulate all trading activity on Canada’s three equity exchanges and eight alternative trading systems – we see it as a priority to mitigate the kind of shortterm volatility that interrupts a fair and orderly market. We do not expect to handle this role alone; it is a shared responsibility that includes appropriate order handling by industry participants and consistent volatility controls at the exchange/ATS level.
What are the benefits of harmonizing circuit breaker rules with US markets?
One main advantage to a shared or complementary approach is that it limits the potential for certain kinds of regulatory arbitrage in markets that operate in the same time zone. Many Canadian-listed stocks also trade in the US, and roughly half of the dollar value traded in those shares takes place on US markets each day.
Which approaches are you considering taking for market-wide circuit breakers?
We are monitoring developments in the US, where regulators have proposed changes which include lower trigger thresholds calculated daily, using the S&P 500 (instead of the Dow Jones Industrial Average) and shorter pauses when those thresholds are triggered. We are currently exploring options for marketwide circuit breakers which include continuing our existing policy of harmonizing with the US, pursuing a ‘made-in-Canada’ alternative or identifying a hybrid approach that does a little bit of both. At this stage, we are soliciting industry feedback on the merits of these three approaches. With the help of that feedback, we expect to be able to choose the appropriate path soon. It is important to note that these kinds of circuit breakers are an important control but have traditionally acted more as insurance – they have only been tripped once in the US and Canada since being introduced in 1988.
How similar is IIROC’s new Single-Stock Circuit Breaker (SSCB) rule to the US rules?
Single-stock circuit breakers are relatively new for both jurisdictions. The US and Canada have implemented SSCBs which are similar in that a five-minute halt is triggered when a stock swings 10% within a five-minute period. Otherwise, the Canadian approach differs in several ways. For example, our SSCB does not trigger on a large swing in price if a stock were trading on widely disseminated news after a formal regulatory halt.
Do you believe circuit breakers, market-wide or single-stock, have a deterrent effect on momentum trading?
We did not set out with a prescriptive approach to influence or change trading behaviour or strategy. IIROC’s circuit breaker policies were developed to provide added insurance against extraordinary short-term volatility. We intend to study the impact of any changes and we may be able to learn more about the impact of policy changes on trading behaviour.
How would you characterize the response to the minimum size and minimum increment rules for dark pools?
As expected there is a range of strong opinion about any efforts to regulate an emerging area of market structure like dark pools and dark orders. We have consulted extensively with the industry in Canada towards developing a policy proposal, including a series of one-on-ones with individual parties who commented. A common concern was that the dark market had not evolved far enough in Canada to warrant the kind of policy development we have pursued. We wanted to be proactive and to put the kind of regulatory standards in place that would protect market integrity while maintaining competitive capital markets.
What is your greatest concern about unlit trading venues?
I would say our biggest challenge (related to the growth of dark liquidity) is to ensure that we are able to maintain robust and meaningful price discovery and improvement in Canadian markets. We need to balance the need for fair and equal access to liquidity for all investors, while recognizing the role that dark pools play in facilitating the execution of large (for example, institutional) orders.
What other issues on your Canadian policy agenda have been keeping you busy?
Well, related to volatility, there have been several initiatives aside from circuit breakers. We proactively educated the industry and investors on the use of order types – especially stop losses without limits – in today’s high-speed, multiple-marketplace environment. We have started working with the marketplaces toward harmonization of their volatility controls. And we are increasing transparency to our policies and procedures on breaking or re-pricing erroneous and unreasonable trades, in part through the implementation of our SSCB policy.
What else is on the agenda for the coming year?
Like FINRA in the US, we also regulate investment dealers and their activity, but I will restrict my comments to market regulation. We expect to be able to announce an important policy update relating to short sales very soon. Last year, after conducting several studies of  market activity, we proposed rule changes that would include a repeal of the ‘tick test’ restriction that prohibited a short sale from being made at a lower price than the last sale price of that security.
We always balance our policy reforms against consistency with global regulatory approaches. That said, this is an example where we recommended a ‘Made in Canada’ policy, based on our empirical research which found the tick test was ineffective as a tool to restrict significant and rapid systemic declines in prices. Our studies saw no correlation between short sales and a high failure rate for trades. We have bolstered other areas relating to shorts. These include increasing settlement rigor through rule changes, heightening surveillance of settlement failure, as well as monitoring for correlations between an increase in short selling activity and downward price movements. We are also working towards increasing public transparency of short sales with new reports.
Another area we are actively analyzing is high-frequency trading (HFT). It is a subject where there are many opinions but still too few hard facts. We are conducting a study to better understand the characteristics of this type of activity and its effects on markets generally, and to develop a stronger set of data to assist us in identifying issues as well as assessing their impact.
How does technology assist you in the analysis of trading?
Just a day before the Flash Crash, we started running our Surveillance Technology Enhancement Program (STEP) and we are pleased to say it passed the test. Canadian regulation has the advantage of having a single regulator (IIROC) administering a single set of market rules (UMIR) on all exchanges and ATSs. STEP cements that edge by offering a consolidated, cross-market view of all trading activity. This is invaluable not only in terms of real-time surveillance and building custom alerts but in creating a rich set of data for post-market analysis. Finally – your readers will be interested to know that Canadian marketplaces connect to STEP using FIX to optimize messaging flexibility.

Hitting Their Stride: Mexican Markets Mature

Hector Casavantes of Finamex assesses improvements in the Mexican markets, like technology upgrades, high frequency trading and regional partnerships, as well as the work yet to be done.

Hector CasavantesHow has Bolsa Mexicana de Valores’ (BMV’s) exchange upgrade improved trading conditions in Mexico?
The technology update by the BMV placed us on a path toward a more standardized market, and it definitely helped to raise international investors’ awareness of the existing modern, transparent and easy to trade market. While the benefits of the upgrade are clearly apparent, there have also been collateral effects. For example, after the upgrade, new and more demanding players have moved into the picture, increasing the level of demand for system reliability, pricing models and additional features common on other markets, such as liquidity rebates.
There is still work to do, however, as the exchange platform, upgraded though it is, reserves certain access privileges for more established domestic participants. For example, purely electronic foreign brokers cannot obtain or see the Market on Close (MOC) book, the hidden midpoint peg bookmarks for IOIs, the regular full-depth order book, etc. The exchange needs to work on leveling access, and they have recently demonstrated that they are both aware of this issue and examining how to address it.
What advantages in terms of liquidity will the Mercado Integrado Latino Americano (MILA) bring to Mexican markets?
MILA is thought to provide access to new natural liquidity sources in both directions. From what we can see, Mexico may initially provide new asset classes besides equities, including global stocks, ETF’s and eventually derivatives for South American MILA countries. The domestic buy-side investor from any of those countries, like pension funds, insurance companies and corporate treasuries, may find Mexican-listed names appealing within their risk strategy objectives. For Mexican institutional investors, MILA may provide investment options independent of fluctuation from the local macroeconomic, sector-related or seasonal forces.
For all the countries involved, MILA will provide a good opportunity for sharing technology, best practices and maybe adding productive competition, thereby encouraging increasingly costeffective services. On the downside, there are a number of legal and regulatory issues that need to be resolved before the promise of the MILA integration is a reality. In the medium term, the expectations for MILA’s role in Mexico are quite high.
What is the role of High Frequency Trading (HFT) in Mexican markets?
In Mexico, HFT is found to be mostly focused on statistical and spread arbitrage strategies and includes other market making strategies to a lesser extent. These HFT strategies have contributed both liquidity and efficiency to the overall market structure, as well as facilitating the sharing of experiences and best practice about HFT strategies and technology.
The drawback of HFT in Mexico is that the majority of those strategies are executed within a small subset of symbols. Typically, these are Mexican symbols with ADRs or global ETF’s, which pushes the rest of the market mainly the mid to low liquidity stocks, under the radar. In time, I am confident that the mix of conventional investors and HFT firms will only increase the overall ‘cake size’ for all participants.
How have market data speeds from BMV and the Mexican Derivatives Exchange (MexDer) kept up with trading speeds?
The Mexican exchanges have invested heavily to increase their overall processing and messaging capacities. The efforts are noticeable, and that is the reason why some brokers now offer services for HFT clients and deploy proprietary HFT strategies. The average speeds, however, are still below what is expected by and necessary for many participants.
For example, a simple marketmaking automated strategy on fungible cross listed stocks, say QQQ or AAPL, cannot, by any means, reflect the originating market price fluctuations. We need to systematically incorporate semi-static pricing models and low-pass filters to cushion the real motions of the source Best Bid and Offer (BBO), plus the foreign exchange effects, which translate into lost opportunities or reduced margins. In addition, the prospect of cancellation fees discourages quantitative traders and strategists from exploring more sophisticated HFT strategies. The exchange operators have confirmed their commitment to address these issues soon, and we expect that will result in significant improvements.
As international brokers increase their presence in Mexico, how are Mexican brokers differentiating themselves?
Local knowledge of market rules and the economy as well as an understanding of domestic social contexts can provide a sort of differentiation. The core differentiating factor for almost every broker, however, remains the price, independence and other added values like research, trading ideas and prime brokerage. All of these services are becoming commoditized, and the success of international brokers has only helped accelerate this commoditization.
It is the duty of each local broker dealer to find a competitive niche, be it geographical, economical or technological, and follow a well-designed marketing, sales and operational strategy to provide genuine value to their clients.
To learn more about how MILA is using FIX, please visit http://www.fixprotocol.org or http://bit.ly/zJdrMo.

Making Sense of MiFID

AFME’s Securities Trading Committee Chairman Stephen McGoldrick unlocks the latest MiFID proposals and looks at the rules for Organized Trading Facilities, algo trading and a consolidated tape.
Stephen McGoldrickOrganized Trading Facilities (OTFs)
The OTF regime began life as a specific regulatory wrapper to put around broker crossing systems, (which are a new mechanism for delivering an existing service). Crossing, which is almost the definition of a broker, has become highly automated. Whilst most crossing activities have not changed, other aspects of the industry were seen to require regulation – namely increased automation and greater scope of crossing. The initial proposals outlined an umbrella category of systems called OTFs, with one category created to hold broker crossing systems and another to hold the systems for G20 commitments around derivatives trading.
When the MiFID II proposals came out at the end of 2011, the ‘umbrella’ aspect had been simplified into a structure intended to be ‘all things to all people’, which is where it has come undone. MiFID II has created a regulatory receptacle for a practice and the two things differ in shape. The broker crossing system does not fit into the receptacle that has been created for it because much of the trading is against the books of the system’s operators, which is prohibited under the current proposals.
The regulators do not want speculative, proprietary trading within these systems, but unwinding risk created by clients is both useful and risk-reducing. An opt-in mechanism for compliance, allowing traders to decide if they want their orders traded this way may be a solution. Conflict management of this sort is common in the financial sector, as it ensures that any discretion is not exercised against the interests of the client. Certainly, when it comes to measuring the client’s interests against the operator of an OTF, it is absolutely unambiguous that their interests must come first. Therefore, any exercise of discretion that disadvantages the client relative to the operator is already prohibited. A formal, documented process to ensure that segregation stays in place is good, but to effectively prohibit the vast majority of trading on broker crossing systems seems to abandon the regulators’ objectives – to increase transparency and protect clients.
Furthermore, trades allowed into a broker crossing system would be instantly reported, creating post-trade transparency. The current proposals call for OTFs to be treated in the same way as Multilateral Trading Facilities (MTFs), which fosters uncertainty about the waivers for pre-trade transparency. Currently, there are clear criteria for granting a waiver to a platform: one is that orders are large in size, the other is taking reference prices from a third party platform. The Commission will not, however, be making the decisions about waivers; they have been handed to the European Securities Market Authority (ESMA) to determine. There is a danger in specifying too stringent limits for these waivers, which would create a very different landscape from that explicitly envisaged by MiFID I.
Systemic Internalisers (SIs)
Our understanding is that regulators did not want to split activity that was in an OTF into two, but rather to regulate the broker crossing systems and to remove the subjectivity of SIs. The current SI proposal is aimed at regulating automated market making by banks, so that institutions make markets by reference to market conditions, not by reference to their clients. In MiFID I, the SI regime was introduced to protect retail investors, but subsequently this seems to have changed. When the European Commission (EC) was asked by the Committee of European Securities Regulators (CESR) to clarify the rationale for an SI regime, they declined to do so. As a result there is a distinct lack of clarity regarding the intent of the SI rules. If we had a clearer vision of the direction the regulators wished to take the market, then it would be far easier to assess whether the regulations were moving us in the right direction – or not.
Appropriate transparency is fundamental to any market but the operative word is appropriate. If transparency is an end in itself, it could become detrimental to market participants. For a market with participants who operate at very different sizes, you cannot have a one-size-fits-all approach. Some participants may wish to rapidly trade 5% or 10% of the listed capital of a company while others might simply want to trade a few hundred pounds worth of shares. It is inexplicable to think that a singular approach will work for such vastly different-sized activities.
Algo Trading and Market Stability
The general regulation of algorithms in MiFID II looks sensible: that automated software for generating orders should be properly tested and documented by each organization. We see no harm in the regulators knowing what those algorithms are doing. Whether or not they want every change communicated to them is another matter, as many firms regularly make minor changes to algorithms. Inserting the regulator into the release cycle of software is less palatable, but having a robust control mechanism and obligations to test are sensible.
Where the regulation does not work is the requirement that every algorithm must continue to push liquidity regardless of prevailing market conditions. As drafted currently, this concept displays a real misunderstanding of how electronic execution works. Instead of identifying a particular subset of High Frequency Trading (HFT) and creating an obligation to assist with liquidity, a far better way is to create a marketplace that incentivizes the provision of liquidity. Forcing traders to take on risks they are not comfortable with is contrary to the central policy of de-risking the market. If a trader leaves the market because they cannot manage their risk, requiring them to take on risk in the first place is perilous.
What needs to be discussed with the regulators? Code review is impractical, but traders should be willing to explain an algorithm’s concept. If there had to be a record of what an algorithm was supposed to do, but in reality it does something else, then a difficult conversation with the regulator would ensue. A far more sensible approach is to categorize different types of activity, ensuring that liquidity providers are regulated and monitored, but also to have an ongoing incentive, perhaps being registered as market makers. The old model was that when a firm registered as a market maker, they received certain privileges as a reward for taking on those obligations.
The fundamental break in the proposal is that it fails to differentiate between algorithms that are execution tools for large orders and algorithms that take a wholesale interest in the market and slice it to minimize market impact. By not differentiating this from an algorithm that seeks to make markets by making two-way prices and exists purely as a liquidity provider, MiFID tries to fit two massively different things into the same rule book.
In many respects, this rule simply defines the best practice that most stockbrokers already apply to their algorithms. The trouble arises if this rule is intended to prevent a ‘flash crash.’ HFT did not create the ‘flash crash’, it was the oversell of some S&P mini-Futures by a traditional wholesale investor. It did involve a poor execution algorithm, which should have been tested better, but requiring all algorithms to make markets does not address this and will not prevent another ‘flash crash.’
Transparency and G20 Derivatives Commitments
We should clarify that the regulations about trading derivatives on OTFs are not about giving the regulators the ability to foresee market risk. They are about understanding where positions rest and seeing executions coming out of or being executed on an OTF or MTF. This type of post-trade transparency gives no additional ability to regulators to foresee risks evolving in the market. The risk in derivatives comes from where open positions rest – and the trade repositories (databases of open positions) and Central Counterparties (CCPs) provide this information. In concert with trade repositories, CCPs give regulators the ability to foresee concentrations of risk or when a market participant is taking on disproportionate risk in a particular area. Where the trade is executed has nothing to do with that.
The UK Financial Services Authority (FSA) has suggested that an OTF with a continuing ban on proprietary trading should fulfill the law, but in a market where 95% of the trades are done on a propriety basis, this proposal is fundamentally flawed. It should be very clear that for the OTF to fulfill the G20 commitments a ban on proprietary capital traded by an operator is not required and it must be revisited.
European Consolidated Tape
The mantra that transparency is a good thing seems to have been adopted somewhat simplistically. Reducing the post-trade delays permitted when banks execute positions on behalf of their clients will increase the spreads for the trade. Lower returns for investors are the result of premature flagging to the market, which pushes the price ahead of or away from the wholesale trader. We have lost sight of the fact that these delays have a role to play, and we should focus on establishing an appropriate delay period rather than merely shortening it. Pre-trade transparency – whereby orders must be transparent prior to execution – also uses a system of waivers that has been effective so far. Investor choice and trading efficiency have been created under the current structure, but there is a growing sense of transparency as binary; if transparency is a good thing, then a waiver on transparency must be bad. The concern is that regulators may seek to reduce the extent of the waivers as an end in itself, instead of deciphering their relative impact on efficiency and transparency.
Regarding the consolidated tape, there is a proposal that standard data formats are created, maintained and applied to all post-trade data. For trade data to use a standardized format whether it is executed on a regulated market, MTF or OTC implies benefits of efficiency and consolidatibility. The problem is that the commission proposed that multiple suppliers be allowed to act as consolidators of data. A European market where multiple vendors supply multiple tapes is no better than where we started from.
It has been suggested that ESMA and the commission maintain powers to implement such mechanisms that are required to ensure that the data is comparable. But what the markets need is certainty and clarity. The proposal would have carried more weight if providers were required to synchronize their offerings. Alternatively, mandating that there be one central provider would have been pragmatic as it would allow for easier debate about data pricing. Currently, there is a real risk that once all the data sources are compiled it will be prohibitively expensive. In this area, traders are again dependent on the commission’s powers to define reasonable commercial terms. We need to be able to rely upon this because otherwise it will take years of continued lobbying in order to establish what is already known: that market data is too expensive in Europe for a consolidated tape to be viable.
The lack of an affordable consolidated tape in Europe detracts from confidence in the markets. It is important for investors to know they can get out of their positions, make principal prices and have a clear understanding of what is happening in markets across Europe. Nothing is going to build liquidity in Europe more than confidence in the market structure.
FPL EMEA Regulatory Update
There is a considerable amount of activity taking place in the European regulatory environment and its impact is a central concern for many FPL member firms. As an organisation, FPL works closely with regulators globally to encourage the use of nonproprietary, free and open industry standards in the development of regulation, so all sectors of the financial community can benefit from increased consistency and transparency. The standards FPL promotes are those that have already achieved mass adoption by the trading community, enabling firms to more easily meet new requirements by leveraging existing investments across additional business areas.
FPL is increasingly approached to comment on regulatory consultations and in Quarter 1 2012, decided to re-convened the EMEA Regulatory Subcommittee to ensure that all responses submitted strongly reflect membership interests. FPL is pleased to announce that Stephen McGoldrick, Deutsche Bank and Matthew Coupe, Redkite Financial Markets have been elected to co-chair this group. Through their leadership the group will benefit from their extensive knowledge and experience. FPL welcomes participation in this group from all FPL member firms with an interest in the region, if you would like to find out more please contact fpl@fixprotocol.org.

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA