With Neil Joseph, Co-chair of EMEA FIX Trading Community and Senior Trader J.P. Morgan Asset Management; Stuart Baden Powell, Co-chair of EMEA FIX Trading Community; Adam Conn, Co-chair Investment Management Working Group (IMWG), and Head of Trading, Baring Asset Management; Michele Patron, Co-chair IMWG, and Senior Quant Trader AllianceBernstein; Paul Squires, Co-chair IMWG, and Head of Trading at AXA Investment Managers.
Stuart Baden Powell (SBP): Neil, FIX Trading Community EMEA has a number of working groups across the region. Along with other reasonably recently created teams such as the regulatory and the fixed income groups is the Investment Management Working Group. The EMEA IMWG has developed well since its start around 18 months ago and at full strength it has over 10 trillion US Dollars in AUM represented – what is your take on how the group has developed so far?
Neil Joseph (NJ): The group has developed extremely well so far. As one of the newest groups within the FIX Trading Community in EMEA, it is encouraging to see that the membership has already grown to 33 firms. This is helping the buy side have a consolidated view on standards for communicating trading related data as well as prompting the group to address relevant industry wide issues. By having a single voice in all matters relating to standards, we can save replication of efforts across the buy-side and also hopefully for brokers and vendors, as considered and uniform decisions can be made more effectively. Being led by three co-chairs appears to have really helped the group gain traction. Michele, would you like to expand on some areas that the group is currently focusing on and the progress to date?
Michele Patron (MP): As Stuart said, the group was officially formed in early 2012, during the FIX Trading Community EMEA conference. The current leadership structure (Adam, Paul and I) was finalized later that year. From the start we tried to understand which areas we should focus on to bring some added value to the industry. What we’ve typically seen is that, while sell-side representatives are very good at coming together around a table to find solutions to business problems they share, the buy-side has been much less active in that regard. Of course, there had already been some opportunities for the buy-side – either sell-side driven or (semi) independent – to sit down and discuss problems, especially on the regulatory side; but our thinking was that we needed to merge the business interests of our group with the technical pedigree of the FIX Trading Community. So we concentrated on workflows where standardisation can bring benefits to the market participants. Current working streams are: potential use of FIX Protocol for Initial Public Offerings, led by Adam; developments in fixed income trading, driven by Paul; and adoption of execution venue reporting standards in the EMEA region, on my plate. As far as the latter is concerned, we recently ran a buy-side survey in partnership with the Investment Management Association. Some of the early results will be briefly shared below and the rest along with further actions are going to be shared during the 2014 EMEA Conference.
Adam Conn (AC): The attraction to me of being involved with the FIX Trading Community is that whilst there are already some very good forums/quorums established to share experiences; formulate and onwardly communicate opinion; this is an opportunity for all the interest groups within our industry, some maybe with different end objectives, to work together on specific initiatives (as highlighted by Michele) that we identify to do the right thing to enhance market structure. The work on standardisation of venue data and the better identification of the capacity in which a counterparty has acted, for instance, with the caveat that it does not expose fundamental investors to picking off by proprietary trading systems, has benefits to users, end clients and regulators alike. Pushing against the accepted manual norm for IPOs and other new issue applications would also create an openness and transparency to one of the last impenetrable allocation procedures without the need for an imposed solution. The advantage of having co-chairs with diverse day to day responsibilities is we have been able to look at issues affecting both quantitative and fundamental investors across asset classes. As electronic trading accounts for a growing share of fixed income trading; the need for early adoption of a standard language/protocol is in all participants’ best interest.
The multi-asset angle
Paul Squires (PS): Adam and I both head up multiasset class trading desks and I think this ‘extended mandate’ is a trend that is likely to continue to develop as regulatory reform drives change to market structure and operational efficiency becomes a more central focus. That is why we wanted to include fixed income in this predominantly equities-focussed working group (which is equally reflected in the broader FTC global scope). We are already witnessing a high number of industry initiatives emerge (over 30 at our last count) in anticipation of what are likely to be fundamental changes to the way fixed income is traded and we welcome such innovation on the basis that it increases the choice of execution strategy. It does, however, come at a cost to the buy-side as liquidity becomes dispersed across multiple venues that the trader needs access to. These channels may take a number of different formats (executable quotes, crossing networks, order books, streamed price aggregators) but if we can use a common language like FIX the connectivity is harmonised.
As has previously been mentioned, FIX underpinning such a network would reduce costs for all market participants. The real benefit, of course, is if this standardisation of protocol helps to unlock the inventory of bonds that have increasingly been sitting dormant in fixed income asset managers’ portfolios (97% is one estimate we have recently seen). Onto the year ahead we have already identified some areas that we think FIX can assist with as the buy side adapts to the ever-changing market dynamics. Stuart, perhaps you would like to mention some of these?
Buy-side survey
SBP: Thanks Paul, you are spot on about the multiasset component and this is adding additional value due to a cross-fertilization of ideas within the group and the wider FIX Trading Community that can only be a benefit to the various asset classes. With regard to some of the other areas, the IMWG recently worked in partnership with the Investment Management Association (IMA) to construct a buy side survey on the specific topics of Tag 29 (capacity), 30 (last market) and 851 (posting or taking). The aim was to build on previous legacy work in this area and take stock of where things are today and what we can do for tomorrow. The survey was distributed to the members of the IMA and the IMWG alike. Michele, would you like to take us through some of the key findings?
MP: Back in 2012, the FIX Trading Community published the “Execution Venue Reporting Recommended Best Practices” document. As you said, that provides a comprehensive protocol to communicate to buy siders where and how each single fill has been executed. We have always felt that the topic is quite relevant, both from a strategic and regulatory viewpoint. Therefore, since the very beginning, we have been dedicating a portion of our periodic meetings to covering different aspects of execution reporting. To have a picture as detailed as possible about implementation of the reporting mechanisms and usage on the buy side, the IMA and the FIX Trading Community have recently distributed a brief survey, the response to which was quite wide and enlightening. In general terms, it does appear that – whilst the vast majority of the participants think the information coming from those tags is insightful, there is a chunk which is unable to process the tags internally, or relies on counterparties/third parties to provide analysis around that. For sure, part of the problem is that a full standardisation of reporting has not happened yet: for example, figure 1a shows that 90% of the respondents stressed that Tag 30 does not always contain a MIC code – as advised by the best practices document – and a fair amount of additional work is required to make this nonstandard information usable. In our opinion, the highlight of the survey are shown in figure 1b. This is the consensual need (95.2% of the participants!) for an extra flag in Tag 29 to classify explicitly riskless principal trades. This gives us a clear action point, and we have started working to make this happen.
AC: Another advantage of doing this work within an organisation with global reach is the ability to make this an all-encompassing rather than just regional initiative. With this collegiate approach we hope that Regulators and other Market Authorities will view FIX Trading Community as a resource to achieve beneficial changes in market practice. The more we meet it is clear there are many areas we could eventually work together on that could be enhanced without the need for prescriptive rule making.
Response to FCA paper (CP 13/17)
PS: Back to equities and one of the most immediate topics that needs attention is delivering a response to the FCA’s paper (CP 13/17) on the use of clients’ dealing commissions. While each asset manager will, inevitably, have their own table of commission rates, what is clear is that simply applying a standard commission rate for all clients and all types of execution strategy is no longer appropriate. Again, regardless of the well-intentioned objectives, the practicalities of precisely capturing the full matrix of required rates on each individual trade (bundled or execution-only, dependent on market or execution strategy etc) is not straightforward. Furthermore, identifying the correct commission rate on each trade may be one thing but efficiently transmitting that detail to your counterpart is another (particularly when many trades are now matched at block level). The FIX Tag 12 (commission rate) is a means by which we think a lot of these operational problems could be mitigated so we are looking to see how we can promote its use and support amongst our community and its systems. As before, any progress we can make will only additionally benefit our sell-side colleagues so there is a common interest and this reflects how we – as a group – can react to market developments as they emerge.
Global Collaboration
NJ: Yet, another example of new work to be progressed this year and as Michele has already mentioned with Tags 29 and 30, it’s not just the introduction of new tags and standards but also the promotion of usage and possible changes to usage that are often a large part of the work. Further supporting Adam’s point regarding the benefits of the FIX Trading Community being a global organisation, is the IMWG now working so closely with its equivalent group in the US and with Asia as well as also forming the Global Buy-side Committee which provides Adam, Michele and Paul a seat on the Global Steering Committee. So in summary, the relatively new group has grown quickly and benefited from its diversity of membership. The areas covered so far and including, last market and last capacity and the work to use FIX or IPOs are seen as having direct benefit to many desks including my own. With numerous initiatives underway and planned for 2014, we expect to see further benefits across a range of issues covering multiple asset classes.
The Americas Conference 2014 – the weather outside is frightful.
The 10th Americas FIX Trading Conference held in New York on 13th February was marked by yet another severe snowstorm in Manhattan. The weather was no match for the FIX Trading Community though, who turned out in good numbers to listen to the presentations and participate in the interactive roundtables which provided healthy debate and the opportunity to network.
Both business and technical leaders were very much in evidence as panellists discussed developments in execution venue analysis, what are the priorities for both Buy and Sell Side in 2014 and the buy vs build argument for vendors to address.
In the afternoon, two streams allowed the technicians to hear presentations on the next generation of FIX and specifics on FIX certification and client-onboarding whilst the business stream focused on post trade issues and the new market structure for derivatives, timely given the mandatory electronic trading for certain swaps on 15th February.
The day concluded with a presentation on the theme of making markets safer. It was a good way to finish the conference as the word that seemed to be most in use over the day was ‘transparency’. Whether it was execution venue analysis, TCA, IOIs, post trade, fixed income, REG SCI or algos the need for greater transparency was alluded to many times. Expect that to be the case for some time to come.
News : ESMA’s Verena Ross on fixed income liquidity : Huw Jones, Reuters

EU WATCHDOG SAYS IT WILL TAKE CARE IN FRAGILE BOND MARKET.
By Huw Jones, LONDON Tue Feb 25, 2014 – Link to articleFeb 25 (Reuters) – The European Union’s securities watchdog said it will tread carefully in regulating government and corporate bonds to avoid crimping liquidity in already fragile markets that are key to funding economic growth.
Verena Ross, executive director of the European Securities and Markets Authority (ESMA), said changes to market practices were inevitable under new EU rules in 2016.
Many bonds are traded privately but under the rules, known as MiFID II, traders would have to post prices to the wider market – a step critics say will force some banks to only trade the most popular bonds as it would be harder to make a market in less liquid ones.
Ross said ESMA was, however, mindful of the need not to damage liquidity which it acknowledges is already under pressure.
Issuance of sovereign bonds in Europe in the second half of last year fell to 429 billion euros, its lowest level since the height of the financial crisis in 2008.
“It is clear that liquidity in bond markets is still pretty fragile and that ESMA’s regulatory framework may have a role in order to safeguard the functioning of this market,” Ross told a conference organised by the Association for Financial Markets in Europe (AFME), a banking lobby, on Tuesday.
ESMA will face political as well as industry pressure to go easy on corporate bonds as politicians look for ways to boost trading in a bid to reduce companies’ reliance on banks for funding.
Poor liquidity in corporate bonds is already prompting policymakers to look at alternative ways of funding companies, such as by reviving securitisation or pooling of loans into bonds.
Ross said market practices and possibly their structures will change as a result of the new transparency requirements aimed at giving investors more information and allowing regulators to see what is going on in the market.
“I know some of you are concerned about this but things cannot stay as they are today,” she said.
ESMA’s first task will be to define liquidity so that it can then determine which bonds are subject to the transparency rules, but so far no single definition has emerged. Bonds deemed illiquid would get waivers or be exempt from some rules.
Ross sought to reassure bond traders in the audience that current tough transparency rules for shares would not be “mechanistically” applied to bonds.
Rick Watson, AFME’s head of capital markets, said Ross’ speech made it clear the watchdog recognises the important differences between equity and bond markets and that it plans to calibrate liquidity based on fixed income specific data.
ESMA also recognised that fixed income markets will “require a flexible and dynamic approach to calibration and waivers thresholds”, Watson said.
The watchdog will monitor the liquidity of bonds regularly once the new rules are in force, meaning that if a bond becomes less liquid over time as many do, they could be exempt later on from the toughest transparency rules.
https://in.reuters.com/article/2014/02/25/eu-bonds-idINL6N0LU30S20140225
[divider_to_top]
Announcement : EuroCCP choose Euroclear to settle US equities on Turquoise

EUROCLEAR BANK SELECTED FOR SETTLEMENT OF US STOCKS BY EuroCCP.
London and Brussels, 24 February 2014 – EuroCCP, the pan-European central counterparty (CCP) which clears over 4,000 securities, has chosen Euroclear Bank, the Brussels-based international central securities depository (ICSD), as sole settlement provider for all US-listed equities transactions struck on Turquoise, the pan-European trading venue.
Under the service which was launched today, transactions in US stocks listed and traded on Turquoise will be settled on behalf of clients at Euroclear Bank in any of the 54 settlement currencies offered by the ICSD.
Clients of Turquoise and EuroCCP need make no adjustments – contractual or operational – to be able to benefit from Euroclear Bank’s robust securities settlement and asset servicing expertise. Turquoise is the only MTF in Europe to offer a full range of US equities.
Robert Barnes, Chief Executive Officer of Turquoise, said: “Turquoise already offers its customers access to 18 European markets through a single connection. Today’s announcement increases the geographic diversification of the markets we offer to our customers and makes the process of trading US names – as well as in a European time zone – easier than ever.”
Diana Chan, Chief Executive Officer of EuroCCP, said: “We have a very good working relationship with Euroclear Bank and it is a natural evolution of this to now partner with them to settle all of Turquoise’s US-listed equities transactions.”
Yves Poullet, Chief Executive Officer of Euroclear Bank, stated: “The proliferation of multi-lateral trading facilities post MiFID II, such as Turquoise, has led to a whole-scale redrawing of how investors access and trade stocks. Ultimately, those venues that attract the largest pools of equity liquidity can expect to flourish in the future. I am delighted to extend the current working relationship that we have for Depository Receipts with Turquoise and EuroCCP to US equities. Our 1,300 clients can now benefit from full trade, clearing and settlement automation in blue-chip firms like Apple, Berkshire Hathaway, Exxon Mobil and Ford, to name but a few.”
Link to Euroclear press release[divider_to_top]
5 Reasons Why Firms Should Enhance Quality Assurance Testing
Market participants are under heavy pressure to ensure their technology is stable and performing as intended. But often participants rush to launch services to the marketplace, and are cutting corners on performance testing. In their push to quickly release new functionality – whether in response to regulations, exchange updates or customer demand, firms may be putting themselves in a dangerous position. By Candyce Edelen, CEO, PropelGrowth.
Poorly performing systems can cause disastrous errors or trading outages that can damage a firm’s reputation—possibly even jolting investors enough that they take their business elsewhere. That’s why thorough quality assurance (QA) testing is essential, giving developers the opportunity to fix minor glitches before new technology is unleashed on customers. Proper QA practices can also help firms obtain a competitive edge in a highly complex trading ecosystem. It can improve customer acquisition and retention rates through helping deliver cutting-edge services to customers faster and more reliably.
1. Speed Time to Market for Improved Competitiveness
In this fiercely competitive landscape, financial firms need to be ever-vigilant of and responsive to evolving market and customer demands. In large part, this means being able to deliver innovative offerings in a reasonable timeframe. But often the process of delivering new solutions, new functionality or major upgrades is involved and lengthy – both with extensive development requirements and thorough QA.
Automation is a viable way for firms to improve time to market and keep systems quality high by speeding up regression testing. Using automating testing, organizations can deliver quickly while preventing major defects that could have a negative impact on customer relationships.
Direct Edge, a US-based stock exchange, is one firm using automating testing for just this purpose. They automated regression testing not only to speed up the overall process, but also so they could add more sophisticated test cases. Rene Ovalle, Head of Quality Assurance at Direct Edge, commented, “Introducing greater sophistication into QA has helped us reduce the regression testing cycle from months down to just 33 hours. Not only is the process exponentially faster now, but it is also far more thorough.”
Improved efficiency is a prime area where automated testing shows its value. It’s not uncommon for firms to see a 40% reduction in time spent on the end-to-end testing process. This means they can perform platform migrations faster, giving customers an uninterrupted user experience.
2. Lower TCO to Free-up Budget
Some firms are skeptical about implementing automated testing technology, considering the initial upfront investment involved. They think it’s more affordable to simply hire a team of testers to handle the job. “It’s true that in the short–term, manual testing is the lower cost option. However, the overhead involved in sustaining this approach over time is significantly higher when stacked up against automation,” said Josh Tolman, Deputy CEO at Greenline.
Likewise, building testing tools in-house can present considerable long-term expenses. “For example,” said Tolman, “IT staff will need to devote a lot of time to not only building, but also keeping these systems up-to-date as regulations and other requirements change. That can become quite arduous. Furthermore, if building testing software is not within their area of expertise, then the quality of the system may suffer. Think about it like this— do you want the same people who are building your software to also be building your testing tools?”
According to Tolman, the average firm will see a 155% return on investment (ROI) after the first year using a third party testing tool, and a 327% ROI by the end of year three. The total cost of ownership (TCO) attached to automation is not only lower because it demands fewer employee resources, but it also minimizes the possibility of costly mistakes related to lack of domain knowledge or human error. Reduced TCO can also free-up budget, allowing firms to redirect money to other value-added areas of the business. Tolman used an ROI calculator developed by Greenline to arrive at these numbers. Of course, results will vary by firm.
Direct Edge’s experience with automated QA testing proves Tolman’s point. “A lot of our ROI can be seen in our improved productivity rates and how much more efficiently we’re handling testing. Prior to automation, all testing had to be completed during our business hours, but now it can be done 24 hours per day, 7 days per week. As a result, we’re now able to deliver new releases to our clients in a matter of weeks, not months,” said Ovalle.
3. Improve Testing Accuracy for Better Stability
Firms that use automated testing systems stand to dramatically enhance quality assurance. If they’re designed well, automated processes can be far more accurate, reliable and consistent than manual processes. Furthermore, sophisticated testing tools can introduce consistent, repeatable processes into a testing environment that can improve systems stability.
“With automation, many firms have reported a 75% reduction in production defects when compared to manual methods. Minimizing such defects can save sell-sides a significant amount of money – both in terms of resources needed to fix errors and pay-outs to clients in the event of trading errors,” commented Josh Nardo, Global Head of Services at Greenline. In essence, an automated approach will help make the testing process less prone to error, and thereby more cost-efficient.
4. Respond Nimbly to Regulatory Pressures
Today’s regulatory landscape is in a state of flux, forcing firms to repeatedly modify their front, middle and back office systems to meet evolving compliance requirements, often with short deadlines. Every change introduces new system integrity risks. In addition, regulators are putting added pressure on these firms to establish thorough testing practices and to ensure the operational integrity of their systems.
Compliance is further complicated by the principles-based rather than prescriptive regulations. “We’ve observed that while regulating bodies do set forth some guidelines for compliance, they have not supplied precise rules. This leaves room for interpretation on the part of exchanges that are striving to comply with the latest standards,” said Ovalle. “At Direct Edge, we’ve made it a point to be a model in this space. Across our organization, we’re trying to create repeatable, sustainable, adaptable, demonstrable processes that help simplify the compliance burden.”
Automated testing is an essential ingredient that can help organizations stay on top of regulatory pressures. “Automation can help you tackle the compliance challenge in a few different ways,” said Tolman. “It allows you to create best practices in testing and auditable processes. It can also establish and enforce consistent testing procedures. Additionally, automation can make testing more comprehensive, while simultaneously making the process less manually intensive and time-consuming. Therefore, more testing gets done, and the tests are less prone to error.”
5. Reduce Testing Costs and Improve Testing Maturity
Some sell-side firms think outsourcing technology to a vendor would be more expensive than handling it internally. But this is largely a misconception. Testing complex financial technology requires specialized knowledge and experience. It’s neither simple nor inexpensive to find this skilled labor pool.
When you compare it to the cost of recruiting, training and retaining this type of talent; leveraging a vendor with proven knowledge in this space and skilled resources can ultimately be more affordable. Nardo comments, “Firms that choose to outsource automated testing to an experienced provider will typically see an overall 30 to 60% process savings. Not only is the process less expensive, but leveraging subject matter experts can also make tests more sophisticated.”
Direct Edge agrees. Reducing overhead and getting access to domain experts were the primary reasons they chose to outsource. “When setting up our automated testing environment, we elected to use Greenline’s support resources. As a result, we’ve been able to get our testing environment up and running quicker and more cost-effectively than if we had managed the entire process in-house. Greenline’s support team has also been very helpful in sharing their knowledge with our internal staff,” said Ovalle.
Meeting Today’s Demands
In this unpredictable environment characterized by constant change and unrelenting competition, firms need to find new ways of attracting and retaining trading flow. One way of doing this is through introducing automation into the quality assurance process, which can help organizations enhance their performance, while meeting the pressures of offering improved functionality to clients. “We’re proud that our firm has one of the best uptimes of any other exchange in the industry. A large part of this we attribute to having rigorous testing practices in place that help ensure the quality and reliability of the service we offer our clients,” said Ovalle.
MiFID II : Now the dust has settled…
NOW THE DUST HAS SETTLED…
The political dust may have settled on the MiFID II agreement but the end game is still not in sight. Regulators are now tasked with ironing out the various technical details and few expect implementation to be seamless.
Firstly, incorporating the different political agreements that have been reached is going to be a monumental task and could take weeks. Once the text is finished, it will move into ‘level 2’ discussions, which will largely be handled by the European Securities and Markets Authority (ESMA) and allow the industry further opportunities to wield their influence. The final version could be as long as 700 pages which will then have to be pored over by legal team at the European Commission and translated into 24 different languages before a plenary vote in the Parliament.
There are also still areas of debate most notably the volume caps on dark trading. The proposals suggest placing a limit on the total amount of unlit trading in an individual stock to 4% for each venue and 8% as a whole across European regulated markets and MTFs. Current estimates from TABB Group put total European dark trading at around 11% of equity trades across the full spectrum of regulated markets, MTFs and broker crossing networks (BCNs). However, BCNs will be banned under MiFID as the equivalent organised trading facility (OFT) category created by the legislation will not be allowed to trade equities.
One of the issues is that the thresholds are arbitrarily set and many see them as unnecessarily restrictive. In addition, it is unclear how this rule will be enforced given the lack of a consolidated tape which can accurately and reliably spew out the trade data across the trading platforms. Ironically, this dilemma is bringing the intensely competitive broking community together in search of a solution to minimise the potentially damaging impact a cap could have. One alternative being bandied about is imposing a minimum size for dark pool orders that incrementally increases as trading nears the threshold.
Another hot topic is derivatives trading and those OTFs, which must be multilateral with restrictions on the use of own capital by OTF operators such as broker-dealers. While this will mean a more neutral playing field, there are questions over the liquidity as many of the instruments traded on these venues have relied on broker capital or inventory. This could depend on the derivative contracts that ESMA deems suitable.
It is not surprising perhaps that against this backdrop, the target date for 2016-2017 could slip.
Lynn Strongin Dodds, Editor.
[divider_to_top]
Aquis Exchange; What's Next?
With Alasdair Haynes, CEO Aquis Exchange
During 2013 we had the idea, got a team together, built the technology, raised the finance, got regulatory approval, connected customers and went live. This is a great achievement for any organisation in this economic climate and I commend the team for their extraordinary work. Many firms take much longer than that just to get their regulatory approval, let alone go live.
2014 sees us executing that strategy. We started out with seven or eight customers and we hope to add more and more customers each month. We have an aggressive sales pipeline of over 40 customers that we want on board as quickly as possible. We are doing tiny volumes today, but that’s almost deliberate; if you have a party at home, you don’t turn the music on straight away, you wait until everybody arrives, you serve a few drinks and then you turn the music on! And that’s what we are doing here, we want to get everybody on board first, to make sure they’re all ‘at the party’ and when they are, we can turn the music up. That’s when we will make a difference. And that’s the approach we’ve taken; the major metric we look at is the number of customers we have on subscription each month in the early stages.
The strategy is very simple. We have launched in three markets and over the next few months, we will add more markets in Europe (14 in total), we will add customers and then we will start turning up the volume for the music. We think subscription is really down to critical mass, as much as any new exchange is down to critical mass.
The vast majority of people we speak to like the concept and they appreciate the idea that it’s a strategy that will grow the market. Our plan is not about taking market share from one place to another and just breaking up the pie. We want to grow and adapt the model in order to realise the same effect that we’ve seen in telecoms, television, retail and in leisure. Where these industries have subscriptions, there has been enormous growth. We think the same thing can happen in financial services and that it is basically due to the transparency angle.
We have 12-18 months in order to improve our concept, to get out into the market and to make it happen. We are a private company, not a consortium of banks who could finance this kind of project for long periods of time without profitability. We have a model which we are showing to the industry. The industry is either going to like it, which we obviously hope and believe it will, or it won’t. If it doesn’t like the model, then they won’t get a subscription.
The market is ready for change and the appetite for European equities certainly appears to be growing and we believe we have the technology and the best people in place to implement that technology. It’s still early days but the signs are encouraging, and in any business you do need a bit of luck! But we have a great opportunity as we are in the right place, it’s the right time, and we have the right product and business model.
Real Time TCA – The Next Frontier in Trading Analytics
The use of technologies for analysing real time data can unleash insights and added value for buy-sides, which have been lacking from broker offerings until now, according to Head of APAC for OneMarketData Serdar Armutcu
Take TCA for example. Traditionally brokers have provided TCA reporting based on a post-trade basis, but increasingly in an environment of dynamic algorithms, fragmentation, dark pools and SORs this is no longer sufficient for meeting the challenges of analysing real time order performance. There is still significant informational asymmetry between the buy-side and brokers on an intraday basis. Democratising the access to intraday execution information can be an area that brokers can focus on to differentiate themselves from the competition. Helping clients act on insights as they are revealed in real time will allow them to be seen as much more pro-active than the competition.
TCA Gets Real
In real time TCA, the main difference is that data; market trading data, orders, executions, news data as well as any other types of proprietary data, is coming in a continuous stream, which makes the data set much more complex in structure and far richer in terms of scope for providing additional colour and diagnostics for the trader. While the traditional types of metrics which have been used thus far provide a way to pierce the data to get at the longer term trends in execution costs they only provide a rear-view mirror picture and therefore they lack the specificity to guide the buy-side trader on how to adjust their actions on the next order or how to set algo parameter values during trading when market conditions can change very quickly. Another weakness of such TCA products is that they tend to focus on the costs and generally do not consider the risks. It is also difficult to incorporate into the analysis exogenous factors such as the trading environment that the trader is working in and instructions or constraints attached to the order by the clients.
In contrast, in the real time setting in addition to these standard metrics, there is tremendous opportunity and scope to provide additional market colour via decision support tools that incorporate metrics and diagnostic tools which cover areas which were not relevant a few years ago but which have a direct impact on execution quality now. Some examples of these are real time liquidity analysis by venue and dark pools, toxicity levels, latency statistics of different exchanges and venues, prediction of order completion probabilities etc.
This also means it makes it so much more challenging to design interfaces and visualisations that don’t overwhelm the end users. One of the trends in this space has seen the entry of visualisation companies to fill the gap between the large data sets that are being produced by the execution process and the end users whose decisions depend on insights from such large continuous streams of data. While these tools can help with building quick dashboards, building useful visualisations is just as much an art as it is science, and requires special sensitivity to things like colour, shape and form that can’t be gained by simply purchasing a software system. It is a bit like giving brushes and paint to somebody and expecting them to produce the Mona Lisa.
The other main challenge for brokers is to justify the investment in technology and people in a time of shrinking commission pools for such real-time analytics products, as these have tended to be more costly and labour intensive to implement than say pre- or post-trade analytics because it requires expensive things like data connections and market data feeds. But these costs have been coming down thanks in large part to the commoditisation of the fundamental technology needed to implement such systems. Companies can now purchase off the shelf products to help them get up the analytics curve quickly in building their solutions. Amongst the technologies which have made the most impact when it comes to analysing real time streaming data are the Complex Event Processing (CEP) platforms.
Complex Event Processing (CEP)
The CEP model of computing has its roots in a range of fields. The concepts found in CEP were developed independently in areas such as Discrete Event Simulation, Network Management, Active Databases and Middleware.
A CEP application is at its core about finding patterns amongst a stream of events. The word complex is meant to imply an abstraction or a compound of a set of basic patterns rather than something that is complicated to calculate. While it is out of scope to go into a full discussion of CEP technologies in this short article, it is probably important to state that while it is possible to implement a CEP type application in a general purpose programming language such as JAVA or C++, the field has evolved to a point where there are significant advantages to using these specialised CEP platforms. Queries using the CEP driven model of computing can be much more efficiently and clearly captured than say a more low level language such as JAVA for example. CEP languages generally employ techniques such as filtering, aggregation, event stream processing constructs such as joins and merges to allow users to be able to express temporal relationships amongst events in a highly efficient manner. Such queries would be difficult if not cumbersome to implement in say a traditional database language such as SQL.
So what are some of the important points you should be thinking about when you are considering CEP technology? Here is a shortlist of questions you can engage your CEP vendor next time you are in the market for a CEP platform.
What core design assumptions underlie your architecture and who designs and builds them?
It is important to ascertain whether there are any inbuilt limitations which might make it difficult to apply CEP technology to your particular problem or domain. How easy is it to scale up to handle an increased number of input streams or an increased number of users. Is the design flexible enough to accommodate users distributed across different geographic regions?
Does the system provide the ability to extend functionality?
Can the system meet not only your current requirements with out of the box functionality but also any future requirements that might pop up unexpectedly? If you can extend the functionality through configuration rather than coding the result will be less costly, less error prone and a faster turnaround on your applications.
What type of support do you provide for third party market data vendor feeds, internal proprietary data sources and the ability to process and load non-structured data ?
It will save you a lot of time if the process of connecting to market data vendors, subscribing to symbols and capturing the raw message traffic is part of the out of the box functionality so that time does not need to be spent on what most people would consider low value work or “plumbing”. Handling non-structured data is also important for example when backfilling gaps in your datasets and incorporating non-traditional sources into your quant analysis.
What type of storage mechanisms do you provide to store the results of the CEP calculations and market data? Is there an in-memory database to capture the streaming data and store results of calculations? How much data can be stored?
Most CEP platforms have some type of temporary storage area which can be written to from their event processing engines, usually however this area is generally insufficient to be able store both the results of queries and the streaming market data. It is important then to understand the storage limitations of the system if there are any for storing both intraday data as well as historical data.
What type of logging, troubleshooting and diagnostic tools are available to monitor, measure and ensure the integrity of the data and of the system?
It is important to be able to have a complete audit trail of the system on individual user queries, be able to monitor system performance statistics such as throughput latencies, detect when there are outages, when system performance is coming under strain, user entitlement violations, etc. Logging should be easily customizable to suit the various needs of different user groups. Monitoring and troubleshooting tools should come as part of the system and not depend on proprietary third party technologies.
Trading in Transition
Technology is reshaping the trading landscape as algorithms and low-latency analytics continue to dominate. Yet despite this proliferation of tools and executions services, there are still many buy-side traders in Asia who are not comfortable with the process of selecting, managing and evaluating broker algorithms. This discomfort can be attributable to factors such as lack of training, technological support, opaqueness of broker tools. But new services that are now appearing that seek to address these issues such as real time TCA we believe will help in overcoming what has been missing from the trading landscape such as simple, intuitive interfaces to enable automation of complicated strategies and to provide users with real time holistic feedback regarding changing market conditions, market impact and order executions. Uncovering the performance of trading behaviour through customised, personalised real time transaction cost analysis is a critical component to any investors’ profitability.
The Buy-Side's SOR
Simo Puhakka, Chief Executive Officer, Pohjola Asset Management Execution Services Ltd looks at the development of a Smart Order Router that meets the buy-side’s needs.
In 2011, Pohjola Asset Management Ltd (PAM), Finland’s leading fund manager, launched its Smart Order Routing and Algorithmic Trading Service in partnership with Investment Technology Group Inc. (ITG). The origin of this offering was PAM’s desire to reduce slippage costs when trading equities for their funds. In order to take full control of the routing logic, PAM built its own buy-side controlled SOR (Smart Order Router) with ITG.
What is this Smart Order Router built “by the buy side, for the buy side?” It is something different from that offered by traditional brokers: its’ sole purpose being to provide the right price at the right time.
It quickly became apparent that there is demand for this type of offering from other institutional traders. A separate entity, PAMES, was established to market and further develop the buy-side SOR. PAMES, which is a fully owned subsidiary of PAM, currently offers the SOR to institutional traders and asset managers. Its aim is to work with like-minded buy-side companies and to provide a low cost, highly efficient execution facility that delivers dramatically improved trading performance.
Is PAMES saying that traditional brokers do not always offer superior execution quality for their clients? The answer is: probably. In fact, traditional brokers’ routing logic quite understandably includes the need to create value for themselves. This is not the case for the buy-side SOR which does not, for example, internalise orders, look at trading venue fees or maker/taker refunds and is not conflicted by ownership of alternative trading venues.
When sending an order to the broker, the root of the problem is, what happens before the order goes to the venue? What control do we have over the broker infrastructure, including their proprietary flow, internalisation, market making and crossing, not to mention the routing logic? It is noteworthy that whilst many brokers have the ability to connect to multiple venues, our investigation has revealed that a disproportionate percentage of flow is executed in very few venues which might lead to higher opportunity costs for the buy-side.
In our case, the buy-side SOR user group has control over the SOR routing logic and the router is designed to avoid conflicts arising from internalisation of flow, costs of entering certain trading venues, venue ownership and other factors that may affect the quality of the execution. As an example, we would not rest the order in an internal venue nor would post or leave an order sitting in a venue purely for financial gain. Our SOR seeks and finds liquidity from number of different venues.
We believe that the traditional brokerage business model cannot be truly compatible with buy-side´s interests and thus it is likely to limit the quality of execution.
You sometimes hear people arguing that the brokerage market is a very competitive market, and thus those who do not offer best execution would not survive. The brokerage market is very competitive, but it is still not even close to being truly competitive due to the high barriers of entry and number of uninformed customers.
Having said that, we cannot solely blame brokers for the prevailing situation. There is simply not enough demand for high quality execution because of the buy-side´s inability to measure it. If brokers believe that their customers cannot measure execution quality, they are unlikely to provide it, as it is more profitable not to. Any broker who spends resources and increases costs to provide unrecognised execution quality will be undercut by those who do not. All this leads to an acceptable level of execution quality, which, unfortunately, does not equal best execution.
In order to change the present state of the market, the buy-side need to increase their understanding and become better informed – you simply cannot buy something that you cannot see. We all know how to conduct traditional macro level TCA but that is just not enough. Buy-side trading desks would need to dig deeper into the TCA numbers and combine traditional TCA with micro-level analysis. Through the micro-level analysis we are for example, able to monitor the quality of our routing, which has an impact on traditional TCA numbers.
Fortunately, over the past few years we have started to see the buy-side becoming more informed. More buy-side participants have joined PAMES buy-side SOR user group to work with like-minded companies leveraging each others’ expertise and thus creating value for each other. In addition, at the end of 2013 we saw the launch of a new buy-side owned alternative trading system in the US, where brokers act only as subscribers. The platform shares the same goals as PAMES by promising to provide the opportunity for the buy-side to maximise trading at the best available price without unnecessary intermediation. It seems that buy-side is slowly starting to demand efficient, non-conflicted execution – something that should be self-evident.