A GlobalTrading roundtable discussion.
No firm is capable of managing its entire technology stack in-house, this much was agreed early on at the GlobalTrading roundtable hosted in June in New York to discuss outsourcing and its ongoing role in managing technology for firms. Exchanges are much better able to manage technology internally, but even this becomes increasingly challenging as they look to global markets.
The participants at the roundtable represented a range of buy-side, sell-side and exchanges, and each realised that there are differing approaches to the role of outsourcing technology, and what should be managed in-house by way of competitive differentiation, and what can be commoditised and bought in from a technology provider. That each firm has a slightly different mix of what they want to build and what they want to buy itself enhances the nature of the marketplace, and firms deciding on different answers to the same question is what keeps competitive differentiation alive. The firms in the room though all generally concluded that one of the key measures is the ability to have simple solutions that are easy to integrate into current workflows, and those that can be customised to fit in with the outlook and philosophy of the firm. One buy-side attendee used the analogy of a pyramid of core functionality, whereby a firm can view its technology as either being in the core function of its firm, or it is a peripheral piece that is added competitive advantage. The commoditised functions can be brought in, but the competitive pieces at the top of the pyramid need to be developed internally, and thus the firm ends up moving those boundaries back and forth and realises a hybrid model of technology. As the elements that are competitive advantage evolve, and other elements become more standardised, the key is the ability to maintain an evolving relationship with the outsource provider to encourage that flexibility.
One ongoing discussion point was the level of responsibility for each of the participants in the chain of technology in the event of outages or regulatory scrutiny, from vendor to the sell-side through to the ultimate end client. While it was generally agreed that the majority of the emphasis sat with the sell-side, there was an acknowledgement that increasingly this emphasis is shifting to the buy-side, and more responsibility for the technology is growing across different participants. Whether this will result in a shift of sell-side technologists to the buy-side remains to be seen, but there is certainly room for development on both sides of the Street to ensure well integrated and controlled technology.
Finding The Right Mix: Outsourcing Technology In A Changing Regulatory Environment
Building A Buy-side Community
With Jeff Estella, Director of Trading Analytics & Investment Operations, MFS Investment Management
To roll back the clock to October 2013, we were approached by Fidelity Trading Ventures regarding a new platform for trading. They were using some data to think about long-term institutions and the liquidity challenges they have with both traditional exchanges and with the different intermediaries that are in the marketplace now. They were asking if there was a potential to have an efficient platform that doesn’t have a profit motive to match buyers and sellers. One that could match long-term fundamental investors, in a single environment where it was possible to take the frictional element out of the equation.
Fidelity Trading Ventures approach 20+ institutions. After many months of non-disclosure agreements and meetings, nine firms were able to come together to form the consortium that would become Luminex. MFS got involved because this was focused around an opportunity to reduce the friction in matching buyers and sellers. We have a passion for innovation and change and wanted to help bring that to the market through our involvement in the consortium. Could this potentially help MFS’ clients? The answer is yes, but more importantly, it could help the entire marketplace. This is about the greater good for long term-focused investors.
The first milestone of success is already past us. We were able to get a significant number of long term-focused institutions to commit their names and their financial resources to this industry consortium, to creating what is essentially a utility. That in and of itself is the first milestone.
The next milestone is building out scale. Luminex continues to follow up with institutions it originally sought out back 2013 as well institutions that expressed an interest in participating on the platform after it launched. This system only works if the network is big enough. We need a diversified flow in there, with buyers and sellers in the same security at the same time, on an anonymous basis. While we’ve heard the speculation that if it is all long term-focused institutions that have a buy rating on XYZ security, there could be zero odds for Luminex to have the ability to cross the trade. However, we’ve found that the data shows that there is a diversified number of orders in the system at all times, on both sides of the trade.
We’re creating an environment with like-minded firms that are looking to trade on an institutional block level. They’re more focused on a fundamental view on a security. We think it’s a very attractive proposition if we can put two different long-term market participants together that have a different view on that security at that point in time, with the least amount of friction, relative to other peer-to-peer systems. One would think that those two institutions may consider using an offering such as Luminex.
We’d love to hear your feedback on this article. Please click here
Changing Metrics
With Jonathan Clark, CEO, Luminex
Increasingly we are seeing the buy-side take more responsibility for its orders – how they’re handled, where they go, who sees them and so on. In order to fully understand this, it is important to step back and consider the origins of Luminex; why it was set up and what it is trying to fix.
Luminex was born out of the question of how could we address market structure issues related to efficient sourcing of block liquidity, and whether we could help address these issues by creating a buy-side utility. The vision was for a buy-side owned and operated platform, not driven by profits, but simply by the notion that the buy-side could offer a more transparent and cost-effective solution by creating a community of likeminded high-quality firms that shared the same objective. Could they come together and put their flow into a platform to trade with one another and do it at effectively utility prices?
Ideally it would be self-sustaining and would save execution costs to the clients. In addition, there were already frustrations with ‘traditional’ block trading. Traders were having a hard time trading blocks and they were concerned that when they did send block orders to trade, there might be transparency issues, and the possibility of information leakage with orders that could make the markets move away from them before they could get their order filled.
These kinds of transparency-related issues led to the effort to “raise the bar” by creating a platform where the buy-side could trade with increased trust and confidence. Luminex was being founded at the same that many of the ATSs were facing issues; there had been sanctions, and some had been fined and were effectively forced to close, so there were certainly growing concerns that we felt needed to be addressed from a confidence point of view.
At that time, I was on the buy-side at BlackRock and was part of the initial group that had been approached to go out to the community to find out if there was a consensus about these issues of concern. Nine firms ultimately offered to work together towards solving these difficulties, and committed to funding the initial start-up of Luminex. The platform was announced in January of 2015, and we went live on 3 November 2015. We have been very pleased with the results so far.
Measuring success
There are a variety of ways of calculating success of any given ATS platform. I know most people would use volume figures from the start, but that isn’t the best way to look at Luminex. It’s important to note that Luminex is open only to asset management firms with more than $1 billion in AUM and a focus on long-term investing. We have a strict minimum trade size of 5,000 shares. We don’t allow broker dealers or high frequency traders to participate either, so comparing us by looking only at volume numbers misses the point, to a large degree.
I look at different metrics, particularly average print size. Today, our average print size is about 32,000 shares per print, which is a very good number because it means we’re doing what we set out to do – allowing people to trade large blocks. The top quantity that’s coming in (that is, the order that traders are sending to us) often exceeds 150,000 shares per order. So there is certainly a commitment to trade which is good to see but we are also pleased with the trend of growth. However, it is also challenging because changing trader behaviour is very difficult and requires our sales team to be in touch with clients on a daily basis. This means they have to travel to meet with the clients and ensure they are comfortable with the platform. We appreciate that this will not happen immediately, and that it will take time to develop further.
What we are doing is trying to change the conversation we have with firms. Firms that expect immediate volumes or instant success, those that ask “I sent an order, why didn’t I get it filled immediately?”, will have a more challenging experience on the platform. Those firms that understand what we are trying to achieve are the firms that will have the greatest success. When we communicate with them, because they’re sending in multiple orders a day, they are resting their order for a good duration of time; they are ultimately the ones that will have the most success.
A firm that sends in one order per day or maybe one a week in some esoteric name, will have a much less positive experience. It is our job to convince them that if they want to improve their chances of a hit, they should change that pattern of behaviour, and that just takes time and patience, and our Board recognised that this is going to be a long term build, and we’re funded that way.
We have found, on average, over 50% of the prints that occur here will end up being a top five print on the day in a given security. This is a good way of looking at a metric that most often, people don’t tend to think about. It is a wider conversation than just headline volume figures and ultimately, we want to trade large blocks that are meaningful. After only six months, to have half of our prints be in the top five prints for the day means we have had an immediate impact and we are doing what we promised we would do for the sourcing of block liquidity.
I’m not prepared to denigrate the continuous market. In most market centres around the world, there are very efficient means to execute order flow, but it’s also a very efficient means of executing smaller lots. With that in mind, those firms (especially bigger institutions that need to move larger pieces of stock) are going to need other pools to do that and the open market isn’t really the most efficient place to execute large blocks of stock.
What we are trying to argue is that we are a clean, healthy and effective platform for traders to consider when looking to trade blocks. There are of course other ATS providers that do offer block services, but what we are trying to do is distinguish ourselves in a variety of ways. As I mentioned earlier, our pool, in effect, is solely the buy-side community. Our platform comprises investors with a long-term investment horizon and therefore proprietary flow, market makers, HFT etc; are not in our pool.
Other pools have a number of different participants in their pool, while we are trying to maintain a very tight-knit community where people can feel comfortable trading. However, I’m not going to dismiss equity markets and market complexity, because again, it is a very effective way of trading smaller orders in a continuous fashion.
Our average order size is in stark contrast to that of many ATS operators. Their volume statistic – the average sales size, average order size – looks a lot like exchanges, around 300 shares vs. our 32,000 shares. This is a completely different model than many of the broker-dealer sponsored dark pools.
Ultimately, however, traders have to come equipped with the right mind-set. It is our job to remind those who are only concerned with volume figures of the full story, so that they can re-engage and continue to move along the path towards success.
Expansion
We have a number of European clients who trade US products from European soil, and we are continuing to expand our focus in this area and I hope to see that number continue to grow over time. The reception has been very positive so far, and we are going to spend some more strategic time in that market to explain the benefits of trading through Luminex.
During those trips to Europe and when speaking to global managers here in the US, we are being asked when we will launch a European equivalent, whether it’s in the UK or mainland Europe or possibly even Asia or Latin America. We definitely see an opportunity to do more with our platform over the long term, either as relates to foreign equity markets or other asset classes, but those are longer term opportunities. In the short term, the Board and our team are focused on getting Luminex established as a consistent, successful platform here in the U.S. Getting it right here is what’s critical and, six months in, we are all focused on smoothing out the wrinkles you deal with in any start-up and continuing to build our customer base. If we demonstrate success here, we’ll keep an open mind about what other geographies or products make sense expansion-wise.
We’d love to hear your feedback on this article. Please click here
Making Liquidity Work
By Clive Williams, Global Head of Equity Trading, T Rowe Price
We joined Luminex partly because there are a large number of ongoing regulatory challenges around dark pool trading, and we needed a new venue to trade how we wanted. One difficulty we see is that dark pools are not really dark, they are more grey. The one thing that we have always argued is that dark pools should be for blocks, and there should be no leakage of information. That is where Luminex can come in and be advantageous; we have a minimum fill size of 5,000 shares, we have price improvement. It ticks all the boxes for us and gets back to that more instinctive feel of what a dark pool should be.
However, the challenge for any new venue is attracting enough initial users. Luminex does not have the client base that, for example, Liquidnet has. They have 450 and we’re closer to 150. Integrating new firms and their execution management systems takes time, and building that critical mass is a big challenge.
But the growth will occur because much of the reason behind Luminex comes back to how firms want to use a dark pool. Firms should feel comfortable in resting an order in a dark pool. If a desk is just pinging a pool, the chances of them actually hitting a contra order are very small. The system is only going to be able to operate if people are comfortable putting flow in there and letting it rest. That’s the problem with the market as it currently stands as it has got so short term that people aren’t properly using dark pools; people want continuous execution and it is a challenge to change that mind set.
We would rather, if a trader had liquidity that wasn’t doing anything, they would reside that order on a pool where we have the potential to find a contra and do a block size trade rather than just leaving it on the blotter doing nothing. Luminex is building out the clients that have a similar approach; we have been pretty clear as to who we’ll accept on the system and who we’ll not accept.
This is partly why Luminex is purely focused on the US right now as well. We have to get the US right as we have to learn to crawl before we can walk. Given all of the regulatory changes, even just examining the ongoing information coming out of Europe, the caps on dark pools that are within MiFID II for example, it is all very unclear and the implementation deadlines are rapidly approaching.
I understand that if a firm is a European asset manager or trading European equities, they want more options to execute block sized trades, because the options in Europe are less than in the US. There’s the desire for Luminex to therefore move to those markets, but it’s a matter of getting the capacity right in the US first and then we can transport a more finished model.
Obviously there is a large regulatory drive in the US talking about transparency and routing and the buy-side gaining a better understanding of that. The one problem the buy-side may have overlooked is that if we ask for the data and we get it, we need to put in the effort to understand what that data is telling us. It’s not just a box ticking exercise. The buy-side is going to have to make some investment to try and understand their trade data and how they can effectively use the data; e.g. how smart order routers operate, and having a deeper understanding of market structure follows from that.
We’d love to hear your feedback on this article. Please click here
Welcoming Asian IOI Reform
By Canute Dalmasse, Head of Execution Services Asia Pacific, Goldman Sachs
In May this year, the Asia Trader Forum [ATF] (buy-side trader’s consortium) released their framework and guidelines for an IOI Code of Conduct and Best Practices. In addition to that framework, George Molina previously noted in this publication that “the problem with IOIs has always been the lack of transparency because of the perceived interpretation that brokers sometimes were fishing with IOIs – that they weren’t real”. At Goldman Sachs we believe the focus for IOIs should be on quality not quantity, offering a genuine source of contra side liquidity for clients. There should be clarity and tradability of all IOIs sent, ultimately incentivising clients to respond with the confidence that they will find the liquidity they seek.
The release of industry best practices is an important development for the viability of IOIs in Asia and these guidelines should be wholly embraced by all market participants. Systematic use of qualifiers on all messages sent and specifically, using the ATF recommended methodology to identify when an IOI should be considered natural or non-natural is of utmost importance. In our view, the Sell-side should work collectively and invest in the infrastructure to implement the guidelines for market best practices over an IOI’s lifespan. By collectively embracing the new framework, IOIs will become an ever more important tool for helping institutional clients and their end investors source real liquidity, reducing market impact and trade costs.
Proposed reforms
- Scenarios: The ATF’s published ‘IOI Code of Conduct and Best Practices’ outlines four scenarios under which potential IOIs can be classified. The key distinction across these scenarios is whether an IOI should be flagged as natural or not. Industry practices have varied to date on what constitutes a natural IOI with differing interpretations on the types of flows that qualify. The new code makes a distinction between agency flows and facilitation unwinds of actual positions that can be flagged as natural, versus facilitation risk hedges and unrelated facilitation initiation flows that should not. By making these distinctions, the ATF Code of Conduct tries to draw clear lines as to what can acceptably be considered as natural going forward.
- Order Qualifiers: Enabled by advances in trading technology, the ability to seek and match against appropriate liquidity flows has become a common feature used by the buy-side. It is for this reason that the guidelines also specify data points to be set on individual IOIs that assist in the trading decision process. In addition to natural / non-natural tagging, IOIs should display price limits, order size and type and trading instructions (e.g. Limit, Working with More Behind) using industry standard tag values. The goal of qualifiers is to help clients more effectively interact with natural liquidity.
- Best Market Practices: Underpinning the ability for buy-side firms to trust that IOIs are accurately described and are being properly worked are a series of 15 best market practices. These practices can be broadly grouped together as workflow and control features with care for client information.
Execution controls
Interaction with IOIs should be managed via access controls and have appropriate flow segregation with clients grouped on a tiered basis. Access to the IOI ecosystem should be two-way, for instance, viewing of IOIs by a counterparty that is not also allowing their own orders to be advertised via IOIs should not be permitted. To maximise transparency brokers should also offer clients the option to reflect their own IOIs back to them. We believe these controls are best implemented through flow automation, allowing audits against active orders from which the IOIs are generated and removing the risk of human error. Additional functionality or policy should ensure that an “in touch with” categorisation is only used where a client’s continued interest to trade has been confirmed, and not by default.
Operational features
Execution workflows may need to be modified to support the new best practices; however there are valid concerns underlying the buy-side’s need for more robust functionality. For example, a commonly cited issue is that an IOI becomes stale. In order to ensure accurate and up-to-date information, refresh times on IOIs should be less than 15 minutes. Use of qualifiers should be systematic and standardised to aid in consumption. If a broker cannot populate designated FIX tags, their systems should reflect the same qualifiers in a comment field.
Care of client information
The very nature of IOIs is a passing of information and as such brokers should exercise appropriate care and control. Systems should enforce minimum and maximum IOI sizes with respect to a given stock’s ADV and further tailor quantity as appropriate to each client or client tier. In addition, brokers should consider the time value of information and thus the importance to each client tier.
Why this matters to the sell-side
- Clients: With the release of the Code of Conduct and Best Practices memorandum our clients have spoken on what they expect from the sell-side. The sell-side now has clear guidelines in the absence of a standardised regulatory framework. Many sell-side firms are already currently operating under parts or all of the new best practices, making harmonised implementation within reach.
- Change: With change comes opportunity. As confidence levels grow on the buy-side there is potential for increased usage of IOIs as a liquidity sourcing tool. Brokers that improve their execution capabilities by adopting these measures will be well positioned to capitalise on these changes.
- Value-add: It is well known that the buy-side continues to consider their brokerage relationships against the backdrop of an ever evolving regulatory landscape. Brokers that can adapt to market standards and operate on a best-practice basis will have a value-add above those that do not.
Industry support and adoption
- Sell-side support: In order to realise the full benefits of increasing IOIs as a buy-side liquidity channel the sell-side community should also come together to adopt these new standards. The members of Asia Trader Forum have worked hard to map the best practices. Sell-side industry groups should now work to ensure their members understand this framework with the goal of a sensible implementation in a reasonable timeframe.
We’d love to hear your feedback on this article. Please click here
FIX Trading Community – Half year 2016 report
By Tim Healy, Global Marketing and Communications Manager, FIX Trading Community
When I was writing a preview of the year ahead for 2016, I wrote that 2016 was likely to be busy and volatile given how January was unfolding. The year has flown by and for the FIX Trading Community it has been marked by a number of successful events, new working groups, ongoing work in initiatives and increasing interest from the marketplace in the work that members have undertaken.
London, New York and Hong Kong have provided the locations for three tremendously successful events. Close to 2,000 delegates have attended the three events proving that the value of educational and networking events is core to the community. Technology and innovation are major themes for the membership and addressing the challenges of cybersecurity, the potential of blockchain and the burden of addressing regulation are not just regional issues. Looking ahead in Q3, Mumbai, Stockholm and Chicago will be the venues for FIX events in September. Whilst October will see FIX conferences held in Tokyo and Sydney. The community truly does span the globe and our events for organised by the members, for the members.
The FIX working groups have been convening regularly. As a colleague of mine always says “this is where the magic really happens”. The true value of being involved in the community can really only be seen or heard on these calls, where the different market participants meet to discuss challenges and issues in the marketplace. Below I will cover some of the highlights of this year-to-date. The Global Post Trade Working Group are now in the final stages of producing further guidelines for the use of FIX in post trade for a number of different asset classes including equity swaps and options. The original equities guidelines were initially produced in 2012 and have been enhanced since then following feedback from members. This latest stage of the initiative is important as the use of FIX becomes truly multi asset class both at-trade and post-trade.
The Cybersecurity Working Group has become extremely active in 2016 and the group members have concentrated efforts on encryption methods, most relevantly Stunnel encryption and router-to-router hardware encryption as well as key storage / key management. The headlines on the threat and reality of attacks have been numerous recently. As will be discussed in this publication elsewhere, the effects of communication and collaboration will be beneficial as FIX members discuss the on-going threats.
The Global Buy Side Working Group saw a changing of the guard for the leaders as three new Co-chairs took up their positions in EMEA and a new co-chair for the Americas has taken over to continue the work of their predecessors. The award-winning IPO Initiative continues to make strides and gain more attention from both the market and regulators. Earlier in the year, the Financial Conduct Authority (FCA) in the UK published its interim conclusions on investment banking and the work done by FIX to create a more efficient, less error-prone process for the allocation process has not gone unnoticed.
Additionally, the execution venue initiative has moved out from under this group as we are hearing from brokers that their clients are asking for increased granularity in the data so opening this initiative to the broader FIX membership is key so that we can understand the issues across the board in order to get our documentation reflective of what we face in these markets. The expanded group has just recently launched.
Whilst looking at the interaction with the regulators, it would be remiss not to mention the work done by the different MiFID subgroups that have been in existence for just over 12 months. Although the timeframe for the implementation for MiFID II has changed, the work being done has continued in earnest. Late in 2015, the MiFID Clock Synchronisation Working Group’s initial goal was publicised – enhancing the FIX Protocol to support higher time stamp resolution to enable market participants to be MiFID II compliant. This proactive enhancement highlighted FIX’s ongoing commitment to assist the market with the changing regulatory environment.
Additionally, the real value of the FIX Trading Community can be seen in the crossover work being done in a number of the working groups. The MMT Steering Committee are now pushing forward to have a fully MiFID II compliant version of the MMT standard that came under the FIX umbrella in early 2014. Together with the MiFID Transparency Working Group, the aim is to ensure the regulatory technical standards (RTS 1 and 2) are adhered to for pre- and post-trade transparency.
The Best Execution Working Group have worked through a number of different scenarios to provide guidance on reporting obligations for the buy-side, sell-side and execution venues and who will be required to report. This is no easy task given the amount of variables per asset class. Given the scope of both asset coverage and market participant, the working group is also liaising with the reinvigorated TCA Working Group and trade associations to promote their work and ensure no duplication of effort. As the year progresses, it is likely there will be further MiFID working groups initiated as the market looks to the community to provide that competitive-free platform to discuss the challenges that are confronting them.
With regards to new initiatives, a new technical working group started this year to set a standard to map the FIX semantics to JavaScript Object Notation (JSON) encoding. The JSON Encoding Working Group is now making good progress toward providing a Javascript Object Notation encoding of FIX. In recent years, JSON has become a dominant representation of information for the API economy and internet based services.
One of the most promising areas of innovation has come from the FIX Orchestra Working Group. Orchestration is the technical term for describing the behaviour of a protocol. There have been frequent calls for a machine readable rules of engagement that can describe the message flows, the message structure under various usage scenarios, conditionally required fields, and the conveyance of state. At the behest of Jim Kaye, the Global Steering Committee Co-chair and John Greenan, long term industry consultant, this led to the formation of the FIX Orchestra Working Group. The working group is set to submit Release Candidate 1 to the Global Technical Committee for approval. The goal of FIX Orchestra is to improve operational efficiency, reduce the time while improving the effectiveness of testing a FIX Service and reduce the time it takes to certify trading partners. In short, the hope is that the FIX Orchestra initiative will reduce the cost of FIX ownership. The FIX Orchestra effort triggered a number of proposed changes to the FIX 2010 Repository structure and as a result there will be a 2016 Repository edition that will be released in a Release Candidate format shortly after the Release Candidate for FIX Orchestra.
2016 will no doubt be an incredibly busy year. As I write this, the uncertainty surrounding financial markets across the globe has caused volatility to increase massively following the UK’s decision to leave Europe. Geo political tension across the region could herald a period of anxiety in the markets. From a FIX perspective, it will be important that we keep a business as usual attitude. Our members value the independence and neutrality. There may well be changes in the direction of regulation and, as an organisation, we will continue to work with regulators across the globe to ensure that we address any changes appropriately.
We’d love to hear your feedback on this article. Please click here
Staying Ahead Of The FX Technology Curve
With Christopher Matsko, Head of Foreign Exchange Trading Services, Portware
I joined Portware in August last year and before long, I started digging into the platform and its operation. It became clear that Portware had sound technological know-how: it had all of the advanced execution capabilities for equities and futures, and had applied those execution tools to its FX trading platform, creating a premier multi-asset EMS. Without getting into material specifics, the result was growth in FX: both in clients and in interest in the solution. What Portware needed more of to bring this momentum to fruition was a certain facility with the underlying FX business landscape, to better anticipate client needs and distil various streams of feedback into viable products.
The FX module within our EMS was originally fit for purpose for macro hedge funds and firms with prime brokers. It was also possible to key into Portware’s API and carry out automated algo trading and RFQ streaming programmatically, from client-proprietary black box models. By executing in this fashion with a Prime handling the settlement details, a firm has the freedom not to have to worry about the operational element of the trade.
However, the decision was soon made that the platform needed to expand its client reach and become ‘stickier’ as a true multi-asset enterprise platform, so they took a strategic position to focus on their core client base, namely Tier 1 asset managers. We then talked to these clients to see how we could help them solve their execution problems, and we found that the limitations in executions were purely a result of an operational gap in the real money FX community: front office traders didn’t focus on the complex workflows in the middle and back office. What was needed was a facilitation layer underneath the execution layer.
That multi-allocation space is very complex and takes considerable effort to get right. Splitting a 30-allocation order into five tranches with various execution venues and liquidity pools is difficult, and there is opportunity in the market for platforms that can help simplify and streamline this process.
The biggest challenge we face is the complex nature of the market place: marrying the OMS and its rules, allocations vis-à-vis the banks’ side of the trade, and blocking and splitting out those trades into the right allocations. Our platform is FIX-based but we have to do all the development work to tie the layers together between the banks and the buy-sides. Much of this work is custom to a given client or bank, but this gives us flexibility too.
From a technological perspective we are ahead of the curve in equities with artificial intelligence running automated executions, but this level of automation (interweaving AI execution capabilities) hasn’t yet been applied to real money FX, as many firms don’t envisage AI-assisted FX execution on a big asset manager’s desk. In my opinion, these institutions don’t yet have the same level of trust in machine execution given the inherent relationship-driven nature of the FX market and that’s perfectly natural. Some firms are very progressive and tech-driven and ahead of the curve, but some aren’t and we need to balance that.
The problem of allocations sits behind much of our technological development; taking a large order and slicing it into an algo trade, an RFQ trade, and a bank stream trade and then tying them all back their original allocations is a considerable challenge. We are working toward a solution to that problem. It will be very interesting to see how things progress once these more advanced execution methods can be taken up by the real money buy-side community.
Another area of development concerns fixing trades. Fixing trades have evolved from a very manual process to one that can be smartly automated and given a more mechanical process. Automating fixing trades allows traders more freedom to focus on and manage other large risk positions in their book. In other words, automation helps eliminate noise or distractions for trades which might otherwise be considered a “nuisance.”
Even though there have been serious issues around the WMR, it is still being used to get large block positions executed at a defined mid-rate (plus commission). Unless a firm is focused on generating alpha or any type of return on those executions, then the best way forward is to simply automate those trades, as quite often asset managers don’t realize the fixing trades are already going into algos as is. All Portware needs to know are the times and the bank pools and rotations and fixing trades can all be done automatically.
Trends in the market
Bank algos present an interesting issue with a diversity of opinions among the larger buy-side firms. Some buy-sides advocate the usage of bank algos as they feel they get a stable rate across diverse liquidity and can hit benchmarks, whereas others say they don’t want to give the bank too much information as the banks ‘can do whatever they want with it’… so we have to try to provide access to both solutions and the challenge is in finding the balance.
Given the relative unease or mistrust in the way trade information is handled by market participants who control the most FX volume, Portware began to consider possible solutions for that problem. We proposed diversified liquidity with minimal information leakage. As an example, we allow clients to formulate their own unique electronic communications network (ECN), combining relationship-based RFQ and streaming liquidity as well as access to third-party anonymous ECN’s (think FastMatch / Hotspot), as well as non-bank LPs. A large asset manager can then cut a large position into smaller slices and work them into various liquidity channels at their discretion; this helps control information leakage and improves executions, which is as much about taking control of that information as it is about transparency and impact on the market. The best way to control market impact and information leakage is to control how much information a firm divulges to a liquidity pool. The ability todiscreetly interact with liquidity channels is the same as the ability to control the flow of information into the market place.
The traditional RFQ model is in slow decline because traders want to have more discretion in trade execution rather than just clicking a button and leaving execution to a third party’s black box. Trades can now be automated according to a given desk’s rules of engagement, and traders can focus on the less liquid, more difficult trades on their blotters. The ongoing trends in FX are a natural evolution of the ‘more with less’ process and automation trends across the wider industry. Firms are being forced to narrow their specialization into core businesses, and they need the rest of the process to be as automated as possible. The issue of FX TCA has been around for a long time. In a fragmented marketplace there is ongoing debate over what TCA actually means, and because there is no “volume-at-price” data in the FX market (as well as other OTC markets), benchmarking and analytics run the risk of becoming more subjective. Despite this risk, it is still the case that the best way of finding out the cost of trading is by including a firm’s own data set in any analytics, in addition to third-party market and reference data. This is as true in FX as it is in equities and any other asset class. Now, there are challenges specific to third-party data in foreign exchange (e.g. validating execution size, credit relationships, anonymous vs. relationship-based executions, etc.), but we should not throw the baby out with the bath water. It is always best to use a healthy mix of market and reference data and proprietary execution and benchmark data, as this gives the firm colour from the marketplace alongside comparisons to their own mid-rate and their own reports, which are best for measuring a firm’s own improvements over time.
So, for example, by combining the results of recently executed trades and some historical data, a firm can view which banks gave optimal spreads at a particular time of day in their traded currencies based on their execution methods; RFQ, algos or streaming. The next logical step is to take those inputs, apply pretrade analytics given current market conditions, and put a smart order router (with an AI wrapper) on top of it to execute a trade at the lowest cost given the characteristics of the order. With a good data feedback loop, a trader can more intelligently feed trades into the market given their size and the market liquidity conditions.
The future of the FX market will incorporate automation, artificial intelligence, and TCA, much like what is under way in equities and other asset classes. It’s about creating a common thread over the entire life cycle of a trade, and of trading holistically, so that the results of one trade inform the next, and a firm can constantly tweak and improve execution. Nobody is quite there yet but change is coming, and this is where the future of cutting edge development lies.
We’d love to hear your feedback on this article. Please click here
MiFID II Implementation : Hurry Up and Wait
By Chris Tiscornia, President and CEO of Westminster
As members of the European financial community attempt to develop a better understanding of the far-reaching consequences of the recent “Brexit” referendum to their business structures, many investment management firms are simultaneously preparing for changes to their research processes that will result from the implementation of the European Commission’s Delegated Acts for the Markets in Financial Instruments Directive (MiFID II) in January 2018. While the investment research industry may see more changes in the next two years than it has in the last two decades, the most important word here is “MAY.”
Since the Delegated Directive was published in April, many firms are studying what modifications and enhancements will be needed to their infrastructure and workflow to comply with regulatory requirements. As we await further guidance in the form of consultative papers from the respective European regulators, it is important to reconsider the potential impact that an increasingly complex and burdensome regime can have on the industry’s competitive landscape.
Research Payments
The EC Directive determined that the provision of research is not an inducement if:
- It is paid for directly with the investment firm’s own funds; or
- It is paid from a separate research payment account (RPA) controlled by the investment manager and funded via:
- Specific research charges to the investment firm’s clients; or
- Commission sharing agreements (CSAs) where a specific research charge is not linked to execution volumes or values.
Additionally, upon request, investment firms will be required to provide clients with information about the budgeted amount of research and a summary of providers paid, services consumed, amounts paid, benefits received and total amount spent in comparison to the budget.
Regulatory Considerations
The establishment of procedures to improve transparency over the cost and value of research is in the best interests of investors. Furthermore, a more rigorous research valuation and budgeting process creates an increased level of accountability and should theoretically create a more efficient marketplace that allows the most impactful research inputs to be paid commensurately.
The challenge for regulators and industry participants alike is to foster a more transparent research procurement process without creating an administrative and reporting environment that is exceedingly complex and onerous. The extent to which reporting requirements are more burdensome and costly when research is funded with client assets through an RPA will incentivise some investment managers to pay for research directly with their own funds. Already, a number of firms have indicated that they will consider paying for research services themselves.
Investment Management Considerations
Understanding that most businesses need to control their cost structures in order to remain competitive, there is a material risk that smaller and mid-size asset managers will be disadvantaged if the industry moves towards a standard of direct funding for investment research.
While the largest asset managers have adequate scale and internal resources to absorb the cost of directly funding a research budget, smaller and mid-size managers would be hard-pressed to compete in the long term. A lack of resources would put these managers at a distinct competitive disadvantage in terms of their ability to price their product competitively, attract investment talent, access a necessary range of research, and ultimately manage their clients’ investments.
Research Considerations
As we have said before, a vibrant and competitive marketplace for objective investment research is an essential foundation to the capital markets. Investment managers with access to more quality information should be positioned to more effectively allocate their clients’ capital.
The investment research industry is in a dynamic period of evolution. Our clients continue to enhance their research processes through the cultivation of new research inputs comprised of proprietary information and data that did not even exist several years ago. Many of these services are introduced to the marketplace by third-party, independent research firms whose existence is dependent on the use of commissions as a currency. It is safe to say that emerging independent research firms would have a more challenging time launching in an environment where they need to compete for payments from an investment manager’s own funds.
Similarly, boutique broker-dealers that cover niche industries will have a harder time supporting their capital markets businesses. For instance, a boutique broker-dealer focused on investment banking for emerging health-care companies would have a difficult time managing its investment banking business without some level of support from its research and trading businesses. A further byproduct of such a scenario would be the negative impact it would have on the emerging companies’ ability to obtain financing and come to market.
Final Thoughts
A vibrant, orderly and effective marketplace is inherently made up of a broad range of ‘buying’ and ‘selling’ participants. In the case of investment research, these include various types and sizes of investment managers, proprietary trading operations, investment banks, bundled sell-side brokers, execution only brokers and independent research providers. It is clear that the EC Directive recognises the need for a variety of research payment models to support this broad range of industry participants and it is vital that Member States carry this recognition through consultation and implementation.
In the first half of 2015, industry participants were genuinely concerned that the forthcoming MiFID II guidelines could be interpreted to define investment research as a form of inducement. Many constituents – regulators, politicians, investment managers, investment research providers and others – cautioned that such a dramatic overhaul of industry practices that have worked well for over forty years would trigger many unintended consequences throughout the European economy.
While many in the industry breathed a sigh of relief when the EC’s Directive allowed for the use of trading commissions to fund an RPA, there is now a genuine risk that the level of scrutiny and cost of managing the budgeting and reporting process may be so burdensome to managers that larger firms with more resources will choose to pay for research directly using their own funds, which will put increasing pressure on smaller asset managers with fewer financial and other resources to follow suit. We are hopeful that regulators recognise these issues and foster an environment that is not unnecessarily taxing to smaller asset managers, ultimately leveling the playing field for investment managers of all sizes to acquire research from many different sources.
We’d love to hear your feedback on this article. Please click here
Announcement : Project Sentinel
PROJECT SENTINEL COMPLETES MIFID II TECHNOLOGY SPECIFICATION.
Project Sentinel, the bank collaboration to mutualise the cost of MiFID II implementation in the OTC front office, has completed the detailed business requirements that meet MiFID II regulatory obligations.
London, New York, Thursday 28th July 2016: Solving the MiFID II challenge requires a significant amount of human resources and technology investment that, if solved on an individual basis could prove very costly with only limited, if any, competitive advantage. Project Sentinel’s collaborative approach enables market participants to pool their resources by investing in a standards-based, strategic MiFID II compliant technology solution that will automate the regulatory business processes for the front office; delivering that functionality with maximum efficiency and minimum commercial risk.
The group continues to focus upon:
- Reducing costs and risks through mutualisation of analysis, regulatory interpretation and implementation
- Using economies of scale to create a competitive, market-leading, best-in-class solution
- Ensuring dedicated, ring-fenced resources working for the project
- Up-to-date reference data model and regulatory consensus
The banks have now defined a set of MiFID II compliant workflows. These articulate the detailed changes required for the trading-flow between the client, the sales person and the trader. There are 2,700 permutations of interaction type / client types / trading models / venue types and instruments – these have been combined into 7 key workflows that need significant change. Alongside these flows, the banks have constructed a logical architecture that details the various components required and their interfaces with a typical bank infrastructure and finally a specifications document from which vendors will be able to understand the functional and non-functional requirements. In addition, the consortium has updated the reference data model defined in Phase 1 to reflect further details issued by ESMA and also added the required reference data to support EMIR. This last piece was to demonstrate the flexibility and potential of the model to encompass other regulations.
All of the deliverables in Phases 1 and 2 were created in two blocks of seven weeks – using dedicated resource from Etrading Software and key personnel from the member banks. This collaboration has meant the banks can share the direct cost of ETS and the indirect one of additional internal resources to support MiFID II programmes.

Stephane Malrait, Global Head of eCommerce for Financial Markets at ING Bank N.V. said: “Sentinel is a great collaboration project between banks to document the regulatory requirement around MIFID II and propose a technical implementation solution to reduce cost for its participants. ING participation helped on the creation of a workflow solution in line with ING commitment toward innovation for its client base.”

Sentinel is now focused on assessing possible vendors to deliver the solution. Once the vendor assessment is complete, the banks will decide upon the final technical architecture and the implementation streams for the various components. This will also include consideration of the buy versus build options and the interfaces into the banks’ existing trading platforms – ensuring that, wherever possible, standards and commoditized technologies are utilised.Sassan Danesh, Managing Partner, Etrading Software said: “The rapid completion of the latest Sentinel milestone, on time and on budget, is testimony to the desire of Sentinel banks to provide best-in-class services to their client base in a MiFID II environment. We are delighted to facilitate this important initiative to mutualise the technology costs of MiFID II compliance.”
Etrading Software continues to act as PMO to facilitate the collaboration amongst the Project Sentinel banks, applying its experience of etrading technology and managing bank consortia.
//
About Etrading Software
Etrading Software was founded in 2005 with the goal of bringing efficiency and stability to the burgeoning electronic trading markets. As experts in etrading processes, the partners and their senior team have a track record of implementing key projects and strategies to the market place. As budgetary constraints force businesses and their technology departments to think differently, banks are recognizing the value of collaboration with each other to agree standards, mutualise costs in non-competitive areas and therefore focus on true areas of competitive advantage. Etrading Software is instrumental in this process by acting as a trusted partner, bringing together key participants, providing technical leadership and helping the group towards agreement and syndication.
Media Contacts:
Streets Consulting on behalf of Etrading Software:
Abby Munson
Lindsay Clarke
T: 020 7959 2235
Abby.Munson@streetsconsulting.com
Lindsay.clarke@streetsconsulting.com
[divider_to_top]
Innovation In Quality Assurance: What Is The Impact On Trading Technology?
By Iosif Itkin, CEO, Exactpro
This article focuses on how particular changes in quality assurance might affect trading technology. There are plenty of discussions on Internet-of-Things, Cloud and Web. However, software verification for exchanges and brokerage platforms is closer to home for Exactpro. Here are some interesting QA trends:
• FrAgile process
• Crowd-sourced testing
• Formal methodologies
• Cognitive technology
When it comes to start-ups and web-industry, agile methods are favoured. They do not necessarily mean better products, but they can make software developers much happier. The outstanding keynote address of GTAC 2011, Google’s test automation conference, claimed that “test is dead”, and that FrAgile is the new agile. Reid Hoffman asserted that “if you are not embarrassed by the first version of your product, then you’ve launched too late”.
He went on stating that we need to test our ideas first before building the product right. The point is to have an established user community by the time your competitors get a well tested product. Extensive user pool is one of the key elements of crowd-testing.
The idea is simple – just deliver your product to the community and collect instant feedback on your software, both the upsides and downsides. To do it efficiently, one would have to rely on sophisticated code instrumentation and data collection. Otherwise it impedes you from thoroughly processing crowd-sourcing output and becomes an exercise in futility.
Is it feasible to use FrAgile process and Crowd-testing in finance? Can we afford an algo running amok burning money which could very well result in prosecution? Maybe we should take a different course of action. Perhaps, what we need is the very opposite of agile.
Areas like transport, medicine, nuclear power should exercise more rigorous controls. This leads to the creation of an environment, conducive to the evolution of quality assurance methods. Advances in model checking mean the end of guessing game. Last year, one TABB Forum posts pondered on why we are not using mathematics and computer science similar to what NASA uses to design a safe autopilot?
Is it not embarrassing that we have financial market crashes at the same time as spaceships explore the heliosphere? We can invest heavily into formal methods and expect financial software that is more reliable. Nevertheless, one has to keep in mind that in defiance of all of the formal methods, the New Horizons probe experienced a shutdown that made it lose contact with Earth for three days. Thus, expecting your systems to fail is the only way to enhance the quality of your software.
So where do we go from here? Is it possible to be faster without compromising safety? Can other industries teach us anything? If not the computer industry, maybe the show business has an answer. Some of you may be familiar with the characters of Battlestar Galactica and Sarah Connor Chronicles, their Doomsday scenario and the people who ran into some serious confrontation with technology. Had they not relied on the same technology that tried to kill them, the characters would not have lasted until the finale. They were guarded by systems with the same level of sophistication as those they were fighting against. This is the mentality that we should adopt. In the context of a complex platform, it is imperative to build software to test our software. Our risk control and test instruments should not be inferior to what will hit us. Having a good robot on your side is the only way to survive the robot apocalypse.
So, can we estimate the impact on the trading technology? Developing testing instruments and monitoring tools should have the same priority as developing the trading platform itself. It is possible to use agile methods and continuous integration if the system is designed with testability and risk control in mind. At the very least you can deploy, start, restart and configure it automatically. In addition, it is crucial to be able to shut the system down if necessary. Do not expect formal artefacts from the agile process. Instead, have a parallel stream to develop the necessary test harness. Software engineering in the test approach proposed by Google is very close to building software to test software.
We can implement formal verification, theorem proving, static analysis and other approaches, but the software will break anyway. What’s more, the absence of sufficient monitoring and kill switches is what turns a problem into a disaster.
For the sake of audit and regulatory requirements, we develop passive testing tools. Instead of generating messages themselves, they capture the traffic and store it for further analysis. Passive testing tools can collect all the evidence you need.
It is paramount to ensure that your trading system can co-exist with the tools. Passive testing tools can serve as both surveillance and client on-boarding systems. They can also be used for crowd-sourcing. If your test environment is open to others and you have a strong passive test capability you will be able to gain a lot of valuable feedback without even having to ask.