Home Blog Page 513

Best Execution And The “Electronification” Of High Touch Trading

By Michael Mollemans, Head of Sales Trading Asia-Pacific, Pavilion Global Markets

 

Best execution is an evolving process, and so it is incumbent upon us to anticipate the trend in service expectations.  

michael-mollemans-picture The unbundling of research brought about by the Markets in Financial Instruments Directive (MiFID II) put the cost of execution into the limelight with many buy-side traders responding by increasing their use of low cost, “low touch” algos. Best execution policy disclosures also brought quantitative performance and qualitative service factors into focus. As best execution regulation and service expectations evolve, so too do the traditional “low touch” and “high touch” roles, with the road ahead leading to an electronification of “high touch” client service partnerships. 

 Liquidity issues in certain names, and on certain days, are always a source of frustration for buy-side traders. Markets are dynamic and liquidity in small- and mid-cap names can be here today, gone tomorrow, and a few illiquid names in a “basket” or “program” can destroy overall performance numbers. A sales trader’s ability to combine expert advice on liquidity-seeking algorithm (algo) parameter settings with a breadth of counterparty relationships gained through experience makes all the difference when aiming to achieve the highest possible ranking in a client’s execution performance scorecard. Buy-side traders are increasingly taking a multi-factor approach when evaluating brokers against their peers, with value-added service and other qualitative measures weighted highly next to execution quality. A “high touch” approach, in an otherwise “low touch” algo business, not only helps achieve a superior weighted average performance result, but also helps move up in the performance scorecard ranks. 

Block crosses can help enhance trading performance but can also damage client trust if crosses are sourced without consent, or if the client’s unique qualitative parameters and constraints are not properly followed. Sales traders certainly need to interact with electronic crossing venues selectively while seeking to utilise the breadth of counterparty relationships. When crossing stock at a price, however, there is no substitute for skilled technical analysis of stock trends to estimate the periodicity of alpha so as to minimise the chance that stock prices trend into you only after executing your block at a relatively unfavourable price. Timing contribution of executing a block at a price, and the opportunity cost of not executing a block, must be carefully considered. Information leakage costs must also be considered when reaching out to counterparty relationships. Block executions provide an opportunity to minimise impact but if unnecessary information leakage is created they can also hurt overall performance. Automated “smart IOI” (Indications of Interest) processes, geared to targeting known holders of a name, can be very helpful when trying to minimise leakage. 

Reliability of execution across a robust trading platform that can sustain any type of system failure is often cited by the buy-side as a key qualitative factor when it comes to making broker routing decisions. Availability of backup algo networks is valued because the last thing clients want to hear an hour into the trading session is “trade away please.” Market data issues, network outages, database failures, algo engine issues, etc. can happen, but the ability to fail over to a parallel backup algo infrastructure makes all the difference when it comes to providing reliability of execution. A sales trader’s ability to provide clients with quick and detailed communication about the nature of a system failure, and an informed estimate on when systems will return to normal, is always appreciated.  

The trade lifecycle doesn’t end at execution, it ends at settlement. Smooth settlement, especially with difficult emerging market ID matching requirements and amended settlement terms, requires constant communication between traders and settlements teams, which can be a challenge if these teams are separated in different buildings or cities. Trading and settlements teams also need to rely on experience to help anticipate issues and become problem solvers, when needed, to assure clients get a consistently reliable settlement experience. 

Analytics are at the front-and-centre of the trading relationship as buy-side traders aim to maximise alpha through smarter execution strategies. Sell-side traders are expected to have deeper discussions with clients around transaction cost analytics (TCA) and so data visualisation tools are being used more and more to assist communication with clients across a growing number of data points. TCA based primarily on performance measurements versus “VWAP” or “arrival” benchmarks can be misleading when best execution regulation requires the consideration of both quantitative and qualitative factors across the entire trade lifecycle. MiFID II regulation and the regulatory technical standard (RTS 28) requires Best Execution Policy annual reports to be disclosed publicly for the first time ahead of the April 30th 2018 deadline. However, regulators have made it clear that next year, and in the years to follow, best execution policy reports are expected to become increasingly specific and transparent about how venue and counterparty routing decisions are made.  

Pre-trade reports set strategy selection in motion. Then, real-time market data feed into technical trend analytics designed to help sales traders add alpha by capitalising on directional opportunities with the help of adjusted algo parameters and curve tilts. Real-time analytics are designed to be actionable but their value is largely determined by a sales trader’s ability to interpret the signals and take action on opportunities in spread capture, timing contribution, opportunity cost, etc.  

Venue analysis is a great tool to drive informed discussions with buy-side traders around performance at the venue level and help the sell-side to act in the best interest of clients by removing hard-coded venue biases. Venue analysis also sheds light on the venues that provide the most opportunity for earning spread capture and speed of execution, with the lowest reversion cost. Guided conversations with clients about venue performance provides added depth of understanding of their performance goals, which can help sales traders fine-tune dynamic venue prioritisation and smart order routing strategies across various market conditions and levels of trade urgency.  

Monitoring and reviewing of execution quality requires constant communication between buy-side and sell-side traders and yields a valuable feedback loop used to help produce performance results with improvements to algo parameter settings and customised client service partnerships. Distribution of weighted average trading performance is increasingly being looked at by the buy-side, and tightening of the distribution of performance is becoming a goal in itself. Clearly, from a risk point of view, buy-side traders don’t want long-tail, “all over the map,” performance outcomes.  

To be successful at delivering consistently above average, low standard deviation, trading performance to clients, sales traders need to bring automated “low touch” tools and “high touch” market experience together when navigating through the latest market-moving news announcements, sentiment changes and directional trend shifts. Algo engines are often being equipped with news factor data but they cannot yet compare with an experienced trader’s ability to take news on complex economic and geopolitical events and translate it into alpha producing opportunities for clients.    

Transparency is often cited by the buy-side as one of the main qualitative factors considered when making broker routing decisions. Transparency of order routing logic and smart order routing prioritisation settings require informed communication between buy-side and sell-side traders. Email blasts with technical update jargon and marketing brochures offering limited detail are not enough. Sales traders are expected to help clients understand what is happening, or changing, “under the hood” by thoughtfully and efficiently translating the proprietary technical language. The overall goal is to help clients choose the best algo parameter settings and smart order router preferences to get the best possible performance result.  

Execution consultancy services are needed more than ever as the range of algo strategies, alternative venues, and smart order routing preferencing options widen. Sales traders are expected to assess new technologies, like artificial intelligence or machine learning, and advise clients on where quantifiable, statistically significant, results are being seen, or to what extent it is just marketing. Sales traders are expected to stay updated on market structure changes and advise clients on how to best position themselves to take advantage of change.  

All in all, the sell-side service model that is best positioned for the anticipated change in best execution regulation and service expectations is one that brings the key components of “low touch” and “high touch” roles together seamlessly into one combined approach, which is the aim of the Pavilion Global Markets value proposition. Algos and automated processes do not produce best execution by themselves; rather, they are tools and efficiencies that allow sales traders to allocate more time to the qualitative value-added side of trading services. Access to liquidity is a primary concern and so sell-side traders are expected to regularly assess all new and alternative electronic venues. At the same time, they must source liquidity from a myriad of counterparty relationships gained through experience. Constant monitoring, review, and communication with clients around pre-trade, real-time and post-trade analytics will not only support efforts to produce superior weighted average performance results, but will also help tighten the standard deviation of performance outcomes over time.

Fixed Income TCA: A Competitive Differentiator

7q8a1159

By Laurent Albert, Global Head of Execution, Natixis Asset Management Finance

Building out TCA reporting for fixed income improves execution quality, raising at the same time the level of service for end investors.

Unlike traditional asset management trading desks, we serve our internal wealth management and asset management clients while also offering our execution expertise to external portfolio managers. As such, we believe we owe it to them to have the metrics on-hand. This is the reason why we have extended our transaction cost analysis (TCA) reporting from our equities trades, which we have been doing for years, to our fixed income trading desk.

We were looking for a pragmatic approach when searching for TCA support for fixed income. We quickly decided to choose an independent player and outsource these services. We are currently working with an established equity TCA provider, but wanted an integral player for the fixed income market. We received proposals from six or seven different candidates and selected one for their methodology and user-friendly metrics. 

This was a real opportunity to build a truly detailed report that would increase our ability to demonstrate value. It was also a chance to improve our best selection process. We realized that once we had a robust TCA solution, we could then integrate it into our pre-trade systems on the dealing desk. This now means we can provide additional pre-trade information to help the trader select the best channel for execution, and improving selection means best execution.

Defining the challenge
The first challenge to creating a TCA system for fixed income is to define it, as this is a new approach in the fixed income world. Given the complexity of the fixed income markets compared to equities, we needed to find a very simple solution for measuring this very complex asset class. The simple metric for analyzing our best execution in fixed income is to compare our execution price with a composite bid and offer. It is calculated as a mix between dealer streaming prices and a selection of executing prices drawn from a specified time frame. This allows us to compare our traded price with the composite price.

There are several analysis possibilities, once that data becomes available. We can assess the contribution of each bank by assets, or in a more managerial approach, we can use our TCA data to assess execution quality by each individual trader or instrument.

dsc_0616

The first step was to assemble a huge amount of data and structure it before sending it to our provider. Our mandate was to work on the data to deliver transaction competencies. For two months now, we have been sharing this report with our clients and the feedback has been very positive. They are surprised and curious about the results since it’s a new approach in fixed income. On average, if you can provide between 4-7 basis points in added value in government bonds over the last six months, this is acknowledged to be reasonable performance. However, if we can factually demonstrate a capacity to deliver 5 basis points of added value, this fact is important to clients given the percentage of transactions that are done at midpoint. In high yield or emerging market bonds, the TCA will provide interesting details about the basis point improvement, especially when it is much larger. This gives us the ability to rank by basis points our capacity to add value across fixed income instruments in simple metrics.

The challenge is to frame this data in an engaging way for fixed income portfolio managers. Whereas equity portfolio managers are well-accustomed to TCA, their fixed income peers are less used to it.   

Better benchmarks
We started with a pursuit of simple metrics based on the composite number, but that was merely the initial stage of TCA for fixed income, and our report keeps improving as we receive more feedback from our clients. A later stage could be to compare our execution quality with a global peer group benchmark across all fixed income assets. This would obviously make it more challenging for us, but our final clients will appreciate the capability, transparency policy and quality of our execution team compared to those of our peers.

For our fixed income TCA reporting, we currently cover a vast number of instruments, such as Treasury bills, asset-backed securities, high yield bonds, government bonds, non-rated credit, etc, which means that we have global coverage, excluding swaps and credit default swaps, but that will likely come soon. Each third-party provider has limits to their methodology, but we work with our partner to improve the quality and detail of the reporting across asset classes and to increase comparability between instruments. Some providers deliver no comparative data when the instrument is illiquid, but others interpolate and offer data drawn from a wider timeline.

While our investment in TCA has already yielded results for clients in the months following the launch, and we will continue both refining and expanding our reporting, there is always room for further improvement. For example, if tomorrow the fixed income industry came together and developed some type of an EMS (Execution Management System) to merge trading venues and systems across the same screen, it would immediately increase our ability to capture liquidity for our clients. Key players would need to integrate their information, but if such an EMS existed, it would significantly improve the quality of our execution. This would allow us to use algorithms as we do for equities and seamlessly connect to multiple venues.

TCA: “Is This Good or Bad?”

screenshot-2018-11-13-at-2-30-44-amBy Jason Lam, Director – Head of APAC Electronic Equities Quantitative Analytics & Consulting, Deutsche Bank

When measuring transaction costs, a prevalent practice today is to reference price benchmarks that are relevant to the specific algorithm. For example, referencing the order interval market VWAP (Volume-Weighted-Average-Price) for VWAP strategy orders, arrival price for IS (Implementation Shortfall) orders, etc. The price benchmark slippage can then be computed by comparing the order average price to the reference price. This methodology is widely accepted and almost universally applied within the industry.

For over a decade, this has been the de facto standard to examine algo performance, to evaluate whether best execution has been achieved, and to compare performance across strategies or brokers for each of the trading objectives. However, when presented with results drawn upon these traditional metrics, do we often find ourselves asking the question “is this good or bad”? Surely, if this methodology is not effective in answering this, we must continue to seek a better solution! 

screenshot-2018-11-13-at-2-29-39-am

Why traditional benchmark slippages do not work well?
Figure 1 presents typical results of performance slippages.  These measures are inherently noisy and sometimes misleading. Directional market moves and stock specific volatility tend to skew these measures significantly, where higher volatility likely yields higher deviation from benchmark. For example, if an IS order managed to beat the arrival benchmark by 2 bps (basis points), most would be overjoyed with this performance but “is this good or bad”? What if the majority of market trades were executed at even better prices? The optimism would quickly turn into disappointment. To counter these issues, the industry has continued to innovate new techniques to separate noise from signal – from normalizing slippage numbers in relative terms by representing them in spread instead of bps, to applying adjustments based on pre- and/or post-trade models.

These models evolve over time and have gotten more complex and sophisticated. New factors were added to these models, including beta, stock-specific volatility, liquidity conditions, impact estimates, etc. Furthermore, many agency brokers (Deutsche Bank included) have their own proprietary models, which makes it difficult for clients to compare adjusted-performance numbers across brokers. Many clients find themselves having to apply adjustments themselves to reduce noise in these metrics. More importantly, how do we actually evaluate the effectiveness of these adjustments?

If an objective method existed to measure this, should we not directly apply it to measure transaction cost? Absolute EBEX is one answer.

Absolute EBEX
Traditional slippage metrics only provide the magnitude of slippage and not insight to where this compared to others or the market. An effective measure must then include a component of peer group analysis. Absolute EBEX is an absolute performance measure proposed by the EDHEC Business School in France as part of the EBEX (EDHEC Best Execution) framework1 for TCA (Transaction Cost Analysis). It quantifies the quality of execution with a simple score between 0 and 1, the higher the better. Absolute EBEX can be understood as the fraction of market volume traded at or worse than the order average price. Suppose half of market volume traded at prices better than the order average price, the order would achieve a score of 0.5. We find this analysis intuitive and simple to implement, only requiring final order average price of individual orders and market trade data. 

Applications of EBEX
EBEX provides a standardised framework to assess the quality of execution across orders aggregated at any level. It is also generally free of noise typically found in traditional price benchmark slippage. With these advantages, an effective systematic review process may be introduced to evaluate best execution, which is key to achieve the best possible results through agency algorithms. To monitor for changes in execution performance per strategy and market, one could simply observe the distribution and aggregated absolute EBEX scores. The stability of aggregated absolute EBEX score allows for the introduction of an expected performance band by defining an appropriate lower and upper bound based on acceptable tolerance level, such as a multi-month rolling average of 35 and 65 percentile respectively. In addition, absolute EBEX as an absolute measure allows for a definitive minimum threshold defined by a constant, such as 0.3. In figure 2, we present the absolute EBEX score over a sample set of VWAP orders. The round dot represents aggregated score over each month. The vertical line represents the distribution of EBEX scores between 35 to 65 percentiles. The shaded zone represents the acceptable performance band, which is a rolling 2-month average of 35 and 65 percentile respectively. Also, note the stability of aggregated EBEX score and its distribution month over month, as compared to figure 1. Both figure 1 and figure 2 are based on the same underlying order set. The EBEX analysis identified a drop in performance in October but was significantly less pronounced than what traditional slippage shown.  Indeed, a deeper investigation found higher volatility in October having skewed traditional TCA. 

screenshot-2018-11-13-at-2-30-03-am

EBEX Variants
Alongside absolute EBEX, EDHEC also introduced directional EBEX as part of the EBEX framework. The goal of directional EBEX is to evaluate whether the correct time horizon had been chosen to trade the order by comparing NBBEX (Number of Before-Better Executions) and NABEX (Number of After-Better Execution), which can be understood as the analogous of absolute EBEX within the order interval, and the period between order end time and market close respectively.

For schedule-based algorithms, such as VWAP and TWAP (Time-Weighted Average Price), clients may prefer to define the trading horizon by specifying the start and the end time of the order. In such cases, we find it more appropriate to use NBBEX over absolute EBEX to evaluate performance only within the order interval.   

screenshot-2018-11-13-at-2-30-18-am

For shorter duration orders, where orders are more likely to be a larger part of interval volume (IV), we propose a minor adjustment to exclude own order flow.  Consider 2 Buy orders in figure 3. Each order having only 2 executions, and the price of the latter and larger execution of each order being the only difference between the otherwise identical orders. The size of the bubble represents the relative trade size. The color of the bubble represents order execution and market trades in dark blue and light blue respectively.

One would prefer the performance of order B to order A because the bulk of its executions were done at the low prices of the order interval as opposed to the near-high. However, the NBBEX score does not reflect this, yielding a score of 0.89 and 0.11 for order A and B respectively, which is the opposite of our preference.  Order A scored well under NBBEX because the latter execution itself already accounts for 87% of interval market volume and traded at a price higher than the order average price. NBBEX therefore has included its own execution as the fraction of market volume having done at or worse than its order average price.

By excluding own order execution, the modified NBBEX yields 0.15 and 0.85 respectively. We compute this by excluding own order executions at or worse than the order average price from market volume within the same price range, and dividing it with the order-interval market volume. Modified NBBEX converges with the original as the Interval Volume percentage of the order decreases. The implementation of modified NBBEX only comes at the extra cost of having to identify the executed volume of individual orders executed below the order average price but this information should be readily available.

Conclusion
EBEX provides a standardised TCA framework to evaluate the quality of order executions aggregated across any level. Deutsche Bank has included EBEX as part of our formal Best Execution review process in APAC.  definition of EBEX is intuitive and easy to implement. It is a form of peer group analysis by considering all trades in the market, and generally free of noise typically present in traditional price benchmark slippage. It’s an absolute measure based on a simple score between 0 and 1, where clear objectives can be defined, such as a minimal performance threshold. It is also a versatile measure without having to define relevant and specific reference benchmarks to measure each algorithm. We believe EBEX to be an effective measure and should be included in the suite of standard TCA metrics. 

Building the Next Generation of Algos

kathyrn-zhao-5317-2By Kathryn Zhao, Global Head of Electronic Trading, Cantor Fitzgerald 

While many buy-side firms trade on a suite of existing algorithms, is it time for a new breed of algo? Kathryn Zhao, Global Head of Electronic Trading at Cantor Fitzgerald, sat down with GlobalTrading to discuss the roadmap to building a new generation of algos.

Are algos commoditized?
Trading in today’s complex marketplace requires advanced technology solutions that are performant, robust and flexible. Plug-and-play algorithms used to be enough to satisfy the needs of large institutions who engaged in electronic trading; however, today it is a different story. Utilization of algorithmic trading is growing and so are the customization demands of the buy-side. Investments in analytics raise the bar for algo performance, while also varying the nature of a successful trading outcome. Modern, competitive electronic trading offerings are required to deliver customized solutions tailored to each client’s specific trading preferences, and that fits our business model perfectly.  

A helpful analogy is that of Precision Medicine, a medical model, where healthcare is customized according to each patient’s specific characteristics. By targeting the context of each patient’s DNA structure and tailoring a medical treatment, doctors can achieve the best healthcare outcome with the fewest side effects. Likewise, the next generation of algos must target each client’s trading DNA and optimize trading performance for each client. This is the exact definition of our Precision Algo Platform. 

With growing sophistication of quantitative models and ever increasing electronic solution variety, buy-side firms face a unique challenge: how to choose a product that is straightforward to use and yet easy to customize. We kept these considerations in our focus when we designed our algorithmic trading platform.  

Our Precision Algo framework is modular and easily customizable, with speed of turnaround in mind, to help traders reduce their order trading flow and take advantage of price and liquidity conditions as they occur. 

Replicate or blaze a new trail?
Our Precision Algo Platform was designed from scratch, with latest technologies, armed by experienced electronic trading professionals, devoted to pure meritocracy and achieving the best results for our clients.  

When we began the process of developing a new suite of next generation algos nine months ago, we had a late-mover advantage. The electronic trading technology landscape has experienced dramatic changes over the past 20 years. Innovations in hardware, networking, and software have had an immense impact on current state of the art technology. To truly stand out from the rest of the competition, an electronic trading product must be performant from latency and throughput point of view and resilient to failure without compromising capabilities and quality of execution. We took and will continue to take a craftsman-like approach to building and continually improving our software. Our platform is designed with performance in mind.  

For example, we use cutting-edge open-source libraries for inter-process communication and execution framework throughout the entire Client Gateway / Algo Engine / Exchange Gateway environment. We also use these highly efficient data structures for event sourcing to support full determinism in analysis and debugging. We pay a tremendous amount of attention to reliability and failover to eliminate the possibility of an outage and to minimize client impact if an outage does occur. Additionally, we rely on a comprehensive suite of automated testing and simulated execution tools to validate and guarantee the quality of our product. 

With regard to quantitative research, we choose to use AI/ML only where there is a clear case for adding value. There are certainly cases when AI/ML-powered models give the best results. However, in multiple cases traditional statistical models still represent an overall better solution. 

Clients’ key considerations and when to involve them?
The electronic trading business is highly competitive. In my view, state of the art technology and sophisticated quantitative research enable differentiated product. Execution quality, customization, access to liquidity and system stability are differentiators of a competitive low-touch offering.

Traders demand better performance and predictability of execution results, all without sacrificing the ability to source liquidity, transparency and control of their executions. With the launch of our Precision Algo Platform, we offer our clients a fully customizable algo suite optimized to each client’s trading DNA.

It is critical to engage clients actively at every stage of the process to ensure transparency and understanding of the full capabilities of the product, and to cater for clients’ current and future requirements. Underlying algos, such as VWAP / TWAP / POV, are the building blocks for “algo of algos”. As our Precision Algo Platform is modular and flexible, the majority of customizations are easy to achieve without having to directly alter the code base. 

Extended into other asset classes
The flexibility incorporated in a modular algo framework should enable the next generation of algos to be adapted to new asset classes and market structures without a full rebuild. Most of the code base, for example, the Allocation and Order Optimization framework, the Macro-Trader (Scheduler) and the Micro-Trader (analytics-driven, quantitative model-based order placement logic) are common components of any electronic trading algorithm and can be shared across different asset classes. 

The essential differences between asset classes are market data and their microstructure characteristics. Specialized quantitative models may need to be developed and calibrated to deliver better performing algorithms. Unique aggregators are also required to deliver a unified and normalized interface to both market data and market access.

As we roll out our next generation, cross-asset electronic trading product, we hope to help shape the industry standard for algo development.

Adaptation to Shadow Banking Sector Regulatory Frameworks

This report examines the implementation and compliance reporting implications of three new EU regulations impacting the transactions reporting obligations of funds management companies within the 28-member state bloc operating inside the confines of the shadow banking system. Specifically, the new EU shadow banking system regulations featured in this report are:

  • The Securities Financing Transactions Regulation;
  • The Money Market Funds Regulation; and
  • The Securitisation Regulation.

Adaptation to Shadow Banking Sector Regulatory Frameworks

Fixed income trading focus : Profile : Yann Couellan : BNP Paribas Asset Management

Yann Couellan, Head of FICC Trading at BNP Paribas Asset Management assesses how fixed income is and will change going forward.

 

To date, what has been the impact of MiFID II, and other regulation, on fixed income?

It is clear that the European Market Infrastructure Regulation (EMIR) has driven fundamental changes on the fixed income landscape. Next June, Category 3 counterparties will be subject to clearing and this will continue to move trading onto electronic platforms. However, we still plan to also do bi-lateral trading although we have had to fine-tune our approach and connectivity.

MiFID II has impacted the fixed income market with extended transaction reporting, investor protection, market liquidity with new trading obligations, as well as the pre and post trade transparency requirements, which should have the most impact going forward. However, today it is too early to determine exactly what the beneficial impact will be. This is because, at the moment, there are not enough bonds that fall under the requirements and meet the transparency thresholds. For example, there are just 466 out of 71,000 bonds that were seen as sufficiently liquid to be subject to the real-time transparency requirements of MiFID II. I think ESMA will consider recalibrating the thresholds and it will be interesting to see what happens.

How did the industry prepare?

In the same way as other buyside firms and trading teams we prepared by having a dedicated team working closely with our legal and compliance departments, as well as the national regulator, to ensure that we were covering all aspects of the regulation. This ranged from how the regulation would impact the business – not just on the pure trading level – but also with regard to cost, compliance, trading and operational efficiency, etc. It has been a huge effort but one that has to be ongoing and continuous. Take the daily reporting requirements, there is another dedicated team overseeing the workflow to make sure we are compliant. Technology has also been a key factor and buyside firms have invested significantly

There is a lot of hype about technology and big data, but how is it changing the industry and fixed income trading? How are fixed income teams using data and data analytics tools to access liquidity?

Yes, there is a lot of hype about artificial intelligence and machine learning, but in many cases it is too early to judge what their potential will be. However, the liquidity environment is challenging in the fixed income markets and we need to embrace tools that allow us to combine our pre and post trade data with market intelligence to give portfolio managers and trading teams better insights and allow them to monitor bonds throughout their lifecycle. The fixed income market is different to equities, which are much more electronic, and it is not as easy to get valuable data on bonds because many do not trade that frequently. We are currently looking to develop our proprietary trading tools to leverage the use of big data and to use FIX connectivity to aggregate all the information from the disparate sources for our fund managers and traders to help improve the investment decision-making process.

How has the relationship between portfolio managers and traders changed over the past few years?

In the past, the portfolio managers would raise the orders and then hand them over to the traders to execute them, but the relationship between the two has changed. Today, there is greater communication and collaboration, with a deeper understanding of how they can work together to add value in terms of capturing market insights, optimising execution and improving the investment strategy. Traders can help fund managers make sense of the market and ensure that they capture relevant market information. However, the buyside relationship with the sellside trader is still also important because they can provide useful insights and tools.

Overall, what has been the impact of electronic trading, and is an execution management system (EMS) a necessity?

An EMS is very useful where trading is done on-exchange, such as with equities and futures. In fixed income, an EMS is less relevant due to the challenging liquidity conditions on many bonds and the market structure, with bonds executed using the request for quote (RFQ)-based trading model. However, technology and hybrid protocols derived from the equity and/or FX markets are now starting to be embraced by market participants. Here I am especially thinking about rules-based, and data-driven logic that drives autotrading, as well as all-to-all protocols and auctions for specific markets. Also, technology and comprehensive analytics that leverage data help to improve traders’ and portfolio managers’ market knowledge by challenging their perceptions.

Has fragmentation been an issue?

There is fragmentation, but everyone has access to liquidity, although some buyside firms have more valuable insights than others, depending on their embedded technology. I often hear that electronic trading has made markets more efficient, which is true, and e-trading platforms have been created which facilitate the exchange between market participants, although it is the market participants themselves who create liquidity. For example, on the day of the announcement of the Credit Sector Purchase Programme (CSPP) by the ECB (European Central Bank) in March 2016, trading was possible mainly via voice. The liquidity was not on the platforms, and going forward there needs to be a way of making trading more efficient. At the moment, a few platforms dominate the e-trading landscape. However, I see the development of niche platforms that will increase coverage of the fixed income markets. This, for example, could be in emerging market bonds, or in other specific areas of the market where there is limited access. However, one of the biggest challenges with implementing these niche types of platforms is the cost.

Looking ahead, what are the biggest challenges ahead and greatest opportunities?

In general, building efficient trading systems while keeping implementation costs low is a big challenge. There is so much dispersion of information in the market that the buyside ideally needs to connect to multiple data sources. Also, even if there is a logical and obvious move to electronic trading with a greater offering from the e-trading platforms, there is still a cost and complexity in aggregating everything that is not being traded electronically. Regulation has brought greater transparency, but the markets still remain opaque and there is still a lot of work that needs to be done to make the markets more efficient and ‘electronify’ market information in an easily readable and comprehensive format.

As for opportunities, firms need to have a good understanding of what is happening in the markets and to have strong trading desks that work together with fund managers to capture new clients. The key differentiators come down to trading efficiency, market understanding and access to liquidity.


Yann Couellan is Head of FICC Trading at BNP Paribas Asset Management. He has more than 25 years’ experience of fixed income, derivatives and money market dealing. He joined BNPP AM in November 2017 from AXA Investment Managers where he spent 10 years as Head of Fixed Income Execution. He previously worked in dealing roles at IXIS Asset Management, ETC (E-Trading Company) and Cantor Fitzgerald, after beginning his financial career in 1992 as a junior dealer at Maison Roussin. Couellan has been co-chair of the ICMA Secondary Market Practice committee since 2017.

©BestExecution 2018[divider_to_top]

Profile : Mazy Dar & Adam Toms : OpenFin

Mazy Dar, CEO and Adam Toms, CEO Europe at OpenFin explain how they are bringing the traders’ desktop into the 21st century.

How did you get the idea for OpenFin?

Mazy Dar: My co-founder, Chuck Doerr, and I got the idea from our previous lives at Creditex, where we spent 10 years focused on electronic trading of credit default swaps. Our main product was a front-end desktop application that was used by traders at all of the major CDS dealers. We realised early on that these traders required the kind of native desktop experience that you could only get from installed applications. But we also recognised that installing software on bank desktops is a huge challenge and significantly limits your ability to be responsive to trader requests and market changes. The experience at Creditex got us thinking that there is a missing layer of infrastructure in financial services. The need for native experience and simple application distribution is a problem not just for trading, but also for many other categories of applications across front, middle and back office, portfolio management, research and compliance. We started OpenFin in 2010 to provide this missing layer of infrastructure which we refer to as an “operating system”.

What is the basic concept of OpenFin then?

MD: If you look at what has happened in the mobile world, the iOS and Android operating systems enable instant app distribution to hundreds of millions of users. The instant installs and auto-updates are possible because iOS and Android ensure security and prevent the apps from accessing your photos and contacts, for example, without your permission. On desktops, when you install software, the applications have unfettered access to your files and other sensitive resources which means they are a big security threat. This is why it takes months, quarters or longer to install software on financial desktops. The OpenFin operating system solves the problem by modernising your existing desktop operating system to behave more like iOS and Android. We enable instant application deployment by ensuring security and preventing applications from having any access to the file system or data from other applications.

What this means for the financial industry is we can now have the same innovation cycle as the mobile world. The top 25 apps in the Apple App Store are updated monthly which means they are constantly getting better and better. In finance, applications were previously updated about once a year which leads to big bloated monolithic applications. Now, with OpenFin, firms can update their applications as often as they want which means they can deliver smaller incremental releases and be very responsive to customer requests.

What infrastructure did you chose?

MD: When we started OpenFin, Chuck and I thought the first big decision we had to make was choosing between Java and .NET as the foundation for our platform. A few months in, we decided that HTML5 was the right answer. It was still very early days for HTML5 in 2010, but we could see that it was the language of the web and an open standard not owned by any single technology vendor. Even though we were convinced it was the right decision, we didn’t realise at that time what an important catalyst HTML5 was going to be for our business. It didn’t happen right away but, starting in 2013, the largest banks and financial service firms started rewriting existing platforms and building new applications in HTML5. Most firms now have a “web-first” approach which means they write HTML5 apps on the front-end and are increasingly moving to private or public cloud on the back end.

Can you explain in more detail what the Financial Desktop Connectivity and Collaboration Consortium (FDC3) initiative is?

Adam Toms: FDC3 fosters collaboration. The initiative has attracted the participation of over 50 capital market firms. The objective is to create a standard protocol for interoperability and connectivity across all financial desktop applications. Currently, there are bilateral integrations but these can be expensive because they are bespoke and take place between application providers. FDC3 standards would replace this process with a common language between all applications. This would not only optimise the functionality of the desktop but also allow, as Mazy has said, the industry’s applications to more easily talk to each other. This approach significantly reduces the cost of integrations and optimises functionality for end users.

Alongside FDC3 is the independent Fintech Open Source Foundation (FINOS), whose goal is to expand adoption of open source software and open standards in the financial services industry. FINOS is a non-profit organisation promoting open innovation in financial services that was established in April from the Symphony Software Foundation. In May, we announced that we have moved the FDC3 under the FINOS umbrella to encourage further industry collaboration.

How has the technology progressed and how does it differ from other systems?

AT: The world is moving to more specialised applications and many fintechs are more narrowly focused on a particular product or area, to which they bring a real depth of knowledge and sophistication. The question is how do you embed these specialised apps on the desktop, so they can be used more effectively in the users workflow and create a holistic picture to assist the end user in a more dynamic and complete way. Individual apps are usually well designed for the user, but they are not designed with the wider desktop and user experience in mind. OpenFin delivers this important holistic unification on the desktop. It makes apps more sticky and appealing to users and has the ability to invoke the right information into the user’s workflow exactly when it is required. This is particularly important for users when so many tools, applications and data sources are available to them, it has the potential to eradicate this burdensome navigational task.

The OpenFin platform also differs in that it is completely neutral and agnostic. We allow all users equal access to the platform and we do not offer any products or services that would compete with the applications available to users. Deploying the OpenFin OS allows firms to integrate both legacy and new applications which were designed by both in-house and third-party providers.

How can OpenFin help the industry in the post MiFID II world?

AT: MiFID II is forcing the industry to become more analytical and implement stronger compliance and operational controls to workflows. To support these requirements, a number of new applications have been developed both internally by firms and also by external vendors. New and existing tools can be used to augment and improve the ability, of traders for example, to navigate complex markets in a more informed and compliant manner. However the new and old apps themselves often require information to be re-keyed and are relatively unconnected, creating the potential for operational risk and missing crucial information due to the app being hidden in a browser, generating alerts that are buried or the user not using the correct app at the right time. OpenFin allows the apps to be connected, invokes the right information at the right time in the workflow, breaks apps out of inflexible browsers and delivers a central notification centre to improve the user’s desktop experience. The ability for application builders to invoke compliance and operational risk controls proactively into the user workflow or via notifications is also very powerful.

How do you see OpenFin progressing?

MD: OpenFin is currently licensed to 145,000 financial desktops globally and is being used to enable the deployment of hundreds of applications to over 500 major banks and buyside firms in over 60 countries around the world. Our goal is to empower digital transformation and open collaboration for app builders and users by deploying the OpenFin OS on every buy and sellside desktop in financial services; we are in the middle of our journey right now.

Adam Toms is CEO of OpenFin Europe and also serves as a Non-Executive Director on the board of RSRCHXchange. Prior to OpenFin, Toms served as CEO of Instinet Europe. During his 21 years in financial markets, he has led numerous high profile businesses and large-scale integration projects resulting in a reputation for delivering industry change.

Mazy Dar is CEO of OpenFin. He previously led the credit derivative strategy at the Intercontinental Exchange and served on ICE’s executive committee. Before ICE, Dar was chief strategy officer of Creditex which was backed by JP Morgan, Deutsche Bank and TA Associates and acquired by ICE in 2008. He started his career at UBS as a software programmer and earned a degree in computer science and French literature from Cornell University.

©BestExecution 2018[divider_to_top]

Viewpoint : Market evolution : Mathieu Ghanem

GENERATION Z WON’T NEED YOU!

It is less than six months to Brexit, the volume of transactions in Bitcoin is now larger than PayPal and two US tech giants have passed the one trillion market cap level – but the real concern for brokerages and wealth managers should be ‘Generation Z’, says Mathieu Ghanem, Managing Director, ADSS Head of Global Sales and Marketing

As CEOs and boards of investment firms look ahead to the year-end and plan for 2019 they will be assessing market fundamentals, the tangibles and intangibles, but for many the real question is whether they are still relevant to, and can provide the services, their clients want.

For the last 10 years we have seen an exponential rise in passive investments with very large funds building their businesses based on a wide variety of Exchange Traded Funds (ETFs). Helped by the growth in personally managed pensions, an approach which has spread from North America into Europe driven by regulation and national policies, the funds have attracted trillions in deposits.

ETFs have led the democratisation of market data and trading. It has created awareness in financial-based asset investments and opened the markets up to a new audience. This audience is not based in small or exclusive financial centres, it is global, online and hands on. It can easily access information and is not prepared to pay high fees for something it can do itself.

When the first millennials logged onto mobile trading platforms and could be seen on trains, planes and the street buying Apple, shorting the Dow or switching to gold, the writing was on the wall. These investors now seek out free trading apps and want to manage their own portfolios.

The investment industry has reached a crossroads. A combination of technology innovation, market structure, higher barriers of entry and regulatory restrictions, has altered the way investment firms operate. The acquisition of clients, the management of risk and the ability to make money have all changed, but the most important factor is that the audience they are targeting has moved on.

If we look back to the 1990s, brokers had, depending upon the nature of the securities or product they offered, total control over access to liquidity, so they competed on the types of service from OTC through to exchange traded. It was a closed shop and very easy for good sales teams to make significant profits.

This was always going to change and the introduction of new rules and the appearance of new technologies opened up electronic trading. For the first time brokers had to start competing for clients. This first step in the democratization of trading created direct competition between the brokers and banks, based on levels of service and of course, pricing.

The move to social trading has provided brokerages with a new audience, based at home or accessing FX, CFD, stocks and commodities via mobile platforms. However, this model was open to abuse. Some brokers, just interested in volume, created a ‘churn and burn’ industry which fed off inexperienced traders. This was always going to lead to greater regulation and, very quickly, protection for retail clients was stepped up, but the access to trading and trading-based information continued to increase.

Technology has also meant that the range of products available from wealth managers and brokerages converged, but with a blurring of roles. Investors could access the same assets, through different channels with different payment models. Sales commissions/brokerage fees were slowly moving to a more inclusive percentage fee for assets under management, which brought some clarity.

Through this period brokerages continued to invest in technology, whereas asset and wealth managers looked to product development adding more and more ETFs to their offering. This continued to push down fees which led to further development of passive investment systems and robot advisers.

‘Do it yourself trading’ has arrived and opened the markets up to the disruptive access providers, the pay-as-you-go model. The number of free trade apps is growing very quickly and the old-school wealth and asset managers are realising that they no longer have the same access to clients. Their monopoly has gone.

For example, in the cash equity market, fees have dropped from double digit percentages to 1% and now to 0.05% and with free trade apps this will be pushed even lower. Trading is being paid for on the low-cost airline model. The price of entry is extremely low with firms making their money on the extras – the premium seating, more luggage and express check-in. The problem for traditional providers is that they still want to charge you for the privilege of investing with them. This will not work for the investors of the future.

The financial services industry also has to accept that outdated forms of advertising will not get through to this audience. Generation Z is used to being carefully selected and targeted. Linked in through friends, colleagues and referred by firms and organisations they trust. They know their data is being used and are scornful of companies who do not try and understand them.

As we move into this new market, brokerages do have one important advantage, they understand technology. They have learnt the importance of continual improvements and updates, and how to link this into social media channels. They are also adopters of cryptocurrencies and blockchain technology so are able to adapt.

As we plan for 2019 we know we have to concentrate on the investors of the future and provide them with the access to the products and services they want through the platforms they understand.

©BestExecution 2018[divider_to_top]

Fintech : Distributed ledger technology : Heather McKenzie

THE PROMISE OF DLT.

Heather McKenzie looks at the latest developments and whether the technology will live up to the hype.

There has been a lot of buzz about distributed ledger technology (DLT) transforming the financial services industry and, as demonstrated by the Australian Stock Exchange (ASX), the post-trade world is proving to be fertile ground for the technology. However, time will tell as to whether it will be successful in fulfilling its potential.

In April ASX outlined plans to replace its 20-year-old Clearing House Electronic Subregister System (Chess) clearing and settlement system with a DLT-based system. A consultation paper issued by the ASX described the move as “a once in a generation opportunity to deliver material benefits for the securities industry and the broader economy. It will also be a major undertaking with significant operational impacts that need to be considered and appropriately managed.”

Originally scheduled to go live at the end of 2020, it was announced recently that operations are now scheduled to launch in March or April of 2021. The system will be based on US financial technology company Digital Asset’s DLT.

At the 2018 Macquarie Australia Conference on 1 May in Sydney, Dominic Stevens, managing director and chief executive at ASX, said the Chess replacement project would over time lead to greater efficiencies for the market through improved recordkeeping, reduced reconciliation, more timely transactions and better quality, source of truth data. “Beyond this, we also believe the change will provide the industry with the ability to create an exciting new generation of products and services – some of which we can’t conceive of using today’s technology,” he added.

To date, the ASX has completed three significant pieces of work in the replacement project. Initially it evaluated the technology solution and partner, Digital Asset. This took close to two years and involved the building of a subset of Chess’ clearing and settlement features to “enterprise grade specification”. This enabled the exchange to evaluate Digital Asset along with the functional and non-functional aspects of the technology. A consultation programme was also conducted, involving more than 120 organisations.

The third stage, which has just begun, involves the analysis, building and testing of the full system. The software will be released to the market to enable a long testing period and all users will be asked to participate in industry-wide testing. The consultation paper put forward a six-month window for implementing the system. “This may feel like a long time. However, Chess is deeply entwined in the operations of many market stakeholders,” said Stevens. “In 2015, settlement was moved from T+3 to T+2, a much simpler change and this took two years to roll out.”

Under the current system, all counterparties have their own bespoke operational databases, which are different from that of the ASX and from each other. Counterparties therefore must constantly send messages back and forth to reconcile their data with the ASX, as well as registries, custodians and buyside firms. A DLT-based Chess will create a connection to the ASX’s ‘single source of truth’ DLT database via a node. Databases will be kept in sync in real time, without having to reconcile with the ASX.

Hype versus reality

It’s likely that the eyes of the post-trade world will be on Australia as it tests DLT, which to date has been backed more by hype than by substance.

In its September 2017 report, ‘The potential impact of DLTs on securities post-trading harmonisation and on the wider EU financial market integration’, the European Central Bank (ECB) examined the potential impact of DLT on securities post-trading harmonisation and on the wider EU financial market integration. The report errs very much on the side of caution, while acknowledging the potential of the technology.

“Market players and public authorities have embarked on a learning process that has familiarised many of them with the foundations of distributed systems,” states the report. “Yet while the debate on DLT adoption has largely focused on technical features, discussions on the potential impact of DLTs on financial market integration are still at a preliminary stage.”

T2S response

One of the areas studied in depth is the impact of DLT adoption on Target-2 Securities (T2S) harmonisation activities. This could vary, depending on the different adoption models, says the report. DLTs can in principle accommodate omnibus account structures including for the provision of appropriate services on those accounts. This means that agreed T2S standards could in principle be kept in the case of DLT adoption, but it does not ensure that developers and adopters of the technology will take a unanimous decision in that respect.

An instance where DLT solutions currently under development appear to be diverging in the T2S community is that of securities and cash account numbering, where the use of public keys to identify DLT users may diverge from T2S agreed standards. The report states that if DLT and non-DLT solutions are to coexist, interoperability between the two approaches needs to be ensured. There may be a need to provide ad hoc matching fields where a participant holds both a DLT and non-DLT account.

In T2S, all participants adhere to a harmonised definition of legally defined moments relating to settlement finality. Potential DLT adoption may introduce fragmentation among the settlement finality rules of different systems, which would hamper their ability to interoperate. For most transactions, there will always need to be a designated system recognised by the EU public authorities to ensure that counterparties are protected against insolvency procedures.

With respect to the settlement cycle timeline, the adoption of DLT solutions, if fully interoperable across all involved institutions, could allow straight-through processing and settlement at trade. Nonetheless, costs borne by market participants in terms of additional liquidity need to be carefully considered, says the report.

Solutions to allow T2S central securities depositories to interface DLT-enabled securities settlement systems with the T2S platform based on mainstream technology, possibly allowing delivery versus payment via standard real-time gross settlement system dedicated cash accounts, may require the definition of new harmonisation activities to allow such connectivity. The possible introduction of a variety of DLT-based payment systems might need to ensure technical interoperability between old and new systems dealing with commercial bank money.

That this would not have a negative impact on financial integration is foreseen in the realm of central bank money in the euro area, since a common DLT model would be developed if DLTs were to be deemed viable for Eurosystem market infrastructures in the future.

David Pearson, head of post-trade strategy at Fidessa, believes “it will need something radical to happen to get DLT into the post-trade mainstream”. It is still early days for DLT and no one has yet proven the technology, even internally. Most organisations would have to make a “huge leap of faith” to pursue DLT projects, given the lack of experience and skills required. Pearson says there is much more momentum in buyside firms behind technologies such as robotics and artificial intelligence than there is behind DLT.

“It is likely that the large investment banks may decide to invest the time and effort into trying to develop a DLT-based solution. They have the money and the technological knowhow. Possibly in a year or two they will have an offering for buyside firms that will give access to cleaner, faster data on their blockchain.”

John Harvie, a director in the business performance improvement practice at Protiviti, says one of the factors slowing the adoption of DLT technology is the need for it to be market infrastructure-led. “This requires firms to recognise the need for co-opetition, the potential to generate a bigger more efficient market by collaborating and then competing to divide the market,” he says. “Some sectors are better at recognising this opportunity than others and there are plenty of historical precedents that have demonstrated how successful this can be. However, it does tend to slow down adoption as different parties need to agree standards and the group tends to move at the pace of the slowest.”

A more optimistic view of the prospects of DLT in post-trade is held by Jerry Norton, head of strategy for CGI’s financial services business. He believes financial services firms now have more confidence in DLT and blockchain technology. Unlike some of the previous post-trade technology projects, such as the GSTPA (Global Straight-Through Processing Association), DLT enables solutions to be ‘right sized’ eliminating the asymmetric benefits of larger, more technology heavy projects. “DLT could deliver a better cost-benefit ratio because it is distributed and lighter weight in terms of the software participants would need. It is quite radical in this respect.”

If the ASX project is successful, says Pearson, it could provide the catalyst for much wider adoption of DLT in post-trade, he says.

©BestExecution 2018[divider_to_top]

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA