Home Blog Page 557

Fixed income trading focus : MiFID II : Sassan Danesh

Be33_etrading_S.Danesh

WHAT DOES MIFID II MEAN FOR FIXED INCOME?

Be33_etrading_S.DaneshBy Sassan Danesh, Partner Etrading Software and Co-Chair of the OTC Products Committee of FIX Trading Community.

It is now abundantly clear that MiFID II will drive wholesale change to the current OTC model of client <–> sales person <–> trader interaction, particularly given the predominantly manual industry practices based on voice-trading that exist today.

This article describes how the industry is responding to the challenge by creating new front-office tools to process the large volume of electronic data that will become a part of the post-MiFID II world order. It further explores how standardisation and collaboration can reduce the cost of implementation within existing, often heterogeneous front-office environments.

Impact on the OTC front office

The current OTC workflow is heavily dependent on the salesperson facilitating the client / trader interaction with communication predominantly through voice or IB chat and labour-intensive booking processes.

Whilst MiFID II will move some of this workflow onto electronic trading venues, thereby enabling lower-touch interaction, it also places significant new obligations on the remaining business activities that will stay OTC, including:

  • Knowing whether the client is allowed to trade the instrument on an OTC basis;
  • Understanding whether the client is placing an order, an instruction or an inquiry;
  • Understanding whether or not the firm is a Systematic Internaliser for the instrument;
  • Determining whether ESMA categorises the instrument as liquid or illiquid;
  • Determining whether the trade size is above or below the MiFID II SSTI / LIS thresholds.Be33-Web-18

Automation of the OTC front-office as a strategic solution

Whilst it is possible to meet these new obligations in a tactical manner, for example by the creation of additional spreadsheets that salespeople and traders fill in during the trading lifecycle, such short-term fixes fail to position firms to take advantage of the new opportunities presented by MiFID II. Advantages such as developing the capability to consume and integrate the newly available pre- and post-trade data into pricing and risk management tools or to track, analyse and optimise the individual client interaction with the firm holistically and in real-time across all channels.

These opportunities are driving a growing consensus in the industry that a more strategic re-engineering of the OTC front-office to meet MiFID II obligations provides better long-term value at lower commercial and regulatory risk.

The key to such a re-engineering effort is to standardise and digitise the existing OTC workflows to allow the development of new tools that can be integrated into the existing front-office workflows.

The global regulatory context

Of course, many broker-dealers already have experience of designing such automation as a result of Dodd-Frank Act implementation in the US. In many cases, the DFA solution involved the addition of a regulatory rules engine coupled with a regulatory reference database within the front office environment.

These components provide the necessary information and decision-making logic to the salesforce to manage the OTC workflow efficiently and consistently.

Given the wider scope of MiFID II and the regulatory differences across jurisdictions, a wholesale lift and shift of the US infrastructure is not easy. However, European broker-dealers have the opportunity to take into account the US best practice approaches and modify them to meet the needs of MiFID II.

European best practice implementation

One advantage of the delay in MiFID II implementation to January 2018 is the breathing space it provides firms to implement strategic solutions that allow the front-office to capitalise on the new opportunities afforded in a post-MiFID II world.

Such strategic solutions require deployment of regulatory rules engines, regulatory reference databases, components to interface to APAs / ARMs, capture trading activity for real-time SI monitoring and best execution obligations as well as provide streamlined workflow processes for all the permutations of interaction types / client types / trading models / venue types / instruments across the trading lifecycle.

Such an undertaking can be costly to implement. For example, the above table shows a theoretical 2,700 individual workflow permutations that need to be analysed systematically to ensure conformance to MiFID II. However, it is noteworthy that all firms have the same obligations and therefore the analysis and design of solutions is amenable to collaborative models with the attendant benefit of sharing of costs.Be33-Web-19Be33-Web-20

Current collaborative models for MiFID II implementation

The industry is already collaborating in multiple fora on MiFID II:

  • Via FIX Trading Community to understand the technical requirements;
  • Via trade associations to advocate refinement to the rules;
  • Via private initiatives to perform shared analysis of the rules.

These provide cost-effective opportunities for industry to share the cost of analysis and implementation.

Looking ahead there is also scope for the community to collaborate to create additional open standards and common implementations and indeed the FIX Trading Community is looking at establishing such open standards for the publication and consumption of pre- and post-trade transparency data.

Such open standards allow end-users to lower the costs of consolidating the disparate market data sources that MiFID II will create and also provide simplified connectivity and workflow integration to those APAs that adopt such open standards.

Opportunities for collaboration to create additional standards

A common, open standard for APA connectivity is just one example of the ability of standards to simplify the design, implementation and integration of the new front-office tools necessary to meet regulatory requirements. There is scope for industry to explore additional areas where standards can help. In particular, given the increasing acceptance of the need for regulatory rules engines and associated regulatory reference data, this is an opportune time to discuss the establishment of open standards for access to such rules engines and data.

For example, could FIX or other open standards be used to define a standard API to query a rules engine to return whether the client is allowed to trade the specific instrument on an OTC basis with the firm? The creation of such open standards would provide opportunities for rules engine vendors to quickly and easily integrate their offerings within the existing front office technology stack and lead to significant cost and time-to-market savings for the industry.

Similarly, is it possible to create open-source, cross-jurisdiction regulatory data models that vendors and end-user firms can code up to? Given the common specification (laid down in regulations), such an approach may well allow the industry to realise significant benefits by simplifying workflow integration across disparate vendor and home-grown systems.

Other standards can also be created for the normalised access and storage of the requisite trading activity data and also for integration to risk management and trading systems.

Opportunities for utilities

There are also opportunities to establish utilities that implement these standards on an open, non-discriminatory basis. Such utilities enable users to capture the value of any network effect and associated economies of scale in non-competitive functions whilst allowing vendors to provide value-added services to market participants on top of the basic utility infrastructure.

Data is one area that may be suitable for the creation of utilities. Given the continued and seemingly inexorable rise in the cost of market data in equities, there is an opportunity for OTC markets to leapfrog the equities model by establishing open-source data utilities with industry governance and cost-priced models.

Such utilities are already being discussed, for example for the supply of ISINs and the associated instrument reference data for OTC derivatives, based on the relevant ISO standards.

Another potentially suitable area is standardised connectivity to allow easy data transmission from content producer to consumer. Given the sensitivity of data in the OTC space, such utilities will also need appropriate data governance for content providers, as exemplified by Neptune in the bond markets.

In conclusion

It is clear that there is increased industry appetite in OTC markets to establish market infrastructures in areas of common concern through open standards and industry utilities operating under balanced governance.

Having been a backwater in terms of automation and standardisation, it now seems possible that MiFID II might prove to be the catalyst for OTC markets to create a modern, low-cost, standards-based infrastructure to allow the industry to service client needs efficiently and electronically.

The FIX Trading Community has a number of different working groups currently looking at MiFID regulation. If you would like to find out more, please contact us at fix@fixtrading.org

©BestExecution 2016[divider_to_top]

[divider_line]

Addressing The Market Data Cost Challenge In Fixed Income

P_40 2016_Q2
By Sassan Danesh, Managing Partner, and Kuhan Tharmananthar, Product Development, eTrading Software
Recent years have witnessed an inexorable rise in the electronifcation of trading, driven by a trinity of regulatory, technological and cost pressures. The benefits to investors have been significant, including speed of execution; reduced explicit transaction costs and improved price discovery. However, alongside these benefits has been the ever growing cost of connecting and consuming the data required to operate efficiently in the new electronic world.
Much of this data is essential: It is data that defines the instrument being traded; it is data and connectivity to it that reveals potential prices from different brokers; it is data that captures actual trade prices and market closing levels. It is not surprising that this data is now valuable and that an entire ecosystem has sprung up surrounding the generation, maintenance, distribution and consumption of this data.
The challenges associated with the high cost of data are not new to the equity markets, which have been wrestling with this problem for many years. However, the increased electronification that is now occurring in fixed income markets is threatening to create the same challenges for OTC markets. The timing is therefore ideal for the community to start discussions on creating a fixed income market data ecosystem with a more optimal cost structure and data governance model for all market participants.
Reducing market data costs through collaboration
Part of the reason for today’s high market data costs rests in the structure of the market data ecosystem: raw data is provided by brokers in various formats; various data vendors then clean, reformulate and enhance this data before selling it to market participants including those who generated it. This enriched data is delivered via proprietary channels and using custom formats. Market participants embed these proprietary formats into their own systems, building complex mapping and verification/scrubbing tools of their own to validate and clean data from different sources so they can build a single picture of the markets. This provides an opportunity for the market as a whole to collaborate to create a standardised distribution mechanism to simplify the ecosystem and drastically reduce market data costs to end-users. However, any successful initiative that wishes to simplify this ecosystem must address the needs and sensitivities of the key stakeholders, including the sell-side (typically the data originators); the buy-side (typically the data consumers); and ideally also the data distributors (typically data vendors who provide the connectivity and value-added services to end users).
Taking the needs and sensitivities of each stakeholder in turn:
Buy-side: The fragmentation of fixed income markets is driving the buy-side need to access market data from as wide a set of data sources as possible, and to be able to sift through this data systematically. The buy-side therefore have an in-built incentive to promote such cost reduction. However, given the disparate sources of market data, a key challenge for the buy-side is to receive this data in a standardised format based on open standards to allow aggregation across data sources and to avoid individual vendor lock-in that can result in increased costs.
Sell-side: Fixed income market data is typically generated by the market-making desks of individual brokers for their client base in a particular product. Typically, the broker does not charge for providing this data as it is a pre-requisite for client trading. Indeed, the broker has an incentive to supply the data in the most cost-effective manner to ensure as wide a dissemination to their specific client-base as possible. However, given the risk of information leakage and data being passed to other market actors that the dealer does not have a relationship with, a key sensitivity for the sell-side is ensuring an appropriate data governance wrapper around any data distribution mechanism.
Data vendors: The cost of establishing connectivity to sell-side (data originators) and buy-side (data consumers) is a major challenge for vendors. Reduced connectivity costs can allow vendors to focus their investments on value added services such as TCA or other pre- and post-trade analytics and to market such services to the larger client base that becomes possible as a result of simplified connectivity. However, a key sensitivity for vendors is in ensuring that their business models can be made consistent with an ecosystem in which connectivity has become commoditised.
The right model of industry collaboration
Crafting a collaborative model that meets the aforementioned needs and sensitivities is a challenge. Luckily, the industry already has an example that it can emulate: Neptune, an initiative by a group of market participants to distribute pre-trade axe/inventory, provides a model that addresses many of the critical requirements discussed above:

  1. A standardised, open-source data format to simplify the consumption of data from multiple sources
  2. A data governance model to address information leakage concerns
  3. Reduced connectivity costs and interoperability with EMS & OMS vendors via the use of FIX to allow vendors to provide value-added services such as reference data or additional analytics
  4. Low fees through a utility model that ensures the benefits of scale and the value of the data are captured by end-users

Of course market data has its own specific needs as well, such as providing for industry governance over the creation of the composite price. Such a composite can accelerate the adoption of buy-side TCA and best-execution analytics, since these are predicated on the availability of a clearly defined and reliable industry ‘mid’. Therefore a successful collaborative model must also address the creation of an appropriately balanced governance structure for an industry composite.
And finally
The key question, given the theoretical benefits of such a collaborative model, is whether the industry is ready to launch such an initiative.
The debate continues, but what is clear is the increased industry receptiveness to establishing open standards and industry utilities that make life easier for all participants whilst still providing scope for each firm to compete on top of the richer market infrastructure that such standards and utilities create.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

The Evolution Of Fixed Income

With Lee Sanders, Head of Execution FX and UK and Asian Fixed Income Trading, AXA IM
Lee Sanders_0Regulation is an increasingly important and dominant part of the trading world. New regulatory requirements dealing with transparency around fixed income markets are going to be hard to prepare for. The pre-trade transparency side of MiFID II and then the post-trade side will provide the fixed income world with considerable challenges.
Liquidity remains a principal concern. The first two months of this year have been some of the hardest since 2008 in terms of sourcing liquidity and that’s before the new regulation comes into force. The delay to MiFID II gives the industry longer to prepare for these changes, but as the deadline draws nearer and nearer, we are thinking more and more about how we will have to adapt to the way we trade.
Buy-side mind set change
In terms of a longer frame of reference for the evolution of fixed income trading on the buy-side, a few years ago I was speaking with a market practitioner who said that we should hire some equity traders in fixed income trading to better deal with the changing market structure, and at times I can see how we have moved in that direction in terms of trading and market structure. There are a variety of metrics examining what we do, but one of them, which is outstandingly clear, is that fixed income orders are fragmenting in a meaningful way.
It is hard to find the balance sheet to take down orders so there are two choices with regard to the way we trade: either break the order down and try to find the aggregated liquidity or work it as an order. The banks don’t want to take balance sheet risk. They would rather bypass that risk if they can and act more like agents.
What is clear to us is that more of the inventory sits with the buy-side hence it might be worthwhile if the buy-side traded more with the buy-side. However, there still seems to be a reluctance from the buy-side to trade with one another even in the dark or on platforms. In addition, the buy-side is more aggressive on pricing in the dark than with their more established liquidity providers. The buy-side views other buy-side firms as competition and are less likely to want to leave anything on the table.
The buy-side will never be pure market makers as that is not the business we are in. However, there could be a change to the extent that the buy-side become a price maker and this is the direction the industry should be heading towards. The development of bond trading means that there is nothing wrong with orders coming in from the fund manager and our posting them on an aggregated liquidity environment and staying firm on that price. The investment bank model is changing which could be a good thing for market structure. Regulation and capital constraints are helping to facilitate that and it is all key to the industry’s continued evolution and development.
Next developments
Risk is effectively ‘warehoused’ when firms all move in the same direction, so the firms that may do well could be ‘contrarians’ – those who make prices and decisions opposing the direction of flow. However, it is difficult to see this method being effective given the size of the books being moved around by the largest asset managers in Europe.
The next wave of development could come from those contrarian investors and opportunistic liquidity providers as they come into trading platforms and put their liquidity on systems for traders to find. Certain banks are reinventing their business model into a more hybrid agency model, but they could also be called opportunistic liquidity providers. This model allows them to potentially take advantage of market conditions with a more prohibitive bid offer spread.
More patience is required by the buy-side; if fund managers have time orders can be worked to completion at the market level but this may be compromised if time is an issue.
There are many different definitions of liquidity and price, immediacy, bid offer spread etc are key but one I like to use is that liquidity is the confidence that we will be able to sell something which we bought at a fair price at a given point in the future. Other areas of interest are transaction cost analysis and liquidity points. There are some excellent pieces of work around optimum trading times being prepared by think tanks.
Another key development is the change to our management style relating to how the industry manages fixed income. We still have active portfolios but there has been an emergence of buy and maintain products, and short duration products.
One of the metrics which we use to show liquidity is our trades fully executed on the day. As an example, if liquidity were poor, we could say that a figure below 50% of trades completed on the day would illustrate that liquidity was poor – and we have never been below 92%. But AXA IM is an established asset manager and does have an advantage. Most banks have been discussing the 80/20 breakdown where around 80% of revenue comes from 20% of the client base. Increased regulation could cut that tail away as banks could simply stop servicing those smaller clients therefore smaller asset managers could be at a huge competitive disadvantage come January 2018 – no one will want to send liquidity their way because of the new transparency regimes.
Sell-side rationalisation
The rationalisation taking place across most of the banking sector could be perceived as positive. Banks that want to be everything to everyone in every market are likely to experience problems with that model. But fundamentally MiFID II is at the forefront of everyone’s mind. It frames all those other things which take place day-to-day; the liquidity conversation, the research conversation, the market structure conversation etc – it all comes back to MiFID II.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

Countering Cyber-crime in Financial Services: Keeping Your House in Order

Countering Cyber-crime in Financial Services: Keeping Your House in Order

 

This report provides an overview of the cyber-threats faced by financial institutions in 2016. Financial services sector companies are re-investing vast sums of money to enhance their cyber-security – an estimated USD 9.5bn was spent in 2015 in the US alone. Firms are recruiting former government intelligence officers and hackers to help increase their organisations’ resilience to cyber-attack. However, systems vulnerabilities are still regularly exposed by sophisticated attackers. Financial institutions and governments are working to address these threats by creating organisations designed to collate and share the information needed to offset the risks posed by the activities of cyber-criminals.

 

https://research.greyspark.com/2016/countering-cyber-crime-in-financial-services/

 

 

Democratising Data: Self-service Advanced Analytics in the Financial Services Industry

Democratising Data: Self-service Advanced Analytics in the Financial Services Industry

 

This report assesses how self-service advanced analytics solutions can be used to enrich the output of business managers, focusing on their use in analytical discovery and modelling. Data repositories in the financial services industry are becoming larger and more complex, and the skillsets needed to manage, interrogate and interpret data – a combination of business acumen, statistical awareness and programming – are becoming increasingly expensive and difficult to source.

 

https://research.greyspark.com/2016/democratising-data/

Unifying Emerging Markets

Grigoriy Kozin, Director, Head of Prime Services at Otkritie Capital International Limited examines lessons that can be learnt from Russia by other emerging markets.
Grigoriy KozinIt is relatively easy for international clients to access the Russian market compared with other emerging markets. This is partly due to the role of brokers such as ourselves who are based in London. At Otkritie, we serve mainly international institutional clients and we take steps to alleviate the difficulties of trading in Russia.
When talking about emerging markets, there are three key components which make the markets more accessible and attractive to international clients. The first is the macroeconomics of the country; the political situation and the resulting impact on liquidity. In Russia there are a diversified set of asset classes but the current geopolitical situation in Russia, the drop in the oil price and the devaluation of the ruble, mean that cash equity does not look very attractive in terms of valuations, and as such we have seen a sharp drop in volumes.
At the same time however, geopolitical turbulence usually brings with it increased volatility to the FX markets. As a result of this, the Moscow Exchange moved focus to a range of FX products such as FX SPOT and FX futures which became main revenue bringers for the exchange. This volatility leads to a huge increase in turnover in that part of the market, and as such this is currently one of the main areas of interest to international clients, specifically hedge funds and proprietary trading firms.
Since 2011 there has been a dramatic turnaround in the volumes of index futures versus FX futures: on the Russian market, index futures used to have a market share of more than 75% and FX around 15%. But this year, index futures only make up 20% of market share and FX futures are now responsible for more than 70% of volume.
It is difficult to see how other emerging markets could follow this practice, as the volatility of FX is closely related to wider monetary policy. The Russian central bank is trying to maintain relatively independent monetary policy, which helps with confidence in trading, even if it has spurred on the volatility.
By way of encouraging liquidity and trading in other areas, the Russian markets recently started to develop exchange traded funds (ETFs) which are now a fast-growing investment vehicle. We believe that for international clients who would like experience of the Russian markets, these ETFs could be an interesting product with which to start trading to mirror the index. There are Russian index-based ETFs and ETFs based on Russian Euro bonds either hedged into the ruble or hedged into dollars, and this is a rapidly growing area for us.
Market structure
The second of the three key components is infrastructure; both trading infrastructure and wider market and settlement infrastructure. The Moscow Exchange has made huge improvements over the last five years in terms of ease of access for international clients. Five years ago, there was T+0 settlement with full pre-funding. There was no central depository and there were many infrastructure problems which meant that, for international clients, the Russian market was an inconvenient place in which to trade. So in 2012, a central depository was launched. Shortly thereafter in 2013 central counterparty was introduced on the Moscow Exchange it achieved T+2 settlement cycle with partial pre-funding. They subsequently opened the door for the European clearing houses so that clients could clear their trades with international clearing houses.
All these factors made the Russian market a much more convenient place to trade for international clients. The regulators and the Russian exchange helped to close the market structure gaps between the Russian and the European markets. This is a valuable lesson for other emerging markets; best practices, especially those of a large neighbouring regulatory area such as Europe, should be adopted to help ease trading across those markets. However, there are still a number of things that the Moscow Exchange needs to do in order to attract more international clients. Official global clearing membership was introduced in 2014 on only one market (of three) and it should have been introduced across all markets in order to reduce the risk for market participants trading on the Moscow Exchange.
Legislation
The third component is legislation and taxation. Together with the Moscow Exchange, we are working hard to introduce best practices in terms of regulation of the financial sector and of legislation into Russia. The Central Bank is doing a reasonable job in this regard in terms of regulation.
Taxation in Russia, compared with other emerging markets, is relatively straightforward. It is in line with the approaches announced by the Organisation for Economic Cooperation and Development (OECD), so it is generally unified with the European approach. Again, there is still much work to be done here, specifically relating to the taxation of listed derivatives and other assets, but we are moving quite fast and in the right direction.
The progress made over the last five years has been significant. New legislation is being developed to regulate dividend payments, which will make it simpler to understand when and where dividends are paid. This is happening with the help of the Moscow Exchange and our international clients and obviously, the brokers who are bringing international clients into Russia.
Becoming global
Otkritie Capital International is required to be fully MiFID II compliant given that we are a UK regulated broker, but as such the benefits of unifying regulatory standards between Europe and other emerging markets are clear to us. The Russian central bank does seem to be moving in the same direction in terms of its regulation. In 2015, the Central Bank signed a memorandum of understanding with IOSCO which seeks to drive global cooperation and increase transparency when clients are undertaking cross-border operations.
However, as a European broker we have many more concerns about MiFID II compared with those of local Russian brokers or local Russian market participants. Overall, we can see that the borders between the Russian and international markets are becoming less pronounced. One of the main goals of the Moscow Exchange is to bring back liquidity to local markets, rather than having so much of the volume traded through GDRs. A couple of years ago, more than 50% of most liquid stocks was traded internationally in forms of ADRs and GDRs. Now what we see is increasing volumes moving back into local shares which means that even for international clients and large funds seeking to invest in Russia, it becomes more convenient to trade locally rather than to trade internationally.
Sooner rather than later, Russian markets will become more like developed markets than emerging markets. Removing sanctions will help of course and will boost the economy. For example, with FX settlement, one of the biggest problems is that the Russian ruble is not a CLS-clearable currency which means it requires larger limits for banks to settle rubles and usually, it takes more time settle than CLS clearable currencies. This development would significantly help integrate Russia more deeply into international financial markets.
In conclusion, first of all, there needs to be clear commitment on the part of the government to help international clients. It needs to be reflected through better legislation and taxation approaches for international clients. In addition, the local market has to learn from experiences of other international markets in terms of market infrastructure. Provided that the economy is affordable for clients it will bring international investors.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

The Evolution Of The Buy-side Trader

With James Li, Head of Asian Dealing, Baring Asset Management
James LiThe early buy-side trader was originally more of a processing role between the portfolio managers (PMs) and the broker before evolving into today’s specialist, alpha generating, professional trading role.
The catalysts for evolution came as a result of the growing complexity of market structure, increased adaptation of technology and tighter regulation. The buy-side, cognisant of all of these factors realised the advantages that specialised market professionals would add to performance and the protection of client assets. As the buy-side firms began looking at execution quality and the mitigation of operating risk, the buy-side trading desk has become more integrated with the investment teams to generate alpha to portfolios, whilst operationally looking at how processes could be improved to reduce the dependence on outdated manual processes, simply because they had always been done like that. The buy-side trader has had to evolve into the space between the PM and the broker, with responsibilities including deciding on the appropriate execution strategy to pursue, where to source liquidity whilst minimising information leakage and then post-trade, analyse the quality of the execution and take away any lessons from the conclusion.
Rather than rigidly sticking to one trading benchmark, fund managers realised that different market conditions, such as changes to volatility or momentum require a variety of approaches, recognising that traders could add value to the investment process.
The use of benchmarks has also evolved over the years due to the ever changing market landscape, from a traditional scheduled based VWAP approach to a more price sensitive arrival price or PWP approach. Whilst historically there was typically one main principal benchmark used in TCA measurement, to gain a true picture of the quality of the execution, this analysis has evolved into looking at multiple measures to achieve a clearer picture of the effectiveness of different strategies being employed. From this we can learn how to tweak our process to increase the alpha generated from the investment decision. I feel this approach gives my team the freedom to utilise their skill-sets and market knowledge in the pursuit of best execution as opposed to being constrained by a more narrow measure of their performance.
Changing relationship with portfolio managers
Communication is always the key and we encourage active dialogue between the portfolio managers and trading desk. As client mandates are getting more complex and market structure is ever changing, we look at positioning ourselves to help the portfolio managers focus on their investment decision. In reality, we are providing high touch sales trading to the portfolio managers.
Whenever portfolio managers put on a trade, the dealing desk should be in a position to understand what they are trying to achieve – whether it is a regular distribution of a cash inflow across positions or an investment decision with a time urgency. To help the PMs with the investment decision process, the dealing desk can reflect the opportunity to trade block liquidity in core names or give colour on market noise. That is part of the value proposition a buy-side trading desk should bring.
Changing sell-side relationship
With the introduction of new regulations and technology, there may be a need to look at the range of counterparties we trade with. Our dealing desk has commission sharing agreements set up with most of the global brokers in which best execution has become more important than ever. Best execution involves looking at the duration of an order and examining how to achieve the order with minimal market impact. It involves careful examination of all the tools available: electronic trading, block trading, crossing networks, agency versus principal trading. Regulation has forced the buy-side to examine and justify those choices a lot more carefully.
We place considerable emphasis on brokers that provide us with intelligent execution consulting and block liquidity. One of the biggest challenges facing any buy-side trading desk is sourcing blocks to trade and we do need to use our counterparties’ expertise to locate liquidity, against the backdrop of increasingly fragmented markets.
Driven by regulation
It is difficult to underestimate the impact of regulation, particularly on the buy-side. Conversation is now taking place around the impact of MiFID II and other global regulation on best execution and transparency requirements, and the impact here in Asia needs consideration too.
We can also examine the wider electronic trading rules, looking back particularly to the SFC’s introduction of electronic trading rules. Those rules meant that once signed up to a broker’s algorithmic trading products, the emphasis is very much on the buy-side trader to understand how those products work and how they might impact the market. In Europe, there is now a similar consultation exercise for asset managers to comply with the new systems and control guidelines issued by ESMA and similar work being carried out by such bodies as the PRA.
This has been a major shift from a buy-side perspective, as the previous ability to subscribe to a multitude of different electronic trading services in the market has been replaced by a more subjective approach to handpick those few who will best serve our needs and even these are reviewed on a regular basis. It is crucial we carefully examine which brokers (from a TCA or a coverage perspective) are maintaining the standards we need them to meet. The trading tools we use on the desk are a lot more targeted and precise. For instance, we may subscribe to a broker but we won’t subscribe to the whole algorithmic suite, only to those algos that are to be used on a regular basis.
Driven by technology
Each industry goes through a paradigm shift and trading is no different, with technology playing a big part. At Barings, we are a big supporter of using technology to enhance performance and mitigate risk in a cost effective manner. Whereas buy-side desks used to predominantly use high touch sales trading, there has been increasing use of electronic trading over the years to the point where we are customising algorithmic strategies with brokers. This year, I would like to further deep dive into individual markets to see where our electronic usage rates are. Ideally, developed markets which trade in narrow spread such as Japan, HK, and Australia should have higher usage rates than the rest of the region.
Barings is also actively participating in various industry initiatives. Our Head of Dealing, Adam Conn in London, has chaired a FIX Trading Community working group working to provide automation of new issue applications and allocations in both equities and fixed income, and in both the primary market and secondary placings process. We also participate in joint Investment Association and AFME working groups to bring standardisation to IOI definitions and industry oversight of electronic trading in Europe, similar to what we have developed in Hong Kong.
It will also be interesting how new initiatives such as Symphony for messaging will develop. Blockchain has become the hottest topic this year. I am not exactly sure how this will reshape the industry but I believe it is going to have a huge impact at some point.
What qualities we look for in traders Aside from a basic aptitude to trade and the ability to process information in a timely manner, strong communication skills (both delivering and listening), especially with the PMs is a priority. Understanding market structure and the impact of changes in global and local regulation are increasingly vital in the ever changing market landscape. Innovation is effectively problem solving so the ability to work as a member of a team is critical. As markets change, new skill-sets such as being technology savvy or someone who has a strong quant background who can analyse and identify trends from large amount of data will help us find answers to new questions. Most importantly, the best traders need an open mind and be prepared to continually learn whilst maintaining a degree of humility.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

The long and winding road of Brexit : Lynn Strongin Dodds

EU-DIVIDED

EU-DIVIDED

The long and winding road of Brexit

It didn’t take long for the historic 52% Brexit vote to reverberate around the country and the world. Within hours, David Cameron signalled his intention to step down and hand over the reins of power at the October Tory party conference while UK’s European Commissioner Lord Hill also handed in his notice. This is not even mentioning a revolt in the Labour party with a score or more of ministers resigning and a potential second referendum in Scotland as First Minister Nicola Sturgeon sought to have discussions with Brussels to “protect Scotland’s place in the EU”.

Not surprisingly, markets are more than jittery and the pound dropped to an historic low on the morning after the vote. It not only fell nearly twice as much as on Black Wednesday in 1992 but was also a large enough drop to rank as the second-largest one-day fall for any major currency over the past 40 years, according to analysis of daily Bloomberg data going back to 1971.

Financial volatility is likely to continue in the following weeks although the UK was not unprepared. The Chancellor of the Exchequer George Osborne has been co-ordinating with fellow finance ministers in Europe, the IMF, central banks and the US Treasury Secretary to ensure markets can handle the shock. As he put it, “Markets may not have been expecting the referendum result – but the Treasury, the Bank of England, and the Financial Conduct Authority have spent the last few months putting in place robust contingency plans for the immediate financial aftermath in the event of this result.”

Although it is difficult to predict the future, there are some certainties. The Tory party will hold a leadership election at its party conference and once the new prime minister is chosen, it will be up to him or her to activate Article 50 by notifying the European Council. Once this happens, the UK is cut out of EU decision-making at the highest level and there will be no way back unless by unanimous consent from all other member states.

Leaving the EU is not an automatic or easy process. It has to be negotiated with the remaining 27 members and ultimately approved by them by qualified majority. Although these talks are meant to be completed within a two-year timeframe, many believe it will take much longer. There are divisions within the bloc about how the UK should be treated and the European Parliament has a veto over any new agreement formalising the relationship between the UK and the EU

EU leaders are also particularly concerned about the prospect of “contagion”, with the UK’s decision already fueling demands from populist, anti-EU parties in France and the Netherlands for referendums of their own on EU membership.

Lynn_DSC_1706_WEBLynn Strongin Dodds,

Managing Editor.

©BestExecution 2016[divider_to_top]
[divider_line]

 

 

Looking for the truth : Lynn Strongin Dodds

Looking for the truth

As the EU referendum debate showed the constant 24-hour flow of news can be damaging to stock and bond markets. Investor behaviour can swing from irrational exuberance to pervasive gloom within seconds depending on the information. Ironically though those who ignore the daily diatribe have a better chance of beating the indices.

This is corroborated by a Harvard study on investment habits that showed investors who received no news performed better than those who received a constant stream of information, good or bad. Moreover, among investors who were trading a volatile stock, those who remained in the dark earned more than twice as much money as those whose trades were influenced by the news.

Traditional outlets are not the only ones that have an impact but also social media such as Twitter which can be first to the market with information. Twenty-eight per cent of news about mergers and acquisitions breaks on this medium, according to a 2015 study by Selerity, a financial data platform made famous last year for publishing Twitter’s earnings before the social media company did.

It also has the power to significantly move the needle, as witnessed by a bogus tweet about an explosion at the White House last year. The value of US stocks plummeted around $90bn although markets recovered quickly when it became apparent the information was false.

While much of this behaviour may pertain to the retail investor, pension funds, which albeit take a long term view, have to navigate in markets that become volatile and at times they can be just as temperamental. As Iain Cowell, Head of UK & Ireland Solutions team at Allianz Global Investors, who conducted his own separate study, puts it, “Technology has vastly improved, information is ubiquitous, but human nature hasn’t changed at all.”

Taking a more in depth look at the role of technology, he found that as with any new developments it is a mixed bag of opportunities and challenges. For example, real-time quotes and news delivery as well as access to regulatory filings on a fund manager’s desktop via high-speed networking on fast, dependable PC can turn the raw information into models, presentations and investment insights. However, it also limits the time that fund managers have to think things through and more thoughtfully.

Cowell also found that it is easier to search for inputs to support or refute an idea or potential input but that it is also important for investors to pull away from the “information firehose”. “If you don’t allow yourself time to reflect and imagine what others are missing/what might be important a year out, it’s becoming trickier to add value,” he adds.

In the end it is about cutting out the noise, myths and rumours to focus on the truth and leveraging that knowledge effectively. Of course, this is easier said than done.

Lynn_DSC_1706_WEBLynn Strongin Dodds

Managing Editor.

©BestExecution 2016[divider_to_top]
[divider_line]

 

 

Fixed Income Best Execution Methodology

With Carl James, Global Head of Fixed Income Trading, Pictet Asset Management
Carl JamesThe methodology of equity TCA appears to be being supplanted straight into fixed income – yet we know this way of thinking simply doesn’t work. MiFID definition of best execution, describes the process whereby price is an important component, but it’s not the sole component. In terms of terminology, it would be better to use best execution ‘methodology’ or best execution ‘benchmarking’ instead of TCA.
What many in the industry don’t realise is the extent to which fixed income is a completely different asset class (with a number of different instruments) compared to equities. In addition, the reason that instruments are bought and sold is different for fixed income as compared with other asset classes.
The question of best execution methodology begins with the choice of execution strategy. Is it agency or principal? If agency, is it in a synthetic central limit order book or is it a dark pool? While decision-making, firms need to use the underlying data to support their reasoning. Traders need to improve how they use the data that they have already at their disposal. If a trade in a similar name worked well in a dark pool, it could be worth trying the same again. Similarly, if it worked well through an RFQ using a particular set of brokers, then they could try those brokers again to see if the liquidity is still there.
If a firm examines the active inventories and the hit ratios of winning bids and non-winning bids, there is suddenly a huge amount of data available that will lead to a defensible position of why the firm used Broker X and Broker Y; this should then lead to a better informed strategy and a chance at a better overall outcome. It is a layering process which becomes a more dynamic set of inputs and outputs, rather than simply a tick-box exercise to reach whatever benchmark is being targeted. Traders are therefore actually able to demonstrate that sensible decisions are being made all the way down the decision-making process.
Free data
There is a vast amount of data that asset managers can buy, but this would clearly have a budgetary impact. There is also a considerable amount of free data that buy-side firms have access to because they have already been given it. The first set is their own hit ratio. Firms can examine a broker’s profile to see how well they perform, whether they are the best broker for example, and then they can examine how they fare in terms of other brokers etc – this all gives additional colour.
In terms of other data, the axes and the inventories that are being sent to the desk, if the firm is able to systematically capture this data, then actual flows and fills can be used to detect and recognise patterns in quotes and spreads. If it is possible to detect changes in what is being traded, then we can work out where a broker is in terms of their positions.
Firms do need to be a little more autonomous to pull the data together in order to analyse it. This is part of the ongoing behavioural development of the buy-side, and it is time that these changes were fully embraced. The responsibility of the buy-side to take ownership of this area, not just from the viewpoint of collecting the data, but analysing that data and taking action resulting from those results, means that more proactive steps will surely follow.
Attitude versus technology
As more and more CEOs, CIOs, compliance officers and finance officers realise how important it is to trade correctly, then there will be an uptake in the numbers of firms taking ownership of that on the buy-side.
This is partly a reflection of the maturity of the industry. There is more automation, lower margins, fewer people and generally a more efficient product. As regulation continues to tighten and the technology continues to improve, the industry needs to keep re-evaluating itself and driving further forwards. What the industry needs to do is to start capturing the brains that are heading towards Silicon Valley for their careers rather than towards finance. A more holistic approach to best execution methodology is just one of the areas where firms can make wholesale improvements in their approach to technology and data, and better use the data that they already have access to.
We’d love to hear your feedback on this article. Please click here
globaltrading-logo-002

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA