Home Blog Page 529

Regulation & compliance : MiFID II : Gill Wadsworth

SO CLOSE YET SO FAR.

Although January 3 has passed, it is still unclear how many firms were MiFID ready. Gill Wadsworth reports.

As far as achievable deadlines go the MiFID II January 3 compliance date had always looked ambitious. In December, with just weeks to go, many buyside and sellside firms were yet to get systems, processes and policies in place.

While the Financial Conduct Authority (FCA) gave no indication of regulatory forbearance, the watchdog’s director of enforcement and market oversight, Mark Steward, said in September that the FCA will not take enforcement action where there is evidence that firms have taken “sufficient steps” to comply with MiFID II by the deadline.

However, even with a little more breathing room firms need to get their houses in order and comply with the regulations as soon as possible to avoid civil, regulatory or criminal consequences.

Born out of the regulatory gaps highlighted by the global financial crisis, MiFID II is designed to improve transparency and investor protection; improve competition; and align regulation of institutions doing business in the EU. Importantly the directive puts an end to asset managers taking research from anywhere – for free – and using it to inform investment decisions. Instead they will have to pay for research from approved providers.

The research challenge

It sounds straight forward, but as Mike Stepanovich, president of enterprise services at Visible Alpha, says this is a major change in the way the buyside has sourced and rewarded researchers. “An analyst or portfolio manager used to be able to open and read any email, they could use research from anywhere,” he adds. “After January 3rd how do they know which emails they can read and which they can’t?”

MiFID II means asset managers need to pare back their research providers and analyse which ones are worth doing business with in the future. This is a significant judgement call, and a survey by ONEaccess in association with Institutional Investor found that only just over half of asset managers expected to be able to assess research quality and value by the January 3, 2018 implementation date.

The problem many asset managers have is in being able to easily analyse and assess their research providers and then compare them with the competition. The survey found almost half of asset managers believe one of the most difficult aspects of research valuation is the lack of transparency from their counterparties as to the pricing for research.

Even where asset managers have got their preferred and approved research providers in place, there is no guarantee these relationships will work in practice, meaning the first half of 2018 could see serious chopping and changing of providers.

Sourish Gupta, director of business development for financial services at The Smart Cube, says: “2018 will really be a year for trial runs. It may not work out for some fund managers who rely on incoming research for idea generation and also prefer to compare views from a wide base of researchers. Fund managers may ultimately have to broaden and/or change research provider lists that they put in place at the start of 2018.”

Perhaps more pressing than which researchers they will pay, is how asset managers will pay for it (see Figure 1). There are two main options available. First is to absorb the costs in the P&L, which is proving most popular with more than half (52%) of those surveyed by ONEaccess and has been chosen by State Street and Columbia Threadneedle Investments among others.

The second option is to use a research payment account (RPA) using commission sharing agreements, which passes the costs to investors and is the method of choice for 29% of asset managers including Fidelity Investments. Five per cent of asset managers are still unsure just how to pay for research which means they will need to spend the start of 2018 getting themselves ready.

Tony Freeman, executive director for industry relations at DTCC, says: “It is very clear that some buyside firms left it late to decide whether to adopt the P&L or RPA model. This has left not enough time at all to get policies implemented.”

In some ways the sellside are beholden to the buyside, waiting until asset managers have policies in place before they can make moves of their own. While this is a hindrance for some, MiFID II has offered specialist providers the chance to make inroads into the market.

Freeman says high tech best execution brokers that offer pure execution and no research are very much looking forward to how MiFID II will change the market.

“[Pre-MiFID II] execution only brokers say trading has been directed towards investment bank integrated broker dealers that do investment and research. This is a whole new scenario and they believe they will prosper as they focus entirely on best execution without the conflicts of research,” he says.

At the same time, investment banks will also need to have firm plans in place for how they offer execution and research simultaneously. Freeman says investment banks are assembling Chinese walls where they intend on delivering best in class in both disciplines. Others, however, may choose to drop one in favour of focusing on the other.

The data hurdle

Data management is another huge area of focus – MiFID II’s mandates require firms to source, integrate, manage, deploy, disclose and report reams of new data: client, entity, security, product, portfolio, multi-asset, trading and exchange – which in turn creates a significant task for firms to comply.

“The single biggest challenge of MiFID II, overall, is data management across front, middle and back-office systems, “ says Nels Ylitalo, director, regulatory solutions at FactSet. “The simultaneous deployment of new data requirements across front-to-back systems poses a huge challenge. Large pieces of the data ecosystem were coming online relatively close to MiFID II’s go-live date and data management systems and resources will be hard pressed to be ready for full compliance.”

While it is clear many firms are still not MiFID II compliant, even though the deadline has passed, there are those that are ahead of the game. It is here that the directive’s aim of improving competition will come to bear.

Those buyside firms that have identified the best value research suppliers and have systems in place to allow investment professionals to focus on the day job while still tracking and logging their processes, will have a competitive edge.

The sellside that offers streamlined services, understands its client’s requirements and is flexible enough to adapt as the regulations bed in, will be the one that comes out on top. As David Tattan, head of European business development at TORA, puts it, “Lots of MiFID II is about technological advances within these providers and getting operations in the right place not just to conform with regulations but to generate a competitive advantage.”

With so much still to learn from the implementation of MiFID II it seems rather premature to consider a third incarnation of the regulation, yet some already see the possibility of a MiFID III. Like a bad movie franchise that will not die, the directive will keep reappearing.

Ylitalo says, “MiFID III is already being drafted. All European regulation contains a legal review clause; MiFID III is not a matter of if, but when.”

However, other commentators are less certain there will be a wholesale revision of MiFID II, rather that the existing directive will be tweaked and amended as unintended consequences emerge.

Ravi Jain, manager, business consulting at Sapient Global Markets, says “While regulators will keep tweaking requirements to make it easier for firms and merging different regulations like EMIR and MiFIR, an overhaul in the form of MiFID III is improbable in the near future.”

©BestExecution 2018[divider_to_top]


Technology : The cloud : Frances Faulds

TECHNOLOGY – IS THE ANSWER IS IN THE CLOUD?

Frances Faulds looks at where they are adopting the cloud and how they are mitigating the cybersecurity risks. 

Banks typically spend 80% of their IT budgets on maintaining their legacy systems, but rough estimates are that they could cut IT costs by 75% by adopting cloud-based solutions.

The economics, and benefits, of using the cloud are beginning to add up and make a compelling use case, not just to replace legacy systems but winning over choosing on-premise hardware. Research firm, Gartner, expects that spending on cloud or SaaS-based applications will increase from 30% of the total market spend in 2015 to 47% by 2020 and that cloud will become the dominant deployment model by 2025 due to the increasing maturity of cloud-based, core financial management applications.

Cybersecurity is, and will always be, an issue for banks, across all their systems but one that they are becoming more comfortable with as cloud solutions mature and proven track records lengthen. The expertise and spend of the cloud vendors on cybersecurity is higher than most and cloud technology possesses other benefits, such as built-in data redundancy, that is simply not available with legacy systems.

Jerry Norton, global lead for corporate and transaction banking at CGI, says: “There is a perception out there that somehow the cloud is less secure. I think it is actually more likely the other way around; it is probably more secure than your own capabilities, just because of the sheer amount of money that has been spent on the prevention of attacks.”

While there were initial security concerns about running trading engines in the public cloud, Guy Warren, CEO of ITRS, a provider of data analytics, says the growth of in-house cloud and private cloud means that security issues can be overcome.

Moreover, it does not have to be an either/or choice for the financial services sector as different systems can be migrated to the cloud gradually, reducing risk and easing the plunge. The advantages of cloud versus on-premise computing are significant in terms of cost, and paying only when computing power is needed means that banks cannot avoid the move to cloud.

A scaleable solution

“In the world of fixed IT estates, we typically see financial services firms’ servers running at less than 20% per cent utilisation over 95% of the time, says Warren. “Datacentre space is expensive, translating to huge wasted cost. The cloud allows for a more flexible environment which can grow in capacity when demand is there, and reduce when it isn’t.”

Louis Lovas, director of solutions at OneMarketData, which offers tick data management and analytics for algo trading, best execution and TCA, says some firms use the company simply for historical data and where they are looking at the data provider to warehouse private data, security becomes a critical factor. Others decide the risk profile is too high for their comfort levels, so they keep the data in-house and use a hybrid model – both private/public cloud and their own datacentre infrastructure and just connect the two.

The cost savings are significant, according to Norton, not just in terms of the applications used on the cloud but for architecture operating with high numbers of transactions at peak times. Also, the ability to pay only for the computing power needed far outweighs any cost savings that can be made in-house.

“Up until very recently, banks had been moving to the cloud incrementally but now more are jumping in with both feet, and this is because of cost: the cost advantage is potentially huge,” he adds.

To date, the challenges have been around security and regulation, especially data location, and material outsourcing requirements in some jurisdictions, as well as integration. Sudip Dasgupta, senior director of technology at Sapient, says: “Banks are nervous about having large volumes of data shuttling back and forth between on-premise and the cloud. This is why applications such as regulatory and transaction reporting are good candidates for the public cloud since these systems can process data in the cloud and don’t need to be brought back in.”

There is a move towards Platform as a Service (PaaS), and Sapient’s clients are also looking for cloud platform providers with a comprehensive offering covering both PaaS and Infrastructure as a Service (IaaS). “Given the large amounts of IP that is invested in their existing systems, banks would like to have the option, that if something is too complex to rebuild on PaaS, to be able to move them to an IaaS model on the public cloud,” says Dasgupta.

If the ‘low hanging fruit’ – the easier applications to move to cloud – have already being moved, what is the next step? According to Dasgupta some platforms can be optimised by rebuilding on public cloud PaaS; some applications require thousands of processors in a grid computing deployment, and these can be migrated to the public cloud (PaaS or IaaS).

There are also some applications, with fluctuating demand patterns that will benefit from the elastic scalability aspects of IaaS deployment on the public cloud, and the granular billing rates will result in cost savings for the bank.

The human element

For Norton, while there is still some regulatory uncertainty relating to data location and security issues, it is the cultural changes within an organisation that are needed to drive forward the move to cloud technology that should not be underestimated, both in terms of the skill sets needed and the leadership to drive change.

“To get to these sorts of numbers, there has got to be a much more standardised approach to IT, and how you differentiate yourself in IT, and a lot of banks still believe that IT has got to have some form of differentiation in it,” says Norton. “If you go to cloud, you have got to run it on a standard platform, using standard tools and it has got to be very consistent and standardised from a technology point of view, and a lot of banks find that difficult to accept.”

He believes that there “needs to be a change of mindset. It’s not the technology; it’s other factors holding things back. It is accelerating, and will more, once the critical mass gets behind it.”

HSBC is one bank that has committed to the cloud and it is partnering with Google to take advantage of the latest technology, data analytics and machine learning to transform the way the bank does its business.

Speaking at the Google Next Cloud Conference, in March this year, about the decision to adopt a cloud-first strategy, HSBC CIO Darryl West said: “Our journey in data is very similar to other companies. We have lots of good old-fashioned databases; all of our core systems are running on product systems that have been around for 20, 30, even 40 years. The systems are robust, scalable and they do a great job but they don’t have the database structures that actually allow us to do the data analytics, the machine learning, that we think we need to do.”

The difficulty and expense of extracting data into traditional data warehousing platforms led HSBC to take the plunge, about three years ago, and embed the bank into the evolving Hadoop* eco-system. The initial use cases have been those that have very large datasets and require very intense computing capability in short bursts.

West said that for anti-money laundering, finance liquidity reporting, risk analytics and risk services, it made more sense to use the capabilities of the cloud rather than building new data centres.

CGI’s Norton believes regulatory reporting is in fact forcing increased use of the cloud, simply because of the mounting data and calculations needed. Support systems, Salesforce, CRM, HR, personnel, reporting and risk calculations, such as Monte Carlo and VAR, are all now being moved to the cloud. In fact, cloud technology is fast becoming the only option available to banks to get the storage and computing power they need.

“The regulator is wanting more and more of these calculations to be done in real-time, and this cannot possibly be done on premises,” says Norton. “The cost of running these tens of thousands of burners, all day, in your own premises, would be absurd. So, most banks are looking at doing risk calculations in a large cloud configuration, for when they need that computing capacity. All banks are looking at cloud, where the business case is really obvious: anywhere you have transactional systems and these huge peaks.”

However, Lovas says, it is only when banks take portal access to historical data, analytical toolsets that someone else is maintaining and managing, that the amounts being spent on maintaining legacy systems today can really be attacked. “I believe these numbers will trend down over the next year as the big investment banks and pension funds become more comfortable with outsourcing their technology, their infrastructure, their data management, or their analytics.”

*Hadoop is an open source, Java-based programming framework that supports the processing and storage of extremely large data sets in a distributed computing environment.

©BestExecution 2018[divider_to_top]


Trading : Artificial intelligence : Maha Khan Phillips

ARTIFICIAL INTELLIGENCE AND TRADING.

Trade execution is changing, and artificial intelligence is playing a big role in shaping its future, writes Maha Khan Phillips.

One of the first examples of structured trading in Europe occurred in France nearly a millennium ago, when ‘Courretiers de change’ managed agricultural debts on behalf of banks. In the not too distant future, if some industry participants are to be believed, today’s trade execution technology will seem as outdated as those methods of trading do now.

Already, the industry is on the cusp of enormous change. Automated trading algorithms first came into use in the US equity markets over 20 years ago. According to Greenwich Associates, 99% of buyside and sellside trading desks with higher institutional commission volume now use an electronic trading product for equities. With the explosion of data and new types of information reshaping their world, management consultancy Opimas expects spending on artificial intelligence (AI) related technologies to exceed $1.5bn in 2017, and to reach $2.8bn by 2021.

“Over the past two to five years in particular, various financial institutions have invested resources in establishing data lakes that consist of structured data from disparate sources,” says Usman Khan, chief technology officer at Algomi. “As a result, they have leveraged this structured data in a myriad of areas, including determining profitable efficient trading strategies, how to position best for client business, portfolio optimisation, and early detection of compliance breaches. Each of these areas requires large scale automated analysis to extract value-added insights that lead to an increase in return on investment.”

Algos come of (machine) age

Greenwich Associates points out that trading algorithms have come a long way from simple ‘time-slicing’ algos that would send orders to the exchanges at pre-set intervals. Now, algorithms use volume prediction analytics, market impact models, liquidity heat maps, and venue analytics. However, all these advancements will need to be combined with machine learning (ML) to really drive the industry forward.

According to Khan, machine learning can be used to automate the intra-day calibration of trading algorithm parameters that would traditionally be set manually by humans. ML can also be used to automate the analysis of a client’s historical transactional record, generating insights on what to sell next. It can be used to train an algorithm on how best to optimise a portfolio, on top of standard mean variance portfolio optimisation techniques, and to enhance natural language processing of compliance feeds to automate breach detection.

Fintech firms are offering different trading solutions for their clients. Portware, the multi-asset execution management system bought by FactSet in 2015, has traded a notional $168bn on its Alpha Pro platform, an AI enabled trading management solution. Alpha Pro uses predictive analytics to evaluate historical and real-time order flow, and helps traders implement the execution schedules that best capture price opportunities and short-term alpha.

The firm claims every order that arrives on a trader’s blotter can be analysed, vetted, and paired with a recommended trading strategy and trade horizon, as well as a recommended algo type selection. “Alpha profiling provides a quantitative basis to do something that is the core function of a trading desk, to identify the optimal strategy for each order,” says Henri Waelbroeck, director of research for portfolio management and trading solutions at FactSet.

One of the features on the platform is an alpha switching engine, which uses ML to decide which of the algorithms that traders have access to is likely to have the best performance for a particular order, under specific market conditions.

Jeremy Armitage, head of eFX at State Street, says algorithms can be used to see how the firm’s trading patterns are being interpreted by the market and used against them. “Is there a risk that what we are doing is being picked up by other people in the market electronically, and then they are trading against us in some fashion? We have a suite of algos to look at how what we are doing is impacting the market, and we use that to decide to trade more slowly, or to randomise our trades a bit, or trade at different market venues. Ideally, you want to trade with zero footprint, which every institutional investor knows is impossible.”

Adding alpha

One question is, are these new technologies adding alpha? Opimas says it is impossible to pin down how much alpha comes from the intelligent use of data sets, but provided an example of a hedge fund it worked with, which spent more than $50m annually on sourcing, analysing, and implementing alternative data strategies. “This implies that the fund managers believe they must generate at least 2% in alpha annually simply to cover the associated costs,” the firm points out.

There are though other ways algos can provide value. The buyside should use them to decide how their providers have performed as counterparties, suggests Gary Brackenridge, global head of asset management at Linedata. They should also use it to reduce cost and reduce the inefficiencies after a trade.

“What you are really trying to achieve is to generate alpha. This mythical alpha or idiosyncratic risk or exposure no one has ever found. We are all using the same tools and techniques to try and capture this unicorn. Big data, machine learning, and artificial intelligence give us more tools, but they don’t give us more unicorns to find. I liken this trend to the early days of low latency trading. There was an incredible arms race,” he argues.

Charles Ellis, a trader and quant strategist at Mediolanum Asset Management, believes that the industry needs a balance between quantitative data, fundamental processes and human understanding. “We actively look at how machine learning can be embedded into our investment process,” he says.


There is also a risk component. Bhavna Khurana, vice president of client solutions financial services at The Smart Cube, believes that traders should spend more time using AI to look at risk. “Large asset managers and banks can get a lot of help from AI and ML in terms of their risk function. The way it’s conducted now, it’s regulatory-driven, and there’s not a lot of analytical thought going into whether our models or functions are truly robust,” she says.

The future

However AI is used, industry pundits agree that the future will look very different. Predictive algorithms actually mean correlations could occur faster than the speed of light, points out Michael Cooper, chief technology officer at Radianz. He cites a paper, Information Transmission Between Financial Markets in Chicago and New York, which suggests that predictive algorithms could statistically anticipate Chicago futures price changes before the information can even reach the equity market engines.

Though faster-than-light communication has been shown by Einstein to be physically impossible, say the authors of the report, they “observe a signal consistent with the emergence of such predictive algorithms, by documenting correlations that occur over timescales shorter than the theoretical limit of 3.93ms, the time it takes for light, which travels at a velocity of 299,475 metres per second in air, to travel the distance of 1,187 kilometres between the Chicago Mercantile Exchange’s matching engine in Aurora Illinois, and Nasdaq Exchange’s matching engine in Carteret, New Jersey.”

It may sound like science fiction, but Cooper argues that the industry will have to learn to acquire, curate, and maintain its data even as evolution continues to bring new opportunities and challenges.

“You are into a world of almost constant change,” he says.

©BestExecution 2018[divider_to_top]


FX trading focus : TCA : Stuart Farr

IMPROVING ORDER EXECUTION IN FX

Stuart Farr, President, Deltix.

There has been a marked uptick in the use of TCA (Transaction Cost Analysis) services within the FX market over the last few years. Doubtless significantly inspired by the FX Global Code and MiFID II, the use of TCA in FX has been suggested in some quarters to be a box-ticking exercise (although this charge is similarly levelled at TCA in other asset classes too). This is unfortunate, and maybe it’s time for a name change. The very name ‘TCA’ suggests an after-the-fact analysis which may never see the light of day again. Implicit here is the need for ‘pre-trade TCA’: helping traders intelligently decide the most appropriate execution algo for the required objective (how to execute) and on which venue (where to execute). Note that we use ‘execution algo’ here in the broadest sense, covering the method of order execution in general; whether using market, limit and other order types or true execution algos such as TWAP. We use ‘venue’ to include ECNs, banks and non-bank liquidity. Whatever the requirements are for regulatory, or indeed client reporting, surely the most important aspect of TCA (or ‘execution analysis’) is the use of post-trade analytics to inform and improve current and future (pre-trade) order execution.

Optimising execution quality means performing ongoing execution analysis in real-time to enable traders to make informed decisions for subsequent orders. The key point here is that market characteristics (e.g. volume, spreads and volatility) continually change. Therefore, measurement of execution quality should be continuous and part of the workflow of order execution in order that decisions on algo selection and venue are based on current market conditions and the extent to which they deviate from historical market conditions for similar orders.

The essential component required for such on-going analysis and decision-making is recording faithfully, in real-time, the full depth of the order book for each liquidity venue to which the firm is connected. With multiple liquidity providers (LPs), this is the proverbial “drinking from the firehose” which requires the ability for the trading system to ingest quote updates and trades at rates measured in hundreds of thousands per second. A key aspect here is that this quote recording system is plugged into the trading firm’s production trading infrastructure. As such, the latencies and infrastructure implicit in the trading firm’s particular set-up are baked into the historical time-series thus recorded, and it is more straightforward to intertwine orders, executions and the state of the order book at the time of each order and execution.

Once this recording system is part of the trading infrastructure, a time-series of trading-firm specific quotes and trades is automatically created and maintained. Again, by being part of the production trading infrastructure, this time-series production receives the necessary care and attention befitting such a valuable resource. This systematically maintained time-series database (or better, ‘knowledge base’) is the cornerstone of implementing a systematic approach to improving FX execution quality by appropriate algo and venue selection.

Of course, the lack of a central tape or official close prices as in equities and futures markets begs the question as to the relevant benchmark for measuring FX execution quality. But, when a firm is using TCA as part of a solution to improve execution quality (versus box-ticking), then the benchmark can and should be specific to the objectives of that firm’s FX trading (e.g. hedging versus intra-day trading).

Because of the fragmented nature of liquidity in the FX market, improving FX order execution is not just about how to execute (i.e. which execution algo to use). FX execution analysis has to operate across venues, so where to execute is also critical. This is preferably undertaken in real-time by a smart order routing (SOR) algo. SOR algos determine venue selection typically by looking at the best bid or offer simultaneously provided by each of the connected LPs. Depending on the time of day, currency pair, order size, required aggressiveness, etc, the SOR algo may send child orders to multiple LPs and may use multiple levels of liquidity. The intelligent SOR algo accounts for the historical fill ratio and rejections. Even if a venue offers the current best price, it may be more risky to route the flow to this venue if historically this location has a high rejection rate. Thus, it is essential to SOR operation to have access to the time-series of market data, orders and actual executions in order to calculate fill and rejection profiles for each liquidity venue. The calibration of the SOR algo should be continually evaluated. This is done by back-testing candidate SOR algos (including different parameterisation of the ‘same’ algo) against the firm’s own knowledge base of market data, orders and executions.

In summary, we need to combine traditional FX TCA with liquidity analysis across multiple venues and provide pre-trade analytics on current and historical data to provide traders with relevant and current analytics to optimise execution quality for the next order.


Stuart Farr is president of Deltix, a leading provider of software and services for quantitative research, algorithmic and automated systematic trading. Previously he was CEO and co-founder of Beauchamp Financial Technology, a global provider of front to back hedge fund software. Prior to that, he was Head of Credit Risk Technology at Credit Suisse First Boston and an engineer at NatWest Markets and Synopsis.

©BestExecution 2018[divider_to_top]

 

FX trading focus : Technology : Mathieu Ghanem

TO COIN A PHRASE – IT IS TIME TO CHANGE

Mathieu Ghanem, Global Head of Sales at ADS Securities.

Across the OTC industry and in particular the world of FX trading almost every firm is focused on the changes to regulations, whether this is at the macro-level, with MiFID II coming into play, or the micro level where individual products are subject to change. But, although extremely important, I believe that over the next five years it will be technology, not regulation, which has the greatest impact.

Technology in trading is always changing as systems are improved and updated, but at the moment the financial services industry is on the cusp of potentially one of the greatest changes in its history. This will not happen overnight, but there will be gradual implementation of systems in two key areas. These are the introduction of blockchain technology and the use of artificial intelligence (AI) solutions to manage a much greater proportion of trading.

The market has been slow to adopt blockchain, and at the moment the use of the technology is mostly based around cryptocurrencies, but the systems open up many options. In recent years, regulators have made it clear that they want FX trading to be exchange based – mostly for transparency purposes. For a number of reasons, and using the current format, this would be very difficult to achieve, but blockchain, if properly implemented, could potentially resolve it. A secured cryptographic ledger system would transform the way the FX market operates. Clearing transactions through blockchain would move the market to a new model, revolutionising settlement, making it faster, while decreasing risk, without the need for a central entity making post-trade confirmation much cheaper.

This would be a dramatic change to the way that financial services are delivered. If it is a model that can be made to work, banks and or financial institutions could have a capital role as custodians and arbitrators, helping to apply the necessary regulations and framework that, at the moment, central banks, on behalf of governments, provide.

The use of smart contracts, as part of the blockchain technology, also has many advantages, including the ability to keep the identity of the parties involved secured, and at the same time provide full visibility of the contracts being executed. Smart contracts also remove the need for a number of ‘middlemen’, reducing costs and in many areas additional complications. This would allow trading solutions to be moved from a centralised to a decentralised model, significantly lowering the price per trade. An immediate benefit of this would be the democratisation of market participation which would help increase liquidity.

Algorithms are already an integral part of trading but increasingly it is the deep learning ability they offer which is starting to be linked to marketing and customer relation management (CRM), which is creating more opportunities. Traditional AI systems analyse markets and can trade on behalf of clients (or institutions), across multiple asset classes, optimising returns on investment or helping to manage risk. But, the next generation of algos will also be able to take into consideration big data stored in marketing systems which are used to acquire and engage with clients.

Sophisticated digital marketing understands what motivates clients, and in the future may be able to actually dictate what motivates them. Geographical, social, economic and historical footprints are used to create a picture of the individual. When this is combined with current market behaviour, provided by the AI trading systems, this deep learning can be used to present the right client, with the right opportunity, at the right time. But the important fact is that the approach to the client includes two aligned factors; firstly, it is a trade the investor is interested in and secondly it is based on a genuine market opportunity. Traditional systems have always strived to achieve this outcome, but cannot match the speed, breadth and efficiency of AI systems – at some point and like in other industries where AI is utilised, regulators will need to step in so no single actor or dominant force (or an uncontrollable machine!) ends up taking over. If the regulation is in place, a network of transparency, automation and optimisation could be spread out across the world.

If automation is looked at alongside the development of blockchain-based trading systems, the potential impact is huge. Crypto-technology allows for the digitalisation of assets. Everything can be coin based, along with the services which support the trading, from insurance through to arbitration. The blockchain can become a complete, transparent and cost-effective environment. In this world, by effectively owning the token without having the need to hold the asset, it would dramatically increase the speed of operation of world exchanges and could potentially contribute to new and massive economic growth.

ADS Securities was the first brokerage in the GCC (Gulf Cooperation Council) to start offering CFD Bitcoin and CFD Ethereum on its platform and is in the process of introducing other cryptocurrencies focusing on those most highly traded. But, for the firm this is just the starting point in engaging with the technology. It is currently building its third generation trading platform using blockchain technology for the backend systems. Settlement, clearing and reconciliation systems will all use blockchain, however the front end of the platform will be a development of the type of GUI which is already being used. It will provide investors with the interactivity and trading tools that they are used to when trading using the ADS Securities OREX platforms, but they will have the benefit of all trades being logged through a secure cryptographic ledger system.

The expansion of cryptocurrencies has created a very dynamic investment environment, but investors do need to be very careful when looking at buying into Bitcoin and other non-fiat currencies. Many traders are focused on the rapid increase in values, but do not look at the volatility and consider whether this is an asset which would work well in their investment portfolio.

Opening and maintaining a cryptocurrency wallet has its own issues and as the exchanges are still unregulated, investors need to be aware that there will be a co-mingling of funds. There is also a very real danger from hackers and concerns over the way that Initial Coin Offerings (ICOs) have been managed. 

Another problem for traders is that, unless you are accessing the cryptocurrency as a derivative product, you can only effectively go long on the investment. Instead of setting up a cryptocurrency wallet and buying direct from a coin exchange many investors are therefore choosing to access the asset through CFDs, which gives much greater security and flexibility especially when there is a lot of volatility in the market.

As the UK and Europe continue to discuss the terms of Brexit and questions of where the euro will be cleared are regularly discussed, future conversations may be about which of the major central banks will be the first to regulate cryptocurrencies – Japan for example has taken a step. It is likely that the country which has first-mover advantage will put itself in a very strong position, but it will also be taking on a major challenge and will need to have the resources and expertise to manage this very fast moving asset class.

The financial services industry has always been very quick to adopt change, especially where it provides value and opportunity. Cryptocurrencies are already providing value and many people can see the opportunities that will arise from the technology, so we can assume that through 2018 we are going to hear a lot more about blockchain and AI.


360T : The Future of FX : David Holcombe

THE FUTURE OF FX: EXCHANGE TRADED AND OTC LIQUIDITY?

Best Execution talks to David Holcombe, Product Lead for FX Futures and Clearing, 360T

Is the FX market really heading towards being an exchange traded and centrally cleared market?

This isn’t an all or nothing point in either direction. One size does not fit all in the FX market. The Deutsche Boerse Group FX strategy is actually a good view of the end state of the FX landscape – where informed clients will establish whether to clear any given trade, then use the right tools to achieve that.

When will that be? Cleared and exchange traded FX is still a small fragment of the overall FX market, so surely that “end state” view is still many years away?

My role is to ensure 360T exploits tight integration between 360T, the Eurex Exchange, Eurex Clearing and other group entities to create a truly front-to back FX offering for our clients that covers Exchange and OTC FX liquidity. We haven’t made a public song and dance about this, but this integration really is very well advanced, and you will see FX futures traded via 360T in the first quarter of 2018.

Beyond tools to let our clients choose the right FX product for the trade they need to do, using the right execution model, and the right credit and clearing model to exploit all the benefits available, the challenge the industry faces right now is to understand what clearing means, and what it can do for them.

OK, so if clearing is the start point for all of this, how do I know whether to clear something?

It’s actually a complex consideration. We’ve had a specialist consultancy in to prepare analysis to quantify the benefits of central clearing for FX, as in the absence of clearing mandates, the decision process to clear needs to consider multiple attributes for any given scenario. With so many moving parts as variables in the model, we are now going through these results with clients individually, offering to put their sample portfolios through our modelling tool to see where they will gain.

Ultimately though, one has to understand what the real drivers for each trade are, and also to consider the full impact that the trade will have on the portfolio in each form it could take, in order to then make an informed decision of how and where to get the best trade done with the best outcome.

So, once you need to clear, how do you choose between OTC executed flow or exchange products?

While the use of exchange listed products amplifies the benefits of clearing, the answer is still pretty much down to product access and liquidity.

It’s understandable that OTC execution is a place many start when considering clearing, because exchange-traded FX has never really been centre stage in the FX market. The majority of our clients being real FX participants state that a market built on the foundations of “how much and who’s asking”, with a myriad of ways in which LPs and clients can meet to bilaterally negotiate and trade OTC FX, have meant the US-focused exchange offerings, with limited value dates and product flexibility, have always been too far away from being a good fit for their needs.

Also, it is fair to say that trading on an exchange platform doesn’t suit everyone, and clients with strong relationships that have historically served them very well, particularly in bilateral disclosed models, are understandably inclined to favour those routes to interact with the market. This is the execution model you will see in our 360T Block and EFP network: access to FX futures, while trading using familiar OTC models and tools.

When OTC products are the right route though, the availability of a clearing service for the product you need is the first obvious consideration. While interdealer NDF clearing is pretty much routine now, no CCP has yet been able to satisfy the regulators that they are effectively managing the settlement risk they concentrate between members for deliverable OTC FX products, in order to address the bulk of the market’s ADV – deliverable Spot, Forward and FX Swaps. Deliverable OTC FX clearing will become a reality in 2018 though, as we are one of two major CCPs currently finalising a deliverable FX clearing service, with the Deutsche Boerse Group’s Eurex Clearing service being the only one focused on letting you clear FX Spot, Forward, FX Swap, alongside cross currency swaps.

Once you have determined the position is going to be cleared, then your focus should be whether listed or OTC products give you the best route to get that position into the clearing house, considering all liquidity available: in the exchange orderbook and off-book – exactly the model 360T has with support for OTC alongside exchange listed FX products.

Well that’s clear – the customer gets the choice of using an OTC or exchange FX product, and accessing those exchange products on or off the exchange orderbook, but surely the problem with listed FX remains – the products are not a particularly good fit for OTC users?

The uptake of FX futures will be helped by next generation products that evolve FX futures from the US contracts with a small number of infrequent value dates, to something closer resembling the flexibility of OTC products.

We do have classic shaped FX futures contracts, albeit with OTC characteristics like having the currency pair quoted the right way around for OTC users, but a perfect example of next generation FX is the Eurex Rolling Spot Future. This is the simplest way to get FX spot exposure into the CCP, using an exchange listed product designed with a focus on removing incumbent costs in OTC rolls, with multiple liquidity providers considering the exchange orderbook and also how they can use the 360T block and EFP functionality to increase their distribution.

With all of these points aligning, the future of FX is here. Giving the customer true choice of product, execution type, and credit/clearing model so they can exploit the benefits that clearing can bring is certainly a challenge, but all of the foundations are already there for this client choice to become a reality in 2018 within 360T and the Deutsche Boerse Group.

www.360t.com


Market opinion : MiFID II : Jannah Patchay

Be30-JannahPatchay-320x320

THE DAY AFTER TOMORROW…

Jannah Patchay at Markets Evolution looks ahead at what the implementation of MiFID II will mean and whether the industry should gear up for MiFID III?

In blockbuster terms, this is the big one. As of January 3, MiFID II has landed. It hasn’t been entirely clear what would happen. Some harbingers of doom were forecasting a veritable extinction-level event for brokers and independent research. Others predicted a cataclysmic rift in market liquidity as the derivative trading obligation comes into force (an outcome now at least somewhat less likely, given the recent agreement of an equivalence determination between European and US trading venues).

Images have been conjured of markets frozen by uncertainty and gripped by a paralytic terror of new transparency rules. By the time you read this, the third of January will have come and gone, and whatever happened will have happened, and either modern civilisation will stutter on, or we shall all be wandering a post-apocalyptic market structure.

And so, time to look to the future… What next for European regulators, following their magnum opus of financial markets regulation? Rumours abound that work has already begun on MiFID III (ESMA Strikes Back?) – but what does this mean? The most likely scenario is that, rather than another monolithic piece of legislation, the concept of MiFID III involves a combination of additional regulatory guidance, Q&As and thematic reviews focusing on key areas.

Questions of scope would be near the top of the regulators’ agenda: primarily instrument activity and jurisdictional. The introduction of the “traded on a trading venue” (TOTV) concept was meant to help identify those instruments that should qualify for inclusion in certain MiFID II obligations such as pre- and post-trade transparency and transaction reporting. Similarly, the introduction of standard instrument identifiers, ISINs, for derivatives was meant to enable the identification of (in theory) comparable and potentially fungible derivatives contracts, and to make it easier to build lists of TOTV instruments.

Unfortunately, although not unforeseeably (at least to anyone with some practical experience of derivatives markets), their implementation has been fraught with issues. The regulators’ ideas of which attributes constitute the salient features of a derivatives contract are far from aligned with those of industry practitioners. Furthermore, it has proven operationally and technologically challenging to gather data for, and produce and make available up-to-date lists of these instruments, in a timely manner, to enable market participants to meet the time-critical reporting obligations associated with them. It is likely that this will be a key area for resolution, being so fundamental to the whole enterprise.

It has been said that, whilst in the US, the CFTC’s implementation of the Dodd-Frank Act attempted to impose an exchange-traded futures market structure on OTC derivatives, MiFID II represents an attempt to impose an equities market structure on them. This becomes more and more apparent in the edge cases – the activities and services undertaken in OTC derivatives markets that are specific to those markets and have no parallels in the world of exchange trading. Prime brokerage and the practice of credit intermediation were also sorely overlooked by the CFTC as they rolled out Dodd-Frank, and this has taken years to remediate. Those hoping that lessons would have been learned this time around have been disappointed.

The early adoption, by many banks, of the Systematic Internaliser (SI) regime for non-equity instruments, has been unexpected: a response to demands from their buyside clients, for whom trading with an SI removes the requirement for them to undertake post-trade reporting in near real-time.

For derivatives, the SI regime is sketchily defined and further guidance has been scant. The regime is intended to bring OTC trading with market makers in line with that on trading venues, by imposing quote and trade publication requirements on those firms deemed to trade on an “organised, frequent and systematic basis”, as measured against total European trading activity. A lack of sufficient trade data from EU trading venues and market participants had led to a temporary reprieve from the SI regime being granted, until September 2018.

Given the high profile of the SI regime, and the lengths to which the regulators have already gone, in their guidance, to close potential loopholes, the early adopters will no doubt be closely observed, and findings may quickly be incorporated into a review of the rules prior to September.

A great deal of the new rules are focused on investor protection, and these have been defined in the directive, which is locally implemented by each EU member state (and hence subject to some variations). Any early indications of massively inconsistent client experiences, or a perception that some offer a more laissez-faire approach to dealing with clients, will be quickly seized upon.

It is worth bearing in mind that, after all, the intention of MiFID I was to harmonise European markets and client experiences, and MiFID II retains those objectives alongside the newer G20 commitments. Areas to watch include disclosures of costs and charges, and other information provided to clients with respect to executed trades and execution quality.

Interpretation and application of the rules has also been made murky by the sheer complexity of global financial firms. Not only banks, but also many of their buyside clients, operate via several legal entities in multiple jurisdictions, and it is not trivial to analyse their trading relationships and roles against the regulatory text or its intent. This has been made more difficult by the lack of guidance on the extraterritorial and cross-border reach and applicability of the rules.

The regulators’ silence in this respect may perhaps largely be attributed to the current political climate and sensitivities around Brexit, and the potential outcomes of negotiations between the UK and the EU around access to the single market. Will there be more inclination towards extending the EU’s jurisdictional reach, and hence widening the jurisdictional scope? Will rules on the provision of cross-border services be more or less stringently adopted? This will no doubt be a key area of focus in the coming months, taking direction and shape as the negotiations evolve.

©BestExecution 2018[divider_to_top]


Viewpoint : MiFID II : Tim Healy

MIFID II DATA DELUGE: WHY WE NEED TO COLLABORATE.

Tim Healy, FIX Trading Community

It is now over 2½ years since the first meetings were held by a number of different FIX working groups to review MiFID II topics with the aim of providing guidelines and best practices for market participants. What has become clear since those first calls, was that many participants on the calls were also participating in other working groups with various other trade associations. The risk was that there would a be a duplication of effort and that firms would have different individuals working at cross purposes.

To try and help this issue, FIX Trading Community and Association for Financial Markets in Europe (AFME) partnered to form a joint industry working group looking at Order Data and Record Keeping (ORK). Not an easy task considering the sheer depth and volume of data that is required by the regulators, so joining forces only made sense.

Initially, the focus for the working group was to scope out what exactly the working group needed to focus on. Those items were decided as:

  • What events trigger Record Keeping?
  • A list of fields required by trading venues;
  • Specifics on these fields;
  • High Frequency Trading (HFT) superset fields;
  • FIX mapping with the required fields.

There was quickly a realisation from members of the working group that whilst the detail of the regulation had good intent, the magnitude and scope would mean extensive effort in detailing requirements. Additionally, the regulation does not necessarily have the robustness of a tried and tested trading protocol. Being able to map record keeping requirements to the FIX Protocol has allowed for a common interpretation that is practical to implement. This was done with very little need to add to the protocol, and the creation of a set of guidelines on how to use existing features has allowed for a flexible approach that caters for differing use-cases but driving a consistency between them.

One of the key areas that the working group focused on initially was to come up with a standardised format that would allow the venues to receive the newly required regulatory information from their members in short code. This could include:

  • Execution algo;
  • Execution individual;
  • Investment decision algo;
  • Investment decision individual;
  • Client legal entity;
  • Client individual.

As a guiding principle, it has been accepted that it would create unnecessary cost and complexity for members if venues were to diverge in their approach with respect to short codes, particularly when venues trade the same instruments. If individual venues require different values on orders, a translation would have to occur at the router level, after the venue had been selected, potentially introducing additional latency. If different values were required, separate and different end of day mapping files would have to be produced for all venues. As with all the MiFID working groups, achieving consensus and providing clear standards was the main objective and, after polling the venues, a short code standard with FIX mappings was produced.

The main body of work for the broader ORK working group was to produce a definitive document that would provide guidance on various use cases and the reporting requirements that are needed for each counterparty involved, whether they be a buyside, sellside or venue/exchange. Of course, with any electronic order routing, there is a two-way flow of information between submitting firm and receiving firm, and firm to venue. ESMA has clearly stipulated that all parties bear the responsibility to ensure that the required data and information is recorded and that such information should be made available to the competent authorities using uniform standards and formats.

The document covers FIX order and execution events as well as quote events that happen via a trading venue. For each of these event types, it looks at the types of entities that will be receiving and sending these events. It considers the specific record keeping requirements of each of these entities and whether the information can be generated internally or whether it is required from another entity. If this information is required from another entity, the document provides the FIX tags that map to ESMA’s requirements. Again, this highlights that FIX is much more than just an electronic trading language. As one participant at a recent FIX event noted, “What we’re working with in FIX is a regulator’s dream.”

By being able to map ESMA’s requirements to the FIX Protocol and putting these guidelines in place, the members of the FIX Trading Community and AFME have shown that there is a strength in working closely together to address regulation. There has been much talk over the years about the unintended consequences of MiFID II. From here, it would appear that one of those consequences, would be greater collaboration to ensure all market participants meet their respective requirements. This is the way that FIX Trading Community has worked over a number of years and will continue to work to promote the use of cost-effective standards that are available for all.

©BestExecution 2018[divider_to_top]


 

Viewpoint : MiFID II : Peter Moss

WE’VE ONLY JUST BEGUN.

The January 3 MiFID II deadline marked a milestone for financial market participants across Europe. But there is still much work to be done, as Peter Moss, CEO at The SmartStream Reference Data Utility tells Best Execution.

MiFID II went live on January 3; how would you describe the industry’s efforts to reach that milestone?

Firms made a huge effort to be ready for the MiFID II deadline. SmartStream worked mostly with the large brokers, either approved publication arrangement (APA) firms or systematic internalisers (SIs). These firms were very focused on the starting line, trying to get everything done during the last few months to ensure they arrived at the start point in the best possible shape.

What did that work involve?

Much of the activity was around changing trade automation processes, particularly to make sure data is available for the trading mechanisms to use. This enables organisations to quote prices in advance of a trade and to deliver post-trade reports immediately after a trade. A lot of work was involved in this.

What challenges did firms face?

One of the main difficulties was the delay in the launch of the second phase of the European Securities and Markets Authority’s [ESMA’s] Financial Instrument Reference Database. Market participants anticipated the data would be ready in July or August, but it arrived only in mid-October. The launch provided access to the database containing the currently available reference data that will eventually enable market participants to identify instruments subject to Market Abuse Regulation and MiFID II/MiFIR reference data reporting requirements.

Because it came in October, this gave a shorter time than had been hoped for to enable market participants to prepare their reporting systems ahead of the MiFID II go-live. Moreover, the clarification of Regulatory Technical Standards 1 and 2 came only in early December, again giving market participants just a short time with which to work.

These delays caused some concern across Europe, with firms struggling to test their software in the absence of clarity about data requirements. A significant amount of work was back-ended to the final weeks of 2017.

Given these challenges, how well-prepared were firms for the MiFID II go-live?

Most firms arrived at the starting line in reasonable shape, despite the challenging circumstances. But it is inevitable that there were still some elements of MiFID II compliance that would not have been fully tied down. In many cases, assumptions had to be made that might not prove to be in line with those of others.

It sounds like firms still have more work to do in 2018.

That’s right. Pretty much every institution is resigned to the fact that their project teams for MiFID II will need to be kept in place for at least another nine months – probably until September 2018.

There will be immediate activities in Q1 2018 related to remediation of elements that didn’t work because of the last-minute rush to be ready. There will also be adjustments that might have to be made in line with any changes in guidance that will emanate from ESMA.

Firms have reached the starting line, but that’s really only the start of the journey. Many firms focused on getting the fundamentals in place for the 3rd of January and are likely to be running optimisation projects now, to ensure their infrastructures are sustainable operationally, that they are resilient and that they are cost effective going forward. These firms may also be looking at secondary sources of some of the core data, so they have a backup in case of failure scenarios.

Reporting is an important element in MiFID II; how have firms approached this?

Transparency has been a primary focus for MiFID II and that has been achieved by very specific rules for trading venues and systematic internalisers (SIs). For systematic internalisers this has meant new rules around pre-trade price transparency, post-trade reporting and transaction reporting to the regulators. A number of Approved Publication Authorities have emerged and most of the market is taking advantage of their services to support the pre and post trade reporting. At present, there are about 70-80 firms that will opt-in as SIs across Europe. But there may be another 40-60 that will get caught by the redefinition of SIs by regulators in September 2018. This is when becoming an SI ceases to be optional and becomes mandatory. Regulators will identify firms that are doing significant volumes in certain instruments and those that have not already opted-in will be told that they need to adopt the SI reporting requirements.

What do you think will happen over the next 12 months with regard to MiFID II?

I think there will be four main areas of activity: remediation projects, optimisation projects once firms have some breathing space, control framework projects to ensure the ‘belt and braces’ are in place, and finally compliance projects at firms that have been re-designated as SIs.

For the next 12 months, the market will see regulators bedding down what they need to do and make sure everything is running smoothly. There will be adjustments to the way things are working, and it is possible we will see a MiFID II.1, for example.

The changes regulators make will probably be relatively minor. For example, ESMA will probably adjust the thresholds for RTS 1 and 2 around the end of Q1, which will require firms to adjust their reference data accordingly. Financial regulators have been very clear that they will be watching how markets respond to this regulation and will be making adjustments.

For firms, there will be constant change until the end of 2018 as they bed down their systems to achieve optimal or routine operations. However, compared to what the market has been through to get ready for MiFID II, this work will be relatively straightforward.

Given all of the work firms have had to undertake, do they think it has been worth it?

The attitude towards MiFID II varies across organisations of course. Some definitely see it as an opportunity because the transparency in the market will create profound changes in the way instruments are traded. There’s consensus in the market that it will become a battleground, with organisations that have approached the change well benefiting more than those that haven’t.

There will probably be a bar-bell effect: large organisations that can deal with complexity will do well, as will small niche specialists. The generalists in the middle, without scale, may struggle a bit.

Despite the majority of the heavy lifting being behind the industry, my message is that there is still work to do to refine and operationalise everything in order to create robust and sustainable systems and infrastructures for MiFID II and beyond.

©BestExecution 2018[divider_to_top]


From High Touch To Low Touch

vaibhav-sagarBy Vaibhav Sagar, Senior Technology Consultant, Open System Tech

Higher buy-side expectations and the rapidity of technological advances means a shift to low touch trading will continue.

Back in the day, most market making and client order execution was carried out by human traders. So-called high touch trading was often an opaque process, vulnerable to latency and susceptible to front-running.  Since there was no way for buy-side clients to monitor how the broker desks dealt with their orders, there was always an element of mistrust and misjudgment.

A typical high touch order flow involved buy-side orders being sent to sell-side dealers with a limit price range to work on. The sell-side trader would first try and cross an order with other client flow and then maybe cross it with their own book positions. If these approaches failed, the broker would try their best to slice-and-dice the order in the open market.

This process had the advantages and disadvantages of using a trader’s judgement, networks and biases. Manual trading also carried the risk of higher slippage and market impact. Moreover, there wasn’t a smart way to conduct transaction cost analysis or measure market impact.

Arguably, high touch trading often made good sense for illiquid stocks. On the other hand, for liquid, high turnover stocks it was inefficient.

As a result, many buy-side clients started preferring direct market access (DMA) orders where they used brokers’ exchange connectivity pipes to reach their exchange of choice and route orders accordingly. This helped solve the trust and front-running issues, but failed to tackle the problems associated with smart trading and market impact. DMA orders could be used to send orders to target exchanges, but that does not necessarily mean they target the high liquidity exchanges.

Many large quant and high volume shops also insisted on liquidity seeking executions that had minimum market impact and successfully managed market risks. Since the buy-side firms’ primary concern was the investment decision, they allowed trading intelligence to be handled by sell-side broker dealers.

Now, of course, with the advancement in computing power, data storage and data analytics, more and more trading is automated. Algorithms are designed to target venues with maximum liquidity and are also adaptive to changing market conditions to reduce market impact. They are configurable and can be tuned to behave in a passive, neutral or aggressive manner depending upon trade and market conditions. Algorithms also incorporate historical execution data and analytics to make better trade routing decisions.

All major broker dealers have algorithm suites for different investment styles and which are adapted for diverse geographical regions, while specialist technology companies provide trading algorithms that cater to smaller broker dealers and buy-side firms. Regulatory demands, greater buy-side expectations and the sheer speed of technological developments means that more and more trade flows will migrate from high touch to low touch. That’s great news for people with technology skills.

We’d love to hear your feedback on this article. Please click here

globaltrading-logo-002

We're Enhancing Your Experience with Smart Technology

We've updated our Terms & Conditions and Privacy Policy to introduce AI tools that will personalize your content, improve our market analysis, and deliver more relevant insights.These changes take effect on Aug 25, 2025.
Your data remains protected—we're simply using smart technology to serve you better. [Review Full Terms] |[Review Privacy Policy] By continuing to use our services after Aug 25, 2025, you agree to these updates.

Close the CTA